To say that 2022 was a big year for data privacy would be quite the understatement.
With regulatory scrutiny on the rise and the ghosts of platform privacy changes past, present and future haunting ad tech’s attic, there were few dull moments (unless you find the inner workings of GDPR consent strings to be dull; in that case, fair enough).
From the Federal Trade Commission’s plan to regulate privacy in the absence of a federal privacy law to Apple’s intimations about cracking down on fingerprinting, these are seven stories that sent ripples through the ad tech ecosystem in 2022 – and will keep on rippling in 2023.
Strung out
In February, Belgium’s data protection authority dropped a little bombshell on IAB Europe’s Transparency and Consent Framework (TCF), ruling that the TCF is illegal under GDPR in its current form.
To be fair, it wasn’t a total surprise. IAB Europe warned members in late 2021 that this pronouncement was on the way.
The Belgian DPA alleges that the TCF relies on legitimate interest (and shouldn’t). It also argues that IAB Europe is a data controller of TCF strings (a point IAB Europe vehemently disagrees with).
IAB Europe is in the midst of working to overhaul the TCF to bring it into compliance, but the implications of the Belgian ruling are a big deal for ad tech companies. The TCF is a cornerstone of the ad tech industry’s plan to comply with GDPR and emerging state privacy laws in the US.
“Fingerprinting is never allowed”
Apple isn’t exactly renowned for its clear communications with developers, but there’s no confusing Apple’s stance on device fingerprinting.
During a session about ATT at Apple’s weeklong Worldwide Developers Conference in June, Julia Hanson, a member of Apple’s privacy engineering team, did not mince words.
“With permission, tracking is allowed, but fingerprinting is never allowed,” Hanson said. “Regardless of whether a user gives your app permission to track, fingerprinting – or using signals from the device to try to identify the device or user – is not allowed, per the Apple Developer Program License Agreement.”
It’s hard to be any clearer than that.
But there’s one thing that isn’t clear, and that’s how Apple intends to enforce against fingerprinting.
Apple doesn’t yet appear to have a technical solution for systematically cracking down on the practice.
Not so pretty
In August, Sephora earned the dubious honor of becoming the first company to be fined under the California Consumer Privacy Act (CCPA).
Sephora paid $1.2 million to settle allegations that it failed to disclose to consumers it was selling their personal information to third parties to create targeted advertising profiles. The cosmetics brand was also dinged for failing to process opt-out requests made through user-enabled privacy controls, like the Global Privacy Control.
Although the dollar amount of the fine wasn’t significant, the implications of the settlement most certainly are.
The Sephora case “was a big shot across the bow,” OptiMine CEO Matt Voda recently told AdExchanger, “and we should expect more when the CPRA [California Privacy Rights Act] gets going.”
Location, location, location
But regulatory scrutiny is coming at the federal level, too.
In August, the FTC sued Kochava for allegedly making it possible to sell visitation data tied to sensitive locations, such as abortion clinics, mental health facilities, domestic abuse shelters and places of worship.
According to the commission, the geolocation data provided by Kochava is not anonymized, which means combining it with a mobile ad ID and offline information could identify a specific device owner.
The FTC’s allegations are timely, coming just a few months after the US Supreme Court’s Dobbs decision overturning Roe v. Wade.
Further reading:
- The FTC Spells Out Why It Zeroed In On Kochava
- Why Did The FTC Fixate On Kochava? We Asked Kochava’s CEO
“Surveillance” enters the lexicon
Over the summer, the FTC announced what’s known as an Advanced Notice of Proposed Rulemaking, or ANPR, which is part of a process to explore rules to crack down on lax data security practices and what the commission refers to as “commercial surveillance.”
The FTC defines commercial surveillance as “the business of collecting, analyzing and profiting from information about people.” (Sound familiar?)
As Gary Kibel, a partner at Davis+Gilbert, pointed out in an AdExchanger column published in March, “the growing association between ‘data driven’ and ‘surveillance’ is a problem.”
Although the terms “commercial surveillance” and “commercial advertising” aren’t new, the FTC’s rulemaking process has helped cement these terms in the public consciousness – and the association is going to be hard to shake.
Close but no cigar (yet)
The American Data Privacy and Protection Act (ADPPA) is the closest the United States has come to passing a federal data privacy law – but it’s stalled in the House.
In September, House Speaker Nancy Pelosi – who, it’s worth pointing out, represents California, which is home to the CCPA and the CPRA – said she doesn’t support the bill as is because it’s not strong enough.
In other words, why support a federal bill that doesn’t provide as much protection as state laws that are already on the books. (Senator Maria Cantwell of Washington state feels similarly.)
That said, the ADPPA represents real progress. If it doesn’t pass in the next Congress, a similar (hopefully more successful) bill will likely bubble up in the congress after that.
But in the meantime, there are five state privacy laws to contend with coming into effect in 2023, including in California, Virginia, Connecticut, Colorado and Utah.
Time to run
Progress is proceeding apace within the Android Privacy Sandbox, which is set to enter beta in 2023 – and when that happens, life is going to change for many SDKs.
Developers integrate software development kits into their apps so they don’t have to write code from scratch to do things like monetize or get crash reporting. But when an SDK is executed in a host app, it inherits the same permissions as the app, meaning there’s the potential for undisclosed data collection and sharing.
But there’s an API incubating inside the Android Privacy Sandbox called SDK Runtime that will stop that practice in its tracks.
The SDK Runtime API creates a dedicated and separate environment in which to run third-party SDKs, effectively cutting off their ability to gather in-app data without explicit consent. As in, goodnight and good luck to some not-so-kosher business models.
As big of a deal as SDK Runtime is, it’s a wonder more people aren’t talking about it.