Uncovering the Hidden Market: Can SECURE Data Act and GUARD Financial Data Act Protect Us?
In the spirit of our sprint toward the Singularity, I find myself updating a blog post not even a week after it was posted. Yesterday, a group of House Republicans introduced two bills, the SECURE Data Act and the GUARD Financial Data Act. SECURE is aimed at the commercial use of collected data by technology companies, and GUARD is more of an update to the Gramm-Leach-Bliley Act. They both attempt to build a federal standard for consumer data privacy. Given that I’d only just finished the article and Sarah was still fresh in my mind, I decided to read the texts of both bills and see how they would impact Sarah’s story, if adopted.
In my article, Sarah had a whole series of unfortunate situations with her data, particularly her location data. The root cause was an innocuous app on her phone which provided great weather updates and radar features but also dropped her location data into a real-time bidding auction for customized ads. This allows observers of the auction, who aren’t bidding, to assemble quite a bit of information about Sarah, and millions of others. Sarah didn’t realize the app was doing this, because a generic accept button was enough to hide the behavior behind. Most of the rest of Sarah’s problems, from scammers to the FBI, stemmed from this massive trove of location data that was collected without Sarah’s clear knowledge and consent. Will SECURE and GUARD give Sarah the protection she needs?
The good bits first. SECURE Section 2(b)(1) requires opt-in consent before processing sensitive data. Section 16(30)(D) puts “precise geolocation data” in the sensitive bucket. Section 16(8) defines consent as a “clear affirmative act that signifies the freely given, specific, informed, and unambiguous agreement.” The Section 16(22) definition of “precise” is within 1,750 feet.
In a reasonable reading of the bill, “Allow” over a long page of tiny words would probably not be considered a “clear, affirmative act” or “unambiguous agreement”. This means that the app would need to explicitly get permission from Sarah to auction off her location data. By not opting in, Sarah would cauterize the PII bleed, because the app couldn’t sell it anymore. Section 3(a) data minimization and Section 3(b) limitation on secondary uses both reinforce this: the app collected the location for radar; selling it into an auction is a secondary use requiring its own consent.
The app couldn’t sell it, if the location data is precise within 1750 feet (533.4 meters), if it’s less precise than 1750 feet, it falls under a less rigorous “opt out” rule. Additionally, the bill attempts to protect small businesses from onerous bureaucracy, and so it carves out minimum levels of data being moved and profit being made. Section 16(12)(A) defines a data broker as a controller that (i) collects data about someone who is NOT a customer, client, user, reader, or subscriber, AND (ii) derives 50% or more of annual gross revenue from the sale of that data.
So, if the weather app considered Sarah and everyone else with their app a user, then they are free of the registry, the conspicuous-data-broker notice, and the FTC’s central searchable list. Sarah wouldn’t find the weather app in the federal broker registry even though it’s the source of her problem. To be fair, the weather app people would now be a “controller” with some obligations, but it could still be incredibly difficult for Sarah to identify as the source of her PII headaches under SECURE’s carve outs. Since a cell phone GPS is accurate in a five-to-thirty-foot range, as a controller, they could still capture the precise coordinates and simply broadcast the city block or census area (opt out) and argue that they are not violating SECURE. How effective this will be in protecting Americans’ privacy data is up for debate, and I am hopeful it’s a vigorous one.
What about the clients sitting at the real-time auction? Surely, they’ll be registered and Sarah can opt-out with them. Well, for some of the big ones like LexisNexis, or Epsilon, sure. Sarah isn’t their customer and probably 50% of their revenue comes from the kind of data they buy. However, most of the ad-tech-adjacent ecosystem that handles Sarah’s location in near-real-time, are making money on selling Sarah’s data, but they are also selling analytics, and ad-tech SaaS and consulting services. So, maybe they don’t get on the registry either.
Sarah’s victory is even more in question, because this will supplant all state laws, including the portal we discussed California was bringing online to allow for bulk opt-out from data brokers. Fewer brokers will be identifiable under this rule and there will only be a registry, not any assistance on opting out. That’s going to be part of a three-year study to determine the feasibility of a portal.
The bills don’t prohibit federal agencies from collecting or buying the data. In fact, they make an exception for federal agencies. Senators Lee and Wyden have reintroduced the bipartisan Government Surveillance Reform Act, which would require federal agencies to go back to court orders instead of purchase orders. Separately, the Bulk Data Rule and Executive Order 14117. I mentioned last week are working to stem the flow of PII to foreign adversaries.
So, Sarah got a new tool, the requirement for opt-in and not using data for something it wasn’t consented to be collected for. It will likely have some impact, but there’s a lot of room for improving the protections Sarah really needs. Right now, it is the time for everyone to have a discussion on what protections we should reasonably expect. While there might be disagreement, at least we’re having the conversation. These bills are going to subcommittee right now. The Wyden-Lee bill is a separate vehicle, but it’s also actively being discussed.
I mentioned GUARD earlier since it and SECURE were proposed together. GUARD is financial though, and would have no impact on Sarah’s weather app.
If Sarah was using Plaid for multiple different transactions, well, then GUARD would kick in. It includes several nice additions like enforcing data minimization for financial institutions, providing continuous opt-out, rather than a one time “speak now or we forever sell your data” option like the law currently allows. It also sets some limits on aggregation and Sarah could opt out of any aggregation whenever she used the app. The right to request deletion of data is another nice modernization.
Of course, it isn’t perfect. It’s far friendlier to the banks than SECURE is to the tech companies. That, however, would be a whole blog into itself and would probably make Sarah even more depressed than she already is. But she lives in Ohio and the weather here makes us all depressed, if only there was an app for that.
If you’ve been reading this, then I suspect you care about data privacy. If you can identify with Sarah, or if you’re one of the people whose job is to protect the Sarahs of the world, let your representatives know what you expect from them when it comes to protecting our most sensitive information.