New Privacy Rights Act Exempts Government and Gives More Power to the FTC

Consumer Choice Center Deputy Director Yaël Ossowski
Tom Williams/CQ Roll Call/Newscom

Data privacy talk in Congress seems kind of ironic coming just a week after lawmakers rejected a proposal to make federal authorities get a warrant to search Americans' electronic communications. But in keeping with that move, the American Privacy Rights Act—a draft data privacy bill that will be getting a hearing in the House Innovation, Data, and Commerce Subcommittee today—would exempt governments and entities dealing with data on behalf of the government from its protections.

The bill would also give more power to the Federal Trade Commission (FTC), and create an "unprecedented" private right of action to sue companies over data handling, according to Yaël Ossowski.

Ossowski is deputy director of the Consumer Choice Center, which bills itself as "an independent, non-partisan consumer advocacy group championing the benefits of freedom of choice, innovation, and abundance in everyday life." I talked to him yesterday about the bill's (few) benefits and its (myriad) drawbacks.

When your average libertarian or classic liberal hears "federal privacy law," it sounds bad. Can you talk about why small government supporters might endorse some sort of federal privacy law? 

Ossowski: It mostly has to do with the patchwork legislation. You have stricter privacy rules that are already existing in places like California or Virginia or Vermont. Essentially, any attempt to have a nationwide bill that applies to everyone makes it a lot easier for businesses, makes it a lot easier for consumers, and is generally just easier to understand….It's at least a uniform policy, so that people can either figure out what works, what doesn't work, they can craft their strategy. [Nonprofits] can figure out how they can work their own data collection or petition drives, these kinds of things. It's just a kind of what we call regulatory stability.

A couple of terms that often come up in data privacy discussions are portability and tech neutrality. Before we go any further, can you briefly define these things? 

Portability is the ability for you to take your data—that has either been collected about you or that you've given a particular platform or service—and take that with you, basically to export it in an Excel file, a JSON format, a zip file. It's just the ability to export all the information that you've given, which is what we have with many different tech companies. [It allows] us to export that data in a machine-readable, easy way that we can then input into another service, like a competing service we'd like to use.

Then tech neutrality is just that—the government does not determine the type of technology that businesses or consumers are supposed to use. The example is that in the E.U., they mandate that every phone can only be USB-C, which then means that if we want to have some kind of wireless charging thing, is there going to be any way that's slowed down? Consumers who prefer lightning on their Apple iPhone won't be able to choose anymore.…That's just something you generally want to avoid with any kind of tech regulation, but also with privacy regulation, too.

Let's talk about the American Privacy Rights Act. Who's behind this bill? Has it been formally introduced yet, or are we still just talking about a draft version? 

We only have a draft version, by Senator Maria Cantwell, Democrat of Washington, and then we have Republican Representative Cathy McMorris Rodgers, also of Washington. This is only a discussion draft. It is building on past privacy legislation that you could consider much more left-wing.

What are the main positives of the American Privacy Rights Act?

The things I like are preemption—that's number one. Basically, every other state privacy law is essentially preempted by this national privacy law. The more stringent requirements in California or in Virginia will no longer apply, and it will just be the national privacy law that's the law of the land. That's good.

As I mentioned before, I think the data portability is good. It's a great principle, and I think it's very consumer-friendly, tech-friendly, and fairly reasonable.

The other one is transparency on what platforms or services collect. That is pretty standard fare. Most app stores do this. Most cell phones already do this. It's generally a very good tech practice, it would just kind of be backed up by at least some portion of the law.

What are the main negatives of this bill? 

I think the outright veto on targeted advertising just does not make sense with how most companies and services offer things today. [Under this measure, "a consumer has the right to opt out of the use of their personal information for targeted advertising," per Cantwell and McMorris Rodgers' summary of the bill.] It's not just social media companies, it's also journalistic institutions and universities and small businesses that use Facebook ads or marketplace, or people who want to sell products online. If you gut the ability to do any kind of targeted advertising, you essentially make that business conduct illegal.

For example, my dad is an electrician, he has a small business. If he wanted to advertise just for people in Cincinnati because he's not working outside of Cincinnati, that wouldn't be allowed? Because that would be targeted advertising? 

Technically, per the bill right now, there are covered entities—these are the people who would be responsible for following this law. If you're under 40 million dollars, as a for-profit company, this does not apply to you. But if you're a nonprofit—say, Reason Foundation—it does apply to you.

What if you're a small business but you're using Facebook to advertise? Obviously, Facebook falls into that covered entity category. Does that mean that if you're a small business, you could still use Facebook ads in a targeted way? Or no, since Facebook itself can't do targeted advertising? 

No, because there's a specific section in the bill that…makes social media companies specifically not able to do this. They call them "covered high-impact social media companies." That would mean that you would not be able to target advertise on Facebook.

That's obviously going to affect tons of average people and small businesses—not just Facebook. 

Yeah….Basically, what this bill says is the consumer has the right to opt out of any of their personal information being targeted. You have opt-out rights with every covered service, which essentially kills targeted advertising. So that's one negative point.

[There's also] this idea of data minimization—basically, you should not be collecting any more information beyond what's necessary, proportionate, and limited—which I think is in principle very good. It's just very unworkable as a legal standard because the types of information that you would need from people you're trying to service or sell goods to is always going to change. And what you might consider necessary or proportionate, the government might not. That is essentially not a good legal standard that we should have in law. That would probably harm a lot of people, a lot of businesses, and our ability to use those.

Any other negatives?

The bill actually exempts any government agency from any privacy actions, so the government itself would not be subject to these privacy rules.

Wow. Of course. 

Just like when California's—I think it was the database of concealed carry gun holders leaked. No crime, no foul. There's no penalty, no problem. I think the same thing happened in the state of New York.

A lot of the government agencies…they use—like particularly the [National Security Agency]—they actually buy a lot of information from data brokers that they can't get by a warrant. And basically, they're not subject to any of these privacy rules, so they could basically continue doing that.

This idea that government is exempt from the privacy rules is pretty bogus and actually is contrary to the European [General Data Protection Regulation], in which governments are subject to private data privacy rules.

You mentioned that there are elements of the Kids Online Safety Act (KOSA) in this?

It deals with children and their ability to sign up for services and be targeted advertised to….Essentially, any social media firm would have to put up a kind of walled garden around any users who are under 17…and would have specific rules around targeted advertising—basically a no-go zone for anyone who's under 17. It would create different tiers of users and consumers for various services.

Would this then require various places to check IDs so they know people's ages, or are they just supposed to say that if their users might be under 17 then they have to follow these rules?  

It's not as specific as KOSA, so it doesn't say that they need to somehow electronically verify….It's just any user who inputs their data and they're under 17….There's no technological requirement of checking ID or anything….But you would, I believe, need to provide proof. They just leave it up to the FTC, in the bill, to determine that.

That's no good. 

The FTC will be the effective enforcer of this privacy law—which, technically, this is what the FTC should be doing anyway. It's supposed to be concentrating on unfair and deceptive practices. But the FTC will not only determine if privacy has been violated, they will act as the court. It'll have a new division inside of it.

But also there will be this requirement for algorithms, any algorithm that's deployed. This is another "Why should you oppose this?" or "Why is it bad?" There's essentially a prior restraint for algorithms. The FTC can have veto power on any algorithmic innovation. [Companies] have to prove a safety test and data privacy test before [an algorithm] can ever be released to the market.

So it gives the FTC total power over—and it just says any algorithm. It's not [more] specific. It just says any covered algorithm. It's essentially a backdoor Trojan horse to legislate on AI without actually legislating on AI.

This is one of those kinds of laws where we don't really know exactly how this would work out in practice, because it's kind of like, here's a broad framework but we'll leave it up to some administrative agency—in this case, the FTC—to figure out all the particulars

Yeah, there's two ways [for enforcement]. One is that the FTC will crack down.

But what this bill actually does which is unprecedented is it creates a private right of action. Any individual consumer can sue a company for violating their privacy according to these statutes.

While the bill itself does give the way to the FTC to decide stuff, it also enshrines into law what you as a private citizen can use in your court case against Marriott, British Airways, or whatever company might have breached your data privacy in some way. I think that is the more unprecedented thing and it ultimately would empower trial attorneys, lawyers. I think [Sen.] Ted Cruz [R–Texas] had comments specific to this.  ["I cannot support any data privacy bill that empowers trial lawyers, strengthens Big Tech by imposing crushing new regulatory costs on upstart competitors or gives unprecedented power to the FTC to become referees of internet speech and DEI compliance," Cruz said in an official statement.]

It sounds like there could be major potential for abuse of that sort of thing, right?  

There is. There's obviously [potential for] abuse of lawsuits and just bogus lawsuits.

Is there anything about encryption in the American Privacy Rights Act? 

There are a few things. They do say that there's a preference for privacy-enhancing technology….Essentially, if you do implement encryption, that is seen as positive. It is a good thing. It does not try to outlaw encryption—though I'm sure they'd love to figure that out at some point. But, so far, no, there is no way of doing that. They actually seem very positive to what they call homomorphic encryption, differential privacy, zero knowledge proofs.

Is there anything missing from the bill that you feel is important? 

Well, obviously the government being exempted is a big issue.

They have a big section on a kind of repository of data brokers, so-called, to be like a national database. I don't think that's too egregious, because there are companies that do this and because no one knows who they are, they're the ones who deal with law enforcement, and sell a lot of our information and data. I think just transparency on that stuff is good.

I don't know too much that's missing. I think it just went a bit too much overboard.

This is, so far, probably the least offensive privacy bill that has made it out of the House and the Senate negotiating together. But there's still a lot of problematic parts of it.

Are there any other serious privacy law proposals in Congress right now?

In [today's subcommittee hearing], they are looking at [other] privacy proposals. But from all indications, this is the end result of negotiations between Democrats and Republicans, from what we can tell. There are a few others because this does not really touch financial privacy, nor does it touch health privacy too much. They have left that to existing law.

Is there anything that I'm missing here that you think is important to mention?  

Just that the large covered entities, like the social media entities—it's technically any company with revenues over 3 billion U.S. dollars a year. Right now, that's only Meta, YouTube, X, LinkedIn, and TikTok. So just to mention that there's particular scorn for social media companies being called out. Very similar to KOSA.

This interview has been condensed and edited for style and clarity.

More Sex & Tech News

• Hold Fast

(Audible )
(Audible )

, a new podcast on Audible (with some episodes featuring yours truly), explores the rise of the Phoenix New Times and Michael Lacey and James Larkin's alt-weekly empire, their efforts to take the papers' classified-ad sections digital with Backpage, and the subsequent (and ongoing) prosecution by multiple levels of government over sex work advertising. Nancy Rommelmann and Sarah Hepola ("who spent a combined 25 years in the alt-weekly trenches," as they note) talk to Hold Fast co-creator Mike Mooney about the series and the erstwhile world of alt-weeklies here.

• Today's Innovation, Data, and Commerce Subcommittee hearing will also cover a dizzying array of proposals related to children and technology, including the Kids Online Safety Act (KOSA), the Children and Teens' Online Privacy Protection Act (aka COPPA 2.0), the Protecting Kids on Social Media Act, and the SCREEN Act.

• "Schools were just supposed to block porn. Instead they sabotaged homework and censored suicide prevention sites": The Markup explores how the Children's Internet Protection Act (CIPA), which "requires schools seeking subsidized internet access to keep students from seeing obscene or harmful images online," has led to "school districts all over the country…limiting not only what images students can see but what words they can read."

• Social media platforms have property rights, too, writes Ethan Blevins.

Today's Image

Phoenix | 2018 (ENB/Reason)
Phoenix | 2018 (ENB/Reason)

The post New Privacy Rights Act Exempts Government and Gives More Power to the FTC appeared first on Reason.com.

Advertisement