Opt-In Vs Opt-Out, Consent In Privacy

There is no privacy online without informed consent. But then there's also the opt-in, opt-out dilemma. And since opt-out favors the people who want your data, well...

Opt-In Vs Opt-Out, Consent In Privacy
Image by Gerd Altmann from Pixabay

Do you ever read those cookie policies that pop-up when you visit some websites?

Do you really read that waiver the doctor's office makes you sign about who they may share your healthcare data with?

Ever read the EULA when you first log into a new computer or phone about what things your operating system might call home about and tell the vendor?

I know one of you is feeling smug right now because you're answering yes to all of these questions. So let me ask another:

What are your options if you decline to agree to these conditions?

Following up on that, did you get to review these conditions when you had the time to make an informed decision, such as before you purchased the computer/phone, or before you chose to visit that particular physician? Are you forced to go to that website because your company requires you to? Because your phone company requires you to use it to modify your service?

Informed consent is a principle that demands that when you give your consent to something, you're doing so because you're aware of the impacts of your choice, and you have freely chosen this option. In the US, we have laws that attempt to enforce this in a great many ways. If you've ever signed for a mortgage in the past 20 years you'll be aware of the number of forms you have to sign that say the same thing over and over again, and the number of forms you have to sign that you agree that when you signed that previous form you understood what it meant. The regulations are doing everything they can to make sure you're not getting into a mortgage you don't understand, and the mortgage companies are doing everything they can to ensure that if you try to sue to get out of the mortgage later you can't claim you were tricked into it or weren't aware of some of the terms.

In the mortgage example, you are shown a number of these documents well before closing and are encouraged to address these topics beforehand, so you don't get to the point of no return only to suddenly change your mind, or feel forced into conditions you don't accept.

Implicit in this is that you're mentally and emotionally capable of making decisions. There is a reason, for example, that governments set minimum age limits on things like entering into contracts, voting, getting married, etc. There are also limits that can be put on those who do not demonstrate the mental capacity to make their own decisions, a famous pop-culture example of this would be the conservatorship of Britney Spears.

So to summarize, in order to provide informed consent you need:

  • To clearly understand the choice
  • To be able to freely make your own decision
  • To be capable of making such a decision

Opt-Out Fails This Test

Try a little test. Pick a website that, when you visit it with your tracking blocker turned on asks you to disable it to "support" the site in your regular browser. Now go ahead and open that exact same site in a private browsing window. Did you get any warning that cookies are being used to track you and provide advertisements in the private window?

Or what about when you see a banner at the bottom of the page like this?

Typical "you're on our site so you're being tracked" notice

Doesn't this notification actually come AFTER you've been exposed to the tracking cookies? So even if you leave the site now, how does this meet the requirements of informed consent? Did you know what would happen BEFORE you clicked that particular link? Were you given the opportunity to make a decision before cookies were deployed to your browser?

In fact, no, you were not. In some cases, the website will allow you (if you go find their privacy and cookie policy page) to opt out of these things. But what does that mean for data that's already been shared about you? Nothing. It is already too late.

That may not seem like that big a deal. I'd argue it is but OK, let's raise the stakes. Let's say you've had a Facebook account going back years, maybe even decades by now. And you've just found out that Facebook is using your content on their site to help train a Large Language Model (LLM) AI solution. So posts that you shared only with a select group of people are now being used to train an AI that will probably be used publicly. What are your options?

Well, in the EU, you can invoke GDPR and challenge Facebook's right to use that data, and there are reports that this has been successful. But you're not a citizen of the EU, and you don't reside in the EU. Literally your only option is to quit using Facebook. But guess what, that doesn't prevent Meta from using all your old posts. When you signed up for Facebook in 2012 were you explicitly aware that Facebook (Meta didn't exist yet) would someday use your posts to train a technology that you hadn't even heard of yet? Of course not.

Editable Policies and Legalese

Further hampering our abilities to provide informed consent are the fact that End User License Agreements (EULA)s, and other policies with terms we need to consent to are online and allowed to change at the whim of the company providing the service or solution. We've all gotten that snail-mail notice from our credit card company or our bank telling us that terms and conditions have changed. Well, that happens all the time online as well, sometimes with less visibility.

Part of the reason these change a lot? Well, discounting changing leadership priorities at these companies, regulations and laws can cause significant rounds of edits as well. Let's look at a recent example around SONOS, as documented by The Verge. Here's 2023 language from SONOS's privacy policy:

📘
Sonos does not and will not sell personal information about our customers. However, certain data practices described throughout this Privacy Statement may constitute a “sale” or “sharing” of data under California and/or other US state laws. See the below CA Addendum for more information applicable to CA residents. We want you to understand that information about our customers is an important part of our business. We only disclose your data as described in this Statement.

Compare that to the 2024 update:

📙
Certain data practices described throughout this Privacy Statement may constitute a “sale” or “sharing” of data under California and/or other US state laws. See the below CA Addendum for more information applicable to CA residents. We want you to understand that information about our customers is an important part of our business. We only disclose your data as described in this Statement.

So did SONOS change their policy or not? Yes, the first sentence of the 2023 paragraph was removed in 2024, but did that actually change what they do with your data? Or did they just have to take that line out because a lawyer got nervous that under California law they're considered to be "selling" personal data so the 2023 paragraph was incorrect? Did the 2023 version give their customers a false sense of privacy? I am not a lawyer, so I can't say with authority, nor have I seen a trustworthy interpretation I can reference for you and share.

It's enough to drive you to just click "accept" on these policies and avoid the headache that trying to interpret and understand them brings.

It Matters, Even Cookies

The zeroth law of both cybersecurity and privacy states that you cannot lose data you do not have. Every data breach, be it a cellular phone provider, a medical insurance company, or even your car manufacturer puts your private data at potential risk. That data can be as innocuous as what websites you visited and what ads you interacted with, all the way through to real-time location data to health diagnoses and beyond. The damage that can be caused ranges from mild embarrassment to, well, doxing and physical harm.

But even if the data collected seems innocuous, there's the principle of the matter. The question isn't "what do you have to hide," the question is "what right is it of yours to information about me, and why do you need it?" If they can't answer that question to your satisfaction they shouldn't be getting your private data.

Take Action and Get Your Privacy Back

So to be blunt, the things you can do as an individual to protect your privacy in the online era are somewhat limited at best. There are lots of things you can do, some of which are outlined in other blogs available in the privacy collection here on Between Two Firewalls.

Options include choosing open-source software that doesn't collect data, choosing privacy focused browsers and plugins to your browsers, all the way up to divorcing from Google, Amazon, Apple, and Microsoft. The choices are yours, and every choice has tradeoffs. But even one change is better than whatever you're doing now.

But the only way this actually gets resolved is via legislation or through the legal process. Everything you do on your end, or I do on mine, is a patchwork, attempting to hide from data collection. One of the best organizations out there focused on privacy issues (especially online privacy) is the Electronic Frontiers Foundation (EFF). The EFF can assist you with your potential legal privacy issues in court, but has a plethora of other great resources, including advice on making your voice heard to lawmakers. If you're not ready to get directly involved in the legal or legislative process, you can always start by voting for candidates who are interested in these policies and concerns, as well as donating to groups like the EFF.

Taking back your privacy is not quick. But understanding how systems are set up to make giving up your privacy the default condition, without informed consent, and what you can do about it is a start.