When Your Choice Impacts My Privacy

Copilot+ Recall might be fine for you, but if I'm sharing any data with you your choice impacts my privacy. And that's just one example.

When Your Choice Impacts My Privacy
Photo by Brett Jordan / Unsplash

A wise woman known as Em recently pointed out on Mastodon:

If privacy is important to you, then you must also value the privacy of others.
Em (is looking for work) :official_verified: (@Em0nM4stodon@infosec.exchange)
Tiny Privacy Tip for Everyone 🔒✨: We often talk about what we can do to protect our own privacy, but we don’t talk enough about what we should be doing to protect other people’s privacy. If privacy is important to you, then you *must* also value the privacy of others. This is a great cultural shift we all need to work on, collectively. Data privacy isn’t only about using the right software and implementing legislation, it is also about people and cultures. Always think about what you can do to improve the privacy of others around you. This is how we build a better world 💚 #TinyPrivacyTip #Privacy

The original post the above quote was taken from

Perhaps an obvious statement. Perhaps so obvious you've overlooked it. I know a great many companies seem to operate that way. Take, for example, the tale of at least some Wyndham hotels who chose to install an app called pcTattletale to keep an eye on the productivity of their front desk workers. You know, the people who take your ID when you check in and the people who verify your address, phone, email, and the duration of your stay. Those people. Well, you see, somebody was able to compromise pcTattletale, and has access to all the collected data. The breach has even (in an extremely rare turn of events) caused the spyware software company to shut down. Anyone who checked into one of the hotels in question (or who interacted with any of hundreds of other pcTattletale customers) could be in the position of having had their private data exposed.

Who do we get to hold accountable for this? I suppose we'll get some answers as the inevitable lawsuits come forth, which we know will have somewhat less to do with who is at fault and somewhat more to do with who can pay out the most if they are found at fault.

We could talk for days about cookie and other website tracking, PHI breaches from insurance companies, and the like. But let's focus on you and I, shall we?

It's Always Been a Matter of Trust

If I know you and you know me, you probably have my phone number. You may know my birth date. You may even have my address. You may even have my itinerary for that months long trip I'm taking to Europe, and the responsibility for watering what few plants we still have alive in the house, including the code to my front door keypad. I'm trusting you with private data, and I'm confident you're not going to go tell that group of home invaders when I'll be gone so they can come take my one-of-a-kind collection of cybersecurity text books, recently valued at tens of dollars.

These things are collectors' items I tell ya!

I Trust You, But I Have To Trust Your Tech Too

There's the rub. Maybe I made it a point to write down that front door code on a sticky-note and hand it to you. You, realizing you'd lose that note, put it in your contact info for me. Now I'm trusting my faith to your cell phone, your contacts software, the cloud solution that stores that contact info, your strong password and mult-factor authentication solution...and any computing device that you might access that contact software on.

Did I go overboard with the example? Yes and no. Presumably the contact info has both the code and my address in it, but if it doesn't also have the dates of my big trip, risk is lowered. And hopefully I have other countermeasures in place to keep my collection of books safe. Maybe a top-of-the-line humidity controlled safe bolted to the concrete slab, or a complicated taser system that only deactivates based on facial recognition.

But let's change this up a bit. Let's say your parents are getting to later stages in life. You have responsibility for their healthcare now, which means you likely have a whole list of private data at your fingertips because you are dealing with it daily:

  • Prescription information
  • Financial account information
  • Insurance information
  • Their social security number
  • Their driver's license number

...and so on. Now does it matter for them if somehow this data gets out beyond your private sphere of trust? I'd argue it absolutely does.

Sorry, where was I? Ah yes. The thing is, we share our personal data with people we believe we can and should trust all the time. And the ways that data can be accessed by third parties is often not something we think about. (Let's not get started on things like "private messages" on nearly any social media platform, we'll be here all day) What do Google and Apple do with our contact lists? What do T-Mobile, AT&T, and Verizon do with our text messages? Good luck getting straight, complete answers on any of those questions, or the hundreds more we could ask.

Is It a Feature, Or Is It Spyware?

Which brings us to "fabulous" new AI led features like Microsoft Copilot+ Recall. Please don't get me wrong, this isn't the only software of questionable privacy to hit the mainstream recently, but it is a very easy one to discuss, and discuss thoroughly. At the 10,000 foot view, this is a full system recording of every move you make on a Windows 11 workstation. That means that any interaction you and I have together, while you're using that computer running that software is exposing my data to that system.

I know that when I join a web meeting with an AI recording agent turned on, I get warnings telling me that it is on, and asking me to opt in to continue. But there's no such thing for me if you've turned on a tool like Recall. I don't even know what OS you're running, so I don't know to ask the question. I am not a lawyer, but that seems to potentially tread all over wiretap laws and other recording laws, along with international privacy laws like GDPR. We may soon find out if I'm right.

Making a Point of Privacy

All I'm asking here is that you think about what you take for granted about data, and your responsibility for it, a bit more often. Think about how you're being careful with your own data, and ensure you're being at least as careful with the private data available to you from those who trust you. How would they want you to protect that data? The more you think this way the more you'll start to question how businesses, governments, and the like are using and protecting your data. Sooner or later you'll even help take action on those questions, and ultimately that's what I'm hoping for from all of us. The sooner the better.