Your Fridge is Spying on You: How Big Tech Became Big Brother (and Brought Friends)
From phones to toasters, the global surveillance state isn’t coming—it’s already in your kitchen
Welcome to the Age of Surveillance:
There was a time, not long ago, when the biggest thing your TV “knew” was how to switch to Channel 4. Your fridge existed to keep milk cold. Your watch told the time. Your phone—well, alright, phones have been problematic for a while. But the point is: everyday objects weren’t plotting against you.
Now? Welcome to 2025, where your lightbulbs are snitches, your car might be quietly tracking your mood, and the concept of privacy has been air-fried into oblivion by the very gadgets we invited into our homes.
At the core of it all: Big Tech. Meta. Google. Amazon. Microsoft. Apple. X. The faceless grey blur of consumer electronics manufacturers. And every app that promised to make your life easier—while quietly sucking up your location, preferences, contacts, and bathroom schedule.
And while this may sound dramatic, the truth is even more unsettling: these companies aren’t just collecting data. Surveillance is the business model.
Surveillance as a Service (SaaS): The Real Revenue Stream
Google doesn’t make most of its money selling you phones. Meta isn’t surviving on your aunt’s Farmville addiction. Amazon didn’t become a trillion-dollar empire on discounted paper towels alone.
They rake it in by mining your personal data—where you go, who you talk to, what you type, what you say aloud in your living room—and then selling access to those behavioural insights to advertisers, governments, and anyone with enough cash to buy a targeted ad or data set.
Take Google. It tracks your search queries, Gmail content, YouTube viewing history, app usage, and Android device activity. Even “incognito mode” is more PR than privacy. And with recent tweaks to its terms of service, Google has made it clear: they can now sell your information to “government partners” or basically anyone else they see fit, including for weapons and software used to target and kill civilians. From “don’t be evil” to “don’t be naïve,” in just two decades.
Meta, meanwhile, has turned the personal details of billions into behavioural gold. Their algorithms don’t just understand your likes—they understand your triggers, your insecurities, your breaking points. It’s all very efficient, especially when harvesting data from Instagram, WhatsApp, Oculus, and whatever remains of Facebook. They own the ecosystem and the microscope.
Amazon? It tracks more than your delivery habits. Every Alexa request, Fire TV click, Ring doorbell detection, and Kindle tap feeds into a profile that makes you easier to predict, persuade, or surveil. At one point, Ring was partnered with over 2,000 US police departments—blurring the line between corporate convenience and state surveillance.
Even Microsoft, the supposedly boring one, is quietly in on the game. Windows 10 and 11 send usage data back to HQ by default, and tools like LinkedIn offer employers powerful profiling tools that look suspiciously like corporate-level people-watching.
And X (formerly Twitter)? Under Elon Musk, it’s a data Wild West. He’s used internal analytics to fuel mass layoffs, track dissent, and push political narratives—all while publicly courting governments, hosting far-right voices, and reinstating banned accounts with disturbingly little regard for safety or privacy.
Smart Tech, Dumb Trust: The Devices Watching You
But it’s not just the platforms. Your stuff is watching you, too.
Smart TVs track what you watch—even when you’re not using built-in apps. Some models even scan the nearby Wi-Fi environment, just to get a little extra information. Smart assistants like Siri, Alexa, and Google Assistant have all, at some point, been caught recording conversations by accident. Or “accident.”
Wearables like smartwatches monitor your heart rate, sleep, blood oxygen, and movement. That’s great if you’re training for a marathon, less great if you’re worried about employers or insurers gaining access to that data through partnerships or leaks.
Even your car—yes, your actual car—is likely collecting data on your driving style, navigation history, cabin activity, and whether you’ve had a cheeky cry on the M25.
And all this data? It’s not always stored securely. It’s often sold, shared, or leaked. The so-called “anonymisation” of your personal data doesn’t work either—not really. With just a handful of data points, researchers have repeatedly shown they can re-identify individuals in massive anonymous data sets. Your phone’s metadata (the “data about data”)—who you called, when, where you were—is often more revealing than the content of the call itself.
The Aggregation Equation: Why Little Things Reveal Big Truths
Individually, none of this feels dramatic. So what if your weather app knows where you are? Who cares if your voice assistant heard you say “Play Phil Collins”?
But all of this data—when stitched together—builds an alarmingly intimate profile. Your preferences, fears, health status, income bracket, sexual orientation, religious beliefs, political leanings. It becomes a portrait of your life so detailed, it makes your passport look like a post-it note.
Data brokers then sell these profiles, not just to marketers, but to landlords, insurers, and yes—governments. And when surveillance becomes this detailed, it’s not just creepy. It’s dangerous.
From Ad Tech to Authoritarianism: The Global Creep
The scary part? We’re already past the hypothetical stage.
In Israel, advanced surveillance tools powered by AI and behavioural data—some of it derived from consumer tech—have been deployed to identify and kill targets in Gaza. Facial recognition, metadata, and phone tracking play into what’s chillingly known as “the Gospel,” an IDF automated system that assigns targets for elimination. The line between commercial technology and state violence? Blurred into dust.
In the United States, under the Trump administration, ICE used private data brokers like LexisNexis and tools from Google’s partners to identify, track, and detain immigrants. Many of those targeted were legal residents or citizens mislabelled by flawed or biased systems. Google, meanwhile, quietly changed its terms of service to allow broader sale of your data, including to government agencies.
In India, facial recognition is increasingly used to monitor protests. In China, social credit scores are tied directly to surveillance data to restrict access to travel, jobs, and loans. And in the UK, real-time facial recognition is being piloted by police forces in ways that disproportionately target ethnic minorities and working-class communities—because of course it is.
Back in the corporate realm, Elon Musk used internal data from Twitter/X to justify mass layoffs and weed out “inefficient” staff, particularly targeting those in regulatory or oversight departments. He now controls a global behavioural database and has openly aligned himself with authoritarian voices. As for Donald Trump regaining power—with Musk’s X as the de facto town square? Let’s just say: history really did have a sense of irony.
“Give me six lines written by the most honest man and I will find something in them which will hang him.”
A quote that springs to mind, and you’ll also see quoted again in the video imbed below, is that of the French Cardinal Richelieu who once said, “Give me six lines written by the most honest man and I will find something in them which will hang him.” Now imagine those six lines are your Google search history, your location trail, your sleep schedule, your doorbell footage, and the background chatter picked up by your TV or in your car. This isn’t a future dystopia. It’s a push notification away.
A Final Word of Thanks (and Warning)
Before we wrap up, I want to give enormous credit to the British journalist
, who, six years ago, helped expose Cambridge Analytica’s role in harvesting Facebook data to manipulate voters. It was a groundbreaking piece of journalism. For me, it was the straw that broke the camel's back in terms of ditching all my Facebook/Meta products. It was a huge scandal, and breaking the story, which by her own admission almost broke her, was a remarkable achievement. She was sued in a brutal court case simply for speaking the truth, and her courage lit a beacon in the darkness surrounding Big Tech’s influence on democracy. Her reporting didn’t just peel back the curtain; it proved that the surveillance machine was already shaping our elections, our discourse, and our fate. On the day I’m writing this section (11 April 2025), I’ve also just found out that her contract at The Observer, where she has worked for many years and published this story, has not been renewed following their acquisition by Tortoise Media. This is despite the fact that many stories she brought to the paper, particularly those regarding the Cambridge Analytica scandal, have been some of the most read articles in the newspaper's history. Who says censorship is dead? Well, I’ll tell you for free: heritage news media certainly is. The Observer is Britain's oldest newspaper; it was owned by The Guardian, which is part of the Guardian Media Group, which in turn is owned by the Scott Trust Limited. They sold The Observer to Tortoise Media, a British news website co-founded in 2018 by former BBC News director and The Times editor James Harding and former US ambassador to the United Kingdom Matthew Barzun. To be perfectly honest, I think it's highly unlikely that either Tortoise or, more sadly, The Observer will be around in another ten years.We all owe Carole a debt for what we now know. You can start by subscribing to her Substack. But we also owe it to ourselves to stay aware (another reason to subscribe to her Substack)! After that, be sure to watch her recent and incredibly powerful TED Talk below. Unfortunately, this is a YouTube link, only because I couldn't get Substack to successfully embed the original link from TED. That irony isn’t lost on me.
Finally, the surveillance economy isn’t something we “opt into”—it’s the air we breathe. But we can, at the very least, start paying attention to who’s selling us the oxygen.
This article kicks off the first of a three-part series where I’ll be diving into the tech industry, its products, and their impact on our lives. The aim? To make you stop, think, and seriously question how we interact with this technology. When should we switch things off, cut back, or even ditch them altogether?
More importantly, let’s discuss the treasure trove of information we hand over to these companies. It’s time to reflect on which brands we’re willing to trust with our hard-earned cash—and our deepest secrets. After all, if you’re paying for a product or service, you should be absolutely certain about who you’re sharing your personal details with, not to mention your precise location—24/7. So, buckle up and get ready to rethink your tech choices. And don’t forget to subscribe for free now and receive all the articles in your inbox when they go live!
Part Two “Hey World, We Have a Problem” (How the entire planet outsourced its nervous system to a handful of American corporations born from Cold War paranoia—and why that’s now an existential threat under a rogue administration). Goes live Friday 25th April.
Thanks for reading. If you enjoyed this article please take a second and hit share. If you'd like to see more that's similar, take two seconds and hit subscribe below:

References:
Metadata and Anonymised Data Risks
https://mcmillan.ca/insights/risks-of-anonymized-and-aggregated-data
https://www.priv.gc.ca/en/opc-actions-and-decisions/research/explore-privacy-research/2014/md_201410
Smart Tech Overreach
https://www.theguardian.com/technology/2022/oct/06/smart-home-gadgets-which-investigation-data
https://www.vox.com/culture/2023/3/30/23660933/apple-siri-lawsuit-settlement-is-my-phone-spying-on-me
Carole Cadwalladr and Cambridge Analytica
https://www.theguardian.com/news/series/cambridge-analytica-files
https://www.bbc.co.uk/news/uk-politics-66047394
Israeli Government Use of AI and Consumer Data
https://www.theguardian.com/world/2024/apr/03/israel-ai-bombing-targets-civilian-deaths-gaza
https://www.972mag.com/gospel-ai-targeting-gaza/
US Government and ICE Using Commercial Data
https://www.washingtonpost.com/technology/2021/03/09/ice-lexisnexis-contract/
https://theintercept.com/2022/07/08/ice-surveillance-data-google-lexisnexis/
Google’s User Agreement Changes
https://www.techspot.com/news/98077-google-quietly-updates-terms-allow-sharing-data-government.html
Musk’s Layoffs and Internal Analytics
https://www.bloomberg.com/news/articles/2022-11-14/elon-musk-uses-employee-data-to-cut-jobs-at-twitter
Richelieu Quote
https://en.wikiquote.org/wiki/Cardinal_Richelieu
International Surveillance Practices
India: https://www.technologyreview.com/2022/12/21/1064860/india-facial-recognition-police-surveillance/
China: https://www.wired.com/story/china-surveillance/
UK: https://news.sky.com/story/police-trial-facial-recognition-tech-that-wrongly-targeted-black-people-study-finds-12806364
Well, I had quit using Google a long time ago, same for buying from Jeff Bezos' Amazon, and being a techno-Luddite, never cared for Facebook, Twitter, Instagram, and iterations and permutations thereof. So I thought I was relatively safe. Now, given Part I of your series, and looking forward to the next two, I recall that line from "Catch-22": "Just because you're paranoid doesn't mean they aren't after you."
Five minutes of pertinent (and needed) levity from Woody Allen can be found here: https://www.youtube.com/watch?v=DRB_ypEnL50
Addendum: This just in, no laughing matter: https://www.mintpressnews.com/oogle-wiz-cybersecurity-data-deal/289413/