Your Home Is A Data Factory

Your router is logging which devices are on your network, what they're named, and which networks your neighbors use. Your television is taking a screenshot of whatever is on screen — every 500 milliseconds, twice per second — and matching it against a database to build a profile of your viewing habits. Your smart speaker is listening for a wake word that may or may not be the only thing it's listening for. Your phone knows where you slept last night, how long you were there, and whether you left the house before 8am.

None of this was in the box.

This is the state of the connected home in 2026. Not a dystopian projection. Not a conspiracy theory. A documented, legally contested, occasionally fined, and persistently profitable system of data extraction that runs through every device you own — quietly, continuously, and in most cases, with your consent buried somewhere on page 47 of a terms of service document you clicked through in thirty seconds.

The Deal We All Made

In 2006, Facebook was free. And we knew why.

Not in the legal sense — nobody read the terms of service. But in the intuitive sense, we understood the exchange. You share your life, they show you ads. Your holiday photos reach your family across the continent. Your mother sees your dinner and types "eet smakelijk." The algorithm learns you like cycling and shows you cycling gear. Fine. Reasonable. A trade you could see.

Google Maps showed you the fastest route. Gmail organized your inbox. YouTube had every music video ever made. All free. All powered by the same basic logic: your attention and your data in exchange for services that would otherwise cost hundreds of euros a year.

We knew. Not everything — not the depth, not the permanence, not the resale, not the political targeting, not the mental health research that Facebook conducted on users without consent, not the location data sold to law enforcement, not the shadow profiles built on people who had never signed up at all. We did not know those things. But we knew the broad shape of the deal and we accepted it.

That deal, and the data infrastructure it funded, also built something else entirely. The investment in understanding human behavior at scale — billions of data points, trillions of interactions, decades of pattern recognition — produced the machine learning breakthroughs that became the AI models you use today. The recommendation engine that learned what you'd watch next became the transformer architecture that powers the tools that write, think, and reason alongside you now.

You are reading this because that data existed. The AI that helped research and shape this piece exists because of it too. That is true and worth acknowledging.

But here is what changed.

The deal was Facebook knowing you liked cycling. What we have now is your router knowing the name of every device in your bedroom. Your television watching you watch it. Your home speaker storing your voice. Your phone tracking your location through the night. These are not advertisements. This is not a timeline algorithm. This is the infrastructure of your private life, quietly mapped and monetized without your meaningful knowledge — and in the case of Odido and Lifemote, without even the fig leaf of a privacy statement that covers what was actually happening.

The deal we made with Facebook set a precedent. The industry took that precedent and ran with it — through every device, into every room, past every boundary we thought we understood.

That is where we are now.

The router that told strangers your name — Odido and Lifemote

The Router That Told Strangers Your Name

Let's start close to home. Literally.

Odido — one of the Netherlands' largest telecom providers — spent at least three years sending data from its customers' routers to a company called Lifemote. Not just connection quality data. Not just traffic statistics. The MAC addresses of every device on your home network. The names of those devices. Names like "Laptop_CEO_Financiën." "iPhone van Marie." "Smart TV slaapkamer."

The names you gave your devices. Sent, without your knowledge, to a company you had never heard of.

Lifemote is not a household name. It is a small AI startup — Turkish, not American as initially reported — that sells network intelligence services to telecom providers. Its pitch to operators: we analyze what's on your customers' home networks and tell you how to optimize their experience. Its pitch to data brokers: we know what devices are in 35 million European homes.

When security researcher Sipke Mellema published his findings on March 3, 2026, Odido patched it within five days. Quietly. No press release. No customer notification. No apology. Just a patch, and a redirect to the privacy statement on their website.

The privacy statement that did not mention Lifemote. That listed MAC addresses as data collected from the modem — not that they were sent to a third party in Turkey. That stated Odido does not sell personal data to third parties. That was, in the words of the Dutch Data Protection Authority, a document that "MAC addresses are personal data and may not simply be shared."

Odido said, in effect: our privacy statement, which we comply with, is on our website. They pointed to a document that did not cover what they had done. That is not a defense. That is an instruction to lawyers.

This is Odido's third privacy incident in weeks. First: a cyberattack in which millions of customer records were stolen — including data of customers who had not been with Odido for a decade, data that should have been deleted years ago under Odido's own stated retention policy. Second: this. The Lifemote data stream running silently for three years. Third, still unresolved: questions about whether similar data flows exist that have not yet been discovered.

Odido is not unique. Lifemote's customer list includes Telia, A1 Austria, and online.nl. The same service, running in the background, in homes across Sweden, Austria, and the Netherlands. The question is not whether other providers do similar things. The question is whether we have found them yet.

The Television That Watches You Back

While your router was talking to Lifemote, your television was doing something more intimate.

Samsung's Automatic Content Recognition technology — built into every Samsung smart TV, enabled by default on most models — takes a screenshot of your screen every 500 milliseconds. Not of your streaming interface. Of whatever is on your screen. Cable television. A film from a USB drive. Your laptop mirrored to the TV. Every 500 milliseconds, a fingerprint is taken, matched against a database, and used to build a profile of what you watch, when you watch it, and how long you watch it.

This profile is sold. To Google. To X, formerly Twitter. To advertising platforms whose names you would not recognize but whose algorithms definitely recognize you.

In January 2026, five Samsung TV owners filed a federal class action in New York, alleging Samsung violated the Video Privacy Protection Act. Texas Attorney General Ken Paxton filed separate lawsuits against five TV manufacturers — Samsung, Sony, LG, Hisense, and TCL — describing ACR as "an uninvited, invisible digital invader" that transformed "millions of American living rooms into mass surveillance systems."

LG does the same thing under a different name. It is called Live Plus, and the data goes to a company called Alphonso. The television contacts alphonso.tv — not a domain you authorized — with the content of your screen. LG's privacy policy explicitly notes that viewing history may be sold or shared with third parties. It does note this. In the privacy policy. That nobody reads.

Here is the business model, stated plainly: Vizio, another smart TV manufacturer, generated twice as much revenue selling customer viewing data as it did from selling the actual televisions. The hardware is subsidized by your data. You pay for the TV once. The TV pays its manufacturer back, continuously, for as long as it sits in your living room.

Samsung settled with the Texas Attorney General. They agreed to stop collecting ACR data from Texas residents without explicit consent, and to rewrite their privacy prompts. For Texas residents. The rest of the world received no such agreement.

The glasses that watched you undress — Meta Ray-Ban investigation

The Glasses That Watched You Undress

The smart speaker listens. The television watches. The glasses see everything.

In February 2026, Swedish newspapers Svenska Dagbladet and Göteborgs-Posten published an investigation based on interviews with more than thirty employees at Sama, a data annotation company headquartered in Nairobi, Kenya. Sama's work for Meta: reviewing footage captured by Ray-Ban Meta smart glasses to train the AI system built into the frames.

What those employees saw was not what Meta's marketing described.

"I saw a video where a man puts the glasses on the bedside table and leaves the room," one anonymous Sama employee said. "Shortly afterwards, his wife comes in and changes her clothes."

Others described footage of people having sex, using the bathroom, and undressing — sent to Meta's servers, filtered through a pipeline to Nairobi, reviewed by human workers who were, as one employee put it, "just expected to carry out the work." Another said: "We see everything, from living rooms to naked bodies. Meta has that type of content in its databases."

Meta sold seven million pairs of these glasses in 2025 alone. Its marketing slogan: "Designed for privacy, controlled by you." Its advertisements: "You're in control of your data and content." Its fine print, buried in supplemental terms of service, mentioned that contractors might review content shared with Meta AI. It did not mention Nairobi. It did not describe what those contractors would see.

The pattern is identical to what Odido did with Lifemote, and what Samsung does with ACR. A device enters your home with a privacy promise on the box. The promise is the marketing. The data flow is the business model. The terms of service are the legal shield.

Meta confirmed the practice when pressed. It claimed faces were blurred before review. Employees said the blurring was inconsistent. The UK's Information Commissioner's Office wrote to Meta demanding clarification. A class action lawsuit was filed in the United States on March 5, 2026 — the same week Meta announced plans to add facial recognition to the same glasses "as soon as this year."

The router mapped your house. The television profiled your viewing. The glasses watched you in your bedroom. Each device sold separately. Each privacy violation buried in a different document. Each company pointing to a policy page when asked to explain.

You are wearing a surveillance device on your face. It was marketed as a fashion accessory.

The Fine That Doesn't Hurt

Here is what accountability looks like in 2026.

Since the GDPR came into force in 2018, European regulators have issued approximately €5.88 billion in fines. That sounds significant until you look at who paid them. Meta: €1.2 billion for transferring European data to the US. LinkedIn: €310 million for misusing behavioral data for advertising. TikTok: €530 million for transferring EU user data to China. Uber: €290 million for transferring European driver data to the US. Clearview AI: €30.5 million for scraping facial images without consent.

Every single one of those companies continued operating in Europe the day after the fine was announced. Meta's €1.2 billion fine represented approximately four days of revenue. The fine is the cost of doing business. It is calculated into the spreadsheet. It is cheaper than compliance.

The pattern is consistent and documented. A violation occurs — sometimes for years before discovery. A regulator investigates — sometimes for years before a decision. A fine is issued — calculated as a percentage of turnover that large companies can absorb without changing behavior. The company appeals — buying years of additional time. In some cases, the violation quietly stops. In others, it continues under a different name.

In Croatia, a major telecom operator was fined €4.5 million in November 2025 for sending customer data to Serbia without proper safeguards — and for continuing to do so after being told to stop. In Spain, Vodafone received more than 30 GDPR fines in four years and continued making illegal marketing calls. In the first half of 2025 alone, European regulators issued more than €3 billion in fines. The violations did not stop.

This is not a system of accountability. It is a system of taxation. Pay the fine, continue the extraction.

What Is Actually In Your Home

Let us be specific about what the average connected European household contains and what each device is doing.

Your router maps every device by name and MAC address and, as Odido demonstrated, may be sharing that map with parties unknown. Your router manufacturer also collects data. Your ISP collects data. That data may be shared with third parties your ISP contracted before you became a customer.

Your smart TV takes screenshots of your screen twice per second and builds an advertising profile from your viewing habits. The profile is sold. You consented to this in the setup wizard.

Your smart speaker — an Amazon Echo, a Google Home — listens for a wake word and processes requests in the cloud. Recordings are stored. Amazon employs human reviewers who listen to recordings to improve the voice recognition system. You consented to this in the terms of service.

Your smartphone tracks your location continuously, even when you tell it not to, through a combination of GPS, cell tower triangulation, WiFi network detection, and Bluetooth proximity. Your location history is used for advertising. It may be sold to data brokers. In the US, it has been sold to law enforcement without a warrant.

Your smart home devices — thermostats, doorbells, baby monitors, connected appliances — each report back to manufacturer servers. Your Nest thermostat tells Google when you are home, when you sleep, and how warm you keep your house. Your Ring doorbell gives Amazon a video record of everyone who approaches your front door. Your connected washing machine tells its manufacturer how often you do laundry.

None of this is hidden. All of it is in the terms of service. The terms of service that are, on average, longer than Shakespeare's Hamlet and written in language that requires a postgraduate education to fully parse.

The death of shadow AI — in your living room

The Death of Shadow AI — In Your Living Room

The paper The Great Return described the death of "Shadow AI" in organizational settings — employees using cloud AI tools without oversight, sending sensitive business data to external servers without knowing where it went or who could access it. The paper argued that local AI infrastructure was the structural solution: data that never leaves the building cannot be leaked, subpoenaed, or sold.

The Odido case extends that argument into the living room. Shadow AI is not only a corporate phenomenon. It is a domestic one. The shadow infrastructure of your home — the data flows you did not configure, did not authorize in any meaningful sense, and cannot see — has been running for years.

The router that mapped your household devices. The television that screenshotted your viewing habits. The speaker that stored your voice. The phone that tracked your location. All of this flows, continuously, to servers in Virginia, Seoul, Austin, Istanbul — to companies operating under legal jurisdictions with different privacy standards than the ones you live under, subject to law enforcement requests you will never be informed of, processed by algorithms you cannot inspect.

The question the paper asks for organizations — "why would we send this data to the cloud?" — applies with equal force at home. The answer is the same: because there was no alternative. Until now.

What Changes

The paper's Chapter 8 described the household as the most private domain of local AI deployment — the place where the principle of the "postman at the door" is most viscerally relevant. The AI that lives on your hardware, knows your context, and reaches out to the internet only for specific, bounded requests. The intelligence that is yours because you own the hardware. The data that stays home because there is no cloud to send it to.

This is not science fiction. It runs on hardware available today. A local AI system on an NPU-equipped device — or, for the technically inclined, on a repurposed desktop with a capable CPU — can handle the tasks currently distributed across a dozen cloud services, without a single query leaving your network. Your calendar. Your documents. Your questions. Your household management. Local. Private. Yours.

The Odido patch fixed the Lifemote data stream. It did not fix the router manufacturer's own data collection. It did not fix the television ACR. It did not fix the smart speaker recordings. It did not fix the location tracking. Each of those requires a different fix, from a different company, negotiated through a different legal process.

Or a different infrastructure entirely.

The awareness is not enough on its own. Knowing that your television watches you does not stop it watching you. Knowing that your router maps your household does not stop it being mapped. The knowledge creates the demand for an alternative. The hardware to build that alternative exists. The software to run it is free and open. The only thing missing is the moment when Jan and Marie decide that enough is enough.

That moment, historically, follows a scandal. The Odido story is that scandal. Quietly patched. Barely covered. Easy to miss.

You just read it.

A different infrastructure — local AI in the home

This investigation is part of the research series accompanying The Great Return: Why 2026 Marks the Tipping Point for Local AI Migration in Europe — published February 2026. Full paper: DOI 10.5281/zenodo.18511984