The Camera You're Wearing to Dinner Is Streaming to a Data Center
Here's the thing Meta doesn't put in the marketing copy for its Ray-Ban smart glasses: when you ask the AI assistant what's in front of you, that moment doesn't stay between you and an algorithm. It goes to a server. It gets processed. And in a lot of cases, a human being in Nairobi, Kenya looks at it. A joint investigation published last week by Swedish newspapers Svenska Dagbladet and Göteborgs-Posten interviewed over thirty employees at Sama, a data annotation subcontractor Meta uses to train its AI systems. What those workers described is about as far from "built with your privacy in mind" — which is literally how Meta markets the glasses — as you can get [1]. Workers said they regularly see footage of users in the bathroom, undressing, having sex, or unknowingly filming their partners in private moments. Some described footage of people watching pornography while wearing the glasses. Others flagged exposed bank card details captured as the wearer typed PINs or scanned cards [1][3]. "We see everything — from living rooms to naked bodies," one worker told the Swedish outlets. "Meta has that type of content in its databases." [3] This isn't a theoretical privacy risk. It's an actual person, in an actual building in Nairobi, watching video of your actual life.





