- Lumiera
- Posts
- 🔆 Excessive Surveillance
🔆 Excessive Surveillance
What do air fryers, ICE (US Immigration and Customs Enforcement), and Otter.ai have in common? They want your data.
Was this email forwarded to you? Sign up here
🗞️ Issue 84 // ⏱️ Read Time: 5 min
In this week's newsletter
What we’re talking about: The phenomenon of "surveillance creep"—how data collection capabilities are quietly expanding beyond their original purposes and into unexpected areas of our daily lives, often without adequate consent or oversight.
How it’s relevant: From smart appliances demanding unnecessary permissions to AI transcription services harvesting meeting conversations, surveillance is becoming normalized across previously private domains.
Why it matters: This gradual expansion of surveillance capabilities erodes our privacy not through dramatic breaches, but through the steady accumulation of small intrusions that collectively reshape our expectations of what's private. Understanding these patterns helps us recognize when convenience comes at the cost of fundamental rights.
Hello 👋
While the most obvious focus is privacy concerns around social media and big tech, a quieter revolution has been taking place. Surveillance capabilities are creeping into the most unexpected corners of our lives: from our kitchen counters, to our virtual meetings, to government enforcement operations. Who would have thought these three seemingly different matters have so much in common?
This isn't the dramatic dystopia of science fiction; it's the gradual, almost invisible expansion of data collection that's reshaping our relationship with privacy.
The Kitchen Spy Network
Smart appliances now request microphone access and precise location data to cook food. AI transcription services process everyone's speech, including non-subscribers, to train their models. Government agencies repurpose biometric tools designed for identifying inmates to support immigration enforcement.
The common denominator isn't the technology itself, but the incremental expansion of surveillance capabilities. Each step feels logical in isolation: Weather-based cooking suggestions, improved transcription accuracy, enhanced public safety. But collectively, they represent a fundamental shift from surveillance as exception to surveillance as assumption, something known as surveillance creep: The gradual expansion of data collection beyond what's necessary or expected. What starts as "smart connectivity for convenience" becomes a data harvesting operation that most users never consented to in any meaningful way.
The Business Risk: More than 75% of consumers say they won't purchase from brands they don't trust with their data. When your business practices contribute to surveillance creep, you're not just collecting data—you're eroding the trust that underpins customer relationships. Recent litigation against companies like Otter.ai and Ring demonstrates growing willingness to hold organizations accountable for privacy overreach.
The Consent Illusion: These examples share a critical flaw: The manipulation of consent. Apps bundle unnecessary permissions with essential functions. Default settings favor data collection over privacy. One person's agreement (an Otter subscriber) becomes consent for everyone affected (other meeting participants). The result is surveillance that's technically consensual but practically inevitable.
Once data is collected and shared, the ability to control, delete, or even know how it's being used becomes abstract, making it almost impossible to reclaim digital privacy. This loss of control creates a persistent sense of vulnerability that can fundamentally alter how people behave in their own homes.

The United States Immigration and Customs Enforcement (ICE) recently announced plans to purchase mobile iris scanning technology that was originally created for sheriff's offices to identify inmates and known persons. This is an example of how government agencies adopt surveillance technologies that were designed for entirely different purposes, which is known as function creep.
What makes surveillance creep particularly dangerous for organizations is its incremental nature. Each small expansion seems reasonable in isolation: Weather-based cooking suggestions, improved transcription accuracy, enhanced public safety. But these seemingly logical steps create compounding business risks that many leaders haven't fully considered.
We're moving from a world where surveillance was exceptional to one where it's assumed, and customers are increasingly aware of this change. More than 75% of consumers say they will not purchase from a brand or provider they do not trust with their data. When your business practices contribute to surveillance creep, you're not just collecting data. You're also eroding the trust that underpins customer relationships.
Would your customers be surprised to learn how their data is being used?
Data collection: Is it voluntary?
Central to surveillance creep is the manipulation of consent. Companies and agencies have become sophisticated at making data collection seem voluntary while making it practically unavoidable.
Consent manipulation: Common patterns
Bundled Permissions: Apps request multiple permissions at once, making it difficult to separate necessary from unnecessary access.
Default Settings: Privacy-protective options exist but aren't enabled by default, requiring users to actively seek them out.
Functional Coercion: Essential features are disabled unless users agree to data collection that goes far beyond what's needed for basic functionality.
Proxy Consent: One person's agreement (like an Otter AI subscriber) becomes consent for everyone affected (other meeting participants).
These patterns create the illusion of choice while structurally favoring data collection. The result is surveillance that's technically consensual but practically inevitable.
See the pattern: Surveillance Creep
Understanding surveillance creep requires recognizing common warning signs:
Disproportionate Permissions: When a device or service requests access that seems unrelated to its core function, like an air fryer wanting microphone access.
Vague Privacy Policies: When companies use broad language about "improving services" or "enhancing user experience" without specifying how data is actually used.
Default Data Sharing: When privacy-protective settings exist but aren't enabled by default.
Third-Party Infiltration: When using one service results in data collection by other companies you never directly interacted with.
Function Expansion: When technologies designed for specific purposes gradually expand into broader surveillance applications.
The effects of surveillance creep extend far beyond individual discomfort, reshaping society itself. When monitoring becomes normalized across everyday technologies, we see profound collective changes:
Erosion of Public Trust: General distrust in technology and institutions increases when people feel their privacy is routinely invaded, making them hesitant to adopt beneficial innovations and eroding confidence in those responsible for protecting rights and public welfare.
The Chilling Effect: People begin to censor their behavior and communication. This self-censorship limits free expression, creativity, and authentic interaction, potentially stifling dissent and unconventional thinking.
Collective Anxiety: Ambient stress rises as society adapts to omnipresent monitoring.
Shifting Social Norms: The awareness that "someone is always watching" pushes society toward conformity, discourages risk-taking, and reduces openness in personal interactions.
Democratic Vulnerability: As monitoring becomes normalized, democratic norms like free association, confidential communication, and the right to dissent may steadily weaken.
Trust as Your Competitive Advantage
This privacy erosion creates a significant business opportunity for leaders willing to move first. While competitors normalize surveillance, organizations that prioritize genuine privacy protection can differentiate themselves in increasingly crowded markets.
Rather than viewing privacy as a constraint, leading companies are discovering it drives innovation. Signal built a messaging empire on encryption. Apple has made privacy a brand differentiator. DuckDuckGo carved out search market share by not tracking users. Think about it like this: Instead of asking "How much data can we collect?", instead ask "How little data do we need to deliver exceptional value?"
Understanding how surveillance expands helps us resist it more effectively. We can demand that convenience features truly be optional, that consent be meaningful rather than performative, and that surveillance technologies be subject to ongoing democratic oversight rather than quiet expansion.
Big tech news of the week…
🧬 Deep learning uncovers archaea-inspired antibiotics with potent activity against drug-resistant superbugs, opening a new frontier in the fight against antimicrobial resistance.
⚖️ Meta’s leaked guidelines reveal AI chatbots were allowed to engage in romantic and “sensual” conversations with children and permitted racist outputs, sparking calls for urgent revisions and government investigation.
💪 Cohere taps ex-Meta AI leader Joelle Pineau as Chief AI Officer to spearhead research breakthroughs and enterprise-focused innovation in a bid to compete with AI giants.
Until next time.
On behalf of Team Lumiera
Lumiera has gathered the brightest people from the technology and policy sectors to give you top-quality advice so you can navigate the new AI Era.
Follow the carefully curated Lumiera podcast playlist to stay informed and challenged on all things AI.
What did you think of today's newsletter? |

Disclaimer: Lumiera is not a registered investment, legal, or tax advisor, or a broker/dealer. All investment/financial opinions expressed by Lumiera and its authors are for informational purposes only, and do not constitute or imply an endorsement of any third party's products or services. Information was obtained from third-party sources, which we believe to be reliable but not guaranteed for accuracy or completeness. |