Your Dignity Is Not a Product
The "I agree" button is broken. It's time to move past failed consent models and build the next generation of products rooted in dignity and trust.
In this series, we've traced the long arc of privacy, from the 'invisible walls' of ancient hunter-gatherer societies to the physical walls of the 19th-century home. We’ve seen how our need for a private sphere is not a modern fad, but a deep human instinct that has been shaped and reshaped by technology and culture. Now, we arrive at the frontline, the place where all this history collides with the products we build every day.
A few years ago, I was helping my dad set up his first smartphone. We were installing a simple weather app, and a permission box popped up: "Allow WeatherApp to access your contacts?" My dad, a retired engineer, was baffled. "Why on earth," he asked, "does the weather app need to know who my friends are?" I mumbled something about ad targeting, but it was a perfect example of the "privacy mismatch" that defines our modern lives: a fundamental disconnect where our instincts are out of sync with our technological reality.
We are living in a "digital panopticon," an era of pervasive, persistent, and largely invisible surveillance by corporations and states. The very colleagues who tell me "privacy is dead" are often the ones building the systems that make my dad's question so depressingly common. They see data as a resource. But my dad’s question comes from a different place. He was asking about dignity. And that distinction lies at the heart of our current crisis. The future of privacy hinges on our ability to move past the cynical, "privacy is dead" mindset and embrace a more global, dignity-oriented view.
The Two Wests: Liberty vs. Dignity
We often talk about a "Western" idea of privacy, but that's a dangerous oversimplification. In reality, the West is a house divided, and this schism has massive implications for how tech is regulated globally.
The Anglo-American tradition is rooted in a deep suspicion of the state, a legacy of its 18th-century revolution. It’s encoded in the Fourth Amendment's protection against "unreasonable searches and seizures" and the maxim that "a man's home is his castle". The motto, from Samuel Warren and Louis Brandeis in 1890, is "the right to be let alone". The primary enemy of privacy is the government. This gives us strong protections against government surveillance but leaves us incredibly vulnerable to intrusions from the private sector, like corporations or the media. Freedom of speech is often prioritized over privacy claims, and the right to one's own image is frequently treated as a commercial property right.
The Continental European tradition, especially in Germany and France, grew from a different root: the protection of personal honor, reputation, and dignity. It democratized aristocratic codes of conduct, extending the right to be treated with respect to every citizen. The core value isn't liberty from the state, but the protection of one’s personality from public humiliation or unwanted exposure. The prime enemy is often the press or any entity that disseminates personal information.
This "dignity-oriented" model leads to a completely different world. It’s the world that produced the General Data Protection Regulation (GDPR), which grants individuals robust rights over their personal data, including the "right to be forgotten". It treats one's image as an inalienable aspect of personality, not a commodity. This clash of cultures—liberty vs. dignity—is playing out every day in the fight to regulate Big Tech.
The Wisdom of the 'We'
The Western schism is only half the story. Both of its models are intensely individualistic. But many Eastern traditions offer a profound alternative: the concept of a "relational self". Here, a person isn’t an isolated atom, but is defined by their relationships and duties within a larger social order. Privacy isn't a right to be left alone, but a contextual practice of managing information to maintain social harmony.
In the Confucian tradition, the self is a center of relationships (ruler-subject, parent-child, etc.). The classical Chinese term for privacy, yinsi, even carried negative connotations, suggesting something shameful to be hidden from the community. The goal is to act appropriately within one's roles. Confucius taught that a son concealing the wrongdoing of his father is a form of uprightness, as it prioritizes familial loyalty over abstract justice.
This "relational privacy" is becoming incredibly relevant. My choices online don't just affect me. When I post a group photo, I make a privacy decision for everyone in it. The digital age forces us to see that privacy is a team sport. The purely individualistic, rights-based model is breaking down because our digital lives are so deeply interconnected.
Breaking the Cycle and Fixing the Mismatch
The history of modern privacy is a repeating cycle: a new technology emerges (the Kodak camera, the telephone), creates a social panic, and is eventually followed by legal adaptation. The invention of the affordable camera and "yellow journalism" led to the call for a "right to be let alone". Fears of a government "National Data Center" in the 1960s spurred the first data protection laws in the 1970s.
We are in the latest turn of this cycle. The Snowden revelations (2013) and the Cambridge Analytica scandal (2020) laid bare the scale of the digital panopticon. The legislative response, led by the GDPR in 2018, is our generation's attempt to adapt.
But laws alone aren't enough. We are still stuck in that "privacy mismatch." Our brains evolved to handle privacy with tangible cues. The digital world has none of those. This explains the "privacy paradox": people say they care about privacy, then share their lives online. It’s not hypocrisy; it’s that the visceral cues that trigger our privacy-protecting instincts are absent.
For too long, the tech industry’s answer has been the "notice and consent" framework—those unreadable privacy policies we all agree to. This approach is a resounding failure that offloads liability from data collectors to users.
So what do we do? As builders of this world, we must lead the change.
First, we must champion and adopt strong, GDPR-style regulation as a baseline. These laws rightly shift the burden of protection onto us—the organizations that collect data—demanding privacy by design and data minimization.
Second, we must build and promote Privacy-Enhancing Technologies (PETs). These are the tools—from end-to-end encryption to privacy-preserving algorithms—that bake privacy directly into the architecture of our systems, offering the benefits of data without demanding the sacrifice of our private selves.
The tension between the individual and the collective will never be fully resolved. But the current imbalance is not inevitable. It’s the result of choices we have made—to prioritize engagement over dignity and data extraction over trust. We can make different choices. We can choose to build technology that honors the ancient human need for a space to be let alone.
We can listen to my dad’s simple question and decide that no, the weather app doesn’t need his contacts. And more importantly, we can build a world where it would never even think to ask.