I was tinkering with some old satellite telemetry data last week when a news alert popped up on my tablet about facial recognition software being deployed at airports. Had to laugh – there I was, looking at actual surveillance technology I’d helped build decades ago, while the media treated it like some shocking new development. Thing is, we engineers have been living in the dystopian future for years. We just called it “systems integration.”
See, when you’ve spent four decades designing the technical infrastructure that makes modern surveillance possible, reading Orwell hits different. I remember working on GPS systems in the ’80s, thinking about navigation and scientific applications. Never really considered that someday everyone would carry a tracking device voluntarily and call it a convenience. But that’s how it works – the scary tech doesn’t announce itself with ominous music and storm troopers. It shows up as a software update.
My daughter asked me last month why I keep re-reading *1984* every few years. Good question. I mean, I lived through the actual Cold War, worked on projects where we worried about Soviet satellites photographing our facilities. You’d think fiction about totalitarian surveillance would feel quaint by now. Instead, it feels like a technical manual I helped write without realizing it.
The thing about dystopian novels that still make me uncomfortable – they’re not the ones with impossible technology. They’re the ones where I recognize the engineering principles. When Orwell writes about telescreens, I think about the cameras I’ve seen integrated into display systems. When Huxley describes soma distribution, I remember designing automated delivery mechanisms for spacecraft life support. The physics works out.
*Fahrenheit 451* used to annoy me, honestly. As an engineer, I kept getting distracted by the impracticalities. How do you train salamanders to hunt books? What’s the maintenance schedule on those mechanical hounds? But then I noticed something interesting at my nephew’s high school graduation last year. Kids weren’t bringing books to read during the boring parts – they were watching videos on their phones. Same effect as burning books, but way more efficient distribution system.
I had coffee with a former colleague recently who’d moved into civilian satellite work. We got talking about *The Handmaid’s Tale* and he made this observation that stuck with me: Atwood didn’t invent any new oppression technology. She just described existing surveillance and control systems operating at full efficiency. From an engineering standpoint, Gilead’s monitoring capabilities are basically what we had in the ’90s, just with the safety protocols turned off.
That’s what gets me about effective dystopian fiction – it shows you your own work from a different angle. I spent years designing communication systems that could track signals, identify sources, monitor transmission patterns. All perfectly legitimate aerospace applications. But the same technology that helps coordinate spacecraft operations can also map human social networks and predict behavior patterns.
*Never Let Me Go* bothered me for weeks after I finished it. Not because of the cloning stuff – that’s just biology, not my department. What bothered me was the psychological conditioning system. How do you convince people to accept their exploitation? You design the information environment they grow up in. You control what they learn, what questions they’re allowed to ask, what alternatives they can imagine. It’s systems engineering applied to human consciousness.
I actually tried an experiment after reading that book. Spent a day documenting every piece of data I generated just by existing. GPS coordinates from my phone, purchase patterns from credit cards, biometric readings from unlocking devices, search queries, email metadata… Filled up most of a legal pad. Thing is, I know how this data gets processed because I’ve worked on the processing systems. The storage capabilities, the analysis algorithms, the correlation techniques – none of it’s science fiction anymore.
*Black Mirror* episodes crack me up sometimes because they’re basically engineering case studies. That social credit rating system? I’ve seen the database architecture that could support it. The memory recording technology? We’ve had similar data collection capabilities in spacecraft black boxes for decades. The only difference is scale and application.
But here’s where it gets interesting from a technical perspective. The most effective dystopian control systems aren’t the dramatic ones. They’re the ones designed for user acceptance. Make surveillance feel like service. Make compliance feel like convenience. Make monitoring feel like safety. Basic human factors engineering.
*Brave New World* understands this perfectly. Huxley wasn’t worried about force-feeding people drugs. He was worried about designing drugs people would want to take. From an engineering standpoint, that’s a much more elegant solution. Why build expensive enforcement mechanisms when you can build addiction mechanisms instead?
I remember visiting my first grandson last year and watching his parents use all these “smart” monitoring devices. Baby cam with motion sensors, temperature tracking, sleep pattern analysis, automatic alerts for anything unusual. All perfectly reasonable safety features. But I couldn’t stop thinking about how the same monitoring infrastructure could be repurposed if the parameters changed.
*Station Eleven* hit me right in the supply chain anxiety I developed during my aerospace career. We always worried about component failures, backup systems, what happens when critical infrastructure goes down. Mandel gets the cascade effect exactly right – how quickly complex systems can unravel when key nodes fail. I’ve seen it happen in satellite networks. Same principles apply to civilization.
The scary thing about working in aerospace is you learn how fragile our technological systems really are. Everything depends on everything else. Power grids, communication networks, transportation systems, financial systems – they’re all interconnected in ways most people never consider. When I read about societal collapse in fiction, I’m not thinking about the dramatic stuff. I’m thinking about maintenance schedules and spare parts inventory.
*The Circle* reads like a documentation review of every tech company I’ve consulted with since retiring. The mission creep, the feature expansion, the gradual normalization of invasive monitoring presented as user benefits. I’ve sat in meetings where people discussed these exact implementation strategies. The only unrealistic part is how quickly they rolled everything out.
What really gets under my skin about contemporary dystopian fiction is how accurately it predicts engineering decisions I’ve watched colleagues make. Not malicious decisions – practical ones. Efficiency improvements, cost reductions, performance optimizations. But each small technical choice shapes the larger system, and the larger system shapes human behavior.
Sometimes I wonder if we engineers are the real villains in dystopian stories. Not because we’re evil, but because we’re good at solving the technical problems that enable social control. We build systems that work exactly as designed, and sometimes the design specifications come from people with concerning priorities.
But that’s the thing about dystopian fiction that ages well – it’s not really about the future. It’s about recognizing the present moment when technical capability meets political opportunity. The engineering is already here. We’re just waiting to see how it gets used.
John spent forty years designing real spacecraft before turning his attention to fictional ones. Writing from Oregon, he brings a scientist’s curiosity to sci-fi—separating good speculation from bad physics while keeping his sense of wonder firmly intact.



















