That’s Not My AI
Since ancient times, the human impulse to extend thought through tools has been shadowed by doubt. Socrates warned that writing might erode memory and genuine understanding, offering only the illusion of true wisdom
Such anxieties remind us that technology is never simply about machines. Each innovation is embedded in social and political life, carrying with it a particular order. Philosopher Lewis Mumford captured this dynamic through his distinction between authoritarian technics – large-scale, hierarchical, and centralized systems that demand rigid organization, mass labor, and concentrated resources under elite or state control – and democratic technics, which are smaller, more flexible, and rooted in local initiative and ecological limits. These are oriented toward cooperation and participation – like bicycles, craft workshops, or community-scale renewable energy systems.1 Mumford’s idea of the megamachine reflects the authoritarian type: a coordinated system of people and tools linked by signals, commands, and roles. Take mining for example – it draws workers, overseers, transport lines, and entire towns
into a single apparatus for the extraction of minerals.Military domination and corporate expansion have long driven technological innovation, oriented toward extending power, speed, and profit. The computer was forged in the crucible of World War II and further developed during the Cold War, backed by unprecedented state investment in weapons research.2 Familiar inventions like the internet, GPS, touchscreens, microwaves – even Velcro – emerged from this nexus of militarization and state-backed science. Private sectors such as insurance, manufacturing, and finance were among the first to adopt this new computing power, using it to reorganize production, administration, and circulation. This set the stage for the exponential acceleration we now inhabit, where flows of capital traverse the globe in the blink of an eye, reshaping markets and the rhythms of daily life.3The Cold War of the 1950s, steeped in paranoia and nuclear anxiety, left a deep mark on the next generation. “Duck and cover” fallout drills became routine, and universities became extensions of the defense industry. Many felt reduced to cogs in a vast war machine – a sense sharpened by the escalation of the Vietnam War. By the 1960s, these pressures helped spark student revolts that soon converged with civil rights struggles, anti-war protests, native sovereignty movements, feminism, LGBTQ+ activism, and more. Everyday life itself came under scrutiny, giving rise to both anger and new horizons of possibility. Mumford’s thinking resonated strongly in this context, capturing both the alienation of the megamachine and the hope of democratic technics. Robots, it was thought, might one day relieve humans of hard labor and expand leisure. Marshall McLuhan envisioned a utopic global village 4 in which everyone would be connected.It is telling that the Greek root of the word cyber means “to govern.” As we scroll and click with haptic precision, we hear the affirming sounds of folders opened, files saved, and the wastebasket emptied with a satisfying “crunch” – giving us a pleasurable sense of agency. But deep down, we know that the “desktop” before us is no such thing. It is a graphical user interface – like the mouse itself – born in military laboratories. Beneath the calm surface of friendly pictograms lies an unruly complexity. Code executes in ways that no one, not even its creators, can fully predict.5 Still, what we encounter on the screen appears as choice – even though those choices have already been made for us. Invisible barriers arise; defaults are renamed “Your Preferences” and marketed as “user-friendliness.” 6 Algorithms guide and constrain our movement, determining what will appear and what will remain hidden – arranging possibilities, nudging us into predictable paths, shaping our behavior.Each keystroke, each click becomes data – converted into prediction, which becomes control. Ultimately, it is us who are being governed, not by elected states, but by non-state actors operating with the force of a state.I recall first opening Google Earth and being struck by its ability to deliver seamless, zoomable imagery of almost any place on the planet. It effectively rendered the physical world computationally searchable and addressable.7 Google’s mission is “to organize the world’s information and make it universally accessible and useful” 8 – a goal that implicitly suggests organizing information means organizing everything.9 AI systems are now doing something similar with human knowledge, creativity, and decision-making processes – proving Socrates’ anxiety durable: whenever human capacities are outsourced to technology, we risk mistaking storage for knowledge and access for comprehension.Over 200 years ago, Mary Shelley wrote Frankenstein.10 Set in Switzerland, it is less about a monster with bolts in its head than about a creator’s failure to take responsibility when his invention wreaks havoc. With uncanny foresight, Shelley offered an allegory that anticipates today’s debates on AI – raising enduring questions about autonomy, morality, and what it means to be human in relation to technology.