Frankenstein and AI: Innovation without accountability

Artificial Intelligence bears uncanny parallels with a gothic novel written more than two centuries ago that reveals a fundamental truth: Creation and responsibility are inseparable, writes Paul Budde.
IN 1818, an 18-year-old young man Mary Shelley It was published FrankensteinIt’s a novel read today less as gothic fiction than as a warning about technological ambition.
At its center is Victor Frankenstein, a scientist who managed to create life but could not take responsibility for it. His downfall is not the act of invention, but his refusal to deal with the consequences.
This failure ends in disaster.
Shelley was writing at this time: enlightenmentAn age built on faith in reason and progress. This belief still shapes the way we think about technology today. We are told that technology is neutral, just a tool. Its effects depend on how it is used.
But this is becoming increasingly untenable.
Technologies are shaped by the systems (commercial incentives, political interests, and social dynamics) that create them. Once deployed at scale, they are not neutral. They actively shape behavior, institutions, and even reality itself.
Artificial intelligence is an example of this
Artificial intelligence systems are being developed at an extraordinary pace and capabilities that are not fully understood are emerging. But competition dominates the structures that drive this development: speed, scale and market control. Ethics is considered, but rarely decisive.
The result is a widening accountability gap.
No single actor can fully bear the consequences. Governments are falling behind, regulations are reactive, and companies are working under pressure to avoid falling behind their competitors. Responsibility is dispersed and therefore weakened.
We’ve seen this before
The rise of social media platforms was initially celebrated as a force for connection and democratization. Instead, it has contributed to misinformation, polarization, and the erosion of democratic norms. These consequences were not entirely unpredictable; but when it was important it was not taken seriously.
Even now, meaningful reforms are struggling against business models built on inclusion and growth.
This isn’t just a failure of foresight. This is a failure of responsibility.
Shelley’s insight remains strikingly timely. Frankenstein’s mistake was not innovation, but abandonment. He created something powerful and then stepped back, leaving society to deal with the consequences.
Today, this model has become institutionalized. Innovation moves quickly; Responsibility gradually follows, if it happens.
If technology is not neutral, responsibility cannot be optional.
This requires a change in thinking. Ethical considerations should not be added later, but should be included from the beginning. Regulation should anticipate, not react. And we need to question the assumption that faster innovation is always better.
More than two centuries ago, Shelley captured a fundamental truth: Creation and responsibility are inseparable.
In the age of artificial intelligence, this fact has become urgent
We are no longer dealing with isolated inventions, but with systems that shape societies. If we continue to innovate without accountability, we risk repeating Frankenstein’s mistake; this time not in fiction, but on a large scale.
The question is no longer whether we can develop these technologies.
It’s whether we’re ready to take responsibility for what happens next.
Paul Budde IA is a columnist and managing director of independent telecommunications research and consultancy. Paul Budde Consulting. You can follow Paul on Twitter @PaulBudde.
Support independent journalism Subscribe to IA.
Related Articles



