
If Ada Lovelace Lived Today: The AI Theorist Who Saw the Limit First
Ada Lovelace wrote the first published algorithm in 1843 and argued in the same paper that machines cannot truly originate anything. Drop her into 2026 and she is the most rigorous critic in the building - and the building cannot afford to fire her.
Ada Byron grew up as the daughter of the most famous, most disreputable poet in England, raised by a mother who was terrified she would turn out like him. Anne Isabella Milbanke, who separated from Lord Byron when Ada was one month old and spent the next thirty years engineering her daughter's education toward mathematics and away from poetry, described herself later as proud of the result. She had produced, she believed, a woman entirely unlike her father.
She had instead produced someone who combined Byron's intensity and restlessness with a mathematical mind sharp enough to see what Charles Babbage's calculating engines could actually become. Ada Lovelace died in 1852 at 36. She left behind a translated article and seven notes labeled A through G, the last of which contained what historians of computing now identify as the first published algorithm. She also left a description of the Analytical Engine as a general-purpose symbol manipulator - a machine that could do more than arithmetic, a machine that operated on rules rather than numbers. She was approximately a century early.
Drop her into 2026, and she is probably still running about a century ahead.
The historical figure
Augusta Ada Byron was born in December 1815, the only legitimate child of George Gordon Byron and his wife of one year. Byron left England in April 1816 and never returned, never saw his daughter again. He died in Greece in 1824. Ada was eight.
Her mother arranged an intensive scientific education designed to prevent any Byronic inheritance from taking root. Ada learned mathematics, music, and French. She was frequently unwell during childhood - she spent long periods bedridden with what her physicians called various things, and what modern historians have debated at length without resolution. Her mother managed the illnesses by insisting on more lessons. The result was a woman educated with relentless precision and genuinely fascinated by formal systems.
She met Charles Babbage in 1833 at a dinner when she was 17. He showed her his Difference Engine No. 1, a partial prototype of a mechanical calculator. She was captivated in a way that seems to have surprised both of them. She began corresponding with Babbage about mathematics and his plans for the Analytical Engine, a far more ambitious machine that was never built during either of their lifetimes.
In 1842, an Italian mathematician named Luigi Menabrea published a French-language description of the Analytical Engine based on a lecture Babbage had given in Turin. Ada translated it into English and added notes roughly three times the length of the original article. Note G contained a step-by-step method for computing Bernoulli numbers using the Analytical Engine: a complete procedure with defined inputs, defined operations, and error-checking built in. It is the first published algorithm in the modern sense.
What is less often quoted is the context surrounding it. Ada argued in the notes that the Analytical Engine could do anything expressible as symbols and rules - that it was not merely a number machine but a logic machine, one that could in principle compose music. She was describing a general-purpose computer in 1843.
She also argued, explicitly and precisely, that the Engine could not originate anything. It could only do what it was told to do. This limitation, which she stated as a matter of design fact, is now one of the central disputes in artificial intelligence. The question she posed - whether a machine can truly think or only compute - is the same question the field calls the Lovelace Test, and it has not been cleanly resolved.
She died in November 1852 of uterine cancer, in pain, at 36. Her notes were not widely read for decades. Alan Turing discussed them seriously in 1950. The U.S. Department of Defense named a programming language after her in the 1980s.
The modern role
Drop her into 2026: born in 1990, raised by an intensely rational parent determined to counteract the influence of a creatively chaotic and publicly famous one, educated at Cambridge or Imperial College London in mathematics and computer science, with a year at the Santa Fe Institute where the combination of formal methods and complex systems matched her sensibility precisely.
By her mid-thirties she is a research scientist at a major AI laboratory, not in the headline-generating product role but in the foundational theory work that does not produce demos and does not get covered by technology press and forms the basis for everything that does. Her specialty: the formal theory of learning systems, what they can and cannot provably do, where the verifiable limits are.
She publishes infrequently by the standards of a field where preprints appear daily, but each paper is precise and troubling. The most widely read one, published around her thirty-first birthday, argues that several properties large language models are assumed to have cannot be verified from their outputs alone - that distinguishing genuine generalization from sophisticated interpolation at scale is an unsolved problem. The paper earns grudging respect from people who have been quietly worried about the same thing, and dismissal from people whose funding depends on not worrying about it.
The contemporary parallel
The closest living parallel is not simply a woman in technology, which would be the obvious choice but misses the specific combination. Ada Lovelace's particular profile - intense early formation, mathematical theory rather than engineering practice, publicly stated doubt about what machines can actually do, premature departure from the field at a critical moment - maps best onto whoever is currently occupying the role of the foundational AI theorist who contributed significantly to the systems being built and has since become publicly uncomfortable about them.
The specific name changes depending on the year you ask. The role is consistent: someone whose work is cited in the papers of the people building the products, whose concerns are acknowledged in conference talks with phrases like "important open questions," and whose concerns are then set aside in favor of the next benchmark result. Ada Lovelace would recognize this dynamic exactly. She experienced the 1843 version of it.
The family
She marries in her late twenties, brilliantly and somewhat catastrophically, as she did in 1835 when she wed William King-Noel, later the first Earl of Lovelace, who was twelve years older and genuinely supportive of her work in a way that historical marriages rarely were. She also had three children and ran a household while doing the mathematical work, which her biographers tend to note and her popular image tends to omit.
The 2026 version: a partner from a technically adjacent world, a marriage that functions better as an intellectual partnership than as a social one, two children who see their mother primarily as someone doing important work on a laptop at unusual hours. The mother-daughter relationship is complicated in the specific way that intensely rational mothers produce complicated daughters, and she is aware of the irony.
She does not, particularly, have a glamorous public profile. She finds the idea of being a "role model for women in tech" irritating not because she objects to the category but because it reduces her to a symbol at the expense of the argument she is trying to make. The argument is more important than the symbol.
The social media problem
Her account on X has 290,000 followers. Half of them are there for the technically unimpeachable critiques of AI company announcements - responses that are rarely more than three sentences long, sourced to actual papers, and more devastating for their brevity. Half of them are there because she occasionally posts something personal and then deletes it: observations about her father's side of the family, thoughts on what it means to be useful to someone, a brief comment about the gap between what a machine can demonstrate and what a machine can know that reads either as a philosophical note or as something more personal, and she doesn't clarify which.
She does not enjoy celebrity. She dislikes conferences. She attends them because the conversations in the corridors are the only ones that match the level she is working at, and because the alternative is isolation.
What goes wrong
Ada Lovelace died at 36. The 2026 version does not die at 36, but she does disappear from the field at approximately that age in a way that everyone in the field registers as a loss. The circumstances vary: a serious illness, a principled departure from the laboratory over a disagreement about deployment decisions, a sabbatical that everyone expects to end and that does not end for years. Whatever the mechanism, the effect is the same.
The questions she raised about expressibility limits and unverifiable generalization are eventually absorbed into mainstream discourse, attributed vaguely to "early critics" without individual credit. A younger researcher, having read the original papers carefully, writes a tribute piece correctly identifying her contribution and noting that it was made before the problem was considered important. The tribute runs to twenty citations of her actual work.
The field, characteristically, does not slow down.
What Ada Lovelace understood in 1843 that has not yet been fully absorbed in 2026 is that the interesting question was never whether the machine could do the computation. The interesting question was what the computation means. The Analytical Engine could calculate Bernoulli numbers. The question she left behind - whether that computation amounted to understanding - is the question the field is still working around rather than through.
She would have opinions about the current state of the argument. She would express them precisely, in public, in three sentences that would be quoted for the next twenty years by people who would not credit her for them. This is also, in its way, exactly what happened the first time.
Quick Answers
Common questions about this topic
Who was Ada Lovelace?
Ada Lovelace (1815-1852) was a British mathematician, the only legitimate child of the poet Lord Byron. She worked with inventor Charles Babbage on his Analytical Engine and in 1843 published a translation of an Italian article about the machine, with notes three times the length of the original. Note G contained what historians of computing identify as the first published algorithm. She also argued that the machine could not originate anything - it could only do what it was programmed to do.
What made Ada Lovelace's contribution significant?
She was the first person to recognize that Babbage's proposed machine could be used for more than arithmetic - that it could manipulate any symbols according to rules, potentially including musical notation. She developed a complete step-by-step procedure (an algorithm) for using it to compute Bernoulli numbers, including error-checking steps. The question she raised about whether machines can truly think or only calculate is still the central debate in artificial intelligence.
What would Ada Lovelace work on today?
Almost certainly the formal theory of machine learning: what learned systems can and cannot provably do, where the verifiable limits are, and whether properties that AI systems appear to have can actually be confirmed from their outputs. She was drawn to the gap between what a machine can compute and what it can be said to understand - and that gap has not closed.
What contemporary figure is most like Ada Lovelace?
The closest parallel is someone who combines deep technical contribution to the foundations of AI with a public willingness to argue that current systems have limits the field is not taking seriously - a theorist whose work is cited by the people building the products and whose concerns are acknowledged briefly and then set aside. The exact name changes depending on which year you ask.
Never miss a mystery
Get new investigations in your inbox
Weekly deep-dives on unsolved cases, Hollywood vs. history, and ancient civilizations. No spam. Unsubscribe anytime.


