CSHPM Notes bring scholarly work on the history and philosophy of mathematics to the broader mathematics community. Authors are members of the Canadian Society for History and Philosophy of Mathematics (CSHPM). Comments and suggestions are welcome; they may be directed to either of the column’s co-editors:

Amy Ackerberg-HastingsIndependent Scholar (aackerbe@verizon.net)
Hardy GrantYork University [retired] (hardygrant@yahoo.com)

Figure 1. Ada Lovelace (1815–1852).

On July 25, 2018, in a rare display of consensus, the Senate of the United States passed a resolution “honoring the life and legacy of Ada Lovelace” and “designating October 9, 2018, as ‘National Ada Lovelace Day’.”  This was a somewhat belated recognition, not only of Lovelace herself, but also of the fact that in every year since 2009, the second Tuesday in October has been celebrated worldwide as “Ada Lovelace Day”. The brainchild of Suw Charman-Anderson, Ada Lovelace Day has become an international celebration of the achievements of women in science, technology, engineering and mathematics (STEM) that aims to increase the profile of women in STEM subjects and to encourage more young women into scientific careers.

Today, the name of Ada Lovelace is as recognizable as those of other famous female scientists, including Marie Curie and Rosalind Franklin, and, to the general public, better known than other female mathematicians such as Emmy Noether and Sophie Germain. Yet Lovelace made no famous scientific discovery, proved no mathematical theorem, and died at the age of 36, having published only one paper—which credited her not by name but merely by the initials “A.A.L.” In fact, in her lifetime and for many years after it, the lady whose full name was Augusta Ada King, Countess of Lovelace (see Figure 1) was famous primarily for being the sole legitimate child of the poet Lord Byron.

Her fame today derives from the paper she published in 1843 in a journal called Taylor’s Scientific Memoirs [6]. Strictly speaking, this was a translation of someone else’s paper. The original article, entitled “Notions sur la machine analytique de M. Charles Babbage,” had been published the previous year in French by the Italian engineer Luigi Menabrea, and contained a discussion of a machine, as yet unbuilt, called the analytical engine. This theoretical contraption had been devised by the famous Victorian mathematician, inventor, and polymath Charles Babbage in the 1830s. Had it ever been built, it would have been the world’s first general-purpose computer—100 years before the work of Alan Turing and John von Neumann. Menabrea’s article was intended to explain and promote Babbage’s ideas to the European scientific community; Lovelace’s translation performed the same task for a British audience. But she also wrote seven lengthy appendices, or “Notes” to her translation which, at a total of 41 pages, amounted to more than one-and-a-half times the length of the original paper.

It is the last of these notes, Note G, on which her current fame rests. In it, she outlined an iterative process by which Babbage’s machine, via a series of steps, could compute the Bernoulli numbers, an irregular sequence of rational numbers, highly useful in number theory and analysis. Although the algorithm she devised was never run and the computer for which it was intended was never built, if Ada Lovelace is remembered for anything today, it is for having written the world’s “first computer program”. (This is despite the fact that what she actually published was closer to what we would call an execution trace than an actual program. See Figure 2.) Perhaps unsurprisingly, interest in Lovelace and her work re-surfaced as the era of modern computing began in the 1940s and 1950s. Pioneers such as Alan Turing referenced her paper, and other early writers on computer science paid tribute to her ability. Perhaps the most tangible display of the esteem in which she was held was the choice of the name “Ada” by the U.S. Department of Defense for its new programming language in 1979.

Figure 2. Chart that accompanied Note G in Lovelace’s translation of Menabrea’s paper.

Since the 1980s, though, evaluations of Lovelace’s scientific ability have been more mixed, with some authors claiming that her command of mathematics was actually rather limited and pointing to various algebraic errors as “evidence of the tenuousness with which she grasped the subject of mathematics” [7, p. 90].  The most forthright even described Lovelace as “mad as a hatter . . . with the most amazing delusions about her own talents”, calling her “the most overrated figure in the history of computing” [2, preface]. Yet for every study in which she is portrayed as a charlatan, there is another in which she is described as “a synthesizer and a visionary [who] saw the need for a mathematical and scientific language which was more expressive and which incorporated imagination” [8, p. 2]. To provide a more balanced estimation of Lovelace’s mathematical abilities, recent research has aimed to shed more light on precisely what mathematics Ada Lovelace actually studied in order to ultimately produce her famous paper of 1843.

This research, undertaken by a team comprising Chris Hollings and Ursula Martin from the University of Oxford and myself, focused on the 66 surviving letters from an eighteen-month-long correspondence course undertaken by Lovelace in 1840–41 under the tutelage of the British mathematician and logician, Augustus De Morgan. During this period, De Morgan introduced Lovelace to a large segment of what then comprised an undergraduate course in mathematics—since no women were actually allowed to receive a formal university education at that time. From basic algebra and trigonometry, she progressed through functional equations, calculus, and differential equations, even reading some of De Morgan’s own research papers. The letters between them at this time show her to have been a tremendously keen and capable student, although certainly prone to the usual beginner’s mistakes and misapprehensions. But our study seems to differ from others in delving into the actual details of the mathematics that Lovelace was studying with De Morgan. It reveals that, far from being mathematically limited, she did in fact have very strong mathematical skills together with an inquiring mind that led her to pose questions and speculations quite unlike the usual sort of enquiries to which De Morgan was accustomed from his (male) students.

For a start, Lovelace had a keen eye for detail, spotting several typos and other errors in De Morgan’s published works. Charles Babbage later recalled that, during the composition of the 1843 paper, when he provided the underlying algebra for the Bernoulli numbers algorithm, Lovelace had “detected a grave mistake which I had made in the process” [1, p. 136]. This critical eye resulted in significant independence of thought throughout her studies, for example when she refused to accept De Morgan’s proof of the binomial theorem because of its reliance on the so-called “Principle of the Permanence of Equivalent Forms”, an unproved (and now discredited) assumption then commonly used in algebra. It also led her to a prescient speculation, prompted by her introduction to the two-dimensional representation of complex numbers: “It cannot help striking me that this extension of Algebra ought to lead to a further extension similar in nature, to Geometry in Three-Dimensions; & that again perhaps to a further extension into some unknown region, & so on ad-infinitum possibly” [3, p. 219]. This was a strikingly accurate prediction, foreshadowing by two years the discovery of quaternions, which in turn gave rise to vectors, now used in the study of n-dimensional space. For a relative beginner in mathematics, Lovelace showed remarkable foresight.

Figure 3. Cover of Ada Lovelace: The Making of a Computer Scientist.

Lovelace’s correspondence course with De Morgan appears to have ended in late 1841, or possibly early 1842, but by that time she had learned all the mathematics necessary for her computational algorithm for the Bernoulli numbers: the algebra of functions, infinite series, and the calculus of finite differences. By the summer of 1843, as she wrote in a letter to Babbage, she was working “like the Devil” [8, p. 216] on her paper on his analytical engine. It was published in September, and Lovelace wrote excitedly about what further mathematical projects she would like to undertake in the future. She had already expressed an interest in the mathematical analysis of games like solitaire, and in 1844 she wrote of her hope to “bequeath to the generations a Calculus of the Nervous System” [3, p. 228]. But none of these grand ideas were realized. Her subsequent years were plagued by ill health and financial worries. By 1852, her condition had worsened and it was discovered that she was suffering from cancer of the uterus. She finally succumbed on 27 November of that year.

Our research into Ada Lovelace has not only revealed far more detail about the actual mathematics she studied, but our study of the original manuscripts has also helped to restore her mathematical reputation by revealing some key historical errors made by earlier scholars. The details can be found in our two papers [3] and [4], while those looking for an easy read (or a gift for a non-mathematical friend!) might enjoy our expository book [5], lavishly illustrated with over 50 color images relating to her life and work (see Figure 3). Finally, for those who really like to get their hands dirty, high-quality images (plus transcriptions) of all of the letters in the Lovelace-De Morgan correspondence may be viewed online at: https://www.claymath.org/content/correspondence-de-morgan-0.

This recent research—plus the many other publications that continue to appear on the subject—attests to the fact that the life and work of Ada Lovelace are still of great interest to mathematicians, computer scientists, and the public at large. So perhaps her greatest mathematical achievement is that she continues to attract scholarly attention, not only in the mathematics she actually produced, but in the possibilities of what might have been.

Adrian Rice is the Dorothy and Muscoe Garnett Professor of Mathematics at Randolph-Macon College in Ashland, Virginia, USA. His research focuses on the history of mathematics, specifically the development of algebra, analysis and logic in 19th- and early 20th-century Britain. He was awarded the Paul R. Halmos-Lester R. Ford Award for expository excellence by the Mathematical Association of America in 2019 for his article “Partnership, Partition, and Proof: The Path to the Hardy-Ramanujan Partition Formula”, published in The American Mathematical Monthly in 2018.


[1] Babbage, C. (1864) Passages from the Life of a Philosopher. Longman, Green, Longman, Roberts, & Green.

[2] Collier, B. (1990) The Little Engines That Could’ve: The Calculating Machines of Charles Babbage. Garland Publishing.

[3] Hollings, C., U. Martin, and A. Rice. (2017) The Lovelace-De Morgan mathematical correspondence: A critical re-appraisal. Historia Mathematica 44, 202–231.

[4] Hollings, C., U. Martin, and A. Rice. (2017) The early mathematical education of Ada Lovelace. BSHM Bulletin: Journal of the British Society for the History of Mathematics 32, 221–234.

[5] Hollings, C., U. Martin, and A. Rice. (2018) Ada Lovelace: The Making of a Computer Scientist. Bodleian Library Press.

[6] [Lovelace, A. A., trans. and ed.] (1843) Sketch of the Analytical Engine invented by Charles Babbage Esq. By L. F. Menabrea, of Turin, officer of the Military Engineers, with notes upon the memoir by the translator. Taylor’s Scientific Memoirs 3, 666–731.

[7] Stein, D. (1985) Ada: A Life and a Legacy. MIT Press.

[8] Toole, B. A. (1992) Ada, the Enchantress of Numbers. Strawberry Press.