My Life as a Logician

Martin Davis

“My father and mother were honest, though poor-"

“Skip all that!" cried the Bellman in haste.

“If it once becomes dark, there’s no chance of a snark- We have hardly a minute to waste!"

“I skip forty years," said the Baker, in tears,

“And proceed without further remark ..."

-Lewis Carroll’s “The Hunting of the Snark”

Abstract This brief autobiography highlights events that have had a significant effect on my professional development.

I was just over a year old when the great stock market crash occurred. My parents, Polish Jews, had immigrated to the United States after the First World War. My father’s trade was machine embroidery of women’s apparel and bedspreads. During the depression, embroidery was hardly in great demand, so we were dependent on home relief—what today would be called “welfare”. Only with the upturn of the economy coming with the outbreak of war in 1939, was my father able to find steady work. In his spare time, he was a wonderful self-taught painter. (One of his paintings is in the collection of the Jewish Museum in New York and two others are at the Judah Magnus Museum in Berkeley.) My mother, eager to contribute to the family income, taught herself the corsetiere’s craft. Until I left New York for graduate school, the room where she conducted her business by day was my bedroom at night.

This is a revision and expansion to the present of an earlier autobiographical essay [22], much of which is included verbatim. I am grateful to Springer Verlag for their permission.

M. Davis (B)

Department of Mathematics, University of California, Berkeley, CA, USA e-mail: This email address is being protected from spam bots, you need Javascript enabled to view it

M. Davis

Courant Institute of Mathematical Sciences, New York University, New York, NY, USA

© Springer International Publishing Switzerland 2016 1

E.G. Omodeo and A. Policriti (eds.), Martin Davis on Computability,

Computational Logic, and Mathematical Foundations,

Outstanding Contributions to Logic 10, DOI 10.1007/978-3-319-41842-1_1

In the New York City public schools, I was an adequate, but not at all exceptional, student. I’ve always enjoyed writing, but an interest in numbers came early as well. I remember trying to find someone who could teach me long division before I encountered it in school. My parents, whose schooling was minimal, could not help. My first “theorem” was the explanation of a card trick. I learned the trick from a friend who had no idea why it worked; I was delighted to see that I could use the algebra I was being taught in junior high school to explain it.

It was at the Bronx High School of Science that I first found myself with young people who shared the interests I had been developing in mathematics and physics. My burning ambition was to really understand Einstein’s theory of relativity. There were a number of books available in the local public library as well in the school library, but I couldn’t understand many of the equations. Somehow I got the idea that it was calculus I needed to learn, so I got a textbook and taught myself. When I arrived at City College as a freshman, I was able to begin with advanced calculus. During those years, the math majors at City College were an enthusiastic talented group many of whom eventually became professional mathematicians. The faculty, on the other hand, was badly overworked, and, with a few notable exceptions, had long since lost their enthusiasm. Even by the standards of the time, teaching loads were excessive, and none of the usual amenities of academic life (such as offices and secretarial help) were available. Only very few of the most determined faculty members remained active researchers. In addition to these obstacles, Emil Post struggled against physical and psychological handicaps: his left arm had been amputated in childhood and he suffered from periodically disabling manic-depressive disease. Nevertheless, Post not only continued a program of important fundamental research, but also willingly accepted students for special advanced studies on top of his regular teaching load (16 contact hours). I absorbed his belief in the overriding importance of the computability concept and especially of Turing’s formulation.

At City College my academic performance was hardly outstanding. I allowed myself the luxury of working hard only on what interested me. My A grades were in mathematics, German, history, and philosophy. My worst class was a required general biology course. I hated the amount of memorization of names of plant and animal parts I had no desire to know, and found genuinely difficult the “practicums”, in which we were asked to identify specimens we viewed under the microscope. I actually failed the course, and even on the second try only managed a C.

During my Freshman and Sophomore years, my passionate interest was in the foundations of real analysis. I learned various alternate approaches and proofs of the main theorems. I spent weeks working out the convergence behavior of the sequence

for x > 0. (It converges for (1/e)e < x < e1/e .ThecaseO < x < 1 is tricky because, although the even-numbered terms and the odd-numbered terms each converge, when x < (1/e)e their limits are different.) I liked sequences and saw how to prove that every sequence of real numbers has a monotone subsequence as a way of obtaining the basic theorems. I even wrote quite a few chapters of a proposed textbook.

My fellow student John Stachel and I began to be interested in logic, and at his suggestion, we approached Post about a reading course in mathematical logic. Thus, in my junior year, we began studying an early version of Alonzo Church’s textbook under Post’s supervision. Unfortunately, it only lasted a few weeks: Post had made his discovery of the existence of incomparable degrees of unsolvability, the excitement precipitated a manic episode, and he was institutionalized. The following year Post was back and we spent a great deal of time talking about logic. He gave me a collection of his reprints and also referred me to Kleene’s paper [37]. This was a paper Kleene had written in haste to get some results in publishable form before he was requisitioned for war work. For me this was a boon because it was written in a relatively informal style quite unlike Kleene’s usual more opaque exposition. I spent a lot of time filling in the gaps, and in the process became enamored of the Herbrand-Godel-Kleene equation formalism. In considerable part, my dissertation developed from that paper.

Kleene’s paper showed that the sets definable in the language of arithmetic[1] formed a natural hierarchy in terms of alternating strings

of quantifiers applied to a computable relation: each additional quantifier makes it possible to define new sets. This result was applied to give short incisive proofs of Godel’s incompleteness theorem and the unsolvability results of Alonzo Church.

I would undoubtedly have remained at City College for my graduate studies to work with Post if that option had been available. But City College was strictly an undergraduate school, and I had to look elsewhere. I had offers of financial support from Princeton, where I could work with Church, and from the University of Wisconsin, where Kleene would have been my mentor. Post advised me to go to Princeton, and that is what I did. There was quite a culture clash between my New York Jewish working-class background and the genteel Princeton atmosphere, and at one point it seemed that my financial support would not be renewed for a second year for reasons having nothing to do with my academic performance. Although eventually I was given support for a second year, the unpleasantness made me eager to leave. Fortunately, the requirements at Princeton were sufficiently flexible that it was quite possible to obtain a Ph.D. in just 2years, and that is what I did.

The problem that I knew would readily yield results was the extension of Kleene’s arithmetic hierarchy into the constructive transfinite, what later became known as the hyperarithmetic sets. Post had shown that the successive layers of Kleene’s hierarchy could also be generated using the jump operator,[2] and it was easy to see how to extend this method into the transfinite. But the problem that I found irresistibly seductive was Hilbert’s tenth problem, the problem of the existence of integer solutions to polynomial Diophantine equations. Post had declared that the problem “begs for an unsolvability proof” and I longed to find one. Not being at all expert in number theory, I thought that it was foolish to spend my time on Diophantine equations when I had a dissertation to write and a sure thing to work on. But I couldn’t keep away from Hilbert’s tenth problem.

Diophantine problems often occur with parameters. In general one can consider a polynomial equation

where p is a polynomial with integer coefficients, a,...,am are parameters whose range is the natural numbers, and x,...,xn are unknowns. I began to study Diophantine sets, that is, sets that could be defined by such an equation as the set of m-tuples of values of the parameters for which the corresponding equation has a solution in natural numbers.[3] Another way to say this is that Diophantine sets are those definable by an expression of the form

It was not hard to see that the class of Diophantine sets is not only a sub-class of the class of recursively enumerable (r.e.) sets,[4] but also shares a number of important properties with that class. In particular, both classes are easily seen to be closed under union and intersection, and under existential quantification of the defining expressions. A crucial property of the class of r.e. sets, a property that leads to unsolvability results, is that the class is not closed under taking complements. I was quite excited when I realized that the class of Diophantine sets has the same property. This was because if the Diophantine sets were closed under complementation, then the de Morgan relation

would lead to the false conclusion that all of the sets in Kleene’s hierarchy, all arithmetically definable sets, are Diophantine. (False because there are arithmetically definable sets that are not r.e. and hence certainly not Diophantine.) Although this proof is quite non-constructive,[5] the result certainly suggested that the classes of r.e. sets and of Diophantine sets might be one and the same. If every r.e. set were indeed Diophantine, there would be a Diophantine set that is not computable which would lead at once to the unsolvability of Hilbert’s tenth problem in a particularly strong form. So, I began what turned into a 20year quest, the attempt to prove that every r.e. set is Diophantine, what Yuri Matiyasevich much later called my “daring hypothesis”.

During the summer between my 2 years at Princeton I was able to prove that every r.e. set is definable by an expression of the form

where p is a polynomial with integer coefficients. From a purely formal point of view, this result (later known as “Davis normal form”) seemed tantalizingly close to my conjecture; the only difference was the presence of the bounded universal quantifier (Vk)<y. However, there was no method in sight for getting rid of this quantifier, and I couldn’t help agreeing with Church’s assessment when he expressed disappointment that the result was not stronger.

Meanwhile, I had a dissertation to write. I didn’t think at the time that my normal form by itself would suffice, although in retrospect I think it likely would have been accepted. In any case, I worked out an extension of Kleene’s hierarchy into the constructive transfinite using Kleene’s system of notations for ordinals.[6] Kleene had defined a set O of natural numbers and a partial well-ordering <O on this set. Each m e O represented an ordinal m , and

With each m e O I associated a set Lm in such a way that m <On implied that Lm is computable relative to Ln as oracle, but not the other way around. Then to extend Kleene’s hierarchy, it was only necessary to consider the sets many-one reducible to the Lm. I studied their representation in terms of second order quantification and obtained the ridiculously weak result that up to rn1 all of these sets were indeed so representable.[7] In addition I defined a constructive ordinal y to be a uniqueness ordinal if whenever m = |n| = y the Turing degrees of Lm and Ln are the same. I proved that every y < &>2 is indeed a uniqueness ordinal.[8]

I presented the results from my dissertation in brief talks at two professional meetings. The Diophantine result was given at a small meeting of the Association for Symbolic Logic in Worcester, Massachusetts in December 1949, which I attended with my first wife a few days after our marriage. Eight months later I attended the first post-war International Congress of Mathematicians at Harvard University, and spoke about my results on hyperarithmetic sets. This time I was alone—our marriage had proved short-lived; my wife had left me shortly before the Congress. At the Congress I met the great logician Alfred Tarski who showed considerable interest in my work, and, of particular significance, I also met Julia and Raphael Robinson. I had studied some of their published work, and was very pleased to meet them. I was surprised to find that Julia was presenting a short contributed paper on Diophantine sets. It turned out that we had approached the subject from opposite directions. While I had been trying to find a general representation for arbitrary r.e. sets, as close as possible to a Diophantine definition, she had been seeking such definitions for various particular sets. Her result that turned out to have the most important consequences was that from the existence of a single Diophantine equation with two parameters, one of which grows exponentially as a function of the other, she could obtain a Diophantine definition of [9] based on conversations with Julia shortly before her tragic death of leukemia, she quotes Julia as remembering me saying when we met that I couldn’t see how her work “could help solve Hilbert’s problem, since it was just a series of examples”. I do not want to believe that I said anything so ungracious and so foolish. Julia is also quoted as remembering my “presenting a ten minute paper” at that Congress on my Diophantine results, and as that was not the case, I can comfort myself with the thought that her recollection of what I had said may also have been mistaken.

A few days after the Congress, I was on a plane from New York to Chicago, my first experience of air travel. After considerable difficulty in landing a job in a tight market, with my specialization in logic a definite disadvantage, I had had a stroke of luck. My former fellow student Richard Kadison having received a coveted National Research Fellowship, turned down the offer from the University of Illinois at Champaign-Urbana to be “Research Instructor”. As their second choice, the position was offered to me, and I was delighted to accept. Research Instructors were expected to be recent Ph.D.’s and were required to teach only one course per semester at a time when the regular faculty taught three. In addition, we were given the opportunity to teach a second graduate course in our own specialty if we wished. I was very happy to take advantage of this possibility: I taught mathematical logic in the fall and recursive function theory in the spring. In this second course, I decided to begin with Turing machines. Kleene had applied Godel’s methods of arithmetic coding to develop his results for the equation calculus. I saw that the same could be done for Turing machines and that this had certain technical advantages. However, in order to develop the necessary machinery, I had to design Turing machines to carry out various specific operations; without realizing it, I was being a computer programmer!

Edward Moore (later known for his basic work on sequential machines), also a very recent Ph.D. in mathematics, was an auditor in my course. He came up to the front of the room after one of my classes and showed me how one of the Turing programs I had written on the blackboard could be improved. Then he said something very much like the following: “You should come across the street; we’ve got one of those machines there.” In fact a superb engineering group were just finishing a computer called ORDVAC of the “johnniac class” on the University campus. I had been paying no attention to computers, and up to that moment had not considered that Turing’s abstract machines might have some relation to real world computing machines. It would make a better story if I said that the next day I took Ed up on his invitation. But the truth is that it was the Korean War and the hot breath of the draft that led me to take that walk “across the street” some weeks later. It was clear to me that if I remained in my faculty position, I would be inducted into the army, and it was equally clear to me that that was something I wanted to avoid.

A faculty group, led by the physicist Frederick Seitz, determined to contribute to the war effort and convinced of the military significance of automated systems, started a project within the university called the Control Systems Laboratory (C.S.L.). I was recruited for the project and, with the promise of a draft exemption, accepted. My boss was the mathematician Abe Taub, an expert in relativity theory and shock waves. It was a heady time. Norbert Wiener’s Cybernetics heralding a new age of information and control had appeared a few years earlier, von Neumann had developed the basic computer architecture still used today and was investigating the use of redundancy to obtain reliable results from unreliable components, and the transistor had just been developed at Bell labs. There was much discussion of all this at the C.S.L., and after some vacillation, a report from the battlefield on the need for better fighter plane support for the front line troops decided the direction of the first major effort.

A working model was to be produced of an automated system for navigating airplanes in real time. The “brain” of the system was to be the newly constructed ORDVAC. And the job of writing the code fell to me. My instruction in the art of computer programming was delivered by Taub in less than five minutes of “This is how it is done”. I also had as textbook the basic reports by von Neumann and Goldstine with many sample programs. Of course, the project was ludicrously overambitious given the technology available in 1951. The ORDVAC had 5 kB of RAM; memory access required 24 ms. Addition time was 44 ms, and multiplication time a hefty millisecond. From a programmer’s point of view, interpreters, compilers, or even assembly language were all non-existent. There were no index registers. Inductive loops had to be coded by incrementing the address portion of the instructions themselves. And of course all the code had to be written in absolute binary. The RAM was implemented as static charge on the surface of cathode ray tubes, which tended to decay rapidly, and was continuously being refreshed. This worked so long as the programmer was careful not to write loops so tight that the same position on the CRT’s was bombarded by electrons too rapidly for the refreshing cycle to prevent spillover of charge to neighboring positions. To a contemporary programmer, these conditions seem nightmarish, but in fact it was lots of fun (especially when I let myself forget what it was all supposed to be for).

My experience as an ORDVAC programmer led me to rethink what I had been doing with Turing machines in the course I had just finished teaching. I began to see that Turing machines provided an abstract mathematical model of real-world computers. (It wasn’t until many years later that I came to realize that Alan Turing himself had made that connection long before I did.) I conceived the project of writing a book that would develop recursive function theory (or as I preferred to think of it: computability theory) in such a way as to bring out this connection. I hardly imagined that 7 years would go by before I held in my hand a printed copy of Computability & Unsolvability. I enticed a group of my C.S.L. colleagues into providing an audience for a series of lectures on computability; the notes I provided for the lectures were a rough draft of the first part of the book.

Champaign-Urbana in the early 1950 s was not an ideal locale for a young bachelor looking for a social life. In the university community, young men outnumbered young women by something like 10 to 1. (Even among undergraduates the ratio was 4 to

1.) But I was lucky enough to attract the interest of Virginia Palmer, a graduate student. By the spring of my first year there, she had moved into my apartment, an arrangement far more unusual in those days than it would be today. In fact, the university administration took an active interest in students’ intimate lives. Female graduate students (and only female students) were subject to expulsion if they were found cohabiting with a male. So our menage was somewhat dangerous, especially as Virginia’s parents didn’t find me a particularly desirable suitor. We planned to marry on the earliest date after the legal formalities officially dissolving my first marriage were complete. That date turned out to be the first day of autumn just about a year after my arrival in Champaign-Urbana; we were married by the local Unitarian minister in a simple ceremony with only three friends present. My second marriage has proved somewhat more durable than the first; as I write this, we have recently celebrated our 63rd anniversary.

Christmas week 1951 provided an occasion to drive East and introduce Virginia to my New York friends. It also enabled me to attend a mathematical meeting in Providence where I heard Kurt Godel deliver his astonishing lecture in which he proposed that reflecting on his undecidability results would force one to adopt ontological assumptions characteristic of idealistic philosophy. The lecture was published only recently, after Godel’s death, but the audacious ideas he propounded have remained with me ever since I heard the lecture.

The spring of 1952 marked a major change for the ORDVAC. It had been built under contract for Army Ordnance, and it was time for its delivery to their Proving Grounds in Aberdeen, Maryland. The computer group had been busy working on a twin (not quite identical) to the ORDVAC dubbed the ILLIAC (later ILLIAC I). But here I was with my code and no computer to debug it on until the ILLIAC came on line. So I was sent to Aberdeen. Virginia came with me and we stayed in a motel in the nearby town, Havre de Grace. It was in that motel that we conceived our first child.

The ORDVAC had been installed in the the building housing the Ballistics Research Laboratory along with two older, indeed historic, computers: the EDVAC and the ENIAC. The ENIAC consisted of racks of vacuum tube circuits and plugboards such as were used by telephone switchboards, filling the four walls of a large room. The building was locked from the inside and one could only leave by first going to the ENIAC room and asking one of the people there to unlock the door. The ORDVAC was in use by Aberdeen people until 4 p.m., after which it was made available to me. Instead of the watchful crew in Urbana used to babying their creation, the computer operator was a sergeant whose main qualification was that in civilian life he had been an amateur radio operator. I was soon operating the machine myself, something I never would have been permitted to do in Urbana. One evening, I noticed that the machine seemed to be making many errors. I also noticed that I was getting very warm, but it didn’t occur to me to connect these facts. Finally, when I saw a 0 change to 1 on a CRT at a time that the computer was not executing any instructions, I gave up and left. The folks back in Urbana were furious with me. The air conditioning had broken down, and there had been a very real danger that the ORDVACS could have been destroyed by the heat. It should have been powered down at once.

Back in Urbana, I found myself increasingly unhappy with what I was doing at the C.S.L. The Office of Naval Research came to my rescue with a small grant that enabled me to spend 2 years as a visiting member at the Institute for Advanced Study in Princeton. I thought that with that sponsorship, I would probably be safe from the draft. My proposal was to work on connections between logic and information theory. That was a really good idea: the great Russian mathematician Kolmogoroff and Gregory Chaitin showed what could be done with it quite a few years later. However, I found myself moving in other directions.

The Institute for Advanced Study in those years was directed by J. Robert Oppen- heimer. On the faculty were Albert Einstein, Kurt GOdel, and John von Neumann. Einstein and Godel, good friends, were often seen walking to or from the Institute buildings together. I well remember the first time we encountered them walking down the middle of Olden Lane together: Einstein dressed like a tramp accompanied by Godel in a suit and tie carrying his briefcase. “Einstein and his lawyer” was Virginia’s vivid characterization.

I had met Norman Shapiro as an undergraduate in Urbana. He had come to Princeton University as a graduate student and was writing a thesis on recursive functions. He and I organized a logic seminar. Among the regular attendees were Henry Hiz, John Shepherdson, and Hao Wang. Hilary Putnam, with whom I was later to do some of my best work, gave a philosophical talk which Norman and I mercilessly attacked. In my research, I was struck by the fact that the phenomenon of undecidability in logic could be understood abstractly in terms of the way each particular logical system provided a mapping from recursively enumerable sets[10] to subsets of their complements. I was particularly struck by the fact that Godel’s famous result about the unprovability of consistency could be expressed simply as the fact that the iteration of this map always produces the empty set. Some years later I told one of my first doctoral students, Robert Di Paola, about this, and he based his dissertation on studying that mapping. Godel himself was uninterested when I summoned the courage to tell him about my ideas.

I occasionally thought about Hilbert’s tenth problem, and I worked on my book. The chapter on applications of computability theory to logic gave me particular trouble. The problem I faced was giving a coherent exposition without writing a whole book on logic. I rewrote that chapter many times before I was satisfied. A problem of another kind was the difficulty I had in getting the Institute typists to produce a decent copy from my handwritten manuscripts. Our son was born in

January 1953. After he was weaned, a year later, Virginia took a job at the Princeton Public Library. I imagined I could take care of the baby and work on my book at the same time. Of course this did not work out very well.

My arrangement with the Office of Naval Research left me free to seek employment during the summer months. We certainly needed the extra money. I was able to spend the summer of 1953 working at Bell LabsBell Labs, a short commute from Princeton. My boss was Shannon, the inventor of information theory, and I was able to renew my acquaintance with Ed Moore. Shannon had recently constructed a universal Turing machine with only two states. He posed the question of giving a well defined criterion for specifying when a Turing machine could be said to be universal. I liked that question and wrote two short papers dealing with it.[11] The intellectual atmosphere at Bell Labs was stimulating and open to fundamental research. I could well understand how a fundamental breakthrough like the transistor could develop in such an environment. Shannon himself was treated like the star he was. He had a small shop with two technicians available to build any of his whimsical gadgets. His “mouse” that successfully solved mazes was already famous. Less well known was his desk calculator “Throwback I” that accepted its input in the form of Roman numerals. Shannon was also an expert unicycle rider. One day he brought his unicycle to the labs and created mass disruption by riding it down the long corridors and even into and out of elevators, bringing swarms of Bell LabsBell Labs employees streaming out of their offices to watch. Another thing I remember about that summer is the excitement of a real workers uprising in East Berlin against the Communist regime.

For the summer of 1954 I thought about applying the programming skills I had learned in Urbana to a logical decision procedure. My first choice was Tarski’s quantifier elimination algorithm for the first order theory of real closed fields. But on second thought I saw that this was going to be too difficult for a first try, and instead I settled on Presburger’s procedure for integer arithmetic without multiplication, since this was a much simpler quantifier elimination procedure. Had I known the Fischer-Rabin super-exponential lower bound for Presburger arithmetic (proved 20 years later), I would presumably have hesitated. But I went blithely ahead with the blessing of the Office of Ordnance Research of the U.S. Army which agreed to support the effort. I was able to do the work without leaving Princeton, using the original JOHNNIAC at the Institute for Advanced Study. To my dismay the code used all of the 5 kB of RAM available and was only able to deal with the simplest statements on the order of “The sum of two even numbers is even”. My report on the program, duly delivered to the Army on its completion and included as well in the Proceedings of an important Summer Institute of Logic at Cornell in 1957 (about which, more later), ended with the understatement[12]:

The writer’s experience would indicate that with equipment presently available, it will prove

impracticable to code any decision procedures considerably more complicated than Presburger’s. Tarski’s procedure for elementary algebra falls under this head.

An anthology [43] of “classical papers on computational logic 1957-1966” published in 1983 begins its preface with the sentence:

In 1954 a computer program produced what appears to be the first computer generated mathematical proof: Written by M. Davis at the Institute of Advanced Studies (sic), USA, it proved a number theoretic theorem in Presburger Arithmetic.

In the spring of 1954 my 2years at the Institute were drawing to a close, and I needed to find a job. Again the market was rather tight. We had a few possibilities, but opted for the one that took us furthest west: an Assistant Professorship at the University of California at Davis. For the first time we experienced what was to be repeated over a dozen times in our lives: the trip by automobile across the United States with a stopover in Kansas City to visit Virginia’s parents. As we approached our new home, the road signs seemed to be directing us: “Davis use right lane”.

The liberal arts program was newly instituted at U.C. Davis which had been originally devoted to agriculture. In 1954, the population of Davis was just about 5000 people. It was not a cultural center. Amusements were in such short supply that we would drive to the local soft ice-cream drive-in just to watch the customers come and go. It was not a year in which I accomplished much scientifically. The teaching load was quite modest: just two courses per semester. When I taught calculus (to students majoring in agricultural engineering), I had to speak in a loud voice to be heard above the clatter of the turkeys in the building next door.

Virginia was pregnant again and we needed to find an obstetrician. There were none in Davis itself, but there was a hospital at the county seat, Woodland, a few miles away. Virginia found the local obstetrician there quite unacceptable. Sacramento, the state capital was perhaps 18 miles away, but we had heard too many obstetrical horror stories coming from that quarter. So we headed for progressive Berkeley 80 miles away, where Virginia found an excellent obstetrician. Today there is an excellent superhighway connecting Davis with Berkeley, but in 1954 the drive took at least 2h. The highway ended in Richmond with the rest of the route being through city streets. Virginia’s first labor had been swift and uneventful, so we knew that we had to be prepared for the possibility of not making it to Berkeley in time. We obtained a government pamphlet for farmers on delivering babies, and bought a second-hand obstetrics textbook. We were in Berkeley a few days before our Nathan made his appearance, and Virginia was assured that all was well. Back in Davis, we were awakened at 2 a.m. by a flood of amniotic fluid drenching the sheets. By 4 a.m. the crying baby had arrived. Virginia tells people that I “delivered” him, although really I just watched. Except for one detail: Nathan was born with his umbilical cord wrapped around his neck. Before I had time to think about it, I had lifted the loop away.

People we hardly knew had strong opinions about what had happened. Those who thought we had done something praiseworthy in defense of the natural seem to have been outnumbered by those who thought we had behaved in an irresponsible manner. There was even the suggestion that we should be imprisoned. We were convinced that Davis was not for us, and were determined to leave. I found a position at Ohio State University in Columbus and quickly accepted. For the summer, I got a job at the

Moore School of Electrical Engineering in Philadelphia where the ENIAC and the EDVAC had been built. So, we set out for Princeton in our 1951 Studebaker sedan with Harold and Nathan in the rear. Our plan was to spend the summer there, an easy commute to the Moore School.

The summer at the Moore School was a pretty complete disaster. They wanted me to prove a particular kind of theorem about certain numerical methods for solving ordinary differential equations. I knew very little about such things, but I saw no reason to believe that there was a theorem of the kind they wanted. I did not accomplish much for them. The best part of the summer was getting to know Hilary Putnam who was living in the same prefab housing complex for graduate student and junior faculty families where we had subleased an apartment for the summer. To my surprise, he was very interested in Hilbert’s tenth problem and proposed that we collaborate. Nothing much came of this until a few years later.

A major plus of my new position at Ohio State University was that Kleene’s student Clifford Spector was on the faculty. His brother was a close friend of a good friend of mine, and on his brother’s advice, he had written me some years earlier about his interest in logic. Apparently, this interest had been actively discouraged at Columbia University where he had been informed that there are no interesting problems in logic. I had suggested a number of possibilities for graduate study in logic including Madison, Wisconsin with Kleene. Somewhat to my surprise, I detected something not entirely friendly in Clifford’s welcome. It was several months before he became open. I learned that Kleene had been rather displeased with me. Kleene had gone to considerable trouble to get a fancy fellowship for me at the University of Wisconsin, and I had not only gone to Princeton instead, but had written a dissertation largely in areas where Kleene himself had been working. Kleene had given Spector the uniqueness ordinal problem left open in my thesis as an appropriate topic for his dissertation. Clifford reported that Kleene had whipped him on with the warning that “Davis is working on it” emphasizing the importance of reaching the goal first. In fact, I hadn’t been thinking about uniqueness ordinals at all. In any case Spector was a more powerful mathematician than I. In his excellent dissertation, he not only proved that every constructive ordinal is a uniqueness ordinal (thus settling the question raised in my dissertation), but also proved a deep result in the theory of degrees of unsolvability.[13]

The 1 year we spent in Columbus was not a happy one. Among other difficulties, we were feeling financially pinched. I received my last paycheck from Davis at the beginning of June and the first from Columbus only in November. The money from the Moore School helped, but I had to return half of the money for moving expenses I had received from Davis because I had left after only 1 year, and Ohio State did not cover moving expenses. And apparently impossible to please, Virginia and I just didn’t like life in Columbus very much. To save money, we moved into an apartment with just one bedroom that we gave to our two babies, while we slept on a convertible couch in the living room. The Chair of the department helped by offering me the opportunity to teach an off-campus advanced calculus course to Air Force officers at the nearby Wright-Patterson base in the summer. In the hot Ohio summer, I often taught wearing short pants. I later found that a Colonel had complained about my attire to the department Chair.

One morning that summer, at the breakfast table, Virginia pointed to an advertisement in the New York Times. An anonymous “long established university in the northeast of the United States” was seeking teachers of engineering subjects including calculus and differential equations. Salaries, the ad said would be “comparable to industry”. I sent off a letter at once, and I was interviewed and hired. My academic year salary increased from $5100 to $7900 and we felt rich. The “long established university” turned out to be Rensselaer Polytechnic Institute (R.P.I.). The position was not at the main campus in Troy, New York, but at the Hartford Graduate Division in Eastern Connecticut. In 1956 the nation was experiencing an acute shortage of engineers. In the Connecticut valley, the United Aircraft Company with its Pratt-Whitney subsidiary (a major manufacturer of jet engines) had been finding it extremely difficult to hire the engineers it needed. To help to solve this problem, R.P.I. was asked to form the Hartford Graduate Division so United Aircraft engineering employees could take courses leading to a master’s degree, with tuition to be paid by the company. This had helped, but not enough. So, liberal arts graduates who satisfied the minimum requirement of having completed a year of calculus and a year of physics were hired by United Aircraft and sent to the Hartford Graduate Division to study mechanical engineering. Those who completed the forty week program received a certificate and were put to work. They were also eligible to apply to R.P.I.’s master’s program.

Faculty was needed to teach in this new program, and that was the reason for the ad I had answered. The Hartford Graduate Center was housed in a one-story, industrial- style building with a huge parking lot on the main highway between Hartford and Springfield, Massachusetts. Friends had predicted that moving to an environment with no research aspirations, to do elementary teaching would be the end of my research career. In fact it turned out to be an excellent move. From a personal point of view, Eastern Connecticut was beautiful and an easy drive to New York where there were friends, the amazing resources of that city, and my mother’s apartment in the Bronx where she would cheerfully serve as baby sitter, and where we could spend the night. But it turned out very well professionally also. The student body were relatively mature interesting people of varied background who were fun to teach. And as the forty week program wound down, I moved into the master’s level program, teaching a variety of courses far more interesting than what would have been available to a lowly assistant professor at Ohio State. Student notes for a course in functional analysis later became a short book.[14] The clerical staff turned out to be cheerful and competent and quite willing to turn my mangled and worked over manuscript for Computability & Unsolvability into a typescript I could send to publishers. There were mixed reviews, including one that derided the connection I was proposing with actual computers and included an invidious comparison with Kleene’s recently published book (with which, by the way, the overlap was not extensive). It turned out that McGraw-Hill had chosen Hartley Rogers as their reviewer, and he not only wrote the kind of laudatory review that gladdens an author’s heart, but also produced an astonishingly detailed helpful critique. The book was published in McGraw-Hill’s series on “Information Processing and Computers” appearing in 1958. It was eventually translated into Japanese and Italian,[15] and, reprinted by Dover in 1982,[16] it remains in print today

[3].

The summer of 1957 was an exciting time for American logicians. A special “institute” on logic was held at Cornell University. For five weeks 85 logicians participated: established old-timers, those in mid-career, fresh Ph.D.’s, and graduate students. There was even Richard Friedberg, still an undergraduate, who had just created a sensation by proving the existence of two r.e. sets neither of which is computable relative to the other thus solving what had been called Post’s problem. There were seminars all day. The gorges of Ithaca were beautiful, and swimming under Buttermilk Falls was a summertime pleasure. Hilary Putnam and I seized the opportunity to work together. Our two families shared a house with an unusual distribution of the quarters: there were three small separate apartments; the adult couples each got one of them, and the third went to the three children, our two boys and Hilary’s Erika who was two days younger than our Nathan. Hilary and I talked all day long about everything under the sun, including Hilbert’s tenth problem. Hilary tended to generate ideas non-stop, and some of them were very good. I tended to be cool and critical and could be counted on to shoot down ideas that were pretty obviously bad. Hilary’s idea that turned out to be very good indeed was to begin with the normal form from my dissertation and to try to get rid of that bounded universal quantifier that blocked the path to my “daring hypothesis” by using the Chinese Remainder Theorem to code the finite sequences of integers that the quantifier generates.[17] Using little more than the fact that congruences are preserved under addition and multiplication, we obtained two relations with rather simple definitions about which we were able to show that their being Diophantine would imply that every r.e. set is likewise Diophantine.[18]

We resolved to seek other opportunities to work together. Hilary suggested we try to get funding so we could spend summers together. He proposed investigations of possible computer implementations of proof procedures for first order logic.[19]1 guess we thought we’d have more luck being funded with that than with Hilbert’s tenth problem. We agreed to work through R.P.I. By the time we got our proposal together, it was too late to be funded for the summer of 1958 by any of the usual agencies. Someone suggested that we try the National Security Agency (NSA). Although the NSA is now notorious, I’d never heard of them. Nevertheless I sent the proposal to them.

Our idea was to define a procedure that would generate a proof of a sentence by seeking a counter-example to its negation in what later became known as its Her- brand universeHerbrand!H. universe. This involved generating ever longer Herbrand expansions, and testing periodically for a truth-functional inconsistency. When I was called to NSA headquarters, it turned out that it was this test for truth-functional inconsistency that interested them. They told me that this was a very hard problem, and seemed dubious of our ability to make serious inroads in just one summer, but, finally, they did agree to sponsor our work. We were to provide a report at the end of the summer. However, unlike typical funding agencies, they specifically asked that their support not be acknowledged in the report. Told that I’d never heard of the NSA, the reply was that their “publicity department” was doing a good job.

I found a summer cottage on Lake Coventry for Hilary and his family. As I said elsewhere about my summers with Hilary:

We had a wonderful time. We talked constantly about everything under the sun. Hilary gave me a quick course in classical European philosophy, and I gave him one in functional analysis. We talked about Freudian psychology, about the current political situation, about the foundations of quantum mechanics, but mainly we talked mathematics.[20]

My first copy of Computability & Unsolvability, smelling of printer’s ink arrived that summer. Elated, I showed it to Hilary. He smilingly offered to find a typographical error on any page I’d select. Determined to show him, I turned to the reverse side of the title page containing the copyright notice, only six lines. Giving the page a quick glance, Hilary noted that the word “permission” was missing its first “i”.

Our report for the NSA, entitled Feasible Computational Methods in the Propositional Calculus is dated October 1958. It emphasizes the use of conjunctive normal form for satisfiability testing[21] (or, equivalently, the dual disjunctive normal form for tautology testing). The specific reduction methods whose use together have been linked to the names Davis-Putnam are all present in this report.[22]

After that first summer, our research was supported by the U.S. Airforce Office of Scientific Research. It was in the summer of 1959 that Hilary and I really hit the jackpot. We decided to see how far we could get with the approach we had used at the Logic Institute in Ithaca, if, following Julia Robinson’s lead, we were willing to permit variable exponents in our Diophantine equations. That is, we tried to show that every r.e. set could be defined by such an exponential Diophantine equation. After some very hard work, using Julia Robinson’s techniques as well as a good deal of elementary analysis,[23] [24] [25] we had our result, but, alas, only by assuming as given, a fact about prime numbers that was certainly believed to be true, but which wasn’t proved until many years later namely: there exist arbitrarily long arithmetic progressions consisting entirely of prime numbers.24 As we wrote up our summer’s work, we decided to include an account of a proof procedure for first order logic based on our work on the propositional calculus from the previous summer. Our report to the Air Force included the work on Hilbert’s tenth problem, the proof procedure, and a separate paper on finite axiomatizability. Years later Julia Robinson brought a copy of this report[26] with her to Russia where the mathematicians to whom she showed it were astonished to learn that this work was supported by the

U.S. Airforce. It was the proof procedure [28] that brought some notoriety to the Davis-Putnam partnership. It proposed to deal with problems in first order logic by beginning with a preprocessing step that became standard: The negation of the proposition to be proved was put into prenex normal form, followed by Skolemization to eliminate existential quantifiers, and then put into conjunctive normal form. Our crude algorithm generated a continuing Herbrand expansion periodically interrupted by tests for satisfiability along the lines mentioned above.

We submitted our work on Hilbert’s tenth problem for publication and at the same time sent a copy to Julia Robinson. Julia responded soon afterwards with an exciting letter:

the four of us. It still seems to be useful. I might mention that I have received two awards based at least partly on this work and that I feel strongly that Hilary, at least, should have shared them.

expanded Г '/Г by Taylor’s theorem, and used an estimate for Г" to deal with the remainder.

I am very pleased, surprised, and impressed with your results on Hilbert’s tenth problem.

Quite frankly, I did not think your methods could be pushed further ...

I believe I have succeeded in eliminating the need for [the assumption about primes in

arithmetic progression] by extending and modifying your proof.

She sent us her proof soon afterwards; it was a remarkable tour de force. She showed how to get all the primes we needed by using, instead of a then unproved hypothesis about primes in arithmetic progression, the prime number theorem for arithmetic progressions which provided a measure of how frequently primes occurred “on average” in such progressions. We proposed that we withdraw our paper in favor of a joint publication, and she graciously accepted. She undertook the task of writing up the work, and (another surprise), she succeeded in drastically simplifying the proof so only the simplest properties of prime numbers were used. Combined with Julia’s earlier work, this new result showed that my “daring hypothesis” that all r.e. sets are Diophantine was equivalent to the existence of a single Diophantine equation whose solutions grow exponentially (in a suitable technical sense of the word).26 The hypothesis that such an equation exists had been raised by Julia in her earlier work, and Hilary and I called it JR.

For years I thought of myself as an exile from New York. Now came an opportunity to move there. From the Institute of Mathematics (to be renamed a few years later the Courant Institute of Mathematical Sciences) at New York University came an invitation to visit for a year. Although this was just a visiting appointment, I was confident that we would not be returning to Connecticut. Cutting our bridges behind us, we sold the house we had bought just a year before. Virginia was as enthusiastic as I about our new life. We moved into an apartment overlooking the Hudson River in the Upper West Side of Manhattan. At NYU, I was asked to teach a graduate course in mathematical logic which was a great pleasure. One of the students in that course, Donald Loveland, later became one of my first doctoral students, and, still later, a colleague. One result of my new situation was access to an IBM 704 computer. I jumped at the chance to try out the proof procedure Hilary and I had proposed. Two graduate students Loveland and his friend George Logemann were assigned to me to do the programming. Donald was a particularly apt choice because he had been involved at IBM with Gelernter’s “geometry machine”, a program to prove theorems in high school geometry. Their programming effort was successful, but when the program was run it was found that the periodic tests for truth-functional consistency were generating large numbers of ever longer formulas that rapidly filled the available RAM. It was the rule for eliminating atomic formulas (later called ground resolution) which replaced a formula

by

that was causing the problem. It was when the three of us met that we decided to try to use instead the splitting rule27 which generates the pair of formulas

The idea was that a stack for formulas to be tested could be kept in external storage (in fact a tape drive) so that formulas in RAM never became too large.

Much to my surprise this problem of testing for satisfiability a Boolean polynomial presented in form of a list of disjunctive clauses which Hilary and I had introduced as an adjunct to a theorem proving procedure for first order logic, has turned out to be of fundamental importance. The problem has been given the name SAT and it has been the subject of a huge literature both theoretical and pragmatic. The form of our algorithm that uses the splitting rule, currently referred to as DPLL, has proved to be very successful and has been incorporated in many implementations. In the case of our work, it proved successful in testing formulas consisting of thousands of clauses. Nevertheless the program was overwhelmed by the explosive nature of the Herbrand expansion in all but the simplest examples.

As I had expected and hoped, I was offered a regular faculty appointment at NYU. At that time, there were three more or less separate mathematics departments at NYU: the graduate department, the undergraduate department at the main campus in Greenwich Village, and another undergraduate department at the Bronx campus. The appointment I was offered was in the undergraduate department at the main campus. Although not what I had hoped for, I would certainly have accepted this offer, had not the first Sputnik gone aloft a few years earlier. The Soviet launching of a satellite in 1957 had provoked a furore in the United States. We were “falling behind” in science and technology. All at once, science became a growth industry. And that was why I received a very attractive offer from Yeshiva University.

Yeshiva College is housed in a building with a curiously Middle Eastern flavor in the part of Manhattan known as Washington Heights (because Washington fought a rear guard action against the British there as the revolutionary forces were retreating from New York City). It takes its name from the traditional East European yeshivas, institutions of advanced religious training based on the Talmud with instruction mostly in Yiddish, training that could lead to rabbinical ordination. Yeshiva College adjoined to this traditional curriculum, a liberal arts program in the American mode. Various schools were added to the complex, most of them secular, leading to the “university” designation in 1945. The Mathematics Department at Yeshiva College was the home of the periodical Scripta Mathematica, specializing in mathematical oddities and issued regularly beginning in 1932. Abe Gelbart was a mathematician at Syracuse University who had become involved with the Scripta Mathematica effort. He began to imagine the possibility of building a first-rate graduate program in mathematics and physics in this milieu. He was able to convince the Yeshiva University administration that in the post-Sputnik atmosphere, external funding would be readily available, and he received a go-ahead to found a new Graduate School of Science (later the Belfer Graduate School of Science) with himself as dean.

My teaching load at Yeshiva was to be two graduate courses per semester with every encouragement to develop a program in logic as opposed to the NYU offer which would have required three undergraduate courses per semester with the option to conduct a logic seminar on my own time. In addition the Yeshiva offer came with a salary of $500 more than I would make at NYU. For various reasons I would have preferred to remain at NYU, but they were unwilling to respond to the Yeshiva offer, and so, I phoned Gelbart and accepted. Late that spring I was informed that the NYU had reconsidered and was now willing to coming closer to the Yeshiva offer. However, I felt that I had made a commitment to Yeshiva that it would have been unethical to break. I was told to keep in touch and let them know if circumstances changed. It was 5 years before I took them up on that suggestion.

Gelbart found a home for the new school in a building not far from Yeshiva College. When I was taken to see it, I was quite startled. The building had previously been a catering palace, and I remembered it very well. It had been the scene of the celebration of my ill-fated marriage to my first wife a decade earlier. Gelbart turned out to be difficult and ill-tempered, but eager to please so long as his beneficence was duly acknowledged. The faculty, mathematicians and physicists all together on one floor, formed a very congenial group and a good deal of first-rate research was accomplished. I worked to develop a logic program and was successful in having an offer made to Raymond Smullyan which he accepted. Although Donald Loveland’s degree was awarded by NYU, he was effectively part of our logic group at Yeshiva. From the beginning Robert Di Paola was at Yeshiva in order to work with me. Both Di Paola and Loveland received their doctorates in 1964.

I was able to publish several papers that were spin-offs of the work with Hilary and Julia (one of them joint with Hilary).28 I also worked on proof procedures, a field that was beginning to be called automatic theorem proving (ATP). In fact my work with Logemann and Loveland was continuing after I had left NYU, with the IBM 704 continuing to be available to us. It was after the program was running and its weaknesses were apparent to us that an article arrived in the mail that had a major influence on my thinking. It was a paper by Prawitz [39] in which he showed how the kind of generation of spurious substitution instances that overwhelmed our procedure could be avoided. However, the procedure he proposed was subject to a combinatorial explosion from another direction. I set as my goal finding a procedure that combined the benefits of Prawitz’s ideas with those of our procedure. I believed that we were on the right track in using as our basic data objects sets of disjunctive clauses (each consisting of literals) containing variables for which substitution instances could be sought. Prawitz had proposed to avoid spurious substitutions from the Herbrand universe by forming systems of equations the satisfaction of which would give the desired result. I came to realize that for problems expressed in our form, the required equalities always were such as to render literals complementary. That is given a pair of clauses one of which contains the literal R(u1, u2 ,...,un) while the other contains -R(v1 ,v2,...,vn), what was needed was to find substitutions to satisfy the system of equations

I also saw that for any system of substitutions that was successful in producing an inconsistent set of clauses, there necessarily had to be a subset of that set which was linked in the following sense:

A set of clauses is linked if for each literal l in one of the clauses of the set, the complementary

literal — l occurs in one of the remaining clauses.[27]

I had the opportunity to explain these ideas and to place them in the context of existing research at a symposium organized by the American Mathematical Society on Experimental Arithmetic held in Chicago in April 1962 to which I was invited to participate. The ideas developed in the paper that was published in the proceedings of the conference [4] turned out to be very influential (although I believe that many of those whose work was ultimately based on this paper were unaware of the fact).[28]

Around this time I had been invited to spend several hours weekly as a consultant at Bell Labs in Murray Hill, New Jersey. I was delighted to have the opportunity to see some of my ideas implemented. Doug McIlroy undertook to produce a working program for Bell Labs’ IBM 7090, and did so in short order. The problem of finding solutions to the systems of equations needed to establish “links” was dealt with in McIlroy’s program by using what was later called unification.[29] Peter Hinman joined the effort as a summer Bell LabsBell Labs employee and found and corrected some bugs in the McIlroy program. We wrote up our work and submitted it for publication. It was accepted with some rather minor changes. These changes were not made, for reasons now obscure and the paper never appeared.

In [4] I carelessly called the algorithm which Logemann and Loveland had implemented, the Davis-Putnam procedure as one of a number of early theorem-proving programs for first order logic. I saw the modification we had made to DP as a mere ad hoc adjustment, and was very slow to realize that from a practical point of view it was the DPLL algorithm for SAT that was the most important part of that effort and that the application to first order logic was really just a footnote. (The interest of the National Security Agency in just that part of the effort should have been a tipoff!) All of this may have had the inadvertent effect of depriving Logemann and Loveland of their proper share of the credit for developing it for many years.

The year 1963 brought great excitement to the world of logic. Paul Cohen invented a powerful new method he called forcing for constructing models of the axioms of set theory, and he had used this method to show that Cantor’s continuum hypothesis could not be proved from the standard axioms for set theory, the Zermelo-Fraenkel axioms together with the axiom of choice. This settled a key question that had been tacitly posed by Godel when more than two decades previously, he had shown that the continuum hypothesis couldn’t be disproved from those same axioms. I was astonished to receive a letter from Paul Cohen dated November of that year reading in part:

I really should thank you for the encouragement you gave me in Stockholm. You were directly responsible for my looking once more at set theory. ...Of course, the problem I solved had little to do with my original intent. In retrospect, though, the basic ideas I developed previously played a big role when I tried to think of throwing back a proof of the Axiom of Choice, as I had previously thought about throwing back a proof of contradiction.

In the summer of 19621 had attended the International Congress of Mathematicians in Stockholm. These conferences are scheduled to occur every 4 years, but this was my first since the 1950 Congress at Harvard. At the Congress, I talked briefly with Paul Cohen. I knew that although he was not primarily a logician, he was a very powerful mathematician who had been attempting to find a consistency proof for the axioms of set theory. He indicated that some logicians he had talked to had been discouraging, and I urged him to pay no attention. That was really the total extent of my “encouragement”. Of course, I was very pleased to receive the letter.

It was in 1963 that we realized that we were outgrowing our apartment overlooking the river. Our two sons had been sharing a bedroom. Now aged eight and ten, they had quite different temperaments. If we were to have any peace, they would have to have separate rooms. At this point Virginia found a “brownstone” town house a mile south of our apartment that we were able to buy. Although the price we paid was ridiculously low by later standards, the house and its renovation put an enormous strain on our budget. Of course, it turned out to be far and away the best investment we ever made. We lived there for 33 years. In order to make ends meet, I found myself becoming an electrician and a plumber. With the help of a friend (as much a novice as I), I even installed a new furnace.

A project that was absorbing a good deal of time and energy at this time was the preparation of my anthology of fundamental articles by Godel, Church, Turing, Post, Kleene, and Rosser.32 I wrote some of the commentaries for the book while attending a conference in the delightful town of Ravello south of Naples.

Meanwhile my relationship with Abe Gelbart was becoming more and more difficult. Things were brought to a head in the spring of 1965 when, interviewing a prospective faculty member, someone I had very much hoped would be a colleague, Gelbart behaved in an insulting manner. I decided that I had no choice but to resign my position. I called a friend at the Courant Institute and reminded him of the suggestion that I let them know if I were interested in a position at NYU. Soon enough an offer arrived. It was not quite what I had expected: the position was half at the Bronx campus and only half at the Institute. However, I was urged to regard the relative vacuum on the Bronx campus as an opportunity to develop a logic group there, and I was assured that I would be treated as a regular member of the graduate faculty. I accepted the position and remained with the Courant Institute until my retirement in September 1996.

I took to the notion of developing a logic group at the Bronx campus with avidity. My old friend and student Donald Loveland was already there, and he was soon joined by the Yasuharas, Ann and Mitsuru. I was able to provide support in the form of released time for research from teaching from my continued funding by the Air Force. During these years Norman Shapiro, with whom I had organized a logic seminar in Princeton many years earlier, and my former student Bob Di Paola were both at the RAND Corporation. Norman arranged for me to be able to spend summers at RAND. Our family found housing in the hills above Topanga Canyon near the Malibu coast, and I enjoyed the daily drive along the beach to the RAND facility in Santa Monica. I was required to obtain security clearance at the “Top Secret” level, not because I did any secret work there, but because classified documents in the building were not necessarily under lock and key. I was tempted once to use my clearance. The Cultural Revolution was in full swing in China, and I was thoroughly mystified by it. Di Paola urged me to seek enlightenment by looking at the intelligence reports from the various agencies easily available to me. I did so, feeling that I was losing my innocence, and was greatly disappointed. Not only did these “secret” reports contain nothing that couldn’t be found in newspapers and magazines, they turned out to be anything but unbiased, clearly reflecting the party line of the agency from which they emanated.

What I did at RAND was work on Hilbert’s tenth problem, specifically I tried to prove JR. I used the computing facility at RAND to print tables of Fibonacci numbers and solutions of the Pell equation looking for patterns that would do the trick. I also found one interesting equation:

I proved that JR would follow if it were the case that this equation had only the trivial solution x = u = 1; y = v = 0.33 In fact the equation turns out to have many non-trivial solutions, but the reasoning actually shows that JR would follow if there are only finitely many of them, and this question remains open.

In the academic year 1968-1969 I finally had a sabbatical leave. I would have been due for one at Yeshiva, and as part of my negotiation with NYU, I secured this leave. I spent the year in London loosely attached to Westfield College of the University of London where I taught a “postgraduate” course on Hilbert’s tenth problem. I continued efforts to prove JR (and thereby settle Hilbert’s tenth problem). I found myself working on sums of squares in quadratic rings, but I didn’t make much progress. Meanwhile “Swinging London” was in full bloom with the mood of the “sixties” very much in evidence. Although not quite swept away by the mood, I did not entirely escape its influence.

While I was in London Jack Schwartz, a friend from my City College days and a colleague at NYU, was working to found a computer science department in the Courant Institute. I was pressed by the Courant Institute to become part of the new department. I accepted but not with alacrity. Among other issues, it meant abandoning my logic group in the Bronx. In fact, Fred Ficken, the amiable Chair at the Bronx campus was about to retire and the applied mathematician Joe Keller had agreed to take on this role with the intention of making the Bronx campus a bastion of applied mathematics. So my group didn’t have much future. What neither Joe nor I knew was that the entire Bronx campus would be shut down a few years later because NYU found itself in a financial crunch.

I had been back in New York only a few months when I received an exciting phone call from Jack Schwartz. A young Russian had used the Fibonacci numbers to prove JR! Hilbert’s tenth problem was finally settled! I had been half-jokingly predicting that JR would be proved by a clever young Russian, and, lo and behold, he had appeared. (I met the 22year old Yuri Matiyasevich in person that summer at the International Congress in Nice.) After getting the news I quickly phoned Julia, and about a week later, I received from her John McCarthy’s notes on a lecture he had just heard in Novosibirsk on the proof. It was great fun to work out the details of Yuri’s lovely proof from the brief outline I had. I saw that the properties of the Fibonacci numbers that Yuri had used in his proof had analogues for the Pell equation solutions with which Julia had worked and I enjoyed recasting the proof in those terms. I also wrote a short paper in which I derived some consequences of the new result in connection with the number of solutions of a Diophantine equation.34

To make the proof of the unsolvability of Hilbert’s tenth problem widely accessible, I wrote a survey article for the American Mathematical Monthly which was later reprinted as an appendix to the Dover edition of Computability & Unsolvability [11]. In addition, I collaborated with Reuben Hersh on a popular article on the subject for the Scientific American [26]. Suddenly awards were showered on me. For the Monthly article I received the Leroy P. Steele prize from the American Mathematical Society and the Lester R. Ford prize from the Mathematical Association of America, and for the Scientific American article Reuben Hersh and I shared the Chauvenet prize, also from the Mathematical Association of America. I was also invited by the Association to give the Hedrick lectures for 1976.

In May 1974, the Society sponsored a symposium on mathematical problems arising from the Hilbert problems, and of course, Yuri was invited to speak on the tenth. But he was unable to get permission from the Soviet authorities to come. So, Julia was invited instead, and she agreed on condition that I be invited as well to introduce her. When it came to writing a paper for the proceedings of the symposium, we agreed that it should be by the three of us. Yuri’s contribution faced the bureaucratic obstacle that any of his draft documents had to be approved before being sent abroad.

But there was no such problem with letters. So he would send letters to Julia on his parts of the article generally beginning,

Dear Julia,

Today I would like to write about ....

One of his topics was “Famous Problems”. The idea was that the same techniques that had been used to show that there is no algorithm for the tenth problem could also be used to show that various well-known problems were equivalent to the non-existence of solutions to certain Diophantine equations. One of Yuri’s letters did this for the famous Riemann Hypothesis, an assertion about the complex zeros of the function Z (s) = XT=i ns which remains unproved although it has important implications for the theory of prime numbers. The Hypothesis can be expressed in terms of the values of certain contour integrals, and Yuri’s technique was to approximate these integrals by sums in a straightforward way. It was done in a very workman-like way, but it seemed very inelegant to me. I went to my colleague Harold Shapiro, an expert in analytic number theory, and he told me what to do. Julia was so pleased by the result that she sent me a note I kept on my bulletin board for years saying “I like your reduction of RH immensely”.35

My joint appointment in mathematics and the new Computer Science Department defined my new situation at Courant when I returned from London. I had been flirting with computers and computer science for years. But now I had come out of the closet and identified myself as a computer scientist. The new Computer Science Department was developing its own culture and clashes with the mathematicians at Courant were not infrequent. It didn’t help that among the mathematicians were some outstanding scientists that thoroughly outclassed our young department. To begin with our graduate students took the same exams as the mathematics students, and hiring was done by the same committee that hired mathematicians. The evolution towards autonomy was slow and painful. In the spring of 1973, two applied mathematicians and I constituted a hiring committee for computer science. The mathematicians were both heavily involved with scientific computing, but neither had any real appreciation or understanding of computer science as an autonomous discipline. I remember all too well a particular tense lunch meeting on a Friday in June of that year in which possible appointments were discussed in an atmosphere I did not find friendly. When I left the meeting I became aware of a sensation like a brick placed on my chest. I continued to experience this disagreeable sensation through the weekend and finally entered the hospital where a myocardial infarction was diagnosed. Although in retrospect I had done plenty to bring this about by poor diet and lack of exercise, I have always thought that that disagreeable meeting played a precipitating role. Before my heart attack, I had been an enthusiastic New Yorker even when in exile; but now I began to yearn to live someplace where I could have a rich professional life without the tension that I found in everyday life in New York.

Over the years I continued to have doctoral students from both the Mathematics Department and the Computer Science Department. I certainly taught hard-core computer science courses, beginning programming and data structures, and on the graduate level, theory of computation, logic programming, and artificial intelligence. In the early years I could also count on teaching mathematical logic, set theory, and even nonstandard analysis. Unfortunately for me this flexibility gradually vanished, a fact that contributed to my decison to retire from NYU in 1996. But this is getting ahead of the story.

In the 1970 s the Italian universities were still not offering Ph.D. programs. Having received their bachelor degree, graduates were entitled to use the title “Dottore”. The C.N.R., the agency of the Italian government involved with scientific research, concerned to do something about the inadequate training young Italian mathematicians received, set up summer programs in which they were exposed to graduate level courses. I was invited to give such a course on Computability in the lovely town of Perugia during the summer of 1975. This was the beginning of a connection between Italian computer science and the Courant Institute. A number of my students from that course became graduate students at NYU. Two in particular, Alfredo Ferro and Eugenio Omodeo, obtained Ph.D.’s in computer science from NYU. Ferro went back to his home town of Catania in Sicily where he started a computer science program at the university there, and sent his own students back to NYU. The relationship continues to be active.

As an undergraduate I had tried briefly to rehabilitate Leibniz’s use of infinitesimal quantities as a foundation for calculus. It was easy enough to construct algebraic structures containing the real numbers as well as infinitesimals; the problem that baffled me was how to define the elementary functions such as sine and log on such structures. It was therefore with great excitement and pleasure that I heard Abraham Robinson’s address before the Association for Symbolic Logic towards the end of 1961 in which he provided an elegant solution to this problem using techniques that he dubbed nonstandard analysis. Some years later together with Melvin Hausner, my roommate at Princeton and now a colleague, I started an informal seminar on the subject. We had available Robinson’s treatise and some rather elegant lecture notes by Machover and Hirschfeld. Hausner was inspired to apply the technique to prove the existence of Haar measure. Reuben Hersh and I wrote a popular article on nonstandard analysis, also for the Scientific American.[30] Nonstandard analysis really tickled my fancy. As I wrote in the flush of enthusiasm:

It is a great historical irony that the very methods of mathematical logic that developed (at least in part) out of the drive toward absolute rigor in analysis have provided what is necessary to justify the once disreputable method of infinitesimals. Perhaps indeed, enthusiasm for nonstandard methods is not unrelated to the well-known pleasures of the illicit. But far more, this enthusiasm is the result of the mathematical simplicity, elegance, and beauty of these methods and their far-reaching application.

I taught nonstandard analysis at Courant and benefited from class notes prepared by my student Barry Jacobs. In the summer of 1971, I taught it again at the University of British Columbia. Finally I wrote a book (the quotation above is from the preface).37

For the academic year 1976-77, I was able to go on sabbatical leave. I had spent two summers in Berkeley and was eager to try a whole year. John McCarthy (who had been a fellow student at Princeton) hired me to work for the month of July at his Artificial Intelligence Laboratory at Stanford University. I loved the atmosphere of play that John had fostered. The terminals that were everywhere proclaimed “Take me, I’m yours”, when not in use. I was encouraged to work with the FOL proof checker recently developed by Richard Weyhrauch. Using this system, I developed a complete formal proof of the pigeon-hole principle from axioms for set theory. I found it neat to be able to sit at a keyboard and actually develop a complete formal proof, but I was irritated by the need to pass through many painstaking tiny steps to justify inferences that were quite obvious. FOL formalized a “natural deduction” version of First Order Logic. The standard paradigm for carrying out inferences was to strip quantifiers, apply propositional calculus, and replace quantifiers. I realized that from the viewpoint of Herbrand proofs, each of these mini-deductions could be carried out using no more than one substitution instance of each clause. I decided that this very possibility provided a reasonable characterization of what it means for an inference to be obvious. Using the LISP source code for the linked-conjunct theorem prover that had been developed at Bell LabsBell Labs, a Stanford undergraduate successfully implemented an “obvious” facility as an add-on to FOL. I found that having this facility available cut the length of my proof of the pigeon-hole principle by a factor of 10. This work was described at the Seventh Joint International Congress on Artificial Intelligence held in Vancouver in 1981 and reported in the Proceedings of that conference [15].

During the 1976-77 academic year, it was a great pleasure to be able to interact with the Berkeley logic group and especially with Julia Robinson. We worked on the analogue of Hilbert’s tenth problem for strings under concatenation, but didn’t make much progress. It had at one time been thought that proving this problem unsolvable would be the way to obtain the desired unsolvability result for the Diophantine problem. Julia guessed that the string problem was actually decidable, and she turned out to be right as we learned when we got word of Makanin’s positive solution of the problem. At Berkeley that year, I taught two trimester courses, an undergraduate computability theory course for Computer Science and a graduate course in nonstandard analysis for Mathematics. For the nonstandard analysis course, I was able to use my newly published book as a text. It was a class of about thirty students, and a little intimidating. It was clear to me that among these Berkeley educated students were a number who were far better versed in model theory (the underlying basis for nonstandard analysis) than I.

Ever since my days with Hilary Putnam, I have had a continuing interest in the foundations of quantum mechanics. A preprint I received from the logician Gaisi Takeuti caught my attention as having important ramifications for quantum theory.

This paper explored Boolean-valued models of set theory using algebras of projections on a Hilbert space. Boolean-valued models (in which the “truth value” of a sentence can be any element of a given complete Boolean algebra, rather than being restricted to the usual two element algebra consisting of {true, false}), had been studied as an alternative way to view Paul Cohen’s forcing technique for obtaining independence results in set theory. What Takeuti found was that the real numbers of his models were in effect just the self-adjoint operators on the underlying Hilbert space. Since a key element in “quantizing” a classical theory is the representation of “observables” by such operators, I felt that the connection was surely no coincidence. I wrote a short paper about the application of Takeuti’s mathematics to quantum mechanics, and I was very pleased when it was published in the International Journal of Theoretical Physics [13].

I worked at John McCarthy’s AI lab again, and this time John asked me to think about some questions involving so-called non-monotonic reasoning. I wrote a pair of short notes which John later arranged to be combined for publication in Artificial Intelligence [14].

I spent the academic year 1978-79 as a Visiting Professor at the Santa Barbara campus of the University of California. There was some mutual interest in a permanent appointment, but it all faded away as a consequence of wrangling over the status of the campus’s two computer science programs: the one in the Mathematics Department and the one in Electrical Engineering. On my return to New York, I met a new faculty member Elaine Weyuker with whom I was to find a number of shared interests. Although trained as a theoretical computer scientist, she had moved into the turbulent field of software testing. Of course all software must be tested before it is released. Often, in practice this testing phase is ended simply because some deadline is reached or because funding runs out. From an academic point of view, the field invites attention to the problem of finding a more rational basis for the testing process. Elaine and I wrote two papers attempting to provide an explication for the notion of test data adequacy.38

I had been teaching theory of computation for many years, and had developed lecture notes for some of the topics covered. For a long time I had wanted to produce a book based on my course, but had never found the time or energy to complete the task. Elaine came to the rescue adding the needed critical dose of energy. In addition, she produced lots of exercises, and tested some of the material with undergraduates. The book was published and was sufficiently successful that we were asked to update the book for a second edition. Neither of us being willing to undertake this, we coaxed Ron Sigal, who had written a doctoral dissertation under my supervision, to join the team as a third author largely in charge of the revision [30].

The CADE (Conference on Automated Deduction) meetings were occurring annually devoted to theoretical and practical aspects of logical deduction by computer. The organizers of the February 1979 CADE meeting in Austin, realizing that year was the centennial of Frege’s Begriffsschrift in which the rules of quantificational logic were first presented, thought that it would be appropriate to have a lecture that would place their field in a proper historical context. Their invitation to me to give such a lecture fundamentally changed the direction of my work. I found that trying to trace the path from ideas and concepts developed by logicians, sometimes centuries ago, to their embodiment in software and hardware was endlessly fascinating. Since then I have devoted a great deal of time and energy to these questions. I’ve published a number of articles and a book and also given many lectures with a historical flavor.[31] For 1983-84, when I was again on sabbatical leave, I received support from the Guggenheim Foundation for this work. One key figure whose ideas I tried to promulgate was my old teacher Emil L. Post. I lectured on his work on a number of occasions including one talk at Erlangen in Germany. It was very much a labor of love to edit his collected works.[32]

For the two academic years 1988-90, I was Chair of the Computer Science Department at NYU. I had always felt that I would not be happy in an administrative position, and this experience did nothing to change my mind. I would have been hopelessly swamped without the help of the department’s capable and ultra-conscientious adminstrative assistant Rosemary Amico. The NYU central administration had been increasingly unhappy with the fact that the Courant Institute as a whole was running an increasing deficit each year. At the same time, the CS department was encouraged to improve its national standing among research-oriented CS departments. The administration was said to be surprised and pleased that our department was rated among the top twenty in the nation, and we were urged to produce a plan showing how we could move up to the top ten. Assuming that the central administration understood that this would require their providing additional resources, the department prepared an ambitious plan calling for expansion in a number of directions. The central administration did not deign to reply.

After my term of office was over, it was time for another sabbatical leave. The fall 1990 semester was spent in Europe. Our first stop was Heidelberg where I lectured at the local IBM facility and at a logic meeting at the university. Next, a series of lectures on Hilbert’s tenth problem at a conference in Cortona in the north of Italy. Then a month visiting Alfredo Ferro at the University of Catania in Sicily. The fall semester was completed with a stay at the University of Patras in Greece sponsored by Paul Spirakis, and we were home in time for Christmas. I had completed an important article the day before our departure from Patras, had printed it, and left a copy on a secretary’s desk with a note asking her to make copies. Our departure was to be by car ferry to Italy scheduled for the following midnight. The next morning I arrived on the campus to discover that students had occupied the building where I’d been working, and were permitting no one to enter. This was dismaying. I had no copy of my article; it was stored in a VAX that I couldn’t access, and the only hard copy was on the secretary’s desk. At this point a faculty member, who had become a friend, appeared and, ascertaining the problem, spoke briefly to one of the students. Evidently a deal was struck. I got out my key to the massive doors locking the computer science section, and the three of us entered. There was the hard copy of my article where I had left it and a copying machine, and I soon had several copies one of which my friend kept to send to the editor in Germany. Meanwhile the student helped himself to the copier to duplicate a handwritten document, doubtless a manifesto. We left and I was permitted to lock up.

Virginia and I took our friend and his wife to dinner that evening. Finally I asked him what he had said to the student that turned the trick. His reply was not at all what I had expected. “I reminded him that he was applying to Courant, and told him that you are the Chairman.” Our ship due to sail at midnight didn’t actually leave before 3 a.m. It turned out that the stabilizers were not functioning, and the voyage to Ancona took a day longer than scheduled with me being seasick most of the way. We drove to Paris in time for our flight back to New York. But our stay in New York was very short. Over the years Virginia had accompanied me on many trips. Now it was my turn to accompany her. Virginia had become an artist with an international reputation. She is particularly adept at researching and mastering traditional textile techniques and using them to make works of art. For 1991 she had been awarded a three month Indo-American Fellowship[33] to study textiles in India. Of course I came along.

Our scheduled departure date was January 15. That was also the date on which President Bush’s ultimatum to Saddam Hussein demanding that his forces leave Kuwait was expiring. Friends urged us to abandon our travel plans at such an uncertain time, but we decided to go ahead. After a delay caused by a bomb scare at Kennedy airport, we arrived in New Delhi to learn that bombs were dropping on Baghdad. Given the chaos just outside airports in India with throngs insistently offering their services we were delighted to be met by representatives of the American Institute for India Studies (AIIS) who took us to their guest house. The next morning we found other American fellowship recipients in a state of panic. The U.S. State Department had issued an advisory to the effect that non-essential American personnel leave India at once. Most of the others agreed to postpone their fellowship periods and left. We decided to remain. So we were in India for the entire duration of the Gulf War. In an odd way, the situation was advantageous for us. The lack of tourists meant that it was easy to get reservations and services, and the AIIS guest house was always available. The U.S. embassy, which had been transformed into a virtual fortress, was the target of virtually daily vituperative demonstrations by militant Muslim groups, but we ourselves had no problems.

The textiles Virginia was most eager to study were in the state of Orissa, one of the poorest states in India, just south of Calcutta, and we spent most of our time there. I had a new job: I was Virginia’s camera man. My job was to use the video camera to record textile processes; we accumulated 12h of raw footage. There was a week-long tour of some of the the small villages of Orissa, where often, there were no hotels even minimally acceptable by U.S. standards. In one village, we were put up in the guest house of a cotton spinning factory.

In India the contrasts between the best and the worst is enormous. We saw people lying on the sidewalks of Calcutta waiting to die, and we had lunch with a matriarch whose huge family estate is guarded by a private police force and whose foot was kissed by her servants when she permitted them to take the lunch leftovers home to their families. The best educational and scientific research institutions are first-rate by any standard. On my previous sabbatical, I had spent a month as the guest of the Tata Institute of Fundamental Research (TIFR) in Bombay, and I was able to visit them again briefly this time. In addition I lectured at the Indian Institute of Technology (IIT) in New Delhi, an outstanding school whose entrance examinations in mathematics are quite formidable. (At IIT and TIFR I was able to collect my email.) But I also lectured at colleges, allegedly institutions of higher learning, that were sadly weak.

On our way back to New York from India, we stopped in Europe. I spent a week at the University of Udine as the guest of Professor Franco Parlamento who had been a student in my Computability course in Perugia two decades earlier. Then we went to the wonderful mathematical research institute at Oberwolfach in Germany, an institute that started its successful life as an effort by German mathematicians to save their talented young people from becoming cannon fodder during the second world war. There are week-long conferences through the year on a great variety of mathematical subjects. On this occasion, it was on automatic theorem proving organized by Woody Bledsoe and Michael Richter, and a follow-up to a similar meeting 15years earlier.

Back in New York, and back to teaching, I was approaching that sixty-fourth birthday the Beatles had sung about, and beginning to wonder how I wanted to spend the rest of my life. The things that really interested me seem to be of less and less importance to my colleagues. I had my very own course called Arithmetic Undecidability; in a whirlwind semester I covered the elements of first order logic through the Godel completeness theorem, Hilbert’s tenth problem, and the essential undecidability of Peano arithmetic. I taught it for the last time in the spring 1993 semester, and was rebuffed in my request to teach it again. I taught the introductory programming course for computer science majors, and indeed supervised the sections taught by others, for three successive years. I love to program, and at first, I enjoyed these courses. But after a while, I did ask myself: do I really want to be teaching Pascal to classes of 60 students not all of whom are especially receptive, at this stage of my life? A triple coronary bypass operation in January 1994 brought matters to a head. The operation was very successful, but it certainly forced me to face my mortality. In short I decided to investigate retirement possibilities. May 17, 1996 was “Martin Davis Day” at the Courant Institute. Organized by my old friends Jack Schwartz and Ricky Pollack, there were eight speakers: two from Italy (my student Eugenio Omodeo, a Perugia veteran, and Mimmo Cantone, one of Alfredo Ferro’s proteges), my first two students Bob Di Paola and Don Loveland, Hilary Putnam, Elaine Weyuker, Ron Sigal and my college chum John Stachel.

My study is in a house in the Berkeley hills, and I am enjoying the dazzling reflection of the late afternoon sun in San Francisco Bay. My older son, his wife and their four children are here in Berkeley. Virginia and I still go dancing on Tuesdays. And as Virginia likes to say, retiring gave me time to work. I have been a “Visiting

Scholar” at the university here where Alfred Tarski had developed a world-class logic group.

Although I had been involved with Turing’s abstract machines in my research and teaching, and considered myself a computer scientist as well as a mathematician, I was slow to appreciate Alan Turing’s pioneering role in both the theoretical and practical side of computer science. As I came to understand that the circuits of a computer “embody the distilled insights of a remarkable collection of logicians, developed over centuries”, I became eager to bring that point of view to the attention of the public. Publication of Turing’s Ace report (in which he had presented the design of a computer proposed to be built at the British National Physics Laboratory) and his 1947 address to the London Mathematical Society (in which he explicitly stressed the connection of general purpose digital computers to his abstract machines) both of which had languished in obscurity, made it possible to see Turing’s vision for the farsighted anticipation that it was. My essay [19] was an attempt to explain the importance of Turing’s role as a computer pioneer as well as the extent to which his work leaned on the accomplishments of generations of logicians. With my retirement I could devote myself to the project of expanding this essay into a book for the general educated public. My “Universal Computer” [23, 24] was published in 2000. In it I was particularly eager to emphasize the importance of ideas being pursued for their own sake without necessarily expecting the immediate practical payoff that nowadays is generally sought.

In the early years of the new millennium, I found myself devoting considerable effort to debunking foolish claims by scholars who surely should have known better. With the energy and self-assurance that had been shown by designers of perpetual motion machines, a computer scientist, a philosopher, and a physicist, each claimed to have found a way to circumvent the limitations on what can be computed that emerged from the work of Church, Kleene, Post, and Turing in the 1930s. This took on the aspect of a movement called hypercomputation. I spoke at meetings and wrote a number of articles. Here is my abstract for a talk I gave at a special session on hypercomputation at a meeting of the American Mathematical Society in San Francisco in 2003:

Hava Siegelmann claims that her neural nets go “beyond the Turing limit”. Jack Copeland proposes to harness Turing’s notion of “oracle” for a similar purpose. Tien D. Kieu proposes a quantum mechanical algorithm for Hilbert’s 10th problem, known to be unsolvable. It will be shown that in each case the results depend on the presumed physical availability of infinite precision real numbers. In the first two examples, uncomputable outputs are obtained only by slipping uncomputable inputs into the formalisms. Kieu’s method depends on a physically unrealizable form of adiabatic cooling.

2002 would have been Turing’s 90th birth year, and I spoke at a conference in his honor in Lausanne. Jack Copeland (one of the three hypercomputationalists just mentioned) and Andrew Hodges (Turing’s wonderful biographer) were also speakers. I spoke before and Andrew after Jack Copeland. Copeland did not come off unscathed.

A decade later, Turing’s centenary, there were celebratory conferences devoted to Turing all over the world. I spoke at a math meeting in San Francisco in a panel with Andrew Hodges, and at meetings in Boston and Florida. In addition I spoke in England, Italy, and Peru. I was at a banquet at King’s College, Cambridge where Turing’s nephew spoke. I crossed the Atlantic six times, three going and three returning.

I have always thought of myself as rather lazy and very lucky. I’ve had a wonderful marriage—Virginia and I recently celebrated our 63rd anniversary, and we have six grandchildren. Although I had a heart attack when I was only 45, I’m still alive at 86 and well enough that Virginia and I go dancing once a week. My conjecture that anything computable has a Diophantine definition, strongly doubted by most experts, turned out to be correct. And amazingly, there is this book being edited by two of my former students with contributions by a remarkable group of scholars [34].

References

  • 1. Davis, M. (1956). A note on universal turing machines. Automata Studies. In C.E. Shannon & J. McCarthy (Eds.), Annals of Mathematics Studies (pp. 167-175). Princeton University Press
  • 2. Davis, M. (1957). The definition of universal turing machine. Proceedings of the American Mathematical Society, 8, 1125-1126.
  • 3. Davis, M. (1958). Computability and unsolvability. New York: McGraw-Hill. (Reprinted with an additional appendix, Dover 1983)
  • 4. Davis, M. (1963). Eliminating the irrelevant from mechanical proofs. Proceedings of Symposia in Applied Mathematics,15, 15-30. (Reprinted from [43], pp. 315-330).
  • 5. Davis, M. (1963). Extensions and corollaries of recent work on Hilbert’s tenth problem. Illinois Journal of Mathematics, 7, 246-250.
  • 6. Davis, M. (Ed.). (1965). The undecidable. Raven Press. (Reprinted from Dover 2004).
  • 7. Davis, M. (1967). First course in functional analysis. Gordon and Breach. (Reprinted from Dover 2013).
  • 8. Davis, M. (1968). One equation to rule them all. Transactions of the New York Academy of Sciences, Sec., II(30), 766-773.
  • 9. Davis, M. (1971). An explicit Diophantine definition of the exponential function. Communications on Pure and Applied Mathematics, 24, 137-145.
  • 10. Davis, M. (1972). On the number of solutions of Diophantine equations. Proceedings of the American Mathematical Society, 35, 552-554.
  • 11. Davis, M. (1973). Hilbert’s tenth problem is unsolvable. American Mathematical Monthly,80, 233-269. (Reprinted as an appendix to the Dover edition of [3]).
  • 12. Davis, M. (1977). Applied nonstandard analysis. Interscience-Wiley. (Reprinted from Dover 2005).
  • 13. Davis, M. (1977). A relativity principle in quantum mechanics. International Journal of Theoretical Physics, 16, 867-874.
  • 14. Davis, M. (1980). The mathematics of non-monotonic reasoning. Artificial Intelligence, 13, 73-80.
  • 15. Davis, M. (1981). Obvious logical inferences. In Proceedings of the Seventh Joint International Congress on Artificial Intelligence (pp. 530-531).
  • 16. Davis, M. (1982). Why Godel didn’t have Church’s thesis. Information and Control, 54, 3-24.
  • 17. Davis, M. (1983). The prehistory and early history of automated deduction. In J. Siekmann & G. Wrightson (Eds.), Automation of reasoning (Vol. 1, pp. 1-28). Springer.
  • 18. Davis, M. (1987). Mathematical logic and the origin of modern computers. Studies in the History of Mathematics, pp. 137-165. Mathematical Association of America. (Reprinted from

The universal turing machine—a half-century survey, pp. 149-174, by R. Herken, Ed., 1988, Hamburg, Berlin: Verlag Kemmerer & Unverzagt, Oxford University Press.)

  • 19. Davis, M. (1988). Influences of mathematical logic on computer science. In R. Herken (Ed.), The universal turing machine-a half-century survey (pp. 315-326). Hamburg, Berlin: Verlag Kemmerer & Unverzagt, Oxford University Press.
  • 20. Davis, M. (1989). Emil post’s contributions to computer science. In Proceedings Fourth Annual Symposium (Ed.), on Logic in Computer Science (pp. 134-136). Washington, D.C.: IEEE Computer Society Press.
  • 21. Davis, M. (1995). American logic in the 1920s. Bulletin of Symbolic Logic, 1, 273-278.
  • 22. Davis, M. (1999). From logic to computer science and back. In C. S. Calude (Ed.), People and ideas in theoretical computer science (pp. 53-85). Springer.
  • 23. Davis, M. (2000). The universal computer: The road from Leibniz to Turing. W.W. Norton. Turing Centenary Edition, CRC Press, Taylor & Francis 2012.
  • 24. Davis, M. (2001). Engines of logic: Mathematicians and the origin of the computer. W.W. Norton. [Paperpack edition of The Universal Computer]
  • 25. Davis, M., & Hersh, R. (1972). Nonstandard analysis. Scientific American, 226, 78-86.
  • 26. Davis, M., & Hersh, R. (1973). Hilbert’s tenth problem. Scientific American,229, 84-91 (Reprinted in J.C. Abbott (Ed.), The Chauvenet papers, 2, 555-571. Math. Assoc. America, 1978).
  • 27. Davis, M., & Putnam, H. (1958). Reductions of Hilbert’s tenth problem. Journal of Symbolic Logic, 23, 183-187.
  • 28. Davis, M., & Putnam, H. (1960). A computing procedure for quantification theory. Journal of the Association for Computing Machinery, 7, 201-215. (Reprinted from [43], pp. 125-139).
  • 29. Davis, M., & Putnam, H. (1963). Diophantine sets over polynomial rings. Illinois Journal of Mathematics, 7, 251-255.
  • 30. Davis, M., & Weyuker, E. J. (1983). Computability, complexity, and languages. Second edition with Ron Sigal: Academic Press. 1994.
  • 31. Davis, M., & Weyuker, E. J. (1983). A formal notion of program-based test data adequacy. Information and Control, 56, 52-71.
  • 32. Davis, M., & Weyuker, E. J. (1988). Metric space based test data adequacy criteria. The Computer Journal, 31, 17-24.
  • 33. Davis, M., Putnam, H., & Robinson, J. (1961). The decision problem for exponential Diophantine equations. Annals of Mathematics, 74, 425-436.
  • 34. Davis, M., Logemann, G., & Loveland, D. (1962). A machine program for theorem proving. Communications of the Association for Computing Machinery, 5, 394-397. (Reprinted from [43], pp. 267-270).
  • 35. Davis, M., Matijasevic, Y., & Robinson, J. (1976). Hilbert’s tenth problem: Diophantine equations: positive aspects of a negative solution. Proceedings ofSymposia in Pure Mathematics, 28, 323-378.
  • 36. Green, B., &Tao, T. (2008). The primes contain arbitrarily long arithmetic progressions. Annals of Mathematics, 167, 481-547.
  • 37. Kleene, S. C. (1943). Recursive predicates and quantifiers. Transactions of the American Mathematical Society,53, 41-73. (Reprinted from [11], pp. 254-287).
  • 38. Post, E. L. (1994). Solvability, provability, definability: The collected works of Emil L. Post ed. by M. Davis with a biographical introduction. Boston, Basel and Berlin: Birkhauser
  • 39. Prawitz, D. (1960). An improved proof procedure. (Reprinted from [43] with an additional comment by the author, pp. 162-201).
  • 40. Reid, C. (1996). Julia: A life in mathematics. Washington, D.C.: Mathematical Association of America.
  • 41. Robinson, J. A. (1963). Theorem proving on the computer. Journal of the Association for Computing Machinery,10. (Reprinted from [43], pp. 372-383).
  • 42. Robinson, J. A. (1965). A machine-oriented logic based on the resolution principle. Journal of the Association for Computing Machinery,12. (Reprinted from [43], pp. 397-415).
  • 43. Siekmann, J., & Wrightson, G. (Eds.). (1983). Automation of reasoning 1: Classical papers on computational logic 1957-1966. Berlin: Springer.

  • [1] That is, the language using the symbols - DVA3V =of elementary logic together with thesymbols 0 1 + x of arithmetic.
  • [2] The jump of a set A of natural numbers may be understood as the set of (numerical codes of) thoseTuring machines that will eventually halt when starting with a blank tape and able to obtain answersto any question of the form “n e A?”.
  • [3] For example, the “Pell” equation (x + 1)2 — d(y + 1)2 = 1 has natural number solutions in x, yjust in case d belongs to the set consisting of 0 and all positive integers that are not perfect squares;hence that latter set is Diophantine.
  • [4] A set of natural numbers is r.e. if it is the set of inputs to some given Turing machine for whichthat machine eventually halts.
  • [5] It furnishes no example of a Diophantine set whose complement is not Diophantine.
  • [6] Actually, Kleene’s system S3.
  • [7] Actually without any bound on the ordinal all the sets in the hierarchy are representable with onlyone second order function quantifer.
  • [8] Clifford Spector showed that the result remains true for all constructive ordinals in his dissertation,written a few years later under Kleene’s supervision.
  • [9] a, b, c) | c = ab}.

    I would like to say that I expressed my pleasure at finding another Hilbert’s tenth problem enthusiast. However, in Julia’s sister Constance Reid’s memoir, The Autobiography of Julia Robinson,{{In [40] p. 61.

  • [10] Actually, indices of r.e. sets.
  • [11] [1, 2].
  • [12] The report was reprinted in [43], pp. 41-48.
  • [13] The existence of “minimal” degrees. Only 31years old, Clifford Spector died quite suddenly in1961 of acute leukemia, a tragic loss.
  • [14] [7].
  • [15] The translator for the Italian version called it a “libro classico”.
  • [16] A review of the Dover edition by David Harel referred to the book as one of the few “classics” incomputer science.
  • [17] (Vk)
  • [18] [27].
  • [19] Abraham Robinson had proposed similar investigations in a talk at the Cornell Institute. I attendedthat talk, but Hilary didn’t. Somehow, I didn’t connect the two ideas until years later when I noticedRobinson’s paper in the proceedings of the Institute.
  • [20] [40] p. 93.
  • [21] What has become known as the satisfiability problem.
  • [22] These are: 1. The one literal rule also known as the unit rule. 2. The affirmative-negative rule also known as the pure literal[3.] rule. 4. The rule for eliminating atomic formulas 5. The splitting rule, called in the report, the rule ofcase analysis The procedure proposed in our later published paper used rules 1, 2, and 3. The computer programwritten by Logemann and Loveland discussed below used 1, 2, and 4. The first of these is the“Davis-Putnam procedure” which has been the subject of theoretical analysis, nowadays referredto as DP. The second choice is the one generally implemented, is usually called DPLL to refer to
  • [23] (Footnote 22 continued)
  • [24] Among other matters, we needed to find an exponential Diophantine definition for the relation: We didn’t go about it in the easiest way. We used the fact that
  • [25] See [36].
  • [26] AFOSR TR59-124.
  • [27] Here if l = —R(ci, c2,..., cn) then it is understood that by —l, the literal R(ci, c2,..., cn) ismeant.
  • [28] It was J.A. Robinson’s key insight that a theorem-proving procedure for first order logic couldbe based on not merely finding complementary literals by unification, but also then eliminatingthem—what he called “resolution”—that revolutionized the field [42]. Anyone interested in tracingthe history might notice that whereas Robinson’s [42] does not refer to my [4], his earlier [41] does
  • [29] and in its content clearly shows its influence.
  • [30] Actually, as I remember it, we worked on that article and the one on Hilbert’s tenth problemfor which we received the Chauvenet prize pretty much at the same time. The one on nonstandardanalysis appeared in 1972 [25], a year before the prize-winning article.
  • [31] [16-21].
  • [32] [38].
  • [33] These fellowships are administered by the CIES, the same office that manages Fullbright awards.
 
Source
< Prev   CONTENTS   Source   Next >