The post The Phantastic Worlds of M. C. Escher appeared first on SciHi Blog.

]]>On June 17, 1898, Dutch graphic artist **Maurits Cornelis Escher**, better known as M. C. Escher, was born. He is known for his often mathematically inspired woodcuts, lithographs, and mezzotints, which feature impossible constructions, explorations of infinity, architecture, and tessellations.

“The ideas that are basic to [my work] often bear witness to my amazement and wonder at the laws of nature which operate in the world around us. He who wonders discovers that this is in itself a wonder. By keenly confronting the enigmas that surround us, and by considering and analyzing the observations that I had made, I ended up in the domain of mathematics.”

– Quote of Escher 1959 – in the introduction of M.C. Escher: The Graphic Work (1978)

M. C. Escher was born in 1898 as the youngest of five sons of the hydraulic engineer George Arnold Escher in the Princessehof in Leeuwarden. In 1903 the family moved to Arnhem, where the young Escher completed his basic school education. However, he was a rather bad pupil, had to repeat two classes and, despite his talent for drawing, even had poor marks in art.. After school, he enrolled at the Haarlem Schoof of Architecture and Decorative Arts, finishing in 1922. During these years he got in touch with and gained lots of experience in drawing as well as woodcuts. Very influencing to his artist career was the journey Escher made in 1922 through Europe. Escher visited Italy, and Spain, being mostly fascinated by the countrysides and the building’s designs. Due to the positive impressions and influences the artist gained, he visited Italy in the future more often, eventually living there, wherefore the country became an important fragment of his life and work. During his time in italy, Escher created ‘*Still Life and Street*‘, one of his most famous prints of an impossible reality. The image’s inspiration depicted a street in Savona, Italy. The picture was an early example of his playful way handling perspectives, as it looks like the books on his table leaned against the far away buildings. Escher travelled several times through Italy, mostly on foot or on a donkey, as well as Spain, where he studied Arabic ornamentation (Alhambra). In 1923 he met the Swiss Jetta Umiker, whom he married in Viareggio in 1924. The couple settled near Rome.

Due to the rise of fascism in Italy, M. C. Escher moved along with his family to Switzerland, where he was never really able to settle, while in search of the same beautiful landscaped he saw in Italy. In Belgium he could not stay because of World War II and he moved to the Netherlands. As depressing as this period seems, it was also the most productive in Escher’s life. The dark and cloudy weather made him focus on his work and he completed numerous pictures during his time there with almost no break. His teacher Mesquita was kidnapped by the German occupiers in 1944 and murdered in the Auschwitz concentration camp. Escher was able to save at least a large part of Mesquita’s work.

After the end of the war Escher learned the mezzotint technique and from 1946 he increasingly turned to perspective pictures *(top and bottom* 1947). He increasingly received well-paid commissions, sold many of his prints and was a sought-after artist in the USA in 1950. His great breakthrough in Europe came in September 1954, when the Stedelijk Museum granted him a solo exhibition in Amsterdam on the occasion of a mathematician congress held there at the same time.

Escher’s prints of impossible realities made him famous. Next to *Still Life and Street*, the *Drawing Hands*, published in 1948 in which the two hands draw each other. M. C. Escher knew how to portray mathematical relationships of shapes and bodies as well as playing with their perspectives brilliantly without having a higher mathematical education. The mathematical importance in his pictures evolved also during his stay in Italy, where he began gaining interest in orders of shapes and symmetry. Escher’s brother once sent him a paper by the mathematician George Pólya on plane symmetry groups that increased Escher’s attention and he decided to focus more on the mathematical issues in his works, wherefore the artist is liked by so many scientists. He spent more time figuring out how to combine infinity of objects with two-dimensional planes, which he began discussing with several mathematicians. By around 1956 Escher’s interests changed again taking regular division of the plane to the next level by representing infinity on a fixed 2-dimensional plane. Earlier in his career he had used the concept of a closed loop to try to express infinity as demonstrated in *Horseman*.By 1958 Escher had achieved remarkable fame. He continued to give lectures and correspond with people who were eager to learn from him. He had given his first important exhibition of his works and had also been featured in *Time* magazine. Escher received numerous awards over his career including the Knighthood of the Oranje Nassau (1955) and was regularly commissioned to design art for dignitaries around the world.[8]

His best-known works, which almost earned Escher the status of a pop star, deal with the representation of perspective impossibilities, optical illusions and multistable perceptual phenomena. One sees objects or buildings that seem natural at first glance, but are completely contradictory at second glance. In his works, Escher also devoted himself to themes such as Möbius bands, crystal shapes, reflections, optical distortions, fractals and approaches to infinity.

Through his works, Escher was able to open up whole new art creations and bring the mathematical component in art to a whole new level. At yovisto, you may enjoy the video lecture *Gödel, Escher, Bach* as part of a whole series by Justin Curry at MIT, bringing the character’s interests in art, mathematics, logics, physics and numerous other scientific fields together.

**References and Further Reading: **

- [1] Official Escher Website
- [2] Artful Mathematics: The Heritage of M. C. Escher
- [3] Escher at NGA
- [4] “Math and the Art of M.C. Escher”. USA: SLU.
- [5] Schattschneider, Doris (June – July 2010). “The Mathematical Side of M. C. Escher” (PDF).
*Notices of the American Mathematical Society*.**57**(6): 706–18. - [6] Escher-Museum, Den Haag
- [7] Escher for Real, Israel Institute of Technology
- [8] John J. O’Connor, Edmund F. Robertson:
*Maurits Cornelius Escher.*In:*MacTutor History of Mathematics archive.* - [9] M.C Escher at Wikidata
- [10] Timeline for M.C. Escher, via Wikidata

The post The Phantastic Worlds of M. C. Escher appeared first on SciHi Blog.

]]>The post The Beautiful Mind of John Forbes Nash appeared first on SciHi Blog.

]]>On June 13, 1928, American mathematician **John Forbes Nash Jr.** was born. Nash made fundamental contributions to game theory, differential geometry, and the study of partial differential equations. His work has provided insight into the factors that govern chance and decision-making inside complex systems found in everyday life. John Nash is the only person to be awarded both the Nobel Memorial Prize in Economic Sciences and the Abel Prize.

“You don’t have to be a mathematician to have a feel for numbers.”

– John Forbes Nash Jr.

Nash was born on June 13, 1928, in Bluefield, West Virginia. His father, John Forbes Nash, was an electrical engineer for the Appalachian Electric Power Company. His mother, Margaret Virginia (née Martin) Nash, had been a schoolteacher before she was married. From 1945 to 1948 Nash studied at the Carnegie Institute of Technology in Pittsburgh, where he received his Bachelor’s degree in 1945 and his Master’s degree in 1948. Originally he wanted to become an engineer like his father, but developed a great passion for mathematics. Still in Pittsburgh his interest in game theory began, whose solution John von Neumann and Oskar Morgenstern had left open in their book *Theory of Games and Economic Behavior,* 1944.[1, 2]

Nash received his doctorate from Princeton University in 1950 from the mathematician Albert W. Tucker. The work entitled *Non-cooperative Games* extended the game theory of Morgenstern and von Neumann to include the so-called Nash equilibrium. Nash proved that this equilibrium – deviating from the solutions – also exists for non-zero-sum games and for more than two players. The Nash equilibrium describes a combination of strategies in non-cooperative games, where each player chooses exactly one strategy from which it makes no sense for any player to deviate from his chosen strategy as the only one. In a Nash equilibrium every player agrees with his strategy choice afterwards, he would hit it again in the same way. The strategies of the players are therefore the best answers for each other. The Nash equilibrium is an elementary solution concept of game theory.

The importance of this work from 1950 was only later recognised in connection with the further development of game theory and earned him the Nobel Prize for Economics in 1994. Von Neumann himself was not very impressed at the time at a meeting with Nash; he considered the result to be trivial and mentioned it only indirectly in the introduction to the new edition of his book Morgenstern on game theory from 1953. Nash himself also regarded the work more as a by-product in comparison to his later works.

In 1952 his work on real algebraic manifolds appeared, which he himself regarded as his perfect work. The idea was to approximate each multiplicity by an algebraic variety (much easier to handle and describable by polynomials), possibly by moving to spaces of much higher dimension. In this context, Nash manifolds and Nash functions are named after him. After receiving his doctorate, Nash increasingly turned to analysis, in particular differential geometry and partial differential equations. He proved that every Riemann multiplicity isometric in the Euclidean vector space can be embedded (Nash Embedding theorem). The question of whether this is possible was already asked by Bernhard Riemann, and the common opinion in the 1950s was that this was not the case. The result of Nash came unexpectedly and had far-reaching consequences. [3]

From 1950 onwards, Nash worked for four years in the summer months at Rand Corporation on secret research related to applications of game theory to strategic situations in the Cold War. From 1951 to 1953, Nash was a Moore instructor at the Massachusetts Institute of Technology, where he was assistant professor from 1953 and associate professor from 1957 to 1959. In 1955, he submitted a proposal for an encryption procedure to the National Security Agency, but received a rejection. In 1958 he published (in parallel with Ennio De Giorgi, but independently of him) a solution to the regularity problem of partial differential equations, which David Hilbert had included in 1900 in his well-known list of the largest open problems in mathematics (19th problem).[4] The results became known as De Giorgi’s and Nash’s theorem and have far-reaching consequences for the theory of partial differential equations.

At the end of the 1950s Nash was widely recognised as a leading mathematician, which was also reflected in an article in Forbes Magazine, and he was nominated for the Fields Medal in 1958, particularly for his work on Hilbert’s 19th problem simultaneously with de Giorgi. He was third in the final evaluation behind Klaus Roth and René Thom, who finally received the Fields Medal in 1958. At MIT he was on the verge of a full professorship when the first signs of Nash’s disease became apparent in 1959. In May 1959 he was diagnosed with paranoid schizophrenia. According to Nash biographer Sylvia Nasar, Nash was now increasingly showing anti-Semitic tendencies and prone to violence. Nash gave up his position at MIT and after a short stay in hospital went first to Paris and Geneva in 1959/60, where he saw himself as a citizen of the world and exile. In 1961, his wife Alicia Lardé and his mother were forced to send Nash to a mental hospital (Trenton State Hospital). Here he was treated by insulin shock therapy, which put him in an artificial coma. He recovered and was able to attend a conference on game theory in 1961.

In 1964 his schizophrenia became so severe that he had to be admitted to a psychiatric clinic (the private Carrier Clinic) for a longer period of time. During the next 20 years he was repeatedly in clinics for relapses with interruptions. Between 1966 and 1996, as a result of his illness, he did not produce any publications. Before that, however, some outstanding works were published. The 1960s saw the emergence of an idea in the theory of the resolution of singularities in algebraic geometry known as *Nash Blowing Up* (so called by Heisuke Hironaka, to whom Nash gave the idea orally) and some influential papers on partial differential equations. In the 1970s to 1990s he lived in Princeton, where he was regularly seen on campus. While at first he attracted the students’ attention with strange messages he left behind, from the early 1990s mathematicians in Princeton (like Peter Sarnak) increasingly noticed that he had regained some of his old problem-solving skills. In his last years, he increasingly turned to monetary theory, arguing for index money.

Nash died with his wife in May 2015 in a traffic accident on the New Jersey Turnpike; they were in a taxi on their way home from receiving the Abel Award. Both were not strapped on and were thrown out of the vehicle. The 2001 feature *A Beautiful Mind*, starring Russell Crowe, tells the story of Nash’s ingenious designs, illness and recovery; the film won four Oscars in 2002. The script is based on Sylvia Nasar’s 1998 biography of the same name.

**References and Further Reading**:

- [1] John von Neumann – Game Theory and the Digital Computer, SciHi Blog
- [2] Oskar Morgenstern and the Game Theory, SciHi Blog
- [3] Bernhard Riemann’s novell approaches to Geometry, SciHi Blog
- [4] David Hilbert’s 23 Problems, SciHi Blog
- [5] O’Connor, John J.; Robertson, Edmund F., “John Forbes Nash Jr.“, MacTutor History of Mathematics archive, University of St Andrews.
- [6] Home Page of John F. Nash Jr. at Princeton
- [7] Fisher, Len (May 25, 2015). “John Nash obituary”.
*The Guardian*. - [8] Biography of John Forbes Nash Jr. from the Institute for Operations Research and the Management Sciences
- [9] John Forbes Nash Jr at Wikidata
- [10] Timeline of Game Theorists, via DBpedia and Wikidata

The post The Beautiful Mind of John Forbes Nash appeared first on SciHi Blog.

]]>The post Please Don’t Ignite the Earth’s Atmosphere… appeared first on SciHi Blog.

]]>When in 1952 the world‘s first thermonuclear fusion bomb was ignited, mathematicians and physicists thought it would be rather unlikely that testing the newly developed device might result in burning all the nitrogen in the earth‘s atmosphere. However, the possibility could not be excluded completely. Nevertheless, they have have tested the bomb and fortunately for all of us not the like did happen. One of the key persons behind the development of the hydrogen bomb was **Stanislaw Ulam**, who together with physicist Edward Teller came up with the first successful design.[1]

On May 13, 1984, Stanislaw Ulam, Polish-American scientist in the fields of mathematics and nuclear physics passed away. Before developing the hydrogen bomb, he also participated in the Manhattan Project producing the very first atomic weapon based on thermonuclear fission. After the second World War scientists believed that maintaining nuclear fusion to support another kind of weapon should be rather unlikely. But in January 1951, Ulam came up with the decisive idea: put a fission bomb and fusion fuel together inside a massive casing. When the bomb detonated, the casing would contain the explosion long enough for mechanical shock to heat and compress the fusion fuel, and for fission neutrons to ignite nuclear fusion.

Ulam was born in Lemberg , Galicia, (today’s , Lviv in Poland) on 13 April 1909. Ulam came from a Polish-Jewish middle class family with members in the banking and wood processing industries; his father was a lawyer. His mathematics teacher was the Polish mathematician Stefan Banach, one of the leading minds of the Lviv School of Mathematicians. Ulam obtained his Ph.D. from the Polytechnic Institute in Lviv in 1933. He investigated a problem which originated with Lebesgue in 1902 to find a measure on [0,1] with certain properties.[6] Banach in 1929 had solved a related measure problem, but assuming the Generalised Continuum Hypothesis. Ulam, in 1930, strengthened Banach’s result by proving it without using the Generalised Continuum Hypothesis.[5]

In 1938, Ulam went to the United States at the invitation of George David Birkhoff as a Harvard Junior Fellow. In 1940 he became assistant professor at the University of Wisconsin. In 1943 he became a US citizen and in the same year his friend John von Neumann invited him to a secret project in New Mexico, the Manhattan Project.[3] Ulam had a number of specialties, including set theory, mathematical logic, functions of real variables, thermonuclear reactions, topology, and the Monte Carlo theory.

I don’t support any weapon technology and it might be controversial to acknowledge also a scientist like Stanislaw Ulam or his colleague Edward Teller, who spent a great deal their lives in the development of apocalyptic technology. But what really shocked me was the fact that both could not completely exclude the possibility of igniting the earth’s atmosphere when testing the first thermonuclear fusion bomb and thus possibly causing the end of the world:

“There remains the distinct possibility that some other less simple mode of burning may maintain itself in the atmosphere… the complexity of the argument and the absence of satisfactory experimental foundations makes further work on the subject highly desirable.”

(Report LA-602, Ignition of the Atmosphere with Nuclear Bombs [8])

Nevertheless, Stanislaw Ulam also has left us something more substantial: the Monte Carlo Method of Computation, a class of computational algorithms that rely on repeated random sampling and often used for computer simulations of mathematical or physical models. Through the use of electronic computers, this method became widespread, finding applications in weapons design, mathematical economy, and operations research.

Ulam, with J C Everett, also proposed the ‘Orion’ plan for nuclear propulsion of space vehicles. The aim of the Orion project was to develop a nuclear pulse engine to power spaceships. The project ran in the USA from 1957 to 1965. The design envisaged driving a spaceship with a nuclear pulse engine through a series of atomic bomb explosions, each taking place only a few metres behind the stern of the spaceship. Protected by a massive protective shield and a shock absorber system, the spaceship “rides” on the shock waves of the explosions. A nuclear pulse engine based on the Orion principle combines a high specific impulse with a high thrust. The project was abandoned in 1965 for political reasons and because of the 1963 Treaty banning nuclear testing in the atmosphere, in space and under water.

Ulam remained at Los Alamos until 1965 when he was appointed to the chair of mathematics at the University of Colorado. Stanislaw Ulam died of an apparent heart attack in Santa Fe on 13 May 1984 at age 75.

At yovisto academic video search, you might watch a documentary on ‘*Operation Ivy*‘, the detonation of the world’s first hydrogenic bomb back in 1952.

**Related Articles in the Blog: **

- [1] Edward Teller and Stanley Kubrick’s Dr. Strangelove, SciHi Blog
- [2] Stefan Banach and Modern Function Analysis, SciHi Blog
- [3] John von Neumann – Game Theory and the Digital Computer, SciHi Blog
- [4] Stanislaw Marcin Ulam, American Scientist,at Britannica Online
- [5] John J. O’Connor, Edmund F. Robertson: Stanisław Marcin Ulam. In: MacTutor History of Mathematics archive
- [6] Henri Léon Lebesgue and the Theory of Integration, SciHi Blog
- [7] Stanislaw Ulam at Wikidata
- [8] Report LA-602, Ignition of the Atmosphere with Nuclear Bomb
*s* - [9] Timeline with people related to nuclear weapon design, via DBpedia and Wikidata

The post Please Don’t Ignite the Earth’s Atmosphere… appeared first on SciHi Blog.

]]>The post Carl Friedrich Gauss – The Prince of Mathematicians appeared first on SciHi Blog.

]]>On April 30, 1777, German mathematician and physical scientist Carl Friedrich Gauss was born. He contributed significantly to many fields, including number theory, algebra, statistics, analysis, differential geometry, geodesy, geophysics, electrostatics, astronomy and optics. He is often referred to as *Princeps mathematicorum* (Latin, “the Prince of Mathematicians”) as well as “greatest mathematician since antiquity”.

“Mathematics is the Queen of Science, and Arithmetic is the Queen of Mathematics” – handed down in Wolfgang Sartorius von Waltershausen, Gauss zum Gedächtniss, Verlag von S. Hirzel, Leipzig 1856, p.79

Carl Friedrich Gauss grew up as an only child, his mother could barely read but was known to be incredibly intelligent. Rumors about Gauss say that he could calculate before being able to speak and that he corrected his father on his wage accounting at the age of only three. No matter if these rumors are actually true, it indicates that Gauss’ talents and his love for complex calculations were detected very early. At the age of seven, he started to attend school and already designed formulas to easen his calculations during math class.

There is one famous telling about Carl Friedrich Gauss’s boyhood discovery of the “trick” for summing an arithmetic progression. The event occurred when Gauss was seven and attended the Katharina-school in Brunswick. The teacher, one Büttner, had set the class the task of calculating the sum 1 + 2 + 3 + …. + 100 – probably to get a bit of peace for himself, with instructions that each should place his slate on a table as soon as he had completed the task. Almost immediately Gauss placed his slate on the table, saying, “*There it is.*” The teacher looked at him scornfully while the others worked diligently. When the instructor finally looked at the results, the slate of Gauss was the only one to have the correct answer, 5050, with no further calculation. The ten-year-old boy evidently had computed mentally the sum of the arithmetic progression 1 + 2 + 3 + … + 99 + 100, presumably through the formula *m(m+1)/2*. His teachers soon called Gauss’ talent to the attention of the Duke of Brunswick [1].

At the age of 14, Gauss was introduced to Duke Karl Wilhelm Ferdinand von Braunschweig, who sent him to the Collegium Carolinum (now Braunschweig University of Technology), which he attended from 1792 to 1795, and to the University of Göttingen from 1795 to 1798. While at university, Gauss independently rediscovered several important theorems. His breakthrough occurred in 1796 when he showed that a regular polygon can be constructed by compass and straightedge if the number of its sides is the product of distinct Fermat primes and a power of 2. This was a major discovery in an important field of mathematics; construction problems had occupied mathematicians since the days of the Ancient Greeks, and the discovery ultimately led Gauss to choose mathematics instead of philology as a career. Gauss was so pleased with this result that he requested that a regular heptadecagon be inscribed on his tombstone. The stonemason declined, stating that the difficult construction would essentially look like a circle.

“The enchanting charms of this sublime science reveal themselves in all their beauty only to those who have the courage to go deeply into it.”

— Gauss in a letter to Sophie Germain (30 April 1807)

Gauss returned to Brunswick where he received a degree in 1799. After the Duke of Brunswick had agreed to continue Gauss’s stipend, he requested that Gauss submit a doctoral dissertation to the University of Helmstedt. He already knew Pfaff, who was chosen to be his advisor. Gauss’s dissertation was a discussion of the fundamental theorem of algebra. With his stipend to support him, Gauss did not need to find a job so devoted himself to research. He published the book *Disquisitiones Arithmeticae* in the summer of 1801 with a special section dedicated to number theory.[7] In the following years, Carl Friedrich Gauss was offered several positions at foreign universities, but in loyalty to the Duke and the hope of getting his own observatory he stayed in Göttingen, where he had to give lectures. Despite the fact that he did not enjoy his teacher occupation, several famous future mathematicians were taught by him, like Richard Dedekind or Bernhard Riemann. [8,9]

Gauss’ contributions to the field of mathematics are numerous. At the age of only 16, he made first attempts leading to non-Eucleidean geometry. Two years later, Gauss began researching on properties of the distribution of prime numbers, which later on led him to calculate areas underneath graphs and to the Gaussian bell curve. Independent of Caspar Wessel and Jean-Robert Argand, Gauss found the geometrical expression of complex numbers in one plane.

In June 1801, Zach, an astronomer whom Gauss had come to know two or three years previously, published the orbital positions of Ceres, a new “small planet” which was discovered by Giuseppe Piazzi, an Italian astronomer on 1 January, 1801.[10] Unfortunately, Piazzi had only been able to observe 9 degrees of its orbit before it disappeared behind the Sun. Zach published several predictions of its position, including one by Gauss which differed greatly from the others. When Ceres was rediscovered by Zach on 7 December 1801 it was almost exactly where Gauss had predicted.[7]

“It may be true, that men, who are mere mathematicians, have certain specific shortcomings, but that is not the fault of mathematics, for it is equally true of every other exclusive occupation. So there are mere philologists, mere jurists, mere soldiers, mere merchants, etc. To such idle talk it might further be added: that whenever a certain exclusive occupation is coupled with specific shortcomings, it is likewise almost certainly divorced from certain other shortcomings.”

– Gauss-Schumacher Briefwechsel (1862)

At the age of 18, he discovered some properties of the prime number distribution and found the least squares method, which aims to minimize the sum of squares of deviations without first publishing anything about them. After Adrien-Marie Legendre published his “*Méthode des moindres carrés*” in a treatise in 1805 and Gauss only published his results in 1809, a dispute of priorities arose.

Gauss rejected an appointment to the Petersburg Academy of Sciences out of gratitude to his patron, the Duke of Braunschweig, and probably in the hope that he would build him an observatory in Braunschweig. After the sudden death of the Duke after the Battle of Jena and Auerstedt, Gauss became professor at the Georg August University of Göttingen and director of the observatory there in November 1807. There he had to hold courses, but he developed an aversion to them.

“I mean the word proof not in the sense of the lawyers, who set two half proofs equal to a whole one, but in the sense of a mathematician, where ½ proof = 0, and it is demanded for proof that every doubt becomes impossible.”

— Gauss in a letter to Heinrich Wilhelm Matthias Olbers (14 May 1826)

Gauss had been asked in 1818 to carry out a geodesic survey of the state of Hanover to link up with the existing Danish grid. Gauss was pleased to accept and took personal charge of the survey, making measurements during the day and reducing them at night, using his extraordinary mental capacity for calculations. Because of the survey, Gauss invented the heliotrope which worked by reflecting the Sun’s rays using a design of mirrors and a small telescope. However, inaccurate base lines were used for the survey and an unsatisfactory network of triangles.[7]

Gauss began working in the field of astronomy after finishing his famous ‘*Disquisitiones Arithmeticae*‘ and managed to calculate planetary orbits through his method of least squares. He shared his experiences in the work ‘*Theoria motus corporum coelestium in sectionibus conicis solem ambientium*. His achievements in this field made Gauss internationally famous and several of his astronomical methods are still in use today.

Together with Wilhelm Eduard Weber Gauss worked in the field of magnetism starting in 1831. Together with Weber, Gauss invented the magnetometer, thus connecting his observatory with the Institute of Physics in 1833. He exchanged messages with Weber via electromagnetically influenced compass needles: the world’s first telegraph connection. Together with him he developed the CGS unit system, which was determined in 1881 at an international congress in Paris as the basis for electrotechnical units.

Gauss worked in many fields, but only published his results when he believed a theory was complete. As a result, he occasionally pointed out to colleagues that he had long proved this or that result, but had not yet presented it because of the incompleteness of the underlying theory or the lack of carelessness necessary for rapid work. Numerous mathematical methods and formulas carry Gauss’ name today and throughout his lifetime and beyond he earned himself the reputation as one of the most genius and productive mathematicians of all times.

He was still scientifically active towards the end of his life and gave lectures on the least squares method in 1850/51. Two of his most important students, Bernhard Riemann (who received his doctorate from Gauss in 1851 and impressed Gauss in 1854 with his habilitation lecture on the basics of Riemann geometry) and Richard Dedekind, were only at the end of his career. Gauss suffered in his last years from heart failure (diagnosed as dropsy) and insomnia. In June 1854 he travelled with his daughter Therese to the construction site of the railway line from Hanover to Göttingen, where the passing railway made the horses shy and overturned the carriage, the coachman was seriously injured, Gauss and his daughter remained unharmed. Gauss took part in the inauguration of the railway line on 31 July 1854, after which he was increasingly restricted to his house due to illness. He died on 23 February 1855 in Göttingen in his armchair.

At yovisto academic video search, you may enjoy a video lecture by Prof. Ramamurti Shankar on Gauss’s Law at Yale University.

**References:**

- [1] Boyer, Carl B. 1968, 1991. A History of Mathematics. Second edition. Revised by Uta C. Merzbach; foreword by Isaac Asimov. New York: Wiley. (p. 497)
- [2] Carl Friedrich Gauss Biography
- [3] Carl Friedrich Gauss Info Website
- [4] Carl Friedrich Gauss at Wolfram Research
- [5] Carl Friedrich Gauss at Britannica
- [6] Carl Friedrich Gauss at zbMATH
- [7] O’Connor, John J.; Robertson, Edmund F., “Carl Friedrich Gauss“, MacTutor History of Mathematics archive, University of St Andrews.
- [8] Richard Dedekind and the Real Numbers, SciHi Blog
- [9] Bernhard Riemann’s novell approaches to Geometry, SciHi Blog
- [10] Giuseppe Piazzi and the Dwarf Planet Ceres, SciHi Blog
- [11] Carl Friedrich Gauss at Wikidata
- [12] Carl Friedrich Gauss at the Mathematics Genealogy Project
- [13] Timeline for Carl Friedrich Gauss

The post Carl Friedrich Gauss – The Prince of Mathematicians appeared first on SciHi Blog.

]]>The post Walter Pitts and the Mathematical Model of a Neural Network appeared first on SciHi Blog.

]]>On April 23, 1923, American logician Walter Harry Pitts, Jr. was born. Pitts worked in the field of computational neuroscience. He proposed landmark theoretical formulations of neural activity and generative processes that influenced diverse fields such as cognitive sciences and psychology, philosophy, neurosciences, computer science, artificial neural networks, cybernetics and artificial intelligence. Moreover, he proposed the first mathematical model of a neural network. The unit of this model, a simple formalized neuron, is still the standard of reference in the field of neural networks. It is often called a McCulloch–Pitts neuron.

Pitts was considered an eccentric genius when he began researching at the University of Chicago as a non-enrolled student who had run away from home at the age of 15 and a sidekick. He taught himself ancient Greek, Latin, Sanskrit and other languages as well as logic and mathematics on his own as a teenager. At 12 he read the *Principia Mathematica* by Bertrand Russell [1] and Alfred North Whitehead in the library and wrote a letter to Russell, which impressed him so much that he invited him to England and he attended Russell’s lectures in Chicago in 1938. Pitts also impressed the Chicago professor Rudolf Carnap after he had given him a note of his *Logical Structure of the World*.[2] Carnap then spent months trying in vain to find out who he was, as he had not imagined himself, and arranged a subordinate job for him at the university, but Pitts did not seek a degree as a student and never received a university degree.

He became an associate of neurophysiologist and cyberneticist Warren McCulloch in Chicago, who took him into his home as he had no fixed accommodation at the time. This led to the classical work on early mathematical neuron models (*A logical calculus of ideas immanent in nervous activity*, 1943) and neural networks. The work influenced, among others, the mathematician and computer pioneer John von Neumann.[3] The research question both McCulloch and Pitts wanted to solve was whether the human brain also in theory is able really to compute so-called turing-computable functions. To this end they designed a mathematical model of a single biological neuron, the so-called McCulloch-Pitts neuron. Artificial neural networks consisting of McCulloch-Pitts cells can only use binary signals. Each individual neuron can only generate 1 or 0 as output. Analogous to biological neuronal networks, inhibitory signals can be processed. Each McCulloch-Pitts cell has any real number as threshold. McCulloch and Pitts were able to show that Turing-calculable programs can be calculated by a finite network of such artificial neurons. This made both to founding fathers of neuroinformatics.

McCulloch and his friend Jerome Lettwin (1920-2011), also a physician, placed Pitts in an assistant position with the mathematician Norbert Wiener of MIT in 1943.[4] He was given the post after the sceptical Viennese, who himself had previously been regarded as a mathematical prodigy, tested him by going through his proof of the ergod theorem with him on the blackboard. Pitts was accepted by Wiener as a doctoral student and he even provided him personally with a curriculum from various subjects ranging from mathematics to circuit theory and electronics. In 1944, Pitts was also hired by Kellex Corporation, a petrochemical company that also dealt with the processing of radioactive materials. From 1946, Pitts was a core member and involved with the Macy conferences, whose principal purpose was to set the foundations for a general science of the workings of the human mind.

In 1952, together with McCulloch, Lettvin and Pat Wall, Pitts was part of the group hired by MIT Professor Jerome Wiesner at the Vienna Research Laboratory for Electronics on advice from Wiener to study the functioning of the nervous system. Within this group, which pioneered cognitive science, he was one of the leading minds with McCulloch, although he did not like to be mentioned in publications and rejected academic degrees and official university positions. His way of working was also unusual – he was often seen reading in a bar. He remained at MIT until his death, but was increasingly isolated after Wiener broke with McCulloch and all his associates, including Pitts, for private reasons (his wife did not like McCulloch). In 1959, the paradigmatic “*What the Frog’s Eye Tells the Frog’s Brain*” (credited to Humberto Maturana, Lettvin, McCulloch and Pitts) conclusively demonstrated that “*analog processes in the eye were doing at least part of the interpretive work*” in image processing as opposed to “*the brain computing information digital neuron by digital neuron using the exacting implement of mathematical logic*“, leading Pitts to burn his unpublished doctoral dissertation and years of unpublished research.Pitts died in 1969 from bleeding esophageal varices, a common concomitant disease of cirrhosis of the liver. The theoretical foundations he formulated together with McCulloch were important for the development of neuroinformatics and cognitive sciences.

**References and Further Reading:**

- [1] The time you enjoy wasting is not wasted time – Bertrand Russell, Logician and Pacifist, SciHi Blog
- [2] Rudolf Carnap and the Logical Structure of the World, SciHi Blog
- [3] John von Neumann – Game Theory and the Digital Computer, SciHi Blog
- [4] Norbert Wiener and the Science of Cybernetics, SciHi Blog
- [5] Piccinini, Gualtiero, “The First Computational Theory of Mind and Brain: A Close Look at McCulloch and Pitts’s ‘Logical Calculus of Ideas Immanent in Nervous Activity'”,
*Synthese*141: 175–215, 2004. Kluwer Academic Publishers - [6] “The Man who Tried to Redeem the World with Logic”,
*Nautilus Magazine*issue 21, 5 February 2015 - [7] Walter Pitts, “Some observations on the simple neuron circuit”,
*Bulletin of Mathematical Biology*, Volume 4, Number 3, 121–129, 1942. - [8] Warren McCulloch and Walter Pitts, “A Logical Calculus of Ideas Immanent in Nervous Activity”, 1943,
*Bulletin of Mathematical Biophysics*5:115–133. - [9] Walter Pitts at Wikidata
- [10] Timeline for the History of Artificial Intelligence, via DBpedia and Wikidata

The post Walter Pitts and the Mathematical Model of a Neural Network appeared first on SciHi Blog.

]]>The post Rudy Rucker – Infinity and the Mind appeared first on SciHi Blog.

]]>On March 22, 1946, American mathematician, computer scientist, science fiction author, and philosopher **Rudolph von Bitter Rucker**, better known as Rudy Rucker, was born. He is also one of the founders of the cyberpunk literary movement.

“The space of our universe is the hypersurface of a vast expanding hypersphere.”

– Rudy Rucker, The Sex Sphere (1983)

Rucker was born and raised in Louisville, Kentucky, where his father Embry Cobb Rucker, Sr., a descended from Flemish Huguenots, ran a small furniture-manufacturing company. Later in life, Embry Cobb Rucker, Sr. became an Episcopal minister and worked as parish priest for the rest of his life. His mother, Marianne von Bitter originally was from Berlin and came to study at the Pennsylvania Academy of Fine Arts in Philadelphia in 1937. She was an enthusiastic gardener, amateur artist and potter. Moreover, she also was a descendent of famous German philosopher Georg Wilhelm Friedrich Hegel.[5] Rucker attended St. Xavier High School before earning a B.A. in mathematics from Swarthmore College in 1967 and M.S. (1969) and PhD (1973) degrees in mathematics from Rutgers University with a specialization on mathematical logic.

In 1972, Rucker started teaching in the Math. Dept. at the State University College at Geneseo, New York, with a “Higher Geometry” course, which turned into a series of lectures on the fourth dimension. Eventually he wrote the lectures up as Geometry, Relativity and The Fourth Dimension, published by Dover Publications, which should become the foundation of his writing career. Thanks to a grant from the German Alexander von Humboldt Foundation, Rucker taught math at the Ruprecht Karl University of Heidelberg from 1978 to 1980. He then taught at Randolph-Macon Women’s College in Lynchburg, Virginia from 1980 to 1982, before trying his hand as a full-time author. Inspired by an interview with British scientist Stephen Wolfram, Rucker became a computer science professor at San José State University in 1986, from which he retired in 2004.

“Computationsare everywhere, once you begin to look at things in a certain way.” (Rudy Rucker)

A mathematician with philosophical interests, Rucker published *Infinity and the Mind* in 1982. The book contains accessible popular expositions on the mathematical theory of infinity, and a number of related topics. These include Gödel‘s incompleteness theorems [6] and their relationship to concepts of artificial intelligence and the human mind, as well as the conceivability of some unconventional cosmological models. The material is approached from a variety of viewpoints, some more conventionally mathematical and others being nearly mystical.

As his “own alternative to cyberpunk,” Rucker developed a writing style he terms Transrealism. The essence of transrealism, as outlined in his 1983 essay “The Transrealist Manifesto,” is to write about one’s real life in fantastic terms. The Secret of Life, White Light, and The Sex Sphere are examples of his transreal novels. The first recasts a traditional coming of age memoir as a UFO novel, the second is about Rucker’s time as a mystical mathematician at SUNY Geneseo, while the third turns his two years in Germany into a tale of higher dimensions and nuclear terrorism.

Rucker is presently working on a 1950s SF novel called *The Turing Chronicles*, featuring a love affair between computer pioneer Alan Turing [4] and Beat author William Burroughs. Rucker often uses his novels to explore scientific or mathematical ideas; White Light examines the concept of infinity, while the Ware Tetralogy is in part an explanation of the use of natural selection to develop software. His non-fiction book, *The Lifebox*, t*he Seashell, and the Soul: What Gnarly Computation Taught Me About Ultimate Reality*, *the Meaning Of Life, and How To Be Happy* summarizes the various philosophies he’s believed over the years and ends with the tentative conclusion that we might profitably view the world as made of computations, with the final remark, “perhaps this universe is perfect.”

At yovisto academic video search you can watch Rudy Rucker delivering a rather interesting talk at TEDx Brussels 2012 about “*Beyond Machines: The Year 3000*“.

**References and Further Reading: **

- [1] Rudy Rucker’s website
- [2] Rudy Rucker’s autobiographical essay (2004)
- [3] More autobiographical notes on Rudy Rucker’s old webpage
- [4] Churchill’s Best Horse in the Barn – Alan Turing, Codebreaker and AI Pioneer, SciHi Blog, June 23, 2012.
- [5] Georg Friedrich Wilhelm Hegel and the Secret of his Philosophy, SciHi Blog, August 27, 2012.
- [6] Kurt Gödel Shaking the Very Foundations of Mathematics, SciHi Blog
- [7] Rudy Rucker
*Fantastic Fiction*Bibliographie - [8] Rudy Rucker at Wikidata
- [9] Timeline of American Science Fiction Writers born before 1950, via DBpedia and Wikidata

The post Rudy Rucker – Infinity and the Mind appeared first on SciHi Blog.

]]>The post Jakob Steiner and Analytical Geometry appeared first on SciHi Blog.

]]>On March 18, 1796, Swiss mathematician **Jakob Steiner** was born. Steiner‘s work was mainly confined to geometry. Moreover, he has been considered the greatest pure geometer since Apollonius of Perga.

“Calculating replaces, while geometry stimulates, thinking”

-Jakob Steiner (1796-1863)

Steiner was the son of a small farmer, attended the local village school, where he learned to write only at the age of fourteen, and at the age of seventeen went to Yverdon to Johann Heinrich Pestalozzi, at whose institution he later worked as an assistant teacher for some time. When this was closed, he moved to Heidelberg in 1818 to study mathematics with Ferdinand Schweins (1780-1856), among others, but due to the poorness of the lectures there he was almost entirely dependent on self-study. He financed his living by private lessons. The lectures on algebra, differential and integral calculus stimulated investigations into mechanics, which he recorded in his compendia in 1821, 1824 and 1825. From the winter of 1820/21 he lived in Berlin, initially as a private teacher of mathematics, and was soon regarded as the best private teacher in the city.

During this time Steiner published some papers on geometrical problems in Crelles Journal für die reine und angewandte Mathematik. Then he was a teacher at the Plamann educational institution, which was influenced by Pestalozzi’s pedagogy. From 1827 Steiner worked at the Gewerbeakademie (*Oberlehrer*, from 1833 with professor title). Steiner became acquainted with several influential scientists, who supported his career like A. L. Crelle [6] or N. H. Abel.[7] In 1832, he published his *Systematische Entwickelungen* and earned himself a great reputation. Through the influence of Carl Gustav Jacob Jacobi [8] and of the brothers Alexander and Wilhelm von Humboldt a new chair of geometry was founded for him at the University of Berlin in 1834. He held this position for the rest of his life as a full member of the Prussian Academy of Sciences. He spent the last years of his life in Switzerland, tormented by severe physical ailments.

Steiner devoted most of his mathematical work to geometry and tried to avoid analysis as much as he could, since he hated it. His research became known for its great generality, the fertility of his resources, and for the rigour in his proofs. Jakob Steiner was soon widely considered as the greatest pure geometer since Apollonius of Perga. In his work “*Systematische Entwickelung der Abhängigkeit geometrischer Gestalten von einander*“, he laid the foundation of modern synthetic geometry and introduced what are now called the geometrical forms. Between their elements, Steiner managed to establish a one-to-one correspondence, or, made them projective, as he referred to it. Furtherly, he gave by aid of these projective rows and pencils a new generation of conics and ruled quadric surfaces, which leads quicker and more directly than former methods into the inner nature of conics and reveals to the organic connection of their innumerable properties and mysteries. Also, for the first time the principle of duality introduced from the very beginning as an immediate outflow of the most fundamental properties of the plane, the line and the point can be seen.

Another important work by Steiner was “*Die geometrischen Constructionen ausgeführt mittels der geraden Linie und eines festen Kreises*“, which was published in 1833. The work was influenced by J V. Poncelet [9] and in it, Steiner showed how all problems of the second order can be solved by aid of the straight edge alone without the use of compasses as soon as one circle is given on the drawing-paper. Another famous result is the ‘Poncelet-Steiner theorem’ which shows that only one given circle and a straight edge are required for Euclidean constructions. [10] A posthumously published work was his “*Vorlesungen über synthetische Geometrie*“. However, next to his scientific books, Steiner authored numerous papers, which were later published in Crelle’s Journal. Especially the one’s concerning algebraic curves and surfaces were the most influential to later mathematicians. Futher very important research, published by the Swiss mathematician was his work on maxima and minima, which surpassed contemporary scientists significantly. Known is his geometric solution of the isoperimetric problem (to show that the circle is the curve that encloses the largest content at a given circumference).

In his lectures Steiner attached great importance to the formation of geometric views, which was also an important topic of Pestalozzi pedagogy. In order to promote this view, Steiner refrained from using geometric figures in his lectures. A further characteristic, which came from the school of Pestalozzi, was the responsiveness to the needs of the pupils, who were supposed to discover mathematical knowledge as far as possible by themselves, whereby the teacher only indicates the direction, similar to the Socratic method or the Moore method later influential in the USA. Steiner demanded a lot from his students, there was often a harsh tone and he was not easily satisfied, yet he could gather a circle of loyal students around him.

The last ten years of Steiner’s life were increasingly difficult through illness. Kidney problems caused him to spend most of the year in his native Switzerland, only going to Berlin in the winter to deliver his lectures. Eventually he became totally bedridden and was unable to carry out any teaching duties. Jakob Steiner passed away on April 1, 1863, at age 67.

At yovisto academic video search you may be interested in a video lecture on “*Steiner’s regions of space problem*” by n. J. Wildberger.

**References and Further Reading:**

- [1] Viktor Blåsjö (2009) “Jakob Steiner’s Systematische Entwickelung: The Culmination of Classical Geometry
- [2] Jacob Steiner’s work on the Isoperimetric Problem
- [3] Steiner Biography
- [4] Jacob Steiner at zbMATH
- [5] Jacob Steiner at Mathematics Genealogy Project
- [6] August Leopold Crelle and his Journal, SciHi Blog, March 11, 2017.
- [7] The Short but Influential Life of Niels Henrik Abel, April 6, 2013.
- [8] Carl Jacobi and the Elliptic Functions, SciHi Blog, December 10, 2014.
- [9] Jean-Victor Poncelet and Projective Geometry, SciHi Blog, July 1, 2016.
- [10] O’Connor, John J.; Robertson, Edmund F., “Jakob Steiner“, MacTutor History of Mathematics archive, University of St Andrews.
- [11] Works of Jacob Steiner at Wikisource
- [12] TImeline of Geometers, born before 1900, via DBpedia and Wikidata

The post Jakob Steiner and Analytical Geometry appeared first on SciHi Blog.

]]>The post Girard Desargues and Projective Geometry appeared first on SciHi Blog.

]]>On February 21, 1591, French mathematician and engineer **Girard Desargues **was born. Desargues is considered one of the founders of projective geometry. Desargues‘ theorem, the Desargues graph, and the crater Desargues on the Moon are named in his honour. In his later years, he designed an elaborate spiral staircase, and an ingenious new form of pump, but the most important of Desargues‘ interests was Geometry. He invented a new, non-Greek way of doing geometry, now called ‘projective’ or ‘modern’ geometry. As a mathematician he was highly original and completely rigorous. However, he was far from lucid in his mathematical style.

“I freely confess that I never had taste for study or research either in physics or geometry except in so far as they could serve as a means of arriving at some sort of knowledge of the proximate causes… for the good and convenience of life, in maintaining health, in the practice of some art,… having observed that a good part of the arts is based on geometry, among others that cutting of stone in architecture, that of sundials, that of perspective in particular.”

– Girard Desargues, (ca. 1640) as quoted in [13]

Little is known about Girard Desargues’ personal life. Born in Lyon, Desargues came from a wealthy family devoted to service to the French crown. His father was a royal notary, an investigating commissioner of the Seneschal’s court in Lyon (1574), the collector of the tithes on ecclesiastical revenues for the city of Lyon (1583) and for the diocese of Lyon, then the second most important city in France. Desargues seems to have made several extended visits to Paris in connection with a lawsuit for the recovery of a huge debt. Despite this loss, the family still owned several large houses in Lyon, a manor house at the nearby village of Vourles, and a small chateau surrounded by the best vineyards in the vicinity. Thus, it is clear that Desargues had every opportunity of acquiring a good education, could afford to buy what books he chose, and had leisure to indulge in whatever pursuits he might enjoy.

Girard Desargues worked as an architect from 1645. Prior to that, he had worked as a tutor and may have served as an engineer and technical consultant in the entourage of Richelieu. As an architect, Desargues planned several private and public buildings in Paris and Lyon. As an engineer, he designed a system for raising water that he installed near Paris. It was based on the use of the at the time unrecognized principle of the epicycloidal wheel. When in Paris, Desargues became part of the mathematical circle surrounding Marin Mersenne, which also included Rene Descartes,[4] Étienne Pascal and his son Blaise Pascal.[5] It was probably essentially for this limited readership of friends that Desargues prepared his mathematical works, and had them printed.

Desargues wrote on ‘practical’ subjects such as perspective (1636), the cutting of stones for use in building (1640) and sundials (1640). His writings are, however, dense in content and theoretical in their approach to the subjects concerned. His research on perspective and geometrical projections can be seen as a culmination of centuries of scientific inquiry across the classical epoch in optics that stretched from al-Hasan Ibn al-Haytham (Alhazen) to Johannes Kepler,[6] and going beyond a mere synthesis of these traditions with Renaissance perspective theories and practices. Desargues conceived projective geometry as a natural extension of Euclidean geometry [7] in which parallel lines at infinity, sizes can vary as long as proportions are kept, and shapes are considered to be one with the totality of shadows they can cast. This is exactly what is needed in perspective design, where each object appears deformed according to the point of observation. Thus the plane sections of a cone are nothing but the different images projected by a light source on a wall when its inclination varies. In this framework, a circle is equivalent to an ellipse, which becomes a parabola as soon as the intersection point of the axis of the light cone with the wall ends up in infinity.

Desargues’ most important work, the one in which he invented his new form of geometry, *Rough draft for an essay on the results of taking plane sections of a cone *was published in 1639 only in a small number. Just one is now known to survive. The theorem of Desargues of projective geometry states that the intersection points of corresponding sides of two triangles lie on a straight line when the connecting lines of corresponding vertices intersect in one point (and vice versa). The painter Laurent de La Hire and the engraver Abraham Bosse found Desargues’s method attractive. Bosse, who taught perspective constructions based on Desargues’s method at the Royal Academy of Painting and Sculpture in Paris, published a more accessible presentation of this method in 1648.

In the 17th century Desargues’s new approach to geometry, i.e. studying figures through their projections, was appreciated by a few gifted mathematicians, such as Blaise Pascal and Gottfried Wilhelm Leibniz,[8] but it did not become rather influential. Rene Descartes’s algebraic way of treating geometrical problems, published in *Discours de la méthode* (1637) came to dominate geometrical thinking and Desargues’s ideas were forgotten. Desargues’ work, however, was rediscovered and republished in 1864. Late in his life, Desargues published a paper with the cryptic title of *DALG*. The most common theory about what this stands for is *Des Argues, Lyonnais, Géometre* (proposed by Henri Brocard). He died in Lyon in 1661.

At yovisto academic video search you can learn more about Desargues and his projective geometry in the lecture series of Yale Prof N.J. Wildberger on the History of Mathematics

**References and Further Reading**

- [1] O’Connor, John J.; Robertson, Edmund F., “Girard Desargues”,
*MacTutor History of Mathematics archive*, University of St Andrews. - [2] Girard Desargues at Britannica.com
- [3] Girard Desargues at Scienceworld.wolfram.com
- [4] Cogito Ergo Sum – René Descartes, SciHi Blog, March 31, 2013.
- [5] It is not Certain that Everything is Uncertain – Blaise Pascal’s Thoughts, SciHi Blog, June 19, 2012.
- [6] And Kepler Has His Own Opera – Kepler’s 3rd Planetary Law, SciHi Blog, May 15, 2012.
- [7] Euclid – the Father of Geometry, SciHi Blog, January 30, 2015.
- [8] Let Us Calculate – the Last Universal Academic Gottfried Wilhelm Leibniz, SciHi Blog, July 1, 2012.
- [9] Girard Desargues in Wikidata
- [10] Girard Desargues at Reasonator
- [11] Girard Desargues at Mathematics Genealogy Project
- [12] Timeline for geometers, via Wikidata
- [13] William Thompson Sedgwick, Harry Walter Tyler,
*A Short History of Science*(1917)

The post Girard Desargues and Projective Geometry appeared first on SciHi Blog.

]]>The post Wilhelm Weinberg and the Genetic Equilibrium appeared first on SciHi Blog.

]]>On January 13, 1908, German physician and obstetrician-gynecologis **Wilhelm Weinberg** delivered an exposition of his ideas on the principle of genetic equilibrium in a lecture before the Verein für vaterländische Naturkunde in Württemberg. He developed the idea of genetic equilibrium independently of British mathematician G. H. Hardy.

Wilhelm Weinberg studied medicine at the Universities of Berlin, Tübingen, and Munich, Germany. He returned to his birth town, Stuttgart in 1889, where he remained running a large practice as a gynecologist and obstetrician. He is known to have been a physician to the poor and delivered around 3500 babies in his life. Still, he managed to write over 160 scientific papers as well as numerous reviews and comments in addition. The fact that his recognition outside of the German speaking area was so little, was according to contemporary scientists highly noticeable in his writings. His criticism was often very personal and his reviews very argumentative.

However, the scientist is mainly known for his contributions to the Hardy–Weinberg principle. It is also known as the Hardy–Weinberg equilibrium, model, theorem, or law. It states that allele and genotype frequencies in a population will remain constant from generation to generation in the absence of other evolutionary influences. These influences include non-random mating, mutation, selection, genetic drift, gene flow and meiotic drive. Because one or more of these influences are typically present in real populations, the Hardy–Weinberg principle describes an ideal condition against which the effects of these influences can be analyzed. The principle was developed by the British mathematician Godfrey Harold Hardy and Weinberg independently. Weinberg’s work was published in a lecture at the Society for the Natural History of the Fatherland in Württemberg, which was six months before the publishing of Hardy’s paper in English.

Since Weinberg’s scientific work was published in German, it remained completely unrecognized for over 35 years. Curt Stern was a contemporary German geneticist and he immigrated to the United States just before World War II. When he found out, that the achievement of both scientists was named “Hardy’s law” or “Hardy’s formula”, he pointed out Weinberg’s work in a scientific paper. Another reason, why Weinberg’s achievement was ignored for so many years may be the fact that it was written very difficultly, even for native speakers. Still, he used only elementary mathematics and avoided calculus as much as he could.

Wilhelm Weinberg also pioneered in the studies of twins. He developed techniques to analyze the phenotypic variation that partitioned this variance into genetic and environmental components. Weinberg recognized that ascertainment bias was affecting many of his calculations, and he produced methods to correct for it. Weinberg observed that proportions of homozygotes in familial studies of classic autosomal recessive genetic diseases generally exceed the expected Mendelian ratio of 1:4, and he explained how this is the result of ascertainment bias. He discovered the answer to several seeming paradoxes caused by ascertainment bias and he recognized that ascertainment was responsible for a phenomenon known as anticipation, the tendency for a genetic disease to manifest earlier in life and with increased severity in later generations.

Wilhelm Weinberg passed away on November 27, 1937.

At yovisto academic video search, you may be interested in a short introduction lecture to the Hardy-Weinberg Principle.

**References and Further Reading:**

- [1] Hardy, Weinberg and Language Impediments. James F. Crow, 1999 [PDF]
- [2] Weinberg, W., 1908: Über den Nachweis der Vererbung beim Menschen, Jahreshefte des Vereins für Vaterländische Naturkunde in Württemberg 64: 369-82. Digitalisat
- [3] Hardy–Weinberg Equilibrium Calculator
- [4] G. H. Hardy and the aesthetics of Mathematics, SciHi Blog, December 1, 2016.
- [5] Crick and Watson decipher the DNA, SciHi Blog
- [6] The Avery-McLeod-McCarthy Experiment, SciHi Blog
- [7] Max Delbrück and the Genes, SciHi Blog
- [8] Wilhelm Weinberg at Wikidata

The post Wilhelm Weinberg and the Genetic Equilibrium appeared first on SciHi Blog.

]]>The post Ada Lovelace – The World’s First Programmer appeared first on SciHi Blog.

]]>On November 27, 1852, Augusta Ada King**, Countess of Lovelace** passed away, who is considered to be the world’s very first computer programmer. Every student of computer science has most probably heart of Ada Countess of Lovelace, assistant to mathematician Charles Babbage, [1] inventor of the very first programmable (mechanical) computer, the analytical engine. Although probably not widely known to the general public, there are Ada Lovelace tuition programs for girls, a programming language called ‘*ADA*‘, as well as numerous references in popular culture, literature, and even a graphic novel.

“The Analytical Engine] might act upon other things besides number, were objects found whose mutual fundamental relations could be expressed by those of the abstract science of operations, and which should be also susceptible of adaptations to the action of the operating notation and mechanism of the engine…Supposing, for instance, that the fundamental relations of pitched sounds in the science of harmony and of musical composition were susceptible of such expression and adaptations, the engine might compose elaborate and scientific pieces of music of any degree of complexity or extent.”

— Ada Lovelace Notes in Menabrea, Luigi (1842). Sketch of the Analytical Engine invented by Charles Babbage Esq..

Augusta Ada King, Countess of Lovelace, was born on December 10, 1815, as Ada Augusta Byron, daughter of the famous as well as notorious English poet Lord Byron, to a wealthy family of nobility. Still, her childhood was pretty unfortunate. Byron had numerous affairs and three children of three women, only Ada was born legitimate. Her mother moved to Kirkby Mallory on January 16, 1816, together with the one-month-old Ada, to her parents in Kirkby Mallory due to ongoing disputes with Lord Byron. On April 21, 1816 Lord Byron signed a deed of separation and left England a few days later. Lord Byron had no relationship with his daughter, she never met him. When Ada was eight years old, he died

Ada’s mathematically interested mother, who in her youth had also been taught natural sciences and mathematics by house teachers, gave Ada an education in natural sciences, rather uncommon for girls in her age. Luckily, Ada’s talents and her brilliance were quickly detected and would basically determine the rest of her life.

Ada’s love to mathematics and her admiration for the scientist Mary Somerville led to a meeting of Charles Babbage and herself in 1833, which was to change her life critically. Babbage had published a paper on his famous difference engine about 10 years earlier, a calculating machine designed to tabulate polynomial functions. Augusta Ada Byron was highly interested in Charles Babbage’s work and especially in his machine, which many scientists were talking about. After some scientific debates with Babbage, he was deeply in love with her writing abilities as well as her mathematical skills, wherefore he called her the ‘*enchantress of numbers*‘. Augustus De Morgan, professor at University College London, who himself made fundamental contributions to the development of mathematical logic, had a major influence on her later education and on her main work – *the Notes*. Lovelace took lessons with him from 1841 on his own initiative. At the age of 19 Ada Byron married William King, 8th Baron King, who in 1838 became the 1st Earl of Lovelace. He, too, had a mathematical education and, since women were forbidden to enter libraries and universities at that time, had himself accepted into the Royal Society for her sake, where he copied articles for her.

Ada’s ascent to being a recognized scientist was hard due to her family’s public attention as well as to the fact that women in science and technology were still rare in the middle of the 19th century. Still, her chance came with Babbage’s publication of the ‘analytical engine’, a successor of the prior ‘difference engine’ and the very first general purpose programmable mechanical computer. In 1843 she translated the description of Babbages Analytical Engine, written in French by the Italian mathematician Luigi Menabrea, into English. Encouraged by Babbage, she added her own notes and reflections on the construction of this planned machine adding numerous notes explaining and commenting the machine’s function. Ada’s notes turned out being longer than the original work itself, because most scientists were not able to understand the difference between the two machines of Babbage.

Ada also explained an algorithm for calculating a sequence of Bernoulli numbers with the new machine, wherefore she is now mostly known for being the world’s first computer programmer. Many scholars belief that Babbage must have written programs for the Analytical Engine beforehand. However, Ada Lovelace’s program was the first published computer program. As many researchers read Ada’s work over the years, they recognized her being even more visionary than Babbage himself. Babbage’s machine was never built during his lifetime. On the one hand, precision mechanics had not yet been developed far enough to produce the machine parts with the necessary precision; on the other hand, the British Parliament refused to finance Babbage’s research programme, having already supported the development of its predecessor – the Difference Engine – with 17,000 British pounds (a value of around 3.4 million British pounds in 2014). Unfortunately, Ada was recognized for her work only over a century after the first publication, when the engine was proven to be an early model of the computer.

Ada Lovelace has critically influenced early achievements on programming but also faced lifelong difficulties with her family as well as with society, that rather emphasized her antics with alcohol, men or gambling than paying attention to her mathematical brilliance. Lovelace’s *Notes* contains a number of concepts that are far ahead of the state of research around 1840. While her contributions to computer architecture and the fundamentals of programming were largely forgotten until their rediscovery in the 1980s, her views on artificial intelligence played a certain role in epistemological debates as “*Lady Lovelace’s Objection*” even when this field of computer science research was founded. She wrote that

“

The Analytical Engine has no pretensions whatever tooriginate anything. It can do whatever we know how to order itto perform. It can follow analysis; but it has no power of anticipating any analytical relations or truths.”

This objection has been the subject of much debate and rebuttal, for example by Alan Turing in his paper “*Computing Machinery and Intelligence*“

Babbage’s motivation for the Analytical Engine was the calculation of numerical tables for use in science and engineering. Lovelace, on the other hand, had recognized the far greater potential of the machine: It would not only be able to perform numerical calculations, but would also combine letters and compose music, which was based on the relations of tones that could be expressed as combinations of numbers. Ada Lovelace also recognized that the machine has a physical part, namely the copper wheels and punched cards, and a symbolic part, i.e. the automatic calculations coded in the punched cards. She thus anticipated the subdivision into hardware and software.

Ada Lovelace died at the age of 36 – the same age that her father had died – on 27 November 1852, from uterine cancer probably exacerbated by bloodletting by her physicians.

At yovisto academic video search you can learn more about Ada Lovelace in the video report from the Institution of Engineering and Technology (IET) explaining the contribution and importance of the Countess of Lovelace in the field of engineering.

**References and further Reading:**

- [1] Charles Babbage – The Father of the Computer who hated Street Music, SciHi Blog
- [2] The Thrilling Adventures of Lovelace and Babbage – a web comic
- [3] Augustus de Morgan and Formal Logic, SciHi Blog
- [4] Ada Lovelace at San Diego Supercomputer Center Website
- [5] Ada Byron, Lady Lovelace at Agnes Scott College
- [6] The Babbage Engine at Computer History Museum
- [7] H. Sack,
*Programmieren*, in: Historisches Wörterbuch des Mediengebrauchs. Band 2, hrsg. von Heiko Christians, Matthias Bickenbach und Nikolaus Wegmann, Böhlau Verlag, Köln, Weimar, Wien, 2018, 363-375.*[in German]* - [8] Churchill’s Best Horse in the Barn – Alan Turing, Codebreaker and AI Pioneer, SciHi Blog
- [9] Ada Lovelace at Wikidata
- [10] O’Connor, John J.; Robertson, Edmund F., “Ada Lovelace“, MacTutor History of Mathematics archive, University of St Andrews.

The post Ada Lovelace – The World’s First Programmer appeared first on SciHi Blog.

]]>