On April 16, 1894, Polish mathematician and statistician **Jerzy Neyman** was born. Neyman was one of the principal architects of modern theoretical statistics. He first introduced the modern concept of a confidence interval into statistical hypothesis testing and co-devised null hypothesis testing in collaboration with Egon Pearson.

“Statistics is the servant to all sciences.”, Jerzy Neyman

#### Youth and Education

Jerzy Neyman was born into a Polish family in Bendery, in the Bessarabia Governorate of the Russian Empire, the fourth of four children of Czesław Spława-Neyman and Kazimiera Lutosławska. Neyman’s family descended from a long line of Polish nobles and military heroes. In 1906, however, his father died of a heart attack and his mother, now having little money to bring up her son, moved to Kharkov where she had relatives.[1] Neyman graduated from the Kamieniec Podolski gubernial gymnasium for boys in 1909 under the name Yuri Cheslavovich Neyman, the Russian version of his name, and began studies at Kharkov University in 1912, where he was taught by Russian probabilist Sergei Natanovich Bernstein, who strongly influenced Neyman and encouraged him to read Karl Pearson’s *The Grammar of Science*.

#### World War I and Russian Revolution

Many students left for military service when World War I started in 1914 but Neyman failed the eyesight test so remained at University.[1] After he read *‘Lessons on the integration and the research of the primitive functions*‘ by French mathematician Henri Lebesgue, he was fascinated with measure and integration.[6] However in 1917, the last year of the war, the Russian Revolution, and the civil war, totally disrupted the academic life of the University. There was great hardship and not surprisingly Neyman’s health began to deteriorate as he was diagnosed with tuberculosis. Despite the difficulties that he was under, Neyman passed his examinations and became a lecturer at Kharkov University.[1]

#### Travels to the USA and France

In 1921 he returned to Poland in a program of repatriation of POWs after the Polish-Soviet War. He earned his Doctor of Philosophy degree at University of Warsaw in 1924 for a dissertation titled “*On the Applications of the Theory of Probability to Agricultural Experiments*“. He was examined by Wacław Sierpiński and Stefan Mazurkiewicz, among others. Receiving a Rockefeller Fellowship to work with Karl Pearson in London, he arrived there in September 1925 but was disappointed to discover that Pearson was ignorant of modern mathematics. He continued to travel to Paris, where he attended lectures by Borel,[7] Lebesgue and Hadamard [8] and his interests began to move back towards sets, measure and integration.[1] While there, his interest in statistics was renewed by an encounter with Karl Pearson‘s son Egon, also then in Paris, who was trying to find a general principle from which Gosset’s (“Student’s”) T test could be derived. [2]

#### The Gosset Problem and Bayes’ Theorem

After his return to Poland he established the Biometric Laboratory at the Nencki Institute of Experimental Biology in Warsaw. He spent time in both Warsaw and Kraków and on 26 June obtained his habilitation and began lecturing as a docent.[1] He wrote several papers jointly with Egon Pearson, one of them relevant to the Gosset Problem, and the other dealing with the Bayes Theorem about prior and posterior probabilities. This collaboration eventually led to the Neyman-Pearson Lemma, a “frequentist” or Bernoullian alternative to the Bayesian approach to hypothesis testing.[2]

#### Modern Scientific Sampling

Neyman proposed and studied randomized experiments in 1923. Furthermore, his paper *“On the Two Different Aspects of the Representative Method: The Method of Stratified Sampling and the Method of Purposive Selection*“, given at the Royal Statistical Society on 19 June 1934, was the groundbreaking event leading to modern scientific sampling. He introduced the confidence interval in his paper in 1937. Another noted contribution is the Neyman–Pearson lemma, the basis of hypothesis testing.

#### Moving to Berkeley

In 1938 he moved to Berkeley, where he worked for the rest of his life. Berkeley, as he put it, was “tabula rasa” – no statistical study at all then existed.[2] There he taught probability and statistics in the mathematics department but aimed to set up a centre to train American statisticians. Soon there was an officially recognized Statistical Laboratory within the Department of Mathematics but the start of World War II diverted most academics from their intended course. Neyman undertook military research, particularly working on bomb sights and targeting problems.[1]

Neyman’s contributions to research in statistics over the latter part of his career were mostly in the areas of applications to meteorology and medicine. He received many honors for his remarkable contributions. In particular we note the Guy Medal of the Royal Statistical Society in 1966 and the Medal of Science from President Johnson in January 1969.[1] Jerzy Neyman died on August 5, 1981 at age 87.

Raymond Flood, *Are Averages Typical?*, [11]

**References and Further Reading:**

- [1] O’Connor, John J.; Robertson, Edmund F., “Jerzy Neyman“, MacTutor History of Mathematics archive, University of St Andrews.
- [2] Jerzy Neyman, Tales of Statisticians, at University of Massachussetts
- [3] Karl Pearson and Mathematical Statistics, SciHi Blog
- [4] Jerzy Neyman at zbMATH
- [5] Jerzy Neyman at the Mathematics Genealogy Project
- [6] Henri Léon Lebesgue and the Theory of Integration, SciHi Blog
- [7] Émile Borel and the Infinite Monkey Problem, SciHi Blog
- [8] Jacques Hadamard and the Description of Mathematical Thought, SciHi Blog
- [9] Jerzy Neyman at Wikidata
- [10] Chin Long Chiang, Jerzy Neyman, Statisticians in History
- [11] Raymond Flood,
*Are Averages Typical?*, Gresham College @ youtube - [12] Kendall, D. G.; Bartlett, M. S.; Page, T. L. (1982). “Jerzy Neyman. 16 April 1894-5 August 1981”.
*Biographical Memoirs of Fellows of the Royal Society*.**28**: 379–412. - [13] Neyman, Jerzy (1937). “Outline of a Theory of Statistical Estimation Based on the Classical Theory of Probability”.
*Philosophical Transactions of the Royal Society of London. Series A, Mathematical and Physical Sciences*.**236**(767): 333–380. - [14] Timeline of Statisticians, via DBpedia and Wikidata