Michael I. Jordan

Michael I. Jordan is a leading researcher in machine learning and artificial intelligence. Jordan was a prime mover behind popularising Bayesian networks in the machine learning community and is known for pointing out links between machine learning and statistics. Jordan was also prominent in the formalisation of variational methods for approximate inference and the popularisation of the expectation-maximization algorithm in machine learning.

Jordan was a student of David E. Rumelhart and a member of the PDP Group in the 1980s. During this time he developed recurrent neural networks as a cognitive model. In recent years, though, his work is less driven from a cognitive perspective and more from the background of traditional statistics.

Jordan is currently a full professor at the University of California, Berkeley where his appointment is split across the Department of Statistics and the Department of EECS.

It is notable that many of Jordan's graduate students continue to strongly influence the machine learning field after their PhDs. Zoubin Ghahramani, Tommi Jaakkola, Andrew Ng, Lawrence Saul and David Blei (all former students of Jordan) have all continued to make significant contributions to the field.