Michael I. Jordan
Michael I. Jordan | |
---|---|
Born | February 25, 1956 |
Residence | Berkeley, CA |
Institutions |
University of California, Berkeley University of California, San Diego Massachusetts Institute of Technology |
Alma mater | University of California, San Diego |
Thesis | The Learning of Representations for Sequential Performance (1985) |
Doctoral advisor |
David Rumelhart Donald Norman |
Known for | Latent Dirichlet allocation |
Notable awards |
Fellow of the U.S. National Academy of Sciences[1] AAAI Fellow (2002) Rumelhart Prize (2015) [2] IJCAI Award for Research Excellence (2016) |
Website www |
Michael Irwin Jordan is an American scientist, Professor at the University of California, Berkeley and leading researcher in machine learning and artificial intelligence.[3][4][5]
Biography
Jordan received his BS magna cum laude in Psychology in 1978 from the Louisiana State University, his MS in Mathematics in 1980 from Arizona State University and his PhD in Cognitive Science in 1985 from the University of California, San Diego.[6] At the University of California, San Diego Jordan was a student of David Rumelhart and a member of the PDP Group in the 1980s.
Jordan is currently a full professor at the University of California, Berkeley where his appointment is split across the Department of Statistics and the Department of EECS. He was a professor at MIT from 1988-1998.[6]
Work
In the 1980s Jordan started developing recurrent neural networks as a cognitive model. In recent years, though, his work is less driven from a cognitive perspective and more from the background of traditional statistics.
He popularised Bayesian networks in the machine learning community and is known for pointing out links between machine learning and statistics. Jordan was also prominent in the formalisation of variational methods for approximate inference[1] and the popularisation of the expectation-maximization algorithm[7] in machine learning.
Resignation from Machine Learning Journal
In 2001, Michael Jordan and others resigned from the Editorial Board of Machine Learning. In a public letter, they argued for less restrictive access; a new journal Journal of Machine Learning Research (JMLR) was created to support the evolution of the field of machine learning.[8]
Honors and awards
Jordan received numerous awards, including a best student paper award [9] (with X. Nguyen and M. Wainwright) at the International Conference on Machine Learning (ICML 2004), a best paper award (with R. Jacobs) at the American Control Conference (ACC 1991), the ACM - AAAI Allen Newell Award, the IEEE Neural Networks Pioneer Award, and an NSF Presidential Young Investigator Award. In 2010 he was named a Fellow of the Association for Computing Machinery "for contributions to the theory and application of machine learning."[10]
Prof. Jordan is a member of the National Academy of Science, a member of the National Academy of Engineering and a member of the American Academy of Arts and Sciences.
He has been named a Neyman Lecturer and a Medallion Lecturer by the Institute of Mathematical Statistics. He received the David E. Rumelhart Prize in 2015 and the ACM/AAAI Allen Newell Award in 2009.
Notable students
It is notable that many of Jordan's graduate students and postdocs continue to strongly influence the machine learning field after their PhDs. Among many others, Andrew Ng, Zoubin Ghahramani, Francis Bach, Tommi Jaakkola, Lawrence Saul, David Blei, Eric Xing, Martin Wainwright, Yee Whye Teh, Ben Taskar and Yoshua Bengio (all former students or postdocs of Jordan) have all continued to make significant contributions to the field.
References
- 1 2 Zeliadt, N. (2013). "Profile of Michael I. Jordan". Proceedings of the National Academy of Sciences. 110 (4): 1141–1143. doi:10.1073/pnas.1222664110. PMID 23341554.
- ↑ Bio highlights of Prof. MI Jordan
- ↑ Jacobs, R. A.; Jordan, M. I.; Nowlan, S. J.; Hinton, G. E. (1991). "Adaptive Mixtures of Local Experts". Neural Computation. 3: 79. doi:10.1162/neco.1991.3.1.79.
- ↑ David M. Blei, Andrew Y. Ng, Michael I. Jordan. Latent Dirichlet allocation. The Journal of Machine Learning Research, Volume 3, 3/1/2003
- ↑ Michael I. Jordan, ed. Learning in Graphical Models. Proceedings of the NATO Advanced Study Institute, Ettore Maiorana Centre, Erice, Italy, September 27-October 7, 1996
- 1 2 Vitae MICHAEL I. JORDAN Department of Electrical Engineering and Computer Science, Department of Statistics, University of California. Accessed September 3, 2013
- ↑ "Hierarchical mixtures of experts and the EM algorithm". Neural Computation. 1994. doi:10.1109/IJCNN.1993.716791. Retrieved 2015-12-19.
- ↑ Editorial Board of the Kluwer Journal, Machine Learning: Resignation Letter (2001)
- ↑ "Long Nguyen's Publications". Stat.lsa.umich.edu. Retrieved 2012-05-21.
- ↑ "ACM Names 41 Fellows from World's Leading Institutions — Association for Computing Machinery". Acm.org. Retrieved 2012-05-21.
External links
- Homepage (at University of California, Berkeley)
- Published papers (chronological)