Dr. Yoshua Bengio
Professor, department of computer science and operations research, Université de Montréal, Canada
Articles sponsored or reviewed
Yoshua Bengio (born in Paris, France, March 5th, 1964) grew up in France and in Quebec (Canada). He attended McGill University (Montreal, Quebec, Canada), graduating with a B.Eng. in 1986, a M.Sc. (computer science) in 1988, and a Ph.D. (computer science) in 1991. His thesis was on neural networks and hidden Markov models (HMMs) and their combination in discriminant learning algorithms. He pursued his machine learning research in post-doctoral internships first at MIT (Brain and Cognitive Sciences) with Michael Jordan and then at AT&T Bell Labs in the group of Larry Jackel, Yann Le Cun and Vladimir Vapnik. His most noted contribution coming out of this post-doctoral studies was a series of papers showing the fundamental limitations of gradient-based learning for parametrized dynamical systems such as recurrent neural networks and HMMs. Since 1993 he has been a faculty at the department of computer science and operations research, Université de Montréal, Canada. Since 2000 he holds a Canada Research Chair in statistical learning algorithms.
Throughout his career he has been trying to understand the principles of learning that may provide -- one day -- computers with intelligence. In this quest he has focussed on the fundamental limitations of current learning algorithms, to help design algorithms bypassing these limitations. In the nineties he outlined the difficulties of learning to represent context. In recent years he has focussed on the limitations of shallow architectures, and on strategies for dealing with the difficult optimization and inference problems that deeper architectures present.