Michael A Osborne is an expert in the development of intelligent algorithms capable of making sense of complex big data. His work in Machine Learning and non-parametric data analytics has been successfully applied in diverse and challenging contexts. For example, in astrostatistics, Michael’s probabilistic algorithms have aided the detection of planets in distant solar systems, and in autonomous robotics, his work has enabled self-driving cars to determine when their maps may have changed due to roadworks.
More recently, he has addressed key societal challenges, analysing how intelligent algorithms might soon substitute for human workers, and predicting the resulting impact on employment. Michael is an Associate Professor in Machine Learning, an Official Fellow of Exeter College, and a Faculty Member of the Oxford-Man Institute for Quantitative Finance, all at the University of Oxford.
Within machine learning, Michael has particular expertise in Gaussian processes, active learning, Bayesian optimisation and Bayesian quadrature, with a particular focus on the emerging field of probabilistic numerics.
He has applied machine learning to scientific and engineering problems in fields as diverse as exoplanet search, finance, crystallography, autonomous robotics, and the study of pigeon navigation. I’m also interested in how new technologies, particularly machine learning, are changing the nature of work.
Technology & Employment
Machine intelligence will underpin many of the most profound developments in the 21st century; Michael works both in developing such algorithms and in considering the broader societal consequences of automation. Towards this latter goal, Michael and economist Carl Frey have studied the societal and economic impact of new technologies, along with the differing and changing capabilities of both man and machine.
Bayesian optimisation is the use of probabilistic modelling techniques to perform the global optimisation of black-box functions.
Probabilistic Numerics is the study of numerical methods as learning algorithms.
Active Inference for Sensor Networks
Sensors networks are a growing source of complex data, offering insight into many environmental and human phenomena, but demanding novel forms of information processing.
Fault, Changepoint and Anomaly Tolerant Inference
Fast and reliable fault and changepoint detection is vital in achieving near-zero-breakdown performance for many real-world systems.
Machine learning will play a central role in many of the key aspects of delivering and storing energy in future.
Michael, along with Professor Stephen J Roberts, supervises DPhil (PhD) students in the Machine Learning Research Group.
C24 Advanced Probability Theory
This course is really less esoteric than its name might suggest, providing a grounded review of probabilistic tools relevant for many applications in data science and predictive analytics. In particular, the course will cover model comparison, the value of information, Gaussian processes.
This course covers maximum likelihood estimation and compares against Bayesian inference. The course concludes by introducing decision theory and classification.
Data, Estimation and Inference
This course will be taught within the AIMS CDT. It will include an introduction to probability theory, including various properties of distributions and their representation as belief networks. It covers maximum likelihood estimation and compares against Bayesian inference, before introducing decision theory and classification. The course continues by covering Gaussian processes.
C19 Machine Learning: Inference
Machine learning is a rapidly developing field that lies at the intersection of statistics and computer science. Its growth can be partially attributed to the vast quantities of data (known as "big data") that are now routinely being captured in science and industry. Machine learning aims to find means of taking decisions upon data that are both computationally tractable and statistically principled. This course will briefly introduce core concepts in machine learning: supervised vs unsupervised learning; nearest neighbour methods; the challenge of generalisation; and approximate inference (in the form of Laplace's method and variational methods).
B16 Object Oriented Programming
This course provides a brief introduction to object-oriented design, along with an introduction to C++. It covers: encapsulation; constructors; implementation and interface; functions and operators, inheritance and composition; polymorphism; and templates.