AFIT’s Provost Dr. Sivaguru Sritharan presents an AFIT coin to Dr. Vladimir Vapnik (left) and Dr. Rauf Izmailov (right) following their presentation on Learning Using Statistical Invariants as part of the AFRL – AFIT Distinguished Speaker Series on 24 Oct 2018.
The Air Force Research Laboratory and the Air Force Institute of Technology hosted Dr. Vladimir Vapnik and Dr. Rauf Izmailov for a presentation titled "Learning Using Statistical Invariants (Revision of Learning Theory)" as part of the AFRL – AFIT Distinguished Speaker Series. Dr. Doug Riecken, Program Officer for Science of Information, Computation, Learning and Fusion at the Air Force Office of Scientific Research was instrumental in organizing this exciting event.
The talk focused on a new learning paradigm. In the classical paradigm, the learning machine uses a data-driven model of learning. In the Learning Using Statistical Invariants paradigm, the learning machine computes statistical invariants that are specific for the problem, and then minimizes the expected error in a way that preserves these invariants; it is thus both data- and intelligent-driven learning. Mathematically, methods of the new paradigm employ both strong and weak convergence mechanisms, increasing the rate of convergence. LUSI describes a complete theory of learning and can be considered as a mathematical alternative to “deep learning.”
Professor Vapnik is considered the father of machine learning with his pioneering ideas such as Vapnik-Chervonenkis Dimensions that characterize the concept of learnability using a particular type of hypothesis classes, Support Vector Machines that classify different data classes and more recently Learning Using Statistical Invariants.
“During his lecture Dr. Vapnik emphasized and advised the Wright-Patt community on the importance of grasping the mathematical foundations of machine learning at the deepest level first prior to implementation,” said Dr. Sivaguru Sritharan, AFIT’s Provost.
Dr. Vapnik also challenged the audience by pointing out the famous theorem of George Cybenko which seems to question the need for deep neural networks and presented a refreshing look at machine learning using inherent statistical invariants and posed a challenge problem to the audience to demonstrate that LUSI will present better results in machine learning than deep neural networks (DNN) with much less effort.
“Wright-Patt artificial intelligence (AI) and machine learning community were galvanized by Dr. Vapnik's distinguished lecture. AI and machine learning is considered as a top priority scientific discipline for the department of defense and this year's annual AFRL-AFIT distinguished lecture with Dr. Vapnik has raised the bar on this annual event," said Dr. Sritharan.
Dr. Vladimir Vapnik has taught and researched in computer science, theoretical and applied statistics for over 30 years. His major achievements include a general theory of minimizing the expected risk using empirical data, and a new type of learning machine called Support Vector that possesses a high level of generalization ability. These techniques have been used in constructing intelligent machines. Vapnik earned a master’s degree in mathematics in 1958 at Uzbek State University, Samarkand, USSR, a second master's degree in mathematics from the Uzbek State University in 1958, and completed his Ph.D. in statistics at the Institute of Control Sciences, Moscow in 1964, where he became Head of the Computer Science Research Department, before he joined AT&T Bell Laboratories, NJ. He has held a Professor of Computer Science and Statistics position at Royal Holloway, University of London since 1995, and a position as Professor of Computer Science at Columbia University, New York City since 2003. Dr. Vapnik was inducted into the U.S. National Academy of Engineering in 2006, and most recently, he received the 2018 Kolmogorov Medal from the University of London.
Dr. Rauf Izmailov is a Senior Research Scientist at Perspecta Labs and an established researcher in mathematical and computer models for networking and control systems, machine learning, optimization, and statistical data analysis. He has more than 20 years of industry experience (including AT&T Bell Labs and NEC Labs America) in research and technical leadership of R&D teams. With Dr. Vapnik, he co-invented the new machine learning paradigm, Learning Using Privileged Information. He was the principal investigator on the Defense Advanced Research Projects Agency’s Probabilistic Programming for Advanced Machine Learning program and is currently the PI on the DARPA Data-Driven Discovery of Models program and the analytics task leader on the DARPA Leveraging the Analog Domain for Security program. He is also a co-PI with Dr. Vapnik on the Air Force Office of Scientific Research program Science of Information, Computation, Learning and Fusion.