Vladimir Vapnik: Predicates, Invariants, and the Essence of Intelligence
Play • 1 hr 45 min

Vladimir Vapnik is the co-inventor of support vector machines, support vector clustering, VC theory, and many foundational ideas in statistical learning. He was born in the Soviet Union, worked at the Institute of Control Sciences in Moscow, then in the US, worked at AT&T, NEC Labs, Facebook AI Research, and now is a professor at Columbia University. His work has been cited over 200,000 times.

This conversation is part of the Artificial Intelligence podcast. If you would like to get more information about this podcast go to https://lexfridman.com/ai or connect with @lexfridman on Twitter, LinkedIn, Facebook, Medium, or YouTube where you can watch the video versions of these conversations. If you enjoy the podcast, please rate it 5 stars on Apple Podcasts, follow on Spotify, or support it on Patreon.

This episode is presented by Cash App. Download it (App Store, Google Play), use code “LexPodcast”. 

Here’s the outline of the episode. On some podcast players you should be able to click the timestamp to jump to that time.

00:00 – Introduction
02:55 – Alan Turing: science and engineering of intelligence
09:09 – What is a predicate?
14:22 – Plato’s world of ideas and world of things
21:06 – Strong and weak convergence
28:37 – Deep learning and the essence of intelligence
50:36 – Symbolic AI and logic-based systems
54:31 – How hard is 2D image understanding?
1:00:23 – Data
1:06:39 – Language
1:14:54 – Beautiful idea in statistical theory of learning
1:19:28 – Intelligence and heuristics
1:22:23 – Reasoning
1:25:11 – Role of philosophy in learning theory
1:31:40 – Music (speaking in Russian)
1:35:08 – Mortality

More episodes
Search
Clear search
Close search
Google apps
Main menu