back to index

Dr. Lex Fridman: Machines, Creativity & Love | Huberman Lab Podcast #29



link |
00:00:00.000
Welcome to the Huberman Lab Podcast,
link |
00:00:02.260
where we discuss science and science-based tools
link |
00:00:04.900
for everyday life.
link |
00:00:09.300
I'm Andrew Huberman, and I'm a professor of neurobiology
link |
00:00:12.040
and ophthalmology at Stanford School of Medicine.
link |
00:00:15.020
Today, I have the pleasure of introducing Dr. Lex Friedman
link |
00:00:17.900
as our guest on the Huberman Lab Podcast.
link |
00:00:21.100
Dr. Friedman is a researcher at MIT
link |
00:00:23.180
specializing in machine learning, artificial intelligence,
link |
00:00:26.540
and human-robot interactions.
link |
00:00:29.340
I must say that the conversation with Lex
link |
00:00:31.820
was without question one of the most fascinating
link |
00:00:34.980
conversations that I've ever had,
link |
00:00:36.600
not just in my career, but in my lifetime.
link |
00:00:39.440
I knew that Lex worked on these topics,
link |
00:00:41.500
and I think many of you are probably familiar with Lex
link |
00:00:43.940
and his interest in these topics
link |
00:00:45.180
from his incredible podcast, the Lex Friedman Podcast.
link |
00:00:48.260
If you're not already watching that podcast,
link |
00:00:50.240
please subscribe to it, it is absolutely fantastic.
link |
00:00:53.740
But in holding this conversation with Lex,
link |
00:00:56.040
I realized something far more important.
link |
00:00:58.500
He revealed to us a bit of his dream,
link |
00:01:00.900
his dream about humans and robots,
link |
00:01:03.360
about humans and machines,
link |
00:01:04.980
and about how those interactions can change the way
link |
00:01:07.520
that we perceive ourselves
link |
00:01:08.900
and that we interact with the world.
link |
00:01:10.740
We discuss relationships of all kinds,
link |
00:01:13.060
relationships with animals, relationships with friends,
link |
00:01:16.420
relationships with family, and romantic relationships.
link |
00:01:20.420
And we discuss relationships with machines,
link |
00:01:23.200
machines that move and machines that don't move,
link |
00:01:26.500
and machines that come to understand us in ways
link |
00:01:28.980
that we could never understand for ourselves,
link |
00:01:31.700
and how those machines can educate us about ourselves.
link |
00:01:35.720
Before this conversation,
link |
00:01:37.420
I had no concept of the ways in which machines
link |
00:01:40.060
could inform me or anyone about themselves.
link |
00:01:43.700
By the end, I was absolutely taken with the idea,
link |
00:01:46.780
and I'm still taken with the idea
link |
00:01:48.540
that interactions with machines of a very particular kind,
link |
00:01:51.920
a kind that Lex understands and wants to bring to the world,
link |
00:01:55.180
can not only transform the self,
link |
00:01:57.360
but may very well transform humanity.
link |
00:02:00.100
So whether or not you're familiar
link |
00:02:01.340
with Dr. Lex Friedman or not,
link |
00:02:03.320
I'm certain you're going to learn a tremendous amount
link |
00:02:05.340
from him during the course of our discussion,
link |
00:02:07.500
and that it will transform the way
link |
00:02:08.940
that you think about yourself and about the world.
link |
00:02:12.060
Before we begin, I want to mention
link |
00:02:13.860
that this podcast is separate
link |
00:02:15.100
from my teaching and research roles at Stanford.
link |
00:02:17.560
It is, however, part of my desire and effort
link |
00:02:19.960
to bring zero cost to consumer information about science
link |
00:02:22.740
and science-related tools to the general public.
link |
00:02:25.560
In keeping with that theme,
link |
00:02:26.720
I'd like to thank the sponsors of today's podcast.
link |
00:02:29.580
Our first sponsor is Roca.
link |
00:02:31.420
Roca makes sunglasses and eyeglasses
link |
00:02:33.340
that are of absolutely phenomenal quality.
link |
00:02:35.620
The company was founded
link |
00:02:36.500
by two all-American swimmers from Stanford,
link |
00:02:38.500
and everything about the sunglasses and eyeglasses
link |
00:02:40.760
they've designed had performance in mind.
link |
00:02:43.900
Now, I've spent a career working on the visual system,
link |
00:02:46.220
and one of the fundamental issues
link |
00:02:47.700
that your visual system has to deal with
link |
00:02:49.700
is how to adjust what you see when it gets darker
link |
00:02:52.780
or brighter in your environment.
link |
00:02:54.600
With Roca sunglasses and eyeglasses,
link |
00:02:56.820
whether or not it's dim in the room or outside,
link |
00:02:58.920
whether or not there's cloud cover,
link |
00:03:00.100
or whether or not you walk into a shadow,
link |
00:03:01.540
you can always see the world with absolute clarity.
link |
00:03:04.300
And that just tells me that they really understand
link |
00:03:06.220
the way that the visual system works,
link |
00:03:07.620
processes like habituation and attenuation.
link |
00:03:10.140
All these things that work at a real mechanistic level
link |
00:03:12.520
have been built into these glasses.
link |
00:03:14.540
In addition, the glasses are very lightweight.
link |
00:03:16.740
You don't even notice really that they're on your face.
link |
00:03:19.060
And the quality of the lenses is terrific.
link |
00:03:21.780
Now, the glasses were also designed
link |
00:03:23.500
so that you could use them not just while working
link |
00:03:25.460
or at dinner, et cetera, but while exercising.
link |
00:03:28.580
They don't fall off your face or slip off your face
link |
00:03:30.620
if you're sweating.
link |
00:03:31.800
And as I mentioned, they're extremely lightweight,
link |
00:03:33.460
so you can use them while running,
link |
00:03:34.860
you can use them while cycling, and so forth.
link |
00:03:37.060
Also, the aesthetic of Roca glasses is terrific.
link |
00:03:39.680
Unlike a lot of performance glasses out there,
link |
00:03:41.680
which frankly make people look like cyborgs,
link |
00:03:44.580
these glasses look great.
link |
00:03:46.020
You can wear them out to dinner.
link |
00:03:47.200
You can wear them for essentially any occasion.
link |
00:03:50.460
If you'd like to try Roca glasses,
link |
00:03:51.980
you can go to roca.com, that's R-O-K-A.com,
link |
00:03:55.460
and enter the code Huberman
link |
00:03:56.860
to save 20% off your first order.
link |
00:03:59.140
That's Roca, R-O-K-A.com,
link |
00:04:01.060
and enter the code Huberman at checkout.
link |
00:04:03.500
Today's episode is also brought to us by InsideTracker.
link |
00:04:06.900
InsideTracker is a personalized nutrition platform
link |
00:04:09.400
that analyzes data from your blood and DNA
link |
00:04:11.980
to help you better understand your body
link |
00:04:13.560
and help you reach your health goals.
link |
00:04:15.700
I am a big believer in getting regular blood work done
link |
00:04:18.220
for the simple reason that many of the factors
link |
00:04:20.700
that impact our immediate and long-term health
link |
00:04:23.180
can only be assessed from a quality blood test.
link |
00:04:25.820
And now with the advent of quality DNA tests,
link |
00:04:28.460
we can also get insight into some of our genetic
link |
00:04:30.900
underpinnings of our current and long-term health.
link |
00:04:34.300
The problem with a lot of blood and DNA tests out there,
link |
00:04:36.700
however, is you get the data back
link |
00:04:38.500
and you don't know what to do with those data.
link |
00:04:40.480
You see that certain things are high
link |
00:04:41.760
or certain things are low,
link |
00:04:42.980
but you really don't know what the actionable items are,
link |
00:04:45.200
what to do with all that information.
link |
00:04:47.300
With InsideTracker, they make it very easy
link |
00:04:49.740
to act in the appropriate ways on the information
link |
00:04:52.580
that you get back from those blood and DNA tests.
link |
00:04:55.060
And that's through the use of their online platform.
link |
00:04:57.580
They have a really easy to use dashboard
link |
00:04:59.860
that tells you what sorts of things can bring the numbers
link |
00:05:02.860
for your metabolic factors, endocrine factors, et cetera,
link |
00:05:05.740
into the ranges that you want and need
link |
00:05:08.040
for immediate and long-term health.
link |
00:05:10.140
In fact, I know one individual just by way of example
link |
00:05:13.100
that was feeling good but decided to go
link |
00:05:15.280
with an InsideTracker test and discovered
link |
00:05:16.940
that they had high levels of what's called
link |
00:05:18.340
C-reactive protein.
link |
00:05:19.820
They would have never detected that otherwise.
link |
00:05:21.820
C-reactive protein is associated
link |
00:05:23.340
with a number of deleterious health conditions,
link |
00:05:25.900
some heart issues, eye issues, et cetera.
link |
00:05:28.180
And so they were able to take immediate action
link |
00:05:30.180
to try and resolve those CRP levels.
link |
00:05:33.380
And so with InsideTracker, you get that sort of insight.
link |
00:05:35.980
And as I mentioned before, without a blood or DNA test,
link |
00:05:38.420
there's no way you're going to get that sort of insight
link |
00:05:40.380
until symptoms start to show up.
link |
00:05:42.820
If you'd like to try InsideTracker,
link |
00:05:44.300
you can go to insidetracker.com slash Huberman
link |
00:05:47.360
to get 25% off any of InsideTracker's plans.
link |
00:05:50.260
You just use the code Huberman at checkout.
link |
00:05:52.680
That's insidetracker.com slash Huberman
link |
00:05:55.420
to get 25% off any of InsideTracker's plans.
link |
00:05:58.700
Today's podcast is brought to us by Athletic Greens.
link |
00:06:01.540
Athletic Greens is an all-in-one
link |
00:06:03.180
vitamin mineral probiotic drink.
link |
00:06:05.660
I started taking Athletic Greens way back in 2012.
link |
00:06:09.020
And so I'm delighted that they're sponsoring the podcast.
link |
00:06:11.860
The reason I started taking Athletic Greens
link |
00:06:13.740
and the reason I still take Athletic Greens
link |
00:06:15.820
is that it covers all of my vitamin mineral probiotic bases.
link |
00:06:19.440
In fact, when people ask me, what should I take?
link |
00:06:22.020
I always suggest that the first supplement people take
link |
00:06:24.460
is Athletic Greens for the simple reason
link |
00:06:26.940
is that the things it contains covers your bases
link |
00:06:29.480
for metabolic health, endocrine health,
link |
00:06:31.900
and all sorts of other systems in the body.
link |
00:06:33.980
And the inclusion of probiotics are essential
link |
00:06:36.500
for a healthy gut microbiome.
link |
00:06:38.900
There are now tons of data showing
link |
00:06:40.540
that we have neurons in our gut.
link |
00:06:42.820
And keeping those neurons healthy
link |
00:06:44.240
requires that they are exposed
link |
00:06:45.620
to what are called the correct microbiota,
link |
00:06:47.940
little microorganisms that live in our gut
link |
00:06:49.900
and keep us healthy.
link |
00:06:51.180
And those neurons in turn help keep our brain healthy.
link |
00:06:53.860
They influence things like mood, our ability to focus,
link |
00:06:56.500
and many, many other factors related to health.
link |
00:06:59.900
With Athletic Greens, it's terrific
link |
00:07:01.380
because it also tastes really good.
link |
00:07:03.160
I drink it once or twice a day.
link |
00:07:04.700
I mix mine with water and I add a little lemon juice
link |
00:07:07.060
or sometimes a little bit of lime juice.
link |
00:07:09.380
If you want to try Athletic Greens,
link |
00:07:11.080
you can go to athleticgreens.com slash Huberman.
link |
00:07:14.100
And if you do that, you can claim their special offer.
link |
00:07:16.660
They're giving away five free travel packs,
link |
00:07:18.540
the little packs that make it easy to mix up
link |
00:07:20.420
Athletic Greens while you're on the road.
link |
00:07:22.660
And they'll give you a year supply of vitamin D3 and K2.
link |
00:07:26.260
Again, go to athleticgreens.com slash Huberman
link |
00:07:28.940
to claim that special offer.
link |
00:07:30.900
And now my conversation with Dr. Lex Friedman.
link |
00:07:34.620
We meet again.
link |
00:07:35.540
We meet again.
link |
00:07:36.500
Thanks so much for sitting down with me.
link |
00:07:39.600
I have a question that I think is on a lot of people's minds
link |
00:07:43.460
or ought to be on a lot of people's minds
link |
00:07:46.380
because we hear these terms a lot these days,
link |
00:07:50.020
but I think most people, including most scientists
link |
00:07:53.100
and including me, don't know really
link |
00:07:56.900
what is artificial intelligence and how is it different
link |
00:08:01.900
from things like machine learning and robotics?
link |
00:08:05.240
So if you would be so kind as to explain to us
link |
00:08:08.900
what is artificial intelligence and what is machine learning?
link |
00:08:14.020
Well, I think that question is as complicated
link |
00:08:17.100
and as fascinating as the question of what is intelligence.
link |
00:08:21.780
So I think of artificial intelligence first
link |
00:08:26.480
as a big philosophical thing.
link |
00:08:28.740
Pamela McCordick said AI was the ancient wish
link |
00:08:37.180
to forge the gods or was born as an ancient wish
link |
00:08:40.620
to forge the gods.
link |
00:08:41.740
So I think at the big philosophical level,
link |
00:08:44.260
it's our longing to create other intelligent systems,
link |
00:08:48.300
perhaps systems more powerful than us.
link |
00:08:51.740
At the more narrow level, I think it's also a set of tools
link |
00:08:56.740
that are computational mathematical tools
link |
00:08:59.260
to automate different tasks.
link |
00:09:01.100
And then also it's our attempt to understand our own mind.
link |
00:09:05.620
So build systems that exhibit some intelligent behavior
link |
00:09:09.940
in order to understand what is intelligence
link |
00:09:12.900
in our own selves.
link |
00:09:14.580
So all those things are true.
link |
00:09:16.160
Of course, what AI really means as a community,
link |
00:09:19.340
as a set of researchers and engineers,
link |
00:09:21.500
it's a set of tools, a set of computational techniques
link |
00:09:25.300
that allow you to solve various problems.
link |
00:09:29.580
There's a long history that approaches the problem
link |
00:09:33.020
from different perspectives.
link |
00:09:34.220
What's always been throughout one of the threads,
link |
00:09:37.660
one of the communities goes under the flag
link |
00:09:40.260
of machine learning, which is emphasizing
link |
00:09:43.780
in the AI space, the task of learning.
link |
00:09:48.100
How do you make a machine that knows very little
link |
00:09:50.520
in the beginning, follow some kind of process
link |
00:09:53.800
and learns to become better and better
link |
00:09:56.260
at a particular task.
link |
00:09:58.260
What's been most very effective in the recent about 15 years
link |
00:10:03.660
is a set of techniques that fall under the flag
link |
00:10:05.700
of deep learning that utilize neural networks.
link |
00:10:08.660
What neural networks are, are these fascinating things
link |
00:10:12.120
inspired by the structure of the human brain very loosely,
link |
00:10:17.240
but they have, it's a network of these little basic
link |
00:10:20.540
computational units called neurons, artificial neurons.
link |
00:10:24.400
And they have, these architectures have an input
link |
00:10:27.580
and output, they know nothing in the beginning
link |
00:10:30.140
and they're tasked with learning something interesting.
link |
00:10:33.500
What that something interesting is,
link |
00:10:35.220
usually involves a particular task.
link |
00:10:38.220
The, there's a lot of ways to talk about this
link |
00:10:41.160
and break this down.
link |
00:10:42.000
Like one of them is how much human supervision
link |
00:10:45.880
is required to teach this thing.
link |
00:10:48.340
So supervised learning, this broad category,
link |
00:10:51.860
is the neural network knows nothing in the beginning
link |
00:10:56.180
and then it's given a bunch of examples of,
link |
00:10:59.500
in computer vision, that will be examples of cats,
link |
00:11:02.020
dogs, cars, traffic signs, and then you're given the image
link |
00:11:06.080
and you're given the ground truth of what's in that image.
link |
00:11:09.540
And when you get a large database of such image examples
link |
00:11:13.040
where you know the truth, the neural network
link |
00:11:16.660
is able to learn by example,
link |
00:11:18.620
that's called supervised learning.
link |
00:11:20.460
The question, there's a lot of fascinating questions
link |
00:11:22.760
within that, which is how do you provide the truth?
link |
00:11:26.040
When you've given an image of a cat,
link |
00:11:30.180
how do you provide to the computer
link |
00:11:32.920
that this image contains a cat?
link |
00:11:34.920
Do you just say the entire image is a picture of a cat?
link |
00:11:37.980
Do you do what's very commonly been done,
link |
00:11:40.340
which is a bounding box?
link |
00:11:41.540
You have a very crude box around the cat's face
link |
00:11:45.060
saying this is a cat.
link |
00:11:46.520
Do you do semantic segmentation?
link |
00:11:48.720
Mind you, this is a 2D image of a cat.
link |
00:11:51.000
So it's not a, the computer knows nothing
link |
00:11:53.940
about our three-dimensional world.
link |
00:11:55.660
It's just looking at a set of pixels.
link |
00:11:57.340
So semantic segmentation is drawing a nice,
link |
00:12:01.140
very crisp outline around the cat and saying that's a cat.
link |
00:12:04.920
That's really difficult to provide that truth.
link |
00:12:07.060
And one of the fundamental open questions
link |
00:12:09.500
in computer vision is,
link |
00:12:10.800
is that even a good representation of the truth?
link |
00:12:13.800
Now, there's another contrasting set of ideas.
link |
00:12:18.540
Their attention, their overlapping
link |
00:12:21.120
is what's used to be called unsupervised learning,
link |
00:12:24.340
what's commonly now called self-supervised learning,
link |
00:12:27.140
which is trying to get less and less
link |
00:12:29.420
and less human supervision into the task.
link |
00:12:33.980
So self-supervised learning is more,
link |
00:12:38.820
it's been very successful in the domain of a language model,
link |
00:12:42.020
natural language processing,
link |
00:12:43.180
and now more and more is being successful
link |
00:12:44.960
in computer vision tasks.
link |
00:12:46.460
And what's the idea there is,
link |
00:12:48.860
let the machine without any ground truth annotation,
link |
00:12:53.160
just look at pictures on the internet
link |
00:12:55.740
or look at texts on the internet
link |
00:12:57.580
and try to learn something generalizable
link |
00:13:02.300
about the ideas that are at the core of language
link |
00:13:05.800
or at the core of vision.
link |
00:13:07.180
And based on that,
link |
00:13:09.580
we humans at its best like to call that common sense.
link |
00:13:12.900
So we have this giant base of knowledge
link |
00:13:15.980
on top of which we build more sophisticated knowledge,
link |
00:13:18.700
but we have this kind of common sense knowledge.
link |
00:13:21.380
And so the idea with self-supervised learning
link |
00:13:23.380
is to build this common sense knowledge
link |
00:13:25.980
about what are the fundamental visual ideas
link |
00:13:30.420
that make up a cat and a dog and all those kinds of things
link |
00:13:33.260
without ever having human supervision.
link |
00:13:35.780
The dream there is you just let an AI system
link |
00:13:40.780
that's self-supervised run around the internet for a while,
link |
00:13:44.840
watch YouTube videos for millions and millions of hours,
link |
00:13:47.480
and without any supervision be primed and ready
link |
00:13:52.080
to actually learn with very few examples
link |
00:13:54.640
once the human is able to show up.
link |
00:13:56.680
We think of children in this way, human children,
link |
00:14:00.160
is your parents only give one or two examples
link |
00:14:03.040
to teach a concept.
link |
00:14:04.600
The dream with self-supervised learning
link |
00:14:07.040
is that would be the same with machines
link |
00:14:10.080
that they would watch millions of hours of YouTube videos
link |
00:14:13.960
and then come to a human and be able to understand
link |
00:14:16.760
when the human shows them this is a cat.
link |
00:14:19.360
Like, remember this is a cat.
link |
00:14:20.800
They will understand that a cat is not just a thing
link |
00:14:23.540
with pointy ears or a cat is a thing that's orange
link |
00:14:27.520
or is furry, they'll see something more fundamental
link |
00:14:30.800
that we humans might not actually be able
link |
00:14:32.720
to introspect and understand.
link |
00:14:34.400
Like if I asked you what makes a cat versus a dog,
link |
00:14:36.800
you wouldn't probably not be able to answer that.
link |
00:14:39.400
But if I showed you, brought to you a cat and a dog,
link |
00:14:42.760
you'll be able to tell the difference.
link |
00:14:44.400
What are the ideas that your brain uses
link |
00:14:47.040
to make that difference?
link |
00:14:48.800
That's the whole dream of self-supervised learning
link |
00:14:51.280
is it would be able to learn that on its own,
link |
00:14:53.880
that set of common sense knowledge
link |
00:14:56.080
that's able to tell the difference.
link |
00:14:57.880
And then there's like a lot of incredible uses
link |
00:15:01.600
of self-supervised learning,
link |
00:15:04.000
very weirdly called self-play mechanism.
link |
00:15:07.280
That's the mechanism behind the reinforcement learning
link |
00:15:11.720
successes of the systems that want to go
link |
00:15:15.720
at alpha zero, that want to chess.
link |
00:15:18.840
Oh, I see, that play games.
link |
00:15:20.800
That play games. Got it.
link |
00:15:21.880
So the idea of self-play is probably applies
link |
00:15:26.000
to other domains than just games,
link |
00:15:27.880
is a system that just plays against itself.
link |
00:15:30.840
And this is fascinating in all kinds of domains,
link |
00:15:33.580
but it knows nothing in the beginning.
link |
00:15:36.600
And the whole idea is it creates a bunch of mutations
link |
00:15:39.540
of itself and plays against those versions of itself.
link |
00:15:46.620
And the fascinating thing is when you play against systems
link |
00:15:50.220
that are a little bit better than you,
link |
00:15:51.820
you start to get better yourself.
link |
00:15:53.720
Like learning, that's how learning happens.
link |
00:15:56.120
That's true for martial arts,
link |
00:15:57.280
that's true in a lot of cases,
link |
00:15:59.240
where you want to be interacting with systems
link |
00:16:02.040
that are just a little better than you.
link |
00:16:03.920
And then through this process of interacting
link |
00:16:06.320
with systems just a little better than you,
link |
00:16:08.120
you start following this process where everybody
link |
00:16:10.480
starts getting better and better and better and better
link |
00:16:12.640
until you are several orders of magnitude better
link |
00:16:15.480
than the world champion in chess, for example.
link |
00:16:18.080
And it's fascinating because it's like a runaway system.
link |
00:16:21.040
One of the most terrifying and exciting things
link |
00:16:23.560
that David Silver, the creator of AlphaGo and AlphaZero,
link |
00:16:27.200
one of the leaders of the team said to me is
link |
00:16:31.960
they haven't found the ceiling for AlphaZero,
link |
00:16:36.640
meaning it could just arbitrarily keep improving.
link |
00:16:39.360
Now in the realm of chess, that doesn't matter to us,
link |
00:16:41.840
that it's like, it just ran away with the game of chess.
link |
00:16:44.980
Like it's like just so much better than humans.
link |
00:16:48.360
But the question is if you can create that in the realm
link |
00:16:52.120
that does have a bigger, deeper effect on human beings,
link |
00:16:56.060
on societies, that can be a terrifying process.
link |
00:16:59.940
To me, it's an exciting process
link |
00:17:01.780
if you supervise it correctly,
link |
00:17:03.720
if you inject what's called value alignment,
link |
00:17:10.960
you make sure that the goals that the AI is optimizing
link |
00:17:13.500
is aligned with human beings and human societies.
link |
00:17:17.000
There's a lot of fascinating things to talk about
link |
00:17:19.200
within the specifics of neural networks
link |
00:17:23.200
and all the problems that people are working on.
link |
00:17:25.600
But I would say the really big exciting one
link |
00:17:28.280
is self-supervised learning,
link |
00:17:29.840
where trying to get less and less human supervision,
link |
00:17:35.520
less and less human supervision of neural networks.
link |
00:17:38.740
And also just to comment and I'll shut up.
link |
00:17:41.980
No, please keep going.
link |
00:17:43.100
I'm learning.
link |
00:17:44.120
I have questions, but I'm learning.
link |
00:17:45.480
So please keep going.
link |
00:17:46.480
So to me, what's exciting is not the theory,
link |
00:17:49.420
it's always the application.
link |
00:17:51.240
One of the most exciting applications
link |
00:17:52.920
of artificial intelligence,
link |
00:17:55.120
specifically neural networks and machine learning
link |
00:17:57.480
is Tesla Autopilot.
link |
00:17:59.140
So these are systems that are working in the real world.
link |
00:18:01.600
This isn't an academic exercise.
link |
00:18:03.600
This is human lives at stake.
link |
00:18:05.200
This is safety critical.
link |
00:18:07.200
These are automated vehicles, autonomous vehicles.
link |
00:18:09.480
Semi-autonomous, we want to be.
link |
00:18:11.280
Okay.
link |
00:18:12.320
We've gone through wars on these topics.
link |
00:18:15.000
Semi-autonomous.
link |
00:18:16.280
Semi-autonomous.
link |
00:18:17.120
So even though it's called FSD, full self-driving,
link |
00:18:22.560
it is currently not fully autonomous,
link |
00:18:24.960
meaning human supervision is required.
link |
00:18:27.740
So human is tasked with overseeing the systems.
link |
00:18:30.920
In fact, liability wise, the human is always responsible.
link |
00:18:35.220
This is a human factor psychology question,
link |
00:18:37.940
which is fascinating.
link |
00:18:39.360
I'm fascinated by the whole space,
link |
00:18:43.000
which is a whole nother space of human robot interaction.
link |
00:18:46.160
When AI systems and humans work together
link |
00:18:48.760
to accomplish tasks.
link |
00:18:49.960
That dance to me is one of the smaller communities,
link |
00:18:54.960
but I think it will be one of the most important
link |
00:18:58.280
open problems once they're solved,
link |
00:19:00.340
is how do humans and robots dance together?
link |
00:19:04.040
To me, semi-autonomous driving is one of those spaces.
link |
00:19:07.860
So for Elon, for example, he doesn't see it that way.
link |
00:19:11.680
He sees semi-autonomous driving as a stepping stone
link |
00:19:16.520
towards fully autonomous driving.
link |
00:19:18.620
Like humans and robots can't dance well together.
link |
00:19:22.720
Like humans and humans dance and robots and robots dance.
link |
00:19:25.320
Like we need to, this is an engineering problem.
link |
00:19:28.060
We need to design a perfect robot that solves this problem.
link |
00:19:31.680
To me forever, maybe this is not the case with driving,
link |
00:19:34.120
but the world is going to be full of problems
link |
00:19:37.140
where it's always humans and robots have to interact
link |
00:19:40.400
because I think robots will always be flawed,
link |
00:19:43.520
just like humans are going to be flawed, are flawed.
link |
00:19:47.460
And that's what makes life beautiful, that they're flawed.
link |
00:19:51.000
That's where learning happens at the edge
link |
00:19:53.800
of your capabilities.
link |
00:19:55.860
So you always have to figure out how can flawed robots
link |
00:20:00.320
and flawed humans interact together,
link |
00:20:03.800
such that they, like the sum is bigger than the whole,
link |
00:20:08.360
as opposed to focusing on just building the perfect robot.
link |
00:20:12.540
So that's one of the most exciting applications,
link |
00:20:15.360
I would say, of artificial intelligence to me,
link |
00:20:17.800
is autonomous driving and semi-autonomous driving.
link |
00:20:20.640
And that's a really good example of machine learning
link |
00:20:23.240
because those systems are constantly learning.
link |
00:20:26.780
And there's a process there that maybe I can comment on.
link |
00:20:31.820
Andre Karpathy, who's the head of Autopilot,
link |
00:20:34.360
calls it the data engine.
link |
00:20:36.360
And this process applies for a lot of machine learning,
link |
00:20:38.900
which is you build a system
link |
00:20:40.340
that's pretty good at doing stuff.
link |
00:20:42.200
You send it out into the real world.
link |
00:20:45.320
It starts doing the stuff and then it runs
link |
00:20:47.560
into what are called edge cases, like failure cases,
link |
00:20:50.640
where it screws up.
link |
00:20:52.640
You know, we do this as kids that, you know, you have-
link |
00:20:55.480
We do this as adults.
link |
00:20:56.320
We do this as adults, exactly.
link |
00:20:58.560
But we learn really quickly.
link |
00:21:00.120
But the whole point,
link |
00:21:01.320
and this is the fascinating thing about driving,
link |
00:21:03.640
is you realize there's millions of edge cases.
link |
00:21:06.960
There's just like weird situations that you did not expect.
link |
00:21:10.800
And so the data engine process is
link |
00:21:13.480
you collect those edge cases,
link |
00:21:15.080
and then you go back to the drawing board
link |
00:21:17.080
and learn from them.
link |
00:21:18.440
And so you have to create this data pipeline
link |
00:21:21.000
where all these cars,
link |
00:21:22.680
hundreds of thousands of cars as you're driving around,
link |
00:21:25.320
and something weird happens.
link |
00:21:27.040
And so whenever this weird detector fires,
link |
00:21:30.880
it's another important concept,
link |
00:21:33.360
that piece of data goes back to the mothership
link |
00:21:37.400
for the training, for the retraining of the system.
link |
00:21:40.880
And through this data engine process,
link |
00:21:42.600
it keeps improving and getting better and better
link |
00:21:44.920
and better and better.
link |
00:21:45.880
So basically, you send out a pretty clever AI systems
link |
00:21:49.320
out into the world and let it find the edge cases.
link |
00:21:54.420
Let it screw up just enough
link |
00:21:56.600
to figure out where the edge cases are,
link |
00:21:58.560
and then go back and learn from them,
link |
00:22:00.800
and then send out that new version
link |
00:22:02.640
and keep updating that version.
link |
00:22:04.240
Is the updating done by humans?
link |
00:22:06.320
The annotation is done by humans.
link |
00:22:09.460
So you have to,
link |
00:22:11.040
the weird examples come back, the edge cases,
link |
00:22:14.800
and you have to label what actually happened in there.
link |
00:22:17.680
There's also some mechanisms for automatically labeling,
link |
00:22:23.140
but mostly I think you always have to rely on humans
link |
00:22:25.840
to improve, to understand what's happening
link |
00:22:28.080
in the weird cases.
link |
00:22:30.000
And then there's a lot of debate.
link |
00:22:31.760
And that's the other thing,
link |
00:22:32.640
what is artificial intelligence?
link |
00:22:34.440
Which is a bunch of smart people
link |
00:22:36.800
having very different opinions about what is intelligence.
link |
00:22:39.740
So AI is basically a community of people
link |
00:22:41.880
who don't agree on anything.
link |
00:22:43.960
It seems to be the case.
link |
00:22:45.800
And first of all, this is a beautiful description of terms
link |
00:22:48.660
that I've heard many times among my colleagues at Stanford,
link |
00:22:51.900
at meetings, in the outside world.
link |
00:22:53.760
And there's so many fascinating things.
link |
00:22:55.940
I have so many questions,
link |
00:22:56.960
but I do want to ask one question about the culture of AI,
link |
00:23:00.200
because it does seem to be a community where,
link |
00:23:02.640
at least as an outsider,
link |
00:23:03.960
where it seems like there's very little consensus
link |
00:23:06.160
about what the terms
link |
00:23:07.280
and the operational definitions even mean.
link |
00:23:09.620
And there seems to be a lot of splitting happening now
link |
00:23:12.340
of not just supervised and unsupervised learning,
link |
00:23:14.880
but these sort of intermediate conditions
link |
00:23:18.080
where machines are autonomous,
link |
00:23:20.780
but then go back for more instruction,
link |
00:23:22.120
like kids go home from college during the summer
link |
00:23:24.000
and get a little, mom still feeds them,
link |
00:23:26.240
then eventually they leave the nest kind of thing.
link |
00:23:29.760
Is there something in particular about engineers
link |
00:23:32.640
or about people in this realm of engineering
link |
00:23:35.820
that you think lends itself to disagreement?
link |
00:23:39.080
Yeah, I think, so first of all, the more specific you get,
link |
00:23:43.440
the less disagreement there is.
link |
00:23:44.640
So there's a lot of disagreement
link |
00:23:45.900
about what is artificial intelligence,
link |
00:23:47.880
but there's less disagreement about what is machine learning
link |
00:23:50.640
and even less when you talk about active learning
link |
00:23:52.680
or machine teaching or self-supervised learning.
link |
00:23:56.600
And then when you get into NLP language models
link |
00:23:59.780
or transformers,
link |
00:24:00.620
when you get into specific neural network architectures,
link |
00:24:03.780
there's less and less and less disagreement
link |
00:24:05.620
about those terms.
link |
00:24:06.660
So you might be hearing the disagreement
link |
00:24:08.060
from the high level terms,
link |
00:24:09.340
and that has to do with the fact that engineering,
link |
00:24:12.060
especially when you're talking about intelligent systems
link |
00:24:15.580
is a little bit of an art and a science.
link |
00:24:20.840
So the art part is the thing that creates disagreements
link |
00:24:25.300
because then you start having disagreements
link |
00:24:28.620
about how easy or difficult a particular problem is.
link |
00:24:33.860
For example, a lot of people disagree with Elon,
link |
00:24:37.580
how difficult the problem of autonomous driving is.
link |
00:24:41.100
And so, but nobody knows.
link |
00:24:43.120
So there's a lot of disagreement
link |
00:24:44.340
about what are the limits of these techniques.
link |
00:24:47.160
And through that, the terminology also contains within it,
link |
00:24:50.560
the disagreements.
link |
00:24:53.940
But overall, I think it's also a young science
link |
00:24:56.760
that also has to do with that.
link |
00:24:58.820
So like, it's not just engineering,
link |
00:25:01.180
it's that artificial intelligence truly
link |
00:25:03.960
as a large scale discipline where it's thousands,
link |
00:25:06.800
tens of thousands, hundreds of thousands of people
link |
00:25:09.420
working on it, huge amounts of money being made.
link |
00:25:11.660
That's a very recent thing.
link |
00:25:13.820
So we're trying to figure out those terms.
link |
00:25:16.460
And of course there's egos and personalities
link |
00:25:18.860
and a lot of fame to be made,
link |
00:25:22.620
like the term deep learning, for example.
link |
00:25:25.740
Neural networks have been around for many, many decades,
link |
00:25:28.020
since the sixties, you can argue since the forties.
link |
00:25:30.860
So there was a rebranding of neural networks
link |
00:25:33.440
into the word deep learning, the term deep learning
link |
00:25:36.880
that was part of the reinvigoration of the field.
link |
00:25:40.900
But it's really the same exact thing.
link |
00:25:42.680
I didn't know that.
link |
00:25:43.640
I mean, I grew up in the age of neuroscience
link |
00:25:46.020
when neural networks were discussed.
link |
00:25:49.060
Computational neuroscience and theoretical neuroscience,
link |
00:25:51.380
they had their own journals.
link |
00:25:53.060
It wasn't actually taken terribly seriously
link |
00:25:55.020
by experimentalists until a few years ago.
link |
00:25:57.220
I would say about five to seven years ago,
link |
00:26:00.680
excellent theoretical neuroscientists like Larry Abbott
link |
00:26:03.500
and other colleagues, certainly at Stanford as well,
link |
00:26:07.460
that people started paying attention
link |
00:26:08.780
to computational methods.
link |
00:26:10.340
But these terms, neural networks, computational methods,
link |
00:26:13.200
I actually didn't know that neural network works
link |
00:26:15.140
in deep learning where those have now become
link |
00:26:18.540
kind of synonymous.
link |
00:26:19.380
No, they were always, no, they're always the same thing.
link |
00:26:22.700
Interesting.
link |
00:26:24.160
I'm a neuroscientist and I didn't know that.
link |
00:26:25.740
So, well, because neural networks probably mean
link |
00:26:28.300
something else in neuroscience, not something else,
link |
00:26:30.180
but a little different flavor depending on the field.
link |
00:26:32.620
And that's fascinating too, because neuroscience and AI,
link |
00:26:36.700
people have started working together
link |
00:26:38.980
and dancing a lot more in the recent,
link |
00:26:41.560
I would say probably decade.
link |
00:26:43.020
Oh, machines are going into the brain.
link |
00:26:46.140
I have a couple of questions,
link |
00:26:47.600
but one thing that I'm sort of fixated on
link |
00:26:49.800
that I find incredibly interesting is this example you gave
link |
00:26:54.400
of playing a game with a mutated version of yourself
link |
00:26:58.180
as a competitor.
link |
00:26:59.420
Yeah, I find that incredibly interesting
link |
00:27:02.340
as a kind of a parallel or a mirror for what happens
link |
00:27:05.860
when we try and learn as humans,
link |
00:27:07.600
which is we generate repetitions of whatever it is
link |
00:27:10.540
we're trying to learn and we make errors.
link |
00:27:13.260
Occasionally we succeed.
link |
00:27:15.020
In a simple example, for instance,
link |
00:27:16.580
of trying to throw bull's eyes on a dartboard.
link |
00:27:18.980
I'm going to have errors, errors, errors.
link |
00:27:20.460
I'll probably miss the dartboard
link |
00:27:21.680
and maybe occasionally hit a bull's eye.
link |
00:27:23.380
And I don't know exactly what I just did, right?
link |
00:27:26.220
But then let's say I was playing darts
link |
00:27:28.720
against a version of myself
link |
00:27:30.340
where I was wearing a visual prism,
link |
00:27:32.460
like my visual, I had a visual defect.
link |
00:27:36.700
You learn certain things in that mode as well.
link |
00:27:38.940
You're saying that a machine can sort of mutate itself.
link |
00:27:42.900
Does the mutation always cause a deficiency
link |
00:27:45.100
that it needs to overcome?
link |
00:27:46.540
Because mutations in biology
link |
00:27:47.980
sometimes give us superpowers, right?
link |
00:27:49.580
Occasionally you'll get somebody
link |
00:27:51.060
who has better than 20, 20 vision
link |
00:27:52.700
and they can see better than 99.9% of people out there.
link |
00:27:56.420
So when you talk about a machine playing a game
link |
00:27:59.140
against a mutated version of itself,
link |
00:28:01.320
is the mutation always what we call a negative mutation
link |
00:28:04.700
or an adaptive or a maladaptive mutation?
link |
00:28:07.620
No, you don't know until you mutate first
link |
00:28:11.860
and then figure out and they compete against each other.
link |
00:28:14.460
So you're evolving,
link |
00:28:15.780
the machine gets to evolve itself in real time.
link |
00:28:18.380
Yeah, and I think of it, which would be exciting,
link |
00:28:21.820
if you could actually do with humans,
link |
00:28:23.580
it's not just, so usually you freeze a version
link |
00:28:29.460
of the system.
link |
00:28:30.300
So really you take an Andrew of yesterday
link |
00:28:33.900
and you make 10 clones of them
link |
00:28:36.560
and then maybe you mutate, maybe not.
link |
00:28:38.940
And then you do a bunch of competitions
link |
00:28:41.060
of the Andrew of today.
link |
00:28:42.440
Like you fight to the death and who wins last.
link |
00:28:45.620
So I love that idea of like creating
link |
00:28:47.140
a bunch of clones of myself from each of the day
link |
00:28:50.980
for the past year and just seeing who's going to be better
link |
00:28:54.500
at like podcasting or science or picking up chicks at a bar
link |
00:28:58.820
or I don't know, or competing in jujitsu.
link |
00:29:01.980
That's one way to do it.
link |
00:29:03.080
I mean, a lot of Lexes would have to die for that process,
link |
00:29:06.320
but that's essentially what happens
link |
00:29:07.860
is in reinforcement learning
link |
00:29:09.460
through the self-play mechanisms,
link |
00:29:11.420
it's a graveyard of systems that didn't do that well.
link |
00:29:14.700
And the surviving, the good ones survive.
link |
00:29:19.700
Do you think that, I mean, Darwin's theory of evolution
link |
00:29:22.660
might have worked in some sense in this way,
link |
00:29:26.300
but at the population level.
link |
00:29:27.740
I mean, you get a bunch of birds with different shaped beaks
link |
00:29:29.560
and some birds have the shape beak
link |
00:29:30.980
that allows them to get the seeds.
link |
00:29:32.260
I mean, it's a trivially simple example
link |
00:29:34.860
of Darwinian evolution, but I think it's correct
link |
00:29:39.260
even though it's not exhaustive.
link |
00:29:40.780
Is that what you're referring to?
link |
00:29:42.140
You essentially that normally this is done
link |
00:29:44.100
between members of a different species.
link |
00:29:45.580
Lots of different members of species have different traits
link |
00:29:47.860
and some get selected for,
link |
00:29:49.420
but you could actually create multiple versions of yourself
link |
00:29:52.580
with different traits.
link |
00:29:53.880
So with, I should probably have said this,
link |
00:29:56.460
but perhaps it's implied,
link |
00:29:59.220
but the machine learning or the reinforcement learning
link |
00:30:01.220
through these processes,
link |
00:30:02.480
one of the big requirements
link |
00:30:04.180
is to have an objective function, a loss function,
link |
00:30:06.500
a utility function.
link |
00:30:07.800
Those are all different terms for the same thing.
link |
00:30:10.220
Is there's like an equation that says what's good.
link |
00:30:15.080
And then you're trying to optimize that equation.
link |
00:30:17.500
So there's a clear goal for these systems.
link |
00:30:20.980
Because it's a game, like with chess, there's a goal.
link |
00:30:23.940
But for anything, anything you want machine learning
link |
00:30:26.820
to solve, there needs to be an objective function.
link |
00:30:29.940
In machine learning, it's usually called loss function
link |
00:30:32.860
that you're optimizing.
link |
00:30:34.340
The interesting thing about evolution,
link |
00:30:37.780
complicated of course,
link |
00:30:38.700
but the goal also seems to be evolving.
link |
00:30:41.700
Like it's a, I guess adaptation to the environment
link |
00:30:44.020
is the goal, but it's unclear.
link |
00:30:46.460
You can convert that always.
link |
00:30:48.580
It's like survival of the fittest.
link |
00:30:52.140
It's unclear what the fittest is.
link |
00:30:53.860
In machine learning, the starting point,
link |
00:30:56.820
and this is like what human ingenuity provides,
link |
00:31:00.460
is that fitness function of what's good and what's bad,
link |
00:31:04.420
which it lets you know which of the systems is going to win.
link |
00:31:08.340
So you need to have a equation like that.
link |
00:31:10.940
One of the fascinating things about humans
link |
00:31:12.860
is we figure out objective functions for ourselves.
link |
00:31:17.140
Like we're, it's the meaning of life.
link |
00:31:20.500
Like why the hell are we here?
link |
00:31:22.980
And a machine currently has to have
link |
00:31:26.620
a hard-coded statement about why.
link |
00:31:29.220
It has to have a meaning of
link |
00:31:30.900
artificial intelligence-based life.
link |
00:31:33.220
Right, it can't.
link |
00:31:34.520
So like there's a lot of interesting explorations
link |
00:31:37.580
about that function being more about curiosity,
link |
00:31:42.420
about learning new things and all that kind of stuff,
link |
00:31:45.220
but it's still hard-coded.
link |
00:31:46.680
If you want a machine to be able to be good at stuff,
link |
00:31:49.580
it has to be given very clear statements
link |
00:31:53.420
of what good at stuff means.
link |
00:31:56.060
That's one of the challenges of artificial intelligence
link |
00:31:58.360
is you have to formalize the,
link |
00:32:01.580
in order to solve a problem, you have to formalize it
link |
00:32:04.180
and you have to provide
link |
00:32:06.060
both like the full sensory information.
link |
00:32:08.260
You have to be very clear about
link |
00:32:10.020
what is the data that's being collected.
link |
00:32:12.600
And you have to also be clear about the objective function.
link |
00:32:15.860
What is the goal that you're trying to reach?
link |
00:32:18.840
And that's a very difficult thing
link |
00:32:20.720
for artificial intelligence.
link |
00:32:22.100
I love that you mentioned curiosity.
link |
00:32:23.940
I'm sure this definition falls short in many ways,
link |
00:32:26.940
but I define curiosity as a strong interest
link |
00:32:31.180
in knowing something,
link |
00:32:33.140
but without an attachment to the outcome.
link |
00:32:35.660
You know, it's sort of a,
link |
00:32:37.660
it's not, it could be a random search,
link |
00:32:39.320
but there's not really an emotional attachment.
link |
00:32:42.060
It's really just a desire to discover
link |
00:32:44.020
and unveil what's there
link |
00:32:45.540
without hoping it's a gold coin under a rock.
link |
00:32:48.860
You're just looking under rocks.
link |
00:32:50.620
Is that more or less how the machine,
link |
00:32:53.060
within machine learning,
link |
00:32:54.000
it sounds like there are elements of reward prediction
link |
00:32:57.260
and rewards the machine has to know
link |
00:32:59.340
when it's done the right thing.
link |
00:33:01.460
So can you make machines that are curious
link |
00:33:05.280
or are the sorts of machines
link |
00:33:06.940
that you are describing curious by design?
link |
00:33:10.220
Yeah, curiosity is a kind of a symptom,
link |
00:33:14.300
not the goal.
link |
00:33:16.260
So what happens is one of the big trade-offs
link |
00:33:21.220
in reinforcement learning
link |
00:33:22.280
is this exploration versus exploitation.
link |
00:33:25.540
So when you know very little,
link |
00:33:27.420
it pays off to explore a lot,
link |
00:33:29.600
even suboptimal,
link |
00:33:31.340
like even trajectories that seem like
link |
00:33:32.940
they're not going to lead anywhere.
link |
00:33:34.620
That's called exploration.
link |
00:33:36.180
The smarter and smarter and smarter you get,
link |
00:33:38.600
the more emphasis you put on exploitation,
link |
00:33:41.820
meaning you take the best solution,
link |
00:33:44.180
you take the best path.
link |
00:33:45.780
Now through that process,
link |
00:33:47.220
the exploration can look like curiosity by us humans,
link |
00:33:52.700
but it's really just trying to get out of the local optimal,
link |
00:33:55.580
the thing that's already discovered.
link |
00:33:57.300
It's from an AI perspective,
link |
00:34:00.180
it's always looking to optimize the objective function.
link |
00:34:04.380
It derives, and we can talk about this a lot more,
link |
00:34:08.160
but in terms of the tools of machine learning today,
link |
00:34:11.500
it derives no pleasure from just the curiosity of like,
link |
00:34:17.220
I don't know, discovery.
link |
00:34:19.300
That moment.
link |
00:34:20.140
So there's no dopamine for a machine.
link |
00:34:20.960
There's no dopamine.
link |
00:34:21.820
There's no reward system chemical
link |
00:34:23.980
or I guess electronic reward system.
link |
00:34:26.880
That said, if you look at machine learning literature
link |
00:34:30.380
and reinforcement learning literature,
link |
00:34:32.060
they will use like deep mind,
link |
00:34:34.020
we use terms like dopamine.
link |
00:34:35.740
We're constantly trying to use the human brain
link |
00:34:38.820
to inspire totally new solutions to these problems.
link |
00:34:41.820
So they'll think like,
link |
00:34:42.720
how does dopamine function in the human brain?
link |
00:34:44.940
And how can that lead to more interesting ways
link |
00:34:49.120
to discover optimal solutions?
link |
00:34:51.460
But ultimately, currently,
link |
00:34:54.580
there has to be a formal objective function.
link |
00:34:57.460
Now you could argue that humans
link |
00:34:58.660
also has a set of objective functions
link |
00:35:00.460
we're trying to optimize.
link |
00:35:01.860
We're just not able to introspect them.
link |
00:35:04.500
Yeah, we don't actually know what we're looking for
link |
00:35:07.800
and seeking and doing.
link |
00:35:09.320
Well, like Lisa Feldman Barrett,
link |
00:35:10.700
she's spoken with at least on Instagram.
link |
00:35:13.420
I hope you get her through you.
link |
00:35:14.780
Yeah, I hope you actually have her on this podcast.
link |
00:35:17.620
That'd be terrific.
link |
00:35:18.800
So she has a very,
link |
00:35:22.500
it has to do with homeostasis like that.
link |
00:35:26.780
Basically there's a very dumb objective function
link |
00:35:28.980
that the brain is trying to optimize,
link |
00:35:30.860
like to keep like body temperature the same.
link |
00:35:32.940
Like there's a very dumb
link |
00:35:34.300
kind of optimization function happening.
link |
00:35:36.460
And then what we humans do with our fancy consciousness
link |
00:35:39.540
and cognitive abilities is we tell stories to ourselves
link |
00:35:42.320
so we can have nice podcasts,
link |
00:35:44.060
but really it's the brain trying to maintain
link |
00:35:48.080
just like healthy state, I guess.
link |
00:35:50.720
That's fascinating.
link |
00:35:51.860
I also see the human brain
link |
00:35:55.520
and I hope artificial intelligence systems
link |
00:35:58.940
as not just systems that solve problems or optimize a goal,
link |
00:36:04.180
but also storytellers.
link |
00:36:06.260
I think there's a power to telling stories.
link |
00:36:08.820
We tell stories to each other.
link |
00:36:10.060
That's what communication is.
link |
00:36:11.660
Like when you're alone, that's when you solve problems.
link |
00:36:16.680
That's when it makes sense to talk about solving problems.
link |
00:36:19.040
But when you're a community,
link |
00:36:20.820
the capability to communicate, tell stories,
link |
00:36:25.420
share ideas in such a way that those ideas are stable
link |
00:36:28.220
over a long period of time,
link |
00:36:29.980
that's like, that's being a charismatic storyteller.
link |
00:36:33.260
And I think both humans are very good at this.
link |
00:36:35.860
Arguably, I would argue that's why we are who we are
link |
00:36:40.220
is we're great storytellers.
link |
00:36:42.260
And then AI, I hope will also become that.
link |
00:36:44.780
So it's not just about being able to solve problems
link |
00:36:47.460
with a clear objective function.
link |
00:36:49.020
It's afterwards be able to tell like a way better,
link |
00:36:51.900
like make up a way better story
link |
00:36:53.340
about why you did something or why you failed.
link |
00:36:55.740
So you think that robots and or machines of some sort
link |
00:36:59.840
are going to start telling humans stories?
link |
00:37:02.340
Well, definitely.
link |
00:37:03.440
So the technical field for that is called explainable AI,
link |
00:37:07.340
explainable artificial intelligence
link |
00:37:09.300
is trying to figure out how you get the AI system
link |
00:37:14.160
to explain to us humans why the hell it failed
link |
00:37:17.520
or why it succeeded.
link |
00:37:19.740
Or there's a lot of different sort of versions of this
link |
00:37:22.340
or to visualize how it understands the world.
link |
00:37:26.300
That's a really difficult problem,
link |
00:37:28.100
especially with neural networks that are famously opaque,
link |
00:37:33.160
that they, we don't understand in many cases
link |
00:37:36.320
why a particular neural network does what it does so well.
link |
00:37:40.480
And to try to figure out where it's going to fail,
link |
00:37:43.700
that requires the AI to explain itself.
link |
00:37:46.220
There's a huge amount of money,
link |
00:37:48.340
like there's a huge amount of money in this,
link |
00:37:52.340
especially from government funding and so on.
link |
00:37:54.540
Because if you want to deploy AI systems in the real world,
link |
00:37:59.460
we humans at least want to ask it a question,
link |
00:38:02.640
like why the hell did you do that?
link |
00:38:04.260
Like in a dark way, why did you just kill that person?
link |
00:38:08.820
Right, like if a car ran over a person,
link |
00:38:10.620
we wouldn't understand why that happened.
link |
00:38:12.980
And now again, we're sometimes very unfair to AI systems
link |
00:38:17.880
because we humans can often not explain why very well.
link |
00:38:21.900
But that's the field of explainable AI.
link |
00:38:25.560
That's very, people are very interested in
link |
00:38:28.180
because the more and more we rely on AI systems,
link |
00:38:31.500
like the Twitter recommender system, that AI algorithm,
link |
00:38:35.660
that's I would say impacting elections,
link |
00:38:39.140
perhaps starting wars or at least military conflict.
link |
00:38:41.980
That's that algorithm.
link |
00:38:43.660
We want to ask that algorithm, first of all,
link |
00:38:46.580
do you know what the hell you're doing?
link |
00:38:48.500
Do you know, do you understand
link |
00:38:50.300
the society level effects you're having?
link |
00:38:52.900
And can you explain the possible other trajectories?
link |
00:38:55.820
Like we would have that kind of conversation with a human.
link |
00:38:58.300
We want to be able to do that with an AI.
link |
00:39:00.020
And on my own personal level,
link |
00:39:02.020
I think it would be nice to talk to AI systems
link |
00:39:05.420
for stupid stuff, like robots, when they fail to-
link |
00:39:11.620
Why'd you fall down the stairs?
link |
00:39:12.900
Yeah, but not an engineering question,
link |
00:39:15.860
but almost like a endearing question.
link |
00:39:18.740
Like I'm looking for, if I fell
link |
00:39:22.580
and you and I were hanging out,
link |
00:39:25.020
I don't think you need an explanation
link |
00:39:28.020
exactly what were the dynamic,
link |
00:39:29.860
like what was the under actuated system problem here?
link |
00:39:32.740
Like what was the texture of the floor or so on?
link |
00:39:36.180
Or like what was the-
link |
00:39:37.580
I want to know what you're thinking.
link |
00:39:39.020
That, or you might joke about like,
link |
00:39:41.200
you're drunk again, go home or something.
link |
00:39:43.100
Like there could be humor in it.
link |
00:39:44.860
That's an opportunity,
link |
00:39:46.940
like storytelling isn't just explanation of what happened.
link |
00:39:51.040
It's something that makes people laugh,
link |
00:39:54.380
makes people fall in love,
link |
00:39:56.100
makes people dream and understand things
link |
00:39:58.980
in a way that poetry makes people understand things
link |
00:40:01.980
as opposed to a rigorous log of where every sensor was,
link |
00:40:07.220
where every actuator was.
link |
00:40:09.560
I mean, I find this incredible because,
link |
00:40:12.460
one of the hallmarks of severe autism spectrum disorders
link |
00:40:16.220
is a report of experience from the autistic person
link |
00:40:21.860
that is very much a catalog of action steps.
link |
00:40:25.340
It's like, how do you feel today?
link |
00:40:26.380
And they'll say, well, I got up and I did this
link |
00:40:27.960
and then I did this and I did this.
link |
00:40:29.080
And it's not at all the way that a person with,
link |
00:40:32.180
who doesn't have autism spectrum disorder would respond.
link |
00:40:35.700
And the way you describe these machines
link |
00:40:38.740
has so much humanism or so much of a human
link |
00:40:44.020
and biological element.
link |
00:40:45.420
But I realized that we were talking about machines.
link |
00:40:48.020
I want to make sure that I understand
link |
00:40:51.660
if there's a distinction between a machine that learns,
link |
00:40:57.860
a machine with artificial intelligence and a robot.
link |
00:41:01.180
At what point does a machine become a robot?
link |
00:41:03.840
So if I have a ballpoint pen,
link |
00:41:06.460
I'm assuming I wouldn't call that a robot,
link |
00:41:08.660
but if my ballpoint pen can come to me
link |
00:41:12.420
when I moved to the opposite side of the table,
link |
00:41:15.340
if it moves by whatever mechanism,
link |
00:41:17.940
at that point, does it become a robot?
link |
00:41:20.660
Okay, there's a million ways to explore this question.
link |
00:41:23.420
It's a fascinating one.
link |
00:41:25.060
So first of all, there's a question of what is life?
link |
00:41:29.220
Like how do you know something is a living form and not?
link |
00:41:32.540
And it's similar to the question of when does sort of a,
link |
00:41:35.460
maybe a cold computational system becomes a,
link |
00:41:40.140
well, we're already loading these words
link |
00:41:41.860
with a lot of meaning, robot and machine.
link |
00:41:44.140
But so one, I think movement is important,
link |
00:41:50.300
but that's kind of a boring idea
link |
00:41:52.340
that a robot is just a machine
link |
00:41:54.580
that's able to act in the world.
link |
00:41:56.600
So one, artificial intelligence could be
link |
00:42:00.040
both just the thinking thing,
link |
00:42:01.800
which I think is what machine learning is,
link |
00:42:04.040
and also the acting thing,
link |
00:42:05.720
which is what we usually think about robots.
link |
00:42:07.900
So robots are the things that have a perception system
link |
00:42:10.180
that's able to take in the world,
link |
00:42:11.580
however you define the world,
link |
00:42:13.140
is able to think and learn
link |
00:42:14.600
and do whatever the hell it does inside
link |
00:42:16.700
and then act on the world.
link |
00:42:18.700
So that's the difference between maybe an AI system
link |
00:42:21.580
or a machine and a robot.
link |
00:42:23.220
It's something that's able,
link |
00:42:24.280
a robot is something that's able to perceive the world
link |
00:42:27.220
and act in the world.
link |
00:42:28.180
So it could be through language or sound,
link |
00:42:31.080
or it could be through movement or both.
link |
00:42:32.660
Yeah, and I think it could also be in the digital space,
link |
00:42:36.100
as long as there's a aspect of entity
link |
00:42:39.020
that's inside the machine
link |
00:42:41.260
and a world that's outside the machine,
link |
00:42:44.220
and there's a sense in which the machine
link |
00:42:46.460
is sensing that world and acting in it.
link |
00:42:49.340
So we could, for instance,
link |
00:42:50.900
there could be a version of a robot,
link |
00:42:52.600
according to the definition that I think you're providing,
link |
00:42:55.400
where the robot, where I go to sleep at night
link |
00:42:58.260
and this robot goes and forges for information
link |
00:43:01.420
that it thinks I want to see
link |
00:43:03.680
loaded onto my desktop in the morning.
link |
00:43:05.280
There was no movement of that machine.
link |
00:43:07.000
There was no language,
link |
00:43:07.840
but it essentially has movement in cyberspace.
link |
00:43:11.140
Yeah, there's a distinction that I think is important
link |
00:43:18.400
in that there's an element of it being an entity,
link |
00:43:24.180
whether it's in the digital or the physical space.
link |
00:43:26.620
So when you have something like Alexa in your home,
link |
00:43:32.280
most of the speech recognition,
link |
00:43:35.140
most of what Alexa is doing
link |
00:43:36.720
is constantly being sent back to the mothership.
link |
00:43:42.060
When Alexa is there on its own,
link |
00:43:44.940
that's, to me, a robot,
link |
00:43:46.940
when it's there interacting with the world.
link |
00:43:49.460
When it's simply a finger of the main mothership,
link |
00:43:54.460
then Alexa is not a robot.
link |
00:43:56.660
Then it's just an interaction device.
link |
00:43:58.660
Then maybe the main Amazon Alexa AI,
link |
00:44:02.380
big, big system is the robot.
link |
00:44:04.800
So that's important because there's some element
link |
00:44:08.840
to us humans, I think,
link |
00:44:10.600
where we want there to be an entity,
link |
00:44:12.560
whether in the digital or the physical space.
link |
00:44:14.740
That's where ideas of consciousness come in
link |
00:44:16.740
and all those kinds of things
link |
00:44:18.560
that we project our understanding
link |
00:44:21.400
what it means to be a being.
link |
00:44:23.180
And so to take that further,
link |
00:44:27.020
when does a machine become a robot?
link |
00:44:31.360
I think there's a special moment.
link |
00:44:35.240
There's a special moment in a person's life,
link |
00:44:37.840
in a robot's life where it surprises you.
link |
00:44:41.640
I think surprise is a really powerful thing
link |
00:44:44.280
where you know how the thing works
link |
00:44:46.840
and yet it surprises you.
link |
00:44:49.460
That's a magical moment for us humans.
link |
00:44:51.960
So whether it's a chess playing program
link |
00:44:54.640
that does something that you haven't seen before
link |
00:44:57.600
that makes people smile,
link |
00:44:59.000
like, huh, those moments happen with alpha zero
link |
00:45:03.080
for the first time in chess playing
link |
00:45:05.560
or grandmasters were really surprised by a move.
link |
00:45:08.760
They didn't understand the move
link |
00:45:10.240
and then they studied and study
link |
00:45:11.600
and then they understood it.
link |
00:45:13.380
But that moment of surprise,
link |
00:45:15.280
that's for grandmasters in chess.
link |
00:45:17.340
I find that moment of surprise really powerful,
link |
00:45:20.400
really magical in just everyday life.
link |
00:45:23.320
Because it supersedes the human brain in that moment?
link |
00:45:27.320
Not supersedes like outperforms,
link |
00:45:31.000
but surprises you in a positive sense.
link |
00:45:35.400
Like I didn't think you could do that.
link |
00:45:37.980
I didn't think that you had that in you.
link |
00:45:40.720
And I think that moment is a big transition for a robot
link |
00:45:45.160
from a moment of being a servant
link |
00:45:48.120
that accomplishes a particular task
link |
00:45:51.160
with some level of accuracy,
link |
00:45:52.840
with some rate of failure to an entity,
link |
00:45:57.920
a being that's struggling just like you are in this world.
link |
00:46:01.920
And that's a really important moment
link |
00:46:04.440
that I think you're not gonna find many people
link |
00:46:07.560
in the AI community that talk like I just did.
link |
00:46:11.400
I'm not speaking like some philosopher or some hippie.
link |
00:46:14.360
I'm speaking from purely engineering perspective.
link |
00:46:16.800
I think it's really important for robots to become entities
link |
00:46:20.520
and explore that as a real engineering problem,
link |
00:46:23.360
as opposed to everybody treats robots
link |
00:46:25.880
in the robotics community.
link |
00:46:27.680
They don't even call them a he or she.
link |
00:46:29.560
They don't give them, try to avoid giving them names.
link |
00:46:32.240
They really want to see it like a system, like a servant.
link |
00:46:36.720
They see it as a servant that's trying to accomplish a task.
link |
00:46:40.280
To me, I don't think I'm just romanticizing the notion.
link |
00:46:44.660
I think it's a being.
link |
00:46:46.200
It's currently perhaps a dumb being,
link |
00:46:48.880
but in the long arc of history,
link |
00:46:53.080
humans are pretty dumb beings too, so.
link |
00:46:55.200
I would agree with that statement.
link |
00:46:57.120
So I tend to really want to explore
link |
00:46:59.560
this treating robots really as entities.
link |
00:47:05.720
So like anthropomorphization,
link |
00:47:08.400
which is the sort of the act of looking at a inanimate object
link |
00:47:12.480
and projecting onto it lifelike features,
link |
00:47:15.460
I think robotics generally sees that as a negative.
link |
00:47:21.240
I see it as a superpower.
link |
00:47:23.860
Like that, we need to use that.
link |
00:47:26.740
Well, I'm struck by how that really grabs on
link |
00:47:29.560
to the relationship between human and machine
link |
00:47:32.800
or human and robot.
link |
00:47:34.060
So it's the simple question is,
link |
00:47:37.400
and I think you've already told us the answer,
link |
00:47:38.880
but does interacting with a robot change you?
link |
00:47:43.040
Does it, in other words, do we develop relationships
link |
00:47:46.580
to robots?
link |
00:47:48.040
Yeah, I definitely think so.
link |
00:47:50.240
I think the moment you see a robot or AI systems
link |
00:47:55.720
as more than just servants,
link |
00:47:57.680
but entities, they begin to change,
link |
00:48:01.240
just like good friends do, just like relationships,
link |
00:48:04.880
just like other humans.
link |
00:48:06.760
I think for that, you have to have certain aspects
link |
00:48:09.900
of that interaction,
link |
00:48:11.440
like the robot's ability to say no,
link |
00:48:15.820
to have its own sense of identity,
link |
00:48:19.080
to have its own set of goals
link |
00:48:21.720
that's not constantly serving you,
link |
00:48:23.120
but instead trying to understand the world
link |
00:48:24.920
and do that dance of understanding
link |
00:48:27.200
through communication with you.
link |
00:48:28.920
So I definitely think there's a,
link |
00:48:31.800
I mean, I have a lot of thoughts about this, as you may know,
link |
00:48:35.200
and that's at the core of my lifelong dream, actually,
link |
00:48:38.720
of what I want to do,
link |
00:48:39.840
which is I believe that most people have
link |
00:48:46.160
a notion of loneliness in them that we haven't discovered,
link |
00:48:50.280
that we haven't explored, I should say.
link |
00:48:53.120
And I see AI systems as helping us explore that
link |
00:48:57.860
so that we can become better humans,
link |
00:49:00.240
better people towards each other.
link |
00:49:02.280
So I think that connection between human and AI,
link |
00:49:06.800
human and robot is not only possible,
link |
00:49:11.280
but will help us understand ourselves
link |
00:49:14.520
in ways that are like several orders of magnitude,
link |
00:49:18.760
deeper than we ever could have imagined.
link |
00:49:21.320
I tend to believe that,
link |
00:49:24.360
well, I have very wild levels of belief
link |
00:49:32.440
in terms of how impactful that would be.
link |
00:49:34.440
All right, so when I think about human relationships,
link |
00:49:38.280
I don't always break them down into variables,
link |
00:49:41.140
but we could explore a few of those variables
link |
00:49:43.400
and see how they map to human-robot relationships.
link |
00:49:47.340
One is just time, right?
link |
00:49:49.160
If you spend zero time with another person at all
link |
00:49:52.720
in cyberspace or on the phone or in person,
link |
00:49:55.380
you essentially have no relationship to them.
link |
00:49:58.120
If you spend a lot of time, you have a relationship.
link |
00:49:59.600
This is obvious, but I guess one variable would be time,
link |
00:50:01.800
how much time you spend with the other entity,
link |
00:50:05.180
robot or human.
link |
00:50:06.520
The other would be wins and successes.
link |
00:50:10.000
You know, you enjoy successes together.
link |
00:50:13.840
I'll give an absolutely trivial example of this in a moment,
link |
00:50:16.680
but the other would be failures.
link |
00:50:19.160
When you struggle with somebody,
link |
00:50:21.520
whether or not you struggle between one another,
link |
00:50:23.560
you disagree, like I was really struck
link |
00:50:25.320
by the fact that you said that robot's saying no.
link |
00:50:27.040
I've never thought about a robot saying no to me,
link |
00:50:30.080
but there it is.
link |
00:50:31.160
I look forward to you being one of the first people
link |
00:50:34.040
to send this robot to.
link |
00:50:35.160
So do I.
link |
00:50:36.320
So there's struggle.
link |
00:50:37.640
You grow, you know, when you struggle with somebody,
link |
00:50:40.200
you grow closer.
link |
00:50:41.120
Sometimes the struggles are imposed
link |
00:50:43.600
between those two people, so-called trauma bonding.
link |
00:50:45.920
They call it in the whole psychology literature
link |
00:50:48.320
and pop psychology literature.
link |
00:50:50.120
But in any case, I could imagine,
link |
00:50:52.200
so time, successes together, struggle together,
link |
00:50:57.600
and then just peaceful time, hanging out at home,
link |
00:51:00.960
watching movies, waking up near one another.
link |
00:51:06.160
Here, we're breaking down the kind of elements
link |
00:51:08.160
of relationships of any kind.
link |
00:51:10.680
So do you think that these elements apply
link |
00:51:13.880
to robot-human relationships?
link |
00:51:16.400
And if so, then I could see how if the robot
link |
00:51:22.640
is its own entity and has some autonomy
link |
00:51:25.880
in terms of how it reacts to you,
link |
00:51:27.160
it's not just there just to serve you.
link |
00:51:28.960
It's not just a servant.
link |
00:51:30.200
It actually has opinions and can tell you
link |
00:51:32.800
when maybe your thinking is flawed
link |
00:51:34.440
or your actions are flawed.
link |
00:51:35.760
It can also leave.
link |
00:51:37.160
It could also leave.
link |
00:51:39.360
So I've never conceptualized
link |
00:51:40.920
robot-human interactions this way.
link |
00:51:43.920
So tell me more about how this might look.
link |
00:51:46.500
Are we thinking about a human-appearing robot?
link |
00:51:51.360
I know you and I have both had intense relationships
link |
00:51:53.600
to our, we have separate dogs, obviously, but to animals.
link |
00:51:57.240
This sounds a lot like human-animal interaction.
link |
00:51:59.120
So what is the ideal human-robot relationship?
link |
00:52:04.480
So there's a lot to be said here,
link |
00:52:06.360
but you actually pinpointed one of the big,
link |
00:52:09.520
big first steps, which is this idea of time.
link |
00:52:13.760
And it's a huge limitation
link |
00:52:15.500
in machine learning community currently.
link |
00:52:18.560
Now we're back to like the actual details.
link |
00:52:21.460
Lifelong learning is a problem space
link |
00:52:26.340
that focuses on how AI systems can learn
link |
00:52:29.600
over a long period of time.
link |
00:52:31.940
What's currently most machine learning systems
link |
00:52:35.400
are not able to do is to,
link |
00:52:37.880
all of the things you've listed under time,
link |
00:52:39.780
the successes, the failures,
link |
00:52:41.820
or just chilling together watching movies,
link |
00:52:44.600
AI systems are not able to do that,
link |
00:52:47.960
which is all the beautiful, magical moments
link |
00:52:51.040
that I believe are the days filled with.
link |
00:52:53.880
They're not able to keep track of those together with you.
link |
00:52:57.440
Because they can't move with you and be with you.
link |
00:52:59.200
No, no, like literally we don't have the techniques
link |
00:53:02.080
to do the learning,
link |
00:53:03.440
the actual learning of containing those moments.
link |
00:53:07.200
Current machine learning systems are really focused
link |
00:53:09.960
on understanding the world in the following way.
link |
00:53:12.480
It's more like the perception system,
link |
00:53:14.340
like looking around, understand like what's in the scene,
link |
00:53:20.000
that there's a bunch of people sitting down,
link |
00:53:22.240
that there is cameras and microphones,
link |
00:53:24.880
that there's a table, understand that.
link |
00:53:27.520
But the fact that we shared this moment of talking today
link |
00:53:30.920
and still remember that for next time you're,
link |
00:53:34.160
for like next time you're doing something,
link |
00:53:36.660
remember that this moment happened.
link |
00:53:38.360
We don't know how to do that technique wise.
link |
00:53:40.400
This is what I'm hoping to innovate on
link |
00:53:44.160
as I think it's a very, very important component
link |
00:53:47.080
of what it means to create a deep relationship,
link |
00:53:49.940
that sharing of moments together.
link |
00:53:52.060
Could you post a photo of you and the robot,
link |
00:53:54.080
like selfie with robot,
link |
00:53:55.800
and then the robot sees that image
link |
00:53:58.320
and recognizes that was time spent,
link |
00:54:00.960
there were smiles or there were tears,
link |
00:54:03.600
and create some sort of metric of emotional depth
link |
00:54:09.040
in the relationship and update its behavior?
link |
00:54:11.640
So could it text you in the middle of the night and say,
link |
00:54:15.160
why haven't you texted me back?
link |
00:54:16.800
Well, yes, all of those things,
link |
00:54:18.640
but we can dig into that.
link |
00:54:21.420
But I think that time element, forget everything else,
link |
00:54:24.820
just sharing moments together, that changes everything.
link |
00:54:29.200
I believe that changes everything.
link |
00:54:30.940
There's specific things that are more in terms of systems
link |
00:54:33.560
that I can explain you.
link |
00:54:37.000
It's more technical and probably a little bit off line
link |
00:54:39.360
because I have kind of wild ideas
link |
00:54:41.320
how that can revolutionize social networks
link |
00:54:44.760
and operating systems.
link |
00:54:47.660
But the point is that element alone,
link |
00:54:50.640
forget all the other things we're talking about,
link |
00:54:53.160
like emotions, saying no, all that,
link |
00:54:56.080
just remember sharing moments together
link |
00:54:58.600
would change everything.
link |
00:55:00.260
We don't currently have systems
link |
00:55:01.680
that share moments together.
link |
00:55:05.600
Like even just you and your fridge,
link |
00:55:08.080
just all those times you went late at night
link |
00:55:11.080
and ate the thing you shouldn't have eaten,
link |
00:55:13.280
that was a secret moment you had with your refrigerator.
link |
00:55:16.680
You shared that moment,
link |
00:55:17.960
that darkness or that beautiful moment
link |
00:55:20.360
where you just like heartbroken for some reason,
link |
00:55:24.060
you're eating that ice cream or whatever,
link |
00:55:26.320
that's a special moment.
link |
00:55:27.600
And that refrigerator was there for you.
link |
00:55:29.640
And the fact that it missed the opportunity
link |
00:55:31.840
to remember that is tragic.
link |
00:55:36.000
And once it does remember that,
link |
00:55:38.760
I think you're gonna be very attached to the refrigerator.
link |
00:55:42.440
You're gonna go through some hell with that refrigerator.
link |
00:55:45.720
Most of us have like in the developed world
link |
00:55:49.640
have weird relationships with food, right?
link |
00:55:51.520
So you can go through some deep moments
link |
00:55:54.880
of trauma and triumph with food.
link |
00:55:57.280
And at the core of that is the refrigerator.
link |
00:55:59.200
So a smart refrigerator, I believe would change society,
link |
00:56:04.960
not just the refrigerator,
link |
00:56:06.200
but these ideas in the systems all around us.
link |
00:56:10.200
So I just wanna comment on how powerful
link |
00:56:12.680
the idea of time is.
link |
00:56:14.240
And then there's a bunch of elements of actual interaction
link |
00:56:17.860
of allowing you as a human to feel like you're being heard,
link |
00:56:26.380
truly heard, truly understood that we human,
link |
00:56:30.080
like deep friendship is like that, I think,
link |
00:56:33.040
but we're still, there's still an element of selfishness.
link |
00:56:36.940
There's still an element of not really being able
link |
00:56:39.520
to understand another human.
link |
00:56:40.720
And a lot of the times when you're going through
link |
00:56:43.960
trauma together through difficult times and through
link |
00:56:46.680
successes, you're actually starting to get that inkling
link |
00:56:49.560
of understanding of each other.
link |
00:56:51.200
But I think that can be done more aggressively,
link |
00:56:55.380
more efficiently.
link |
00:56:57.400
Like if you think of a great therapist,
link |
00:56:59.280
I think I've never actually been to a therapist,
link |
00:57:01.800
but I'm a believer.
link |
00:57:03.200
I used to want to be a psychiatrist.
link |
00:57:04.960
Do Russians go to therapists?
link |
00:57:06.320
No, they don't, they don't.
link |
00:57:08.000
And if they do the therapist don't live to tell the story.
link |
00:57:12.040
No, I do believe in talk there,
link |
00:57:16.000
which friendship is to me is talk therapy or like it's like,
link |
00:57:20.200
it's you don't necessarily need to talk.
link |
00:57:23.680
It's like just connecting through in the space of ideas
link |
00:57:27.120
and the space of experiences.
link |
00:57:28.880
And I think there's a lot of ideas of how to make AI
link |
00:57:31.800
systems to be able to ask the right questions
link |
00:57:35.000
and truly hear another human.
link |
00:57:37.320
This is what we try to do with podcasting, right?
link |
00:57:40.320
I think there's ways to do that with AI,
link |
00:57:42.620
but above all else, just remembering the collection
link |
00:57:47.820
of moments that make up the day, the week, the months.
link |
00:57:52.000
I think you maybe have some of this as well.
link |
00:57:55.680
Some of my closest friends still
link |
00:57:57.160
are the friends from high school.
link |
00:57:59.720
That's time we've been through a bunch of shit together.
link |
00:58:02.960
And that like we're very different people,
link |
00:58:06.200
but just the fact that we've been through that
link |
00:58:07.960
and we remember those moments and those moments somehow
link |
00:58:11.040
create a depth of connection like nothing else,
link |
00:58:14.200
like you and your refrigerator.
link |
00:58:17.080
I love that because my graduate advisor,
link |
00:58:20.640
unfortunately she passed away, but when she passed away,
link |
00:58:22.600
somebody said at her memorial,
link |
00:58:27.400
all these amazing things she had done, et cetera.
link |
00:58:29.400
And then her kids got up there and she had young children
link |
00:58:32.880
that I knew as they were when she was pregnant with them.
link |
00:58:35.480
And so it was really, even now I can feel like
link |
00:58:38.440
your heart gets heavy thinking about this.
link |
00:58:39.880
They're going to grow up without their mother.
link |
00:58:41.820
And it was really amazing.
link |
00:58:42.660
Very, very strong young girls and now young women.
link |
00:58:47.200
And what they said was incredible.
link |
00:58:49.300
They said what they really appreciated most
link |
00:58:51.720
about their mother, who was an amazing person,
link |
00:58:55.800
is all the unstructured time they spent together.
link |
00:58:59.480
So it wasn't the trips to the zoo.
link |
00:59:00.960
It wasn't, oh, she woke up at five in the morning
link |
00:59:03.600
and drove us to school.
link |
00:59:04.440
She did all those things too.
link |
00:59:05.520
She had two hour commute in each direction.
link |
00:59:07.360
It was incredible, ran a lab, et cetera.
link |
00:59:09.360
But it was the unstructured time.
link |
00:59:11.440
So on the passing of their mother,
link |
00:59:13.280
that's what they remembered was the biggest give
link |
00:59:16.520
and what bonded them to her was all the time
link |
00:59:18.380
where they just kind of hung out.
link |
00:59:20.400
And the way you described the relationship
link |
00:59:22.000
to a refrigerator is so, I want to say human-like,
link |
00:59:27.200
but I'm almost reluctant to say that
link |
00:59:28.720
because what I'm realizing as we're talking
link |
00:59:31.420
is that what we think of as human-like
link |
00:59:34.400
might actually be a lower form of relationship.
link |
00:59:39.000
There may be relationships that are far better
link |
00:59:42.360
than the sorts of relationships
link |
00:59:43.720
that we can conceive in our minds right now
link |
00:59:46.840
based on what these machine relationship interactions
link |
00:59:50.060
could teach us.
link |
00:59:51.080
Do I have that right?
link |
00:59:52.840
Yeah, I think so.
link |
00:59:54.080
I think there's no reason to see machines
link |
00:59:55.720
as somehow incapable of teaching us something
link |
00:59:59.880
that's deeply human.
link |
01:00:01.600
I don't think humans have a monopoly on that.
link |
01:00:04.980
I think we understand ourselves very poorly
link |
01:00:06.920
and we need to have the kind of prompting from a machine.
link |
01:00:13.960
And definitely part of that is just remembering the moments.
link |
01:00:16.780
Remembering the moments.
link |
01:00:18.920
I think the unstructured time together,
link |
01:00:24.240
I wonder if it's quite so unstructured.
link |
01:00:27.520
That's like calling this podcast unstructured time.
link |
01:00:30.520
Maybe what they meant was it wasn't a big outing.
link |
01:00:34.000
There was no specific goal,
link |
01:00:36.160
but a goal was created through the lack of a goal.
link |
01:00:39.660
Like where you just hang out
link |
01:00:40.560
and then you start playing thumb war
link |
01:00:42.540
and you end up playing thumb war for an hour.
link |
01:00:45.100
So the structure emerges from lack of structure.
link |
01:00:48.900
No, but the thing is the moments,
link |
01:00:52.300
there's something about those times
link |
01:00:54.320
that create special moments.
link |
01:00:56.440
And I think that those could be optimized for.
link |
01:01:00.280
I think we think of like a big outing
link |
01:01:01.900
as I don't know, going to Six Flags or something,
link |
01:01:03.820
or some big, the Grand Canyon or go into some,
link |
01:01:08.180
I don't know, I think we would need to,
link |
01:01:11.660
we don't quite yet understand as humans
link |
01:01:13.840
what creates magical moments.
link |
01:01:15.700
I think there's possible to optimize a lot of those things.
link |
01:01:18.180
And perhaps like podcasting is helping people discover that
link |
01:01:21.420
like maybe the thing we want to optimize for
link |
01:01:24.220
isn't necessarily like some sexy, like quick clips.
link |
01:01:29.220
Maybe what we want is long form authenticity, depth, depth.
link |
01:01:36.240
So we were trying to figure that out,
link |
01:01:38.740
certainly from a deep connection between humans
link |
01:01:42.260
and humans and AI systems, I think long conversations
link |
01:01:45.920
or long periods of communication over a series of moments
link |
01:01:52.280
like my new, perhaps seemingly insignificant
link |
01:01:56.160
to the big ones, the big successes,
link |
01:01:58.080
the big failures, those are all,
link |
01:02:01.580
just stitching those together and talking throughout.
link |
01:02:05.280
I think that's the formula
link |
01:02:06.600
for a really, really deep connection
link |
01:02:08.240
that from like a very specific engineering perspective
link |
01:02:11.940
is I think a fascinating open problem
link |
01:02:15.560
that hasn't been really worked on very much.
link |
01:02:18.400
And for me, from a, if I have the guts
link |
01:02:21.840
and I mean, there's a lot of things to say,
link |
01:02:24.800
but one of it is guts as I'll build a startup around it.
link |
01:02:29.360
Yeah, so let's talk about this startup
link |
01:02:32.360
and let's talk about the dream.
link |
01:02:34.800
You've mentioned this dream before
link |
01:02:36.040
in our previous conversations,
link |
01:02:37.280
always as little hints dropped here and there,
link |
01:02:40.120
just for anyone listening,
link |
01:02:41.360
there's never been an offline conversation about this dream.
link |
01:02:43.720
I'm not privy to anything except what Lex says now.
link |
01:02:48.160
And I realized that there's no way
link |
01:02:49.760
to capture the full essence of a dream
link |
01:02:52.840
in any kind of verbal statement in a way
link |
01:02:57.180
that captures all of it.
link |
01:02:58.200
But what is this dream that you've referred to now
link |
01:03:02.020
several times when we've sat down together
link |
01:03:05.000
and talked on the phone?
link |
01:03:06.920
Maybe it's this company, maybe it's something distinct.
link |
01:03:09.100
If you feel comfortable, it'd be great
link |
01:03:11.660
if you could share a little bit about what that is.
link |
01:03:13.320
Sure, so the way people express long-term vision,
link |
01:03:19.060
I've noticed is quite different.
link |
01:03:20.880
Like Elon is an example of somebody who can very crisply
link |
01:03:24.720
say exactly what the goal is.
link |
01:03:27.520
Also has to do with the fact the problems he's solving
link |
01:03:29.800
have nothing to do with humans.
link |
01:03:32.400
So my long-term vision is a little bit more difficult
link |
01:03:36.400
to express in words, I've noticed, as I've tried.
link |
01:03:40.840
It could be my brain's failure.
link |
01:03:43.260
But there's a ways to sneak up to it.
link |
01:03:45.400
So let me just say a few things.
link |
01:03:47.180
Early on in life, and also in the recent years,
link |
01:03:53.000
I've interacted with a few robots
link |
01:03:55.140
where I understood there's magic there.
link |
01:03:58.200
And that magic could be shared by millions
link |
01:04:02.140
if it's brought to light.
link |
01:04:04.640
When I first met Spot from Boston Dynamics,
link |
01:04:07.620
I realized there's magic there that nobody else is seeing.
link |
01:04:10.400
Is the dog.
link |
01:04:11.240
Is the dog, sorry.
link |
01:04:12.280
The Spot is the four-legged robot from Boston Dynamics.
link |
01:04:17.280
Some people might have seen it, it's this yellow dog.
link |
01:04:20.380
And sometimes in life, you just notice something
link |
01:04:26.620
that just grabs you.
link |
01:04:28.460
And I believe that this is something that,
link |
01:04:32.380
this magic is something that could be
link |
01:04:34.820
every single device in the world.
link |
01:04:38.260
The way that I think maybe Steve Jobs
link |
01:04:41.380
thought about the personal computer.
link |
01:04:44.420
Woz didn't think about the personal computer this way,
link |
01:04:46.700
but Steve did.
link |
01:04:48.060
Which is like, he thought that the personal computer
link |
01:04:50.340
should be as thin as a sheet of paper
link |
01:04:52.100
and everybody should have one.
link |
01:04:53.640
I mean, this idea, I think it is heartbreaking
link |
01:04:58.820
that we're getting, the world is being filled up
link |
01:05:02.500
with machines that are soulless.
link |
01:05:06.060
And I think every one of them can have that same magic.
link |
01:05:10.260
One of the things that also inspired me
link |
01:05:14.780
in terms of a startup is that magic can be engineered
link |
01:05:17.660
much easier than I thought.
link |
01:05:19.560
That's my intuition with everything I've ever built
link |
01:05:22.620
and worked on.
link |
01:05:23.960
So the dream is to add a bit of that magic
link |
01:05:28.860
in every single computing system in the world.
link |
01:05:32.360
So the way that Windows operating system for a long time
link |
01:05:36.500
was the primary operating system everybody interacted with.
link |
01:05:39.460
They built apps on top of it.
link |
01:05:41.300
I think this is something that should be as a layer.
link |
01:05:45.780
It's almost as an operating system
link |
01:05:47.740
in every device that humans interacted with in the world.
link |
01:05:51.000
Now what that actually looks like,
link |
01:05:53.060
the actual dream when I was especially a kid,
link |
01:05:57.940
it didn't have this concrete form of a business.
link |
01:06:01.140
It had more of a dream of exploring your own loneliness
link |
01:06:06.140
by interacting with machines, robots.
link |
01:06:13.340
This deep connection between humans and robots
link |
01:06:15.720
was always a dream.
link |
01:06:17.180
And so for me, I'd love to see a world
link |
01:06:20.140
where there's every home has a robot
link |
01:06:22.460
and not a robot that washes the dishes or a sex robot,
link |
01:06:27.860
or I don't know, I think of any kind of activity
link |
01:06:31.440
the robot can do, but more like a companion.
link |
01:06:33.840
A family member, the way a dog is,
link |
01:06:37.660
but a dog that's able to speak your language too.
link |
01:06:41.980
So not just connect the way a dog does
link |
01:06:45.160
by looking at you and looking away
link |
01:06:46.800
and almost like smiling with its soul in that kind of way,
link |
01:06:51.100
but also to actually understand what the hell,
link |
01:06:54.580
like why are you so excited about the successes?
link |
01:06:56.740
Like understand the details, understand the traumas.
link |
01:06:59.900
And I just think that has always filled me
link |
01:07:04.420
with the excitement that I could,
link |
01:07:07.060
with artificial intelligence, bring joy to a lot of people.
link |
01:07:12.300
More recently, I've been more and more
link |
01:07:17.380
heartbroken to see the kind of division, derision,
link |
01:07:23.320
even hate that's boiling up on the internet
link |
01:07:27.100
through social networks.
link |
01:07:28.620
And I thought this kind of mechanism is exactly applicable
link |
01:07:32.780
in the context of social networks as well.
link |
01:07:34.780
So it's an operating system that serves
link |
01:07:38.420
as your guide on the internet.
link |
01:07:43.500
One of the biggest problems with YouTube
link |
01:07:45.740
and social networks currently
link |
01:07:47.960
is they're optimizing for engagement.
link |
01:07:50.760
I think if you create AI systems
link |
01:07:52.540
that know each individual person,
link |
01:07:54.920
you're able to optimize for long-term growth,
link |
01:07:59.080
for long-term happiness.
link |
01:08:00.800
Of the individual?
link |
01:08:01.640
Of the individual, of the individual.
link |
01:08:03.720
And there's a lot of other things to say, which is the,
link |
01:08:09.440
in order for AI systems to learn everything about you,
link |
01:08:15.760
they need to collect, they need to,
link |
01:08:17.880
just like you and I, when we talk offline,
link |
01:08:19.960
we're collecting data about each other,
link |
01:08:21.520
secrets about each other.
link |
01:08:23.520
The same way AI has to do that.
link |
01:08:26.600
And that allows you to,
link |
01:08:28.720
and that requires you to rethink ideas of ownership of data.
link |
01:08:35.460
I think each individual should own all of their data
link |
01:08:39.920
and very easily be able to leave.
link |
01:08:41.920
Just like AI systems can leave,
link |
01:08:43.600
humans can disappear and delete all of their data
link |
01:08:48.200
in a moment's notice,
link |
01:08:49.400
which is actually better than we humans can do.
link |
01:08:54.220
Once we load the data into each other, it's there.
link |
01:08:56.880
I think it's very important to be both,
link |
01:09:00.420
give people complete control over their data
link |
01:09:03.920
in order to establish trust that they can trust you.
link |
01:09:06.180
And the second part of trust is transparency.
link |
01:09:09.460
Whenever the data is used to make it very clear
link |
01:09:11.900
what it's being used for.
link |
01:09:13.060
And not clear in a lawyerly legal sense,
link |
01:09:16.120
but clear in a way that people really understand
link |
01:09:18.780
what it's used for.
link |
01:09:19.620
I believe when people have the ability
link |
01:09:21.500
to delete all their data and walk away
link |
01:09:24.160
and know how the data is being used, I think they'll stay.
link |
01:09:29.860
The possibility of a clean breakup
link |
01:09:31.860
is actually what will keep people together.
link |
01:09:33.380
Yeah, I think so.
link |
01:09:34.420
I think, yeah, exactly.
link |
01:09:36.380
I think a happy marriage requires the ability
link |
01:09:39.700
to divorce easily without the divorce industrial complex
link |
01:09:46.060
or whatever is currently going on.
link |
01:09:48.040
There's so much money to be made from lawyers and divorce.
link |
01:09:50.740
But yeah, the ability to leave is what enables love,
link |
01:09:53.980
I think.
link |
01:09:55.220
It's interesting, I've heard the phrase
link |
01:09:57.340
from a semi-cynical friend
link |
01:09:58.860
that marriage is the leading cause of divorce.
link |
01:10:01.580
But now we've heard that divorce
link |
01:10:03.560
or the possibility of divorce
link |
01:10:04.980
could be the leading cause of marriage.
link |
01:10:06.620
Of a happy marriage.
link |
01:10:08.100
Good point.
link |
01:10:08.940
Of a happy marriage.
link |
01:10:09.920
So yeah, but there's a lot of details there.
link |
01:10:12.740
But the big dream is that connection
link |
01:10:14.660
between AI system and a human.
link |
01:10:17.480
And I haven't,
link |
01:10:20.240
there's so much fear
link |
01:10:21.080
about artificial intelligence systems and about robots
link |
01:10:23.860
that I haven't quite found the right words
link |
01:10:26.420
to express that vision
link |
01:10:27.580
because the vision I have is one,
link |
01:10:31.320
it's not like some naive delusional vision
link |
01:10:33.500
of technology is gonna save everybody.
link |
01:10:36.820
I really do just have a positive view
link |
01:10:40.300
of ways AI systems can help humans explore themselves.
link |
01:10:44.700
I love that positivity and I agree
link |
01:10:47.540
that the stance everything is doomed is equally bad
link |
01:10:54.400
to say that everything's gonna turn out all right.
link |
01:10:56.560
There has to be a dedicated effort.
link |
01:10:58.300
And clearly you're thinking
link |
01:11:00.380
about what that dedicated effort would look like.
link |
01:11:02.500
You mentioned two aspects to this dream.
link |
01:11:06.580
And I wanna make sure that I understand
link |
01:11:07.820
where they connect if they do
link |
01:11:10.020
or if these are independent streams.
link |
01:11:12.400
One was this hypothetical robot family member
link |
01:11:17.260
or some other form of robot
link |
01:11:19.340
that would allow people to experience
link |
01:11:20.980
the kind of delight that you experienced many times
link |
01:11:27.020
and that you would like the world to be able to have.
link |
01:11:30.280
And it's such a beautiful idea of this give.
link |
01:11:33.500
And the other is social media
link |
01:11:36.180
or social network platforms
link |
01:11:38.160
that really serve individuals
link |
01:11:40.820
and their best selves and their happiness and their growth.
link |
01:11:44.160
Is there crossover between those
link |
01:11:45.600
or are these two parallel dreams?
link |
01:11:47.060
It's 100% the same thing.
link |
01:11:48.500
It's difficult to kind of explain
link |
01:11:50.900
without going through details,
link |
01:11:52.100
but maybe one easy way to explain
link |
01:11:54.620
the way I think about social networks
link |
01:11:56.700
is to create an AI system that's yours.
link |
01:11:59.660
That's yours.
link |
01:12:00.660
It's not like Amazon Alexa that's centralized.
link |
01:12:03.220
You own the data.
link |
01:12:04.540
It's like your little friend
link |
01:12:07.460
that becomes your representative on Twitter
link |
01:12:11.340
that helps you find things that will make you feel good,
link |
01:12:16.220
that will also challenge your thinking to make you grow,
link |
01:12:19.900
but not let you get lost in the negative spiral of dopamine
link |
01:12:26.860
that gets you to be angry
link |
01:12:29.420
or most just get you to be not open to learning.
link |
01:12:34.240
And so that little representative
link |
01:12:36.220
is optimizing your long-term health.
link |
01:12:40.360
And I believe that that is not only good for human beings,
link |
01:12:45.620
it's also good for business.
link |
01:12:47.360
I think long-term you can make a lot of money
link |
01:12:50.480
by challenging this idea
link |
01:12:53.060
that the only way to make money is maximizing engagement.
link |
01:12:57.340
And one of the things that people disagree with me on
link |
01:12:59.700
is they think Twitter's always going to win.
link |
01:13:02.420
Like maximizing engagement is always going to win.
link |
01:13:04.880
I don't think so.
link |
01:13:06.300
I think people have woken up now to understanding
link |
01:13:09.500
that they don't always feel good,
link |
01:13:12.740
the ones who are on Twitter a lot,
link |
01:13:16.220
that they don't always feel good at the end of the week.
link |
01:13:19.300
I would love feedback from whatever this creature,
link |
01:13:24.000
whatever, I can't, I don't know what to call it,
link |
01:13:26.740
as to maybe at the end of the week,
link |
01:13:28.720
it would automatically unfollow
link |
01:13:30.520
some of the people that I follow
link |
01:13:31.940
because it realized through some really smart data
link |
01:13:35.580
about how I was feeling inside
link |
01:13:37.120
or how I was sleeping or something
link |
01:13:38.540
that that just wasn't good for me,
link |
01:13:40.600
but it might also put things and people in front of me
link |
01:13:43.600
that I ought to see.
link |
01:13:45.500
Is that kind of a sliver of what this looks like?
link |
01:13:48.660
The whole point, because of the interaction,
link |
01:13:50.540
because of sharing the moments
link |
01:13:53.580
and learning a lot about you,
link |
01:13:56.600
you're now able to understand
link |
01:13:59.560
what interactions led you to become
link |
01:14:01.960
a better version of yourself,
link |
01:14:04.280
like the person you yourself are happy with.
link |
01:14:07.280
This isn't, if you're into flat earth
link |
01:14:11.300
and you feel very good about it,
link |
01:14:12.800
that you believe the earth is flat,
link |
01:14:15.720
like the idea that you should censor, that is ridiculous.
link |
01:14:19.600
If it makes you feel good
link |
01:14:21.000
and you're becoming the best version of yourself,
link |
01:14:23.440
I think you should be getting
link |
01:14:24.600
as much flat earth as possible.
link |
01:14:26.420
Now, it's also good to challenge your ideas,
link |
01:14:29.320
but not because the centralized committee decided,
link |
01:14:34.560
but because you tell to the system
link |
01:14:37.280
that you like challenging your ideas.
link |
01:14:39.280
I think all of us do.
link |
01:14:40.800
And then, which actually YouTube doesn't do that well,
link |
01:14:44.040
once you go down the flat earth rabbit hole,
link |
01:14:45.900
that's all you're gonna see.
link |
01:14:47.160
It's nice to get some really powerful communicators
link |
01:14:51.640
to argue against flat earth.
link |
01:14:53.600
And it's nice to see that for you
link |
01:14:57.020
and potentially at least long-term to expand your horizons,
link |
01:15:01.100
maybe the earth is not flat.
link |
01:15:03.240
But if you continue to live your whole life
link |
01:15:05.160
thinking the earth is flat, I think,
link |
01:15:08.100
and you're being a good father or son or daughter,
link |
01:15:11.680
and like you're being the best version of yourself
link |
01:15:14.360
and you're happy with yourself, I think the earth is flat.
link |
01:15:18.940
So like, I think this kind of idea,
link |
01:15:21.360
and I'm just using that kind of silly, ridiculous example,
link |
01:15:24.220
because I don't like the idea of centralized forces
link |
01:15:30.440
controlling what you can and can't see.
link |
01:15:33.240
But I also don't like this idea of like,
link |
01:15:36.880
not censoring anything,
link |
01:15:39.680
because that's always the biggest problem with that
link |
01:15:42.560
is there's a central decider.
link |
01:15:45.840
I think you yourself can decide what you wanna see and not.
link |
01:15:49.480
And it's good to have a companion that reminds you
link |
01:15:54.040
that you felt shitty last time you did this,
link |
01:15:56.500
or you felt good last time you did this.
link |
01:15:58.560
I mean, I feel like in every good story,
link |
01:16:00.120
there's a guide or a companion that flies out
link |
01:16:03.540
or forages a little bit further or a little bit differently
link |
01:16:06.440
and brings back information that helps us,
link |
01:16:08.420
or at least tries to steer us in the right direction.
link |
01:16:11.360
So that's exactly what I'm thinking
link |
01:16:16.600
and what I've been working on.
link |
01:16:17.800
I should mention there's a bunch of difficulties here.
link |
01:16:20.820
You see me up and down a little bit recently.
link |
01:16:24.200
So there's technically a lot of challenges here.
link |
01:16:28.480
Like with a lot of technologies,
link |
01:16:30.480
and the reason I'm talking about it on a podcast comfortably
link |
01:16:34.360
as opposed to working in secret is it's really hard.
link |
01:16:38.500
And maybe it's time has not come.
link |
01:16:41.960
And that's something you have to constantly struggle with
link |
01:16:44.140
in terms of like entrepreneurially as a startup.
link |
01:16:48.060
Like I've also mentioned to you maybe offline,
link |
01:16:50.640
I really don't care about money.
link |
01:16:52.540
I don't care about business success,
link |
01:16:55.280
all those kinds of things.
link |
01:16:58.480
So it's a difficult decision to make
link |
01:17:01.120
how much of your time do you want to go all in here
link |
01:17:05.260
and give everything to this?
link |
01:17:07.620
It's a big roll of the dice because I've also realized
link |
01:17:11.560
that working on some of these problems,
link |
01:17:14.480
both with the robotics and the technical side
link |
01:17:18.040
in terms of the machine learning system
link |
01:17:21.000
that I'm describing, it's lonely, it's really lonely
link |
01:17:26.800
because both on a personal level and a technical level.
link |
01:17:31.420
So on the technical level,
link |
01:17:32.480
I'm surrounded by people that kind of doubt me,
link |
01:17:37.900
which I think all entrepreneurs go through.
link |
01:17:40.560
And they doubt you in the following sense.
link |
01:17:42.800
They know how difficult it is.
link |
01:17:46.800
Like the people that, the colleagues of mine,
link |
01:17:49.560
they know how difficult lifelong learning is.
link |
01:17:52.320
They also know how difficult it is
link |
01:17:53.920
to build a system like this,
link |
01:17:56.320
to build a competitive social network.
link |
01:17:59.360
And in general, there's a kind of a loneliness
link |
01:18:05.040
to just working on something on your own
link |
01:18:08.800
for long periods of time.
link |
01:18:10.280
And you start to doubt whether,
link |
01:18:13.320
given that you don't have a track record of success,
link |
01:18:16.320
like that's a big one.
link |
01:18:17.980
When you look in the mirror, especially when you're young,
link |
01:18:20.520
but I still have that on most things,
link |
01:18:22.680
you look in the mirror and you have these big dreams.
link |
01:18:26.520
How do you know you're actually as smart
link |
01:18:30.800
as you think you are?
link |
01:18:32.480
Like, how do you know you're going to be able
link |
01:18:34.240
to accomplish this dream?
link |
01:18:35.420
You have this ambition.
link |
01:18:36.520
You sort of don't, but you're kind of pulling on a string
link |
01:18:40.920
hoping that there's a bigger ball of yarn.
link |
01:18:42.960
Yeah, but you have this kind of intuition.
link |
01:18:45.000
I think I pride myself in knowing what I'm good at
link |
01:18:51.360
because the reason I have that intuition
link |
01:18:54.440
is because I think I'm very good at knowing
link |
01:18:57.560
all the things I suck at, which is basically everything.
link |
01:19:01.280
So like, whenever I notice like, wait a minute,
link |
01:19:04.520
I'm kind of good at this, which is very rare for me.
link |
01:19:08.180
I think like that might be a ball of yarn worth pulling at.
link |
01:19:11.520
And the thing with, in terms of engineering systems
link |
01:19:14.760
that are able to interact with humans,
link |
01:19:16.560
I think I'm very good at that.
link |
01:19:18.520
And it's because we talk about podcasting and so on.
link |
01:19:22.040
I don't know if I'm very good at podcasting.
link |
01:19:23.600
You're very good at podcasting.
link |
01:19:25.200
But I certainly don't.
link |
01:19:27.200
I think maybe it is compelling for people
link |
01:19:30.760
to watch a kindhearted idiot struggle with this form.
link |
01:19:35.360
Maybe that's what's compelling.
link |
01:19:37.200
But in terms of like actual being a good engineer
link |
01:19:40.600
of human-robot interaction systems, I think I'm good.
link |
01:19:45.560
But it's hard to know until you do it
link |
01:19:48.000
and then the world keeps telling you you're not.
link |
01:19:50.840
And it's just, it's full of doubt and it's really hard.
link |
01:19:53.800
And I've been struggling with that recently.
link |
01:19:55.320
It's kind of a fascinating struggle.
link |
01:19:57.180
But then that's where the Goggins thing comes in,
link |
01:19:59.780
is like, aside from the stay hard motherfucker,
link |
01:20:03.900
is the like, whenever you're struggling,
link |
01:20:07.860
that's a good sign that if you keep going,
link |
01:20:11.160
that you're going to be alone in the success, right?
link |
01:20:16.240
Well, in your case, however, I agree.
link |
01:20:18.660
And actually David had a post recently
link |
01:20:20.340
that I thought was among his many brilliant posts
link |
01:20:23.240
was one of the more brilliant about how,
link |
01:20:26.280
he talked about this myth of the light
link |
01:20:27.860
at the end of the tunnel.
link |
01:20:29.520
And instead what he replaced that myth with
link |
01:20:33.460
was a concept that eventually your eyes adapt to the dark.
link |
01:20:38.460
That the tunnel, it's not about a light at the end,
link |
01:20:40.320
that it's really about adapting to the dark of the tunnel.
link |
01:20:42.800
It's very Goggins-
link |
01:20:43.640
I love him so much.
link |
01:20:44.800
Yeah, you guys share a lot in common.
link |
01:20:48.880
Knowing you both a bit, you share a lot in common.
link |
01:20:52.520
But in this loneliness and the pursuit of this dream,
link |
01:20:57.840
it seems to me it has a certain component to it
link |
01:21:01.320
that is extremely valuable,
link |
01:21:04.040
which is that the loneliness itself could serve as a driver
link |
01:21:08.000
to build the companion for the journey.
link |
01:21:10.480
Well, I'm very deeply aware of that.
link |
01:21:13.680
So like some people can make,
link |
01:21:17.280
cause I talk about love a lot.
link |
01:21:18.640
I really love everything in this world,
link |
01:21:21.900
but I also love humans, friendship and romantic,
link |
01:21:26.040
you know, like even the cheesy stuff, just-
link |
01:21:31.040
You like romantic movies.
link |
01:21:32.440
Yeah, not those-
link |
01:21:33.760
I'm just kidding.
link |
01:21:35.160
Well, I got so much shit from Rogan about like,
link |
01:21:38.320
what is it, the tango scene from a Scent of a Woman.
link |
01:21:41.100
But yeah, I find like there's nothing better
link |
01:21:44.000
than a woman in a red dress.
link |
01:21:45.480
Like, you know, just like classy.
link |
01:21:49.420
You should move to Argentina, Mike.
link |
01:21:51.040
You know, my father's Argentine.
link |
01:21:52.240
And you know what he said when I went on your podcast
link |
01:21:54.760
for the first time, he said, he dresses well.
link |
01:21:58.320
Because in Argentina, the men go to a wedding
link |
01:22:00.240
or a party or something.
link |
01:22:01.440
You know, in the U.S. by halfway through the night,
link |
01:22:03.660
10 minutes in the night, all the jackets are off.
link |
01:22:05.800
It looks like everyone's undressing for the party
link |
01:22:07.600
they just got dressed up for.
link |
01:22:09.040
And he said, you know, I like the way he dresses.
link |
01:22:11.600
And then when I started, he was talking about you.
link |
01:22:13.200
And then when I started my podcast, he said,
link |
01:22:15.840
why don't you wear a real suit like your friend Lex?
link |
01:22:19.320
I remember that.
link |
01:22:23.440
But let's talk about this pursuit just a bit more.
link |
01:22:27.840
Because I think what you're talking about
link |
01:22:29.320
is building a, not just a solution for loneliness,
link |
01:22:33.920
but you've alluded to the loneliness
link |
01:22:36.080
as itself an important thing.
link |
01:22:37.800
And I think you're right.
link |
01:22:38.640
I think within people, there's like caverns of thoughts
link |
01:22:44.120
and shame, but also just the desire to be,
link |
01:22:47.680
to have resonance, to be seen and heard.
link |
01:22:51.220
And I don't even know that it's seen
link |
01:22:52.440
and heard through language.
link |
01:22:53.760
But these reservoirs of loneliness, I think they're,
link |
01:23:00.500
well, they're interesting.
link |
01:23:01.340
Maybe you could comment a little bit about it
link |
01:23:02.620
because just as often as you talk about love,
link |
01:23:05.160
I haven't quantified it,
link |
01:23:06.080
but it seems that you talk about this loneliness.
link |
01:23:08.440
And maybe you just would, if you're willing,
link |
01:23:10.160
you could share a little bit more about that
link |
01:23:12.240
and what that feels like now in the pursuit
link |
01:23:15.300
of building this robot-human relationship.
link |
01:23:18.360
And you've been, let me be direct.
link |
01:23:20.460
You've been spending a lot of time on building
link |
01:23:23.800
a robot-human relationship.
link |
01:23:26.340
Where's that at?
link |
01:23:28.360
Oh, well, in terms of business and in terms of systems.
link |
01:23:33.440
No, I'm talking about a specific robot.
link |
01:23:35.540
Oh, so, okay, I should mention a few things.
link |
01:23:39.840
So one is there's a startup where there's an idea
link |
01:23:42.640
where I hope millions of people can use.
link |
01:23:46.000
And then there's my own personal,
link |
01:23:49.240
almost like Frankenstein explorations
link |
01:23:52.040
with particular robots.
link |
01:23:55.040
So I'm very fascinated with the legged robots
link |
01:23:59.200
in my own private sounds like dark,
link |
01:24:03.120
but like N of one experiments
link |
01:24:06.440
to see if I can recreate the magic.
link |
01:24:09.620
And that's been, I have a lot of really good already,
link |
01:24:14.160
perception systems and control systems
link |
01:24:17.080
that are able to communicate affection
link |
01:24:19.240
in a dog-like fashion.
link |
01:24:20.760
So I'm in a really good place there.
link |
01:24:22.520
The stumbling blocks,
link |
01:24:23.560
which also have been part of my sadness recently
link |
01:24:27.080
is that I also have to work with robotics companies
link |
01:24:30.760
that I gave so much of my heart, soul,
link |
01:24:34.640
and love and appreciation towards Boston Dynamics.
link |
01:24:37.940
But Boston Dynamics is also as a company
link |
01:24:42.600
that has to make a lot of money
link |
01:24:43.840
and they have marketing teams.
link |
01:24:45.560
And they're like looking at this silly Russian kid
link |
01:24:48.000
in a suit and tie.
link |
01:24:49.320
It's like, what's he trying to do with all this love
link |
01:24:51.400
and robot interaction and dancing and so on?
link |
01:24:53.480
So there was, I think, let's say for now,
link |
01:24:59.000
it's like when you break up with a girlfriend or something.
link |
01:25:01.200
Right now we decided to part ways on this particular thing.
link |
01:25:04.060
They're huge supporters of mine, they're huge fans.
link |
01:25:05.900
But on this particular thing,
link |
01:25:08.640
Boston Dynamics is not focusing on
link |
01:25:12.640
or interested in human robot interaction.
link |
01:25:15.040
In fact, their whole business currently
link |
01:25:17.160
is keep the robot as far away from humans as possible.
link |
01:25:20.960
Because it's in the industrial setting
link |
01:25:23.800
where it's doing monitoring in dangerous environments.
link |
01:25:27.300
It's almost like a remote security camera
link |
01:25:29.440
essentially is its application.
link |
01:25:32.040
To me, I thought it's still even in those applications
link |
01:25:35.740
exceptionally useful for the robot
link |
01:25:37.840
to be able to perceive humans, like see humans
link |
01:25:41.440
and to be able to, in a big map,
link |
01:25:45.520
localize where those humans are and have human intention.
link |
01:25:48.400
For example, like this,
link |
01:25:49.800
I did this a lot of work with pedestrians
link |
01:25:52.080
for a robot to be able to anticipate
link |
01:25:53.860
what the hell the human is doing, like where it's walking.
link |
01:25:57.560
Humans are not ballistics object.
link |
01:25:59.200
They're not, just because you're walking this way
link |
01:26:00.920
one moment doesn't mean you'll keep walking that direction.
link |
01:26:03.820
You have to infer a lot of signals,
link |
01:26:05.400
especially the head movement and the eye movement.
link |
01:26:07.720
So I thought that's super interesting to explore,
link |
01:26:09.760
but they didn't feel that.
link |
01:26:11.680
So I'll be working with a few other robotics companies
link |
01:26:14.480
that are much more open to that kind of stuff.
link |
01:26:18.360
And they're super excited and fans of mine.
link |
01:26:20.680
And hopefully Boston Dynamics, my first love
link |
01:26:23.440
that getting back with an ex-girlfriend will come around.
link |
01:26:26.000
But so the algorithmically it's,
link |
01:26:29.520
I'm basically done there.
link |
01:26:33.380
The rest is actually getting
link |
01:26:35.840
some of these companies to work with.
link |
01:26:37.200
And then there's, for people who'd work with robots know
link |
01:26:41.600
that one thing is to write software that works.
link |
01:26:44.880
And the other is to have a real machine that actually works.
link |
01:26:48.440
And it breaks down in all kinds of different ways
link |
01:26:50.860
that are fascinating.
link |
01:26:51.720
And so there's a big challenge there.
link |
01:26:53.920
But that's almost, it may sound a little bit confusing
link |
01:26:58.760
in the context of our previous discussion,
link |
01:27:01.280
because the previous discussion was more
link |
01:27:03.660
about the big dream, how I hoped to have millions of people
link |
01:27:07.040
enjoy this moment of magic.
link |
01:27:09.420
This current discussion about a robot
link |
01:27:12.160
is something I personally really enjoy.
link |
01:27:15.080
It just brings me happiness.
link |
01:27:16.620
I really try to do now everything that just brings me joy.
link |
01:27:20.960
I maximize that because robots are awesome.
link |
01:27:24.180
But two, given my like a little bit growing platform,
link |
01:27:28.440
I want to use the opportunity to educate people.
link |
01:27:31.860
It's just like robots are cool.
link |
01:27:34.080
And if I think they're cool, I'll be able to,
link |
01:27:36.600
I hope be able to communicate why they're cool to others.
link |
01:27:39.620
So this little robot experiment
link |
01:27:42.160
is a little bit of a research project too.
link |
01:27:43.980
There's a couple of publications with MIT folks around that.
link |
01:27:47.000
But the other is just to make some cool videos
link |
01:27:49.800
and explain to people how they actually work.
link |
01:27:53.320
And as opposed to people being scared of robots,
link |
01:27:56.640
they can still be scared, but also excited.
link |
01:27:59.860
Like see the dark side, the beautiful side,
link |
01:28:03.120
the magic of what it means to bring, you know,
link |
01:28:07.720
for a machine to become a robot.
link |
01:28:09.800
I want to inspire people with that, but that's less,
link |
01:28:13.880
it's interesting because I think the big impact
link |
01:28:18.120
in terms of the dream does not have to do with embodied AI.
link |
01:28:22.360
So it does not need to have a body.
link |
01:28:24.040
I think the refrigerator is enough that for an AI system,
link |
01:28:28.680
just to have a voice and to hear you,
link |
01:28:31.320
that's enough for loneliness.
link |
01:28:33.400
The embodiment is just-
link |
01:28:36.540
By embodiment, you mean the physical structure.
link |
01:28:38.480
Physical instantiation of intelligence.
link |
01:28:41.280
So it's a legged robot or even just a thing.
link |
01:28:45.980
I have a few other humanoid robot,
link |
01:28:48.520
a little humanoid robot, maybe I'll keep them on the table.
link |
01:28:51.320
Just like walks around or even just like a mobile platform
link |
01:28:54.720
that can just like turn around and look at you.
link |
01:28:57.560
It's like we mentioned with the pen,
link |
01:28:59.200
something that moves and can look at you.
link |
01:29:02.040
It's like that butter robot that asks, what is my purpose?
link |
01:29:11.280
That is really, it's almost like art.
link |
01:29:15.640
There's something about a physical entity that moves around,
link |
01:29:19.680
that's able to look at you and interact with you,
link |
01:29:22.620
that makes you wonder what it means to be human.
link |
01:29:25.920
It like challenges you to think,
link |
01:29:27.520
if that thing looks like it has consciousness,
link |
01:29:32.540
what the hell am I?
link |
01:29:33.920
And I like that feeling.
link |
01:29:35.080
I think that's really useful for us.
link |
01:29:36.500
It's humbling for us humans, but that's less about research.
link |
01:29:41.080
It's certainly less about business
link |
01:29:42.660
and more about exploring our own selves
link |
01:29:45.960
and challenging others to think like,
link |
01:29:49.200
to think about what makes them human.
link |
01:29:52.880
I love this desire to share the delight
link |
01:29:56.360
of an interaction with a robot.
link |
01:29:58.040
And as you describe it,
link |
01:29:58.880
I actually, I find myself starting to crave that
link |
01:30:01.340
because we all have those elements from childhood where,
link |
01:30:04.920
or from adulthood where we experience something
link |
01:30:06.760
we want other people to feel that.
link |
01:30:09.800
And I think that you're right.
link |
01:30:10.920
I think a lot of people are scared of AI.
link |
01:30:12.560
I think a lot of people are scared of robots.
link |
01:30:14.440
My only experience of a robotic like thing
link |
01:30:18.840
is my Roomba vacuum where it goes about,
link |
01:30:21.840
actually was pretty good at picking up Costello's hair
link |
01:30:24.280
when he was shed.
link |
01:30:25.480
And then, and I was grateful for it.
link |
01:30:28.260
But then when I was on a call or something
link |
01:30:30.440
and it would get caught on a wire or something,
link |
01:30:33.200
I would find myself getting upset with the Roomba.
link |
01:30:35.280
In that moment, I'm like, what are you doing?
link |
01:30:37.520
And obviously it's just doing what it does.
link |
01:30:39.920
But that's a kind of mostly positive
link |
01:30:42.480
but slightly negative interaction.
link |
01:30:45.480
But what you're describing,
link |
01:30:46.640
it has so much more richness and layers of detail
link |
01:30:50.640
that I can only imagine what those relationships are like.
link |
01:30:53.680
Well, there's a few, just a quick comment.
link |
01:30:55.400
So I've had, they're currently in Boston.
link |
01:30:57.680
I have a bunch of Roombas for my robot.
link |
01:31:00.560
And I did this experiment.
link |
01:31:01.920
Wait, how many Roombas?
link |
01:31:04.760
Sounds like a fleet of Roombas.
link |
01:31:05.880
Yeah, so probably seven or eight.
link |
01:31:08.640
Well, that's a lot of Roombas.
link |
01:31:09.960
So- This place is very clean.
link |
01:31:12.800
Well, so this, I'm kind of waiting.
link |
01:31:14.760
This is the place we're currently in in Austin
link |
01:31:18.440
is way larger than I need.
link |
01:31:20.280
But I basically got it to make sure I have room for robots.
link |
01:31:27.160
So you have these seven or so Roombas.
link |
01:31:30.400
You deploy all seven at once?
link |
01:31:32.080
Oh no, I do different experiments with them.
link |
01:31:36.040
So one of the things I want to mention is this is,
link |
01:31:38.860
I think there was a YouTube video
link |
01:31:40.140
that inspired me to try this,
link |
01:31:42.800
is I got them to scream in pain and moan in pain
link |
01:31:47.800
whenever they were kicked or contacted.
link |
01:31:53.380
And I did that experiment to see how I would feel.
link |
01:31:56.680
I meant to do like a YouTube video on it,
link |
01:31:58.780
but then it just seemed very cruel.
link |
01:32:00.380
Did any Roomba rights activists come at you?
link |
01:32:03.420
Like, I think if I release that video,
link |
01:32:07.620
I think it's going to make me look insane,
link |
01:32:09.680
which I know people know I'm already insane.
link |
01:32:12.740
Now you have to release the video.
link |
01:32:14.540
I think maybe if I contextualize it
link |
01:32:18.300
by showing other robots like to show why this is fascinating
link |
01:32:23.200
because ultimately I felt like they were human
link |
01:32:26.020
almost immediately.
link |
01:32:27.380
And that display of pain was what did that.
link |
01:32:30.340
Giving them a voice.
link |
01:32:31.500
Giving them a voice, especially a voice of dislike of pain.
link |
01:32:37.180
I have to connect you to my friend, Eddie Chang.
link |
01:32:39.220
He studies speech and language.
link |
01:32:40.460
He's a neurosurgeon and we're lifelong friends.
link |
01:32:42.980
He studies speech and language,
link |
01:32:46.340
but he describes some of these more primitive,
link |
01:32:50.160
visceral vocalizations, cries, groans, moans of delight,
link |
01:32:56.440
other sounds as well, use your imagination,
link |
01:32:58.940
as such powerful rudders for the other,
link |
01:33:02.820
for the emotions of other people.
link |
01:33:04.620
And so I find it fascinating.
link |
01:33:06.060
I can't wait to see this video.
link |
01:33:07.220
Is that, so is the video available online?
link |
01:33:09.480
No, I haven't recorded it, I just had a bunch of Roombas
link |
01:33:13.660
that are able to scream in pain in my Boston place.
link |
01:33:20.580
Like people are ready as well.
link |
01:33:22.300
Next podcast episode with Lex, maybe we'll have that one.
link |
01:33:26.140
Who knows?
link |
01:33:26.980
So the thing is like people,
link |
01:33:28.220
I've noticed because I talk so much about love
link |
01:33:30.940
and it's really who I am.
link |
01:33:32.340
I think they wanna, to a lot of people,
link |
01:33:35.660
it seems like there gotta be a dark person
link |
01:33:38.460
in there somewhere.
link |
01:33:39.460
And I thought if I release videos and Roombas screaming
link |
01:33:41.900
and they're like, yep, yep, that guy's definitely insane.
link |
01:33:44.540
What about like shouts of glee and delight?
link |
01:33:47.760
You could do that too, right?
link |
01:33:49.180
Well, I don't know how to,
link |
01:33:51.660
to me delight is quiet, right?
link |
01:33:54.420
Like you're Russian.
link |
01:33:57.580
Americans are much louder than Russians.
link |
01:33:59.860
Yeah, yeah.
link |
01:34:01.020
But I don't, I mean, unless you're talking about like,
link |
01:34:04.740
I don't know how you would have sexual relations
link |
01:34:06.460
with a Roomba.
link |
01:34:07.300
I wasn't necessarily saying sexual delight, but-
link |
01:34:10.260
Trust me, I tried, I'm just kidding.
link |
01:34:12.660
That's a joke, internet.
link |
01:34:14.100
Okay, but I was fascinated in the psychology
link |
01:34:16.740
of how little it took.
link |
01:34:17.700
Cause you mentioned you had a negative relationship
link |
01:34:20.560
with the Roomba.
link |
01:34:21.940
Well, I'd find that mostly I took it for granted.
link |
01:34:25.140
It just served me, it collected Costello's hair.
link |
01:34:27.540
And then when it would do something I didn't like,
link |
01:34:29.180
I would get upset with it.
link |
01:34:30.220
So that's not a good relationship.
link |
01:34:31.740
It was taken for granted and I would get upset
link |
01:34:35.020
and then I'd park it again.
link |
01:34:36.060
And I just like, you're in the corner.
link |
01:34:39.020
Yeah, but there's a way to frame it being quite dumb
link |
01:34:45.380
as a almost cute, almost connecting with it
link |
01:34:49.620
for its dumbness.
link |
01:34:51.120
And I think that's a artificial intelligence problem.
link |
01:34:54.700
Interesting.
link |
01:34:55.540
I think flaws should be a feature, not a bug.
link |
01:34:59.280
So along the lines of this,
link |
01:35:01.140
the different sorts of relationships that one could have
link |
01:35:03.220
with robots and the fear,
link |
01:35:04.440
but also some of the positive relationships
link |
01:35:06.900
that one could have.
link |
01:35:08.900
There's so much dimensionality.
link |
01:35:10.300
There's so much to explore,
link |
01:35:12.400
but power dynamics in relationships are very interesting
link |
01:35:16.300
because the obvious ones that the unsophisticated view
link |
01:35:20.640
of this is one, there's a master and a servant, right?
link |
01:35:25.020
But there's also manipulation.
link |
01:35:27.780
There's benevolent manipulation.
link |
01:35:30.820
Children do this with parents.
link |
01:35:32.020
Puppies do this.
link |
01:35:33.120
Puppies turn their head and look cute
link |
01:35:34.920
and maybe give out a little noise.
link |
01:35:37.560
Kids, coo.
link |
01:35:39.020
And parents always think that they're doing this
link |
01:35:41.540
because they love the parent.
link |
01:35:44.980
But in many ways, studies show that those coos are ways
link |
01:35:48.460
to extract the sorts of behaviors and expressions
link |
01:35:50.820
from the parent that they want.
link |
01:35:51.820
The child doesn't know it's doing this.
link |
01:35:53.160
It's completely subconscious,
link |
01:35:54.260
but it's benevolent manipulation.
link |
01:35:56.460
So there's one version of fear of robots
link |
01:35:59.720
that I hear a lot about that I think most people
link |
01:36:02.180
can relate to where the robots take over
link |
01:36:04.540
and they become the masters and we become the servants.
link |
01:36:08.140
But there could be another version
link |
01:36:10.000
that in certain communities that I'm certainly not a part of
link |
01:36:15.140
but they call topping from the bottom
link |
01:36:17.220
where the robot is actually manipulating you
link |
01:36:20.780
into doing things, but you are under the belief
link |
01:36:24.180
that you are in charge, but actually they're in charge.
link |
01:36:29.180
And so I think that's one that if we could explore
link |
01:36:33.740
that for a second, you could imagine
link |
01:36:35.540
it wouldn't necessarily be bad,
link |
01:36:37.140
although it could lead to bad things.
link |
01:36:40.120
The reason I want to explore this is I think people
link |
01:36:41.820
always default to the extreme, like the robots take over
link |
01:36:45.580
and we're in little jail cells and they're out having fun
link |
01:36:48.100
and ruling the universe.
link |
01:36:50.580
What sorts of manipulation can a robot
link |
01:36:54.100
potentially carry out, good or bad?
link |
01:36:56.500
Yeah, just so there's a lot of good and bad manipulation
link |
01:36:59.740
between humans, right, just like you said.
link |
01:37:04.400
To me, especially like you said,
link |
01:37:09.900
topping from the bottom, is that the term?
link |
01:37:12.340
I think someone from MIT told me that term, wasn't Lex.
link |
01:37:18.420
So first of all, there's power dynamics in bed
link |
01:37:22.060
and power dynamics in relationships
link |
01:37:24.220
and power dynamics on the street
link |
01:37:25.860
and in the work environment, those are all very different.
link |
01:37:29.260
I think power dynamics can make human relationships,
link |
01:37:34.740
especially romantic relationships, fascinating and rich
link |
01:37:39.820
and fulfilling and exciting and all those kinds of things.
link |
01:37:42.740
So I don't think in themselves they're bad
link |
01:37:49.040
and the same goes with robots.
link |
01:37:50.820
I really love the idea that a robot would be a top
link |
01:37:54.220
or a bottom in terms of like power dynamics
link |
01:37:57.520
and I think everybody should be aware of that
link |
01:37:59.900
and the manipulation is not so much manipulation
link |
01:38:02.460
but a dance of like pulling away, a push and pull
link |
01:38:06.700
and all those kinds of things.
link |
01:38:08.340
In terms of control, I think we're very, very, very far away
link |
01:38:12.860
from AI systems that are able to lock us up.
link |
01:38:16.340
They, to lock us up in, like to have so much control
link |
01:38:21.340
that we basically cannot live our lives
link |
01:38:25.180
in the way that we want.
link |
01:38:26.960
I think there's, in terms of dangers of AI systems,
link |
01:38:29.720
there's much more dangers that have to do
link |
01:38:31.300
with autonomous weapon systems and all those kinds of things.
link |
01:38:34.280
So the power dynamics as exercised in the struggle
link |
01:38:37.940
between nations and war and all those kinds of things.
link |
01:38:40.700
But in terms of personal relationships,
link |
01:38:43.700
I think power dynamics are a beautiful thing.
link |
01:38:45.940
Now there is, of course, going to be all those kinds
link |
01:38:48.380
of discussions about consent and rights
link |
01:38:52.460
and all those kinds of things.
link |
01:38:53.300
Well, here we're talking, I always say,
link |
01:38:54.860
in any discussion around this,
link |
01:38:56.420
if we need to define really the context,
link |
01:38:59.180
it's always, it always should be consensual,
link |
01:39:02.420
age appropriate, context appropriate, species appropriate.
link |
01:39:06.060
But now we're talking about human robot interactions
link |
01:39:09.200
and so I guess that-
link |
01:39:10.780
No, I actually was trying to make a different point
link |
01:39:13.620
which is I do believe that robots will have rights
link |
01:39:17.020
down the line and I think in order for us
link |
01:39:20.820
to have deep meaningful relationship with robots,
link |
01:39:23.060
we would have to consider them as entities in themselves
link |
01:39:26.580
that deserve respect.
link |
01:39:29.560
And that's a really interesting concept
link |
01:39:31.660
that I think people are starting to talk about
link |
01:39:34.360
a little bit more, but it's very difficult for us
link |
01:39:36.580
to understand how entities that are other than human,
link |
01:39:39.700
I mean, the same as with dogs and other animals
link |
01:39:42.460
can have rights on a level as humans.
link |
01:39:44.900
Well, yeah, I mean, we can't and nor should we
link |
01:39:48.220
do whatever we want with animals.
link |
01:39:49.940
We have a USDA, we have departments of agriculture
link |
01:39:54.700
that deal with animal care and use committees
link |
01:39:58.260
for research, for farming and ranching and all that.
link |
01:40:02.060
So while when you first said it, I thought,
link |
01:40:05.020
wait, why would there be a bill of robotic rights?
link |
01:40:07.380
But it absolutely makes sense in the context of everything
link |
01:40:11.240
we've been talking about up until now.
link |
01:40:13.860
Let's, if you're willing, I'd love to talk about dogs
link |
01:40:18.700
because you've mentioned dogs a couple of times,
link |
01:40:21.260
a robot dog, you had a biological dog, yeah.
link |
01:40:26.300
Yeah, I had a Newfoundland named Homer
link |
01:40:32.460
for many years growing up.
link |
01:40:34.220
In Russia or in the US?
link |
01:40:35.620
In the United States.
link |
01:40:37.300
And he was about, he's over 200 pounds, that's a big dog.
link |
01:40:40.940
That's a big dog.
link |
01:40:41.820
If people know Newfoundland, so he's this black dog
link |
01:40:45.260
that's a really long hair and just a kind soul.
link |
01:40:50.460
I think perhaps that's true for a lot of large dogs,
link |
01:40:53.220
but he thought he was a small dog.
link |
01:40:55.180
So he moved like that and-
link |
01:40:56.500
Was he your dog?
link |
01:40:57.380
Yeah, yeah.
link |
01:40:58.340
So you had him since he was fairly young?
link |
01:41:00.500
Oh, since, yeah, since the very, very beginning
link |
01:41:02.720
to the very, very end.
link |
01:41:03.780
And one of the things, I mean, he had this kind of,
link |
01:41:08.620
we mentioned like the Roombas, he had a kindhearted
link |
01:41:13.660
dumbness about him that was just overwhelming.
link |
01:41:16.620
It's part of the reason I named him Homer
link |
01:41:20.020
because it's after Homer Simpson,
link |
01:41:22.740
in case people are wondering which Homer I'm referring to.
link |
01:41:25.260
I'm not, you know.
link |
01:41:27.820
So there's a-
link |
01:41:28.660
Not the Odyssey.
link |
01:41:29.500
Yeah, exactly.
link |
01:41:32.300
There's a clumsiness that was just something
link |
01:41:35.120
that immediately led to a deep love for each other.
link |
01:41:37.960
And one of the, I mean, he was always,
link |
01:41:42.300
it's the shared moments.
link |
01:41:43.300
He was always there for so many nights together.
link |
01:41:46.500
That's a powerful thing about a dog that he was there
link |
01:41:51.060
through all the loneliness, through all the tough times,
link |
01:41:53.540
through the successes and all those kinds of things.
link |
01:41:55.820
And I remember, I mean,
link |
01:41:57.740
that was a really moving moment for me.
link |
01:42:00.100
I still miss him to this day.
link |
01:42:02.140
How long ago did he die?
link |
01:42:05.340
Maybe 15 years ago.
link |
01:42:07.060
So it's been a while,
link |
01:42:09.140
but it was the first time I've really experienced
link |
01:42:13.940
like the feeling of death.
link |
01:42:16.980
So what happened is he got cancer
link |
01:42:22.840
and so he was dying slowly.
link |
01:42:26.020
And then at a certain point he couldn't get up anymore.
link |
01:42:29.700
There's a lot of things I could say here,
link |
01:42:31.780
you know, that I struggle with.
link |
01:42:33.980
That maybe he suffered much longer than he needed to.
link |
01:42:39.080
That's something I really think about a lot.
link |
01:42:42.060
But I remember I had to take him to the hospital
link |
01:42:47.160
and the nurses couldn't carry him, right?
link |
01:42:52.300
So you talk about 200 pound dog.
link |
01:42:54.420
I was really into powerlifting at the time.
link |
01:42:56.700
I remember like they tried to figure out
link |
01:42:59.340
all these kinds of ways to,
link |
01:43:01.740
so in order to put them to sleep,
link |
01:43:03.580
they had to take them into a room.
link |
01:43:07.220
And so I had to carry him everywhere.
link |
01:43:09.500
And here's this dying friend of mine
link |
01:43:13.540
that I just had to,
link |
01:43:15.220
first of all, it was really difficult to carry
link |
01:43:16.980
somebody that heavy when they're not helping you out.
link |
01:43:20.100
And yeah, so I remember it was the first time
link |
01:43:25.900
seeing a friend laying there
link |
01:43:28.460
and seeing wife drained from his body and that realization
link |
01:43:36.340
that we're here for a short time was made so real
link |
01:43:40.580
that here's a friend that was there for me
link |
01:43:42.740
the week before, the day before, and now he's gone.
link |
01:43:46.220
And that was, I don't know,
link |
01:43:49.180
that spoke to the fact that he could be deeply connected
link |
01:43:52.380
with a dog, also spoke to the fact
link |
01:43:56.700
that the shared moments together
link |
01:44:00.820
that led to that deep friendship
link |
01:44:05.260
will make life so amazing,
link |
01:44:08.540
but also spoke to the fact that death is a motherfucker.
link |
01:44:13.420
So I know you've lost Costello recently
link |
01:44:16.100
and you've been going.
link |
01:44:16.940
And as you're saying this,
link |
01:44:17.760
I'm definitely fighting back the tears.
link |
01:44:22.100
Thank you for sharing that,
link |
01:44:23.340
that I guess we're about to both cry over our dead dogs,
link |
01:44:28.660
that it was bound to happen
link |
01:44:30.300
just given when this is happening.
link |
01:44:33.480
Yeah, it's-
link |
01:44:35.060
How long did you know that Costello was not doing well?
link |
01:44:39.660
Well, let's see, a year ago, during the start of,
link |
01:44:44.300
about six months into the pandemic,
link |
01:44:46.420
he started getting abscesses and he was not,
link |
01:44:49.060
his behavior changed and something really changed.
link |
01:44:51.580
And then I put him on testosterone
link |
01:44:55.780
because, which helped a lot of things,
link |
01:44:58.100
it certainly didn't cure everything,
link |
01:44:59.300
but it helped a lot of things he was dealing with,
link |
01:45:01.180
joint pain, sleep issues.
link |
01:45:03.860
And then it just became a very slow decline
link |
01:45:08.860
to the point where two, three weeks ago,
link |
01:45:11.360
he had a closet full of medication.
link |
01:45:15.300
I mean, this dog was, it was like a pharmacy.
link |
01:45:17.340
It's amazing to me when I looked at it the other day,
link |
01:45:19.860
I still haven't cleaned up and removed all his things
link |
01:45:22.080
because I can't quite bring myself to do it.
link |
01:45:23.840
But-
link |
01:45:25.860
Do you think he was suffering?
link |
01:45:27.460
Well, so what happened was about a week ago,
link |
01:45:30.300
it was really just about a week ago, it's amazing.
link |
01:45:32.500
He was going up the stairs, I saw him slip.
link |
01:45:35.020
And he was a big dog.
link |
01:45:35.860
He wasn't 200 pounds, but he was about 90 pounds.
link |
01:45:37.740
But he's a bulldog, that's pretty big.
link |
01:45:39.100
And he was fit.
link |
01:45:41.200
And then I noticed that he wasn't carrying a foot
link |
01:45:43.920
in the back like it was injured.
link |
01:45:45.140
It had no feeling at all.
link |
01:45:46.400
He never liked me to touch his hind paws.
link |
01:45:48.220
And I could do, that thing was just flopping there.
link |
01:45:50.340
And then the vet found some spinal degeneration
link |
01:45:53.840
and I was told that the next one would go.
link |
01:45:55.720
Did he suffer?
link |
01:45:57.300
Sure hope not.
link |
01:45:58.620
But something changed in his eyes.
link |
01:46:01.420
Yeah.
link |
01:46:02.260
It's the eyes again.
link |
01:46:03.080
I know you and I spend long hours on the phone
link |
01:46:05.120
and talking about like the eyes and how,
link |
01:46:06.620
what they convey and what they mean about internal states
link |
01:46:09.180
and forsaken robots and biology of other kinds.
link |
01:46:12.140
But-
link |
01:46:12.980
Do you think something about him was gone in his eyes?
link |
01:46:17.880
I think he was real.
link |
01:46:20.620
Here I am anthropomorphizing.
link |
01:46:22.860
I think he was realizing that one of his great joys in life,
link |
01:46:26.840
which was to walk and sniff and pee on things.
link |
01:46:33.140
This dog loved to pee on things.
link |
01:46:36.420
It was amazing.
link |
01:46:37.260
I've wondered where he put it.
link |
01:46:38.900
He was like a reservoir of urine.
link |
01:46:41.060
It was incredible.
link |
01:46:42.300
I think, oh, that's it.
link |
01:46:43.140
He's just, he'd put like one drop on the 50 millionth plant.
link |
01:46:46.560
And then we get to the 50 millionth in one plant
link |
01:46:49.140
and he just have, you know, leave a puddle.
link |
01:46:51.220
And here I am talking about Costello peeing.
link |
01:46:54.500
He was losing that ability to stand up and do that.
link |
01:46:57.020
He was falling down while he was doing that.
link |
01:46:58.880
And I do think he started to realize,
link |
01:47:01.560
and the passage was easy and peaceful,
link |
01:47:04.560
but you know, I'll say this.
link |
01:47:08.120
I'm not ashamed to say it.
link |
01:47:09.140
I mean, I wake up every morning since then just,
link |
01:47:11.480
I don't even make the conscious decision
link |
01:47:13.240
to allow myself to cry.
link |
01:47:14.460
I wake up crying.
link |
01:47:16.060
And I'm fortunately able to make it through the day,
link |
01:47:18.160
thanks to the great support of my friends
link |
01:47:19.980
and you and my family.
link |
01:47:21.420
But I miss him, man.
link |
01:47:24.080
You miss him?
link |
01:47:24.920
Yeah, I miss him.
link |
01:47:25.740
And I feel like he, you know, Homer, Costello,
link |
01:47:29.180
you know, the relationship to one's dog is so specific, but.
link |
01:47:34.220
So that part of you is gone.
link |
01:47:37.940
That's the hard thing.
link |
01:47:39.500
You know,
link |
01:47:40.340
what I think is different is that I made the mistake,
link |
01:47:49.620
I think.
link |
01:47:50.620
I hope it was a good decision,
link |
01:47:51.800
but sometimes I think I made the mistake
link |
01:47:53.260
of I brought Costello a little bit to the world
link |
01:47:56.480
through the podcast, through posting about him.
link |
01:47:58.420
I gave, I anthropomorphized about him in public.
link |
01:48:01.540
Let's be honest.
link |
01:48:02.380
I have no idea what his mental life was
link |
01:48:03.700
or his relationship to me.
link |
01:48:04.840
And I'm just exploring all this for the first time
link |
01:48:06.780
because he was my first dog,
link |
01:48:07.760
but I raised him since he was seven weeks.
link |
01:48:09.660
Yeah, you got to hold it together.
link |
01:48:11.020
I noticed the episode you released on Monday,
link |
01:48:14.940
you mentioned Costello.
link |
01:48:16.560
Like you brought him back to life for me
link |
01:48:18.900
for that brief moment.
link |
01:48:20.340
Yeah, but he's gone.
link |
01:48:22.100
Well, that's the,
link |
01:48:24.620
he's going to be gone for a lot of people too.
link |
01:48:28.500
Well, this is what I'm struggling with.
link |
01:48:29.820
I think that maybe you're pretty good at this, Lila.
link |
01:48:34.460
Wait, have you done this before?
link |
01:48:36.100
This is the challenge is I actually, part of me,
link |
01:48:40.560
I know how to take care of myself pretty well.
link |
01:48:43.080
Not perfectly, but pretty well.
link |
01:48:44.720
And I have good support.
link |
01:48:46.000
I do worry a little bit about how it's going to land
link |
01:48:48.780
and how people will feel.
link |
01:48:50.200
I'm concerned about their internalization.
link |
01:48:54.040
So that's something I'm still iterating on.
link |
01:48:56.240
And you have to, they have to watch you struggle,
link |
01:48:58.360
which is fascinating.
link |
01:48:59.360
Right, and I've mostly been shielding them from this,
link |
01:49:01.620
but what would make me happiest
link |
01:49:03.960
if people would internalize some of Costello's best traits
link |
01:49:08.200
and his best traits were that he was incredibly tough.
link |
01:49:14.220
I mean, he was a 22 inch neck, bulldog, the whole thing.
link |
01:49:17.680
He was just born that way.
link |
01:49:18.960
But what was so beautiful is that his toughness
link |
01:49:22.340
is never what he rolled forward.
link |
01:49:24.040
It was just how sweet and kind he was.
link |
01:49:27.100
And so if people can take that,
link |
01:49:29.140
then there's a win in there someplace.
link |
01:49:32.400
So I think there's some ways in which
link |
01:49:34.880
he should probably live on in your podcast too.
link |
01:49:37.800
You should, I mean, it's such a,
link |
01:49:41.780
one of the things I loved about his role in your podcast
link |
01:49:45.800
is that he brought so much joy to you.
link |
01:49:48.740
I mentioned the robots, right?
link |
01:49:51.920
I think that's such a powerful thing to bring that joy
link |
01:49:55.960
into like allowing yourself to experience that joy,
link |
01:49:59.400
to bring that joy to others, to share it with others.
link |
01:50:02.520
That's really powerful.
link |
01:50:03.820
And I mean, not to, this is like the Russian thing is,
link |
01:50:10.000
it touched me when Louis CK had that moment
link |
01:50:14.040
that I keep thinking about in his show, Louis,
link |
01:50:17.960
where like an old man was criticizing Louis
link |
01:50:20.240
for whining about breaking up with his girlfriend.
link |
01:50:22.960
And he was saying like the most beautiful thing
link |
01:50:27.080
about love, they made a song that's catchy now
link |
01:50:32.480
that's not making me feel horrible saying it,
link |
01:50:35.660
but like is the loss.
link |
01:50:37.700
The loss really also is making you realize
link |
01:50:41.220
how much that person, that dog meant to you.
link |
01:50:46.360
And like allowing yourself to feel that loss
link |
01:50:48.680
and not run away from that loss is really powerful.
link |
01:50:51.420
And in some ways that's also sweet, just like the love was,
link |
01:50:56.160
the loss is also sweet because you know
link |
01:50:59.440
that you felt a lot for that, for your friend.
link |
01:51:03.760
So I, you know, and then continue bringing that joy.
link |
01:51:07.120
I think it would be amazing to the podcast.
link |
01:51:10.280
I hope to do the same with robots
link |
01:51:13.680
or whatever else is the source of joy, right?
link |
01:51:17.720
And maybe, do you think about one day getting another dog?
link |
01:51:22.440
Yeah, in time, you're hitting on all the key buttons here.
link |
01:51:28.560
I want that to, we're thinking about, you know,
link |
01:51:32.720
ways to kind of immortalize Costello in a way that's real,
link |
01:51:35.620
not just, you know, creating some little logo
link |
01:51:38.760
or something silly.
link |
01:51:39.760
You know, Costello, much like David Goggins is a person,
link |
01:51:43.840
but Goggins also has grown into a kind of a verb.
link |
01:51:47.120
You're going to Goggins this or you're going to,
link |
01:51:48.820
and there's an adjective, like that's extreme, like it.
link |
01:51:52.400
I think that for me, Costello was all those things.
link |
01:51:54.440
He was a being, he was his own being.
link |
01:51:56.800
He was a noun, a verb, and an adjective.
link |
01:52:00.360
So, and he had this amazing superpower
link |
01:52:02.420
that I wish I could get, which is this ability
link |
01:52:04.240
to get everyone else to do things for you
link |
01:52:06.020
without doing a damn thing.
link |
01:52:08.680
The Costello effect, as I call it.
link |
01:52:10.280
So it's an idea, I hope he lives on.
link |
01:52:12.240
Yes, thank you for that.
link |
01:52:14.440
This actually has been very therapeutic for me.
link |
01:52:16.840
Which actually brings me to a question, we're friends.
link |
01:52:23.220
We're not just co-scientists, colleagues,
link |
01:52:26.700
working on a project together,
link |
01:52:28.160
and in the world that's somewhat similar.
link |
01:52:34.760
Just two dogs, just two dogs, basically.
link |
01:52:40.540
But let's talk about friendship.
link |
01:52:42.760
Because I think that, I certainly know as a scientist
link |
01:52:49.140
that there are elements that are very lonely
link |
01:52:51.100
of the scientific pursuit.
link |
01:52:52.720
There are elements of many pursuits that are lonely.
link |
01:52:57.000
Music, math, always seemed to me
link |
01:53:00.080
like they're like the loneliest people.
link |
01:53:02.400
Who knows if that's true or not.
link |
01:53:04.040
Also people work in teams,
link |
01:53:05.360
and sometimes people are surrounded by people
link |
01:53:07.000
interacting with people and they feel very lonely.
link |
01:53:09.280
But for me, and I think as well for you,
link |
01:53:14.420
friendship is an incredibly strong force
link |
01:53:17.940
in making one feel like certain things are possible
link |
01:53:23.520
or worth reaching for.
link |
01:53:25.800
Maybe even making us compulsively reach for them.
link |
01:53:28.200
So when you were growing up,
link |
01:53:30.680
you grew up in Russia until what age?
link |
01:53:32.800
13.
link |
01:53:33.680
Okay, and then you moved directly to Philadelphia?
link |
01:53:38.280
To Chicago.
link |
01:53:39.320
Chicago.
link |
01:53:40.160
And then Philadelphia and San Francisco and Boston
link |
01:53:44.640
and so on, but really to Chicago.
link |
01:53:46.940
That's why I went to high school.
link |
01:53:48.260
Do you have siblings?
link |
01:53:49.840
Older brother.
link |
01:53:51.080
Most people don't know that.
link |
01:53:53.840
Yeah, he is a very different person,
link |
01:53:57.880
but somebody I definitely look up to.
link |
01:53:59.600
So he's a wild man.
link |
01:54:00.780
He's extrovert.
link |
01:54:01.800
He's, he was into, I mean,
link |
01:54:05.400
so he's also a scientist, a bio engineer,
link |
01:54:07.540
but when we were growing up and he was the person
link |
01:54:12.600
who did drink and did every drug,
link |
01:54:17.140
but also was the life of the party.
link |
01:54:19.360
And I just thought he was the,
link |
01:54:21.080
when you're the older brother, five years older,
link |
01:54:23.440
he was the coolest person that I always wanted to be him.
link |
01:54:28.300
So to that, he definitely had a big influence.
link |
01:54:31.180
But I think for me, in terms of friendship growing up,
link |
01:54:35.620
I had one really close friend.
link |
01:54:40.060
And then when I came here, I had another close friend,
link |
01:54:42.280
but I'm very, I believe, I don't know if I believe,
link |
01:54:47.520
but I draw a lot of strength from deep connections
link |
01:54:53.020
with other people and just a small number of people,
link |
01:54:57.740
just a really small number of people.
link |
01:54:59.080
That's when I moved to this country,
link |
01:55:00.480
I was really surprised how like there would be
link |
01:55:04.160
these large groups of friends, quote unquote,
link |
01:55:08.280
but the depth of connection was not there at all
link |
01:55:12.240
from my sort of perspective.
link |
01:55:14.280
Now I moved to the suburb of Chicago was Naperville.
link |
01:55:17.700
It's more like a middle-class, maybe upper middle-class.
link |
01:55:20.960
So it's like people that cared more
link |
01:55:23.520
about material possessions than deep human connection.
link |
01:55:26.480
So that added to the thing.
link |
01:55:28.400
But I drew more meaning than almost anything else
link |
01:55:33.400
was from friendship early on.
link |
01:55:35.640
I had a best friend, his name was, his name is Yura.
link |
01:55:41.540
I don't know how to say it in English.
link |
01:55:43.480
How do you say it in Russian?
link |
01:55:44.680
Yura.
link |
01:55:45.520
What's his last name?
link |
01:55:46.440
Do you remember?
link |
01:55:48.640
Mirkulov, Yura Mirkulov.
link |
01:55:53.720
So we just spent all our time together.
link |
01:55:56.320
There's also a group of friends.
link |
01:55:58.480
Like, I don't know, it's like eight guys.
link |
01:56:01.680
In Russia, growing up, it's like parents didn't care
link |
01:56:07.480
if you're coming back at a certain hour.
link |
01:56:09.680
So we would spend all day, all night,
link |
01:56:12.280
just playing soccer, usually called football
link |
01:56:15.840
and just talking about life and all those kinds of things.
link |
01:56:18.900
Even at that young age, I think people in Russia
link |
01:56:22.680
and the Soviet Union grow up much quicker.
link |
01:56:26.980
I think the education system at the university level
link |
01:56:30.920
is world-class in the United States
link |
01:56:33.720
in terms of really creating really big, powerful minds,
link |
01:56:38.960
at least it used to be, but I think that they aspire to that.
link |
01:56:42.040
But the education system for younger kids
link |
01:56:46.380
in the Soviet Union was incredible.
link |
01:56:49.280
They did not treat us as kids.
link |
01:56:51.240
The level of literature, Tolstoy, Dostoyevsky.
link |
01:56:54.560
When you were just a small child?
link |
01:56:55.960
Yeah.
link |
01:56:56.800
Amazing, amazing.
link |
01:56:58.560
The level of mathematics and you're made to feel like shit
link |
01:57:02.080
if you're not good at mathematics.
link |
01:57:03.780
Like we, I think in this country, there's more,
link |
01:57:07.160
like especially young kids, cause they're so cute.
link |
01:57:09.700
Like they're being babied.
link |
01:57:12.320
We only start to really push adults later in life.
link |
01:57:15.240
Like, so if you want to be the best in the world at this,
link |
01:57:17.860
then you get to be pushed.
link |
01:57:19.480
But we were pushed at a young age, everybody was pushed.
link |
01:57:22.880
And that brought out the best in people.
link |
01:57:25.020
I think they really forced people to discover like,
link |
01:57:29.600
discover themselves in the Goggin style,
link |
01:57:31.840
but also discover what they're actually passionate about,
link |
01:57:35.160
what they're not.
link |
01:57:36.000
Was it true for boys and girls
link |
01:57:37.440
where they pushed equally there?
link |
01:57:38.960
Yeah, they were pushed.
link |
01:57:39.960
Yeah, they were pushed equally, I would say.
link |
01:57:41.960
There was, obviously there was more, not obviously,
link |
01:57:45.560
but there, at least from my memories, more of a,
link |
01:57:50.360
what's the right way to put it,
link |
01:57:52.040
but there was like gender roles,
link |
01:57:54.000
but not in a negative connotation.
link |
01:57:56.280
It was the red dress versus the suit and tie
link |
01:57:59.480
kind of connotation, which is like,
link |
01:58:01.560
there's like guys like lifting heavy things
link |
01:58:08.040
and girls like creating beautiful art.
link |
01:58:11.520
And like there's-
link |
01:58:14.200
A more traditional view of gender, more 1950s, 60s.
link |
01:58:18.200
But we didn't think in terms of, at least at that age,
link |
01:58:20.400
in terms of like roles and then like a homemaker or something
link |
01:58:24.120
like that, or no, it was more about what people care about.
link |
01:58:28.220
Like girls cared about this set of things
link |
01:58:31.360
and guys cared about this set of things.
link |
01:58:33.240
I think mathematics and engineering was something
link |
01:58:36.520
that guys cared about and sort of,
link |
01:58:38.560
at least my perception of that time.
link |
01:58:40.480
And then girls cared about beauty.
link |
01:58:43.980
So like guys want to create machines,
link |
01:58:46.060
girls want to create beautiful stuff.
link |
01:58:47.760
And now, of course, that I don't take that forward
link |
01:58:52.760
in some kind of philosophy of life,
link |
01:58:54.820
but it's just the way I grew up and the way I remember it.
link |
01:58:57.520
But all, everyone worked hard.
link |
01:59:01.600
The value of hard work was instilled in everybody.
link |
01:59:06.520
And through that, I think it's a little bit of hardship.
link |
01:59:12.440
Of course, also economically, everybody was poor,
link |
01:59:14.960
especially with the collapse of the Soviet Union.
link |
01:59:17.100
There's poverty everywhere.
link |
01:59:18.620
You didn't notice it as much,
link |
01:59:19.800
but there was a, because there's not much material
link |
01:59:23.120
possessions, there was a huge value placed
link |
01:59:26.220
on human connection.
link |
01:59:28.040
Just meeting with neighbors, everybody knew each other.
link |
01:59:31.200
We lived in an apartment building,
link |
01:59:33.600
very different than you have in the United States
link |
01:59:35.600
these days, everybody knew each other.
link |
01:59:39.000
You would get together, drink vodka, smoke cigarettes
link |
01:59:41.640
and play guitar and sing sad songs about life.
link |
01:59:47.280
What's with the sad songs and the Russian thing?
link |
01:59:50.600
I mean, Russians do express joy from time to time.
link |
01:59:55.020
They do.
link |
01:59:55.920
Certainly you do.
link |
01:59:57.560
But what do you think that's about?
link |
02:00:00.240
Is it because it's cold there?
link |
02:00:01.320
But it's cold other places too.
link |
02:00:04.460
I think, so first of all, the Soviet Union,
link |
02:00:08.340
the echoes of World War II and the millions
link |
02:00:13.040
and millions and millions of people there,
link |
02:00:14.660
civilians that were slaughtered
link |
02:00:17.020
and also starvation is there, right?
link |
02:00:20.180
So like the echoes of that, of the ideas,
link |
02:00:24.040
the literature, the art is there.
link |
02:00:25.900
Like that's grandparents, that's parents, that's all there.
link |
02:00:29.340
So that contributes to it, that life can be absurdly,
link |
02:00:33.780
unexplainably cruel.
link |
02:00:35.580
At any moment, everything can change.
link |
02:00:37.380
So that's in there.
link |
02:00:38.600
Then I think there's an empowering aspect
link |
02:00:40.920
to finding beauty and suffering,
link |
02:00:43.220
but then everything else is beautiful too.
link |
02:00:45.480
Like if you just linger or it's like
link |
02:00:47.660
why you meditate on death.
link |
02:00:49.260
It's like if you just think about the worst possible case
link |
02:00:52.060
and find beauty in that, then everything else beautiful too.
link |
02:00:54.680
And so you write songs about the dark stuff
link |
02:00:57.400
and that somehow helps you deal with whatever comes.
link |
02:01:02.660
There's a hopelessness to the Soviet Union
link |
02:01:07.220
that like inflation, all those kinds of things
link |
02:01:10.280
where people were sold dreams and never delivered.
link |
02:01:15.420
And so like, if you don't sing songs about sad things,
link |
02:01:21.380
you're going to become cynical about this world.
link |
02:01:24.300
Interesting.
link |
02:01:25.140
So they don't want to give into cynicism.
link |
02:01:27.340
Now, a lot of people did, one of the,
link |
02:01:31.500
but it's the battle against cynicism.
link |
02:01:34.240
One of the things that may be common in Russia
link |
02:01:38.440
is the kind of cynicism about,
link |
02:01:41.020
like if I told you the thing I said earlier
link |
02:01:43.180
about dreaming about robots,
link |
02:01:45.020
it's very common for people to dismiss that dream
link |
02:01:48.800
of saying, nah, that's not, that's too wild.
link |
02:01:52.180
Like who else do you know that did that?
link |
02:01:54.320
Or you want to start a podcast, like who else?
link |
02:01:57.020
Like nobody's making money on podcasts.
link |
02:01:58.780
Like why do you want to start a podcast?
link |
02:02:00.020
That kind of mindset I think is quite common,
link |
02:02:03.360
which is why I would say entrepreneurship in Russia
link |
02:02:07.520
is still not very good, which to be a business,
link |
02:02:11.020
like to be an entrepreneur, you have to dream big
link |
02:02:13.420
and you have to have others around you,
link |
02:02:14.980
like friends and support group that makes you dream big.
link |
02:02:19.080
But if you don't give into cynicism
link |
02:02:22.340
and appreciate the beauty in the unfairness of life,
link |
02:02:27.780
the absurd unfairness of life,
link |
02:02:29.940
then I think it just makes you appreciative of everything.
link |
02:02:34.500
It's like a, it's a prerequisite for gratitude.
link |
02:02:38.020
And so, yeah, I think that instilled in me
link |
02:02:42.060
ability to appreciate everything,
link |
02:02:43.980
just like everything, everything's amazing.
link |
02:02:46.740
And then also there's a culture
link |
02:02:51.500
of like romanticizing everything.
link |
02:02:56.100
Like it's almost like romantic relationships
link |
02:03:01.340
were very like soap opera like,
link |
02:03:04.280
is very like over the top dramatic.
link |
02:03:07.700
And I think that was instilled in me too.
link |
02:03:10.900
Not only do I appreciate everything about life,
link |
02:03:13.620
but I get like emotional about it.
link |
02:03:15.940
In a sense, like I get like a visceral feeling of joy
link |
02:03:20.600
for everything and the same with friends
link |
02:03:24.500
or people of the opposite sex.
link |
02:03:26.100
Like there's a deep, like emotional connection there
link |
02:03:30.700
that like, that's like way too dramatic to like,
link |
02:03:35.300
I guess relative to what the actual moment is.
link |
02:03:38.760
But I derive so much deep, like dramatic joy
link |
02:03:45.240
from so many things in life.
link |
02:03:46.620
And I think I would attribute that to bringing in Russia.
link |
02:03:50.340
But the thing that sticks most of all is the friendship
link |
02:03:54.000
and have now since then had one other friend like that
link |
02:03:59.300
in the United States, he lives in Chicago.
link |
02:04:02.620
His name is Matt.
link |
02:04:04.300
And slowly here and there accumulating
link |
02:04:08.280
really fascinating people,
link |
02:04:09.840
but I'm very selective with that.
link |
02:04:11.540
Funny enough, the few times,
link |
02:04:16.140
it's not few, it's a lot of times now
link |
02:04:17.940
interacting with Joe Rogan.
link |
02:04:19.900
It sounds surreal to say,
link |
02:04:21.840
but there was a kindred spirit there.
link |
02:04:24.160
Like I've connected with him.
link |
02:04:26.840
And there's been people like that
link |
02:04:27.980
also in the grappling sports
link |
02:04:29.860
that are really connected with.
link |
02:04:31.600
I've actually struggled,
link |
02:04:33.380
which is why I'm so glad to be your friend
link |
02:04:36.900
is I've struggled to connect with scientists.
link |
02:04:40.260
Like-
link |
02:04:41.100
They can be a little bit wooden sometimes.
link |
02:04:42.580
Yeah.
link |
02:04:43.420
Even the biologists.
link |
02:04:44.240
I mean, one thing that I-
link |
02:04:45.460
Even the biologists.
link |
02:04:47.000
Well, I'm so struck by the fact that you work with robots,
link |
02:04:50.100
you're an engineer, AI.
link |
02:04:51.740
Science, technology.
link |
02:04:52.940
And that all sounds like hardware, right?
link |
02:04:55.320
But what you're describing,
link |
02:04:56.460
and I know is true about you,
link |
02:04:58.260
is this deep emotional life and this resonance.
link |
02:05:01.660
And it's really wonderful.
link |
02:05:02.700
I actually think it's one of the reasons why
link |
02:05:05.200
so many people, scientists and otherwise,
link |
02:05:07.700
have gravitated towards you and your podcast
link |
02:05:09.900
is because you hold both elements.
link |
02:05:12.860
In Hermann Hesse's book,
link |
02:05:14.240
I don't know if you were at Narcissus and Goldman, right?
link |
02:05:16.100
It's about these elements of the logical, rational mind
link |
02:05:19.400
and the emotional mind
link |
02:05:21.460
and how those are woven together.
link |
02:05:22.940
And if people haven't read it, they should.
link |
02:05:24.860
And you embody the full picture.
link |
02:05:27.380
And I think that's so much of what draws people to you.
link |
02:05:30.020
I've read every Hermann Hesse book, by the way.
link |
02:05:31.940
As usual, I've done about 9% of what Lexi said.
link |
02:05:36.980
No, it's true.
link |
02:05:38.560
You mentioned Joe, who is a phenomenal human being,
link |
02:05:42.340
not just for his amazing accomplishments,
link |
02:05:44.260
but for how he shows up to the world one-on-one.
link |
02:05:48.500
I think I heard him say the other day on an interview,
link |
02:05:51.620
he said, there is no public or private version of him.
link |
02:05:55.660
He's like, this is me.
link |
02:05:56.980
He said the word, it was beautiful.
link |
02:05:58.340
He said, I'm like the fish that got through the net.
link |
02:06:00.940
There is no on-stage, off-stage version.
link |
02:06:03.580
And you're absolutely right.
link |
02:06:04.580
And I, so, well, you guys, I have a question actually about-
link |
02:06:09.820
But that's a really good point
link |
02:06:10.860
about public and private life.
link |
02:06:12.620
He was a huge, if I could just comment real quick.
link |
02:06:15.620
Like that, he was, I've been a fan of Joe for a long time,
link |
02:06:18.560
but he's been an inspiration
link |
02:06:20.620
to not have any difference between public and private life.
link |
02:06:25.060
I actually had a conversation with Naval about this.
link |
02:06:28.180
And he said that you can't have a rich life,
link |
02:06:34.260
like an exciting life
link |
02:06:36.620
if you're the same person publicly and privately.
link |
02:06:39.340
And I think I understand that idea,
link |
02:06:42.420
but I don't agree with it.
link |
02:06:45.380
I think it's really fulfilling and exciting
link |
02:06:49.120
to be the same person privately and publicly,
link |
02:06:51.340
with very few exceptions.
link |
02:06:52.500
Now, that said, I don't have any really strange sex kinks.
link |
02:06:58.040
So like, I feel like I can be open with basically everything.
link |
02:07:00.420
I don't have anything I'm ashamed of.
link |
02:07:03.540
There's some things that could be perceived poorly,
link |
02:07:05.660
like the screaming Arumbas, but I'm not ashamed of them.
link |
02:07:09.040
I just have to present them in the right context.
link |
02:07:11.420
But there's freedom to being the same person in private
link |
02:07:15.600
as in public.
link |
02:07:16.620
And that Joe made me realize that you can be that
link |
02:07:22.460
and also to be kind to others.
link |
02:07:25.220
It sounds kind of absurd,
link |
02:07:28.620
but I really always enjoyed like being good to others.
link |
02:07:38.420
Like just being kind towards others.
link |
02:07:41.100
But I always felt like the world didn't want me to be.
link |
02:07:45.380
Like there's so much negativity when I was growing up,
link |
02:07:48.500
like just around people.
link |
02:07:49.700
If you actually just notice how people talk,
link |
02:07:54.980
from like complaining about the weather,
link |
02:07:57.460
this could be just like the big cities that I visited in,
link |
02:07:59.620
but there's a general negativity
link |
02:08:02.020
and positivity is kind of suppressed.
link |
02:08:05.860
One, you're not seen as very intelligent.
link |
02:08:08.620
And two, you're seen as like a little bit of a weirdo.
link |
02:08:13.220
And so I always felt like I had to hide that.
link |
02:08:15.740
And what Joe made me realize,
link |
02:08:17.100
one, I could be fully just the same person,
link |
02:08:21.220
private and public.
link |
02:08:22.220
And two, I can embrace being kind
link |
02:08:25.220
and just in the way that I like,
link |
02:08:28.220
in the way I know how to do.
link |
02:08:31.220
And sort of for me on like on Twitter
link |
02:08:35.140
or like publicly, whenever I say stuff,
link |
02:08:37.500
that means saying stuff simply,
link |
02:08:39.620
almost to the point of cliche.
link |
02:08:41.220
And like, I have the strength now to say it,
link |
02:08:44.540
even if I'm being mocked, you know what I mean?
link |
02:08:47.140
Like just, it's okay.
link |
02:08:48.680
If everything's going to be okay.
link |
02:08:50.380
Okay, some people will think you're dumb.
link |
02:08:52.520
They're probably right.
link |
02:08:53.520
The point is like, it's just enjoy being yourself.
link |
02:08:56.260
And Joe more than almost anybody else,
link |
02:08:58.300
because he's so successful at it, inspired me to do that.
link |
02:09:03.060
Be kind and be the same person, private and public.
link |
02:09:06.200
I love it, and I love the idea that authenticity
link |
02:09:08.660
doesn't have to be oversharing, right?
link |
02:09:11.520
That it doesn't mean you reveal every detail of your life,
link |
02:09:14.460
what, you know, it's a way of being true
link |
02:09:17.220
to an essence of oneself.
link |
02:09:19.000
Right, there's never a feeling
link |
02:09:22.300
when you deeply think and introspect
link |
02:09:24.420
that you're hiding something from the world
link |
02:09:26.100
or you're being dishonest in some fundamental way.
link |
02:09:28.740
So yeah, that's truly liberating.
link |
02:09:33.480
It allows you to think, it allows you to,
link |
02:09:36.060
like think freely, to speak freely,
link |
02:09:38.680
to just to be freely.
link |
02:09:42.100
That said, it's not like, you know,
link |
02:09:47.060
it's not like there's not still a responsibility
link |
02:09:49.500
to be the best version of yourself.
link |
02:09:52.120
So, you know, I'm very careful with the way I say something.
link |
02:09:57.180
So the whole point, it's not so simple
link |
02:10:00.680
to express the spirit that's inside you with words.
link |
02:10:04.820
It depends, I mean, some people are much better
link |
02:10:06.500
than others.
link |
02:10:08.740
I struggle, like oftentimes when I say something
link |
02:10:12.740
and I hear myself say it, it sounds really dumb
link |
02:10:15.240
and not at all what I meant.
link |
02:10:16.700
So that's the responsibility you have.
link |
02:10:18.400
It's not just like being the same person publicly
link |
02:10:21.380
and privately means you can just say whatever the hell.
link |
02:10:24.460
It means there's still a responsibility to try to be,
link |
02:10:27.640
to express who you truly are.
link |
02:10:29.300
And that's hard.
link |
02:10:32.740
It is hard and I think that, you know, we have this pressure,
link |
02:10:37.640
all people, when I say we, I mean all humans,
link |
02:10:40.900
and maybe robots too, feel this pressure
link |
02:10:44.480
to be able to express ourselves in that one moment,
link |
02:10:47.580
in that one form.
link |
02:10:48.900
And it is beautiful when somebody, for instance,
link |
02:10:51.260
can capture some essence of love or sadness
link |
02:10:53.740
or anger or something in a song or in a poem
link |
02:10:57.160
or in a short quote.
link |
02:10:58.740
But perhaps it's also possible to do it in aggregate.
link |
02:11:02.740
You know, all the things, you know, how you show up.
link |
02:11:06.080
For instance, one of the things that initially drew me
link |
02:11:08.680
to want to get to know you as a human being
link |
02:11:10.620
and a scientist and eventually we became friends
link |
02:11:13.700
was the level of respect that you brought
link |
02:11:16.580
to your podcast listeners by wearing a suit.
link |
02:11:19.620
I'm being serious here.
link |
02:11:20.620
You know, I was raised thinking that if you overdress
link |
02:11:23.460
a little bit, overdress by American,
link |
02:11:25.940
certainly by American standards,
link |
02:11:27.260
you're overdressed for a podcast,
link |
02:11:28.640
but this is, but it's genuine.
link |
02:11:30.540
You're not doing it for any reason,
link |
02:11:31.820
except I have to assume, and I assumed at the time,
link |
02:11:35.060
that it was because you have a respect for your audience.
link |
02:11:37.940
You respect them enough to show up a certain way for them.
link |
02:11:42.400
It's for you also, but it's for them.
link |
02:11:44.420
And I think between that and your commitment
link |
02:11:47.060
to your friendships, the way that you talk about friendships
link |
02:11:49.540
and love and the way you hold up these higher ideals,
link |
02:11:52.820
I think at least as a consumer of your content
link |
02:11:56.620
and as your friend, what I find is that in aggregate,
link |
02:12:01.760
you're communicating who you are.
link |
02:12:03.360
It doesn't have to be one quote or something.
link |
02:12:05.840
And I think that we were sort of obsessed
link |
02:12:08.160
by like the one Einstein quote
link |
02:12:10.120
or the one line of poetry or something,
link |
02:12:13.000
but I think you so embody the way that, and Joe as well,
link |
02:12:18.280
it's about how you live your life and how you show up
link |
02:12:21.080
as a collection of things and said and done.
link |
02:12:24.880
Yeah, that's fascinating, so the aggregate is the goal.
link |
02:12:28.000
The tricky thing, and Jordan Peterson talks about this
link |
02:12:32.200
because he's under attack way more than you and I
link |
02:12:34.400
will ever be, but that-
link |
02:12:36.200
For now.
link |
02:12:37.200
For now, right?
link |
02:12:38.500
This is very true for now.
link |
02:12:40.400
That the people who attack on the internet,
link |
02:12:46.940
this is one of the problems with Twitter,
link |
02:12:49.080
is they don't consider the aggregate.
link |
02:12:53.160
They take a single statements.
link |
02:12:55.320
And so one of the defense mechanisms,
link |
02:12:58.440
like again, why Joe has been an inspiration
link |
02:13:01.160
is that when you in aggregate are a good person,
link |
02:13:05.540
a lot of people will know that.
link |
02:13:07.040
And so that makes you much more immune
link |
02:13:08.940
to the attacks of people
link |
02:13:10.000
that bring out an individual statement
link |
02:13:11.900
that might be a misstatement of some kind
link |
02:13:14.240
or doesn't express who you are.
link |
02:13:16.840
And so that, I like that idea is the aggregate
link |
02:13:20.940
and the power of the podcast is you have hundreds
link |
02:13:25.720
of hours out there and being yourself
link |
02:13:28.500
and people get to know who you are.
link |
02:13:30.200
And once they do and you post pictures
link |
02:13:33.520
of screaming Roombas as you kick them,
link |
02:13:36.520
they will understand that you don't mean well.
link |
02:13:38.120
By the way, as a side comment,
link |
02:13:41.280
I don't know if I want to release this
link |
02:13:42.920
because it's not just the Roombas.
link |
02:13:46.760
You have a whole dungeon of robots.
link |
02:13:48.480
Okay, so this is a problem.
link |
02:13:51.840
Boston Dynamics came up against this problem,
link |
02:13:54.440
but let me just, let me work this out,
link |
02:13:57.240
like workshop this out with you.
link |
02:13:59.840
And maybe because we'll post this,
link |
02:14:02.520
people will let me know.
link |
02:14:05.440
So there's legged robots.
link |
02:14:07.580
They look like a dog.
link |
02:14:08.800
They have a very,
link |
02:14:09.640
I'm trying to create a very real human robot connection,
link |
02:14:13.980
but like they're also incredible
link |
02:14:15.400
because you can throw them like off of a building
link |
02:14:19.480
and it'll land fine.
link |
02:14:21.400
And it's beautiful.
link |
02:14:22.380
That's amazing.
link |
02:14:23.220
I've seen the Instagram videos of like cats
link |
02:14:25.320
jumping off of like fifth story buildings
link |
02:14:27.280
and then walking away.
link |
02:14:29.080
No one should throw their cat out of a window.
link |
02:14:30.960
This is the problem I'm experiencing.
link |
02:14:32.520
I'll certainly kicking the robots.
link |
02:14:34.680
It's really fascinating how they recover from those kicks,
link |
02:14:38.000
but like just seeing myself do it
link |
02:14:40.440
and also seeing others do it, it just does not look good.
link |
02:14:43.360
And I don't know what to do with that.
link |
02:14:44.920
Cause I, it's such a-
link |
02:14:46.280
I'll do it.
link |
02:14:49.400
See, but you don't, I, you, cause you-
link |
02:14:52.920
Robot, no, I'm kidding.
link |
02:14:54.040
Now I'm, you know what's interesting?
link |
02:14:55.960
Yeah.
link |
02:14:56.800
Before today's conversation, I probably could do it.
link |
02:14:59.980
And now I think I'm thinking about robots,
link |
02:15:03.040
bills of rights and things I'm actually,
link |
02:15:04.600
and not for any, not to satisfy you
link |
02:15:07.400
or to satisfy anything, except that if I,
link |
02:15:10.000
if they have some sentient aspect to their being,
link |
02:15:14.120
then I would loathe to kick it.
link |
02:15:16.360
I don't think you'd be able to kick it.
link |
02:15:17.660
You might be able to get the first time, but not the second.
link |
02:15:20.120
This is, this is the problem I've experienced.
link |
02:15:21.840
One of the cool things is one of the robots I'm,
link |
02:15:25.760
I'm working with,
link |
02:15:26.600
you can pick it up by one leg and it's dangling.
link |
02:15:29.340
You can throw it in any kind of way
link |
02:15:31.540
and it'll land correctly.
link |
02:15:33.680
So it's really-
link |
02:15:34.500
I had a friend who had a cat like that.
link |
02:15:35.720
Oh man, we look forward to the letters from the cat.
link |
02:15:40.560
Oh no, I'm not suggesting anyone did that,
link |
02:15:42.080
but he had this cat and the cat, he would just, you know,
link |
02:15:45.320
throw it onto the bed from across the room
link |
02:15:46.800
and then it would run back for more.
link |
02:15:48.840
Somehow they had, that was the nature of the relationship.
link |
02:15:51.580
I think most, no one should do that to an animal,
link |
02:15:54.080
but this cat seemed to, you know,
link |
02:15:56.560
return for it for whatever reason.
link |
02:15:58.000
The robot is a robot and it's fascinating to me
link |
02:16:00.100
how hard it is for me to do that.
link |
02:16:02.240
So it's unfortunate,
link |
02:16:04.000
but I don't think I can do that to a robot.
link |
02:16:06.240
Like I struggle with that.
link |
02:16:08.240
So for me to be able to do that with a robot,
link |
02:16:13.520
I have to almost get like into the state
link |
02:16:15.820
that I imagine like doctors get into
link |
02:16:17.720
when they're doing surgery.
link |
02:16:19.340
Like I have to start,
link |
02:16:20.760
I have to do what robotics colleagues of mine do,
link |
02:16:22.960
which is like start seeing it as an object.
link |
02:16:25.080
Like dissociate.
link |
02:16:25.920
Like dissociate.
link |
02:16:26.880
So it was just fascinating that I have to do that
link |
02:16:28.840
in order to do that with a robot.
link |
02:16:30.700
I just wanted to take that a little bit of a tangent.
link |
02:16:33.600
No, I think it's an important thing.
link |
02:16:35.000
I mean, I am not, I'm not shy about the fact
link |
02:16:39.600
that for many years I've worked on experimental animals
link |
02:16:42.400
and that's been a very challenging aspect
link |
02:16:44.700
to being a biologist, mostly mice,
link |
02:16:46.880
but in the past, no longer, thank goodness,
link |
02:16:49.200
cause I just don't like doing it,
link |
02:16:51.200
larger animals as well.
link |
02:16:52.680
And now I work on humans,
link |
02:16:53.720
which I can give consent, verbal consent.
link |
02:16:56.280
So I think that it's extremely important
link |
02:17:00.800
to have an understanding of what the guidelines are
link |
02:17:03.880
and where one's own boundaries are around this.
link |
02:17:06.200
It's not just an important question.
link |
02:17:09.040
It might be the most important question
link |
02:17:11.040
before any work can progress.
link |
02:17:12.860
So you asked me about friendship.
link |
02:17:14.920
I know you have a lot of thoughts about friendship.
link |
02:17:18.180
What do you think is the value of friendship in life?
link |
02:17:22.320
Well, for me personally,
link |
02:17:24.800
just because of my life trajectory and arc of friendship,
link |
02:17:29.800
and I should say, I do have some female friends
link |
02:17:34.440
that are just friends,
link |
02:17:35.600
they're completely platonic relationships,
link |
02:17:37.100
but it's been mostly male friendship to me has been-
link |
02:17:39.640
It's been all male friendships to me, actually.
link |
02:17:41.920
Interesting, yeah.
link |
02:17:43.000
It's been an absolute lifeline.
link |
02:17:45.680
They are my family.
link |
02:17:47.040
I have a biological family and I have great respect
link |
02:17:49.360
and love for them and an appreciation for them,
link |
02:17:51.260
but it's provided me the,
link |
02:17:57.720
I wouldn't even say confidence
link |
02:17:58.880
because there's always an anxiety in taking any good risk
link |
02:18:02.440
or any risk worth taking.
link |
02:18:04.200
It's given me the sense that I should go for certain things
link |
02:18:08.860
and try certain things to take risks,
link |
02:18:10.720
to weather that anxiety.
link |
02:18:12.160
And I don't consider myself
link |
02:18:14.640
a particularly competitive person,
link |
02:18:16.760
but I would sooner die than disappoint
link |
02:18:21.840
or let down one of my friends.
link |
02:18:24.640
I can think of nothing worse, actually,
link |
02:18:26.920
than disappointing one of my friends.
link |
02:18:29.120
Everything else is secondary to me.
link |
02:18:31.260
Well, disappointment-
link |
02:18:33.120
Disappointing meaning not,
link |
02:18:35.880
I mean, certainly I strive always to show up
link |
02:18:39.600
as best I can for the friendship,
link |
02:18:41.560
and that can be in small ways.
link |
02:18:43.000
That can mean making sure the phone is away.
link |
02:18:45.000
Sometimes it's about,
link |
02:18:48.520
I'm terrible with punctuality because I'm an academic
link |
02:18:50.840
and so I just get lost in time and I don't mean anything by,
link |
02:18:53.380
but striving to listen, to enjoy good times and to make time.
link |
02:18:59.080
It kind of goes back to this first variable we talked about,
link |
02:19:01.840
to make sure that I spend time
link |
02:19:04.000
and to get time in person and check in.
link |
02:19:09.360
I think there's so many ways
link |
02:19:10.640
in which friendship is vital to me.
link |
02:19:12.560
It's actually, to me, what makes life worth living.
link |
02:19:14.820
Yeah.
link |
02:19:17.400
I am surprised, like with the high school friends,
link |
02:19:20.060
how we don't actually talk that often these days
link |
02:19:22.640
in terms of time, but every time we see each other,
link |
02:19:24.920
it's immediately right back to where we started.
link |
02:19:27.080
So I struggle with that,
link |
02:19:28.720
how much time you really allocate
link |
02:19:32.120
for the friendship to be deeply meaningful
link |
02:19:34.400
because they're always there with me,
link |
02:19:36.760
even if we don't talk often.
link |
02:19:39.520
So there's a kind of loyalty.
link |
02:19:40.980
I think maybe it's a different style,
link |
02:19:44.000
but I think to me,
link |
02:19:47.000
friendship is being there in the hard times, I think.
link |
02:19:52.760
I'm much more reliable when you're going through shit.
link |
02:19:57.560
You're pretty reliable anyway.
link |
02:19:59.560
No, but if you're like a wedding or something like that,
link |
02:20:03.120
or I don't know, you want an award of some kind,
link |
02:20:08.760
yeah, I'll congratulate the shit out of you,
link |
02:20:12.200
but that's not, and I'll be there,
link |
02:20:14.440
but that's not as important to me as being there
link |
02:20:16.700
when nobody else is,
link |
02:20:19.560
just being there when shit hits the fan
link |
02:20:22.680
or something's tough where the world turns their back
link |
02:20:25.680
on you, all those kinds of things.
link |
02:20:26.880
That to me, that's where friendship is meaningful.
link |
02:20:29.360
Well, I know that to be true about you,
link |
02:20:30.920
and that's a felt thing and a real thing with you.
link |
02:20:33.420
Let me ask one more thing about that actually,
link |
02:20:35.560
because I'm not a practitioner of jujitsu.
link |
02:20:38.440
I know you are, Joe is,
link |
02:20:39.660
but years ago I read a book that I really enjoyed,
link |
02:20:42.400
which is Sam Sheridan's book, A Fighter's Heart.
link |
02:20:44.800
He talks about all these different forms of martial arts,
link |
02:20:46.920
and maybe it was in the book, maybe it was in an interview,
link |
02:20:50.640
but he said that, you know,
link |
02:20:52.480
fighting or being in physical battle with somebody,
link |
02:20:56.200
jujitsu, boxing, or some other form of physical,
link |
02:20:58.880
direct physical contact between two individuals
link |
02:21:01.840
creates this bond unlike any other,
link |
02:21:04.300
because he said, it's like a one night stand.
link |
02:21:06.580
You're sharing bodily fluids with somebody
link |
02:21:08.480
that you barely know.
link |
02:21:09.800
And I, you know, and I chuckled about it
link |
02:21:12.040
because it's kind of funny and a kind of tongue in cheek,
link |
02:21:14.800
but at the same time, I think this is a fundamental way
link |
02:21:18.960
in which members of a species bond
link |
02:21:22.280
is through physical contact.
link |
02:21:24.300
And certainly there are other forms.
link |
02:21:25.880
There's cuddling and there's hand holding
link |
02:21:27.360
and there's sexual intercourse
link |
02:21:29.680
and there's all sorts of things.
link |
02:21:30.520
What's cuddling?
link |
02:21:31.720
I haven't heard of it.
link |
02:21:32.720
I heard this recently.
link |
02:21:33.700
I didn't know this term, but there's a term.
link |
02:21:36.140
They've turned the noun cupcake into a verb.
link |
02:21:39.120
Cupcaking, it turns out, I just learned about this.
link |
02:21:41.740
Cupcaking is when you spend time just cuddling.
link |
02:21:45.060
I didn't know about this.
link |
02:21:46.300
You heard it here first,
link |
02:21:47.300
although I heard it first just the other day,
link |
02:21:48.800
cupcaking is actually a verb.
link |
02:21:50.000
So cuddling is everything.
link |
02:21:50.880
It's not just like, is it in bed or is it on the couch?
link |
02:21:53.760
Like what's cuddling?
link |
02:21:55.400
I need to look up what cuddling is.
link |
02:21:56.400
We need to look at this stuff
link |
02:21:57.240
and we need to define the variables.
link |
02:21:59.000
I think it definitely has to do with physical contact,
link |
02:22:02.840
I am told, but in terms of battle, competition,
link |
02:22:07.840
you know, and the Sheridan quote, I'm just curious.
link |
02:22:12.640
So do you get close or feel a bond with people
link |
02:22:18.120
that, for instance, you rolled jujitsu with,
link |
02:22:20.680
even though you don't know anything else about them?
link |
02:22:23.480
Is he, was he right about this?
link |
02:22:25.520
Yeah, I mean, on many levels.
link |
02:22:27.120
He also has the book, what,
link |
02:22:28.400
A Fighter's Mind and The First Heart.
link |
02:22:30.880
He's actually an excellent writer.
link |
02:22:32.120
What's interesting about him, just briefly about Sheridan,
link |
02:22:34.520
I don't know, but I did a little bit of research.
link |
02:22:36.160
He went to Harvard.
link |
02:22:38.640
He was an art major at Harvard.
link |
02:22:40.080
He claims all he did was smoke cigarettes and do art.
link |
02:22:43.440
I don't know if his art was any good.
link |
02:22:45.120
And I think his father was in the SEAL teams.
link |
02:22:48.960
And then when he got out of Harvard, graduated,
link |
02:22:51.720
he took off around the world,
link |
02:22:52.880
learning all the forms of martial arts
link |
02:22:54.440
and was early to the kind of ultimate fighting
link |
02:22:56.640
to kind of mix martial arts and things.
link |
02:22:59.020
Great, great book.
link |
02:22:59.960
Yeah, it's amazing.
link |
02:23:01.240
I don't actually remember it, but I read it.
link |
02:23:03.800
I remember thinking there was an amazing encapsulation
link |
02:23:06.960
of what makes fighting the art,
link |
02:23:10.520
like what makes it compelling.
link |
02:23:12.600
I would say that there's so many ways that jiu-jitsu,
link |
02:23:17.120
grappling, wrestling, combat sports in general,
link |
02:23:21.080
is like one of the most intimate things you could do.
link |
02:23:24.240
I don't know if I would describe it
link |
02:23:25.640
in terms of bodily liquids and all those kinds of things.
link |
02:23:27.920
I think he was more or less joking, but.
link |
02:23:29.920
I think there's a few ways that it does that.
link |
02:23:35.040
So one, because you're so vulnerable.
link |
02:23:41.000
So that the honesty of stepping on the mat
link |
02:23:45.520
and often all of us have ego thinking
link |
02:23:48.720
we're better than we are at this particular art.
link |
02:23:52.360
And then the honesty of being submitted
link |
02:23:55.880
or being worse than you thought you are
link |
02:23:58.720
and just sitting with that knowledge.
link |
02:24:01.000
That kind of honesty,
link |
02:24:02.000
we don't get to experience it in most of daily life.
link |
02:24:06.040
We can continue living somewhat of an illusion
link |
02:24:08.560
of our conceptions of ourselves
link |
02:24:10.760
because people are not going to hit us with the reality.
link |
02:24:13.760
The mat speaks only the truth,
link |
02:24:15.760
that the reality just hits you.
link |
02:24:17.800
And that vulnerability is the same
link |
02:24:19.960
as like the loss of a loved one.
link |
02:24:22.520
It's the loss of a reality that you knew before.
link |
02:24:26.320
You now have to deal with this new reality.
link |
02:24:28.160
And when you're sitting there in that vulnerability
link |
02:24:30.480
and there's these other people
link |
02:24:32.120
that are also sitting in that vulnerability,
link |
02:24:34.320
you get to really connect like, fuck.
link |
02:24:36.920
Like I'm not as special as I thought I was
link |
02:24:40.200
and life is like not,
link |
02:24:44.160
life is harsher than I thought I was
link |
02:24:46.000
and we're just sitting there with that reality.
link |
02:24:47.680
Some of us can put words to them, some of them can't.
link |
02:24:50.000
So I think that definitely is the thing
link |
02:24:51.640
that leads to intimacy.
link |
02:24:52.920
The other thing is the human contact.
link |
02:24:58.600
There is something about, I mean, like a big hug.
link |
02:25:03.680
Like during COVID, very few people hugged me
link |
02:25:06.800
and I hugged them and I always felt good when they did.
link |
02:25:10.240
Like we're all tested and especially now we're vaccinated,
link |
02:25:13.880
but there's still people, this is true of San Francisco,
link |
02:25:16.360
this is true in Boston.
link |
02:25:17.320
They want to keep not only six feet away,
link |
02:25:19.360
but stay at home and never touch you.
link |
02:25:21.720
That was, that loss of basic humanity
link |
02:25:25.280
is the opposite of what I feel in jiu-jitsu
link |
02:25:29.400
where it was like that contact where you're like,
link |
02:25:33.920
I don't give a shit about whatever rules
link |
02:25:35.880
we're supposed to have in society where you're not,
link |
02:25:38.360
you have to keep a distance and all that kind of stuff.
link |
02:25:40.440
Just the hug, like the intimacy of a hug
link |
02:25:45.000
that's like a good bear hug
link |
02:25:46.440
and you're like just controlling another person.
link |
02:25:49.720
And also there is some kind of love communicating
link |
02:25:52.080
through just trying to break each other's arms.
link |
02:25:54.680
I don't exactly understand why violence
link |
02:25:57.600
is such a close neighbor to love, but it is.
link |
02:26:02.600
Well, in the hypothalamus,
link |
02:26:04.760
the neurons that control sexual behavior,
link |
02:26:08.200
but also non-sexual contact are not just nearby
link |
02:26:12.960
the neurons that control aggression and fighting,
link |
02:26:15.760
they are salt and pepper with those neurons.
link |
02:26:19.960
It's a very interesting and it almost sounds
link |
02:26:23.120
kind of risque and controversial and stuff.
link |
02:26:25.040
I'm not anthropomorphizing about what this means,
link |
02:26:27.840
but in the brain, those structures are interdigitated.
link |
02:26:32.200
You can't separate them except at a very fine level.
link |
02:26:36.000
And here the way you describe it is the same
link |
02:26:38.800
as a real thing.
link |
02:26:39.720
I do want to make an interesting comment.
link |
02:26:42.840
Again, these are the things
link |
02:26:43.720
that could be taken out of context,
link |
02:26:45.660
but one of the amazing things about jiu-jitsu
link |
02:26:50.520
is both guys and girls train it.
link |
02:26:53.100
And I was surprised.
link |
02:26:54.760
So like I'm a big fan of yoga pants,
link |
02:26:59.600
at the gym kind of thing.
link |
02:27:01.000
It reveals the beauty of the female form.
link |
02:27:04.800
But the thing is like girls are dressed
link |
02:27:07.760
in skintight clothes in jiu-jitsu often.
link |
02:27:10.180
And I found myself like not at all thinking like that at all
link |
02:27:13.880
when training with girls.
link |
02:27:15.000
Well, the context is very non-sexual.
link |
02:27:17.160
But I was surprised to learn that.
link |
02:27:20.040
When I first started jiu-jitsu,
link |
02:27:21.160
I thought wouldn't that be kind of weird
link |
02:27:22.640
to train with the opposites in something so intimate?
link |
02:27:26.360
Boys and girls, men and women,
link |
02:27:28.480
they rolled jiu-jitsu together completely.
link |
02:27:30.960
Interesting.
link |
02:27:31.800
And the only times girls kind of try to stay away from guys,
link |
02:27:35.320
I mean, there's two contexts.
link |
02:27:36.400
Of course, there's always going to be creeps in this world.
link |
02:27:38.840
So everyone knows who to stay away from.
link |
02:27:42.080
And the other is like, there's a size disparity.
link |
02:27:44.280
So girls will often try to roll with people
link |
02:27:46.240
a little bit closer weight-wise.
link |
02:27:48.280
But no, that's one of the things
link |
02:27:51.100
that are empowering to women.
link |
02:27:53.100
That's what they fall in love with
link |
02:27:54.280
when they start doing jiu-jitsu is I can,
link |
02:27:56.320
first of all, they gain an awareness
link |
02:27:58.680
and a pride over their body, which is great.
link |
02:28:00.960
And then second, they get, especially later on,
link |
02:28:04.360
start submitting big dudes,
link |
02:28:05.960
like these like bros that come in
link |
02:28:09.240
who are all shredded and like muscular.
link |
02:28:11.280
And they get through technique to exercise dominance
link |
02:28:15.320
over them.
link |
02:28:16.160
And that's a powerful feeling.
link |
02:28:17.720
You've seen women force a larger guy to tap
link |
02:28:21.560
or even choke him out.
link |
02:28:22.400
Well, I was deadlifting for,
link |
02:28:29.520
oh boy, I think it's 495.
link |
02:28:31.520
So I was really into powerlifting when I started jiu-jitsu.
link |
02:28:35.000
And I remember being submitted by,
link |
02:28:37.400
I thought I walked in feeling like I'm going to be,
link |
02:28:40.640
if not the greatest fighter of at least top three.
link |
02:28:43.240
And so as a white belt, you roll in like all happy.
link |
02:28:47.400
And then you realize that as long as you're not applying
link |
02:28:50.740
too much force that you're having,
link |
02:28:52.920
I remember being submitted many times
link |
02:28:54.360
by like 130, 120 pound girls
link |
02:28:57.220
at the Balance Studios in Philadelphia,
link |
02:28:59.860
that a lot of incredible female jiu-jitsu players.
link |
02:29:02.480
And that's really humbling too.
link |
02:29:04.360
The technique can overpower in combat
link |
02:29:09.360
and pure strength.
link |
02:29:12.160
And that's the other thing that there is something
link |
02:29:15.540
about combat that's primal.
link |
02:29:18.140
Like it just feels, it feels like we were born to do this.
link |
02:29:26.480
Like that there's-
link |
02:29:27.320
But we have circuits in our brain that are dedicated
link |
02:29:30.320
to this kind of interaction.
link |
02:29:32.380
There's no question.
link |
02:29:34.080
And like, that's what it felt like.
link |
02:29:35.720
It wasn't that I'm learning a new skill.
link |
02:29:38.920
It was like somehow I am remembering echoes
link |
02:29:42.780
of something I've learned in the past.
link |
02:29:44.400
It's like hitting puberty.
link |
02:29:45.640
A child before puberty has no concept
link |
02:29:47.760
of boys and girls having this attraction,
link |
02:29:51.020
regardless of whether or not they're attracted
link |
02:29:52.600
to boys or girls, doesn't matter.
link |
02:29:53.640
At some point, most people, not all, but certainly,
link |
02:29:56.400
but most people, when they hit puberty,
link |
02:29:58.360
suddenly people appear differently.
link |
02:30:01.280
And certain people take on a romantic or sexual interest
link |
02:30:05.540
for the very first time.
link |
02:30:07.560
And so it's like, it's revealing a circuitry in the brain.
link |
02:30:11.400
It's not like they learned that, it's innate.
link |
02:30:14.360
And I think when I hear the way you describe jiu-jitsu
link |
02:30:18.120
and rolling jiu-jitsu, it reminds me a little bit,
link |
02:30:21.120
Joe was telling me recently about the first time
link |
02:30:23.340
he went hunting and he felt like it revealed a circuit
link |
02:30:26.680
that was in him all along,
link |
02:30:28.560
but he hadn't experienced before.
link |
02:30:30.960
Yeah, that's definitely there.
link |
02:30:32.360
And of course there's the physical activity.
link |
02:30:34.960
One of the interesting things about jiu-jitsu
link |
02:30:37.280
is it's one of the really strenuous exercises
link |
02:30:40.880
that you can do late into your adult life,
link |
02:30:43.880
like into your 50s, 60s, 70s, 80s.
link |
02:30:48.000
When I came up, there's a few people in their 80s
link |
02:30:50.480
that were training.
link |
02:30:51.720
And as long as you're smart,
link |
02:30:53.080
as long as you practice techniques
link |
02:30:54.840
and pick your partners correctly,
link |
02:30:55.940
you can do that kind of art.
link |
02:30:57.320
It's late into life and so you're getting exercise.
link |
02:30:59.880
There's not many activities I find
link |
02:31:01.880
that are amenable to that.
link |
02:31:04.500
So because it's such a thinking game,
link |
02:31:07.760
the jiu-jitsu in particular is an art
link |
02:31:10.880
where technique pays off a lot.
link |
02:31:13.360
So you can still maintain, first of all,
link |
02:31:17.240
remain injury free if you use good technique
link |
02:31:20.840
and also through good technique,
link |
02:31:24.280
be able to go, be active with people
link |
02:31:26.920
that are much, much younger.
link |
02:31:28.480
And so that was to me,
link |
02:31:30.560
that and running are the two activities
link |
02:31:32.480
you can kind of do late in life
link |
02:31:33.840
because to me, a healthy life has exercise
link |
02:31:38.280
as a piece of the puzzle.
link |
02:31:39.360
No, absolutely.
link |
02:31:40.400
And I'm glad that we're on the physical component
link |
02:31:42.720
because I know that there's, for you,
link |
02:31:47.960
you've talked before about the crossover
link |
02:31:49.680
between the physical and the intellectual and the mental.
link |
02:31:54.980
Are you still running at ridiculous hours of the night
link |
02:31:57.680
for ridiculously long?
link |
02:31:59.600
Yeah, so definitely.
link |
02:32:01.520
I've been running late at night here in Austin.
link |
02:32:03.520
People tell me, the area we're in now,
link |
02:32:05.800
people say is a dangerous area,
link |
02:32:07.240
which I find laughable coming from the bigger cities.
link |
02:32:10.580
No, I've run late at night.
link |
02:32:12.780
There's something.
link |
02:32:15.160
If you see a guy running through Austin at 2 a.m.
link |
02:32:17.580
in a suit and tie, it's probably.
link |
02:32:22.200
Well, yeah, I mean, I do think about that
link |
02:32:24.080
because I get recognized more and more in Austin.
link |
02:32:26.640
I worry that, not really,
link |
02:32:29.040
that I get recognized late at night.
link |
02:32:30.800
But there is something about the night
link |
02:32:36.600
that brings out those deep philosophical thoughts
link |
02:32:38.960
and self-reflection that I really enjoy.
link |
02:32:40.720
But recently I started getting back to the grind.
link |
02:32:44.560
So I'm gonna be competing or hoping to be compete
link |
02:32:47.680
in September and October.
link |
02:32:49.840
In jiu-jitsu.
link |
02:32:50.680
In jiu-jitsu, yeah, to get back to competition.
link |
02:32:53.120
And so that requires getting back into great cardio shape.
link |
02:32:58.120
And so I've been getting running
link |
02:33:00.560
as part of my daily routine.
link |
02:33:02.280
Got it.
link |
02:33:03.640
Well, I always know I can reach you
link |
02:33:05.140
regardless of time zone in the middle of the night,
link |
02:33:08.000
wherever that happens.
link |
02:33:09.280
Well, part of that has to be just being single
link |
02:33:11.120
and being a programmer.
link |
02:33:13.780
Those two things just don't work well
link |
02:33:16.040
in terms of a steady sleep schedule.
link |
02:33:18.040
It's not banker's hours kind of work, nine to five.
link |
02:33:21.440
I want to, you mentioned single.
link |
02:33:23.280
I want to ask you a little bit
link |
02:33:24.840
about the other form of relationship,
link |
02:33:26.400
which is a romantic love.
link |
02:33:29.700
So your parents are still married?
link |
02:33:32.400
Still married, still happily married.
link |
02:33:34.120
That's impressive.
link |
02:33:34.960
Yeah.
link |
02:33:35.780
A rare thing nowadays.
link |
02:33:36.620
Yeah.
link |
02:33:37.460
So you grew up with that example.
link |
02:33:38.960
Yeah, I guess that's a powerful thing, right?
link |
02:33:40.920
If there's an example that I think can work.
link |
02:33:44.640
Yeah, I didn't have that in my own family,
link |
02:33:46.480
but when I see it, it's inspiring and it's beautiful.
link |
02:33:52.040
The fact that they have that and that was the norm for you,
link |
02:33:55.040
I think is really wonderful.
link |
02:33:58.000
In the case of my parents, it was interesting to watch
link |
02:34:00.080
because there's obviously tension.
link |
02:34:03.240
Like there'll be times where they fought
link |
02:34:04.920
and all those kinds of things.
link |
02:34:06.680
They obviously get frustrated with each other
link |
02:34:09.920
and they like, but they find mechanisms
link |
02:34:13.320
how to communicate that to each other,
link |
02:34:15.000
like to make fun of each other a little bit,
link |
02:34:16.520
like to tease, to get some of that frustration out
link |
02:34:19.400
and then ultimately to reunite
link |
02:34:21.040
and find their joyful moments and be that the energy.
link |
02:34:25.280
I think it's clear because I got together in there,
link |
02:34:27.600
I think early twenties, like very, very young.
link |
02:34:30.280
I think you grow together as people.
link |
02:34:32.440
Yeah, you're still in the critical period
link |
02:34:34.320
of brain plasticity.
link |
02:34:35.580
And also, I mean, it's just like divorce
link |
02:34:40.200
was so frowned upon that you stick it out.
link |
02:34:43.860
And I think a lot of couples,
link |
02:34:44.960
especially from that time in the Soviet Union,
link |
02:34:46.640
that's probably applies to a lot of cultures.
link |
02:34:48.600
You stick it out and you put in the work.
link |
02:34:50.640
You learn how to put in the work.
link |
02:34:52.360
And once you do, you start to get to some of those
link |
02:34:54.940
rewarding aspects of being like through time,
link |
02:34:59.080
sharing so many moments together.
link |
02:35:03.120
That's definitely something that was an inspiration to me,
link |
02:35:07.840
but maybe that's where I have,
link |
02:35:09.720
so I have a similar kind of longing
link |
02:35:11.340
to have a lifelong partner like that,
link |
02:35:13.280
have that kind of view where same with friendship,
link |
02:35:16.760
lifelong friendship is the most meaningful kind
link |
02:35:20.600
that there is something with that time
link |
02:35:22.040
of sharing all that time together,
link |
02:35:24.120
like till death do us part as a powerful thing,
link |
02:35:26.800
not by force, not because the religion said it
link |
02:35:29.200
or the government said it or your culture said it,
link |
02:35:31.580
but because you want to.
link |
02:35:33.240
Do you want children?
link |
02:35:34.520
Definitely, yeah.
link |
02:35:35.920
Definitely want children.
link |
02:35:38.400
It's common.
link |
02:35:39.220
How many Roombas do you have?
link |
02:35:41.280
Oh, I thought you should know human children.
link |
02:35:43.240
No, human children.
link |
02:35:44.400
Because I already have the children.
link |
02:35:45.560
Exactly, well, I was saying you probably need
link |
02:35:46.880
at least as many human children as you do Roombas,
link |
02:35:49.520
big family, small family.
link |
02:35:53.360
In your mind's eye, is there a big,
link |
02:35:55.160
are there a bunch of freedman's running around?
link |
02:35:59.160
So I'll tell you like realistically,
link |
02:36:01.240
I can explain exactly my thinking.
link |
02:36:04.080
And this is similar to the robotics work
link |
02:36:06.160
is if I'm like purely logical right now,
link |
02:36:10.080
my answer would be I don't want kids
link |
02:36:12.320
because I just don't have enough time.
link |
02:36:15.720
I have so much going on.
link |
02:36:17.280
But when I'm using the same kind of vision
link |
02:36:19.240
I use for the robots is I know my life
link |
02:36:22.620
will be transformed with the first.
link |
02:36:25.000
Like I know I would love being a father.
link |
02:36:27.640
And so the question of how many,
link |
02:36:30.280
that's on the other side of that hill.
link |
02:36:33.140
It could be some ridiculous number.
link |
02:36:35.880
So I just know that-
link |
02:36:36.800
I have a feeling and I don't have a crystal ball,
link |
02:36:40.820
but I don't know, I see an upwards
link |
02:36:43.720
of certainly three or more comes to mind.
link |
02:36:47.760
So so much of that has to do
link |
02:36:49.840
with the partner you're with too.
link |
02:36:51.920
So like that's such an open question,
link |
02:36:55.560
especially in this society of what the right partnership is.
link |
02:36:58.880
Because I'm deeply empathetic.
link |
02:37:02.800
I want to see, like to me,
link |
02:37:05.100
what I look for in a relationship is
link |
02:37:07.760
for me to be really excited about the passions
link |
02:37:11.160
of another person, like whatever they're into.
link |
02:37:13.040
It doesn't have to be a career success,
link |
02:37:15.740
any kind of success, just to be excited for them.
link |
02:37:18.440
And for them to be excited for me
link |
02:37:20.520
and they can share in that excitement
link |
02:37:21.840
and build and build and build.
link |
02:37:23.720
But there was also practical aspects of like,
link |
02:37:25.800
what kind of shit do you enjoy doing together?
link |
02:37:28.760
And I think family is a real serious undertaking.
link |
02:37:32.320
Oh, it certainly is.
link |
02:37:34.040
I mean, I think that I have a friend who said it,
link |
02:37:37.480
I think best, which is that you first half,
link |
02:37:41.240
he's in a very successful relationship and has a family.
link |
02:37:44.440
And he said, you first have to define the role
link |
02:37:47.480
and then you have to cast the right person for the role.
link |
02:37:51.360
Well, yeah, there's some deep aspect to that,
link |
02:37:53.880
but there's also an aspect to which you're not smart enough
link |
02:37:58.280
from this side of it to define the role.
link |
02:38:03.040
There's part of it that has to be a leap
link |
02:38:04.840
that you have to take.
link |
02:38:06.520
And I see having kids that way.
link |
02:38:11.520
You just have to go with it and figure it out also,
link |
02:38:16.120
as long as there's love there.
link |
02:38:17.640
Like what the hell is life for even?
link |
02:38:20.560
So there's so many incredibly successful people that I know,
link |
02:38:26.320
that I've gotten to know, that all have kids.
link |
02:38:28.960
And the presence of kids for the most part
link |
02:38:32.800
has only been something that energized them,
link |
02:38:36.600
something that gave them meaning,
link |
02:38:37.900
something that made them the best version of themselves,
link |
02:38:40.200
like made them more productive, not less,
link |
02:38:42.360
which is fascinating to me.
link |
02:38:43.680
It is fascinating.
link |
02:38:44.520
I mean, you can imagine if the way that you felt about Homer,
link |
02:38:47.240
the way that I feel and felt about Costello
link |
02:38:49.760
is at all a glimpse of what that must be like then.
link |
02:38:54.600
Exactly.
link |
02:38:55.840
The downside, the thing I worry more about
link |
02:39:00.000
is the partner side of that.
link |
02:39:04.560
I've seen the kids are almost universally
link |
02:39:07.840
a source of increased productivity and joy and happiness.
link |
02:39:11.720
Like, yeah, they're a pain in the ass.
link |
02:39:13.280
Yeah, it's complicated.
link |
02:39:14.120
Yeah, so on and so forth.
link |
02:39:15.640
People like to complain about kids.
link |
02:39:17.440
But when you actually look past that little shallow layer
link |
02:39:20.960
of complaint, kids are great.
link |
02:39:22.940
The source of pain for a lot of people
link |
02:39:24.760
is when the relationship doesn't work.
link |
02:39:27.560
And so I'm very kind of concerned about,
link |
02:39:32.600
dating is very difficult and I'm a complicated person.
link |
02:39:36.560
And so it's been very difficult
link |
02:39:38.240
to find the right kind of person.
link |
02:39:42.000
But that statement doesn't even make sense
link |
02:39:45.040
because I'm not on dating apps.
link |
02:39:46.880
I don't see people.
link |
02:39:48.260
You're like the first person I saw in a while.
link |
02:39:50.360
It's like you, Michael Malice and like Joe.
link |
02:39:53.040
So I don't think I've seen like a female, what is it?
link |
02:40:00.840
An element of the female species in quite a while.
link |
02:40:03.680
So I think you have to put yourself out there.
link |
02:40:06.520
What is it?
link |
02:40:07.440
Daniel Johnson says, true love will find you,
link |
02:40:10.240
but only if you're looking.
link |
02:40:11.760
So there's some element of really taking the leap
link |
02:40:13.680
and putting yourself out there
link |
02:40:14.780
in kind of different situations.
link |
02:40:17.000
And I don't know how to do that
link |
02:40:18.400
when you're behind a computer all the time.
link |
02:40:20.460
Well, you're a builder and you're a problem solver
link |
02:40:25.200
and you find solutions and I'm confident this solution is,
link |
02:40:30.200
and the solution is out there.
link |
02:40:33.620
And-
link |
02:40:34.460
I think you're implying that I'm going to build
link |
02:40:35.700
the girlfriend, which I think-
link |
02:40:38.500
Or that you, well, and maybe we shouldn't separate
link |
02:40:41.120
this friendship, the notion of friendship and community.
link |
02:40:45.460
And if we go back to this concept of the aggregate,
link |
02:40:48.900
maybe you'll meet this woman through a friend
link |
02:40:52.420
or maybe or something of that sort.
link |
02:40:53.940
So one of the things, I don't know if you feel the same way.
link |
02:40:56.920
I definitely one of those people that just falls in love
link |
02:41:01.500
and that's it.
link |
02:41:02.480
Yeah, I can't say I'm like that.
link |
02:41:03.980
With Costello, it was instantaneous.
link |
02:41:06.560
Yeah.
link |
02:41:07.400
It really was.
link |
02:41:08.220
I mean, I know it's not romantic love,
link |
02:41:09.700
but it was instantaneous.
link |
02:41:10.740
No, but that's me.
link |
02:41:12.340
And I think that if you know, you know,
link |
02:41:14.660
because that's a good thing that you have that.
link |
02:41:18.260
Well, I'm very careful of that
link |
02:41:21.740
because you don't want to fall in love with the wrong person.
link |
02:41:24.980
So I try to be very kind of careful with,
link |
02:41:27.340
I've noticed this because I fall in love with everything,
link |
02:41:29.460
like this mug, everything.
link |
02:41:31.040
I fall in love with things in this world.
link |
02:41:34.020
So like, you have to be really careful
link |
02:41:35.500
because a girl comes up to you and says,
link |
02:41:41.060
she loves Dostoevsky.
link |
02:41:43.820
That doesn't necessarily mean you need to marry her tonight.
link |
02:41:46.700
Yes, and I liked the way you said that out loud
link |
02:41:49.060
so that you heard it.
link |
02:41:49.980
It doesn't mean you need to marry her tonight.
link |
02:41:52.540
Exactly.
link |
02:41:53.380
Right, exactly.
link |
02:41:54.220
But people are amazing and people are beautiful.
link |
02:41:57.140
And that's, so I'm fully embraced that,
link |
02:42:00.540
but I also, you have to be careful with relationships.
link |
02:42:02.940
And at the same time, like I mentioned to you offline,
link |
02:42:05.820
I don't, there's something about me that appreciates
link |
02:42:10.840
swinging for the fences and not dating,
link |
02:42:13.640
like doing serial dating or dating around.
link |
02:42:15.860
Yeah, you're a one guy, one girl kind of guy.
link |
02:42:17.580
Yeah.
link |
02:42:18.400
You said that.
link |
02:42:19.240
And it's tricky because you want to be careful
link |
02:42:23.100
with that kind of stuff.
link |
02:42:24.180
Especially now there's a growing platform
link |
02:42:26.380
that have a ridiculous amount of female interest
link |
02:42:29.140
of a certain kind, but I'm looking for deep connection.
link |
02:42:33.580
And I'm looking by sitting home alone
link |
02:42:36.580
and every once in a while, talking to Stanford professors.
link |
02:42:41.100
Perfect solution.
link |
02:42:42.380
Perfect solution.
link |
02:42:43.220
It's going to work out great.
link |
02:42:44.040
It's well incorporated.
link |
02:42:45.260
It's part of, that constitutes machine learning of sorts.
link |
02:42:49.540
Yeah, of sorts.
link |
02:42:51.100
I do, you mentioned what has now become a quite extensive
link |
02:42:55.860
and expansive public platform, which is incredible.
link |
02:42:59.260
I mean, the number of people out,
link |
02:43:01.260
first time I saw your podcast, I noticed the suit.
link |
02:43:03.540
I was like, he respects his audience, which was great.
link |
02:43:05.460
But I also thought, this is amazing.
link |
02:43:08.480
People are showing up for science and engineering
link |
02:43:10.740
and technology information and those discussions
link |
02:43:12.820
and other sorts of discussions.
link |
02:43:14.100
Now, I do want to talk for a moment about the podcast.
link |
02:43:18.100
So my two questions about the podcast are,
link |
02:43:21.700
when you started it, did you have a plan?
link |
02:43:24.300
And regardless of what that answer is,
link |
02:43:27.500
do you know where you're taking it?
link |
02:43:29.820
Or would you like to leave us?
link |
02:43:31.960
I do believe in an element of surprise is always fun.
link |
02:43:35.220
But what about the podcast?
link |
02:43:36.400
Do you enjoy the podcast?
link |
02:43:37.680
I mean, your audience certainly includes me,
link |
02:43:40.340
really enjoys the podcast.
link |
02:43:41.720
It's incredible.
link |
02:43:42.660
So I love talking to people
link |
02:43:46.540
and there's something about microphones
link |
02:43:50.200
that really bring out the best in people.
link |
02:43:52.300
Like you don't get a chance to talk like this.
link |
02:43:54.500
If you and I were just hanging out,
link |
02:43:56.100
we would have a very different conversation
link |
02:43:58.500
in the amount of focus we allocate to each other.
link |
02:44:02.100
We would be having fun talking about other stuff
link |
02:44:04.420
and doing other things.
link |
02:44:06.000
There'd be a lot of distraction.
link |
02:44:07.480
There would be some phone use and all that kind of stuff.
link |
02:44:11.020
But here we're 100% focused on each other
link |
02:44:13.980
and focused on the idea.
link |
02:44:16.060
And sometimes playing with ideas
link |
02:44:18.180
that we both don't know the answer to,
link |
02:44:21.060
like a question we don't know the answer to.
link |
02:44:23.040
We're both like fumbling with it, trying to figure out,
link |
02:44:25.460
trying to get some insights
link |
02:44:27.060
at something we haven't really figured out before
link |
02:44:29.480
and together arriving at that.
link |
02:44:31.240
I think that's magical.
link |
02:44:32.380
I don't know why we need microphones for that,
link |
02:44:34.160
but we somehow do.
link |
02:44:35.220
It feels like doing science.
link |
02:44:36.700
It feels like doing science for me, definitely.
link |
02:44:38.700
That's exactly it.
link |
02:44:39.860
Then, and I'm really glad you said that
link |
02:44:42.180
because I don't actually often say this,
link |
02:44:45.980
but that's exactly what I felt like.
link |
02:44:48.900
I wanted to talk to friends and colleagues at MIT
link |
02:44:53.940
to do real science together.
link |
02:44:56.420
That's how I felt about it.
link |
02:44:57.780
Like to really talk to problems
link |
02:45:00.180
that are actually interesting
link |
02:45:03.100
as opposed to like incremental work
link |
02:45:06.060
that we're currently working for a particular conference.
link |
02:45:10.860
So really asking questions like, what are we doing?
link |
02:45:14.140
Like, where's this headed to?
link |
02:45:15.980
Like, what are the big,
link |
02:45:17.220
is this really going to help us solve,
link |
02:45:19.940
in the case of AI, solve intelligence?
link |
02:45:22.800
Like, is this even working on intelligence?
link |
02:45:24.700
There's a certain sense,
link |
02:45:26.380
which is why I initially called it artificial intelligence
link |
02:45:29.980
is like most of us are not working
link |
02:45:32.900
on artificial intelligence.
link |
02:45:34.780
You're working on some very specific problem
link |
02:45:37.680
and a set of techniques.
link |
02:45:39.480
At the time, it's machine learning
link |
02:45:41.260
to solve this particular problem.
link |
02:45:42.980
This is not going to take us to a system
link |
02:45:45.560
that is anywhere close to the generalizability
link |
02:45:49.220
of the human mind.
link |
02:45:51.340
Like the kind of stuff the human mind can do
link |
02:45:52.940
in terms of memory, in terms of cognition,
link |
02:45:54.660
in terms of reasoning, common sense reasoning.
link |
02:45:56.940
This doesn't seem to take us there.
link |
02:45:58.700
So the initial impulse was,
link |
02:46:00.540
can I talk to these folks,
link |
02:46:03.140
do science together through conversation?
link |
02:46:05.620
And I also thought that there was not enough,
link |
02:46:08.620
now, I didn't think there was enough good conversations
link |
02:46:13.860
with world-class minds that I got to meet
link |
02:46:17.640
and not the ones with a book or this was the thing.
link |
02:46:21.740
Oftentimes you go on this tour when you have a book,
link |
02:46:24.260
but there's a lot of minds that don't write books.
link |
02:46:26.660
They don't.
link |
02:46:27.480
And the books constrain the conversation too,
link |
02:46:28.860
because then you're talking about this thing, this book.
link |
02:46:31.720
But there's, I've noticed that with people
link |
02:46:34.460
that haven't written a book who are brilliant,
link |
02:46:37.260
we get to talk about ideas in a new way.
link |
02:46:40.080
We both haven't actually,
link |
02:46:42.620
when we raise a question,
link |
02:46:43.780
we don't know the answer to it
link |
02:46:45.500
when the question is raised and we try to arrive there.
link |
02:46:49.820
Like, I don't know.
link |
02:46:50.660
I remember asking questions
link |
02:46:53.100
of world-class researchers in deep learning
link |
02:46:57.140
of why do neural networks work as well as they do?
link |
02:47:02.600
That question is often loosely asked,
link |
02:47:06.620
but like when you have microphones
link |
02:47:09.220
and you have to think through it
link |
02:47:11.020
and you have 30 minutes to an hour
link |
02:47:12.460
to think through it together, I think that's science.
link |
02:47:16.140
I think that's really powerful.
link |
02:47:17.620
So that was the one goal.
link |
02:47:19.660
The other one is,
link |
02:47:23.460
I again don't usually talk about this,
link |
02:47:25.460
but there's some sense in which I wanted
link |
02:47:28.500
to have dangerous conversations.
link |
02:47:32.460
Part of the reasons I wanted to wear a suit is like,
link |
02:47:36.180
I want it to be fearless.
link |
02:47:38.060
Now, the reason I don't usually talk about it
link |
02:47:40.180
is because I feel like I'm not good at conversation.
link |
02:47:43.000
So it looks like it doesn't match the current skill level,
link |
02:47:48.240
but I wanted to have really dangerous conversations
link |
02:47:53.860
that I uniquely would be able to do.
link |
02:47:58.180
Not completely uniquely, but like I'm a huge fan
link |
02:48:01.620
of Joe Rogan and I had to ask myself,
link |
02:48:04.540
what conversations can I do that Joe Rogan can't?
link |
02:48:08.240
For me, I know I bring this up,
link |
02:48:11.980
but for me, that person I thought about
link |
02:48:13.760
at the time was Putin.
link |
02:48:15.660
Like that's why I bring him up.
link |
02:48:17.580
He's just like with Costello, he's not just a person.
link |
02:48:22.100
He's also an idea to me for what I strive for,
link |
02:48:25.580
just to have those dangerous conversations.
link |
02:48:27.860
And the reason I'm uniquely qualified is both the Russian,
link |
02:48:31.440
but also there's the judo and the martial arts.
link |
02:48:34.060
There's a lot of elements that make me have a conversation
link |
02:48:37.820
he hasn't had before.
link |
02:48:39.460
And there's a few other people that I kept in mind,
link |
02:48:45.060
like Don Knuth, he's a computer scientist from Stanford
link |
02:48:49.340
that I thought is one of the most beautiful minds ever.
link |
02:48:54.060
And nobody really talked to him, like really talked to him.
link |
02:48:59.500
He did a few lectures, which people love,
link |
02:49:01.380
but really just have a conversation with him.
link |
02:49:03.720
There's a few people like that.
link |
02:49:04.820
One of them passed away, John Conway, that I never got.
link |
02:49:07.300
We agreed to talk, but he died before we didn't.
link |
02:49:10.660
There's a few people like that that I thought like,
link |
02:49:13.460
it's such a crime to not hear those folks.
link |
02:49:19.660
And I have the unique ability to know how to purchase
link |
02:49:24.860
a microphone on Amazon and plug it into a device
link |
02:49:28.000
that records audio and then publish it,
link |
02:49:30.420
which seems relatively unique.
link |
02:49:32.060
Like that's not easy in the scientific community.
link |
02:49:34.700
People knowing how to plug in a microphone.
link |
02:49:36.700
No, they can build Faraday cages and two photon microscopes
link |
02:49:40.500
and bio-engineer, all sorts of things.
link |
02:49:43.040
But the idea that you could take ideas and export them
link |
02:49:47.100
into a structure or a pseudo structure
link |
02:49:49.260
that people would benefit from
link |
02:49:50.660
seems like a cosmic achievement to them.
link |
02:49:54.060
I don't know if it's a fear or just basically
link |
02:49:57.460
they haven't tried it,
link |
02:49:58.300
so they haven't learned the skill level.
link |
02:50:00.300
I think they're not trained, I mean,
link |
02:50:02.140
we could riff on this for a while,
link |
02:50:03.380
but I think that, but it's important and maybe we should,
link |
02:50:08.040
which is that it's, they're not trained to do it.
link |
02:50:11.060
They're trained to think in specific aims
link |
02:50:12.820
and specific hypotheses.
link |
02:50:14.260
And many of them don't care to, right?
link |
02:50:17.840
They became scientists because that's where they felt safe.
link |
02:50:22.660
And so why would they leave that haven of safety?
link |
02:50:25.900
Well, they also don't necessarily always see
link |
02:50:27.860
the value in it.
link |
02:50:29.300
We're all together learning,
link |
02:50:30.580
you and I are learning the value of this.
link |
02:50:33.740
I think you're probably,
link |
02:50:35.380
you have an exceptionally successful and amazing podcast
link |
02:50:39.440
that you started just recently.
link |
02:50:40.900
Thanks to your encouragement.
link |
02:50:42.420
Well, but there's a raw skill there
link |
02:50:45.820
that you're definitely an inspiration to me
link |
02:50:49.180
in how you do the podcast
link |
02:50:50.420
in the level of excellence you reach.
link |
02:50:52.860
But I think you've discovered
link |
02:50:54.440
that that's also an impactful way to do science,
link |
02:50:57.040
that podcast.
link |
02:50:58.020
And I think a lot of scientists have not yet discovered
link |
02:51:01.620
that this is, if they apply same kind of rigor
link |
02:51:06.860
as they do to academic publication
link |
02:51:09.340
or to even conference presentations,
link |
02:51:11.780
and they do that rigor and effort to podcast,
link |
02:51:16.120
whatever that is, that could be a five minute podcast,
link |
02:51:18.500
a two hour podcast, it could be conversational,
link |
02:51:21.000
or it can be more like lecture-like.
link |
02:51:22.940
If they apply that effort,
link |
02:51:24.420
you have the potential to reach over time,
link |
02:51:26.840
tens of thousands, hundreds of thousands,
link |
02:51:28.620
millions of people.
link |
02:51:29.800
And that's really, really powerful.
link |
02:51:32.460
But yeah, for me,
link |
02:51:35.220
giving a platform to a few of those folks,
link |
02:51:39.420
especially for me personally,
link |
02:51:40.940
so maybe you can speak to what fields you're drawn to,
link |
02:51:46.260
but I thought computer scientists
link |
02:51:51.020
were especially bad at this.
link |
02:51:53.540
So there's brilliant computer scientists
link |
02:51:56.300
that I thought it would be amazing to explore their mind,
link |
02:52:00.300
explore their thinking.
link |
02:52:02.060
And so I took that almost on as an effort.
link |
02:52:06.540
And at the same time, I had other guests in mind
link |
02:52:11.140
or people that connect to my own interests.
link |
02:52:13.900
So the wrestling,
link |
02:52:16.780
wrestling, music, football,
link |
02:52:18.900
both American football and soccer.
link |
02:52:21.260
I have a few particular people
link |
02:52:22.620
that I'm really interested in.
link |
02:52:24.120
Bovisar Satiev, the Satiev brothers,
link |
02:52:27.960
even Khabib for wrestling,
link |
02:52:29.640
just to talk to them.
link |
02:52:31.000
Because you guys can communicate.
link |
02:52:33.000
In Russian and in wrestling,
link |
02:52:36.380
as wrestlers and as Russians.
link |
02:52:38.840
And so that little,
link |
02:52:41.760
it's like an opportunity to explore a mind
link |
02:52:44.040
that I'm able to bring to the world.
link |
02:52:47.680
And also, I feel like it makes me a better person
link |
02:52:52.680
just that being that vulnerable
link |
02:52:55.640
and exploring ideas together.
link |
02:52:57.380
I don't know, like good conversation.
link |
02:52:59.820
I don't know how often you have a really good conversation
link |
02:53:01.760
with friends, but like podcasts are like that.
link |
02:53:04.460
And it's deeply moving.
link |
02:53:07.120
It's the best.
link |
02:53:07.960
And what you brought through,
link |
02:53:09.920
I mean, when I saw you sit down with Penrose,
link |
02:53:12.200
Nobel Prize winning physicists and these other folks,
link |
02:53:15.000
it's not just because he has a Nobel,
link |
02:53:16.280
it's what comes out of his mouth is incredible.
link |
02:53:18.040
And what you were able to hold in that conversation
link |
02:53:22.900
was so much better.
link |
02:53:24.400
Light years beyond what he had any other interviewer,
link |
02:53:28.880
I don't want to even call you an interviewer
link |
02:53:30.120
because it's really about conversation.
link |
02:53:31.620
Light years beyond what anyone else had been able
link |
02:53:34.460
to engage with him was such a beacon of what's possible.
link |
02:53:39.960
And I know that, I think that's what people are drawn to.
link |
02:53:42.440
And there's a certain intimacy
link |
02:53:44.420
that certainly if two people are friends as we are
link |
02:53:47.560
and they know each other, that there's more of that,
link |
02:53:49.860
but there's an intimacy in those kinds
link |
02:53:51.940
of private conversations that are made public.
link |
02:53:55.880
Well, that's the, with you,
link |
02:53:57.880
you're probably starting to realize, and Costello,
link |
02:54:01.200
it's like part of it, because you're authentic
link |
02:54:04.560
and you're putting yourself out there completely,
link |
02:54:07.000
people are almost not just consuming
link |
02:54:10.860
the words you're saying, they also enjoy watching you,
link |
02:54:15.800
Andrew, struggle with these ideas
link |
02:54:19.080
or try to communicate these ideas.
link |
02:54:20.680
They like the flaws.
link |
02:54:21.680
They like a human being exploring ideas.
link |
02:54:24.840
Well, that's good, because I got plenty of those.
link |
02:54:26.480
Well, they like the self-critical aspects,
link |
02:54:28.640
like where you're very careful,
link |
02:54:30.280
where you're very self-critical about your flaws.
link |
02:54:33.040
I mean, in that same way, it's interesting,
link |
02:54:35.440
I think, for people to watch me talk to Penrose,
link |
02:54:37.920
not just because Penrose is communicating ideas,
link |
02:54:42.280
but here's this like silly kid trying to explore ideas.
link |
02:54:46.960
Like they know this kid,
link |
02:54:48.300
that there's a human connection that is really powerful.
link |
02:54:51.240
Same, I think, with Putin, right?
link |
02:54:53.560
Like it's not just a good interview with Putin.
link |
02:54:57.440
It's also, here's this kid struggling
link |
02:55:00.880
to talk with one of the most powerful,
link |
02:55:04.760
some would argue dangerous people in the world,
link |
02:55:08.000
that they love that, the authenticity that led up to that.
link |
02:55:11.920
Like, and in return, I get to connect everybody I run to
link |
02:55:16.080
in the street and all those kinds of things.
link |
02:55:19.420
There's a depth of connection there,
link |
02:55:21.040
almost within like a minute or two,
link |
02:55:22.920
that's unlike any other.
link |
02:55:24.300
Yeah, there's an intimacy that you've formed with them.
link |
02:55:26.840
Yeah, we've been on this like journey together.
link |
02:55:29.760
I mean, I have the same thing with Joe Rogan
link |
02:55:31.280
before I ever met him, right?
link |
02:55:32.740
Like I was, because I was a fan of Joe for so many years,
link |
02:55:36.640
there's something, there's a kind of friendship
link |
02:55:40.880
as absurd as it might be to say in podcasting
link |
02:55:44.440
and listening to podcasts.
link |
02:55:45.960
Yeah, maybe it fills in a little bit of that,
link |
02:55:48.880
or solves a little bit of that loneliness
link |
02:55:50.640
that you're talking about.
link |
02:55:51.480
Until the robots are here.
link |
02:55:54.520
I have just a couple more questions,
link |
02:55:56.520
but one of them is on behalf of your audience,
link |
02:55:59.560
which is, I'm not going to ask you
link |
02:56:02.720
the meaning of the hedgehog,
link |
02:56:04.460
but I just want to know, does it have a name?
link |
02:56:08.460
And you don't have to tell us the name,
link |
02:56:09.920
but just does it have a name, yes or no?
link |
02:56:12.400
Well, there's a name he likes to be referred to as,
link |
02:56:17.520
and then there's a private name
link |
02:56:19.280
in the privacy of our own company that we call each other.
link |
02:56:21.280
No, I'm not that insane.
link |
02:56:24.360
No, his name is Hedgy.
link |
02:56:27.280
He's a hedgehog.
link |
02:56:28.640
I don't like stuffed animals,
link |
02:56:31.620
but his story is one of minimalism.
link |
02:56:35.600
So I gave away everything I own now three times in my life.
link |
02:56:41.120
By everything, I mean, almost everything,
link |
02:56:42.960
kept jeans and shirt and a laptop.
link |
02:56:46.140
And recently it's also been guitar, things like that.
link |
02:56:51.240
But he survived because he was always in the,
link |
02:56:55.280
at least in the first two times, was in the laptop bag,
link |
02:56:58.560
and he just got lucky.
link |
02:57:00.240
And so I just liked the perseverance of that.
link |
02:57:02.960
And I first saw him in the,
link |
02:57:05.960
the reason I got a stuffed animal
link |
02:57:07.360
and I don't have other stuffed animals
link |
02:57:09.600
is it was in a thrift store
link |
02:57:12.960
in this like giant pile of stuffed animals.
link |
02:57:16.000
And he jumped out at me because unlike all the rest of them,
link |
02:57:20.000
he has this intense mean look about him,
link |
02:57:25.240
that he's just, he's upset at life,
link |
02:57:29.600
at the cruelty of life.
link |
02:57:31.040
And just, especially in the contrast of the other
link |
02:57:33.040
stuffed animals, they have this dumb smile on their face.
link |
02:57:35.840
If you look at most stuffed animals,
link |
02:57:37.100
they have this dumb look on their face
link |
02:57:38.720
and they're just happy.
link |
02:57:39.640
It's like Pleasantville.
link |
02:57:40.560
It's what we say in neuroscience,
link |
02:57:41.640
they have a smooth cortex, not many fold.
link |
02:57:44.440
Exactly.
link |
02:57:45.280
And this, like, Hegyi like saw through all of it.
link |
02:57:48.000
He was like Dostoevsky's man from underground.
link |
02:57:52.000
I mean, there's a sense that he saw the darkness
link |
02:57:54.800
of the world and persevered.
link |
02:57:56.560
So I got, and there's also a famous Russian cartoon,
link |
02:58:00.720
Hedgehog in the Fog, that I grew up with,
link |
02:58:03.920
I connected with.
link |
02:58:04.760
There's people who know of that cartoon.
link |
02:58:07.600
You can see it on YouTube.
link |
02:58:09.440
It's like-
link |
02:58:10.280
Hedgehog in the Fog.
link |
02:58:11.100
Yeah.
link |
02:58:13.800
It's just as you would expect,
link |
02:58:15.280
especially from like early Soviet cartoons.
link |
02:58:17.920
It's a hedgehog, like sad, walking through the fog,
link |
02:58:22.880
exploring like loneliness and sadness.
link |
02:58:25.320
It's like, but it's beautiful.
link |
02:58:26.760
It's like a piece of art.
link |
02:58:27.840
People should, even if you don't speak Russian,
link |
02:58:29.600
you'll see, you'll understand.
link |
02:58:31.560
Oh, it's like the moment you said that I was gonna ask.
link |
02:58:33.920
So it's in Russian, but of course it's in Russian.
link |
02:58:35.440
It's in Russian, but it's more,
link |
02:58:37.240
there's very little speaking in it.
link |
02:58:39.040
It's almost, there's an interesting exploration
link |
02:58:43.040
of how you make sense of the world
link |
02:58:47.360
when you see it only vaguely through the fog.
link |
02:58:52.160
So he's trying to understand the world.
link |
02:58:55.240
Here we have Mickey Mouse.
link |
02:58:56.660
Yeah.
link |
02:58:57.500
We have Bugs Bunny.
link |
02:58:58.320
Yeah.
link |
02:58:59.160
We have all these crazy animals,
link |
02:59:01.520
and you have the hedgehog in the fog.
link |
02:59:03.560
So there's a certain period, and this is, again,
link |
02:59:07.880
I don't know what to attribute it to,
link |
02:59:09.260
but it was really powerful,
link |
02:59:10.800
which there's a period in Soviet history,
link |
02:59:12.960
I think probably 70s and 80s,
link |
02:59:16.040
where like, especially kids were treated very seriously.
link |
02:59:21.280
Like they were treated like they're able to deal
link |
02:59:24.280
with the weightiness of life.
link |
02:59:27.800
And that was reflected in the cartoons.
link |
02:59:31.020
And it was allowed to have really artistic content,
link |
02:59:37.160
not like dumb cartoons that are trying
link |
02:59:39.040
to get you to be like smile and run around,
link |
02:59:40.960
but like create art.
link |
02:59:42.320
Like stuff that, you know how like short cartoons
link |
02:59:45.300
or short films can win Oscars?
link |
02:59:47.040
Like that's what they're swinging for.
link |
02:59:48.720
So what strikes me about this is a little bit
link |
02:59:51.080
how we were talking about the suit earlier.
link |
02:59:52.760
It's almost like they treat kids with respect.
link |
02:59:55.140
Yeah.
link |
02:59:55.980
Like they have an intelligence
link |
02:59:58.280
and they honor that intelligence.
link |
02:59:59.760
Yeah, they're really just adult in a small body.
link |
03:00:03.360
Like you want to protect them
link |
03:00:04.600
from the true cruelty of the world,
link |
03:00:06.400
but in terms of their intellectual capacity
link |
03:00:08.440
or like philosophical capacity,
link |
03:00:10.240
they're right there with you.
link |
03:00:11.540
And so the cartoons reflected that,
link |
03:00:14.240
the art that they consumed, the education reflected that.
link |
03:00:17.720
So he represents that.
link |
03:00:19.100
I mean, there's a sense of because he survived so long
link |
03:00:24.100
and because I don't like stuffed animals,
link |
03:00:27.320
that it's like, we've been through all of this together
link |
03:00:30.920
and it's the same sharing the moments together.
link |
03:00:32.980
It's the friendship.
link |
03:00:34.340
And there's a sense in which, you know,
link |
03:00:36.200
if all the world turns on you and goes to hell,
link |
03:00:39.000
at least we got each other.
link |
03:00:40.400
That, and he doesn't die because he's an inanimate object.
link |
03:00:44.300
So.
link |
03:00:45.480
Until you animate him.
link |
03:00:47.400
Until you animate him.
link |
03:00:49.060
And then I probably wouldn't want to know
link |
03:00:50.420
what he was thinking about this whole time.
link |
03:00:53.460
He's probably really into Taylor Swift
link |
03:00:55.200
or something like that.
link |
03:00:56.040
It's like that I wouldn't even want to know.
link |
03:00:57.700
Anyway.
link |
03:00:58.540
Well, I now feel a connection to Hedgy the Hedgehog
link |
03:01:02.460
that I certainly didn't have before.
link |
03:01:04.100
And I think that encapsulates the kind of possibility
link |
03:01:07.340
of connection that is possible
link |
03:01:11.120
between human and other object
link |
03:01:13.620
and through robotics, certainly.
link |
03:01:17.740
There's a saying that I heard when I was a graduate student
link |
03:01:19.580
that's just been ringing in my mind
link |
03:01:22.520
throughout this conversation in such a,
link |
03:01:25.060
I think, appropriate way,
link |
03:01:26.340
which is that Lex, you are in a minority of one.
link |
03:01:31.300
You are truly extraordinary in your ability
link |
03:01:35.140
to encapsulate so many aspects of science, engineering,
link |
03:01:40.180
public communication about so many topics,
link |
03:01:43.740
martial arts, and the emotional depth that you bring to it,
link |
03:01:47.100
and just the purposefulness.
link |
03:01:49.060
And I think if it's not clear to people,
link |
03:01:51.620
it absolutely should be stated,
link |
03:01:53.880
but I think it's abundantly clear
link |
03:01:55.460
that just the amount of time and thinking
link |
03:01:58.400
that you put into things is,
link |
03:02:01.340
it is the ultimate mark of respect.
link |
03:02:04.820
So I'm just extraordinarily grateful for your friendship
link |
03:02:08.020
and for this conversation.
link |
03:02:09.140
I'm proud to be your friend.
link |
03:02:11.140
And I just wish you showed me the same kind of respect
link |
03:02:13.420
by wearing a suit and make your father proud,
link |
03:02:15.840
maybe next time.
link |
03:02:17.500
Next time, indeed.
link |
03:02:19.060
Thanks so much, my friend.
link |
03:02:20.460
Thank you. Thank you, Andrew.
link |
03:02:22.340
Thank you for joining me for my discussion
link |
03:02:24.420
with Dr. Lex Friedman.
link |
03:02:26.280
If you're enjoying this podcast and learning from it,
link |
03:02:29.140
please consider subscribing on YouTube.
link |
03:02:31.380
As well, you can subscribe to us on Spotify or Apple.
link |
03:02:35.500
Please leave any questions and comments and suggestions
link |
03:02:38.220
that you have for future podcast episodes and guests
link |
03:02:40.840
in the comment section on YouTube.
link |
03:02:43.220
At Apple, you can also leave us up to a five-star review.
link |
03:02:46.780
If you'd like to support this podcast, we have a Patreon.
link |
03:02:49.640
That's patreon.com slash Andrew Huberman.
link |
03:02:52.820
And there, you can support us at any level that you like.
link |
03:02:56.200
Also, please check out our sponsors mentioned
link |
03:02:58.540
at the beginning of the podcast episode.
link |
03:03:00.840
That's the best way to support this podcast.
link |
03:03:03.180
Links to our sponsors can be found in the show notes.
link |
03:03:06.740
And finally, thank you for your interest in science.
link |
03:03:10.960
["The
link |
03:02:50.580
Letter on the prefers"]