← 返回视频库

Elon Musk on DOGE, AI, & Are we in a Simulation? | KMP Ep.18

中级 ⏱ 52:58

In this episode, I sit down with @elonmusk for a wide-ranging conversation on DOGE, AI, simulation theory, and much more. 0:00 — Introduction 1:13 — How DOGE Began 2:22 — Has DOGE Been Successful? 3:51 — Would Elon Do DOGE Again? 5:40 — Importing Illegal Immigrants 7:00 — Thoughts on Ilhan Omar 7:56 — AI and Robots Replacing Jobs 9:59 — Elon’s Biggest Irrational Fear 10:17 — Average Hours of Sleep per Night 10:52 — A Day in Elon’s Life 11:55 — Funniest Person Elon Knows 12:37 — Believing in God 12:56 — Last Time Elon Was in the General Public 13:25 — Charlie Kirk Assassination Attempt 13:44 — One Moment Elon Wishes He Could Relive 14:05 — Projects in the Pipeline 15:24 — Evolutionary Hall of Fame 16:27 — Humans Becoming Multi-Planetary 19:00 — Why Fashion Needs to Evolve 21:02 — A Believable Conspiracy Theory 22:58 — Biggest Misconception About Elon 24:26 — The Idea Behind Starbase 26:07 — Visiting Disney More Than 10 Times 26:59 — Elon’s Favorite Age to Parent 27:07 — Is Humanity Inherently Good? 28:53 — One Invention That Has Made Humanity Worse 29:56 — Simulation Theory 31:57 — Is Social Media Making People More Performative? 32:51 — X’s Country-of-Origin Feature 33:48 — Rapid-Fire “Would You Rather” 43:35 — TV Show Elon Is Currently Watching 44:41 — Elon’s Top Motivational Song 44:58 — Read the Instructions or Wing It? 45:09 — Starting From Scratch With $1,000 47:30 — A Random Job Elon Would Enjoy 47:55 — Daily Diet and Favorite Food 50:54 — One Emoji to Describe Elon 51:09 — Dream Dinner Party Guests

字幕文本(200 句)

I think the story of Doge from your
0:01
[music] perspective has never been told.
0:03
Do you think you were successful?
0:04
>> We're a little a little bit successful.
0:06
We were somewhat successful.
0:07
>> Would you ever do Doge again?
0:08
>> Um I mean, no. I don't think so.
0:12
I think instead of doing Doge, I would have basically
0:15
built, you know, worked on my companies essentially…
0:16
[music] built…
0:18
and not — and the cars would — they wouldn't have been running the cars.
0:22
>> What's your biggest irrational fear?
0:23
>> I try not to have irrational fears.
0:25
[music] None. If I find an irrational fear, I squelch it.
0:31
If you had to start from scratch today with only $1,000…
0:34
>> Well, I did originally come to North America with like 2500 bucks…
0:38
2500 Canadian, maybe two grand US.
0:41
At this point, I have a lot of knowledge.
0:43
A lot of things have to go wrong for that to be the case.
0:46
It's like, am I just emerging from prison perhaps with a stipend?
1:02
Hi everyone and welcome to this week's episode…
1:03
We are in Texas today joined by Elon Musk.
1:10
>> Nice to see you again Katie.
1:12
>> Nice to see you Elon.
1:14
So I want to take us back. It's January 20th…
1:21
You’re getting sworn in, they hand you a computer and a phone.
1:25
I want to go back to what happened next.
1:27
The story of Doge from your perspective has never been told.
1:31
What was your first thought on how Doge was going to proceed?
1:35
>> Well, um…
1:39
I guess I couldn't believe I was there.
1:41
For the most part it's all seemed extremely surreal.
1:48
Doge was a made-up name…
1:50
…made up two or three months before.
1:56
And based on internet suggestions.
2:03
I was going to call it the Government Efficiency Commission…
2:07
Then someone online said: No, it should be the Department of Government Efficiency—DOGE.
2:13
I'm like, “That sounds great.”
2:20
So we just kind of made up a department.
2:22
>> Do you think you were successful?
2:25
>> We're a little a little bit successful.
2:27
We were somewhat successful.
2:33
We stopped a lot of funding that made no sense.
2:40
Just entirely wasteful.
2:46
For example…
2:49
Probably 100–200 billion dollars of zombie payments per year.
2:53
Which simply by enforcing payment codes…
2:56
…those payments would not go out.
3:03
We made that change to the main Treasury computer.
3:10
Seems insanely obvious.
3:18
There are just 2–3% of government payments that really shouldn't be going out.
3:30
And it's actually quite hard to stop.
3:39
It’s a pretty rare individual who asks the government to stop sending them money.
3:50
>> Would you ever do Doge again?
3:52
>> Um do you mean would I repeat history or…
3:55
>> Two ways to think about it.
3:58
One is if you could go back and start from scratch…
4:00
like it's January 20th all again.
4:02
you go back and do it differently? And
4:04
knowing what you know now, do you think
4:06
there's ever a place to restart
4:09
you? Not saying others in yourstead, you
4:12
go back and restart doing Doge.
4:16
[sighs]
4:16
>> I mean, no, I don't think so. Um, would
4:20
I do I I think I probably
4:24
I don't know. Um,
4:26
>> would you do Doge again knowing what you
4:27
know now?
4:29
I mean the thing is like I think in
4:31
instead of doing Doge I I would have
4:34
basically built
4:37
you know worked on my companies
4:39
essentially. So and not and the cars
4:42
would they wouldn't have been burning
4:43
the cars. Um
4:46
>> you gave up a lot to do. Uh yeah,
4:50
like if you if you if you stop money
4:52
going uh to
4:55
uh
4:57
going going for political corruption,
4:58
they will they will lash out big time.
5:02
>> Mhm.
5:02
>> Um
5:04
so they really want the money to keep
5:06
flowing. Um,
5:10
so if you stop it from flowing, there's
5:13
like a very strong reaction to to
5:17
stopping the money flowing.
5:18
>> After you were in DC for a while, did
5:20
you become disillusioned with how it
5:22
operates?
5:25
>> Well, I I wouldn't say I was super
5:27
illusioned to begin with. Uh, it it I
5:30
mean, I guess it's just like you really
5:32
want the least amount done by government
5:34
possible. The least amount
5:37
I I I I guess maybe maybe like the the
5:40
biggest thing is that I guess the
5:42
biggest single thing is is that the
5:45
there there are massive transfer
5:47
payments going to um illegal immigrants
5:53
um
5:54
like massive essentially we're paying
5:57
people to come here from somewhere else
6:00
um in vast numbers including flying them
6:03
in. So, like it's not like you need a
6:06
border wall if you're flying them in.
6:08
Um, then fasttracking them to
6:12
citizenship and
6:15
um making them beholden to to government
6:18
payments um and uh
6:22
and voting hard left. That's that's
6:26
essentially it's like voter importation.
6:30
If if if you if you create a gigantic
6:32
money magnet to um you say if anyone
6:37
comes here from anywhere else, we're
6:38
going to pay you t tons of money, give
6:40
you lots of free stuff. Um
6:43
come come come to America and and get
6:46
paid
6:48
to do so. Um
6:51
like you're going to get a lot of people
6:52
taking up on that offer. Um and people
6:56
say like this this this is fake. I'm
6:57
like, uh, actually, well, let's look at,
7:01
um, you know, Elon Omar, who was
7:05
literally was voted into power, voted
7:08
into Congress by, uh, you know, large
7:12
group of people from Somalia, who are in
7:14
Minnesota, which is really far from
7:16
Somalia,
7:18
or Mandani,
7:20
who was voted as to be to be mayor.
7:24
But if if but
7:26
>> [clears throat]
7:27
>> by a majority of people who are not um
7:32
born in America.
7:34
That's my understanding at least. Um so
7:38
um and then then California say big time
7:42
um situation.
7:45
So
7:47
uh I don't we just don't want to turn
7:48
into a um
7:53
you know communist hell hole basically.
7:56
If you've said in the future that no
7:58
one's going to need to worry about money
7:59
or work because AI is going to take care
8:03
of the rest, AI and robotics. What do
8:05
you mean that people won't have to work
8:06
in the future?
8:08
>> Assuming the current trend of artificial
8:10
intelligence and robotics continues,
8:12
which seems likely, the um AI and robots
8:17
will be able to do anything
8:20
that that humans want to want them to do
8:23
essentially. So hopefully not more than
8:26
that, but
8:28
AI and robotics will be able to
8:31
provide us
8:33
provide all the goods and services that
8:35
anyone could possibly want. So
8:37
>> but you wouldn't need to work
8:40
like what would you do with your free
8:41
time?
8:44
>> People people will be able to do
8:45
whatever they want with their free time.
8:47
Um
8:48
work will be optional.
8:51
I I mean I just want to separate out
8:52
from like what I wish would happen
8:54
versus what I predict will happen
8:55
because people get confused about that.
8:56
They think that what I predict will
8:58
happen is what I want it to happen.
9:00
>> What I want what I predict to happen is
9:01
not the same as what I want to happen.
9:03
>> Um I if if I could I I would
9:08
I would certainly slow down uh AI and
9:12
robotics, but I I can't. It seems to be
9:15
well
9:19
it's it's it's advancing at a very rapid
9:21
pace. Um whether I like it or not.
9:23
>> Is AI what keeps you up at night?
9:26
>> It used to be this point. I don't know.
9:30
I I I wouldn't say there's there's
9:33
nothing particularly keeping me up at
9:34
night right now except that.
9:38
But if you say what
9:41
what where do I wake up in nightmares?
9:42
Oh, AI. Yeah. Actually, [laughter]
9:46
I've had a lot of AI nightmares. Uh I I
9:49
I had AI nightmares many days in a row.
9:55
>> What am I supposed to do about it?
9:57
>> What's your biggest irrational fear?
10:00
>> Um I I I try not to have irrational
10:02
fears.
10:03
>> None.
10:04
>> If I find an irrational fear, I squelch
10:08
it. I I I don't believe fear is fear is

在 Linglish 上深度学习这个视频

AI 助手 · 单词高亮 · 影子跟读 · 间隔重复 · 完全免费

立即免费学习