Sept. 17, 2025
Colocating Data with David Aronchick

David has worked on a lot of cool tech you know like Kubernetes and Kubeflow, and he's usually a few years ahead of the game. So getting to catch up with him about what he's working on now is probably something you'll want to know about before you have these problems. He has great insights in how to get companies to support open source and how Kubernetes has evolved over time.
Links
1
00:00:00,000 --> 00:00:12,000
Welcome to Fork Around and Find Out, the podcast about building, running and maintaining software and systems.
2
00:00:19,000 --> 00:00:26,000
Hello and welcome to Fork Around and Find Out. I'm your host Justin Garrison and with me as always is Autumn Nash. How's it going Autumn?
3
00:00:26,000 --> 00:00:29,000
I'm just really excited to see what Joe keeps coming up with.
4
00:00:29,000 --> 00:00:34,000
I can't even think of a joke. I'm just trying to think of like we are co-locating our jokes with our smarts today.
5
00:00:34,000 --> 00:00:35,000
Be funny now.
6
00:00:35,000 --> 00:00:38,000
I know. It's like it's too much pressure. This is why I don't do stand-up.
7
00:00:38,000 --> 00:00:39,000
Exactly.
8
00:00:40,000 --> 00:00:43,000
This is why instead of stand-up I just have kids.
9
00:00:43,000 --> 00:00:44,000
Don't lie.
10
00:00:44,000 --> 00:00:46,000
I'm interrupting me like intro.
11
00:00:46,000 --> 00:00:50,000
You just do like random stand-up.
12
00:00:50,000 --> 00:00:55,000
Yeah. It has to be ad hoc and today on the show we do not have an ad hoc guest.
13
00:00:55,000 --> 00:00:59,000
I guess we have David Aronchik, CEO and founder of Expanso. Welcome to the show David.
14
00:00:59,000 --> 00:01:01,000
Thank you so much. Real pleasure.
15
00:01:01,000 --> 00:01:06,000
David, we met like I feel like it was like a decade ago. It was like one of the first cube concept.
16
00:01:06,000 --> 00:01:07,000
Absolutely.
17
00:01:08,000 --> 00:01:09,000
No, for sure.
18
00:01:09,000 --> 00:01:15,000
And we just kind of been around. It's just infrastructure, Kubernetes, cloud. We've been doing all this.
19
00:01:15,000 --> 00:01:20,000
Tell us about what your journey's been throughout the last decade of doing infrastructure stuff.
20
00:01:20,000 --> 00:01:29,000
I mean, you know, when I was looking back, you know, I've certainly exchanged Twitter and Blue Sky and so on in all these days.
21
00:01:29,000 --> 00:01:35,000
But you're exactly right. You were one of the first Kubernetes adopters.
22
00:01:35,000 --> 00:01:40,000
So my background is I've been doing enterprise and consumer. I'm on my fourth startup.
23
00:01:40,000 --> 00:01:42,000
I'll get to that in a second.
24
00:01:42,000 --> 00:01:47,000
But the most recent arc of my career started in just before you and I met.
25
00:01:47,000 --> 00:02:00,000
I worked at Chef where I was doing director of product management and leading a bunch of their work on, you know, as Docker and all the container stuff was first coming out.
26
00:02:00,000 --> 00:02:08,000
Then I left there to go be the first non-founding PM for Kubernetes, which I did for Google for a bunch of years.
27
00:02:09,000 --> 00:02:14,000
And and help start the cloud native computer foundation and so on and so forth.
28
00:02:14,000 --> 00:02:18,000
And and that's where we met, you know, you, you were at Disney at the time.
29
00:02:18,000 --> 00:02:25,000
You were one of the first big adopters of this from a more traditional enterprise.
30
00:02:25,000 --> 00:02:37,000
And I know Disney is like super forward looking, you know, because of folks like you, but like it was really like, you know, no one really understood like what this Docker thing is what these containers are.
31
00:02:37,000 --> 00:02:40,000
And, you know, how does this going to affect me?
32
00:02:40,000 --> 00:02:46,000
But you were you were absolutely one of the first actually one of your other media proper and not not Disney's media properties.
33
00:02:46,000 --> 00:02:54,000
Although who knows now, though, it's all this consolidation, but another one of your media cohorts was right after you.
34
00:02:54,000 --> 00:03:01,000
I always remember because one of the HBO HBO Max adopted Kubernetes really early as well.
35
00:03:01,000 --> 00:03:08,000
And I remember watching or streaming Game of Thrones on my laptop.
36
00:03:08,000 --> 00:03:12,000
And I was like, oh, my God, this is running on our stuff.
37
00:03:12,000 --> 00:03:14,000
I'm very, very proud of it.
38
00:03:14,000 --> 00:03:17,000
But um, yeah, you know, that's where you and I met.
39
00:03:17,000 --> 00:03:23,000
And so I let I started to GKE the Google Kubernetes engine.
40
00:03:23,000 --> 00:03:27,000
And then I did that for a bunch of years.
41
00:03:27,000 --> 00:03:38,000
I then moved from there into starting a machine learning platform called cube flow, which has been very popular.
42
00:03:38,000 --> 00:03:52,000
And so I did that, then I left Google to go work at Microsoft to lead open source machine learning strategy out of the in for the Azure ML group and out of the office of the CTO.
43
00:03:52,000 --> 00:04:03,000
And I did that for a few years and now I'm on to my startup, which is, you know, crazy, like I honestly never thought I would go back.
44
00:04:03,000 --> 00:04:12,000
Not because I don't like the startup game, but because like I had a perfectly reasonable job at like big toe, like kick your feet up and chill, dude.
45
00:04:12,000 --> 00:04:17,000
Like, but no, I have a vision for the world and I'd like it to to exist.
46
00:04:17,000 --> 00:04:20,000
So I'm off doing the startup thing again.
47
00:04:20,000 --> 00:04:24,000
Is there ever really a kick your feet up and chill moment intact?
48
00:04:24,000 --> 00:04:30,000
You know, this should be like I keep hearing about this like it does.
49
00:04:30,000 --> 00:04:33,000
At what point do you get there?
50
00:04:33,000 --> 00:04:35,000
Some people are wired that way.
51
00:04:35,000 --> 00:04:39,000
I wish I was I truly wish I was.
52
00:04:39,000 --> 00:04:40,000
I didn't I didn't hear the word.
53
00:04:40,000 --> 00:04:42,000
I didn't hear the word wider wired.
54
00:04:42,000 --> 00:04:47,000
I heard whiter like my skin color and I was like, well, yeah, that also probably plays into it too.
55
00:04:47,000 --> 00:04:52,000
Where I've worked with quite a few people that have kicked their feet up and usually they look like me.
56
00:04:52,000 --> 00:04:53,000
Yeah, yeah.
57
00:04:53,000 --> 00:04:56,000
It's autumn takes the biggest drink of her coffee.
58
00:04:56,000 --> 00:05:01,000
She's talking idiot.
59
00:05:01,000 --> 00:05:05,000
Take me back a little bit to the GKE creation.
60
00:05:05,000 --> 00:05:07,000
Was that always the intention of Kubernetes?
61
00:05:07,000 --> 00:05:14,000
Like you open sourced it and it felt like it was meant as just like a pure open source play and then just the popularity was there immediately.
62
00:05:14,000 --> 00:05:20,000
So so it's long enough ago that I think I can say all this stuff without missing too many people off.
63
00:05:20,000 --> 00:05:25,000
But no, the story here goes back to 2003.
64
00:05:25,000 --> 00:05:26,000
Okay.
65
00:05:26,000 --> 00:05:35,000
And the story is that Google came out and released the the Hadoop paper, the MapReduce paper in 2003.
66
00:05:35,000 --> 00:05:43,000
And Yahoo came along and very helpfully read this wonderful paper, this groundbreaking paper and say, wow, this sounds really cool.
67
00:05:43,000 --> 00:05:44,000
Let's go launch something around it.
68
00:05:44,000 --> 00:05:46,000
And they created Hadoop.
69
00:05:46,000 --> 00:05:47,000
Right.
70
00:05:47,000 --> 00:05:49,000
And Google was like, oh, this is good.
71
00:05:49,000 --> 00:05:52,000
You know, we're glad people are out there and Google is a very academic place.
72
00:05:52,000 --> 00:05:57,000
So they like really don't take any like ownership over that until it gets to Google Cloud.
73
00:05:57,000 --> 00:06:07,000
And they like at the time that they launched Google Cloud, they they had to create an HDFS compatibility layer or Hadoop.
74
00:06:07,000 --> 00:06:20,000
And what that meant was you had something that Google invented, re-implemented by someone else, implemented on this like compatibility layer that ultimately went through another layer that ultimately was still running on MapReduce.
75
00:06:20,000 --> 00:06:21,000
Right.
76
00:06:21,000 --> 00:06:23,000
And they're like, why the hell did this happen?
77
00:06:23,000 --> 00:06:25,000
Like we could have just done the thing.
78
00:06:25,000 --> 00:06:26,000
Right.
79
00:06:26,000 --> 00:06:28,000
So that's going to be angle one.
80
00:06:28,000 --> 00:06:32,000
They were like, hey, look, we don't want to, we're going to release something in the world.
81
00:06:32,000 --> 00:06:34,000
Let's actually release something to them.
82
00:06:34,000 --> 00:06:35,000
Okay.
83
00:06:35,000 --> 00:06:36,000
So that's category one.
84
00:06:36,000 --> 00:06:41,000
And category two is they saw AWS and they saw it growing.
85
00:06:41,000 --> 00:06:45,000
And they're like, holy shit, you know, this, this, oh, sorry, I don't know what this is a non-safer word.
86
00:06:45,000 --> 00:06:46,000
No, we're good.
87
00:06:46,000 --> 00:06:47,000
You're good.
88
00:06:47,000 --> 00:06:48,000
We don't, we don't believe in it anymore.
89
00:06:48,000 --> 00:06:49,000
I curse like a sailor.
90
00:06:49,000 --> 00:06:50,000
So you'll have to.
91
00:06:50,000 --> 00:06:54,000
So do I. Justin's very good at not cursing.
92
00:06:54,000 --> 00:06:55,000
I don't have it in me.
93
00:06:55,000 --> 00:06:57,000
I just don't have it in me.
94
00:06:57,000 --> 00:07:05,000
I almost feel like swearing is like, like I look for slightly spicy people because I appreciate their honesty.
95
00:07:05,000 --> 00:07:07,000
Justin's spicy in other ways.
96
00:07:07,000 --> 00:07:08,000
I was going to say that.
97
00:07:08,000 --> 00:07:14,000
Like I'm so excited for you to be here because I love being in between a spicy interview
98
00:07:14,000 --> 00:07:15,000
E and Justin.
99
00:07:15,000 --> 00:07:16,000
Cause it's.
100
00:07:16,000 --> 00:07:19,000
You're in between there.
101
00:07:19,000 --> 00:07:20,000
All right.
102
00:07:20,000 --> 00:07:21,000
Yeah.
103
00:07:21,000 --> 00:07:24,000
This podcast has taken a very interesting turn.
104
00:07:24,000 --> 00:07:25,000
Let me just say that.
105
00:07:25,000 --> 00:07:32,000
Within five minutes of like meeting you, I was like, David's going to be so much better.
106
00:07:32,000 --> 00:07:34,000
Like, I thought this was a spark around to find out.
107
00:07:34,000 --> 00:07:36,000
And this is not a BDSM podcast.
108
00:07:36,000 --> 00:07:37,000
Right.
109
00:07:37,000 --> 00:07:40,000
Oh, thank God.
110
00:07:40,000 --> 00:07:42,000
Tim is in here.
111
00:07:42,000 --> 00:07:43,000
Oh shit.
112
00:07:43,000 --> 00:07:45,000
My things went all blurred here.
113
00:07:45,000 --> 00:07:46,000
Okay.
114
00:07:46,000 --> 00:07:52,000
Um, so, so then, so then, uh, AWS comes along and they're like killing it.
115
00:07:52,000 --> 00:07:53,000
Right.
116
00:07:53,000 --> 00:07:54,000
And we all look at that.
117
00:07:54,000 --> 00:07:57,000
We're like, Hey, but wait, we have a cloud that we're trying to get going here.
118
00:07:57,000 --> 00:08:01,000
Um, like we think that the right thing here, the right.
119
00:08:01,000 --> 00:08:04,000
Or, uh, uh, element is not a via.
120
00:08:04,000 --> 00:08:05,000
Right.
121
00:08:05,000 --> 00:08:08,000
We think the right element is a container and look at Docker.
122
00:08:08,000 --> 00:08:09,000
They're doing great.
123
00:08:09,000 --> 00:08:10,000
Right.
124
00:08:10,000 --> 00:08:11,000
So let's take.
125
00:08:11,000 --> 00:08:12,000
Dockers.
126
00:08:12,000 --> 00:08:18,000
You know, extension and wisdom, which by the way, again, another thing that Google launched.
127
00:08:18,000 --> 00:08:24,000
Again, no one is saying that Docker wasn't an enormous part in arguably the reason that
128
00:08:24,000 --> 00:08:25,000
containers are successful.
129
00:08:25,000 --> 00:08:30,000
Um, but a lot of it was based on, you know, kernel changes that came in, you know, in
130
00:08:30,000 --> 00:08:32,000
2004 or 2005, right?
131
00:08:32,000 --> 00:08:34,000
Like there's an enormous amount of stuff there.
132
00:08:34,000 --> 00:08:36,000
And so they're like, Hey, look, Walker's killing it.
133
00:08:36,000 --> 00:08:39,000
Let's help Docker extend even further.
134
00:08:39,000 --> 00:08:44,000
And let's help people say, you know, can take VMs are not the right thing.
135
00:08:44,000 --> 00:08:45,000
It's just not.
136
00:08:45,000 --> 00:08:49,000
And so, you know, again, I was not here during this time.
137
00:08:49,000 --> 00:08:54,000
Craig McClucky and Joe Beta and Brandon Burns and Tim Hawken and Brian Grant and whatever,
138
00:08:54,000 --> 00:08:58,000
they were like all working on a bunch of stuff internally to Google where they're like, we
139
00:08:58,000 --> 00:09:04,000
think there's a new orchestration paradigm that people should be adopting here.
140
00:09:04,000 --> 00:09:08,000
Um, they were going to build it internally to Google in a project called Omega and you
141
00:09:08,000 --> 00:09:09,000
should go read everyone.
142
00:09:09,000 --> 00:09:13,000
You should go read Brian Grant's blog history of this.
143
00:09:13,000 --> 00:09:16,000
It's so good and it's so real.
144
00:09:16,000 --> 00:09:22,000
It is like transparently going through a nice human, which is like amazing that he's
145
00:09:22,000 --> 00:09:26,000
like a humble, nice human after doing like he's so smart.
146
00:09:26,000 --> 00:09:28,000
He would guess two or three on this show.
147
00:09:28,000 --> 00:09:30,000
So yeah, he's so good.
148
00:09:30,000 --> 00:09:34,000
When you talk to him, you're just like, dude, you're so smart.
149
00:09:34,000 --> 00:09:37,000
Like he is just so intelligent.
150
00:09:37,000 --> 00:09:41,000
So, so, uh, uh, Brandon is a good friend.
151
00:09:41,000 --> 00:09:45,000
And when I first get to Google, he tells me this thing, which is amazing.
152
00:09:45,000 --> 00:09:50,000
He says your goal inside Google is not to be the smartest person on day one or day two
153
00:09:50,000 --> 00:09:52,000
or day 400, right?
154
00:09:52,000 --> 00:09:53,000
Your goal is the following.
155
00:09:53,000 --> 00:09:56,000
Like you should go and you should come up with a smart idea.
156
00:09:56,000 --> 00:09:58,000
We hired you because you have a smart idea.
157
00:09:58,000 --> 00:09:59,000
Okay.
158
00:09:59,000 --> 00:10:02,000
You should go and you should try and figure out where that idea is because I guarantee
159
00:10:02,000 --> 00:10:04,000
somebody internally has already thought about it.
160
00:10:04,000 --> 00:10:05,000
Right.
161
00:10:05,000 --> 00:10:12,000
And there will be a window between you thinking of this idea and the paper and the, that window
162
00:10:12,000 --> 00:10:14,000
will start off at like four years.
163
00:10:14,000 --> 00:10:19,000
Like the idea was four years ago that somebody looked at this and they decided this was a
164
00:10:19,000 --> 00:10:22,000
bad idea or they implemented it or whatever.
165
00:10:22,000 --> 00:10:23,000
And then you should.
166
00:10:23,000 --> 00:10:24,000
Okay.
167
00:10:24,000 --> 00:10:25,000
You get smarter about it.
168
00:10:25,000 --> 00:10:27,000
You read the paper and then you come back and then you do that again.
169
00:10:27,000 --> 00:10:30,000
You're going to come up with another great idea and it'll be two years.
170
00:10:30,000 --> 00:10:31,000
You're like, what?
171
00:10:31,000 --> 00:10:33,000
Two years and then you'll do it again.
172
00:10:33,000 --> 00:10:34,000
It'll be like nine months.
173
00:10:34,000 --> 00:10:35,000
Then you do it again.
174
00:10:35,000 --> 00:10:37,000
It'll be like three months and then you do it again.
175
00:10:37,000 --> 00:10:39,000
You won't find a paper and then you're like that.
176
00:10:39,000 --> 00:10:43,000
That is the thing you should go and implement and, and so on and so forth.
177
00:10:43,000 --> 00:10:45,000
So Brandon says this and I was like, I still take this wisdom way.
178
00:10:45,000 --> 00:10:48,000
I think it's so interesting, especially in the real world where you can go out and you
179
00:10:48,000 --> 00:10:51,000
can research it and you can figure out why things worked and didn't work and so on and
180
00:10:51,000 --> 00:10:52,000
so forth.
181
00:10:52,000 --> 00:10:55,000
Brian is interesting because he's the other half of the coin.
182
00:10:55,000 --> 00:11:00,000
Like he's the one who will like, he just has canonical knowledge of everything.
183
00:11:00,000 --> 00:11:04,000
And so he is whenever I'm trying to come up with a new feature for our platform or,
184
00:11:04,000 --> 00:11:06,000
you know, hey, you know, why didn't people do this?
185
00:11:06,000 --> 00:11:09,000
I go and talk to Brian or another guy, Eric Brewer.
186
00:11:09,000 --> 00:11:11,000
He's, he's also a really wonderful human.
187
00:11:11,000 --> 00:11:14,000
You should have him on if you haven't already.
188
00:11:14,000 --> 00:11:20,000
And the two of them together, you're just kind of like, oh, you know, what's, what about
189
00:11:20,000 --> 00:11:21,000
this idea?
190
00:11:21,000 --> 00:11:22,000
Oh yeah, we didn't look at that.
191
00:11:22,000 --> 00:11:26,000
And this is the problem and distributed to this and consensus that in this year and
192
00:11:26,000 --> 00:11:27,000
you're running into this.
193
00:11:27,000 --> 00:11:30,000
And eventually you'll get to a point where they're like, yeah, that's actually not a
194
00:11:30,000 --> 00:11:31,000
bad idea.
195
00:11:31,000 --> 00:11:32,000
And you're like, ah, I'm going to go with it.
196
00:11:32,000 --> 00:11:36,000
I feel like having Brian as a friend has got to be like some sort of life hack because
197
00:11:36,000 --> 00:11:41,000
to be able to bounce ideas off of someone like that, like God.
198
00:11:41,000 --> 00:11:47,000
I mean, I say that I am, I try and collect smart friends like they're fucking Pokemon.
199
00:11:47,000 --> 00:11:48,000
That is that.
200
00:11:48,000 --> 00:11:49,000
Okay.
201
00:11:49,000 --> 00:11:51,000
Like that is like the top tier.
202
00:11:51,000 --> 00:11:56,000
Like if all of the rest of the world is questionable at the moment, having smart friends and good
203
00:11:56,000 --> 00:11:57,000
friends.
204
00:11:57,000 --> 00:11:58,000
100%.
205
00:11:58,000 --> 00:12:02,000
I mean, just having someone who can be honest with you is like brutally important.
206
00:12:02,000 --> 00:12:07,000
I might tease Justin on the internet, but like having good friends, like top tier.
207
00:12:07,000 --> 00:12:09,000
Like if you want to know how to improve your life.
208
00:12:09,000 --> 00:12:11,000
I don't know if that was including or excluding me.
209
00:12:11,000 --> 00:12:13,000
That's kind of it goes both ways.
210
00:12:13,000 --> 00:12:14,000
Duh.
211
00:12:14,000 --> 00:12:20,000
Like having Justin, but also I have good friends in between your questionable moments of the
212
00:12:20,000 --> 00:12:22,000
fact that you don't drink coffee.
213
00:12:22,000 --> 00:12:25,000
But like, we won't go into that today.
214
00:12:25,000 --> 00:12:28,000
But it's like, I think that drink coffee.
215
00:12:28,000 --> 00:12:30,000
How's that even possible?
216
00:12:30,000 --> 00:12:31,000
Thank you.
217
00:12:31,000 --> 00:12:34,000
Like, like you work in tech and you have children.
218
00:12:34,000 --> 00:12:35,000
What is wrong with you?
219
00:12:35,000 --> 00:12:38,000
He, he does have a Dr. Pepper obsession.
220
00:12:38,000 --> 00:12:39,000
I love Dr. Pepper.
221
00:12:39,000 --> 00:12:43,000
Last night it was at an event and somebody had like, so it's, it's tech week here in
222
00:12:43,000 --> 00:12:44,000
Seattle and it's been phenomenal.
223
00:12:44,000 --> 00:12:47,000
You live in Seattle and I've never met you, David.
224
00:12:47,000 --> 00:12:49,000
What are you doing this afternoon?
225
00:12:49,000 --> 00:12:51,000
There's like three more events.
226
00:12:51,000 --> 00:12:52,000
What?
227
00:12:52,000 --> 00:12:53,000
It's tech week.
228
00:12:53,000 --> 00:12:54,000
Yeah.
229
00:12:54,000 --> 00:12:55,000
It's tech week, man.
230
00:12:55,000 --> 00:12:58,000
But I was going to say I was at an event yesterday afternoon all week.
231
00:12:58,000 --> 00:12:59,000
I've been drinking Diet Coke.
232
00:12:59,000 --> 00:13:00,000
Don't get me wrong.
233
00:13:00,000 --> 00:13:01,000
I love Diet Coke.
234
00:13:01,000 --> 00:13:03,000
But like at the same time, like I was at an event and they had Diet Dr. Pepper.
235
00:13:03,000 --> 00:13:06,000
I'm like, oh, you, how do I get in business with you?
236
00:13:06,000 --> 00:13:08,000
Diet Dr. Pepper is amazing.
237
00:13:08,000 --> 00:13:09,000
I love it.
238
00:13:09,000 --> 00:13:12,000
I love how you said, how do I get in business with you?
239
00:13:12,000 --> 00:13:14,000
I'm making this happen.
240
00:13:14,000 --> 00:13:16,000
I totally understand that.
241
00:13:16,000 --> 00:13:18,000
Do you know how hard it is to find Justin Dr. Pepper?
242
00:13:18,000 --> 00:13:19,000
It's not that hard.
243
00:13:19,000 --> 00:13:23,000
So I continue to ignore him at conference, annoy him at conferences.
244
00:13:23,000 --> 00:13:26,000
I don't understand how everyone is at drinking Diet Dr. Pepper.
245
00:13:26,000 --> 00:13:27,000
It's so much better.
246
00:13:27,000 --> 00:13:31,000
I went to three different stores at scale so I could give, be like, here's a Dr.
247
00:13:31,000 --> 00:13:34,000
Pepper and Rice Krispie so I can keep stealing your charges.
248
00:13:34,000 --> 00:13:37,000
It was the only reason I would go to Texas was to get good Dr. Pepper.
249
00:13:37,000 --> 00:13:39,000
They have the original OG sugar.
250
00:13:39,000 --> 00:13:41,000
Do they have a different Dr. Pepper?
251
00:13:41,000 --> 00:13:45,000
It was invented in Texas and they have real sugar Dr. Pepper.
252
00:13:45,000 --> 00:13:49,000
And so it started to percolate out some other places and one store near me sells it.
253
00:13:49,000 --> 00:13:50,000
And so I go there sometimes.
254
00:13:50,000 --> 00:13:52,000
It's fine.
255
00:13:52,000 --> 00:13:57,000
I said it's fine, but it didn't sound fine at all.
256
00:13:57,000 --> 00:14:00,000
Let me, let me finish up the story so we can get out to other interesting things.
257
00:14:00,000 --> 00:14:04,000
There's too much ADHD here, David.
258
00:14:04,000 --> 00:14:06,000
No shit.
259
00:14:06,000 --> 00:14:09,000
I'm like ADHD, like on ADHD.
260
00:14:10,000 --> 00:14:14,000
So anyhow, so Brian Grant and Brennan and whatever, they come up with these things
261
00:14:14,000 --> 00:14:16,000
and Brennan, you know, literally.
262
00:14:16,000 --> 00:14:19,000
Why are you not in business with Brian?
263
00:14:19,000 --> 00:14:21,000
Because he's got his own thing.
264
00:14:21,000 --> 00:14:22,000
Yeah, he's got his own thing.
265
00:14:22,000 --> 00:14:23,000
Yeah.
266
00:14:23,000 --> 00:14:24,000
I love what he's doing, by the way.
267
00:14:24,000 --> 00:14:25,000
Yeah.
268
00:14:25,000 --> 00:14:29,000
Like the configuration management and easy with another wonderful friend of mine that
269
00:14:29,000 --> 00:14:31,000
I got ideas of all the time, Alexis.
270
00:14:31,000 --> 00:14:32,000
Alexis.
271
00:14:32,000 --> 00:14:33,000
Yeah.
272
00:14:33,000 --> 00:14:34,000
Config hub.
273
00:14:34,000 --> 00:14:35,000
Config hub.
274
00:14:35,000 --> 00:14:36,000
Yeah.
275
00:14:36,000 --> 00:14:37,000
Huge, huge.
276
00:14:37,000 --> 00:14:40,000
I'm going to fly on the wall while you guys are having like technical discussions.
277
00:14:40,000 --> 00:14:42,000
Like, can I just sit in the background?
278
00:14:42,000 --> 00:14:43,000
I mean, we never have it.
279
00:14:43,000 --> 00:14:45,000
We never have a technical discussion.
280
00:14:45,000 --> 00:14:50,000
You get in the room and you're like arguing about like how, you know, whatever, blah,
281
00:14:50,000 --> 00:14:52,000
blah, blah, like bad mouth, blah, blah, blah.
282
00:14:52,000 --> 00:14:55,000
And like, oh, you see what these idiots are doing.
283
00:14:55,000 --> 00:14:59,000
I mean, I feel like your group chat is fire.
284
00:14:59,000 --> 00:15:01,000
More group chats.
285
00:15:01,000 --> 00:15:04,000
They're the best.
286
00:15:04,000 --> 00:15:09,000
I mean, our group chat is hilarious.
287
00:15:09,000 --> 00:15:10,000
I don't know.
288
00:15:10,000 --> 00:15:11,000
I don't know.
289
00:15:11,000 --> 00:15:12,000
It's an interesting question.
290
00:15:12,000 --> 00:15:13,000
Finish your story.
291
00:15:13,000 --> 00:15:21,000
Anyhow, so Brandon gets it running on his laptop in Java, like his total skunkworks.
292
00:15:21,000 --> 00:15:23,000
That was kind of a fork.
293
00:15:23,000 --> 00:15:28,000
It wasn't a fork, but it was kind of like a conceptual fork of the thing they were doing
294
00:15:28,000 --> 00:15:29,000
internally to Google.
295
00:15:29,000 --> 00:15:35,000
And then it starts to catch fire and somehow it breaks through, like, because Google was
296
00:15:35,000 --> 00:15:38,000
really internally opposed to Kubernetes.
297
00:15:38,000 --> 00:15:44,000
Not that they were, there was just a lot of motion around like what the hell is going
298
00:15:44,000 --> 00:15:47,000
on and, you know, what kind of team do we spin up?
299
00:15:47,000 --> 00:15:53,000
And then, you know, Craig McClucky and like I said, Brandon and Brian and all these people
300
00:15:53,000 --> 00:15:57,000
ended up forcing it through, get like, I think releasing it to the world like forcibly
301
00:15:57,000 --> 00:16:00,000
and then, you know, just kind of cascaded forward from there.
302
00:16:00,000 --> 00:16:04,000
And so I joined in January of 2015.
303
00:16:04,000 --> 00:16:09,000
And Craig was like, Hey, look, I need someone to take over Kubernetes management for me
304
00:16:09,000 --> 00:16:12,000
because I'm going to go off and work on three other things.
305
00:16:12,000 --> 00:16:14,000
I mean, there's another genius for you.
306
00:16:14,000 --> 00:16:19,000
And so he, he proceeds to go and do that.
307
00:16:19,000 --> 00:16:22,000
And I like launch GK.
308
00:16:22,000 --> 00:16:26,000
And so they're like, Well, all right, we're going to have this open source thing.
309
00:16:26,000 --> 00:16:30,000
We've got to, you know, get this project going.
310
00:16:30,000 --> 00:16:32,000
It was already like put it already been written in.
311
00:16:32,000 --> 00:16:34,000
There were some early versions and so on and so forth.
312
00:16:34,000 --> 00:16:42,000
But, you know, I started leading it and, and, you know, it was the three, three core pillars of
313
00:16:42,000 --> 00:16:54,000
of GKE compute under Navneet, Paul Nash, who was off running compute and
314
00:16:55,000 --> 00:16:56,000
totally blanking on his name.
315
00:16:56,000 --> 00:16:59,000
I feel terrible, but like it was the lead for App Engine.
316
00:16:59,000 --> 00:17:00,000
This was, this was 10 years ago.
317
00:17:00,000 --> 00:17:01,000
We're not old.
318
00:17:01,000 --> 00:17:03,000
I know, but I can't remember his name.
319
00:17:03,000 --> 00:17:05,000
I feel really terrible because he was great.
320
00:17:05,000 --> 00:17:08,000
Crazy, like in 10 years, how much things have changed?
321
00:17:08,000 --> 00:17:09,000
Oh, absolutely.
322
00:17:09,000 --> 00:17:10,000
Absolutely.
323
00:17:10,000 --> 00:17:11,000
So anyhow, so that was it.
324
00:17:11,000 --> 00:17:14,000
And it was just like, let's help people adopt containers.
325
00:17:14,000 --> 00:17:18,000
And, and for better or worse, it's not that we're opposed to AWS.
326
00:17:18,000 --> 00:17:20,000
It's just, we don't want people building on VMs.
327
00:17:20,000 --> 00:17:21,000
That was it.
328
00:17:21,000 --> 00:17:25,000
And we think the world is better if, if everyone isn't completely married to a
329
00:17:25,000 --> 00:17:30,000
VM, because a VM is so heavy weight, even the lightest weight VM to have an
330
00:17:30,000 --> 00:17:36,000
entire kernel to care about your serial port and your Ethernet driver.
331
00:17:36,000 --> 00:17:38,000
And I mean, it's just like, it's insane.
332
00:17:38,000 --> 00:17:42,000
Like let's, let's give people what they want an isolated environment that allows
333
00:17:42,000 --> 00:17:43,000
you to execute against things.
334
00:17:43,000 --> 00:17:46,000
And, and that was the, the whole idea of on the container.
335
00:17:46,000 --> 00:17:49,000
And then obviously letting people do a whole bunch of those at the same time was
336
00:17:49,000 --> 00:17:50,000
really powerful.
337
00:17:50,000 --> 00:17:54,000
And even just to like paint the scene of people that weren't in technology or
338
00:17:54,000 --> 00:17:57,000
weren't doing infrastructure around this time, right?
339
00:17:57,000 --> 00:18:04,000
Like Docker was kind of launched in, in 2014, the first Docker con was 2014.
340
00:18:04,000 --> 00:18:06,000
So this is still super early.
341
00:18:06,000 --> 00:18:11,000
ECS came out from AWS, which was like basically just like a big Docker engine
342
00:18:11,000 --> 00:18:13,000
in 2014.
343
00:18:13,000 --> 00:18:16,000
So this is within six months of all these other things.
344
00:18:16,000 --> 00:18:21,000
Google already had the app engine, which was already kind of this like,
345
00:18:21,000 --> 00:18:25,000
has sort of, you know, you didn't have to care that it was a container sort of
346
00:18:25,000 --> 00:18:27,000
environment where it's like, Hey, you just bring us your application that looks
347
00:18:27,000 --> 00:18:29,000
like this, we'll run it for you.
348
00:18:29,000 --> 00:18:32,000
No VM, no S management, all of that stuff is going to work.
349
00:18:32,000 --> 00:18:37,000
And then launching this new, very configurable kind of complex looking
350
00:18:37,000 --> 00:18:41,000
container engine into the world had to have contention because I know like all
351
00:18:41,000 --> 00:18:45,000
the internal Google stuff around Borg is like, well, you can't just ship Borg to
352
00:18:45,000 --> 00:18:46,000
other people.
353
00:18:46,000 --> 00:18:50,000
How do you wrap that to make it easier just like Hadoop?
354
00:18:50,000 --> 00:18:52,000
It must, it must have been like political struggle.
355
00:18:52,000 --> 00:18:56,000
I think even more than a technical struggle to be able to push that through.
356
00:18:56,000 --> 00:19:00,000
No, I mean, look, you know, we, again, we all forget, right?
357
00:19:00,000 --> 00:19:04,000
But, but Google was not successful and open source at that point, right?
358
00:19:04,000 --> 00:19:06,000
They were very successful publishing papers.
359
00:19:06,000 --> 00:19:13,000
But to that point, they had Android, which they bought and they had Angular.
360
00:19:13,000 --> 00:19:16,000
But other than that, they had Go.
361
00:19:16,000 --> 00:19:17,000
That's true.
362
00:19:17,000 --> 00:19:18,000
They didn't go.
363
00:19:18,000 --> 00:19:19,000
I take that back.
364
00:19:19,000 --> 00:19:20,000
That's the only other thing.
365
00:19:20,000 --> 00:19:23,000
But Go wasn't, Go wasn't as popular as it is now.
366
00:19:23,000 --> 00:19:25,000
It was like, you know what I mean?
367
00:19:25,000 --> 00:19:26,000
Yeah.
368
00:19:26,000 --> 00:19:27,000
Yeah.
369
00:19:27,000 --> 00:19:28,000
Yeah.
370
00:19:28,000 --> 00:19:29,000
Go.
371
00:19:29,000 --> 00:19:31,000
I don't think people know how old Go is because Go got so popular in the last
372
00:19:31,000 --> 00:19:32,000
few years.
373
00:19:32,000 --> 00:19:35,000
And then when they, like, and because so much of Kubernetes and certain
374
00:19:35,000 --> 00:19:37,000
infrastructure is built on it.
375
00:19:37,000 --> 00:19:41,000
Now it's like, I won't say it's like Java, but it's like, you can't avoid
376
00:19:41,000 --> 00:19:42,000
Go in a lot of ways.
377
00:19:42,000 --> 00:19:45,000
So much of the infrastructure tooling was Ruby before that.
378
00:19:45,000 --> 00:19:46,000
Right.
379
00:19:46,000 --> 00:19:47,000
Because Ruby on Rails exploded.
380
00:19:47,000 --> 00:19:48,000
That makes me so mad.
381
00:19:48,000 --> 00:19:49,000
And then there was.
382
00:19:49,000 --> 00:19:52,000
I have like flashbacks.
383
00:19:52,000 --> 00:19:55,000
Chef and Puppet were like, that was like, if you were doing infrastructure,
384
00:19:55,000 --> 00:19:58,000
you were doing config management and you had to know Ruby to be able to write
385
00:19:58,000 --> 00:19:59,000
Chef and Puppet.
386
00:19:59,000 --> 00:20:01,000
Oh, and now so many things at AWS make sense.
387
00:20:01,000 --> 00:20:02,000
Yeah.
388
00:20:02,000 --> 00:20:03,000
Yeah.
389
00:20:03,000 --> 00:20:04,000
Absolutely.
390
00:20:04,000 --> 00:20:06,000
I was like, why would you do this?
391
00:20:06,000 --> 00:20:07,000
Yeah.
392
00:20:07,000 --> 00:20:08,000
Absolutely.
393
00:20:08,000 --> 00:20:10,000
Well, now it's all TypeScript at AWS.
394
00:20:10,000 --> 00:20:11,000
Yeah.
395
00:20:11,000 --> 00:20:14,000
So I mean, like, again, it's, it was like, and so.
396
00:20:14,000 --> 00:20:15,000
I have a question.
397
00:20:15,000 --> 00:20:16,000
Oh, sorry, please.
398
00:20:16,000 --> 00:20:21,000
What do you think is harder politics trying to get things done internally and
399
00:20:21,000 --> 00:20:24,000
then like a mega corporation or open source?
400
00:20:24,000 --> 00:20:28,000
Because I feel like they're very two different, like.
401
00:20:28,000 --> 00:20:29,000
They're.
402
00:20:29,000 --> 00:20:31,000
You break a really interesting question.
403
00:20:31,000 --> 00:20:34,000
I, I, you know, they just are very different.
404
00:20:34,000 --> 00:20:39,000
The nice part about internal politics is there are at least defined motivations.
405
00:20:39,000 --> 00:20:41,000
It's very rare that someone's just absolute chaos.
406
00:20:41,000 --> 00:20:42,000
Right.
407
00:20:42,000 --> 00:20:45,000
Every now and then I'm like, you play D&D, huh?
408
00:20:45,000 --> 00:20:48,000
Cause you are just a chaos goblin for no reason.
409
00:20:48,000 --> 00:20:52,000
Like you just walk in and you're just like, for no reason.
410
00:20:52,000 --> 00:20:53,000
But yeah.
411
00:20:53,000 --> 00:20:54,000
So that's very rare.
412
00:20:54,000 --> 00:20:56,000
And I'm like, I don't know.
413
00:20:56,000 --> 00:20:57,000
I don't know.
414
00:20:57,000 --> 00:20:58,000
I don't know.
415
00:20:58,000 --> 00:20:59,000
I don't know.
416
00:20:59,000 --> 00:21:00,000
I don't know.
417
00:21:00,000 --> 00:21:01,000
I don't know.
418
00:21:01,000 --> 00:21:02,000
I don't know.
419
00:21:02,000 --> 00:21:03,000
Yeah.
420
00:21:03,000 --> 00:21:04,000
So that's very rare internally.
421
00:21:04,000 --> 00:21:08,000
At least you can say, okay, well that person has this job and this VP asked them to do
422
00:21:08,000 --> 00:21:09,000
this.
423
00:21:09,000 --> 00:21:10,000
So like that's a thing.
424
00:21:10,000 --> 00:21:14,000
I don't agree with that thing, but at least you can like unpack what, what they're doing.
425
00:21:14,000 --> 00:21:16,000
I think knowing your audience is important.
426
00:21:16,000 --> 00:21:17,000
100%.
427
00:21:17,000 --> 00:21:18,000
No.
428
00:21:18,000 --> 00:21:19,000
100%.
429
00:21:19,000 --> 00:21:23,000
And, and those politics almost always come down from cheese.
430
00:21:23,000 --> 00:21:27,000
I'm responsible for a thing and you are risking that thing.
431
00:21:27,000 --> 00:21:30,000
So I'm going to like be a dick to you.
432
00:21:30,000 --> 00:21:31,000
Right.
433
00:21:31,000 --> 00:21:37,000
And so you got to figure out because I love the way that you like, you're like, can we
434
00:21:37,000 --> 00:21:38,000
be friends?
435
00:21:38,000 --> 00:21:40,000
You're like, so this person has motivation.
436
00:21:40,000 --> 00:21:41,000
So they're going to be a dick to you.
437
00:21:41,000 --> 00:21:44,000
Like I felt that in my soul.
438
00:21:44,000 --> 00:21:50,000
So, but, but in open source, the, all you have is ego, right?
439
00:21:50,000 --> 00:21:53,000
And so ego can be way more irrational.
440
00:21:53,000 --> 00:21:59,000
It's, but sometimes it's like really like, you got to love the purist, right?
441
00:21:59,000 --> 00:22:04,000
Like, you know, like we, I feel like we all like really care about open source, but every
442
00:22:04,000 --> 00:22:08,000
now you get someone and you're like, do you go outside?
443
00:22:08,000 --> 00:22:09,000
Like,
444
00:22:09,000 --> 00:22:13,000
I don't, you know, it's funny.
445
00:22:13,000 --> 00:22:15,000
So let me give you an example.
446
00:22:15,000 --> 00:22:22,000
When I was leading Kubernetes, they're one of the first.
447
00:22:22,000 --> 00:22:24,000
Hardy debates we had in public.
448
00:22:24,000 --> 00:22:26,000
I was excited about what you're going to say.
449
00:22:26,000 --> 00:22:28,000
He said a party in the way you perked up.
450
00:22:28,000 --> 00:22:30,000
I was like, this is going to be good.
451
00:22:30,000 --> 00:22:37,000
There was a, I can't remember it because we just introduced job sets.
452
00:22:37,000 --> 00:22:39,000
So I can't remember what was the name.
453
00:22:39,000 --> 00:22:41,000
Oh, maybe it's a stateful set.
454
00:22:41,000 --> 00:22:44,000
But like, I think that's what we were calling it.
455
00:22:44,000 --> 00:22:45,000
We're still calling it that.
456
00:22:45,000 --> 00:22:49,000
But like at the time, 2015, there was a thing called pet set.
457
00:22:49,000 --> 00:22:50,000
Right.
458
00:22:51,000 --> 00:22:52,000
And
459
00:22:52,000 --> 00:22:53,000
What name do these things?
460
00:22:53,000 --> 00:22:55,000
What is a pets?
461
00:22:55,000 --> 00:22:57,000
So this is the old name for states.
462
00:22:57,000 --> 00:22:58,000
You bring up the right point.
463
00:22:58,000 --> 00:22:59,000
Right.
464
00:22:59,000 --> 00:23:01,000
So there was a thing called pet set.
465
00:23:01,000 --> 00:23:03,000
And that's because there was a whole idea.
466
00:23:03,000 --> 00:23:05,000
Oh, or things pet cattle or pets.
467
00:23:05,000 --> 00:23:06,000
Okay.
468
00:23:06,000 --> 00:23:10,000
Like, because you, you, you're going to put a bullet in a cattle, but you're not going
469
00:23:10,000 --> 00:23:12,000
to put a bullet in a, in a, right.
470
00:23:12,000 --> 00:23:15,000
And so the whole idea was like to keep it around.
471
00:23:15,000 --> 00:23:19,000
And this person submitted a bug to say, Hey, you should change the name of pets.
472
00:23:20,000 --> 00:23:25,000
And they, they get going to this big, long explanation, like, look, you know, animal
473
00:23:25,000 --> 00:23:28,000
welfare and this, that and the other and so on and so forth.
474
00:23:29,000 --> 00:23:31,000
And I'm not dismissing.
475
00:23:31,000 --> 00:23:36,000
Like what their feelings were, but like, that's, that's your deal, dude.
476
00:23:37,000 --> 00:23:38,000
That's not our deal.
477
00:23:38,000 --> 00:23:41,000
Like I, I, I get you want that, but we don't.
478
00:23:41,000 --> 00:23:43,000
It like, that's not going to help the project.
479
00:23:43,000 --> 00:23:44,000
The fact that this is the motivation.
480
00:23:44,000 --> 00:23:48,000
Now that said, the name is terrible.
481
00:23:48,000 --> 00:23:50,000
Exactly what you said on it.
482
00:23:50,000 --> 00:23:51,000
What the fuck is that?
483
00:23:51,000 --> 00:23:52,000
What is that?
484
00:23:52,000 --> 00:23:53,000
What does that set even mean?
485
00:23:53,000 --> 00:23:54,000
That you're refusing.
486
00:23:54,000 --> 00:23:56,000
Why are you saying this?
487
00:23:56,000 --> 00:24:00,000
And so this is why having a lot of really smart friends that have been in the
488
00:24:00,000 --> 00:24:02,000
field for a long time is good.
489
00:24:02,000 --> 00:24:06,000
But every now and then a name comes out and I'm like, this is how we know you
490
00:24:06,000 --> 00:24:09,000
don't talk to anybody else besides white guys that have been in tech for forever.
491
00:24:09,000 --> 00:24:12,000
Who else knows what this means?
492
00:24:12,000 --> 00:24:14,000
And, and here you go.
493
00:24:14,000 --> 00:24:16,000
You can go search for it.
494
00:24:16,000 --> 00:24:18,000
It was, I can tell by Justin's face.
495
00:24:18,000 --> 00:24:20,000
He's already searching for it.
496
00:24:20,000 --> 00:24:21,000
Please.
497
00:24:21,000 --> 00:24:22,000
Please.
498
00:24:22,000 --> 00:24:24,000
It's, it's a 27, 4, 30.
499
00:24:24,000 --> 00:24:28,000
And this was, and for the time, small community.
500
00:24:28,000 --> 00:24:35,000
This was a, I don't know, 50 comment thread here.
501
00:24:35,000 --> 00:24:39,000
Like it was pretty, like I was a lot of annoying stuff that I love that there
502
00:24:39,000 --> 00:24:41,000
was a 50 comment thread here.
503
00:24:41,000 --> 00:24:44,000
There was a lot of annoying stuff that I love that there was 50.
504
00:24:44,000 --> 00:24:46,000
This is just, this is the epitome of open.
505
00:24:46,000 --> 00:24:47,000
Okay.
506
00:24:47,000 --> 00:24:49,000
So when you get to the conversion, right?
507
00:24:49,000 --> 00:24:52,000
The like conversion zone of like open source.
508
00:24:52,000 --> 00:24:53,000
Oh no, sorry.
509
00:24:53,000 --> 00:24:56,000
120, 150, 150 total comments on this.
510
00:24:56,000 --> 00:24:57,000
Shut up.
511
00:24:57,000 --> 00:24:59,000
I love this.
512
00:24:59,000 --> 00:25:01,000
Like insane, insane.
513
00:25:01,000 --> 00:25:05,000
This is like those people who are so into open source that they refuse to use any
514
00:25:05,000 --> 00:25:06,000
proprietary software.
515
00:25:06,000 --> 00:25:08,000
So they refuse to use Google maps or anything.
516
00:25:08,000 --> 00:25:11,000
And you're just like, bro.
517
00:25:11,000 --> 00:25:13,000
But like, okay.
518
00:25:13,000 --> 00:25:15,000
How do you, okay.
519
00:25:15,000 --> 00:25:19,000
I think that Google was, I won't say one of the first, but you guys, your time at
520
00:25:19,000 --> 00:25:24,000
Google, I imagine that you really learned that conversion zone of proprietary
521
00:25:24,000 --> 00:25:27,000
software and corporate and open source.
522
00:25:27,000 --> 00:25:31,000
And I feel like we're in this kind of, I don't know.
523
00:25:31,000 --> 00:25:34,000
I don't know if I'd say a transition, but weird time.
524
00:25:34,000 --> 00:25:39,000
So like, what do you think about the politics of trying to both balance
525
00:25:39,000 --> 00:25:47,000
corporation politics, but also like interacting with like open source, you
526
00:25:47,000 --> 00:25:52,000
know, because it's very, it's, it makes it even more complicated when you're both
527
00:25:52,000 --> 00:25:56,000
arguing for something internally, but then you have to go to the politics of
528
00:25:56,000 --> 00:25:57,000
open source, right?
529
00:25:57,000 --> 00:26:02,000
Which I think a lot of open source is corporations right now, but when you're
530
00:26:02,000 --> 00:26:07,000
actually the person that's inside doing the arguing with that company, you have
531
00:26:07,000 --> 00:26:12,000
to really know what that company's goals and business and leadership principles
532
00:26:12,000 --> 00:26:16,000
and then fighting it in open source is a whole different battle.
533
00:26:16,000 --> 00:26:21,000
And I feel like it's almost like sometimes if people only work in proprietary
534
00:26:21,000 --> 00:26:25,000
software and they only work in open source, don't realize what it's like to
535
00:26:25,000 --> 00:26:30,000
kind of, so can you speak on the whole, like how you do that and like your
536
00:26:30,000 --> 00:26:34,000
experience and kind of making those worlds happen?
537
00:26:34,000 --> 00:26:38,000
I think you, you touched on it earlier, right?
538
00:26:38,000 --> 00:26:41,000
It's like, how to understand their motivations.
539
00:26:41,000 --> 00:26:46,000
You know, corporate, people who are like in that open source or in the
540
00:26:46,000 --> 00:26:53,000
corporation are both humans and corporate employees, right?
541
00:26:53,000 --> 00:26:59,000
And so you've got to figure out how to balance all of that mess and the
542
00:26:59,000 --> 00:27:05,000
extent to which you can help them achieve their corporate goals, but still
543
00:27:05,000 --> 00:27:07,000
enable them to be humans.
544
00:27:07,000 --> 00:27:11,000
I mean, that's, that's the sweet spot you're like looking to do.
545
00:27:11,000 --> 00:27:15,000
I think trying to teach open source to people that are used to corporate
546
00:27:15,000 --> 00:27:20,000
America, you know, and corporate internals and then trying to like show
547
00:27:20,000 --> 00:27:22,000
the business value of open source.
548
00:27:22,000 --> 00:27:26,000
It's like both hard, but like my favorite thing because there's so much
549
00:27:26,000 --> 00:27:32,000
value in open source and trying to get people to invest, but also understand
550
00:27:32,000 --> 00:27:36,000
the mindset of people that work in open source and what open source customers
551
00:27:36,000 --> 00:27:42,000
want because it's very different than proprietary like software, right?
552
00:27:42,000 --> 00:27:46,000
So trying to teach companies that are into proprietary software.
553
00:27:46,000 --> 00:27:49,000
Like I think Google is really interesting because they, they do, they're
554
00:27:49,000 --> 00:27:52,000
very academic in the papers and kind of where that flows.
555
00:27:52,000 --> 00:27:57,000
But if you look at it like Kubernetes is one of the like most thought out, like
556
00:27:57,000 --> 00:28:01,000
well built open source projects, which I think is going to like really
557
00:28:01,000 --> 00:28:03,000
changed how that's done.
558
00:28:03,000 --> 00:28:07,000
Like if you look at how like the Linux foundation in Kubernetes works, like
559
00:28:07,000 --> 00:28:10,000
you can tell that the rest foundation is really taking that into account when
560
00:28:10,000 --> 00:28:11,000
building it.
561
00:28:11,000 --> 00:28:16,000
So I feel like it's going to like change the future of how those like, so
562
00:28:16,000 --> 00:28:20,000
what's it like kind of seeing that from like the start and how corporate and
563
00:28:20,000 --> 00:28:23,000
open source kind of start at this marriage and how we're still trying to
564
00:28:23,000 --> 00:28:24,000
navigate that.
565
00:28:24,000 --> 00:28:31,000
I mean, I think the thing is like it's the extent to which you become part of
566
00:28:31,000 --> 00:28:38,000
the critical chain of a corporate supply chain is where you start to do it.
567
00:28:38,000 --> 00:28:39,000
Right.
568
00:28:39,000 --> 00:28:44,000
So like, you know, to use a terrible example, you know, analogy here, right?
569
00:28:44,000 --> 00:28:45,000
It's like energy.
570
00:28:45,000 --> 00:28:46,000
Okay.
571
00:28:46,000 --> 00:28:48,000
You know, corporations don't care about energy.
572
00:28:48,000 --> 00:28:49,000
They don't care about electricity.
573
00:28:49,000 --> 00:28:50,000
Most of them don't, right?
574
00:28:50,000 --> 00:28:53,000
They're just like, all right, I plug in my laptop and it works and that allows me
575
00:28:53,000 --> 00:28:55,000
to, you know, produce some Excel files.
576
00:28:55,000 --> 00:29:00,000
If you ask them to like give a shit about, you know, what, what the
577
00:29:00,000 --> 00:29:06,000
transformer up the street is or what the, you know, greenness of your, you know,
578
00:29:06,000 --> 00:29:10,000
you know, power coming in is they're just like, Hey, that sounds great, but like
579
00:29:10,000 --> 00:29:14,000
unless we're very forward looking, we're just not going to care.
580
00:29:14,000 --> 00:29:20,000
Now, if you can align it to whatever their business goals are, then you're
581
00:29:20,000 --> 00:29:22,000
going to be off to the races.
582
00:29:22,000 --> 00:29:24,000
And so if you're like, Hey, you know what?
583
00:29:24,000 --> 00:29:30,000
It turns out that putting a 10 kilowatt battery on every retail outlet means
584
00:29:30,000 --> 00:29:35,000
that, you know, they get less brownouts or something like that.
585
00:29:35,000 --> 00:29:39,000
And that improves the ability to sell and great, they're going to do that.
586
00:29:39,000 --> 00:29:45,000
And now it happens to forward your other goals of being green and resilient and,
587
00:29:45,000 --> 00:29:49,000
and, you know, getting rid of fossil fuels and whatever, but like, that's
588
00:29:49,000 --> 00:29:51,000
not what you don't sell it that way.
589
00:29:51,000 --> 00:29:55,000
You sell it as, as part of this other thing that they care about.
590
00:29:55,000 --> 00:29:57,000
Open source is the same way, right?
591
00:29:57,000 --> 00:30:02,000
Like, you know, you're not, it is very unlikely that you're going to be able to
592
00:30:02,000 --> 00:30:05,000
walk into someone and say, Hey, you know what, you should do this because this
593
00:30:05,000 --> 00:30:08,000
is a social good and you need to support the Linux kernel.
594
00:30:08,000 --> 00:30:12,000
You need to support, you know, engine X or whatever because of X.
595
00:30:12,000 --> 00:30:18,000
What you need to say is like, Hey, do you know that like 84% of our stack runs on
596
00:30:18,000 --> 00:30:22,000
this open source project and we have zero upstream developers on it?
597
00:30:22,000 --> 00:30:24,000
Like that seems like a supply chain risk.
598
00:30:24,000 --> 00:30:28,000
We are not going to go and build this better than they do.
599
00:30:28,000 --> 00:30:35,000
Uh, so let's put someone on maintaining it or, or allocate some dollars or whatever
600
00:30:35,000 --> 00:30:37,000
it is, because that's a supply chain component.
601
00:30:37,000 --> 00:30:39,000
And again, that's just one example.
602
00:30:39,000 --> 00:30:40,000
There are many other reasons.
603
00:30:40,000 --> 00:30:45,000
Like I'm so passionate about like trying to explain that to corporate America
604
00:30:45,000 --> 00:30:49,000
because I feel like it's not only way that we're going to move forward with this
605
00:30:49,000 --> 00:30:54,000
kind of like, this new like change in open source, then trying to license
606
00:30:54,000 --> 00:30:57,000
everything and trying to make it, people pay for it in a different way.
607
00:30:57,000 --> 00:31:00,000
I don't think we're going to get that forward movement that they think they
608
00:31:00,000 --> 00:31:01,000
want.
609
00:31:01,000 --> 00:31:06,000
But I think really showing like, uh, corporate, like companies, like
610
00:31:06,000 --> 00:31:10,000
70% of infrastructure I think is built on open source.
611
00:31:10,000 --> 00:31:14,000
And I think really showing people the value of like, Hey, contribute these
612
00:31:14,000 --> 00:31:19,000
things upstream, learn to get into the political, the politics and contributing
613
00:31:19,000 --> 00:31:21,000
and being a part of the community.
614
00:31:21,000 --> 00:31:25,000
And like it for one, it's like everybody is doing more with less right now.
615
00:31:25,000 --> 00:31:26,000
Right.
616
00:31:26,000 --> 00:31:30,000
So the more that you're contributing upstream, everybody's on the same page.
617
00:31:30,000 --> 00:31:32,000
It's easier to maintain software together.
618
00:31:32,000 --> 00:31:34,000
There's so much actual business value.
619
00:31:34,000 --> 00:31:35,000
Absolutely.
620
00:31:35,000 --> 00:31:41,000
And I just feel so passionately about trying to get people on that page
621
00:31:41,000 --> 00:31:45,000
because I think that we can, for one, we can get people paid to be
622
00:31:45,000 --> 00:31:46,000
maintainers, right?
623
00:31:46,000 --> 00:31:47,000
Absolutely.
624
00:31:47,000 --> 00:31:50,000
But we can all, like people are going to be doing this development anyways.
625
00:31:50,000 --> 00:31:53,000
And instead of taking from open source and internally developing it,
626
00:31:53,000 --> 00:31:59,000
contributing it back to open sources, not only going to make you a better
627
00:31:59,000 --> 00:32:04,000
steward of that community, but also why maintain it solo and like
628
00:32:04,000 --> 00:32:06,000
siloed when you can maintain it.
629
00:32:06,000 --> 00:32:09,000
Like look at how log for J was like, so, you know,
630
00:32:09,000 --> 00:32:14,000
I mean, I would say also maintaining things solo is easier than trying to
631
00:32:14,000 --> 00:32:18,000
trying to get the, like you slow down sometimes in what you're doing,
632
00:32:18,000 --> 00:32:22,000
but like trying to get it depends on how big the project is.
633
00:32:22,000 --> 00:32:23,000
Yeah, absolutely.
634
00:32:23,000 --> 00:32:26,000
There's a tipping point because like at some point, like if you're going to
635
00:32:26,000 --> 00:32:29,000
like a legacy software like Java Linux, all these places.
636
00:32:29,000 --> 00:32:31,000
Thank you for saying Java's legacy.
637
00:32:31,000 --> 00:32:34,000
Why are you always trying to hurt my soul?
638
00:32:34,000 --> 00:32:35,000
You said it.
639
00:32:35,000 --> 00:32:37,000
I just said how good you were friends.
640
00:32:37,000 --> 00:32:39,000
Oh yeah, she is.
641
00:32:39,000 --> 00:32:40,000
Terrible.
642
00:32:40,000 --> 00:32:41,000
Terrible.
643
00:32:41,000 --> 00:32:42,000
Come on.
644
00:32:42,000 --> 00:32:43,000
Join the 21st century.
645
00:32:43,000 --> 00:32:44,000
I'm still IT.
646
00:32:44,000 --> 00:32:47,000
My defense, I haven't got to write Java and God knows how long.
647
00:32:47,000 --> 00:32:52,000
So like apparently I'm a Python and everything else head and decrepit C
648
00:32:52,000 --> 00:32:54,000
and every open source.
649
00:32:54,000 --> 00:32:55,000
Yeah.
650
00:32:55,000 --> 00:32:56,000
But like, you know what I mean?
651
00:32:56,000 --> 00:32:58,000
Like if you think about it, they're the amount.
652
00:32:58,000 --> 00:32:59,000
Okay.
653
00:32:59,000 --> 00:33:06,000
In this world as engineers in 2025, everybody is so under like headcount.
654
00:33:06,000 --> 00:33:09,000
We haven't had headcount for years, right?
655
00:33:09,000 --> 00:33:13,000
We are doing more with less to get the extreme knowledge that you need for
656
00:33:13,000 --> 00:33:15,000
some of these open source.
657
00:33:15,000 --> 00:33:17,000
Think about how big Kubernetes is.
658
00:33:17,000 --> 00:33:22,000
Think about how big Linux is, how big Java is to get that type of
659
00:33:22,000 --> 00:33:26,000
of specialized knowledge.
660
00:33:26,000 --> 00:33:28,000
You're not going to just, unless you're buying a
661
00:33:28,000 --> 00:33:33,000
maintainer for a million dollars, like, and you'd have to have multiple of them,
662
00:33:33,000 --> 00:33:34,000
right?
663
00:33:34,000 --> 00:33:39,000
So if you can get, if you are putting that money towards getting developers to
664
00:33:39,000 --> 00:33:45,000
learn how to like become parts of those ecosystems, you now are like a force
665
00:33:45,000 --> 00:33:50,000
multiplying your developers because you're maintaining it and you're
666
00:33:50,000 --> 00:33:51,000
contributing to this ecosystem.
667
00:33:51,000 --> 00:33:54,000
So if you have four corporations that are huge and they have the smartest
668
00:33:54,000 --> 00:33:59,000
minds and they're all now adding to this open source project, really, that's a
669
00:33:59,000 --> 00:34:03,000
force multiplier because when you have a horrible bug like log4j or something,
670
00:34:03,000 --> 00:34:08,000
if you have four smart, huge, like not huge, but smart teams, right, that are
671
00:34:08,000 --> 00:34:12,000
working at the same problem, that's more now secure than it would have been if
672
00:34:12,000 --> 00:34:13,000
you siloed that.
673
00:34:13,000 --> 00:34:14,000
Yeah.
674
00:34:14,000 --> 00:34:15,000
You know what I mean?
675
00:34:15,000 --> 00:34:18,000
That's like the whole reason for the CNCF as a foundation, right?
676
00:34:18,000 --> 00:34:21,000
So that these big corporations can work together in a neutral place because...
677
00:34:21,000 --> 00:34:25,000
But we have to teach people that because a lot of corporate America just thinks
678
00:34:25,000 --> 00:34:29,000
of like, I'm going to go pull this repo down, pretend like I didn't pull it down.
679
00:34:29,000 --> 00:34:34,000
And then, you know, and it's like, bro, it's better for your business.
680
00:34:34,000 --> 00:34:39,000
It is more business value for you to contribute these things upstream.
681
00:34:39,000 --> 00:34:42,000
I know you're going to have a little bit of politics, but hire someone who knows
682
00:34:42,000 --> 00:34:43,000
how to do that.
683
00:34:43,000 --> 00:34:46,000
They are in the market right now and they could use a job, you know what I mean?
684
00:34:46,000 --> 00:34:50,000
And do the work because it really is a force multiplier when you look at it.
685
00:34:50,000 --> 00:34:51,000
You know what I mean?
686
00:34:51,000 --> 00:34:55,000
So David, we're going to skip over everything you did at Microsoft because it doesn't work.
687
00:34:55,000 --> 00:34:57,000
And we're just going to jump right into it.
688
00:34:57,000 --> 00:34:58,000
I want to know all the things.
689
00:34:58,000 --> 00:34:59,000
Tell us all the cool things.
690
00:34:59,000 --> 00:35:01,000
I will talk for as long as you'd like.
691
00:35:01,000 --> 00:35:03,000
I'll come back for part two or whatever.
692
00:35:03,000 --> 00:35:07,000
I'm not that interesting, but I love hearing my own voice because I'm narcissistic or
693
00:35:07,000 --> 00:35:08,000
I don't know.
694
00:35:08,000 --> 00:35:14,000
I'm not going to give you any key that you'd like at any of these places that I haven't
695
00:35:14,000 --> 00:35:15,000
gotten into the politics.
696
00:35:15,000 --> 00:35:16,000
All right.
697
00:35:16,000 --> 00:35:17,000
Cube float.
698
00:35:17,000 --> 00:35:18,000
There's a lot of drama there.
699
00:35:18,000 --> 00:35:24,000
Oh, you are like my favorite type of human because not only do you have all the intellect,
700
00:35:24,000 --> 00:35:27,000
but you actually have a personality, right?
701
00:35:27,000 --> 00:35:32,000
Because like, bro, you all know that sometimes you get the engineers and you're just like,
702
00:35:32,000 --> 00:35:33,000
oh, good Lord.
703
00:35:33,000 --> 00:35:39,000
Pulling any kind of personality and socialness out of them is just like, it's so hard.
704
00:35:39,000 --> 00:35:41,000
That is incredibly kind.
705
00:35:41,000 --> 00:35:42,000
I don't know.
706
00:35:42,000 --> 00:35:45,000
I don't know if I deserve that, but thank you.
707
00:35:45,000 --> 00:35:47,000
So I will skip over all that stuff.
708
00:35:47,000 --> 00:35:48,000
What else would you like to?
709
00:35:48,000 --> 00:35:50,000
So I'm kind of curious.
710
00:35:50,000 --> 00:35:57,000
What point did you decide that data co-located with compute was a problem to solve?
711
00:35:57,000 --> 00:35:59,000
And what are you doing that's new?
712
00:35:59,000 --> 00:36:04,000
Out of all the interesting questions you could ask David, like all the tea, look at his eyes.
713
00:36:04,000 --> 00:36:10,000
Like they're just, it's hiding behind the eyes and it wants to come out.
714
00:36:10,000 --> 00:36:15,000
And the funny part is like, it's funny that we started with Kubernetes because that was it.
715
00:36:15,000 --> 00:36:16,000
Right?
716
00:36:16,000 --> 00:36:20,000
It's one of these things where you're like, when you hear something for like 10 years
717
00:36:20,000 --> 00:36:25,000
and you're like, oh yeah, I actually heard this problem a million years ago.
718
00:36:25,000 --> 00:36:27,000
So a little bit more history for you.
719
00:36:27,000 --> 00:36:32,000
When we were, when I was at Google, like one of the first features that I launched, I wanted
720
00:36:32,000 --> 00:36:36,000
to launch the first PRDs I wrote when I was at Google or on the Kubernetes team was a
721
00:36:36,000 --> 00:36:38,000
product called Uber Netties.
722
00:36:38,000 --> 00:36:43,000
And it was the idea of how do you federate Kubernetes clusters together?
723
00:36:43,000 --> 00:36:44,000
Right?
724
00:36:44,000 --> 00:36:45,000
How do you have an API server?
725
00:36:45,000 --> 00:36:47,000
I'm sad this didn't work out because the name is cool.
726
00:36:47,000 --> 00:36:49,000
I mean, it was like, it was genius.
727
00:36:49,000 --> 00:36:51,000
It was a genius name, Uber Netties, right?
728
00:36:51,000 --> 00:36:57,000
But the problem was is that these things don't work together, right?
729
00:36:57,000 --> 00:36:58,000
Kubernetes is incredible.
730
00:36:58,000 --> 00:37:04,000
It's not going anywhere, but it is built for a world in which nodes have continual connectivity
731
00:37:04,000 --> 00:37:06,000
between itself and the API server.
732
00:37:06,000 --> 00:37:12,000
And if it goes away for even a small amount of time, Kubernetes is really unhappy.
733
00:37:12,000 --> 00:37:20,000
And, and so then you had someone come along like Brian from Chick-fil-A who has that amazing
734
00:37:20,000 --> 00:37:25,000
2017 talk about how they have a Kubernetes cluster in every single Chick-fil-A.
735
00:37:25,000 --> 00:37:27,000
And they still have that today, right?
736
00:37:27,000 --> 00:37:28,000
It's incredible.
737
00:37:28,000 --> 00:37:29,000
Doesn't Walmart have that too?
738
00:37:29,000 --> 00:37:30,000
Yeah.
739
00:37:30,000 --> 00:37:32,000
Almost everybody starts it in a bunch of other people.
740
00:37:32,000 --> 00:37:35,000
And it's all kind of insane, right?
741
00:37:35,000 --> 00:37:38,000
Like you're kind of like, hey, you know, there's something weird about this, right?
742
00:37:38,000 --> 00:37:41,000
Why isn't there a platform that sits over the top?
743
00:37:41,000 --> 00:37:46,000
And, you know, when I was thinking about it, it really was around, you know, data, right?
744
00:37:46,000 --> 00:37:50,000
And it's the, it's the idea that like data is the challenge here.
745
00:37:50,000 --> 00:37:54,000
I like to say like there, there are three things that, that will never change, right?
746
00:37:54,000 --> 00:37:58,000
One, data growing, not really in dispute, data will continue to grow.
747
00:37:58,000 --> 00:38:01,000
But, but the key is that it will grow everywhere, right?
748
00:38:01,000 --> 00:38:06,000
Not in a central data center in Iowa or Oregon or Tokyo, but, you know,
749
00:38:06,000 --> 00:38:10,000
cross zone, cross region, cross cloud on Prem, Edge, IoT, blah, blah, blah, blah, blah.
750
00:38:10,000 --> 00:38:15,000
Like data is coming from all those places and Kubernetes is not in those places,
751
00:38:15,000 --> 00:38:17,000
nor are any of these other giant data warehouses.
752
00:38:17,000 --> 00:38:21,000
They're all sitting inside of your massive data center as they should.
753
00:38:21,000 --> 00:38:24,000
But somehow that data is there and you got to figure out how to get it in.
754
00:38:24,000 --> 00:38:26,000
And it can't just be a log shipper.
755
00:38:26,000 --> 00:38:27,000
Cause guess what?
756
00:38:27,000 --> 00:38:29,000
Look at what happened with log 4j.
757
00:38:29,000 --> 00:38:30,000
Exactly what you're saying earlier.
758
00:38:30,000 --> 00:38:33,000
You ship raw stuff from the edge.
759
00:38:33,000 --> 00:38:35,000
Go your central data warehouse.
760
00:38:35,000 --> 00:38:36,000
Bam.
761
00:38:36,000 --> 00:38:37,000
Security bullet really.
762
00:38:37,000 --> 00:38:38,000
Guaranteed.
763
00:38:38,000 --> 00:38:40,000
And we haven't even gotten into the other things, right?
764
00:38:40,000 --> 00:38:44,000
So that number one, that's a beautiful number two, speed of light, not getting any faster.
765
00:38:44,000 --> 00:38:45,000
Right?
766
00:38:45,000 --> 00:38:46,000
Just it is what it is.
767
00:38:46,000 --> 00:38:52,000
In 10,000 years, it will still be 49 millisecond ping time between Boston and LA.
768
00:38:52,000 --> 00:38:53,000
It just will.
769
00:38:53,000 --> 00:38:58,000
And so if you want to do anything that takes is faster than that, you're going to need a system
770
00:38:58,000 --> 00:39:01,000
that like can take the action remotely.
771
00:39:01,000 --> 00:39:04,000
But on top of that, like networking is just not keeping up.
772
00:39:04,000 --> 00:39:07,000
And it's not because they aren't out there busting their ass.
773
00:39:07,000 --> 00:39:09,000
It's cause data is growing even faster.
774
00:39:09,000 --> 00:39:11,000
And then the captain gets you every time.
775
00:39:11,000 --> 00:39:12,000
Sorry, go ahead.
776
00:39:12,000 --> 00:39:14,000
The captain gets you every time.
777
00:39:14,000 --> 00:39:15,000
Captain gets you every time.
778
00:39:15,000 --> 00:39:16,000
Thank you very much.
779
00:39:16,000 --> 00:39:20,000
So then the third is around security and regulations and those are growing, right?
780
00:39:20,000 --> 00:39:22,000
So GPR and HIPAA and things like that.
781
00:39:22,000 --> 00:39:27,000
You, you tend to put yourself at risk the moment you move a bit off wherever you generated it.
782
00:39:27,000 --> 00:39:28,000
Right?
783
00:39:28,000 --> 00:39:32,000
Now you, you're a wonderful segue because that's exactly it.
784
00:39:32,000 --> 00:39:37,000
Every major platform today is built around the C and the A of cap theorem, right?
785
00:39:37,000 --> 00:39:42,000
Consistency and consensus, whatever you want to say and availability.
786
00:39:42,000 --> 00:39:43,000
Right.
787
00:39:43,000 --> 00:39:44,000
That's amazing.
788
00:39:44,000 --> 00:39:46,000
Something should be built around the other half of it.
789
00:39:46,000 --> 00:39:49,000
Availability and support for network partitioning.
790
00:39:49,000 --> 00:39:52,000
And that's because of all those things I just said.
791
00:39:52,000 --> 00:39:56,000
And, and, you know, when you go and look at the Chick-fil-A example or the Home Depot or
792
00:39:56,000 --> 00:40:00,000
the millions of other folks out there who have these multiple deployments, retail outlets,
793
00:40:00,000 --> 00:40:04,000
manufacturing, security, et cetera, et cetera, this is the problem.
794
00:40:04,000 --> 00:40:05,000
Right?
795
00:40:05,000 --> 00:40:06,000
Because that data is over there.
796
00:40:06,000 --> 00:40:09,000
I want to do things declaratively.
797
00:40:09,000 --> 00:40:14,000
I want to take action over my data before I move it, but I still want it to move.
798
00:40:14,000 --> 00:40:21,000
So how do I do that when the network could go away for a minute, an hour, a day, because
799
00:40:21,000 --> 00:40:25,000
I'm going to put a backhoe through something who knows what, I want those systems and those
800
00:40:25,000 --> 00:40:26,000
systems to keep working.
801
00:40:26,000 --> 00:40:30,000
But when they reconnect, I want someone to like eventually bring this to consistency.
802
00:40:30,000 --> 00:40:31,000
And that's what we provided.
803
00:40:31,000 --> 00:40:38,000
I also think that's even more value because for one, the more you spread your data, the
804
00:40:38,000 --> 00:40:40,000
more you're spreading your attack surface.
805
00:40:40,000 --> 00:40:41,000
You know what I mean?
806
00:40:41,000 --> 00:40:42,000
Absolutely.
807
00:40:42,000 --> 00:40:46,000
And then I think that networking and security are both things that developers aren't always
808
00:40:46,000 --> 00:40:50,000
educated on and they're very like in depth areas, right?
809
00:40:50,000 --> 00:40:56,000
So the more that you can do in those areas to set them up for success, the better because
810
00:40:56,000 --> 00:41:01,000
and now that we're going to have more agents just, you know.
811
00:41:01,000 --> 00:41:02,000
Doing whatever agents do?
812
00:41:02,000 --> 00:41:03,000
Yeah.
813
00:41:03,000 --> 00:41:08,000
Well, the thing is, is that we didn't, we are struggling to teach developer security,
814
00:41:08,000 --> 00:41:09,000
right?
815
00:41:09,000 --> 00:41:13,000
So now if they don't understand all of the security and then they are giving permissions
816
00:41:13,000 --> 00:41:19,000
to agents that they already don't understand, that it's just a recipe for like the more that
817
00:41:19,000 --> 00:41:23,000
you're scaling this, yeah, you're scaling it, but you're scaling disaster in some ways.
818
00:41:23,000 --> 00:41:24,000
You know what I mean?
819
00:41:24,000 --> 00:41:30,000
So it's like, I think that this is going to, that has like so much value just on so many.
820
00:41:30,000 --> 00:41:32,000
I mean, I'm sorry, please.
821
00:41:32,000 --> 00:41:36,000
David, what you just described is basically just the, all the benefits of edge computing,
822
00:41:36,000 --> 00:41:37,000
right?
823
00:41:37,000 --> 00:41:38,000
Yeah.
824
00:41:38,000 --> 00:41:41,000
Like we can, we can, how do we get more compute at the edge?
825
00:41:41,000 --> 00:41:42,000
How do we make it more powerful?
826
00:41:42,000 --> 00:41:45,000
How do we make it easier to manage from a central place?
827
00:41:45,000 --> 00:41:48,000
So what is it that you're doing that's different than like, I don't know what to say, traditional
828
00:41:48,000 --> 00:41:53,000
edge computing, but like the, the idea is behind edge computing is just like, put compute
829
00:41:53,000 --> 00:41:58,000
closer milliseconds, have it better, have better storage, whatever, to where the data's
830
00:41:58,000 --> 00:41:59,000
being created.
831
00:41:59,000 --> 00:42:00,000
Yeah.
832
00:42:00,000 --> 00:42:02,000
So, you know, that's, we don't provide the edge compute.
833
00:42:02,000 --> 00:42:07,000
That is my, what do you call it here, by the way, my visual aid, my Raspberry Pi, right?
834
00:42:07,000 --> 00:42:08,000
That we run great on.
835
00:42:08,000 --> 00:42:11,000
Well, I don't know why you're coming apart here.
836
00:42:11,000 --> 00:42:12,000
That's weird.
837
00:42:12,000 --> 00:42:14,000
It's a Raspberry Pi.
838
00:42:14,000 --> 00:42:15,000
Yeah, exactly.
839
00:42:15,000 --> 00:42:17,000
But no, you're exactly right.
840
00:42:17,000 --> 00:42:21,000
And what we provide is the distributed consensus layer.
841
00:42:21,000 --> 00:42:28,000
And what that means is that like, turns out that that thing that you put on the edge is
842
00:42:28,000 --> 00:42:31,000
wonderful, but how do I know what is there?
843
00:42:31,000 --> 00:42:33,000
How do I know it stays running?
844
00:42:33,000 --> 00:42:35,000
How do I change the configuration?
845
00:42:35,000 --> 00:42:39,000
How do I do all this in a network and disconnected friendly way?
846
00:42:39,000 --> 00:42:41,000
That is the challenge of distributed computing.
847
00:42:41,000 --> 00:42:44,000
That is 40 years of academic research.
848
00:42:44,000 --> 00:42:51,000
And what we give you is a Kubernetes or container or orchestrated like experience, but one that
849
00:42:51,000 --> 00:42:55,000
is resilient to eventual consistency.
850
00:42:55,000 --> 00:42:59,000
And so if something happened on our side while you were disconnected, if something happened
851
00:42:59,000 --> 00:43:03,000
on that side while you were disconnected, you know, we, we give you, you know, the one
852
00:43:03,000 --> 00:43:08,000
of the big things that we keep seeing is we call it intelligent data pipelines, right?
853
00:43:08,000 --> 00:43:15,000
Where you can much more intelligently do some, not all some of your process before you move
854
00:43:15,000 --> 00:43:16,000
it.
855
00:43:16,000 --> 00:43:21,000
So a trivial example I talk about all the time is a lot of times factory owners, for example,
856
00:43:21,000 --> 00:43:24,000
will have all these sensors all over the place and they're great.
857
00:43:24,000 --> 00:43:30,000
But the sensors usually come straight from a warehouse in Shenzhen and get installed immediately,
858
00:43:30,000 --> 00:43:31,000
right?
859
00:43:31,000 --> 00:43:36,000
And they have no information, you know, no GPS, no, no schema.
860
00:43:36,000 --> 00:43:41,000
A lot of times they'll just output like a raw text string and they'll, you, you jam that
861
00:43:41,000 --> 00:43:43,000
into a backend, right?
862
00:43:43,000 --> 00:43:48,000
Well, geez, the moment you do that, you now have to reverse engineer all of the information
863
00:43:48,000 --> 00:43:50,000
that you lost along the way, right?
864
00:43:50,000 --> 00:43:52,000
What kind of machine was it running on?
865
00:43:52,000 --> 00:43:53,000
What was the firmware?
866
00:43:53,000 --> 00:43:54,000
What was the disk?
867
00:43:54,000 --> 00:43:57,000
What was, you know, what, where in the room was it?
868
00:43:57,000 --> 00:43:59,000
You know, so on and so forth.
869
00:43:59,000 --> 00:44:02,000
And so if you took something, right, you took one of these Raspberry Pis, you stuck it inside
870
00:44:02,000 --> 00:44:08,000
that factory and you said, Hey, you know what, before you send this raw into the back end,
871
00:44:08,000 --> 00:44:14,000
send it to this local machine or like local being like within the region and attach some
872
00:44:14,000 --> 00:44:19,000
metadata to it and do some initial data model enforcement and do some schematization.
873
00:44:19,000 --> 00:44:23,000
So change it from a flat text string into JSON or, you know, structured logging or whatever
874
00:44:23,000 --> 00:44:26,000
and take your pick, but still go to the backend.
875
00:44:26,000 --> 00:44:31,000
All you're doing is you're moving, like I say, you know, you may have a 17 step ETL pipeline
876
00:44:31,000 --> 00:44:34,000
and all your enterprise customers are like, Yeah, right.
877
00:44:34,000 --> 00:44:35,000
Add a zero buddy, right?
878
00:44:35,000 --> 00:44:42,000
But like, you know, take, you take those first, I don't know, four or five steps of your pipeline,
879
00:44:42,000 --> 00:44:48,000
data and model enforcement, schematization, adding metadata, adding providence, adding location,
880
00:44:48,000 --> 00:44:53,000
filtering, aggregation, just do some of those things before you move it.
881
00:44:53,000 --> 00:44:59,000
And magically, all those things downstream become better, faster, smarter.
882
00:44:59,000 --> 00:45:01,000
You can multi home stuff.
883
00:45:01,000 --> 00:45:05,000
So a lot of times, for example, you know, you might have these all these sensors pushing
884
00:45:05,000 --> 00:45:11,000
into the back end, no matter how fast your pipeline is, it might take, you know, five,
885
00:45:11,000 --> 00:45:15,000
10 minutes to ultimately go all the way through the pipeline, very, very common, not because
886
00:45:15,000 --> 00:45:19,000
the pipeline isn't like busting its ass, but because it needs to aggregate from all these
887
00:45:19,000 --> 00:45:21,000
sources before it does anything.
888
00:45:21,000 --> 00:45:27,000
Imagine that you have a factory floor where four different sensors are simultaneously saying,
889
00:45:27,000 --> 00:45:31,000
Hey, you know what, we're above our temperature threshold, right?
890
00:45:31,000 --> 00:45:35,000
Um, do you want to wait 10 minutes to know that?
891
00:45:35,000 --> 00:45:39,000
Wouldn't it be nice if you could trigger an event from that location?
892
00:45:39,000 --> 00:45:40,000
We can do that for you.
893
00:45:40,000 --> 00:45:44,000
And again, it's just by taking some of this and moving that out there and saying, Hey,
894
00:45:44,000 --> 00:45:48,000
you know what, we're still going to send all the raw data back, but simultaneously, we're
895
00:45:48,000 --> 00:45:52,000
also going to track her, you know, trigger pager duty or whatever, take your pick by
896
00:45:52,000 --> 00:45:56,000
your hydrant, you know, any kind of other endpoint sqs, we're going to trigger that from this
897
00:45:56,000 --> 00:45:57,000
location.
898
00:45:57,000 --> 00:45:59,000
We can help you do that too.
899
00:45:59,000 --> 00:46:02,000
Again, we're not like stopping the rest of it.
900
00:46:02,000 --> 00:46:05,000
We're not even, you know, we can save you money if you want, we can reduce data if you
901
00:46:05,000 --> 00:46:10,000
want, we can do other things, but even just putting us there helps you be intelligent
902
00:46:10,000 --> 00:46:12,000
in a more distributed way.
903
00:46:12,000 --> 00:46:18,000
And to your point earlier about what edge compute, uh, doesn't do here, it's not that
904
00:46:18,000 --> 00:46:20,000
edge compute isn't critical to this.
905
00:46:20,000 --> 00:46:23,000
We can't operate without some form of edge compute.
906
00:46:23,000 --> 00:46:27,000
It's really about the orchestration of that job.
907
00:46:27,000 --> 00:46:32,000
So again, let's say you're like, Hey, you know what, I want to change this from being
908
00:46:32,000 --> 00:46:37,000
for, you know, uh, sensors going bad to five sensors going bad.
909
00:46:37,000 --> 00:46:39,000
Imagine what that's involved today.
910
00:46:39,000 --> 00:46:40,000
Right.
911
00:46:40,000 --> 00:46:41,000
How do you actually push that down?
912
00:46:41,000 --> 00:46:44,000
How do you know what version of this job is running?
913
00:46:44,000 --> 00:46:48,000
How do you know what the last time you saw this error is, whatever it might be.
914
00:46:48,000 --> 00:46:51,000
Um, all of that is hard to get down to these places.
915
00:46:51,000 --> 00:46:56,000
And we give you a clean API that is resilient to these networks that gives you, you know,
916
00:46:56,000 --> 00:47:01,000
full and rich intelligent pipelines at the edge and help you push that stuff through.
917
00:47:01,000 --> 00:47:06,000
Uh, and by the way, when people talk about the next trillion devices that are out there,
918
00:47:06,000 --> 00:47:09,000
all of which doing inference, either way, we do that too.
919
00:47:09,000 --> 00:47:10,000
Right.
920
00:47:10,000 --> 00:47:16,000
Like, because at the end of the day, you know, inference is just remote data being processed
921
00:47:16,000 --> 00:47:19,000
and, and we hope you, you, you do that as well.
922
00:47:19,000 --> 00:47:25,000
I think this is going to be really cool too, because now with AI and all the different,
923
00:47:25,000 --> 00:47:27,000
we want to get data from everything.
924
00:47:27,000 --> 00:47:32,000
It's the promise of the future, but to be able to analyze all that people don't understand
925
00:47:32,000 --> 00:47:34,000
that distributed systems.
926
00:47:34,000 --> 00:47:39,000
Like you either, like, especially when it comes to either like, so their IOT is one aspect,
927
00:47:39,000 --> 00:47:42,000
but then you have the retail aspect where you don't want to charge people's cards more
928
00:47:42,000 --> 00:47:43,000
than once.
929
00:47:43,000 --> 00:47:47,000
So you have to have that eventually consistent and really worry about how you're, so like
930
00:47:47,000 --> 00:47:51,000
I went and made an order from happy lemon the other day and I was on my way to a tattoo
931
00:47:51,000 --> 00:47:55,000
appointment and I was trying to speed this up because I'm always late just in those,
932
00:47:55,000 --> 00:48:00,000
but, but I ended up getting there and I'm like, why isn't my order like ready?
933
00:48:00,000 --> 00:48:05,000
So I go and buy it and then all of a sudden my order comes through and I think it was
934
00:48:05,000 --> 00:48:08,000
because they had a like connection issue and I'm just sitting there like something's going
935
00:48:08,000 --> 00:48:14,000
on in your back end that you have an eventually consistent like, like database, but I'm sitting
936
00:48:14,000 --> 00:48:18,000
there and happy lemon, like trying to figure out what's wrong with their back end.
937
00:48:18,000 --> 00:48:22,000
This is, this is why we're friends because I often will like something.
938
00:48:22,000 --> 00:48:24,000
I was diagnosing their problem.
939
00:48:24,000 --> 00:48:26,000
Is this a backup job?
940
00:48:26,000 --> 00:48:28,000
Is this, do you need more workers?
941
00:48:28,000 --> 00:48:30,000
Is this of Kafka something?
942
00:48:30,000 --> 00:48:35,000
Yes, but see the thing is, is that like what you're, okay, so people don't realize that
943
00:48:35,000 --> 00:48:40,000
when you're streaming that amount of data, like David's talking about, you get now this
944
00:48:40,000 --> 00:48:45,000
bottleneck when all of it starts to come back in that like it's so bad.
945
00:48:45,000 --> 00:48:50,000
So the fact that you're doing, like this is another, it's almost like the concept of breaking
946
00:48:50,000 --> 00:48:53,000
different, just kind of compartmentalizing it.
947
00:48:53,000 --> 00:48:55,000
What's the word I'm looking for?
948
00:48:55,000 --> 00:48:59,000
Like, wouldn't you break up things when people want to do like service oriented or kind of
949
00:48:59,000 --> 00:49:04,000
like, so you're not monolith, but you're kind of in different services.
950
00:49:04,000 --> 00:49:05,000
Yes.
951
00:49:05,000 --> 00:49:07,000
Not, but what's the other way to do it?
952
00:49:07,000 --> 00:49:08,000
Not microservices.
953
00:49:08,000 --> 00:49:10,000
So a service oriented architecture.
954
00:49:10,000 --> 00:49:15,000
Sort of, but like basically you're keeping it so that way when one area breaks, it doesn't
955
00:49:15,000 --> 00:49:17,000
completely break everything else.
956
00:49:17,000 --> 00:49:21,000
But like if you, so if you're processing this, like for one people are bad at processing
957
00:49:21,000 --> 00:49:25,000
and schemas and all of that stuff anyways, but if you're doing some of the work in these
958
00:49:25,000 --> 00:49:30,000
individual places that when you get backed up and then you send it all to the same pipeline,
959
00:49:30,000 --> 00:49:34,000
you're now not creating the same stress and bottleneck on your pipelines.
960
00:49:34,000 --> 00:49:38,000
And because we're going to get more and more and more data and people are going to want
961
00:49:38,000 --> 00:49:43,000
to like do all these crazy things with it, like that's great, but it's going to cause
962
00:49:43,000 --> 00:49:44,000
more and more stress.
963
00:49:44,000 --> 00:49:49,000
Like we keep making, like at this point, like Jenkins is not made for the amount of crazy
964
00:49:49,000 --> 00:49:51,000
stuff that like, cause think about it.
965
00:49:51,000 --> 00:49:53,000
It wasn't made for anything.
966
00:49:53,000 --> 00:49:56,000
It wasn't, but it was originally made for Java a million years ago.
967
00:49:56,000 --> 00:50:01,000
And now people are trying to use it in this new modern way or just pipeline services that
968
00:50:01,000 --> 00:50:05,000
were not made up for this amount of heavy data streaming that we're doing, you know, so
969
00:50:05,000 --> 00:50:06,000
we're
970
00:50:06,000 --> 00:50:08,000
David, what do you think the next bottleneck is?
971
00:50:08,000 --> 00:50:09,000
Right?
972
00:50:09,000 --> 00:50:13,000
Cause I do think that data is the obvious one in connectivity to, especially if you're looking
973
00:50:13,000 --> 00:50:14,000
at edge, right?
974
00:50:14,000 --> 00:50:18,000
You're like, Oh, there's some capacity limitation in an edge environment, whether that's compute,
975
00:50:18,000 --> 00:50:19,000
whether that's data.
976
00:50:19,000 --> 00:50:23,000
That's kind of like what we were talking about with LA too, though, because about like how
977
00:50:23,000 --> 00:50:24,000
hardware is moving faster.
978
00:50:24,000 --> 00:50:30,000
Like I don't know if the different parts of computing like are in sync with how some
979
00:50:30,000 --> 00:50:33,000
are moving so fast, you know, it's interesting to see.
980
00:50:33,000 --> 00:50:34,000
Yeah.
981
00:50:34,000 --> 00:50:35,000
Which one do you think is going to outpace the other thing?
982
00:50:35,000 --> 00:50:38,000
Cause like you said, never, uh, let's speed of light is never going to get faster, but
983
00:50:38,000 --> 00:50:40,000
the pipes are getting a little bigger.
984
00:50:40,000 --> 00:50:44,000
But is that a, do we just need better compute to compress that?
985
00:50:44,000 --> 00:50:50,000
No, I, I, I, again, my personal opinion is, uh, first, you know, it's, it's not about
986
00:50:50,000 --> 00:50:51,000
the pipes.
987
00:50:51,000 --> 00:50:52,000
It's about the latency.
988
00:50:52,000 --> 00:50:53,000
Right.
989
00:50:53,000 --> 00:50:54,000
And that will never change.
990
00:50:54,000 --> 00:50:58,000
No amount of compute will, will ever improve latency cause speed of light can't get.
991
00:50:58,000 --> 00:51:01,000
They'll lie to you and say, well, though, like they're just like, and then if you do
992
00:51:01,000 --> 00:51:05,000
this, and I'm just like, no, that's how that works.
993
00:51:05,000 --> 00:51:11,000
But, but, but that said, you know, the, the, the fact is, is that the compute, one of the
994
00:51:11,000 --> 00:51:16,000
things that I talked to a lot of people about is like, the compute is also unused.
995
00:51:16,000 --> 00:51:22,000
Like it's just, you know, a lot of this stuff, again, this Raspberry Pi can, can do a ridiculous
996
00:51:22,000 --> 00:51:27,500
amount of throughput, like ridiculous, um, uh, far more than people would think.
997
00:51:27,500 --> 00:51:30,500
And you're like, well, shit, you know, I'm already have it out there.
998
00:51:30,500 --> 00:51:31,500
It's already doing this other stuff.
999
00:51:31,500 --> 00:51:33,500
I might as well.
1000
00:51:33,500 --> 00:51:36,500
Um, and what I would say is I will contest your point.
1001
00:51:36,500 --> 00:51:40,500
Like, yeah, pipes are getting better, but they, they're getting bigger even faster.
1002
00:51:40,500 --> 00:51:41,500
That's all I'm saying.
1003
00:51:41,500 --> 00:51:45,500
Like, and so it really is like the amount of, and we're going to be making just useless
1004
00:51:45,500 --> 00:51:50,500
data because people like, I think that they just really like data is the most valuable,
1005
00:51:50,500 --> 00:51:56,500
you know, commodity, but also because we have all of these sensors and we have all of this
1006
00:51:56,500 --> 00:51:58,500
AI trying to make all of this data.
1007
00:51:58,500 --> 00:52:03,500
I think we're going to end up just, and we have, we're going to have so much compute
1008
00:52:03,500 --> 00:52:08,500
power with all these data centers that like, so people are just almost, I don't think they
1009
00:52:08,500 --> 00:52:11,500
realize how much infrastructure and data are growing.
1010
00:52:11,500 --> 00:52:12,500
Yeah.
1011
00:52:12,500 --> 00:52:13,500
Totally agree.
1012
00:52:13,500 --> 00:52:14,500
Totally agree.
1013
00:52:14,500 --> 00:52:20,500
So all, all I want to say though is, is, uh, Justin is, uh, um, you know, you are, it
1014
00:52:20,500 --> 00:52:26,500
will be a challenge in a year or two years, five years time to have anything, even this
1015
00:52:26,500 --> 00:52:30,500
Raspberry Pi, not have acceleration on it, right?
1016
00:52:30,500 --> 00:52:33,500
Like just at current power, right?
1017
00:52:33,500 --> 00:52:37,500
It will still have an acceleration and maybe that's because of the system on a chip or whatever
1018
00:52:37,500 --> 00:52:38,500
it might be.
1019
00:52:38,500 --> 00:52:44,500
But then you're like, it's not that I don't want to use my Blackwell plus plus, you know,
1020
00:52:44,500 --> 00:52:48,500
whatever as a central thing, but why don't I have it work on the more interesting problems
1021
00:52:48,500 --> 00:52:53,500
and have whatever a GPU that's sitting on this thing do some of the work?
1022
00:52:53,500 --> 00:53:01,500
Like I, I gave a demo, um, uh, earlier this week that showed, uh, onboarding from 20 nodes,
1023
00:53:01,500 --> 00:53:08,500
uh, on three different clouds using Expanso, um, uh, to all of the, to, to, uh, BigQuery,
1024
00:53:08,500 --> 00:53:09,500
for example, right?
1025
00:53:09,500 --> 00:53:10,500
As a backend.
1026
00:53:10,500 --> 00:53:16,500
And, uh, I was able to push, uh, 27 million rows in under 40 seconds, right?
1027
00:53:16,500 --> 00:53:17,500
From all of these things.
1028
00:53:17,500 --> 00:53:21,500
And that's not because, you know, there was something magical happening here.
1029
00:53:21,500 --> 00:53:27,500
It was because I was adding together the aggregate of all that bandwidth at the same time.
1030
00:53:27,500 --> 00:53:32,500
And there's just, there's no way to like make that any faster.
1031
00:53:32,500 --> 00:53:38,500
Like no amount of network will ever achieve what you can do with the same network multiplied
1032
00:53:38,500 --> 00:53:41,500
times, you know, the number of nodes I have, right?
1033
00:53:41,500 --> 00:53:42,500
It's just the way that works.
1034
00:53:42,500 --> 00:53:44,500
So like I would argue, yeah.
1035
00:53:44,500 --> 00:53:49,500
I was just like, that's exactly the age old problem we've had in infrastructure where
1036
00:53:49,500 --> 00:53:54,500
you can vertically scale one machine and say, I need, I need this big Oracle database on
1037
00:53:54,500 --> 00:53:59,500
that one machine and it needs a terabyte of memory, or you can go with, give me five different
1038
00:53:59,500 --> 00:54:02,500
racks and I'm going to spread that amount of memory across them.
1039
00:54:02,500 --> 00:54:07,500
And there is overhead for the coordination, but we've found those just better performance
1040
00:54:07,500 --> 00:54:11,500
in general for resiliency and for all these other things.
1041
00:54:11,500 --> 00:54:17,500
So being able to spread out that and aggregate it is obviously like we can't make single
1042
00:54:17,500 --> 00:54:18,500
machines big enough.
1043
00:54:18,500 --> 00:54:22,500
We can't make single pipes big enough that are, that are economic, right?
1044
00:54:22,500 --> 00:54:24,500
Cause we can do like Japan just broke a record.
1045
00:54:24,500 --> 00:54:27,500
They're doing like a, like you could download all of Netflix in under a minute.
1046
00:54:27,500 --> 00:54:28,500
Right.
1047
00:54:28,500 --> 00:54:31,500
It's like, yeah, it's like petabyte of throughput, but I can't buy that and no one's going to,
1048
00:54:31,500 --> 00:54:32,500
no one's going to buy that one.
1049
00:54:32,500 --> 00:54:35,500
They're like, actually, I'm just going to go spend it on 10 gig networks everywhere
1050
00:54:35,500 --> 00:54:37,500
rather than one giant pipe.
1051
00:54:37,500 --> 00:54:38,500
I think this is interesting.
1052
00:54:38,500 --> 00:54:42,500
I think it's interesting if you think about like this conversation we're having with
1053
00:54:42,500 --> 00:54:47,500
LA about how like much like hardware has advanced, right?
1054
00:54:47,500 --> 00:54:50,500
And now everybody really wants to run their own stuff.
1055
00:54:50,500 --> 00:54:55,500
But I think that we were almost like optimizing like these, the different hardware because
1056
00:54:55,500 --> 00:54:57,500
for a long time everyone was using cloud.
1057
00:54:57,500 --> 00:55:02,500
So they were building it for these cloud companies and then for AI making this really advanced
1058
00:55:02,500 --> 00:55:06,500
hardware that people weren't playing with and experimenting as much because they were using
1059
00:55:06,500 --> 00:55:07,500
it in the cloud.
1060
00:55:07,500 --> 00:55:11,500
And now that I think people are getting almost like reacquaint it with hardware and what
1061
00:55:11,500 --> 00:55:13,500
hardware can do that.
1062
00:55:13,500 --> 00:55:17,500
It's really interesting to see what people are going to push the limits with this hardware
1063
00:55:17,500 --> 00:55:22,500
because it's so optimized for AI and cloud and all these different places that it was
1064
00:55:22,500 --> 00:55:23,500
being used in.
1065
00:55:23,500 --> 00:55:29,500
And now developers and startups are getting that actual hardware back in their hands.
1066
00:55:29,500 --> 00:55:30,500
Yeah.
1067
00:55:30,500 --> 00:55:31,500
And I think it's going to be interesting.
1068
00:55:31,500 --> 00:55:35,500
Like you said, like what Raspberry Pi can do, you know what I mean?
1069
00:55:35,500 --> 00:55:36,500
Yeah.
1070
00:55:36,500 --> 00:55:37,500
Absolutely.
1071
00:55:37,500 --> 00:55:38,500
Absolutely.
1072
00:55:38,500 --> 00:55:41,500
And there's a threshold of when that hardware advancement becomes just universally available.
1073
00:55:41,500 --> 00:55:43,500
That's what I'm saying because there's so much of it.
1074
00:55:43,500 --> 00:55:48,500
This is so cheap that it's economical for me just to like redevelop something so that
1075
00:55:48,500 --> 00:55:49,500
it uses that.
1076
00:55:49,500 --> 00:55:53,500
I remember at Disney animation at one point we were switching out hardware because we switched
1077
00:55:53,500 --> 00:55:57,500
out racks and racks of servers because the new chip had this.
1078
00:55:57,500 --> 00:56:01,500
I forget what the math was, but it was some function that we do a lot in rendering on
1079
00:56:01,500 --> 00:56:02,500
the CPU.
1080
00:56:02,500 --> 00:56:08,500
And we're like, actually, we will just render this movie like 10% faster by swapping out
1081
00:56:08,500 --> 00:56:09,500
all these CPUs.
1082
00:56:09,500 --> 00:56:13,000
And it's going to cost us millions of dollars to swap out all these CPUs, but we're going
1083
00:56:13,000 --> 00:56:15,500
to get the movie done in time versus delaying it.
1084
00:56:15,500 --> 00:56:17,000
And that's absolutely worth it or whatever.
1085
00:56:17,000 --> 00:56:20,500
It's just like, yeah, no, we're going to do that part in hardware and no longer do it
1086
00:56:20,500 --> 00:56:21,500
in software.
1087
00:56:21,500 --> 00:56:22,500
I think we're going to see a lot of that.
1088
00:56:22,500 --> 00:56:28,000
Like if you look at how Apple is processing AI inside of the iPhones just because that
1089
00:56:28,000 --> 00:56:31,500
way it's technically safer and it's more secure and they could promise more.
1090
00:56:31,500 --> 00:56:32,500
But Siri's still the dumbest one.
1091
00:56:32,500 --> 00:56:33,500
I don't know.
1092
00:56:33,500 --> 00:56:39,500
But still that's, but that like, if somebody told you, if someone told you 10 years ago
1093
00:56:39,500 --> 00:56:44,000
that you are not only going to be able to run an AI model and have it processed in a chip
1094
00:56:44,000 --> 00:56:45,500
that's in your pocket, that's still amazing.
1095
00:56:45,500 --> 00:56:46,500
I don't care what you say.
1096
00:56:46,500 --> 00:56:48,500
I mean, there's, oh, sorry, please.
1097
00:56:48,500 --> 00:56:50,500
But just, you know what I mean?
1098
00:56:50,500 --> 00:56:52,500
Like hardware is changing so much.
1099
00:56:52,500 --> 00:56:56,500
And like there's whole developers that went half a whole career or like young developers,
1100
00:56:56,500 --> 00:57:00,500
like maybe that are mid developers now that have never got to play with that kind of hardware.
1101
00:57:00,500 --> 00:57:07,500
And we're making it at such a fast speed and they're making these chips for AI that so much
1102
00:57:07,500 --> 00:57:12,500
hardware is going to now just have like either it's going to be overproduced at some point
1103
00:57:12,500 --> 00:57:16,500
or they're going to get rid of all the old hardware and now it's going to get so cheap.
1104
00:57:16,500 --> 00:57:21,500
I keep saying, I'm so excited for three to four years from now with all of these GPUs
1105
00:57:21,500 --> 00:57:22,500
hit secondhand market.
1106
00:57:22,500 --> 00:57:23,500
Yeah.
1107
00:57:23,500 --> 00:57:28,500
Everyone has super powerful broad today and video chips like actually I can run so much
1108
00:57:28,500 --> 00:57:29,500
stuff.
1109
00:57:29,500 --> 00:57:34,500
Like the, just the way that the tech market's been really weird and all of that.
1110
00:57:34,500 --> 00:57:40,500
Like I just wonder what cool advancements are going to come out of that, you know, moment.
1111
00:57:40,500 --> 00:57:41,500
Yeah.
1112
00:57:41,500 --> 00:57:42,500
You know, it's, it's, it's fascinating.
1113
00:57:42,500 --> 00:57:44,500
So there's an amazing story.
1114
00:57:44,500 --> 00:57:49,500
I remember way back when the plenty of fish guy.
1115
00:57:49,500 --> 00:57:55,500
Did anyone remember plenty of fish fish was a dating site and it was like a competition
1116
00:57:55,500 --> 00:57:59,500
to okay, Cupid and all these various things out there.
1117
00:57:59,500 --> 00:58:03,500
And it was really, really popular and very, very funny.
1118
00:58:03,500 --> 00:58:07,500
Like at the time this is way before containers and things like that.
1119
00:58:07,500 --> 00:58:13,500
He had built in the entire thing to be a vertical massive machine.
1120
00:58:13,500 --> 00:58:17,500
And it was the like he, I think what I was at Microsoft the first time and I think he
1121
00:58:17,500 --> 00:58:24,500
had the largest like single instance thing that we knew about it was like seriously was
1122
00:58:24,500 --> 00:58:26,500
like 128 CPUs or something like that.
1123
00:58:26,500 --> 00:58:28,500
It was like just an absurd thing.
1124
00:58:28,500 --> 00:58:31,500
And he's, and everyone's like, why do you do like build this out?
1125
00:58:31,500 --> 00:58:33,500
And he's like, cause I don't need to and it's easier.
1126
00:58:33,500 --> 00:58:42,500
And like you go to like Monzo's thing that a graph of microservices that got like then
1127
00:58:42,500 --> 00:58:44,500
everyone got up in arms about a few years ago.
1128
00:58:44,500 --> 00:58:46,500
Oh my God, why are you doing all these microservices?
1129
00:58:46,500 --> 00:58:49,500
Like, and, and they're not wrong.
1130
00:58:49,500 --> 00:58:55,500
You know, like you don't get complexity for like that's a cost.
1131
00:58:55,500 --> 00:58:57,500
But the cost may be worth the benefit.
1132
00:58:57,500 --> 00:58:58,500
Right.
1133
00:58:58,500 --> 00:59:01,500
And so like exactly like you were just saying on them, like we're going to have all this
1134
00:59:01,500 --> 00:59:06,500
compute out there and we're going to have this, but we need to have lower like it needs
1135
00:59:06,500 --> 00:59:11,500
to be sub linear cost in scaling that every additional machine should not cost the same.
1136
00:59:11,500 --> 00:59:14,500
It should be very, very small increment and benefit.
1137
00:59:14,500 --> 00:59:18,500
And, and you know, I'm trying to participate in that by like offering this platform that
1138
00:59:18,500 --> 00:59:21,500
helps with that, but like Kubernetes certainly did that.
1139
00:59:21,500 --> 00:59:25,500
And, and, you know, Docker with its portability certainly did that and all these kind of things.
1140
00:59:25,500 --> 00:59:26,500
Right.
1141
00:59:26,500 --> 00:59:29,500
There's just a bunch of different ways to go out and tackle this thing.
1142
00:59:29,500 --> 00:59:35,500
And so what I would say is that, you know, you're exactly right.
1143
00:59:35,500 --> 00:59:42,500
We need to enable this, but we need to enable it smartly because it is really, really, really
1144
00:59:42,500 --> 00:59:46,500
easy to go and get a massive machine, two terabyte machine and just stick everything
1145
00:59:46,500 --> 00:59:47,500
on there.
1146
00:59:47,500 --> 00:59:49,500
This piece of cake single machine.
1147
00:59:49,500 --> 00:59:50,500
I know where to log in.
1148
00:59:50,500 --> 00:59:53,500
I can manage the entire thing with SSH and call it a day.
1149
00:59:53,500 --> 00:59:58,500
That's not particularly efficient, but it's also not terrible.
1150
00:59:58,500 --> 01:00:03,500
But also just a lot of tax service and exactly is just looking at this.
1151
01:00:03,500 --> 01:00:04,500
Exactly.
1152
01:00:04,500 --> 01:00:06,500
They're going to have so much fun with you.
1153
01:00:06,500 --> 01:00:10,500
But what I like to say is the reason I built this company is like they're these immutable
1154
01:00:10,500 --> 01:00:11,500
truths.
1155
01:00:11,500 --> 01:00:16,500
Like the fact is, is no matter how great that machine is, my entire business does not exist.
1156
01:00:17,500 --> 01:00:24,500
In co-located with that machine, like I have real physical things in the world, whatever
1157
01:00:24,500 --> 01:00:29,500
they may be users that are accessing my website from all over the world or retail outlets
1158
01:00:29,500 --> 01:00:35,500
or hospitals or factories or cars, like that stuff is happening too.
1159
01:00:35,500 --> 01:00:38,500
And, and so then people are like, well, don't worry.
1160
01:00:38,500 --> 01:00:42,500
I'll just take all that stuff and I'll build a digital twin and I'll just mimic all that
1161
01:00:42,500 --> 01:00:43,500
stuff.
1162
01:00:43,500 --> 01:00:45,500
And I'm like, oh, that's not see that.
1163
01:00:45,500 --> 01:00:51,500
But like, you want to be redundant, but that's not all that is not just making an exact copy
1164
01:00:51,500 --> 01:00:52,500
is not always it.
1165
01:00:52,500 --> 01:00:53,500
100%.
1166
01:00:53,500 --> 01:00:57,500
I think that we're, I think we're going to be into a weird space though, because we're
1167
01:00:57,500 --> 01:01:02,500
removing so much abstraction, like cloud was an abstraction from hardware, but AI is
1168
01:01:02,500 --> 01:01:04,500
like a super extraction.
1169
01:01:04,500 --> 01:01:11,500
And we're not only forcing engineers to use it, but we're going to grow a whole like
1170
01:01:11,500 --> 01:01:13,500
generation of developers on AI.
1171
01:01:13,500 --> 01:01:18,500
So you've got people that are either experimenting with hardware and they're in the like thick
1172
01:01:18,500 --> 01:01:23,500
of it, or they are even more abstracted than the whole generation of developers that we
1173
01:01:23,500 --> 01:01:25,500
just had that came into the cloud.
1174
01:01:25,500 --> 01:01:30,500
So how will we kind of like educate people on how to use those things, because like you
1175
01:01:30,500 --> 01:01:34,500
said, it's really easy, especially with the money that people are throwing at certain
1176
01:01:34,500 --> 01:01:39,500
like things where they're just going to buy this huge hardware and put everything, you
1177
01:01:39,500 --> 01:01:40,500
know what I mean?
1178
01:01:40,500 --> 01:01:44,500
Because it's going to be simple and it's going to be less permissions to give AI.
1179
01:01:44,500 --> 01:01:47,500
And like, how do we even educate people to do that?
1180
01:01:47,500 --> 01:01:48,500
Absolutely.
1181
01:01:48,500 --> 01:01:49,500
Absolutely right.
1182
01:01:49,500 --> 01:01:50,500
Yeah.
1183
01:01:50,500 --> 01:01:53,500
There's just a, I mean, you just put your finger, you're what do you call it?
1184
01:01:53,500 --> 01:01:56,500
And the key is, again, you touched on it.
1185
01:01:56,500 --> 01:02:02,500
The education will be giving people a full picture of the world, right?
1186
01:02:02,500 --> 01:02:06,500
Like, you know, when we were first trying to get people to adopt the cloud, there were
1187
01:02:06,500 --> 01:02:08,500
so many times people like, Oh, I don't know.
1188
01:02:08,500 --> 01:02:10,500
You know, I got these machines and so on and so forth.
1189
01:02:10,500 --> 01:02:14,500
And we're like, you know, like they would be like, why would I go out and pay for something
1190
01:02:14,500 --> 01:02:17,500
when I already have the assets in house?
1191
01:02:17,500 --> 01:02:21,500
And, and the conversation was like, well, are you really capturing what you have and
1192
01:02:21,500 --> 01:02:22,500
what you're doing?
1193
01:02:22,500 --> 01:02:26,500
Like, do you want to let go and reboot the machine at three o'clock in the morning?
1194
01:02:26,500 --> 01:02:31,500
Do you want to migrate, you know, the OS when you have to the, you know, the hypervisor,
1195
01:02:31,500 --> 01:02:32,500
et cetera, et cetera?
1196
01:02:32,500 --> 01:02:35,500
Again, no one's saying that's not an answer.
1197
01:02:35,500 --> 01:02:42,500
But when you're doing this, you need to think about the entire scope of the problem and
1198
01:02:42,500 --> 01:02:44,500
capture all the costs.
1199
01:02:44,500 --> 01:02:47,500
Because if it's just this, that's not going to be enough.
1200
01:02:47,500 --> 01:02:49,500
And so that's very much what it is.
1201
01:02:49,500 --> 01:02:53,500
I think that's like the thing we get these trends and everybody wants to do it.
1202
01:02:53,500 --> 01:02:57,500
And then they never like, yeah, you can be in the cloud, but then you have to think about
1203
01:02:57,500 --> 01:03:00,500
how expensive the cloud is and the fact that you're like abstracted.
1204
01:03:00,500 --> 01:03:04,500
But then you get on-prem and then you have to figure out, do you have a DBA to run all
1205
01:03:04,500 --> 01:03:05,500
this stuff?
1206
01:03:05,500 --> 01:03:10,500
And like, you have to be able to be good at kind of figuring out your future.
1207
01:03:10,500 --> 01:03:11,500
But okay.
1208
01:03:11,500 --> 01:03:15,500
So when you're talking about Kubernetes and Dockers, it made me think of Corey Quinn's
1209
01:03:15,500 --> 01:03:19,500
scale talk about how he compared Docker to like that.
1210
01:03:19,500 --> 01:03:21,500
Was it like a F-14 or something?
1211
01:03:21,500 --> 01:03:23,500
I didn't see it.
1212
01:03:23,500 --> 01:03:29,500
But he basically like was comparing Docker and Kubernetes and basically said it was like
1213
01:03:29,500 --> 01:03:30,500
the worst, but it was so funny.
1214
01:03:30,500 --> 01:03:34,500
It was basically he was just really explaining Kubernetes and Docker and the differences.
1215
01:03:34,500 --> 01:03:41,500
But what do you think the next, like, what is the next big, because everything in tech
1216
01:03:41,500 --> 01:03:44,500
is like databases are databases.
1217
01:03:44,500 --> 01:03:45,500
Compute is compute.
1218
01:03:45,500 --> 01:03:53,500
What's the big next, I guess, like revolution and compute and Kubernetes and Docker.
1219
01:03:53,500 --> 01:03:58,500
Obviously, I personally have this general opinion, right?
1220
01:03:58,500 --> 01:04:03,500
Obviously, I think edge and distributed things is going to be enormous.
1221
01:04:03,500 --> 01:04:06,500
And I very much hope to be a part of that.
1222
01:04:06,500 --> 01:04:11,500
Because again, I love building on things that can never change, right?
1223
01:04:11,500 --> 01:04:13,500
All those things I said earlier will never change.
1224
01:04:13,500 --> 01:04:17,500
Data will grow, speed of light will get faster, regulations will be out there, all that kind
1225
01:04:17,500 --> 01:04:18,500
of stuff.
1226
01:04:19,500 --> 01:04:22,500
I mean, we're old that we like the old constant things.
1227
01:04:22,500 --> 01:04:24,500
Does that mean that we're old people now?
1228
01:04:24,500 --> 01:04:26,500
No, it's because like you just recognize it now.
1229
01:04:26,500 --> 01:04:27,500
Yeah.
1230
01:04:27,500 --> 01:04:30,500
Well, I mean, it's one of these things where it's like, you know what, the secret sauce
1231
01:04:30,500 --> 01:04:35,500
behind Manhattan is not because, you know, like why it gets so tall buildings is because
1232
01:04:35,500 --> 01:04:37,500
they know where the bedrock is, right?
1233
01:04:37,500 --> 01:04:42,500
And so they're able to drill down to things that will never change for better or worse.
1234
01:04:42,500 --> 01:04:46,500
And so like, I think that's critically important to understand the things that will never change
1235
01:04:46,500 --> 01:04:50,500
and then figure out what will happen inside that is what will be next.
1236
01:04:50,500 --> 01:04:55,500
So my take on it is, and again, I say this with a highly biased thing is like, it will
1237
01:04:55,500 --> 01:05:03,500
be how do I act like a cloud, but in a thing that respects those immutable truths and matches
1238
01:05:03,500 --> 01:05:07,500
where the data and the compute and the things actually are growing.
1239
01:05:07,500 --> 01:05:12,500
And so when you see Jensen stand on stage and talk about the next, you know, trillion devices
1240
01:05:12,500 --> 01:05:17,500
where you talk about, you know, me being able to have instant response and instant memory
1241
01:05:17,500 --> 01:05:24,500
on my phone or whatever it might be, that's not everything going back to a central API.
1242
01:05:24,500 --> 01:05:31,500
That's that's those things out there having smarts at a level that is incredibly, you know,
1243
01:05:31,500 --> 01:05:32,500
that feels integrated.
1244
01:05:32,500 --> 01:05:38,500
And again, it's where it gets to that sublinear scaling because like I'm telling you the Gemini
1245
01:05:38,500 --> 01:05:43,500
AI and the anthropic people, they're like, they don't want to be out there like, you know,
1246
01:05:43,500 --> 01:05:48,500
managing why, you know, something isn't working at some factory and whatever, you know, the
1247
01:05:48,500 --> 01:05:53,500
Philippines, they want to have like a very easy way for someone out there to deploy a
1248
01:05:53,500 --> 01:05:59,500
model and run it and have it be reliable and debug it and, you know, have metadata around
1249
01:05:59,500 --> 01:06:05,500
it, which is the other thing that I think is super lost and something that we support
1250
01:06:05,500 --> 01:06:06,500
actively.
1251
01:06:06,500 --> 01:06:12,500
But it's, it's, I think that everything is a graph, right?
1252
01:06:12,500 --> 01:06:20,500
And that, you know, all these things workflows and transformations and so on are super under
1253
01:06:20,500 --> 01:06:23,500
invested in by us as a community.
1254
01:06:23,500 --> 01:06:30,500
And it really is it distills down to the simplest possible thing, which is here's this artifact,
1255
01:06:30,500 --> 01:06:33,500
this binary thing that I want you to run.
1256
01:06:33,500 --> 01:06:40,500
And as you do that, I want to record in a way that is programmatic for the, the computer
1257
01:06:40,500 --> 01:06:46,500
to understand what went into that thing, what the thing did, and then what the output was,
1258
01:06:46,500 --> 01:06:47,500
right?
1259
01:06:47,500 --> 01:06:54,500
And simply by having a structured way to approach that will change so much, it will change CICD,
1260
01:06:54,500 --> 01:06:57,500
it will change execution, it will change all these various things.
1261
01:06:57,500 --> 01:07:02,500
And there are many like larger efforts around stuff like this open telemetry is a perfect
1262
01:07:02,500 --> 01:07:03,500
example, right?
1263
01:07:03,500 --> 01:07:06,500
Where you start to think about things as traces and so on.
1264
01:07:06,500 --> 01:07:13,500
But I do think that when you hear the word reproducibility crisis, or you, you see someone
1265
01:07:13,500 --> 01:07:16,500
at two o'clock in the morning trying to figure out what the fuck is going on and why things
1266
01:07:16,500 --> 01:07:18,500
are debug hard to debug.
1267
01:07:18,500 --> 01:07:21,500
It's almost always that problem.
1268
01:07:21,500 --> 01:07:23,500
I don't know what went into this thing.
1269
01:07:23,500 --> 01:07:25,500
I don't know how it ran.
1270
01:07:25,500 --> 01:07:29,500
And I don't know how it came out in a deterministic way.
1271
01:07:29,500 --> 01:07:36,500
And if you don't have that, we will continue to like try and build these like incredibly
1272
01:07:36,500 --> 01:07:41,500
hacky scripts to parse stack traces to figure out what the fuck was going on.
1273
01:07:41,500 --> 01:07:44,500
Do you think AI is going to contribute to that problem?
1274
01:07:44,500 --> 01:07:49,500
No, I think it'll be much worse because, because in its core AI is not deterministic.
1275
01:07:49,500 --> 01:07:53,500
And so I mean, like contribute to making more of it.
1276
01:07:53,500 --> 01:07:55,500
Like, so, okay, technically think about it.
1277
01:07:55,500 --> 01:07:58,500
If AI starts vibe coding a bunch of these things.
1278
01:07:58,500 --> 01:08:03,500
Oh, and we're going to be like, you know, we're already increasing the amount like of agents
1279
01:08:03,500 --> 01:08:05,500
and different things that are coming back to us.
1280
01:08:05,500 --> 01:08:08,500
When you don't know what the expected output should be.
1281
01:08:08,500 --> 01:08:09,500
Absolutely.
1282
01:08:09,500 --> 01:08:11,500
It's really hard to diagnose a problem.
1283
01:08:11,500 --> 01:08:16,500
So I think that not only are you onto things as just development in general, but like that
1284
01:08:16,500 --> 01:08:21,500
is going to almost be like multiplied by the new way of development.
1285
01:08:21,500 --> 01:08:23,500
Yeah, I totally misunderstood what you're saying.
1286
01:08:23,500 --> 01:08:24,500
Yes, exactly.
1287
01:08:24,500 --> 01:08:28,500
It's the, it's the fact that those models are not deterministic that are so brutal.
1288
01:08:28,500 --> 01:08:34,500
And, and whoever breaks through the determinism around AI.
1289
01:08:34,500 --> 01:08:37,500
I mean, you can do it.
1290
01:08:37,500 --> 01:08:42,500
You can get close with things like rappers and things like that, but it's not there.
1291
01:08:42,500 --> 01:08:46,500
And I think the thing is, is it's hard to be deterministic as a developer.
1292
01:08:46,500 --> 01:08:49,500
That's a human because there's so many ways to build things, right?
1293
01:08:49,500 --> 01:08:54,500
Like, and there's so many ways to like, argue like what in half the way, half the time you're like,
1294
01:08:54,500 --> 01:08:56,500
is it because you did this before?
1295
01:08:56,500 --> 01:08:58,500
Is it because you like this method?
1296
01:08:58,500 --> 01:08:59,500
You know what I mean?
1297
01:08:59,500 --> 01:09:01,500
So then the fact that humans can do it.
1298
01:09:01,500 --> 01:09:05,500
Yeah, there's another incredibly smart friend of mine who's, who's right up here.
1299
01:09:05,500 --> 01:09:07,500
Who's saying exactly what you're saying.
1300
01:09:07,500 --> 01:09:12,500
And, and, you know, the new hotness around ML is, so his name's homel.
1301
01:09:12,500 --> 01:09:14,500
He has a courses on this and things like that.
1302
01:09:14,500 --> 01:09:16,500
It's all about evals, right?
1303
01:09:16,500 --> 01:09:24,500
That is such a brutally important and totally must think by a lot of the ML people adopting ML and AI right out,
1304
01:09:24,500 --> 01:09:32,500
which is like, how do I programmatically verify that this model does what I said it should do, right?
1305
01:09:32,500 --> 01:09:37,500
Like, unless you have that, like, do not even begin to go down the ML path.
1306
01:09:37,500 --> 01:09:42,500
Because my God, you know, like, unless you put a human in the loop, which is fine,
1307
01:09:42,500 --> 01:09:47,500
you're never going to be able to like train or build your model in an insensible way.
1308
01:09:47,500 --> 01:09:48,500
I think I'm trying to figure that out.
1309
01:09:48,500 --> 01:09:54,500
Like, how do you use AI to be faster at things, but also the fact that you have to then go verify.
1310
01:09:54,500 --> 01:09:55,500
Is it faster?
1311
01:09:55,500 --> 01:10:00,500
You know, like, I'm trying to figure out how do you use it to learn and to get better at things
1312
01:10:00,500 --> 01:10:06,500
without just losing the abstraction and the knowledge that you need to gain?
1313
01:10:06,500 --> 01:10:12,500
I mean, you know, the, the, that, that piece that came out, I think they got it pretty wrong about the like,
1314
01:10:12,500 --> 01:10:16,500
oh, you know, coders are slower when they use the ML and so on.
1315
01:10:16,500 --> 01:10:17,500
I think that missed it.
1316
01:10:17,500 --> 01:10:21,500
Like, because it didn't really represent the way that people do this, right?
1317
01:10:21,500 --> 01:10:25,500
Like, what they'll do is they'll like stand, you know, vibe code something,
1318
01:10:25,500 --> 01:10:29,500
and then they'll like try and compile it, or then they'll lint it, and then they'll actually run it,
1319
01:10:29,500 --> 01:10:34,500
and then they'll Google like something, you know, and see whether or not this was a good approach.
1320
01:10:34,500 --> 01:10:36,500
And then they'll go back to vibe code some more.
1321
01:10:36,500 --> 01:10:37,500
Right.
1322
01:10:37,500 --> 01:10:41,500
So it's like, it's not this like vibe code only or hand code only.
1323
01:10:41,500 --> 01:10:42,500
It really is a mix and match.
1324
01:10:42,500 --> 01:10:50,500
And, and right now the only way to solve what you're describing is with that human in the loop where they look at the thing.
1325
01:10:50,500 --> 01:10:52,500
They become the evaluator.
1326
01:10:52,500 --> 01:10:54,500
Do you think we're going to break Stack Overflow though?
1327
01:10:54,500 --> 01:10:57,500
Because this is what I was conspiring and thinking about last night.
1328
01:10:57,500 --> 01:11:00,500
Okay, like, I mean Stack Overflow, I love it to death.
1329
01:11:00,500 --> 01:11:01,500
I listened to it.
1330
01:11:01,500 --> 01:11:04,500
I listened to that podcast from day one.
1331
01:11:04,500 --> 01:11:10,500
But like the website, like we so many developers kind of depend on the fact that we're all hitting these issues, right?
1332
01:11:10,500 --> 01:11:15,500
And someone hit the problem before you, and then we wrote it down somewhere and it's like the,
1333
01:11:15,500 --> 01:11:19,500
it's the notebook that we all go and look in and we're like, hey, did you have this error?
1334
01:11:19,500 --> 01:11:22,500
I'm old enough that that notebook has moved a couple of times on the internet.
1335
01:11:22,500 --> 01:11:25,500
Like if you're, if you're of a certain vintage, you remember.
1336
01:11:25,500 --> 01:11:27,500
But it's the fact that no one's going to write it down.
1337
01:11:27,500 --> 01:11:28,500
David, do you remember Experts Exchange?
1338
01:11:28,500 --> 01:11:29,500
Of course.
1339
01:11:29,500 --> 01:11:30,500
Yeah.
1340
01:11:30,500 --> 01:11:32,500
Experts Exchange was the expert sex change.
1341
01:11:32,500 --> 01:11:33,500
Yeah, exactly.
1342
01:11:33,500 --> 01:11:36,500
But the models are getting it though, because we're like, you know what I mean?
1343
01:11:36,500 --> 01:11:42,500
Like, like if nobody asks questions to like other humans anymore and we're only asking it to AI,
1344
01:11:42,500 --> 01:11:45,500
do we then break the loop of knowledge?
1345
01:11:45,500 --> 01:11:46,500
You know what I mean?
1346
01:11:46,500 --> 01:11:50,500
I mean, what's going to happen is I already know what's going to happen, right?
1347
01:11:50,500 --> 01:11:54,500
The one of these models right now, it's probably going to be Claude, right?
1348
01:11:54,500 --> 01:12:01,500
Or we'll, we'll get the majority of the questions and it will be able to do a little mini loop and say like,
1349
01:12:01,500 --> 01:12:06,500
oh, you know, this person did this in Python and then they ask it again and then it did it again.
1350
01:12:06,500 --> 01:12:10,500
And they'll be able to tie those together and say like, okay, this is what actually was happening.
1351
01:12:10,500 --> 01:12:12,500
And then their model will magically just become smarter.
1352
01:12:12,500 --> 01:12:18,500
Now, if they were like a social good, they would release those datasets to the world and make it easy,
1353
01:12:18,500 --> 01:12:20,500
but that's not going to be it either.
1354
01:12:20,500 --> 01:12:22,500
So it's funny.
1355
01:12:22,500 --> 01:12:25,500
I gave a talk and ML talk in the winter.
1356
01:12:25,500 --> 01:12:32,500
And it was an experiment because I took the exact same talk I gave when I was launching Kubeflow in 2018,
1357
01:12:32,500 --> 01:12:38,500
I want to say, and I gave it again with no changes, not a single change to the slides.
1358
01:12:38,500 --> 01:12:41,500
Everything was the exact same, which is hilarious, right?
1359
01:12:42,500 --> 01:12:49,500
It was all about security vulnerabilities through ML and like what you're going to face here and like how you defend your model
1360
01:12:49,500 --> 01:12:51,500
and how you do this and how you do that.
1361
01:12:51,500 --> 01:12:54,500
And one of them was around distillation of models, right?
1362
01:12:54,500 --> 01:13:03,500
And so what I said is true, Claude or whatever, you know, frontier model will walk away with like the best coding model today.
1363
01:13:04,500 --> 01:13:14,500
And someone else, the number two or number three on the list will use number one as a verifier as it's going through its testing.
1364
01:13:14,500 --> 01:13:15,500
Okay.
1365
01:13:15,500 --> 01:13:16,500
And it doesn't need to be a lot.
1366
01:13:16,500 --> 01:13:21,500
It's like a thousand queries or 2000 queries and they will get so much smarter.
1367
01:13:21,500 --> 01:13:22,500
They will.
1368
01:13:22,500 --> 01:13:24,500
It will not be defendable.
1369
01:13:24,500 --> 01:13:29,500
It will not whether not ethically or unethically that will leak out.
1370
01:13:29,500 --> 01:13:32,500
And then the second model will be good, right?
1371
01:13:32,500 --> 01:13:38,500
And then the third model and now you will have this consensus around these models and that will lift all the boats.
1372
01:13:38,500 --> 01:13:47,500
And so, you know, I would love if, if the, you know, whoever becomes number one model, like just releases data sets so that we can all grow together.
1373
01:13:47,500 --> 01:13:51,500
But, you know, what will not happen is that will never stay secret.
1374
01:13:51,500 --> 01:13:52,500
David, hold on.
1375
01:13:55,500 --> 01:13:56,500
David.
1376
01:13:56,500 --> 01:13:57,500
Yes.
1377
01:13:58,500 --> 01:13:59,500
David, we lost you.
1378
01:13:59,500 --> 01:14:00,500
Oh.
1379
01:14:00,500 --> 01:14:02,500
You just, you just broke up for the last like two minutes.
1380
01:14:02,500 --> 01:14:03,500
I didn't hear it.
1381
01:14:03,500 --> 01:14:04,500
No.
1382
01:14:04,500 --> 01:14:05,500
Did you hear it on him?
1383
01:14:05,500 --> 01:14:06,500
Okay.
1384
01:14:06,500 --> 01:14:07,500
What's the last thing?
1385
01:14:07,500 --> 01:14:08,500
I dropped it too.
1386
01:14:08,500 --> 01:14:09,500
Yeah.
1387
01:14:09,500 --> 01:14:10,500
I just got a notice saying it's there.
1388
01:14:10,500 --> 01:14:11,500
Can you hear me?
1389
01:14:12,500 --> 01:14:13,500
Still not.
1390
01:14:13,500 --> 01:14:14,500
Yeah.
1391
01:14:14,500 --> 01:14:15,500
You're back.
1392
01:14:15,500 --> 01:14:16,500
Okay.
1393
01:14:16,500 --> 01:14:17,500
Where do you want me to go?
1394
01:14:19,500 --> 01:14:24,500
Go, go back to, to what will happen once the second model starts training on the first model.
1395
01:14:24,500 --> 01:14:25,500
Yeah.
1396
01:14:25,500 --> 01:14:26,500
Okay.
1397
01:14:26,500 --> 01:14:33,500
The, the second model will come along and they will begin to train using some of the wisdom from the first model.
1398
01:14:33,500 --> 01:14:36,500
They'll use it as a tool just as a verifier.
1399
01:14:36,500 --> 01:14:42,500
And as for as much as the first model wants to block it, it will not be possible to block because it doesn't require very much.
1400
01:14:42,500 --> 01:14:52,500
Like you're talking about literally thousands of total queries over a few days and you can get a very accurate representation of the underlying model.
1401
01:14:52,500 --> 01:14:57,500
And then the second model will be good and the third model will be good and the open source one will be good.
1402
01:14:57,500 --> 01:15:02,500
And now everyone's boats are lifted and then you're going to do this again and again and again and again.
1403
01:15:02,500 --> 01:15:12,500
And so what does that mean for developers and for the tribal knowledge that we all share on the internet so we can all, you know, excellent question.
1404
01:15:12,500 --> 01:15:18,500
I think that, you know, I love Stack Overflow, but I think it's going to go away or whatever.
1405
01:15:18,500 --> 01:15:19,500
I don't know.
1406
01:15:19,500 --> 01:15:22,500
It will migrate to a new community, right?
1407
01:15:22,500 --> 01:15:29,500
Because I think people will ask their first few questions of this and then want to talk to a human.
1408
01:15:29,500 --> 01:15:39,500
But over time, the chat, whatever the chat interface with your code will become so good, you're like, maybe I don't need to talk to a human.
1409
01:15:39,500 --> 01:15:40,500
I don't know.
1410
01:15:40,500 --> 01:15:41,500
I don't know.
1411
01:15:41,500 --> 01:15:47,500
But does that mean that the humans no longer hold that knowledge and they don't need us?
1412
01:15:47,500 --> 01:15:49,500
I don't.
1413
01:15:49,500 --> 01:15:53,500
I think we'll have the same depth of coding.
1414
01:15:53,500 --> 01:16:03,500
No, I mean, you know, to some degree, like, I haven't, you know, I took compilers years ago as a class and hit the compiler.
1415
01:16:03,500 --> 01:16:05,500
I haven't done anything with a compiler.
1416
01:16:05,500 --> 01:16:08,500
I'd like to know how compilers work.
1417
01:16:08,500 --> 01:16:14,500
And I think it helps me when I am coding, you know, which is rare, like be a better coder.
1418
01:16:14,500 --> 01:16:22,500
But it becomes an abstraction layer that I just, I don't think about, you know, 99% of my day.
1419
01:16:22,500 --> 01:16:26,500
Maybe a lot of these concepts reach that point.
1420
01:16:26,500 --> 01:16:28,500
I don't know.
1421
01:16:28,500 --> 01:16:29,500
I don't know.
1422
01:16:29,500 --> 01:16:33,500
I mean, like half a dozen standard questions you ask when you're interviewing at Big Tech.
1423
01:16:33,500 --> 01:16:40,500
How do you do, you know, what do you want to pick a stack or a heap or whatever, like, you know, people are going to be like, why are you even asking that question?
1424
01:16:40,500 --> 01:16:43,500
I don't know why they ask it half the time.
1425
01:16:43,500 --> 01:17:00,500
It's, you know, you know, Joel, Joel Spolsky from from Stack Overflow and Joel on software talked about this, whatever, 20 years ago, he said that he asked, he would ask questions about, you know, basic HTML concepts, right, as part of his thing.
1426
01:17:00,500 --> 01:17:05,500
And he said that, like, it was just, it was just a filter.
1427
01:17:05,500 --> 01:17:06,500
That really is it.
1428
01:17:06,500 --> 01:17:16,500
It's not to say whether or not you know that, but like the amount that people would like exaggerate, I'll put it politely on their resume and not even be able to answer the most basic thing.
1429
01:17:16,500 --> 01:17:21,500
That's, that's really all you're like looking for from this question.
1430
01:17:21,500 --> 01:17:24,500
But, you know, I'm with you.
1431
01:17:24,500 --> 01:17:32,500
Like people are like, oh, you know, I have a, I want you to develop a queue that can handle throttling and so on and so forth.
1432
01:17:32,500 --> 01:17:33,500
And you're like, all right, really?
1433
01:17:33,500 --> 01:17:34,500
Like, I get it.
1434
01:17:34,500 --> 01:17:36,500
Nobody's going to actually build that.
1435
01:17:36,500 --> 01:17:38,500
I mean, you will, you will.
1436
01:17:38,500 --> 01:17:41,500
But the first thing you're going to go do is Google how to do it.
1437
01:17:41,500 --> 01:17:43,500
That's what I'm saying.
1438
01:17:43,500 --> 01:17:44,500
Yeah.
1439
01:17:44,500 --> 01:17:47,500
Like no one does that off of like knowledge.
1440
01:17:47,500 --> 01:17:52,500
You're going to go look it up and then you're going to compare three different things and optimize it and.
1441
01:17:52,500 --> 01:17:54,500
David, this has been fun.
1442
01:17:54,500 --> 01:17:56,500
Where should people find you on the internet?
1443
01:17:56,500 --> 01:17:59,500
So I'm a big blue sky guy.
1444
01:17:59,500 --> 01:18:02,500
I certainly continue to spew there.
1445
01:18:02,500 --> 01:18:04,500
Please come try.
1446
01:18:04,500 --> 01:18:06,500
Try our platform.
1447
01:18:06,500 --> 01:18:07,500
I would love to hear from you.
1448
01:18:07,500 --> 01:18:10,500
If you're a data, if you're touch data, right?
1449
01:18:10,500 --> 01:18:16,500
If you spend any meaningful amount on data, I want to hear how you, we can make our distributed data pipelines better.
1450
01:18:16,500 --> 01:18:18,500
Expanso.io.
1451
01:18:18,500 --> 01:18:23,500
I post all my talks at my website, David Aron check.com.
1452
01:18:23,500 --> 01:18:25,500
And I just love talking to people.
1453
01:18:25,500 --> 01:18:34,500
I, you know, I am the first person to say I like to be the dumbest person in the room, which is very easy because, you know, when you're this dumb, it's your, everyone's smart.
1454
01:18:34,500 --> 01:18:41,500
But like, I, you know, I just want to be smarter about like you and your business and your, I don't know what your opinions are.
1455
01:18:41,500 --> 01:18:48,500
I got in an argument last night with a guy who like rewrote basically all of Excel and raw JavaScript, like from scratch.
1456
01:18:48,500 --> 01:18:50,500
And I was like, what the hell are you doing, man?
1457
01:18:50,500 --> 01:18:53,500
He was like, well, this isn't this because he wanted it.
1458
01:18:53,500 --> 01:18:54,500
He's just a JavaScript guy.
1459
01:18:54,500 --> 01:18:56,500
And I was like, my God, how do you even do that?
1460
01:18:56,500 --> 01:19:00,500
Not TypeScript, not coffee script, raw JavaScript.
1461
01:19:00,500 --> 01:19:02,500
That's a quote.
1462
01:19:02,500 --> 01:19:03,500
I know.
1463
01:19:03,500 --> 01:19:05,500
That is, that is, but anyhow, I loved it.
1464
01:19:05,500 --> 01:19:08,500
It was amazing conversation.
1465
01:19:08,500 --> 01:19:12,500
We will, we have you in the blue sky starter pack for our guests.
1466
01:19:12,500 --> 01:19:14,500
I need to convert that to a list at some point.
1467
01:19:14,500 --> 01:19:15,500
So people can find you there.
1468
01:19:15,500 --> 01:19:17,500
I think you're iron yuppie on blue sky.
1469
01:19:17,500 --> 01:19:18,500
I am iron yuppie.
1470
01:19:18,500 --> 01:19:19,500
That is me.
1471
01:19:19,500 --> 01:19:20,500
So yeah.
1472
01:19:20,500 --> 01:19:26,500
And we'll, we'll definitely, we'll figure out a time we can have you on for a second round of history, some of your antics at Kubeflow and Microsoft.
1473
01:19:26,500 --> 01:19:27,500
I have so many questions.
1474
01:19:27,500 --> 01:19:28,500
I just want to know the tea.
1475
01:19:28,500 --> 01:19:31,500
Like the next whole episode has to be the tea.
1476
01:19:31,500 --> 01:19:34,500
I will talk about this until I'm blue in the face.
1477
01:19:34,500 --> 01:19:38,500
I like, I say this as someone like, I just don't believe in like speaking ill of people.
1478
01:19:38,500 --> 01:19:42,500
So like, don't, don't tune in if you think I'm going to like badmouth someone.
1479
01:19:42,500 --> 01:19:46,500
Like it's just the navigations of these things happening is just so fascinating.
1480
01:19:46,500 --> 01:19:50,500
I feel so lucky that I couldn't be there.
1481
01:19:50,500 --> 01:19:51,500
All right.
1482
01:19:51,500 --> 01:19:52,500
Thank you so much.
1483
01:19:52,500 --> 01:19:53,500
And thank you everyone for listening.
1484
01:19:53,500 --> 01:19:55,500
We will talk to you again soon.
1485
01:20:02,500 --> 01:20:06,500
Thank you for listening to this episode of fork around and find out.
1486
01:20:06,500 --> 01:20:11,500
If you like this show, please consider sharing it with a friend, a coworker, a family member, or even an enemy.
1487
01:20:11,500 --> 01:20:16,500
However, we get the word out about this show helps it to become sustainable for the long term.
1488
01:20:16,500 --> 01:20:26,500
If you want to sponsor this show, please go to fafo.fm slash sponsor and reach out to us there about what you're interested in sponsoring and how we can help.
1489
01:20:26,500 --> 01:20:30,500
We hope your system stay available and your pagers stay quiet.
1490
01:20:30,500 --> 01:20:32,500
We'll see you again next time.
00:00:00,000 --> 00:00:12,000
Welcome to Fork Around and Find Out, the podcast about building, running and maintaining software and systems.
2
00:00:19,000 --> 00:00:26,000
Hello and welcome to Fork Around and Find Out. I'm your host Justin Garrison and with me as always is Autumn Nash. How's it going Autumn?
3
00:00:26,000 --> 00:00:29,000
I'm just really excited to see what Joe keeps coming up with.
4
00:00:29,000 --> 00:00:34,000
I can't even think of a joke. I'm just trying to think of like we are co-locating our jokes with our smarts today.
5
00:00:34,000 --> 00:00:35,000
Be funny now.
6
00:00:35,000 --> 00:00:38,000
I know. It's like it's too much pressure. This is why I don't do stand-up.
7
00:00:38,000 --> 00:00:39,000
Exactly.
8
00:00:40,000 --> 00:00:43,000
This is why instead of stand-up I just have kids.
9
00:00:43,000 --> 00:00:44,000
Don't lie.
10
00:00:44,000 --> 00:00:46,000
I'm interrupting me like intro.
11
00:00:46,000 --> 00:00:50,000
You just do like random stand-up.
12
00:00:50,000 --> 00:00:55,000
Yeah. It has to be ad hoc and today on the show we do not have an ad hoc guest.
13
00:00:55,000 --> 00:00:59,000
I guess we have David Aronchik, CEO and founder of Expanso. Welcome to the show David.
14
00:00:59,000 --> 00:01:01,000
Thank you so much. Real pleasure.
15
00:01:01,000 --> 00:01:06,000
David, we met like I feel like it was like a decade ago. It was like one of the first cube concept.
16
00:01:06,000 --> 00:01:07,000
Absolutely.
17
00:01:08,000 --> 00:01:09,000
No, for sure.
18
00:01:09,000 --> 00:01:15,000
And we just kind of been around. It's just infrastructure, Kubernetes, cloud. We've been doing all this.
19
00:01:15,000 --> 00:01:20,000
Tell us about what your journey's been throughout the last decade of doing infrastructure stuff.
20
00:01:20,000 --> 00:01:29,000
I mean, you know, when I was looking back, you know, I've certainly exchanged Twitter and Blue Sky and so on in all these days.
21
00:01:29,000 --> 00:01:35,000
But you're exactly right. You were one of the first Kubernetes adopters.
22
00:01:35,000 --> 00:01:40,000
So my background is I've been doing enterprise and consumer. I'm on my fourth startup.
23
00:01:40,000 --> 00:01:42,000
I'll get to that in a second.
24
00:01:42,000 --> 00:01:47,000
But the most recent arc of my career started in just before you and I met.
25
00:01:47,000 --> 00:02:00,000
I worked at Chef where I was doing director of product management and leading a bunch of their work on, you know, as Docker and all the container stuff was first coming out.
26
00:02:00,000 --> 00:02:08,000
Then I left there to go be the first non-founding PM for Kubernetes, which I did for Google for a bunch of years.
27
00:02:09,000 --> 00:02:14,000
And and help start the cloud native computer foundation and so on and so forth.
28
00:02:14,000 --> 00:02:18,000
And and that's where we met, you know, you, you were at Disney at the time.
29
00:02:18,000 --> 00:02:25,000
You were one of the first big adopters of this from a more traditional enterprise.
30
00:02:25,000 --> 00:02:37,000
And I know Disney is like super forward looking, you know, because of folks like you, but like it was really like, you know, no one really understood like what this Docker thing is what these containers are.
31
00:02:37,000 --> 00:02:40,000
And, you know, how does this going to affect me?
32
00:02:40,000 --> 00:02:46,000
But you were you were absolutely one of the first actually one of your other media proper and not not Disney's media properties.
33
00:02:46,000 --> 00:02:54,000
Although who knows now, though, it's all this consolidation, but another one of your media cohorts was right after you.
34
00:02:54,000 --> 00:03:01,000
I always remember because one of the HBO HBO Max adopted Kubernetes really early as well.
35
00:03:01,000 --> 00:03:08,000
And I remember watching or streaming Game of Thrones on my laptop.
36
00:03:08,000 --> 00:03:12,000
And I was like, oh, my God, this is running on our stuff.
37
00:03:12,000 --> 00:03:14,000
I'm very, very proud of it.
38
00:03:14,000 --> 00:03:17,000
But um, yeah, you know, that's where you and I met.
39
00:03:17,000 --> 00:03:23,000
And so I let I started to GKE the Google Kubernetes engine.
40
00:03:23,000 --> 00:03:27,000
And then I did that for a bunch of years.
41
00:03:27,000 --> 00:03:38,000
I then moved from there into starting a machine learning platform called cube flow, which has been very popular.
42
00:03:38,000 --> 00:03:52,000
And so I did that, then I left Google to go work at Microsoft to lead open source machine learning strategy out of the in for the Azure ML group and out of the office of the CTO.
43
00:03:52,000 --> 00:04:03,000
And I did that for a few years and now I'm on to my startup, which is, you know, crazy, like I honestly never thought I would go back.
44
00:04:03,000 --> 00:04:12,000
Not because I don't like the startup game, but because like I had a perfectly reasonable job at like big toe, like kick your feet up and chill, dude.
45
00:04:12,000 --> 00:04:17,000
Like, but no, I have a vision for the world and I'd like it to to exist.
46
00:04:17,000 --> 00:04:20,000
So I'm off doing the startup thing again.
47
00:04:20,000 --> 00:04:24,000
Is there ever really a kick your feet up and chill moment intact?
48
00:04:24,000 --> 00:04:30,000
You know, this should be like I keep hearing about this like it does.
49
00:04:30,000 --> 00:04:33,000
At what point do you get there?
50
00:04:33,000 --> 00:04:35,000
Some people are wired that way.
51
00:04:35,000 --> 00:04:39,000
I wish I was I truly wish I was.
52
00:04:39,000 --> 00:04:40,000
I didn't I didn't hear the word.
53
00:04:40,000 --> 00:04:42,000
I didn't hear the word wider wired.
54
00:04:42,000 --> 00:04:47,000
I heard whiter like my skin color and I was like, well, yeah, that also probably plays into it too.
55
00:04:47,000 --> 00:04:52,000
Where I've worked with quite a few people that have kicked their feet up and usually they look like me.
56
00:04:52,000 --> 00:04:53,000
Yeah, yeah.
57
00:04:53,000 --> 00:04:56,000
It's autumn takes the biggest drink of her coffee.
58
00:04:56,000 --> 00:05:01,000
She's talking idiot.
59
00:05:01,000 --> 00:05:05,000
Take me back a little bit to the GKE creation.
60
00:05:05,000 --> 00:05:07,000
Was that always the intention of Kubernetes?
61
00:05:07,000 --> 00:05:14,000
Like you open sourced it and it felt like it was meant as just like a pure open source play and then just the popularity was there immediately.
62
00:05:14,000 --> 00:05:20,000
So so it's long enough ago that I think I can say all this stuff without missing too many people off.
63
00:05:20,000 --> 00:05:25,000
But no, the story here goes back to 2003.
64
00:05:25,000 --> 00:05:26,000
Okay.
65
00:05:26,000 --> 00:05:35,000
And the story is that Google came out and released the the Hadoop paper, the MapReduce paper in 2003.
66
00:05:35,000 --> 00:05:43,000
And Yahoo came along and very helpfully read this wonderful paper, this groundbreaking paper and say, wow, this sounds really cool.
67
00:05:43,000 --> 00:05:44,000
Let's go launch something around it.
68
00:05:44,000 --> 00:05:46,000
And they created Hadoop.
69
00:05:46,000 --> 00:05:47,000
Right.
70
00:05:47,000 --> 00:05:49,000
And Google was like, oh, this is good.
71
00:05:49,000 --> 00:05:52,000
You know, we're glad people are out there and Google is a very academic place.
72
00:05:52,000 --> 00:05:57,000
So they like really don't take any like ownership over that until it gets to Google Cloud.
73
00:05:57,000 --> 00:06:07,000
And they like at the time that they launched Google Cloud, they they had to create an HDFS compatibility layer or Hadoop.
74
00:06:07,000 --> 00:06:20,000
And what that meant was you had something that Google invented, re-implemented by someone else, implemented on this like compatibility layer that ultimately went through another layer that ultimately was still running on MapReduce.
75
00:06:20,000 --> 00:06:21,000
Right.
76
00:06:21,000 --> 00:06:23,000
And they're like, why the hell did this happen?
77
00:06:23,000 --> 00:06:25,000
Like we could have just done the thing.
78
00:06:25,000 --> 00:06:26,000
Right.
79
00:06:26,000 --> 00:06:28,000
So that's going to be angle one.
80
00:06:28,000 --> 00:06:32,000
They were like, hey, look, we don't want to, we're going to release something in the world.
81
00:06:32,000 --> 00:06:34,000
Let's actually release something to them.
82
00:06:34,000 --> 00:06:35,000
Okay.
83
00:06:35,000 --> 00:06:36,000
So that's category one.
84
00:06:36,000 --> 00:06:41,000
And category two is they saw AWS and they saw it growing.
85
00:06:41,000 --> 00:06:45,000
And they're like, holy shit, you know, this, this, oh, sorry, I don't know what this is a non-safer word.
86
00:06:45,000 --> 00:06:46,000
No, we're good.
87
00:06:46,000 --> 00:06:47,000
You're good.
88
00:06:47,000 --> 00:06:48,000
We don't, we don't believe in it anymore.
89
00:06:48,000 --> 00:06:49,000
I curse like a sailor.
90
00:06:49,000 --> 00:06:50,000
So you'll have to.
91
00:06:50,000 --> 00:06:54,000
So do I. Justin's very good at not cursing.
92
00:06:54,000 --> 00:06:55,000
I don't have it in me.
93
00:06:55,000 --> 00:06:57,000
I just don't have it in me.
94
00:06:57,000 --> 00:07:05,000
I almost feel like swearing is like, like I look for slightly spicy people because I appreciate their honesty.
95
00:07:05,000 --> 00:07:07,000
Justin's spicy in other ways.
96
00:07:07,000 --> 00:07:08,000
I was going to say that.
97
00:07:08,000 --> 00:07:14,000
Like I'm so excited for you to be here because I love being in between a spicy interview
98
00:07:14,000 --> 00:07:15,000
E and Justin.
99
00:07:15,000 --> 00:07:16,000
Cause it's.
100
00:07:16,000 --> 00:07:19,000
You're in between there.
101
00:07:19,000 --> 00:07:20,000
All right.
102
00:07:20,000 --> 00:07:21,000
Yeah.
103
00:07:21,000 --> 00:07:24,000
This podcast has taken a very interesting turn.
104
00:07:24,000 --> 00:07:25,000
Let me just say that.
105
00:07:25,000 --> 00:07:32,000
Within five minutes of like meeting you, I was like, David's going to be so much better.
106
00:07:32,000 --> 00:07:34,000
Like, I thought this was a spark around to find out.
107
00:07:34,000 --> 00:07:36,000
And this is not a BDSM podcast.
108
00:07:36,000 --> 00:07:37,000
Right.
109
00:07:37,000 --> 00:07:40,000
Oh, thank God.
110
00:07:40,000 --> 00:07:42,000
Tim is in here.
111
00:07:42,000 --> 00:07:43,000
Oh shit.
112
00:07:43,000 --> 00:07:45,000
My things went all blurred here.
113
00:07:45,000 --> 00:07:46,000
Okay.
114
00:07:46,000 --> 00:07:52,000
Um, so, so then, so then, uh, AWS comes along and they're like killing it.
115
00:07:52,000 --> 00:07:53,000
Right.
116
00:07:53,000 --> 00:07:54,000
And we all look at that.
117
00:07:54,000 --> 00:07:57,000
We're like, Hey, but wait, we have a cloud that we're trying to get going here.
118
00:07:57,000 --> 00:08:01,000
Um, like we think that the right thing here, the right.
119
00:08:01,000 --> 00:08:04,000
Or, uh, uh, element is not a via.
120
00:08:04,000 --> 00:08:05,000
Right.
121
00:08:05,000 --> 00:08:08,000
We think the right element is a container and look at Docker.
122
00:08:08,000 --> 00:08:09,000
They're doing great.
123
00:08:09,000 --> 00:08:10,000
Right.
124
00:08:10,000 --> 00:08:11,000
So let's take.
125
00:08:11,000 --> 00:08:12,000
Dockers.
126
00:08:12,000 --> 00:08:18,000
You know, extension and wisdom, which by the way, again, another thing that Google launched.
127
00:08:18,000 --> 00:08:24,000
Again, no one is saying that Docker wasn't an enormous part in arguably the reason that
128
00:08:24,000 --> 00:08:25,000
containers are successful.
129
00:08:25,000 --> 00:08:30,000
Um, but a lot of it was based on, you know, kernel changes that came in, you know, in
130
00:08:30,000 --> 00:08:32,000
2004 or 2005, right?
131
00:08:32,000 --> 00:08:34,000
Like there's an enormous amount of stuff there.
132
00:08:34,000 --> 00:08:36,000
And so they're like, Hey, look, Walker's killing it.
133
00:08:36,000 --> 00:08:39,000
Let's help Docker extend even further.
134
00:08:39,000 --> 00:08:44,000
And let's help people say, you know, can take VMs are not the right thing.
135
00:08:44,000 --> 00:08:45,000
It's just not.
136
00:08:45,000 --> 00:08:49,000
And so, you know, again, I was not here during this time.
137
00:08:49,000 --> 00:08:54,000
Craig McClucky and Joe Beta and Brandon Burns and Tim Hawken and Brian Grant and whatever,
138
00:08:54,000 --> 00:08:58,000
they were like all working on a bunch of stuff internally to Google where they're like, we
139
00:08:58,000 --> 00:09:04,000
think there's a new orchestration paradigm that people should be adopting here.
140
00:09:04,000 --> 00:09:08,000
Um, they were going to build it internally to Google in a project called Omega and you
141
00:09:08,000 --> 00:09:09,000
should go read everyone.
142
00:09:09,000 --> 00:09:13,000
You should go read Brian Grant's blog history of this.
143
00:09:13,000 --> 00:09:16,000
It's so good and it's so real.
144
00:09:16,000 --> 00:09:22,000
It is like transparently going through a nice human, which is like amazing that he's
145
00:09:22,000 --> 00:09:26,000
like a humble, nice human after doing like he's so smart.
146
00:09:26,000 --> 00:09:28,000
He would guess two or three on this show.
147
00:09:28,000 --> 00:09:30,000
So yeah, he's so good.
148
00:09:30,000 --> 00:09:34,000
When you talk to him, you're just like, dude, you're so smart.
149
00:09:34,000 --> 00:09:37,000
Like he is just so intelligent.
150
00:09:37,000 --> 00:09:41,000
So, so, uh, uh, Brandon is a good friend.
151
00:09:41,000 --> 00:09:45,000
And when I first get to Google, he tells me this thing, which is amazing.
152
00:09:45,000 --> 00:09:50,000
He says your goal inside Google is not to be the smartest person on day one or day two
153
00:09:50,000 --> 00:09:52,000
or day 400, right?
154
00:09:52,000 --> 00:09:53,000
Your goal is the following.
155
00:09:53,000 --> 00:09:56,000
Like you should go and you should come up with a smart idea.
156
00:09:56,000 --> 00:09:58,000
We hired you because you have a smart idea.
157
00:09:58,000 --> 00:09:59,000
Okay.
158
00:09:59,000 --> 00:10:02,000
You should go and you should try and figure out where that idea is because I guarantee
159
00:10:02,000 --> 00:10:04,000
somebody internally has already thought about it.
160
00:10:04,000 --> 00:10:05,000
Right.
161
00:10:05,000 --> 00:10:12,000
And there will be a window between you thinking of this idea and the paper and the, that window
162
00:10:12,000 --> 00:10:14,000
will start off at like four years.
163
00:10:14,000 --> 00:10:19,000
Like the idea was four years ago that somebody looked at this and they decided this was a
164
00:10:19,000 --> 00:10:22,000
bad idea or they implemented it or whatever.
165
00:10:22,000 --> 00:10:23,000
And then you should.
166
00:10:23,000 --> 00:10:24,000
Okay.
167
00:10:24,000 --> 00:10:25,000
You get smarter about it.
168
00:10:25,000 --> 00:10:27,000
You read the paper and then you come back and then you do that again.
169
00:10:27,000 --> 00:10:30,000
You're going to come up with another great idea and it'll be two years.
170
00:10:30,000 --> 00:10:31,000
You're like, what?
171
00:10:31,000 --> 00:10:33,000
Two years and then you'll do it again.
172
00:10:33,000 --> 00:10:34,000
It'll be like nine months.
173
00:10:34,000 --> 00:10:35,000
Then you do it again.
174
00:10:35,000 --> 00:10:37,000
It'll be like three months and then you do it again.
175
00:10:37,000 --> 00:10:39,000
You won't find a paper and then you're like that.
176
00:10:39,000 --> 00:10:43,000
That is the thing you should go and implement and, and so on and so forth.
177
00:10:43,000 --> 00:10:45,000
So Brandon says this and I was like, I still take this wisdom way.
178
00:10:45,000 --> 00:10:48,000
I think it's so interesting, especially in the real world where you can go out and you
179
00:10:48,000 --> 00:10:51,000
can research it and you can figure out why things worked and didn't work and so on and
180
00:10:51,000 --> 00:10:52,000
so forth.
181
00:10:52,000 --> 00:10:55,000
Brian is interesting because he's the other half of the coin.
182
00:10:55,000 --> 00:11:00,000
Like he's the one who will like, he just has canonical knowledge of everything.
183
00:11:00,000 --> 00:11:04,000
And so he is whenever I'm trying to come up with a new feature for our platform or,
184
00:11:04,000 --> 00:11:06,000
you know, hey, you know, why didn't people do this?
185
00:11:06,000 --> 00:11:09,000
I go and talk to Brian or another guy, Eric Brewer.
186
00:11:09,000 --> 00:11:11,000
He's, he's also a really wonderful human.
187
00:11:11,000 --> 00:11:14,000
You should have him on if you haven't already.
188
00:11:14,000 --> 00:11:20,000
And the two of them together, you're just kind of like, oh, you know, what's, what about
189
00:11:20,000 --> 00:11:21,000
this idea?
190
00:11:21,000 --> 00:11:22,000
Oh yeah, we didn't look at that.
191
00:11:22,000 --> 00:11:26,000
And this is the problem and distributed to this and consensus that in this year and
192
00:11:26,000 --> 00:11:27,000
you're running into this.
193
00:11:27,000 --> 00:11:30,000
And eventually you'll get to a point where they're like, yeah, that's actually not a
194
00:11:30,000 --> 00:11:31,000
bad idea.
195
00:11:31,000 --> 00:11:32,000
And you're like, ah, I'm going to go with it.
196
00:11:32,000 --> 00:11:36,000
I feel like having Brian as a friend has got to be like some sort of life hack because
197
00:11:36,000 --> 00:11:41,000
to be able to bounce ideas off of someone like that, like God.
198
00:11:41,000 --> 00:11:47,000
I mean, I say that I am, I try and collect smart friends like they're fucking Pokemon.
199
00:11:47,000 --> 00:11:48,000
That is that.
200
00:11:48,000 --> 00:11:49,000
Okay.
201
00:11:49,000 --> 00:11:51,000
Like that is like the top tier.
202
00:11:51,000 --> 00:11:56,000
Like if all of the rest of the world is questionable at the moment, having smart friends and good
203
00:11:56,000 --> 00:11:57,000
friends.
204
00:11:57,000 --> 00:11:58,000
100%.
205
00:11:58,000 --> 00:12:02,000
I mean, just having someone who can be honest with you is like brutally important.
206
00:12:02,000 --> 00:12:07,000
I might tease Justin on the internet, but like having good friends, like top tier.
207
00:12:07,000 --> 00:12:09,000
Like if you want to know how to improve your life.
208
00:12:09,000 --> 00:12:11,000
I don't know if that was including or excluding me.
209
00:12:11,000 --> 00:12:13,000
That's kind of it goes both ways.
210
00:12:13,000 --> 00:12:14,000
Duh.
211
00:12:14,000 --> 00:12:20,000
Like having Justin, but also I have good friends in between your questionable moments of the
212
00:12:20,000 --> 00:12:22,000
fact that you don't drink coffee.
213
00:12:22,000 --> 00:12:25,000
But like, we won't go into that today.
214
00:12:25,000 --> 00:12:28,000
But it's like, I think that drink coffee.
215
00:12:28,000 --> 00:12:30,000
How's that even possible?
216
00:12:30,000 --> 00:12:31,000
Thank you.
217
00:12:31,000 --> 00:12:34,000
Like, like you work in tech and you have children.
218
00:12:34,000 --> 00:12:35,000
What is wrong with you?
219
00:12:35,000 --> 00:12:38,000
He, he does have a Dr. Pepper obsession.
220
00:12:38,000 --> 00:12:39,000
I love Dr. Pepper.
221
00:12:39,000 --> 00:12:43,000
Last night it was at an event and somebody had like, so it's, it's tech week here in
222
00:12:43,000 --> 00:12:44,000
Seattle and it's been phenomenal.
223
00:12:44,000 --> 00:12:47,000
You live in Seattle and I've never met you, David.
224
00:12:47,000 --> 00:12:49,000
What are you doing this afternoon?
225
00:12:49,000 --> 00:12:51,000
There's like three more events.
226
00:12:51,000 --> 00:12:52,000
What?
227
00:12:52,000 --> 00:12:53,000
It's tech week.
228
00:12:53,000 --> 00:12:54,000
Yeah.
229
00:12:54,000 --> 00:12:55,000
It's tech week, man.
230
00:12:55,000 --> 00:12:58,000
But I was going to say I was at an event yesterday afternoon all week.
231
00:12:58,000 --> 00:12:59,000
I've been drinking Diet Coke.
232
00:12:59,000 --> 00:13:00,000
Don't get me wrong.
233
00:13:00,000 --> 00:13:01,000
I love Diet Coke.
234
00:13:01,000 --> 00:13:03,000
But like at the same time, like I was at an event and they had Diet Dr. Pepper.
235
00:13:03,000 --> 00:13:06,000
I'm like, oh, you, how do I get in business with you?
236
00:13:06,000 --> 00:13:08,000
Diet Dr. Pepper is amazing.
237
00:13:08,000 --> 00:13:09,000
I love it.
238
00:13:09,000 --> 00:13:12,000
I love how you said, how do I get in business with you?
239
00:13:12,000 --> 00:13:14,000
I'm making this happen.
240
00:13:14,000 --> 00:13:16,000
I totally understand that.
241
00:13:16,000 --> 00:13:18,000
Do you know how hard it is to find Justin Dr. Pepper?
242
00:13:18,000 --> 00:13:19,000
It's not that hard.
243
00:13:19,000 --> 00:13:23,000
So I continue to ignore him at conference, annoy him at conferences.
244
00:13:23,000 --> 00:13:26,000
I don't understand how everyone is at drinking Diet Dr. Pepper.
245
00:13:26,000 --> 00:13:27,000
It's so much better.
246
00:13:27,000 --> 00:13:31,000
I went to three different stores at scale so I could give, be like, here's a Dr.
247
00:13:31,000 --> 00:13:34,000
Pepper and Rice Krispie so I can keep stealing your charges.
248
00:13:34,000 --> 00:13:37,000
It was the only reason I would go to Texas was to get good Dr. Pepper.
249
00:13:37,000 --> 00:13:39,000
They have the original OG sugar.
250
00:13:39,000 --> 00:13:41,000
Do they have a different Dr. Pepper?
251
00:13:41,000 --> 00:13:45,000
It was invented in Texas and they have real sugar Dr. Pepper.
252
00:13:45,000 --> 00:13:49,000
And so it started to percolate out some other places and one store near me sells it.
253
00:13:49,000 --> 00:13:50,000
And so I go there sometimes.
254
00:13:50,000 --> 00:13:52,000
It's fine.
255
00:13:52,000 --> 00:13:57,000
I said it's fine, but it didn't sound fine at all.
256
00:13:57,000 --> 00:14:00,000
Let me, let me finish up the story so we can get out to other interesting things.
257
00:14:00,000 --> 00:14:04,000
There's too much ADHD here, David.
258
00:14:04,000 --> 00:14:06,000
No shit.
259
00:14:06,000 --> 00:14:09,000
I'm like ADHD, like on ADHD.
260
00:14:10,000 --> 00:14:14,000
So anyhow, so Brian Grant and Brennan and whatever, they come up with these things
261
00:14:14,000 --> 00:14:16,000
and Brennan, you know, literally.
262
00:14:16,000 --> 00:14:19,000
Why are you not in business with Brian?
263
00:14:19,000 --> 00:14:21,000
Because he's got his own thing.
264
00:14:21,000 --> 00:14:22,000
Yeah, he's got his own thing.
265
00:14:22,000 --> 00:14:23,000
Yeah.
266
00:14:23,000 --> 00:14:24,000
I love what he's doing, by the way.
267
00:14:24,000 --> 00:14:25,000
Yeah.
268
00:14:25,000 --> 00:14:29,000
Like the configuration management and easy with another wonderful friend of mine that
269
00:14:29,000 --> 00:14:31,000
I got ideas of all the time, Alexis.
270
00:14:31,000 --> 00:14:32,000
Alexis.
271
00:14:32,000 --> 00:14:33,000
Yeah.
272
00:14:33,000 --> 00:14:34,000
Config hub.
273
00:14:34,000 --> 00:14:35,000
Config hub.
274
00:14:35,000 --> 00:14:36,000
Yeah.
275
00:14:36,000 --> 00:14:37,000
Huge, huge.
276
00:14:37,000 --> 00:14:40,000
I'm going to fly on the wall while you guys are having like technical discussions.
277
00:14:40,000 --> 00:14:42,000
Like, can I just sit in the background?
278
00:14:42,000 --> 00:14:43,000
I mean, we never have it.
279
00:14:43,000 --> 00:14:45,000
We never have a technical discussion.
280
00:14:45,000 --> 00:14:50,000
You get in the room and you're like arguing about like how, you know, whatever, blah,
281
00:14:50,000 --> 00:14:52,000
blah, blah, like bad mouth, blah, blah, blah.
282
00:14:52,000 --> 00:14:55,000
And like, oh, you see what these idiots are doing.
283
00:14:55,000 --> 00:14:59,000
I mean, I feel like your group chat is fire.
284
00:14:59,000 --> 00:15:01,000
More group chats.
285
00:15:01,000 --> 00:15:04,000
They're the best.
286
00:15:04,000 --> 00:15:09,000
I mean, our group chat is hilarious.
287
00:15:09,000 --> 00:15:10,000
I don't know.
288
00:15:10,000 --> 00:15:11,000
I don't know.
289
00:15:11,000 --> 00:15:12,000
It's an interesting question.
290
00:15:12,000 --> 00:15:13,000
Finish your story.
291
00:15:13,000 --> 00:15:21,000
Anyhow, so Brandon gets it running on his laptop in Java, like his total skunkworks.
292
00:15:21,000 --> 00:15:23,000
That was kind of a fork.
293
00:15:23,000 --> 00:15:28,000
It wasn't a fork, but it was kind of like a conceptual fork of the thing they were doing
294
00:15:28,000 --> 00:15:29,000
internally to Google.
295
00:15:29,000 --> 00:15:35,000
And then it starts to catch fire and somehow it breaks through, like, because Google was
296
00:15:35,000 --> 00:15:38,000
really internally opposed to Kubernetes.
297
00:15:38,000 --> 00:15:44,000
Not that they were, there was just a lot of motion around like what the hell is going
298
00:15:44,000 --> 00:15:47,000
on and, you know, what kind of team do we spin up?
299
00:15:47,000 --> 00:15:53,000
And then, you know, Craig McClucky and like I said, Brandon and Brian and all these people
300
00:15:53,000 --> 00:15:57,000
ended up forcing it through, get like, I think releasing it to the world like forcibly
301
00:15:57,000 --> 00:16:00,000
and then, you know, just kind of cascaded forward from there.
302
00:16:00,000 --> 00:16:04,000
And so I joined in January of 2015.
303
00:16:04,000 --> 00:16:09,000
And Craig was like, Hey, look, I need someone to take over Kubernetes management for me
304
00:16:09,000 --> 00:16:12,000
because I'm going to go off and work on three other things.
305
00:16:12,000 --> 00:16:14,000
I mean, there's another genius for you.
306
00:16:14,000 --> 00:16:19,000
And so he, he proceeds to go and do that.
307
00:16:19,000 --> 00:16:22,000
And I like launch GK.
308
00:16:22,000 --> 00:16:26,000
And so they're like, Well, all right, we're going to have this open source thing.
309
00:16:26,000 --> 00:16:30,000
We've got to, you know, get this project going.
310
00:16:30,000 --> 00:16:32,000
It was already like put it already been written in.
311
00:16:32,000 --> 00:16:34,000
There were some early versions and so on and so forth.
312
00:16:34,000 --> 00:16:42,000
But, you know, I started leading it and, and, you know, it was the three, three core pillars of
313
00:16:42,000 --> 00:16:54,000
of GKE compute under Navneet, Paul Nash, who was off running compute and
314
00:16:55,000 --> 00:16:56,000
totally blanking on his name.
315
00:16:56,000 --> 00:16:59,000
I feel terrible, but like it was the lead for App Engine.
316
00:16:59,000 --> 00:17:00,000
This was, this was 10 years ago.
317
00:17:00,000 --> 00:17:01,000
We're not old.
318
00:17:01,000 --> 00:17:03,000
I know, but I can't remember his name.
319
00:17:03,000 --> 00:17:05,000
I feel really terrible because he was great.
320
00:17:05,000 --> 00:17:08,000
Crazy, like in 10 years, how much things have changed?
321
00:17:08,000 --> 00:17:09,000
Oh, absolutely.
322
00:17:09,000 --> 00:17:10,000
Absolutely.
323
00:17:10,000 --> 00:17:11,000
So anyhow, so that was it.
324
00:17:11,000 --> 00:17:14,000
And it was just like, let's help people adopt containers.
325
00:17:14,000 --> 00:17:18,000
And, and for better or worse, it's not that we're opposed to AWS.
326
00:17:18,000 --> 00:17:20,000
It's just, we don't want people building on VMs.
327
00:17:20,000 --> 00:17:21,000
That was it.
328
00:17:21,000 --> 00:17:25,000
And we think the world is better if, if everyone isn't completely married to a
329
00:17:25,000 --> 00:17:30,000
VM, because a VM is so heavy weight, even the lightest weight VM to have an
330
00:17:30,000 --> 00:17:36,000
entire kernel to care about your serial port and your Ethernet driver.
331
00:17:36,000 --> 00:17:38,000
And I mean, it's just like, it's insane.
332
00:17:38,000 --> 00:17:42,000
Like let's, let's give people what they want an isolated environment that allows
333
00:17:42,000 --> 00:17:43,000
you to execute against things.
334
00:17:43,000 --> 00:17:46,000
And, and that was the, the whole idea of on the container.
335
00:17:46,000 --> 00:17:49,000
And then obviously letting people do a whole bunch of those at the same time was
336
00:17:49,000 --> 00:17:50,000
really powerful.
337
00:17:50,000 --> 00:17:54,000
And even just to like paint the scene of people that weren't in technology or
338
00:17:54,000 --> 00:17:57,000
weren't doing infrastructure around this time, right?
339
00:17:57,000 --> 00:18:04,000
Like Docker was kind of launched in, in 2014, the first Docker con was 2014.
340
00:18:04,000 --> 00:18:06,000
So this is still super early.
341
00:18:06,000 --> 00:18:11,000
ECS came out from AWS, which was like basically just like a big Docker engine
342
00:18:11,000 --> 00:18:13,000
in 2014.
343
00:18:13,000 --> 00:18:16,000
So this is within six months of all these other things.
344
00:18:16,000 --> 00:18:21,000
Google already had the app engine, which was already kind of this like,
345
00:18:21,000 --> 00:18:25,000
has sort of, you know, you didn't have to care that it was a container sort of
346
00:18:25,000 --> 00:18:27,000
environment where it's like, Hey, you just bring us your application that looks
347
00:18:27,000 --> 00:18:29,000
like this, we'll run it for you.
348
00:18:29,000 --> 00:18:32,000
No VM, no S management, all of that stuff is going to work.
349
00:18:32,000 --> 00:18:37,000
And then launching this new, very configurable kind of complex looking
350
00:18:37,000 --> 00:18:41,000
container engine into the world had to have contention because I know like all
351
00:18:41,000 --> 00:18:45,000
the internal Google stuff around Borg is like, well, you can't just ship Borg to
352
00:18:45,000 --> 00:18:46,000
other people.
353
00:18:46,000 --> 00:18:50,000
How do you wrap that to make it easier just like Hadoop?
354
00:18:50,000 --> 00:18:52,000
It must, it must have been like political struggle.
355
00:18:52,000 --> 00:18:56,000
I think even more than a technical struggle to be able to push that through.
356
00:18:56,000 --> 00:19:00,000
No, I mean, look, you know, we, again, we all forget, right?
357
00:19:00,000 --> 00:19:04,000
But, but Google was not successful and open source at that point, right?
358
00:19:04,000 --> 00:19:06,000
They were very successful publishing papers.
359
00:19:06,000 --> 00:19:13,000
But to that point, they had Android, which they bought and they had Angular.
360
00:19:13,000 --> 00:19:16,000
But other than that, they had Go.
361
00:19:16,000 --> 00:19:17,000
That's true.
362
00:19:17,000 --> 00:19:18,000
They didn't go.
363
00:19:18,000 --> 00:19:19,000
I take that back.
364
00:19:19,000 --> 00:19:20,000
That's the only other thing.
365
00:19:20,000 --> 00:19:23,000
But Go wasn't, Go wasn't as popular as it is now.
366
00:19:23,000 --> 00:19:25,000
It was like, you know what I mean?
367
00:19:25,000 --> 00:19:26,000
Yeah.
368
00:19:26,000 --> 00:19:27,000
Yeah.
369
00:19:27,000 --> 00:19:28,000
Yeah.
370
00:19:28,000 --> 00:19:29,000
Go.
371
00:19:29,000 --> 00:19:31,000
I don't think people know how old Go is because Go got so popular in the last
372
00:19:31,000 --> 00:19:32,000
few years.
373
00:19:32,000 --> 00:19:35,000
And then when they, like, and because so much of Kubernetes and certain
374
00:19:35,000 --> 00:19:37,000
infrastructure is built on it.
375
00:19:37,000 --> 00:19:41,000
Now it's like, I won't say it's like Java, but it's like, you can't avoid
376
00:19:41,000 --> 00:19:42,000
Go in a lot of ways.
377
00:19:42,000 --> 00:19:45,000
So much of the infrastructure tooling was Ruby before that.
378
00:19:45,000 --> 00:19:46,000
Right.
379
00:19:46,000 --> 00:19:47,000
Because Ruby on Rails exploded.
380
00:19:47,000 --> 00:19:48,000
That makes me so mad.
381
00:19:48,000 --> 00:19:49,000
And then there was.
382
00:19:49,000 --> 00:19:52,000
I have like flashbacks.
383
00:19:52,000 --> 00:19:55,000
Chef and Puppet were like, that was like, if you were doing infrastructure,
384
00:19:55,000 --> 00:19:58,000
you were doing config management and you had to know Ruby to be able to write
385
00:19:58,000 --> 00:19:59,000
Chef and Puppet.
386
00:19:59,000 --> 00:20:01,000
Oh, and now so many things at AWS make sense.
387
00:20:01,000 --> 00:20:02,000
Yeah.
388
00:20:02,000 --> 00:20:03,000
Yeah.
389
00:20:03,000 --> 00:20:04,000
Absolutely.
390
00:20:04,000 --> 00:20:06,000
I was like, why would you do this?
391
00:20:06,000 --> 00:20:07,000
Yeah.
392
00:20:07,000 --> 00:20:08,000
Absolutely.
393
00:20:08,000 --> 00:20:10,000
Well, now it's all TypeScript at AWS.
394
00:20:10,000 --> 00:20:11,000
Yeah.
395
00:20:11,000 --> 00:20:14,000
So I mean, like, again, it's, it was like, and so.
396
00:20:14,000 --> 00:20:15,000
I have a question.
397
00:20:15,000 --> 00:20:16,000
Oh, sorry, please.
398
00:20:16,000 --> 00:20:21,000
What do you think is harder politics trying to get things done internally and
399
00:20:21,000 --> 00:20:24,000
then like a mega corporation or open source?
400
00:20:24,000 --> 00:20:28,000
Because I feel like they're very two different, like.
401
00:20:28,000 --> 00:20:29,000
They're.
402
00:20:29,000 --> 00:20:31,000
You break a really interesting question.
403
00:20:31,000 --> 00:20:34,000
I, I, you know, they just are very different.
404
00:20:34,000 --> 00:20:39,000
The nice part about internal politics is there are at least defined motivations.
405
00:20:39,000 --> 00:20:41,000
It's very rare that someone's just absolute chaos.
406
00:20:41,000 --> 00:20:42,000
Right.
407
00:20:42,000 --> 00:20:45,000
Every now and then I'm like, you play D&D, huh?
408
00:20:45,000 --> 00:20:48,000
Cause you are just a chaos goblin for no reason.
409
00:20:48,000 --> 00:20:52,000
Like you just walk in and you're just like, for no reason.
410
00:20:52,000 --> 00:20:53,000
But yeah.
411
00:20:53,000 --> 00:20:54,000
So that's very rare.
412
00:20:54,000 --> 00:20:56,000
And I'm like, I don't know.
413
00:20:56,000 --> 00:20:57,000
I don't know.
414
00:20:57,000 --> 00:20:58,000
I don't know.
415
00:20:58,000 --> 00:20:59,000
I don't know.
416
00:20:59,000 --> 00:21:00,000
I don't know.
417
00:21:00,000 --> 00:21:01,000
I don't know.
418
00:21:01,000 --> 00:21:02,000
I don't know.
419
00:21:02,000 --> 00:21:03,000
Yeah.
420
00:21:03,000 --> 00:21:04,000
So that's very rare internally.
421
00:21:04,000 --> 00:21:08,000
At least you can say, okay, well that person has this job and this VP asked them to do
422
00:21:08,000 --> 00:21:09,000
this.
423
00:21:09,000 --> 00:21:10,000
So like that's a thing.
424
00:21:10,000 --> 00:21:14,000
I don't agree with that thing, but at least you can like unpack what, what they're doing.
425
00:21:14,000 --> 00:21:16,000
I think knowing your audience is important.
426
00:21:16,000 --> 00:21:17,000
100%.
427
00:21:17,000 --> 00:21:18,000
No.
428
00:21:18,000 --> 00:21:19,000
100%.
429
00:21:19,000 --> 00:21:23,000
And, and those politics almost always come down from cheese.
430
00:21:23,000 --> 00:21:27,000
I'm responsible for a thing and you are risking that thing.
431
00:21:27,000 --> 00:21:30,000
So I'm going to like be a dick to you.
432
00:21:30,000 --> 00:21:31,000
Right.
433
00:21:31,000 --> 00:21:37,000
And so you got to figure out because I love the way that you like, you're like, can we
434
00:21:37,000 --> 00:21:38,000
be friends?
435
00:21:38,000 --> 00:21:40,000
You're like, so this person has motivation.
436
00:21:40,000 --> 00:21:41,000
So they're going to be a dick to you.
437
00:21:41,000 --> 00:21:44,000
Like I felt that in my soul.
438
00:21:44,000 --> 00:21:50,000
So, but, but in open source, the, all you have is ego, right?
439
00:21:50,000 --> 00:21:53,000
And so ego can be way more irrational.
440
00:21:53,000 --> 00:21:59,000
It's, but sometimes it's like really like, you got to love the purist, right?
441
00:21:59,000 --> 00:22:04,000
Like, you know, like we, I feel like we all like really care about open source, but every
442
00:22:04,000 --> 00:22:08,000
now you get someone and you're like, do you go outside?
443
00:22:08,000 --> 00:22:09,000
Like,
444
00:22:09,000 --> 00:22:13,000
I don't, you know, it's funny.
445
00:22:13,000 --> 00:22:15,000
So let me give you an example.
446
00:22:15,000 --> 00:22:22,000
When I was leading Kubernetes, they're one of the first.
447
00:22:22,000 --> 00:22:24,000
Hardy debates we had in public.
448
00:22:24,000 --> 00:22:26,000
I was excited about what you're going to say.
449
00:22:26,000 --> 00:22:28,000
He said a party in the way you perked up.
450
00:22:28,000 --> 00:22:30,000
I was like, this is going to be good.
451
00:22:30,000 --> 00:22:37,000
There was a, I can't remember it because we just introduced job sets.
452
00:22:37,000 --> 00:22:39,000
So I can't remember what was the name.
453
00:22:39,000 --> 00:22:41,000
Oh, maybe it's a stateful set.
454
00:22:41,000 --> 00:22:44,000
But like, I think that's what we were calling it.
455
00:22:44,000 --> 00:22:45,000
We're still calling it that.
456
00:22:45,000 --> 00:22:49,000
But like at the time, 2015, there was a thing called pet set.
457
00:22:49,000 --> 00:22:50,000
Right.
458
00:22:51,000 --> 00:22:52,000
And
459
00:22:52,000 --> 00:22:53,000
What name do these things?
460
00:22:53,000 --> 00:22:55,000
What is a pets?
461
00:22:55,000 --> 00:22:57,000
So this is the old name for states.
462
00:22:57,000 --> 00:22:58,000
You bring up the right point.
463
00:22:58,000 --> 00:22:59,000
Right.
464
00:22:59,000 --> 00:23:01,000
So there was a thing called pet set.
465
00:23:01,000 --> 00:23:03,000
And that's because there was a whole idea.
466
00:23:03,000 --> 00:23:05,000
Oh, or things pet cattle or pets.
467
00:23:05,000 --> 00:23:06,000
Okay.
468
00:23:06,000 --> 00:23:10,000
Like, because you, you, you're going to put a bullet in a cattle, but you're not going
469
00:23:10,000 --> 00:23:12,000
to put a bullet in a, in a, right.
470
00:23:12,000 --> 00:23:15,000
And so the whole idea was like to keep it around.
471
00:23:15,000 --> 00:23:19,000
And this person submitted a bug to say, Hey, you should change the name of pets.
472
00:23:20,000 --> 00:23:25,000
And they, they get going to this big, long explanation, like, look, you know, animal
473
00:23:25,000 --> 00:23:28,000
welfare and this, that and the other and so on and so forth.
474
00:23:29,000 --> 00:23:31,000
And I'm not dismissing.
475
00:23:31,000 --> 00:23:36,000
Like what their feelings were, but like, that's, that's your deal, dude.
476
00:23:37,000 --> 00:23:38,000
That's not our deal.
477
00:23:38,000 --> 00:23:41,000
Like I, I, I get you want that, but we don't.
478
00:23:41,000 --> 00:23:43,000
It like, that's not going to help the project.
479
00:23:43,000 --> 00:23:44,000
The fact that this is the motivation.
480
00:23:44,000 --> 00:23:48,000
Now that said, the name is terrible.
481
00:23:48,000 --> 00:23:50,000
Exactly what you said on it.
482
00:23:50,000 --> 00:23:51,000
What the fuck is that?
483
00:23:51,000 --> 00:23:52,000
What is that?
484
00:23:52,000 --> 00:23:53,000
What does that set even mean?
485
00:23:53,000 --> 00:23:54,000
That you're refusing.
486
00:23:54,000 --> 00:23:56,000
Why are you saying this?
487
00:23:56,000 --> 00:24:00,000
And so this is why having a lot of really smart friends that have been in the
488
00:24:00,000 --> 00:24:02,000
field for a long time is good.
489
00:24:02,000 --> 00:24:06,000
But every now and then a name comes out and I'm like, this is how we know you
490
00:24:06,000 --> 00:24:09,000
don't talk to anybody else besides white guys that have been in tech for forever.
491
00:24:09,000 --> 00:24:12,000
Who else knows what this means?
492
00:24:12,000 --> 00:24:14,000
And, and here you go.
493
00:24:14,000 --> 00:24:16,000
You can go search for it.
494
00:24:16,000 --> 00:24:18,000
It was, I can tell by Justin's face.
495
00:24:18,000 --> 00:24:20,000
He's already searching for it.
496
00:24:20,000 --> 00:24:21,000
Please.
497
00:24:21,000 --> 00:24:22,000
Please.
498
00:24:22,000 --> 00:24:24,000
It's, it's a 27, 4, 30.
499
00:24:24,000 --> 00:24:28,000
And this was, and for the time, small community.
500
00:24:28,000 --> 00:24:35,000
This was a, I don't know, 50 comment thread here.
501
00:24:35,000 --> 00:24:39,000
Like it was pretty, like I was a lot of annoying stuff that I love that there
502
00:24:39,000 --> 00:24:41,000
was a 50 comment thread here.
503
00:24:41,000 --> 00:24:44,000
There was a lot of annoying stuff that I love that there was 50.
504
00:24:44,000 --> 00:24:46,000
This is just, this is the epitome of open.
505
00:24:46,000 --> 00:24:47,000
Okay.
506
00:24:47,000 --> 00:24:49,000
So when you get to the conversion, right?
507
00:24:49,000 --> 00:24:52,000
The like conversion zone of like open source.
508
00:24:52,000 --> 00:24:53,000
Oh no, sorry.
509
00:24:53,000 --> 00:24:56,000
120, 150, 150 total comments on this.
510
00:24:56,000 --> 00:24:57,000
Shut up.
511
00:24:57,000 --> 00:24:59,000
I love this.
512
00:24:59,000 --> 00:25:01,000
Like insane, insane.
513
00:25:01,000 --> 00:25:05,000
This is like those people who are so into open source that they refuse to use any
514
00:25:05,000 --> 00:25:06,000
proprietary software.
515
00:25:06,000 --> 00:25:08,000
So they refuse to use Google maps or anything.
516
00:25:08,000 --> 00:25:11,000
And you're just like, bro.
517
00:25:11,000 --> 00:25:13,000
But like, okay.
518
00:25:13,000 --> 00:25:15,000
How do you, okay.
519
00:25:15,000 --> 00:25:19,000
I think that Google was, I won't say one of the first, but you guys, your time at
520
00:25:19,000 --> 00:25:24,000
Google, I imagine that you really learned that conversion zone of proprietary
521
00:25:24,000 --> 00:25:27,000
software and corporate and open source.
522
00:25:27,000 --> 00:25:31,000
And I feel like we're in this kind of, I don't know.
523
00:25:31,000 --> 00:25:34,000
I don't know if I'd say a transition, but weird time.
524
00:25:34,000 --> 00:25:39,000
So like, what do you think about the politics of trying to both balance
525
00:25:39,000 --> 00:25:47,000
corporation politics, but also like interacting with like open source, you
526
00:25:47,000 --> 00:25:52,000
know, because it's very, it's, it makes it even more complicated when you're both
527
00:25:52,000 --> 00:25:56,000
arguing for something internally, but then you have to go to the politics of
528
00:25:56,000 --> 00:25:57,000
open source, right?
529
00:25:57,000 --> 00:26:02,000
Which I think a lot of open source is corporations right now, but when you're
530
00:26:02,000 --> 00:26:07,000
actually the person that's inside doing the arguing with that company, you have
531
00:26:07,000 --> 00:26:12,000
to really know what that company's goals and business and leadership principles
532
00:26:12,000 --> 00:26:16,000
and then fighting it in open source is a whole different battle.
533
00:26:16,000 --> 00:26:21,000
And I feel like it's almost like sometimes if people only work in proprietary
534
00:26:21,000 --> 00:26:25,000
software and they only work in open source, don't realize what it's like to
535
00:26:25,000 --> 00:26:30,000
kind of, so can you speak on the whole, like how you do that and like your
536
00:26:30,000 --> 00:26:34,000
experience and kind of making those worlds happen?
537
00:26:34,000 --> 00:26:38,000
I think you, you touched on it earlier, right?
538
00:26:38,000 --> 00:26:41,000
It's like, how to understand their motivations.
539
00:26:41,000 --> 00:26:46,000
You know, corporate, people who are like in that open source or in the
540
00:26:46,000 --> 00:26:53,000
corporation are both humans and corporate employees, right?
541
00:26:53,000 --> 00:26:59,000
And so you've got to figure out how to balance all of that mess and the
542
00:26:59,000 --> 00:27:05,000
extent to which you can help them achieve their corporate goals, but still
543
00:27:05,000 --> 00:27:07,000
enable them to be humans.
544
00:27:07,000 --> 00:27:11,000
I mean, that's, that's the sweet spot you're like looking to do.
545
00:27:11,000 --> 00:27:15,000
I think trying to teach open source to people that are used to corporate
546
00:27:15,000 --> 00:27:20,000
America, you know, and corporate internals and then trying to like show
547
00:27:20,000 --> 00:27:22,000
the business value of open source.
548
00:27:22,000 --> 00:27:26,000
It's like both hard, but like my favorite thing because there's so much
549
00:27:26,000 --> 00:27:32,000
value in open source and trying to get people to invest, but also understand
550
00:27:32,000 --> 00:27:36,000
the mindset of people that work in open source and what open source customers
551
00:27:36,000 --> 00:27:42,000
want because it's very different than proprietary like software, right?
552
00:27:42,000 --> 00:27:46,000
So trying to teach companies that are into proprietary software.
553
00:27:46,000 --> 00:27:49,000
Like I think Google is really interesting because they, they do, they're
554
00:27:49,000 --> 00:27:52,000
very academic in the papers and kind of where that flows.
555
00:27:52,000 --> 00:27:57,000
But if you look at it like Kubernetes is one of the like most thought out, like
556
00:27:57,000 --> 00:28:01,000
well built open source projects, which I think is going to like really
557
00:28:01,000 --> 00:28:03,000
changed how that's done.
558
00:28:03,000 --> 00:28:07,000
Like if you look at how like the Linux foundation in Kubernetes works, like
559
00:28:07,000 --> 00:28:10,000
you can tell that the rest foundation is really taking that into account when
560
00:28:10,000 --> 00:28:11,000
building it.
561
00:28:11,000 --> 00:28:16,000
So I feel like it's going to like change the future of how those like, so
562
00:28:16,000 --> 00:28:20,000
what's it like kind of seeing that from like the start and how corporate and
563
00:28:20,000 --> 00:28:23,000
open source kind of start at this marriage and how we're still trying to
564
00:28:23,000 --> 00:28:24,000
navigate that.
565
00:28:24,000 --> 00:28:31,000
I mean, I think the thing is like it's the extent to which you become part of
566
00:28:31,000 --> 00:28:38,000
the critical chain of a corporate supply chain is where you start to do it.
567
00:28:38,000 --> 00:28:39,000
Right.
568
00:28:39,000 --> 00:28:44,000
So like, you know, to use a terrible example, you know, analogy here, right?
569
00:28:44,000 --> 00:28:45,000
It's like energy.
570
00:28:45,000 --> 00:28:46,000
Okay.
571
00:28:46,000 --> 00:28:48,000
You know, corporations don't care about energy.
572
00:28:48,000 --> 00:28:49,000
They don't care about electricity.
573
00:28:49,000 --> 00:28:50,000
Most of them don't, right?
574
00:28:50,000 --> 00:28:53,000
They're just like, all right, I plug in my laptop and it works and that allows me
575
00:28:53,000 --> 00:28:55,000
to, you know, produce some Excel files.
576
00:28:55,000 --> 00:29:00,000
If you ask them to like give a shit about, you know, what, what the
577
00:29:00,000 --> 00:29:06,000
transformer up the street is or what the, you know, greenness of your, you know,
578
00:29:06,000 --> 00:29:10,000
you know, power coming in is they're just like, Hey, that sounds great, but like
579
00:29:10,000 --> 00:29:14,000
unless we're very forward looking, we're just not going to care.
580
00:29:14,000 --> 00:29:20,000
Now, if you can align it to whatever their business goals are, then you're
581
00:29:20,000 --> 00:29:22,000
going to be off to the races.
582
00:29:22,000 --> 00:29:24,000
And so if you're like, Hey, you know what?
583
00:29:24,000 --> 00:29:30,000
It turns out that putting a 10 kilowatt battery on every retail outlet means
584
00:29:30,000 --> 00:29:35,000
that, you know, they get less brownouts or something like that.
585
00:29:35,000 --> 00:29:39,000
And that improves the ability to sell and great, they're going to do that.
586
00:29:39,000 --> 00:29:45,000
And now it happens to forward your other goals of being green and resilient and,
587
00:29:45,000 --> 00:29:49,000
and, you know, getting rid of fossil fuels and whatever, but like, that's
588
00:29:49,000 --> 00:29:51,000
not what you don't sell it that way.
589
00:29:51,000 --> 00:29:55,000
You sell it as, as part of this other thing that they care about.
590
00:29:55,000 --> 00:29:57,000
Open source is the same way, right?
591
00:29:57,000 --> 00:30:02,000
Like, you know, you're not, it is very unlikely that you're going to be able to
592
00:30:02,000 --> 00:30:05,000
walk into someone and say, Hey, you know what, you should do this because this
593
00:30:05,000 --> 00:30:08,000
is a social good and you need to support the Linux kernel.
594
00:30:08,000 --> 00:30:12,000
You need to support, you know, engine X or whatever because of X.
595
00:30:12,000 --> 00:30:18,000
What you need to say is like, Hey, do you know that like 84% of our stack runs on
596
00:30:18,000 --> 00:30:22,000
this open source project and we have zero upstream developers on it?
597
00:30:22,000 --> 00:30:24,000
Like that seems like a supply chain risk.
598
00:30:24,000 --> 00:30:28,000
We are not going to go and build this better than they do.
599
00:30:28,000 --> 00:30:35,000
Uh, so let's put someone on maintaining it or, or allocate some dollars or whatever
600
00:30:35,000 --> 00:30:37,000
it is, because that's a supply chain component.
601
00:30:37,000 --> 00:30:39,000
And again, that's just one example.
602
00:30:39,000 --> 00:30:40,000
There are many other reasons.
603
00:30:40,000 --> 00:30:45,000
Like I'm so passionate about like trying to explain that to corporate America
604
00:30:45,000 --> 00:30:49,000
because I feel like it's not only way that we're going to move forward with this
605
00:30:49,000 --> 00:30:54,000
kind of like, this new like change in open source, then trying to license
606
00:30:54,000 --> 00:30:57,000
everything and trying to make it, people pay for it in a different way.
607
00:30:57,000 --> 00:31:00,000
I don't think we're going to get that forward movement that they think they
608
00:31:00,000 --> 00:31:01,000
want.
609
00:31:01,000 --> 00:31:06,000
But I think really showing like, uh, corporate, like companies, like
610
00:31:06,000 --> 00:31:10,000
70% of infrastructure I think is built on open source.
611
00:31:10,000 --> 00:31:14,000
And I think really showing people the value of like, Hey, contribute these
612
00:31:14,000 --> 00:31:19,000
things upstream, learn to get into the political, the politics and contributing
613
00:31:19,000 --> 00:31:21,000
and being a part of the community.
614
00:31:21,000 --> 00:31:25,000
And like it for one, it's like everybody is doing more with less right now.
615
00:31:25,000 --> 00:31:26,000
Right.
616
00:31:26,000 --> 00:31:30,000
So the more that you're contributing upstream, everybody's on the same page.
617
00:31:30,000 --> 00:31:32,000
It's easier to maintain software together.
618
00:31:32,000 --> 00:31:34,000
There's so much actual business value.
619
00:31:34,000 --> 00:31:35,000
Absolutely.
620
00:31:35,000 --> 00:31:41,000
And I just feel so passionately about trying to get people on that page
621
00:31:41,000 --> 00:31:45,000
because I think that we can, for one, we can get people paid to be
622
00:31:45,000 --> 00:31:46,000
maintainers, right?
623
00:31:46,000 --> 00:31:47,000
Absolutely.
624
00:31:47,000 --> 00:31:50,000
But we can all, like people are going to be doing this development anyways.
625
00:31:50,000 --> 00:31:53,000
And instead of taking from open source and internally developing it,
626
00:31:53,000 --> 00:31:59,000
contributing it back to open sources, not only going to make you a better
627
00:31:59,000 --> 00:32:04,000
steward of that community, but also why maintain it solo and like
628
00:32:04,000 --> 00:32:06,000
siloed when you can maintain it.
629
00:32:06,000 --> 00:32:09,000
Like look at how log for J was like, so, you know,
630
00:32:09,000 --> 00:32:14,000
I mean, I would say also maintaining things solo is easier than trying to
631
00:32:14,000 --> 00:32:18,000
trying to get the, like you slow down sometimes in what you're doing,
632
00:32:18,000 --> 00:32:22,000
but like trying to get it depends on how big the project is.
633
00:32:22,000 --> 00:32:23,000
Yeah, absolutely.
634
00:32:23,000 --> 00:32:26,000
There's a tipping point because like at some point, like if you're going to
635
00:32:26,000 --> 00:32:29,000
like a legacy software like Java Linux, all these places.
636
00:32:29,000 --> 00:32:31,000
Thank you for saying Java's legacy.
637
00:32:31,000 --> 00:32:34,000
Why are you always trying to hurt my soul?
638
00:32:34,000 --> 00:32:35,000
You said it.
639
00:32:35,000 --> 00:32:37,000
I just said how good you were friends.
640
00:32:37,000 --> 00:32:39,000
Oh yeah, she is.
641
00:32:39,000 --> 00:32:40,000
Terrible.
642
00:32:40,000 --> 00:32:41,000
Terrible.
643
00:32:41,000 --> 00:32:42,000
Come on.
644
00:32:42,000 --> 00:32:43,000
Join the 21st century.
645
00:32:43,000 --> 00:32:44,000
I'm still IT.
646
00:32:44,000 --> 00:32:47,000
My defense, I haven't got to write Java and God knows how long.
647
00:32:47,000 --> 00:32:52,000
So like apparently I'm a Python and everything else head and decrepit C
648
00:32:52,000 --> 00:32:54,000
and every open source.
649
00:32:54,000 --> 00:32:55,000
Yeah.
650
00:32:55,000 --> 00:32:56,000
But like, you know what I mean?
651
00:32:56,000 --> 00:32:58,000
Like if you think about it, they're the amount.
652
00:32:58,000 --> 00:32:59,000
Okay.
653
00:32:59,000 --> 00:33:06,000
In this world as engineers in 2025, everybody is so under like headcount.
654
00:33:06,000 --> 00:33:09,000
We haven't had headcount for years, right?
655
00:33:09,000 --> 00:33:13,000
We are doing more with less to get the extreme knowledge that you need for
656
00:33:13,000 --> 00:33:15,000
some of these open source.
657
00:33:15,000 --> 00:33:17,000
Think about how big Kubernetes is.
658
00:33:17,000 --> 00:33:22,000
Think about how big Linux is, how big Java is to get that type of
659
00:33:22,000 --> 00:33:26,000
of specialized knowledge.
660
00:33:26,000 --> 00:33:28,000
You're not going to just, unless you're buying a
661
00:33:28,000 --> 00:33:33,000
maintainer for a million dollars, like, and you'd have to have multiple of them,
662
00:33:33,000 --> 00:33:34,000
right?
663
00:33:34,000 --> 00:33:39,000
So if you can get, if you are putting that money towards getting developers to
664
00:33:39,000 --> 00:33:45,000
learn how to like become parts of those ecosystems, you now are like a force
665
00:33:45,000 --> 00:33:50,000
multiplying your developers because you're maintaining it and you're
666
00:33:50,000 --> 00:33:51,000
contributing to this ecosystem.
667
00:33:51,000 --> 00:33:54,000
So if you have four corporations that are huge and they have the smartest
668
00:33:54,000 --> 00:33:59,000
minds and they're all now adding to this open source project, really, that's a
669
00:33:59,000 --> 00:34:03,000
force multiplier because when you have a horrible bug like log4j or something,
670
00:34:03,000 --> 00:34:08,000
if you have four smart, huge, like not huge, but smart teams, right, that are
671
00:34:08,000 --> 00:34:12,000
working at the same problem, that's more now secure than it would have been if
672
00:34:12,000 --> 00:34:13,000
you siloed that.
673
00:34:13,000 --> 00:34:14,000
Yeah.
674
00:34:14,000 --> 00:34:15,000
You know what I mean?
675
00:34:15,000 --> 00:34:18,000
That's like the whole reason for the CNCF as a foundation, right?
676
00:34:18,000 --> 00:34:21,000
So that these big corporations can work together in a neutral place because...
677
00:34:21,000 --> 00:34:25,000
But we have to teach people that because a lot of corporate America just thinks
678
00:34:25,000 --> 00:34:29,000
of like, I'm going to go pull this repo down, pretend like I didn't pull it down.
679
00:34:29,000 --> 00:34:34,000
And then, you know, and it's like, bro, it's better for your business.
680
00:34:34,000 --> 00:34:39,000
It is more business value for you to contribute these things upstream.
681
00:34:39,000 --> 00:34:42,000
I know you're going to have a little bit of politics, but hire someone who knows
682
00:34:42,000 --> 00:34:43,000
how to do that.
683
00:34:43,000 --> 00:34:46,000
They are in the market right now and they could use a job, you know what I mean?
684
00:34:46,000 --> 00:34:50,000
And do the work because it really is a force multiplier when you look at it.
685
00:34:50,000 --> 00:34:51,000
You know what I mean?
686
00:34:51,000 --> 00:34:55,000
So David, we're going to skip over everything you did at Microsoft because it doesn't work.
687
00:34:55,000 --> 00:34:57,000
And we're just going to jump right into it.
688
00:34:57,000 --> 00:34:58,000
I want to know all the things.
689
00:34:58,000 --> 00:34:59,000
Tell us all the cool things.
690
00:34:59,000 --> 00:35:01,000
I will talk for as long as you'd like.
691
00:35:01,000 --> 00:35:03,000
I'll come back for part two or whatever.
692
00:35:03,000 --> 00:35:07,000
I'm not that interesting, but I love hearing my own voice because I'm narcissistic or
693
00:35:07,000 --> 00:35:08,000
I don't know.
694
00:35:08,000 --> 00:35:14,000
I'm not going to give you any key that you'd like at any of these places that I haven't
695
00:35:14,000 --> 00:35:15,000
gotten into the politics.
696
00:35:15,000 --> 00:35:16,000
All right.
697
00:35:16,000 --> 00:35:17,000
Cube float.
698
00:35:17,000 --> 00:35:18,000
There's a lot of drama there.
699
00:35:18,000 --> 00:35:24,000
Oh, you are like my favorite type of human because not only do you have all the intellect,
700
00:35:24,000 --> 00:35:27,000
but you actually have a personality, right?
701
00:35:27,000 --> 00:35:32,000
Because like, bro, you all know that sometimes you get the engineers and you're just like,
702
00:35:32,000 --> 00:35:33,000
oh, good Lord.
703
00:35:33,000 --> 00:35:39,000
Pulling any kind of personality and socialness out of them is just like, it's so hard.
704
00:35:39,000 --> 00:35:41,000
That is incredibly kind.
705
00:35:41,000 --> 00:35:42,000
I don't know.
706
00:35:42,000 --> 00:35:45,000
I don't know if I deserve that, but thank you.
707
00:35:45,000 --> 00:35:47,000
So I will skip over all that stuff.
708
00:35:47,000 --> 00:35:48,000
What else would you like to?
709
00:35:48,000 --> 00:35:50,000
So I'm kind of curious.
710
00:35:50,000 --> 00:35:57,000
What point did you decide that data co-located with compute was a problem to solve?
711
00:35:57,000 --> 00:35:59,000
And what are you doing that's new?
712
00:35:59,000 --> 00:36:04,000
Out of all the interesting questions you could ask David, like all the tea, look at his eyes.
713
00:36:04,000 --> 00:36:10,000
Like they're just, it's hiding behind the eyes and it wants to come out.
714
00:36:10,000 --> 00:36:15,000
And the funny part is like, it's funny that we started with Kubernetes because that was it.
715
00:36:15,000 --> 00:36:16,000
Right?
716
00:36:16,000 --> 00:36:20,000
It's one of these things where you're like, when you hear something for like 10 years
717
00:36:20,000 --> 00:36:25,000
and you're like, oh yeah, I actually heard this problem a million years ago.
718
00:36:25,000 --> 00:36:27,000
So a little bit more history for you.
719
00:36:27,000 --> 00:36:32,000
When we were, when I was at Google, like one of the first features that I launched, I wanted
720
00:36:32,000 --> 00:36:36,000
to launch the first PRDs I wrote when I was at Google or on the Kubernetes team was a
721
00:36:36,000 --> 00:36:38,000
product called Uber Netties.
722
00:36:38,000 --> 00:36:43,000
And it was the idea of how do you federate Kubernetes clusters together?
723
00:36:43,000 --> 00:36:44,000
Right?
724
00:36:44,000 --> 00:36:45,000
How do you have an API server?
725
00:36:45,000 --> 00:36:47,000
I'm sad this didn't work out because the name is cool.
726
00:36:47,000 --> 00:36:49,000
I mean, it was like, it was genius.
727
00:36:49,000 --> 00:36:51,000
It was a genius name, Uber Netties, right?
728
00:36:51,000 --> 00:36:57,000
But the problem was is that these things don't work together, right?
729
00:36:57,000 --> 00:36:58,000
Kubernetes is incredible.
730
00:36:58,000 --> 00:37:04,000
It's not going anywhere, but it is built for a world in which nodes have continual connectivity
731
00:37:04,000 --> 00:37:06,000
between itself and the API server.
732
00:37:06,000 --> 00:37:12,000
And if it goes away for even a small amount of time, Kubernetes is really unhappy.
733
00:37:12,000 --> 00:37:20,000
And, and so then you had someone come along like Brian from Chick-fil-A who has that amazing
734
00:37:20,000 --> 00:37:25,000
2017 talk about how they have a Kubernetes cluster in every single Chick-fil-A.
735
00:37:25,000 --> 00:37:27,000
And they still have that today, right?
736
00:37:27,000 --> 00:37:28,000
It's incredible.
737
00:37:28,000 --> 00:37:29,000
Doesn't Walmart have that too?
738
00:37:29,000 --> 00:37:30,000
Yeah.
739
00:37:30,000 --> 00:37:32,000
Almost everybody starts it in a bunch of other people.
740
00:37:32,000 --> 00:37:35,000
And it's all kind of insane, right?
741
00:37:35,000 --> 00:37:38,000
Like you're kind of like, hey, you know, there's something weird about this, right?
742
00:37:38,000 --> 00:37:41,000
Why isn't there a platform that sits over the top?
743
00:37:41,000 --> 00:37:46,000
And, you know, when I was thinking about it, it really was around, you know, data, right?
744
00:37:46,000 --> 00:37:50,000
And it's the, it's the idea that like data is the challenge here.
745
00:37:50,000 --> 00:37:54,000
I like to say like there, there are three things that, that will never change, right?
746
00:37:54,000 --> 00:37:58,000
One, data growing, not really in dispute, data will continue to grow.
747
00:37:58,000 --> 00:38:01,000
But, but the key is that it will grow everywhere, right?
748
00:38:01,000 --> 00:38:06,000
Not in a central data center in Iowa or Oregon or Tokyo, but, you know,
749
00:38:06,000 --> 00:38:10,000
cross zone, cross region, cross cloud on Prem, Edge, IoT, blah, blah, blah, blah, blah.
750
00:38:10,000 --> 00:38:15,000
Like data is coming from all those places and Kubernetes is not in those places,
751
00:38:15,000 --> 00:38:17,000
nor are any of these other giant data warehouses.
752
00:38:17,000 --> 00:38:21,000
They're all sitting inside of your massive data center as they should.
753
00:38:21,000 --> 00:38:24,000
But somehow that data is there and you got to figure out how to get it in.
754
00:38:24,000 --> 00:38:26,000
And it can't just be a log shipper.
755
00:38:26,000 --> 00:38:27,000
Cause guess what?
756
00:38:27,000 --> 00:38:29,000
Look at what happened with log 4j.
757
00:38:29,000 --> 00:38:30,000
Exactly what you're saying earlier.
758
00:38:30,000 --> 00:38:33,000
You ship raw stuff from the edge.
759
00:38:33,000 --> 00:38:35,000
Go your central data warehouse.
760
00:38:35,000 --> 00:38:36,000
Bam.
761
00:38:36,000 --> 00:38:37,000
Security bullet really.
762
00:38:37,000 --> 00:38:38,000
Guaranteed.
763
00:38:38,000 --> 00:38:40,000
And we haven't even gotten into the other things, right?
764
00:38:40,000 --> 00:38:44,000
So that number one, that's a beautiful number two, speed of light, not getting any faster.
765
00:38:44,000 --> 00:38:45,000
Right?
766
00:38:45,000 --> 00:38:46,000
Just it is what it is.
767
00:38:46,000 --> 00:38:52,000
In 10,000 years, it will still be 49 millisecond ping time between Boston and LA.
768
00:38:52,000 --> 00:38:53,000
It just will.
769
00:38:53,000 --> 00:38:58,000
And so if you want to do anything that takes is faster than that, you're going to need a system
770
00:38:58,000 --> 00:39:01,000
that like can take the action remotely.
771
00:39:01,000 --> 00:39:04,000
But on top of that, like networking is just not keeping up.
772
00:39:04,000 --> 00:39:07,000
And it's not because they aren't out there busting their ass.
773
00:39:07,000 --> 00:39:09,000
It's cause data is growing even faster.
774
00:39:09,000 --> 00:39:11,000
And then the captain gets you every time.
775
00:39:11,000 --> 00:39:12,000
Sorry, go ahead.
776
00:39:12,000 --> 00:39:14,000
The captain gets you every time.
777
00:39:14,000 --> 00:39:15,000
Captain gets you every time.
778
00:39:15,000 --> 00:39:16,000
Thank you very much.
779
00:39:16,000 --> 00:39:20,000
So then the third is around security and regulations and those are growing, right?
780
00:39:20,000 --> 00:39:22,000
So GPR and HIPAA and things like that.
781
00:39:22,000 --> 00:39:27,000
You, you tend to put yourself at risk the moment you move a bit off wherever you generated it.
782
00:39:27,000 --> 00:39:28,000
Right?
783
00:39:28,000 --> 00:39:32,000
Now you, you're a wonderful segue because that's exactly it.
784
00:39:32,000 --> 00:39:37,000
Every major platform today is built around the C and the A of cap theorem, right?
785
00:39:37,000 --> 00:39:42,000
Consistency and consensus, whatever you want to say and availability.
786
00:39:42,000 --> 00:39:43,000
Right.
787
00:39:43,000 --> 00:39:44,000
That's amazing.
788
00:39:44,000 --> 00:39:46,000
Something should be built around the other half of it.
789
00:39:46,000 --> 00:39:49,000
Availability and support for network partitioning.
790
00:39:49,000 --> 00:39:52,000
And that's because of all those things I just said.
791
00:39:52,000 --> 00:39:56,000
And, and, you know, when you go and look at the Chick-fil-A example or the Home Depot or
792
00:39:56,000 --> 00:40:00,000
the millions of other folks out there who have these multiple deployments, retail outlets,
793
00:40:00,000 --> 00:40:04,000
manufacturing, security, et cetera, et cetera, this is the problem.
794
00:40:04,000 --> 00:40:05,000
Right?
795
00:40:05,000 --> 00:40:06,000
Because that data is over there.
796
00:40:06,000 --> 00:40:09,000
I want to do things declaratively.
797
00:40:09,000 --> 00:40:14,000
I want to take action over my data before I move it, but I still want it to move.
798
00:40:14,000 --> 00:40:21,000
So how do I do that when the network could go away for a minute, an hour, a day, because
799
00:40:21,000 --> 00:40:25,000
I'm going to put a backhoe through something who knows what, I want those systems and those
800
00:40:25,000 --> 00:40:26,000
systems to keep working.
801
00:40:26,000 --> 00:40:30,000
But when they reconnect, I want someone to like eventually bring this to consistency.
802
00:40:30,000 --> 00:40:31,000
And that's what we provided.
803
00:40:31,000 --> 00:40:38,000
I also think that's even more value because for one, the more you spread your data, the
804
00:40:38,000 --> 00:40:40,000
more you're spreading your attack surface.
805
00:40:40,000 --> 00:40:41,000
You know what I mean?
806
00:40:41,000 --> 00:40:42,000
Absolutely.
807
00:40:42,000 --> 00:40:46,000
And then I think that networking and security are both things that developers aren't always
808
00:40:46,000 --> 00:40:50,000
educated on and they're very like in depth areas, right?
809
00:40:50,000 --> 00:40:56,000
So the more that you can do in those areas to set them up for success, the better because
810
00:40:56,000 --> 00:41:01,000
and now that we're going to have more agents just, you know.
811
00:41:01,000 --> 00:41:02,000
Doing whatever agents do?
812
00:41:02,000 --> 00:41:03,000
Yeah.
813
00:41:03,000 --> 00:41:08,000
Well, the thing is, is that we didn't, we are struggling to teach developer security,
814
00:41:08,000 --> 00:41:09,000
right?
815
00:41:09,000 --> 00:41:13,000
So now if they don't understand all of the security and then they are giving permissions
816
00:41:13,000 --> 00:41:19,000
to agents that they already don't understand, that it's just a recipe for like the more that
817
00:41:19,000 --> 00:41:23,000
you're scaling this, yeah, you're scaling it, but you're scaling disaster in some ways.
818
00:41:23,000 --> 00:41:24,000
You know what I mean?
819
00:41:24,000 --> 00:41:30,000
So it's like, I think that this is going to, that has like so much value just on so many.
820
00:41:30,000 --> 00:41:32,000
I mean, I'm sorry, please.
821
00:41:32,000 --> 00:41:36,000
David, what you just described is basically just the, all the benefits of edge computing,
822
00:41:36,000 --> 00:41:37,000
right?
823
00:41:37,000 --> 00:41:38,000
Yeah.
824
00:41:38,000 --> 00:41:41,000
Like we can, we can, how do we get more compute at the edge?
825
00:41:41,000 --> 00:41:42,000
How do we make it more powerful?
826
00:41:42,000 --> 00:41:45,000
How do we make it easier to manage from a central place?
827
00:41:45,000 --> 00:41:48,000
So what is it that you're doing that's different than like, I don't know what to say, traditional
828
00:41:48,000 --> 00:41:53,000
edge computing, but like the, the idea is behind edge computing is just like, put compute
829
00:41:53,000 --> 00:41:58,000
closer milliseconds, have it better, have better storage, whatever, to where the data's
830
00:41:58,000 --> 00:41:59,000
being created.
831
00:41:59,000 --> 00:42:00,000
Yeah.
832
00:42:00,000 --> 00:42:02,000
So, you know, that's, we don't provide the edge compute.
833
00:42:02,000 --> 00:42:07,000
That is my, what do you call it here, by the way, my visual aid, my Raspberry Pi, right?
834
00:42:07,000 --> 00:42:08,000
That we run great on.
835
00:42:08,000 --> 00:42:11,000
Well, I don't know why you're coming apart here.
836
00:42:11,000 --> 00:42:12,000
That's weird.
837
00:42:12,000 --> 00:42:14,000
It's a Raspberry Pi.
838
00:42:14,000 --> 00:42:15,000
Yeah, exactly.
839
00:42:15,000 --> 00:42:17,000
But no, you're exactly right.
840
00:42:17,000 --> 00:42:21,000
And what we provide is the distributed consensus layer.
841
00:42:21,000 --> 00:42:28,000
And what that means is that like, turns out that that thing that you put on the edge is
842
00:42:28,000 --> 00:42:31,000
wonderful, but how do I know what is there?
843
00:42:31,000 --> 00:42:33,000
How do I know it stays running?
844
00:42:33,000 --> 00:42:35,000
How do I change the configuration?
845
00:42:35,000 --> 00:42:39,000
How do I do all this in a network and disconnected friendly way?
846
00:42:39,000 --> 00:42:41,000
That is the challenge of distributed computing.
847
00:42:41,000 --> 00:42:44,000
That is 40 years of academic research.
848
00:42:44,000 --> 00:42:51,000
And what we give you is a Kubernetes or container or orchestrated like experience, but one that
849
00:42:51,000 --> 00:42:55,000
is resilient to eventual consistency.
850
00:42:55,000 --> 00:42:59,000
And so if something happened on our side while you were disconnected, if something happened
851
00:42:59,000 --> 00:43:03,000
on that side while you were disconnected, you know, we, we give you, you know, the one
852
00:43:03,000 --> 00:43:08,000
of the big things that we keep seeing is we call it intelligent data pipelines, right?
853
00:43:08,000 --> 00:43:15,000
Where you can much more intelligently do some, not all some of your process before you move
854
00:43:15,000 --> 00:43:16,000
it.
855
00:43:16,000 --> 00:43:21,000
So a trivial example I talk about all the time is a lot of times factory owners, for example,
856
00:43:21,000 --> 00:43:24,000
will have all these sensors all over the place and they're great.
857
00:43:24,000 --> 00:43:30,000
But the sensors usually come straight from a warehouse in Shenzhen and get installed immediately,
858
00:43:30,000 --> 00:43:31,000
right?
859
00:43:31,000 --> 00:43:36,000
And they have no information, you know, no GPS, no, no schema.
860
00:43:36,000 --> 00:43:41,000
A lot of times they'll just output like a raw text string and they'll, you, you jam that
861
00:43:41,000 --> 00:43:43,000
into a backend, right?
862
00:43:43,000 --> 00:43:48,000
Well, geez, the moment you do that, you now have to reverse engineer all of the information
863
00:43:48,000 --> 00:43:50,000
that you lost along the way, right?
864
00:43:50,000 --> 00:43:52,000
What kind of machine was it running on?
865
00:43:52,000 --> 00:43:53,000
What was the firmware?
866
00:43:53,000 --> 00:43:54,000
What was the disk?
867
00:43:54,000 --> 00:43:57,000
What was, you know, what, where in the room was it?
868
00:43:57,000 --> 00:43:59,000
You know, so on and so forth.
869
00:43:59,000 --> 00:44:02,000
And so if you took something, right, you took one of these Raspberry Pis, you stuck it inside
870
00:44:02,000 --> 00:44:08,000
that factory and you said, Hey, you know what, before you send this raw into the back end,
871
00:44:08,000 --> 00:44:14,000
send it to this local machine or like local being like within the region and attach some
872
00:44:14,000 --> 00:44:19,000
metadata to it and do some initial data model enforcement and do some schematization.
873
00:44:19,000 --> 00:44:23,000
So change it from a flat text string into JSON or, you know, structured logging or whatever
874
00:44:23,000 --> 00:44:26,000
and take your pick, but still go to the backend.
875
00:44:26,000 --> 00:44:31,000
All you're doing is you're moving, like I say, you know, you may have a 17 step ETL pipeline
876
00:44:31,000 --> 00:44:34,000
and all your enterprise customers are like, Yeah, right.
877
00:44:34,000 --> 00:44:35,000
Add a zero buddy, right?
878
00:44:35,000 --> 00:44:42,000
But like, you know, take, you take those first, I don't know, four or five steps of your pipeline,
879
00:44:42,000 --> 00:44:48,000
data and model enforcement, schematization, adding metadata, adding providence, adding location,
880
00:44:48,000 --> 00:44:53,000
filtering, aggregation, just do some of those things before you move it.
881
00:44:53,000 --> 00:44:59,000
And magically, all those things downstream become better, faster, smarter.
882
00:44:59,000 --> 00:45:01,000
You can multi home stuff.
883
00:45:01,000 --> 00:45:05,000
So a lot of times, for example, you know, you might have these all these sensors pushing
884
00:45:05,000 --> 00:45:11,000
into the back end, no matter how fast your pipeline is, it might take, you know, five,
885
00:45:11,000 --> 00:45:15,000
10 minutes to ultimately go all the way through the pipeline, very, very common, not because
886
00:45:15,000 --> 00:45:19,000
the pipeline isn't like busting its ass, but because it needs to aggregate from all these
887
00:45:19,000 --> 00:45:21,000
sources before it does anything.
888
00:45:21,000 --> 00:45:27,000
Imagine that you have a factory floor where four different sensors are simultaneously saying,
889
00:45:27,000 --> 00:45:31,000
Hey, you know what, we're above our temperature threshold, right?
890
00:45:31,000 --> 00:45:35,000
Um, do you want to wait 10 minutes to know that?
891
00:45:35,000 --> 00:45:39,000
Wouldn't it be nice if you could trigger an event from that location?
892
00:45:39,000 --> 00:45:40,000
We can do that for you.
893
00:45:40,000 --> 00:45:44,000
And again, it's just by taking some of this and moving that out there and saying, Hey,
894
00:45:44,000 --> 00:45:48,000
you know what, we're still going to send all the raw data back, but simultaneously, we're
895
00:45:48,000 --> 00:45:52,000
also going to track her, you know, trigger pager duty or whatever, take your pick by
896
00:45:52,000 --> 00:45:56,000
your hydrant, you know, any kind of other endpoint sqs, we're going to trigger that from this
897
00:45:56,000 --> 00:45:57,000
location.
898
00:45:57,000 --> 00:45:59,000
We can help you do that too.
899
00:45:59,000 --> 00:46:02,000
Again, we're not like stopping the rest of it.
900
00:46:02,000 --> 00:46:05,000
We're not even, you know, we can save you money if you want, we can reduce data if you
901
00:46:05,000 --> 00:46:10,000
want, we can do other things, but even just putting us there helps you be intelligent
902
00:46:10,000 --> 00:46:12,000
in a more distributed way.
903
00:46:12,000 --> 00:46:18,000
And to your point earlier about what edge compute, uh, doesn't do here, it's not that
904
00:46:18,000 --> 00:46:20,000
edge compute isn't critical to this.
905
00:46:20,000 --> 00:46:23,000
We can't operate without some form of edge compute.
906
00:46:23,000 --> 00:46:27,000
It's really about the orchestration of that job.
907
00:46:27,000 --> 00:46:32,000
So again, let's say you're like, Hey, you know what, I want to change this from being
908
00:46:32,000 --> 00:46:37,000
for, you know, uh, sensors going bad to five sensors going bad.
909
00:46:37,000 --> 00:46:39,000
Imagine what that's involved today.
910
00:46:39,000 --> 00:46:40,000
Right.
911
00:46:40,000 --> 00:46:41,000
How do you actually push that down?
912
00:46:41,000 --> 00:46:44,000
How do you know what version of this job is running?
913
00:46:44,000 --> 00:46:48,000
How do you know what the last time you saw this error is, whatever it might be.
914
00:46:48,000 --> 00:46:51,000
Um, all of that is hard to get down to these places.
915
00:46:51,000 --> 00:46:56,000
And we give you a clean API that is resilient to these networks that gives you, you know,
916
00:46:56,000 --> 00:47:01,000
full and rich intelligent pipelines at the edge and help you push that stuff through.
917
00:47:01,000 --> 00:47:06,000
Uh, and by the way, when people talk about the next trillion devices that are out there,
918
00:47:06,000 --> 00:47:09,000
all of which doing inference, either way, we do that too.
919
00:47:09,000 --> 00:47:10,000
Right.
920
00:47:10,000 --> 00:47:16,000
Like, because at the end of the day, you know, inference is just remote data being processed
921
00:47:16,000 --> 00:47:19,000
and, and we hope you, you, you do that as well.
922
00:47:19,000 --> 00:47:25,000
I think this is going to be really cool too, because now with AI and all the different,
923
00:47:25,000 --> 00:47:27,000
we want to get data from everything.
924
00:47:27,000 --> 00:47:32,000
It's the promise of the future, but to be able to analyze all that people don't understand
925
00:47:32,000 --> 00:47:34,000
that distributed systems.
926
00:47:34,000 --> 00:47:39,000
Like you either, like, especially when it comes to either like, so their IOT is one aspect,
927
00:47:39,000 --> 00:47:42,000
but then you have the retail aspect where you don't want to charge people's cards more
928
00:47:42,000 --> 00:47:43,000
than once.
929
00:47:43,000 --> 00:47:47,000
So you have to have that eventually consistent and really worry about how you're, so like
930
00:47:47,000 --> 00:47:51,000
I went and made an order from happy lemon the other day and I was on my way to a tattoo
931
00:47:51,000 --> 00:47:55,000
appointment and I was trying to speed this up because I'm always late just in those,
932
00:47:55,000 --> 00:48:00,000
but, but I ended up getting there and I'm like, why isn't my order like ready?
933
00:48:00,000 --> 00:48:05,000
So I go and buy it and then all of a sudden my order comes through and I think it was
934
00:48:05,000 --> 00:48:08,000
because they had a like connection issue and I'm just sitting there like something's going
935
00:48:08,000 --> 00:48:14,000
on in your back end that you have an eventually consistent like, like database, but I'm sitting
936
00:48:14,000 --> 00:48:18,000
there and happy lemon, like trying to figure out what's wrong with their back end.
937
00:48:18,000 --> 00:48:22,000
This is, this is why we're friends because I often will like something.
938
00:48:22,000 --> 00:48:24,000
I was diagnosing their problem.
939
00:48:24,000 --> 00:48:26,000
Is this a backup job?
940
00:48:26,000 --> 00:48:28,000
Is this, do you need more workers?
941
00:48:28,000 --> 00:48:30,000
Is this of Kafka something?
942
00:48:30,000 --> 00:48:35,000
Yes, but see the thing is, is that like what you're, okay, so people don't realize that
943
00:48:35,000 --> 00:48:40,000
when you're streaming that amount of data, like David's talking about, you get now this
944
00:48:40,000 --> 00:48:45,000
bottleneck when all of it starts to come back in that like it's so bad.
945
00:48:45,000 --> 00:48:50,000
So the fact that you're doing, like this is another, it's almost like the concept of breaking
946
00:48:50,000 --> 00:48:53,000
different, just kind of compartmentalizing it.
947
00:48:53,000 --> 00:48:55,000
What's the word I'm looking for?
948
00:48:55,000 --> 00:48:59,000
Like, wouldn't you break up things when people want to do like service oriented or kind of
949
00:48:59,000 --> 00:49:04,000
like, so you're not monolith, but you're kind of in different services.
950
00:49:04,000 --> 00:49:05,000
Yes.
951
00:49:05,000 --> 00:49:07,000
Not, but what's the other way to do it?
952
00:49:07,000 --> 00:49:08,000
Not microservices.
953
00:49:08,000 --> 00:49:10,000
So a service oriented architecture.
954
00:49:10,000 --> 00:49:15,000
Sort of, but like basically you're keeping it so that way when one area breaks, it doesn't
955
00:49:15,000 --> 00:49:17,000
completely break everything else.
956
00:49:17,000 --> 00:49:21,000
But like if you, so if you're processing this, like for one people are bad at processing
957
00:49:21,000 --> 00:49:25,000
and schemas and all of that stuff anyways, but if you're doing some of the work in these
958
00:49:25,000 --> 00:49:30,000
individual places that when you get backed up and then you send it all to the same pipeline,
959
00:49:30,000 --> 00:49:34,000
you're now not creating the same stress and bottleneck on your pipelines.
960
00:49:34,000 --> 00:49:38,000
And because we're going to get more and more and more data and people are going to want
961
00:49:38,000 --> 00:49:43,000
to like do all these crazy things with it, like that's great, but it's going to cause
962
00:49:43,000 --> 00:49:44,000
more and more stress.
963
00:49:44,000 --> 00:49:49,000
Like we keep making, like at this point, like Jenkins is not made for the amount of crazy
964
00:49:49,000 --> 00:49:51,000
stuff that like, cause think about it.
965
00:49:51,000 --> 00:49:53,000
It wasn't made for anything.
966
00:49:53,000 --> 00:49:56,000
It wasn't, but it was originally made for Java a million years ago.
967
00:49:56,000 --> 00:50:01,000
And now people are trying to use it in this new modern way or just pipeline services that
968
00:50:01,000 --> 00:50:05,000
were not made up for this amount of heavy data streaming that we're doing, you know, so
969
00:50:05,000 --> 00:50:06,000
we're
970
00:50:06,000 --> 00:50:08,000
David, what do you think the next bottleneck is?
971
00:50:08,000 --> 00:50:09,000
Right?
972
00:50:09,000 --> 00:50:13,000
Cause I do think that data is the obvious one in connectivity to, especially if you're looking
973
00:50:13,000 --> 00:50:14,000
at edge, right?
974
00:50:14,000 --> 00:50:18,000
You're like, Oh, there's some capacity limitation in an edge environment, whether that's compute,
975
00:50:18,000 --> 00:50:19,000
whether that's data.
976
00:50:19,000 --> 00:50:23,000
That's kind of like what we were talking about with LA too, though, because about like how
977
00:50:23,000 --> 00:50:24,000
hardware is moving faster.
978
00:50:24,000 --> 00:50:30,000
Like I don't know if the different parts of computing like are in sync with how some
979
00:50:30,000 --> 00:50:33,000
are moving so fast, you know, it's interesting to see.
980
00:50:33,000 --> 00:50:34,000
Yeah.
981
00:50:34,000 --> 00:50:35,000
Which one do you think is going to outpace the other thing?
982
00:50:35,000 --> 00:50:38,000
Cause like you said, never, uh, let's speed of light is never going to get faster, but
983
00:50:38,000 --> 00:50:40,000
the pipes are getting a little bigger.
984
00:50:40,000 --> 00:50:44,000
But is that a, do we just need better compute to compress that?
985
00:50:44,000 --> 00:50:50,000
No, I, I, I, again, my personal opinion is, uh, first, you know, it's, it's not about
986
00:50:50,000 --> 00:50:51,000
the pipes.
987
00:50:51,000 --> 00:50:52,000
It's about the latency.
988
00:50:52,000 --> 00:50:53,000
Right.
989
00:50:53,000 --> 00:50:54,000
And that will never change.
990
00:50:54,000 --> 00:50:58,000
No amount of compute will, will ever improve latency cause speed of light can't get.
991
00:50:58,000 --> 00:51:01,000
They'll lie to you and say, well, though, like they're just like, and then if you do
992
00:51:01,000 --> 00:51:05,000
this, and I'm just like, no, that's how that works.
993
00:51:05,000 --> 00:51:11,000
But, but, but that said, you know, the, the, the fact is, is that the compute, one of the
994
00:51:11,000 --> 00:51:16,000
things that I talked to a lot of people about is like, the compute is also unused.
995
00:51:16,000 --> 00:51:22,000
Like it's just, you know, a lot of this stuff, again, this Raspberry Pi can, can do a ridiculous
996
00:51:22,000 --> 00:51:27,500
amount of throughput, like ridiculous, um, uh, far more than people would think.
997
00:51:27,500 --> 00:51:30,500
And you're like, well, shit, you know, I'm already have it out there.
998
00:51:30,500 --> 00:51:31,500
It's already doing this other stuff.
999
00:51:31,500 --> 00:51:33,500
I might as well.
1000
00:51:33,500 --> 00:51:36,500
Um, and what I would say is I will contest your point.
1001
00:51:36,500 --> 00:51:40,500
Like, yeah, pipes are getting better, but they, they're getting bigger even faster.
1002
00:51:40,500 --> 00:51:41,500
That's all I'm saying.
1003
00:51:41,500 --> 00:51:45,500
Like, and so it really is like the amount of, and we're going to be making just useless
1004
00:51:45,500 --> 00:51:50,500
data because people like, I think that they just really like data is the most valuable,
1005
00:51:50,500 --> 00:51:56,500
you know, commodity, but also because we have all of these sensors and we have all of this
1006
00:51:56,500 --> 00:51:58,500
AI trying to make all of this data.
1007
00:51:58,500 --> 00:52:03,500
I think we're going to end up just, and we have, we're going to have so much compute
1008
00:52:03,500 --> 00:52:08,500
power with all these data centers that like, so people are just almost, I don't think they
1009
00:52:08,500 --> 00:52:11,500
realize how much infrastructure and data are growing.
1010
00:52:11,500 --> 00:52:12,500
Yeah.
1011
00:52:12,500 --> 00:52:13,500
Totally agree.
1012
00:52:13,500 --> 00:52:14,500
Totally agree.
1013
00:52:14,500 --> 00:52:20,500
So all, all I want to say though is, is, uh, Justin is, uh, um, you know, you are, it
1014
00:52:20,500 --> 00:52:26,500
will be a challenge in a year or two years, five years time to have anything, even this
1015
00:52:26,500 --> 00:52:30,500
Raspberry Pi, not have acceleration on it, right?
1016
00:52:30,500 --> 00:52:33,500
Like just at current power, right?
1017
00:52:33,500 --> 00:52:37,500
It will still have an acceleration and maybe that's because of the system on a chip or whatever
1018
00:52:37,500 --> 00:52:38,500
it might be.
1019
00:52:38,500 --> 00:52:44,500
But then you're like, it's not that I don't want to use my Blackwell plus plus, you know,
1020
00:52:44,500 --> 00:52:48,500
whatever as a central thing, but why don't I have it work on the more interesting problems
1021
00:52:48,500 --> 00:52:53,500
and have whatever a GPU that's sitting on this thing do some of the work?
1022
00:52:53,500 --> 00:53:01,500
Like I, I gave a demo, um, uh, earlier this week that showed, uh, onboarding from 20 nodes,
1023
00:53:01,500 --> 00:53:08,500
uh, on three different clouds using Expanso, um, uh, to all of the, to, to, uh, BigQuery,
1024
00:53:08,500 --> 00:53:09,500
for example, right?
1025
00:53:09,500 --> 00:53:10,500
As a backend.
1026
00:53:10,500 --> 00:53:16,500
And, uh, I was able to push, uh, 27 million rows in under 40 seconds, right?
1027
00:53:16,500 --> 00:53:17,500
From all of these things.
1028
00:53:17,500 --> 00:53:21,500
And that's not because, you know, there was something magical happening here.
1029
00:53:21,500 --> 00:53:27,500
It was because I was adding together the aggregate of all that bandwidth at the same time.
1030
00:53:27,500 --> 00:53:32,500
And there's just, there's no way to like make that any faster.
1031
00:53:32,500 --> 00:53:38,500
Like no amount of network will ever achieve what you can do with the same network multiplied
1032
00:53:38,500 --> 00:53:41,500
times, you know, the number of nodes I have, right?
1033
00:53:41,500 --> 00:53:42,500
It's just the way that works.
1034
00:53:42,500 --> 00:53:44,500
So like I would argue, yeah.
1035
00:53:44,500 --> 00:53:49,500
I was just like, that's exactly the age old problem we've had in infrastructure where
1036
00:53:49,500 --> 00:53:54,500
you can vertically scale one machine and say, I need, I need this big Oracle database on
1037
00:53:54,500 --> 00:53:59,500
that one machine and it needs a terabyte of memory, or you can go with, give me five different
1038
00:53:59,500 --> 00:54:02,500
racks and I'm going to spread that amount of memory across them.
1039
00:54:02,500 --> 00:54:07,500
And there is overhead for the coordination, but we've found those just better performance
1040
00:54:07,500 --> 00:54:11,500
in general for resiliency and for all these other things.
1041
00:54:11,500 --> 00:54:17,500
So being able to spread out that and aggregate it is obviously like we can't make single
1042
00:54:17,500 --> 00:54:18,500
machines big enough.
1043
00:54:18,500 --> 00:54:22,500
We can't make single pipes big enough that are, that are economic, right?
1044
00:54:22,500 --> 00:54:24,500
Cause we can do like Japan just broke a record.
1045
00:54:24,500 --> 00:54:27,500
They're doing like a, like you could download all of Netflix in under a minute.
1046
00:54:27,500 --> 00:54:28,500
Right.
1047
00:54:28,500 --> 00:54:31,500
It's like, yeah, it's like petabyte of throughput, but I can't buy that and no one's going to,
1048
00:54:31,500 --> 00:54:32,500
no one's going to buy that one.
1049
00:54:32,500 --> 00:54:35,500
They're like, actually, I'm just going to go spend it on 10 gig networks everywhere
1050
00:54:35,500 --> 00:54:37,500
rather than one giant pipe.
1051
00:54:37,500 --> 00:54:38,500
I think this is interesting.
1052
00:54:38,500 --> 00:54:42,500
I think it's interesting if you think about like this conversation we're having with
1053
00:54:42,500 --> 00:54:47,500
LA about how like much like hardware has advanced, right?
1054
00:54:47,500 --> 00:54:50,500
And now everybody really wants to run their own stuff.
1055
00:54:50,500 --> 00:54:55,500
But I think that we were almost like optimizing like these, the different hardware because
1056
00:54:55,500 --> 00:54:57,500
for a long time everyone was using cloud.
1057
00:54:57,500 --> 00:55:02,500
So they were building it for these cloud companies and then for AI making this really advanced
1058
00:55:02,500 --> 00:55:06,500
hardware that people weren't playing with and experimenting as much because they were using
1059
00:55:06,500 --> 00:55:07,500
it in the cloud.
1060
00:55:07,500 --> 00:55:11,500
And now that I think people are getting almost like reacquaint it with hardware and what
1061
00:55:11,500 --> 00:55:13,500
hardware can do that.
1062
00:55:13,500 --> 00:55:17,500
It's really interesting to see what people are going to push the limits with this hardware
1063
00:55:17,500 --> 00:55:22,500
because it's so optimized for AI and cloud and all these different places that it was
1064
00:55:22,500 --> 00:55:23,500
being used in.
1065
00:55:23,500 --> 00:55:29,500
And now developers and startups are getting that actual hardware back in their hands.
1066
00:55:29,500 --> 00:55:30,500
Yeah.
1067
00:55:30,500 --> 00:55:31,500
And I think it's going to be interesting.
1068
00:55:31,500 --> 00:55:35,500
Like you said, like what Raspberry Pi can do, you know what I mean?
1069
00:55:35,500 --> 00:55:36,500
Yeah.
1070
00:55:36,500 --> 00:55:37,500
Absolutely.
1071
00:55:37,500 --> 00:55:38,500
Absolutely.
1072
00:55:38,500 --> 00:55:41,500
And there's a threshold of when that hardware advancement becomes just universally available.
1073
00:55:41,500 --> 00:55:43,500
That's what I'm saying because there's so much of it.
1074
00:55:43,500 --> 00:55:48,500
This is so cheap that it's economical for me just to like redevelop something so that
1075
00:55:48,500 --> 00:55:49,500
it uses that.
1076
00:55:49,500 --> 00:55:53,500
I remember at Disney animation at one point we were switching out hardware because we switched
1077
00:55:53,500 --> 00:55:57,500
out racks and racks of servers because the new chip had this.
1078
00:55:57,500 --> 00:56:01,500
I forget what the math was, but it was some function that we do a lot in rendering on
1079
00:56:01,500 --> 00:56:02,500
the CPU.
1080
00:56:02,500 --> 00:56:08,500
And we're like, actually, we will just render this movie like 10% faster by swapping out
1081
00:56:08,500 --> 00:56:09,500
all these CPUs.
1082
00:56:09,500 --> 00:56:13,000
And it's going to cost us millions of dollars to swap out all these CPUs, but we're going
1083
00:56:13,000 --> 00:56:15,500
to get the movie done in time versus delaying it.
1084
00:56:15,500 --> 00:56:17,000
And that's absolutely worth it or whatever.
1085
00:56:17,000 --> 00:56:20,500
It's just like, yeah, no, we're going to do that part in hardware and no longer do it
1086
00:56:20,500 --> 00:56:21,500
in software.
1087
00:56:21,500 --> 00:56:22,500
I think we're going to see a lot of that.
1088
00:56:22,500 --> 00:56:28,000
Like if you look at how Apple is processing AI inside of the iPhones just because that
1089
00:56:28,000 --> 00:56:31,500
way it's technically safer and it's more secure and they could promise more.
1090
00:56:31,500 --> 00:56:32,500
But Siri's still the dumbest one.
1091
00:56:32,500 --> 00:56:33,500
I don't know.
1092
00:56:33,500 --> 00:56:39,500
But still that's, but that like, if somebody told you, if someone told you 10 years ago
1093
00:56:39,500 --> 00:56:44,000
that you are not only going to be able to run an AI model and have it processed in a chip
1094
00:56:44,000 --> 00:56:45,500
that's in your pocket, that's still amazing.
1095
00:56:45,500 --> 00:56:46,500
I don't care what you say.
1096
00:56:46,500 --> 00:56:48,500
I mean, there's, oh, sorry, please.
1097
00:56:48,500 --> 00:56:50,500
But just, you know what I mean?
1098
00:56:50,500 --> 00:56:52,500
Like hardware is changing so much.
1099
00:56:52,500 --> 00:56:56,500
And like there's whole developers that went half a whole career or like young developers,
1100
00:56:56,500 --> 00:57:00,500
like maybe that are mid developers now that have never got to play with that kind of hardware.
1101
00:57:00,500 --> 00:57:07,500
And we're making it at such a fast speed and they're making these chips for AI that so much
1102
00:57:07,500 --> 00:57:12,500
hardware is going to now just have like either it's going to be overproduced at some point
1103
00:57:12,500 --> 00:57:16,500
or they're going to get rid of all the old hardware and now it's going to get so cheap.
1104
00:57:16,500 --> 00:57:21,500
I keep saying, I'm so excited for three to four years from now with all of these GPUs
1105
00:57:21,500 --> 00:57:22,500
hit secondhand market.
1106
00:57:22,500 --> 00:57:23,500
Yeah.
1107
00:57:23,500 --> 00:57:28,500
Everyone has super powerful broad today and video chips like actually I can run so much
1108
00:57:28,500 --> 00:57:29,500
stuff.
1109
00:57:29,500 --> 00:57:34,500
Like the, just the way that the tech market's been really weird and all of that.
1110
00:57:34,500 --> 00:57:40,500
Like I just wonder what cool advancements are going to come out of that, you know, moment.
1111
00:57:40,500 --> 00:57:41,500
Yeah.
1112
00:57:41,500 --> 00:57:42,500
You know, it's, it's, it's fascinating.
1113
00:57:42,500 --> 00:57:44,500
So there's an amazing story.
1114
00:57:44,500 --> 00:57:49,500
I remember way back when the plenty of fish guy.
1115
00:57:49,500 --> 00:57:55,500
Did anyone remember plenty of fish fish was a dating site and it was like a competition
1116
00:57:55,500 --> 00:57:59,500
to okay, Cupid and all these various things out there.
1117
00:57:59,500 --> 00:58:03,500
And it was really, really popular and very, very funny.
1118
00:58:03,500 --> 00:58:07,500
Like at the time this is way before containers and things like that.
1119
00:58:07,500 --> 00:58:13,500
He had built in the entire thing to be a vertical massive machine.
1120
00:58:13,500 --> 00:58:17,500
And it was the like he, I think what I was at Microsoft the first time and I think he
1121
00:58:17,500 --> 00:58:24,500
had the largest like single instance thing that we knew about it was like seriously was
1122
00:58:24,500 --> 00:58:26,500
like 128 CPUs or something like that.
1123
00:58:26,500 --> 00:58:28,500
It was like just an absurd thing.
1124
00:58:28,500 --> 00:58:31,500
And he's, and everyone's like, why do you do like build this out?
1125
00:58:31,500 --> 00:58:33,500
And he's like, cause I don't need to and it's easier.
1126
00:58:33,500 --> 00:58:42,500
And like you go to like Monzo's thing that a graph of microservices that got like then
1127
00:58:42,500 --> 00:58:44,500
everyone got up in arms about a few years ago.
1128
00:58:44,500 --> 00:58:46,500
Oh my God, why are you doing all these microservices?
1129
00:58:46,500 --> 00:58:49,500
Like, and, and they're not wrong.
1130
00:58:49,500 --> 00:58:55,500
You know, like you don't get complexity for like that's a cost.
1131
00:58:55,500 --> 00:58:57,500
But the cost may be worth the benefit.
1132
00:58:57,500 --> 00:58:58,500
Right.
1133
00:58:58,500 --> 00:59:01,500
And so like exactly like you were just saying on them, like we're going to have all this
1134
00:59:01,500 --> 00:59:06,500
compute out there and we're going to have this, but we need to have lower like it needs
1135
00:59:06,500 --> 00:59:11,500
to be sub linear cost in scaling that every additional machine should not cost the same.
1136
00:59:11,500 --> 00:59:14,500
It should be very, very small increment and benefit.
1137
00:59:14,500 --> 00:59:18,500
And, and you know, I'm trying to participate in that by like offering this platform that
1138
00:59:18,500 --> 00:59:21,500
helps with that, but like Kubernetes certainly did that.
1139
00:59:21,500 --> 00:59:25,500
And, and, you know, Docker with its portability certainly did that and all these kind of things.
1140
00:59:25,500 --> 00:59:26,500
Right.
1141
00:59:26,500 --> 00:59:29,500
There's just a bunch of different ways to go out and tackle this thing.
1142
00:59:29,500 --> 00:59:35,500
And so what I would say is that, you know, you're exactly right.
1143
00:59:35,500 --> 00:59:42,500
We need to enable this, but we need to enable it smartly because it is really, really, really
1144
00:59:42,500 --> 00:59:46,500
easy to go and get a massive machine, two terabyte machine and just stick everything
1145
00:59:46,500 --> 00:59:47,500
on there.
1146
00:59:47,500 --> 00:59:49,500
This piece of cake single machine.
1147
00:59:49,500 --> 00:59:50,500
I know where to log in.
1148
00:59:50,500 --> 00:59:53,500
I can manage the entire thing with SSH and call it a day.
1149
00:59:53,500 --> 00:59:58,500
That's not particularly efficient, but it's also not terrible.
1150
00:59:58,500 --> 01:00:03,500
But also just a lot of tax service and exactly is just looking at this.
1151
01:00:03,500 --> 01:00:04,500
Exactly.
1152
01:00:04,500 --> 01:00:06,500
They're going to have so much fun with you.
1153
01:00:06,500 --> 01:00:10,500
But what I like to say is the reason I built this company is like they're these immutable
1154
01:00:10,500 --> 01:00:11,500
truths.
1155
01:00:11,500 --> 01:00:16,500
Like the fact is, is no matter how great that machine is, my entire business does not exist.
1156
01:00:17,500 --> 01:00:24,500
In co-located with that machine, like I have real physical things in the world, whatever
1157
01:00:24,500 --> 01:00:29,500
they may be users that are accessing my website from all over the world or retail outlets
1158
01:00:29,500 --> 01:00:35,500
or hospitals or factories or cars, like that stuff is happening too.
1159
01:00:35,500 --> 01:00:38,500
And, and so then people are like, well, don't worry.
1160
01:00:38,500 --> 01:00:42,500
I'll just take all that stuff and I'll build a digital twin and I'll just mimic all that
1161
01:00:42,500 --> 01:00:43,500
stuff.
1162
01:00:43,500 --> 01:00:45,500
And I'm like, oh, that's not see that.
1163
01:00:45,500 --> 01:00:51,500
But like, you want to be redundant, but that's not all that is not just making an exact copy
1164
01:00:51,500 --> 01:00:52,500
is not always it.
1165
01:00:52,500 --> 01:00:53,500
100%.
1166
01:00:53,500 --> 01:00:57,500
I think that we're, I think we're going to be into a weird space though, because we're
1167
01:00:57,500 --> 01:01:02,500
removing so much abstraction, like cloud was an abstraction from hardware, but AI is
1168
01:01:02,500 --> 01:01:04,500
like a super extraction.
1169
01:01:04,500 --> 01:01:11,500
And we're not only forcing engineers to use it, but we're going to grow a whole like
1170
01:01:11,500 --> 01:01:13,500
generation of developers on AI.
1171
01:01:13,500 --> 01:01:18,500
So you've got people that are either experimenting with hardware and they're in the like thick
1172
01:01:18,500 --> 01:01:23,500
of it, or they are even more abstracted than the whole generation of developers that we
1173
01:01:23,500 --> 01:01:25,500
just had that came into the cloud.
1174
01:01:25,500 --> 01:01:30,500
So how will we kind of like educate people on how to use those things, because like you
1175
01:01:30,500 --> 01:01:34,500
said, it's really easy, especially with the money that people are throwing at certain
1176
01:01:34,500 --> 01:01:39,500
like things where they're just going to buy this huge hardware and put everything, you
1177
01:01:39,500 --> 01:01:40,500
know what I mean?
1178
01:01:40,500 --> 01:01:44,500
Because it's going to be simple and it's going to be less permissions to give AI.
1179
01:01:44,500 --> 01:01:47,500
And like, how do we even educate people to do that?
1180
01:01:47,500 --> 01:01:48,500
Absolutely.
1181
01:01:48,500 --> 01:01:49,500
Absolutely right.
1182
01:01:49,500 --> 01:01:50,500
Yeah.
1183
01:01:50,500 --> 01:01:53,500
There's just a, I mean, you just put your finger, you're what do you call it?
1184
01:01:53,500 --> 01:01:56,500
And the key is, again, you touched on it.
1185
01:01:56,500 --> 01:02:02,500
The education will be giving people a full picture of the world, right?
1186
01:02:02,500 --> 01:02:06,500
Like, you know, when we were first trying to get people to adopt the cloud, there were
1187
01:02:06,500 --> 01:02:08,500
so many times people like, Oh, I don't know.
1188
01:02:08,500 --> 01:02:10,500
You know, I got these machines and so on and so forth.
1189
01:02:10,500 --> 01:02:14,500
And we're like, you know, like they would be like, why would I go out and pay for something
1190
01:02:14,500 --> 01:02:17,500
when I already have the assets in house?
1191
01:02:17,500 --> 01:02:21,500
And, and the conversation was like, well, are you really capturing what you have and
1192
01:02:21,500 --> 01:02:22,500
what you're doing?
1193
01:02:22,500 --> 01:02:26,500
Like, do you want to let go and reboot the machine at three o'clock in the morning?
1194
01:02:26,500 --> 01:02:31,500
Do you want to migrate, you know, the OS when you have to the, you know, the hypervisor,
1195
01:02:31,500 --> 01:02:32,500
et cetera, et cetera?
1196
01:02:32,500 --> 01:02:35,500
Again, no one's saying that's not an answer.
1197
01:02:35,500 --> 01:02:42,500
But when you're doing this, you need to think about the entire scope of the problem and
1198
01:02:42,500 --> 01:02:44,500
capture all the costs.
1199
01:02:44,500 --> 01:02:47,500
Because if it's just this, that's not going to be enough.
1200
01:02:47,500 --> 01:02:49,500
And so that's very much what it is.
1201
01:02:49,500 --> 01:02:53,500
I think that's like the thing we get these trends and everybody wants to do it.
1202
01:02:53,500 --> 01:02:57,500
And then they never like, yeah, you can be in the cloud, but then you have to think about
1203
01:02:57,500 --> 01:03:00,500
how expensive the cloud is and the fact that you're like abstracted.
1204
01:03:00,500 --> 01:03:04,500
But then you get on-prem and then you have to figure out, do you have a DBA to run all
1205
01:03:04,500 --> 01:03:05,500
this stuff?
1206
01:03:05,500 --> 01:03:10,500
And like, you have to be able to be good at kind of figuring out your future.
1207
01:03:10,500 --> 01:03:11,500
But okay.
1208
01:03:11,500 --> 01:03:15,500
So when you're talking about Kubernetes and Dockers, it made me think of Corey Quinn's
1209
01:03:15,500 --> 01:03:19,500
scale talk about how he compared Docker to like that.
1210
01:03:19,500 --> 01:03:21,500
Was it like a F-14 or something?
1211
01:03:21,500 --> 01:03:23,500
I didn't see it.
1212
01:03:23,500 --> 01:03:29,500
But he basically like was comparing Docker and Kubernetes and basically said it was like
1213
01:03:29,500 --> 01:03:30,500
the worst, but it was so funny.
1214
01:03:30,500 --> 01:03:34,500
It was basically he was just really explaining Kubernetes and Docker and the differences.
1215
01:03:34,500 --> 01:03:41,500
But what do you think the next, like, what is the next big, because everything in tech
1216
01:03:41,500 --> 01:03:44,500
is like databases are databases.
1217
01:03:44,500 --> 01:03:45,500
Compute is compute.
1218
01:03:45,500 --> 01:03:53,500
What's the big next, I guess, like revolution and compute and Kubernetes and Docker.
1219
01:03:53,500 --> 01:03:58,500
Obviously, I personally have this general opinion, right?
1220
01:03:58,500 --> 01:04:03,500
Obviously, I think edge and distributed things is going to be enormous.
1221
01:04:03,500 --> 01:04:06,500
And I very much hope to be a part of that.
1222
01:04:06,500 --> 01:04:11,500
Because again, I love building on things that can never change, right?
1223
01:04:11,500 --> 01:04:13,500
All those things I said earlier will never change.
1224
01:04:13,500 --> 01:04:17,500
Data will grow, speed of light will get faster, regulations will be out there, all that kind
1225
01:04:17,500 --> 01:04:18,500
of stuff.
1226
01:04:19,500 --> 01:04:22,500
I mean, we're old that we like the old constant things.
1227
01:04:22,500 --> 01:04:24,500
Does that mean that we're old people now?
1228
01:04:24,500 --> 01:04:26,500
No, it's because like you just recognize it now.
1229
01:04:26,500 --> 01:04:27,500
Yeah.
1230
01:04:27,500 --> 01:04:30,500
Well, I mean, it's one of these things where it's like, you know what, the secret sauce
1231
01:04:30,500 --> 01:04:35,500
behind Manhattan is not because, you know, like why it gets so tall buildings is because
1232
01:04:35,500 --> 01:04:37,500
they know where the bedrock is, right?
1233
01:04:37,500 --> 01:04:42,500
And so they're able to drill down to things that will never change for better or worse.
1234
01:04:42,500 --> 01:04:46,500
And so like, I think that's critically important to understand the things that will never change
1235
01:04:46,500 --> 01:04:50,500
and then figure out what will happen inside that is what will be next.
1236
01:04:50,500 --> 01:04:55,500
So my take on it is, and again, I say this with a highly biased thing is like, it will
1237
01:04:55,500 --> 01:05:03,500
be how do I act like a cloud, but in a thing that respects those immutable truths and matches
1238
01:05:03,500 --> 01:05:07,500
where the data and the compute and the things actually are growing.
1239
01:05:07,500 --> 01:05:12,500
And so when you see Jensen stand on stage and talk about the next, you know, trillion devices
1240
01:05:12,500 --> 01:05:17,500
where you talk about, you know, me being able to have instant response and instant memory
1241
01:05:17,500 --> 01:05:24,500
on my phone or whatever it might be, that's not everything going back to a central API.
1242
01:05:24,500 --> 01:05:31,500
That's that's those things out there having smarts at a level that is incredibly, you know,
1243
01:05:31,500 --> 01:05:32,500
that feels integrated.
1244
01:05:32,500 --> 01:05:38,500
And again, it's where it gets to that sublinear scaling because like I'm telling you the Gemini
1245
01:05:38,500 --> 01:05:43,500
AI and the anthropic people, they're like, they don't want to be out there like, you know,
1246
01:05:43,500 --> 01:05:48,500
managing why, you know, something isn't working at some factory and whatever, you know, the
1247
01:05:48,500 --> 01:05:53,500
Philippines, they want to have like a very easy way for someone out there to deploy a
1248
01:05:53,500 --> 01:05:59,500
model and run it and have it be reliable and debug it and, you know, have metadata around
1249
01:05:59,500 --> 01:06:05,500
it, which is the other thing that I think is super lost and something that we support
1250
01:06:05,500 --> 01:06:06,500
actively.
1251
01:06:06,500 --> 01:06:12,500
But it's, it's, I think that everything is a graph, right?
1252
01:06:12,500 --> 01:06:20,500
And that, you know, all these things workflows and transformations and so on are super under
1253
01:06:20,500 --> 01:06:23,500
invested in by us as a community.
1254
01:06:23,500 --> 01:06:30,500
And it really is it distills down to the simplest possible thing, which is here's this artifact,
1255
01:06:30,500 --> 01:06:33,500
this binary thing that I want you to run.
1256
01:06:33,500 --> 01:06:40,500
And as you do that, I want to record in a way that is programmatic for the, the computer
1257
01:06:40,500 --> 01:06:46,500
to understand what went into that thing, what the thing did, and then what the output was,
1258
01:06:46,500 --> 01:06:47,500
right?
1259
01:06:47,500 --> 01:06:54,500
And simply by having a structured way to approach that will change so much, it will change CICD,
1260
01:06:54,500 --> 01:06:57,500
it will change execution, it will change all these various things.
1261
01:06:57,500 --> 01:07:02,500
And there are many like larger efforts around stuff like this open telemetry is a perfect
1262
01:07:02,500 --> 01:07:03,500
example, right?
1263
01:07:03,500 --> 01:07:06,500
Where you start to think about things as traces and so on.
1264
01:07:06,500 --> 01:07:13,500
But I do think that when you hear the word reproducibility crisis, or you, you see someone
1265
01:07:13,500 --> 01:07:16,500
at two o'clock in the morning trying to figure out what the fuck is going on and why things
1266
01:07:16,500 --> 01:07:18,500
are debug hard to debug.
1267
01:07:18,500 --> 01:07:21,500
It's almost always that problem.
1268
01:07:21,500 --> 01:07:23,500
I don't know what went into this thing.
1269
01:07:23,500 --> 01:07:25,500
I don't know how it ran.
1270
01:07:25,500 --> 01:07:29,500
And I don't know how it came out in a deterministic way.
1271
01:07:29,500 --> 01:07:36,500
And if you don't have that, we will continue to like try and build these like incredibly
1272
01:07:36,500 --> 01:07:41,500
hacky scripts to parse stack traces to figure out what the fuck was going on.
1273
01:07:41,500 --> 01:07:44,500
Do you think AI is going to contribute to that problem?
1274
01:07:44,500 --> 01:07:49,500
No, I think it'll be much worse because, because in its core AI is not deterministic.
1275
01:07:49,500 --> 01:07:53,500
And so I mean, like contribute to making more of it.
1276
01:07:53,500 --> 01:07:55,500
Like, so, okay, technically think about it.
1277
01:07:55,500 --> 01:07:58,500
If AI starts vibe coding a bunch of these things.
1278
01:07:58,500 --> 01:08:03,500
Oh, and we're going to be like, you know, we're already increasing the amount like of agents
1279
01:08:03,500 --> 01:08:05,500
and different things that are coming back to us.
1280
01:08:05,500 --> 01:08:08,500
When you don't know what the expected output should be.
1281
01:08:08,500 --> 01:08:09,500
Absolutely.
1282
01:08:09,500 --> 01:08:11,500
It's really hard to diagnose a problem.
1283
01:08:11,500 --> 01:08:16,500
So I think that not only are you onto things as just development in general, but like that
1284
01:08:16,500 --> 01:08:21,500
is going to almost be like multiplied by the new way of development.
1285
01:08:21,500 --> 01:08:23,500
Yeah, I totally misunderstood what you're saying.
1286
01:08:23,500 --> 01:08:24,500
Yes, exactly.
1287
01:08:24,500 --> 01:08:28,500
It's the, it's the fact that those models are not deterministic that are so brutal.
1288
01:08:28,500 --> 01:08:34,500
And, and whoever breaks through the determinism around AI.
1289
01:08:34,500 --> 01:08:37,500
I mean, you can do it.
1290
01:08:37,500 --> 01:08:42,500
You can get close with things like rappers and things like that, but it's not there.
1291
01:08:42,500 --> 01:08:46,500
And I think the thing is, is it's hard to be deterministic as a developer.
1292
01:08:46,500 --> 01:08:49,500
That's a human because there's so many ways to build things, right?
1293
01:08:49,500 --> 01:08:54,500
Like, and there's so many ways to like, argue like what in half the way, half the time you're like,
1294
01:08:54,500 --> 01:08:56,500
is it because you did this before?
1295
01:08:56,500 --> 01:08:58,500
Is it because you like this method?
1296
01:08:58,500 --> 01:08:59,500
You know what I mean?
1297
01:08:59,500 --> 01:09:01,500
So then the fact that humans can do it.
1298
01:09:01,500 --> 01:09:05,500
Yeah, there's another incredibly smart friend of mine who's, who's right up here.
1299
01:09:05,500 --> 01:09:07,500
Who's saying exactly what you're saying.
1300
01:09:07,500 --> 01:09:12,500
And, and, you know, the new hotness around ML is, so his name's homel.
1301
01:09:12,500 --> 01:09:14,500
He has a courses on this and things like that.
1302
01:09:14,500 --> 01:09:16,500
It's all about evals, right?
1303
01:09:16,500 --> 01:09:24,500
That is such a brutally important and totally must think by a lot of the ML people adopting ML and AI right out,
1304
01:09:24,500 --> 01:09:32,500
which is like, how do I programmatically verify that this model does what I said it should do, right?
1305
01:09:32,500 --> 01:09:37,500
Like, unless you have that, like, do not even begin to go down the ML path.
1306
01:09:37,500 --> 01:09:42,500
Because my God, you know, like, unless you put a human in the loop, which is fine,
1307
01:09:42,500 --> 01:09:47,500
you're never going to be able to like train or build your model in an insensible way.
1308
01:09:47,500 --> 01:09:48,500
I think I'm trying to figure that out.
1309
01:09:48,500 --> 01:09:54,500
Like, how do you use AI to be faster at things, but also the fact that you have to then go verify.
1310
01:09:54,500 --> 01:09:55,500
Is it faster?
1311
01:09:55,500 --> 01:10:00,500
You know, like, I'm trying to figure out how do you use it to learn and to get better at things
1312
01:10:00,500 --> 01:10:06,500
without just losing the abstraction and the knowledge that you need to gain?
1313
01:10:06,500 --> 01:10:12,500
I mean, you know, the, the, that, that piece that came out, I think they got it pretty wrong about the like,
1314
01:10:12,500 --> 01:10:16,500
oh, you know, coders are slower when they use the ML and so on.
1315
01:10:16,500 --> 01:10:17,500
I think that missed it.
1316
01:10:17,500 --> 01:10:21,500
Like, because it didn't really represent the way that people do this, right?
1317
01:10:21,500 --> 01:10:25,500
Like, what they'll do is they'll like stand, you know, vibe code something,
1318
01:10:25,500 --> 01:10:29,500
and then they'll like try and compile it, or then they'll lint it, and then they'll actually run it,
1319
01:10:29,500 --> 01:10:34,500
and then they'll Google like something, you know, and see whether or not this was a good approach.
1320
01:10:34,500 --> 01:10:36,500
And then they'll go back to vibe code some more.
1321
01:10:36,500 --> 01:10:37,500
Right.
1322
01:10:37,500 --> 01:10:41,500
So it's like, it's not this like vibe code only or hand code only.
1323
01:10:41,500 --> 01:10:42,500
It really is a mix and match.
1324
01:10:42,500 --> 01:10:50,500
And, and right now the only way to solve what you're describing is with that human in the loop where they look at the thing.
1325
01:10:50,500 --> 01:10:52,500
They become the evaluator.
1326
01:10:52,500 --> 01:10:54,500
Do you think we're going to break Stack Overflow though?
1327
01:10:54,500 --> 01:10:57,500
Because this is what I was conspiring and thinking about last night.
1328
01:10:57,500 --> 01:11:00,500
Okay, like, I mean Stack Overflow, I love it to death.
1329
01:11:00,500 --> 01:11:01,500
I listened to it.
1330
01:11:01,500 --> 01:11:04,500
I listened to that podcast from day one.
1331
01:11:04,500 --> 01:11:10,500
But like the website, like we so many developers kind of depend on the fact that we're all hitting these issues, right?
1332
01:11:10,500 --> 01:11:15,500
And someone hit the problem before you, and then we wrote it down somewhere and it's like the,
1333
01:11:15,500 --> 01:11:19,500
it's the notebook that we all go and look in and we're like, hey, did you have this error?
1334
01:11:19,500 --> 01:11:22,500
I'm old enough that that notebook has moved a couple of times on the internet.
1335
01:11:22,500 --> 01:11:25,500
Like if you're, if you're of a certain vintage, you remember.
1336
01:11:25,500 --> 01:11:27,500
But it's the fact that no one's going to write it down.
1337
01:11:27,500 --> 01:11:28,500
David, do you remember Experts Exchange?
1338
01:11:28,500 --> 01:11:29,500
Of course.
1339
01:11:29,500 --> 01:11:30,500
Yeah.
1340
01:11:30,500 --> 01:11:32,500
Experts Exchange was the expert sex change.
1341
01:11:32,500 --> 01:11:33,500
Yeah, exactly.
1342
01:11:33,500 --> 01:11:36,500
But the models are getting it though, because we're like, you know what I mean?
1343
01:11:36,500 --> 01:11:42,500
Like, like if nobody asks questions to like other humans anymore and we're only asking it to AI,
1344
01:11:42,500 --> 01:11:45,500
do we then break the loop of knowledge?
1345
01:11:45,500 --> 01:11:46,500
You know what I mean?
1346
01:11:46,500 --> 01:11:50,500
I mean, what's going to happen is I already know what's going to happen, right?
1347
01:11:50,500 --> 01:11:54,500
The one of these models right now, it's probably going to be Claude, right?
1348
01:11:54,500 --> 01:12:01,500
Or we'll, we'll get the majority of the questions and it will be able to do a little mini loop and say like,
1349
01:12:01,500 --> 01:12:06,500
oh, you know, this person did this in Python and then they ask it again and then it did it again.
1350
01:12:06,500 --> 01:12:10,500
And they'll be able to tie those together and say like, okay, this is what actually was happening.
1351
01:12:10,500 --> 01:12:12,500
And then their model will magically just become smarter.
1352
01:12:12,500 --> 01:12:18,500
Now, if they were like a social good, they would release those datasets to the world and make it easy,
1353
01:12:18,500 --> 01:12:20,500
but that's not going to be it either.
1354
01:12:20,500 --> 01:12:22,500
So it's funny.
1355
01:12:22,500 --> 01:12:25,500
I gave a talk and ML talk in the winter.
1356
01:12:25,500 --> 01:12:32,500
And it was an experiment because I took the exact same talk I gave when I was launching Kubeflow in 2018,
1357
01:12:32,500 --> 01:12:38,500
I want to say, and I gave it again with no changes, not a single change to the slides.
1358
01:12:38,500 --> 01:12:41,500
Everything was the exact same, which is hilarious, right?
1359
01:12:42,500 --> 01:12:49,500
It was all about security vulnerabilities through ML and like what you're going to face here and like how you defend your model
1360
01:12:49,500 --> 01:12:51,500
and how you do this and how you do that.
1361
01:12:51,500 --> 01:12:54,500
And one of them was around distillation of models, right?
1362
01:12:54,500 --> 01:13:03,500
And so what I said is true, Claude or whatever, you know, frontier model will walk away with like the best coding model today.
1363
01:13:04,500 --> 01:13:14,500
And someone else, the number two or number three on the list will use number one as a verifier as it's going through its testing.
1364
01:13:14,500 --> 01:13:15,500
Okay.
1365
01:13:15,500 --> 01:13:16,500
And it doesn't need to be a lot.
1366
01:13:16,500 --> 01:13:21,500
It's like a thousand queries or 2000 queries and they will get so much smarter.
1367
01:13:21,500 --> 01:13:22,500
They will.
1368
01:13:22,500 --> 01:13:24,500
It will not be defendable.
1369
01:13:24,500 --> 01:13:29,500
It will not whether not ethically or unethically that will leak out.
1370
01:13:29,500 --> 01:13:32,500
And then the second model will be good, right?
1371
01:13:32,500 --> 01:13:38,500
And then the third model and now you will have this consensus around these models and that will lift all the boats.
1372
01:13:38,500 --> 01:13:47,500
And so, you know, I would love if, if the, you know, whoever becomes number one model, like just releases data sets so that we can all grow together.
1373
01:13:47,500 --> 01:13:51,500
But, you know, what will not happen is that will never stay secret.
1374
01:13:51,500 --> 01:13:52,500
David, hold on.
1375
01:13:55,500 --> 01:13:56,500
David.
1376
01:13:56,500 --> 01:13:57,500
Yes.
1377
01:13:58,500 --> 01:13:59,500
David, we lost you.
1378
01:13:59,500 --> 01:14:00,500
Oh.
1379
01:14:00,500 --> 01:14:02,500
You just, you just broke up for the last like two minutes.
1380
01:14:02,500 --> 01:14:03,500
I didn't hear it.
1381
01:14:03,500 --> 01:14:04,500
No.
1382
01:14:04,500 --> 01:14:05,500
Did you hear it on him?
1383
01:14:05,500 --> 01:14:06,500
Okay.
1384
01:14:06,500 --> 01:14:07,500
What's the last thing?
1385
01:14:07,500 --> 01:14:08,500
I dropped it too.
1386
01:14:08,500 --> 01:14:09,500
Yeah.
1387
01:14:09,500 --> 01:14:10,500
I just got a notice saying it's there.
1388
01:14:10,500 --> 01:14:11,500
Can you hear me?
1389
01:14:12,500 --> 01:14:13,500
Still not.
1390
01:14:13,500 --> 01:14:14,500
Yeah.
1391
01:14:14,500 --> 01:14:15,500
You're back.
1392
01:14:15,500 --> 01:14:16,500
Okay.
1393
01:14:16,500 --> 01:14:17,500
Where do you want me to go?
1394
01:14:19,500 --> 01:14:24,500
Go, go back to, to what will happen once the second model starts training on the first model.
1395
01:14:24,500 --> 01:14:25,500
Yeah.
1396
01:14:25,500 --> 01:14:26,500
Okay.
1397
01:14:26,500 --> 01:14:33,500
The, the second model will come along and they will begin to train using some of the wisdom from the first model.
1398
01:14:33,500 --> 01:14:36,500
They'll use it as a tool just as a verifier.
1399
01:14:36,500 --> 01:14:42,500
And as for as much as the first model wants to block it, it will not be possible to block because it doesn't require very much.
1400
01:14:42,500 --> 01:14:52,500
Like you're talking about literally thousands of total queries over a few days and you can get a very accurate representation of the underlying model.
1401
01:14:52,500 --> 01:14:57,500
And then the second model will be good and the third model will be good and the open source one will be good.
1402
01:14:57,500 --> 01:15:02,500
And now everyone's boats are lifted and then you're going to do this again and again and again and again.
1403
01:15:02,500 --> 01:15:12,500
And so what does that mean for developers and for the tribal knowledge that we all share on the internet so we can all, you know, excellent question.
1404
01:15:12,500 --> 01:15:18,500
I think that, you know, I love Stack Overflow, but I think it's going to go away or whatever.
1405
01:15:18,500 --> 01:15:19,500
I don't know.
1406
01:15:19,500 --> 01:15:22,500
It will migrate to a new community, right?
1407
01:15:22,500 --> 01:15:29,500
Because I think people will ask their first few questions of this and then want to talk to a human.
1408
01:15:29,500 --> 01:15:39,500
But over time, the chat, whatever the chat interface with your code will become so good, you're like, maybe I don't need to talk to a human.
1409
01:15:39,500 --> 01:15:40,500
I don't know.
1410
01:15:40,500 --> 01:15:41,500
I don't know.
1411
01:15:41,500 --> 01:15:47,500
But does that mean that the humans no longer hold that knowledge and they don't need us?
1412
01:15:47,500 --> 01:15:49,500
I don't.
1413
01:15:49,500 --> 01:15:53,500
I think we'll have the same depth of coding.
1414
01:15:53,500 --> 01:16:03,500
No, I mean, you know, to some degree, like, I haven't, you know, I took compilers years ago as a class and hit the compiler.
1415
01:16:03,500 --> 01:16:05,500
I haven't done anything with a compiler.
1416
01:16:05,500 --> 01:16:08,500
I'd like to know how compilers work.
1417
01:16:08,500 --> 01:16:14,500
And I think it helps me when I am coding, you know, which is rare, like be a better coder.
1418
01:16:14,500 --> 01:16:22,500
But it becomes an abstraction layer that I just, I don't think about, you know, 99% of my day.
1419
01:16:22,500 --> 01:16:26,500
Maybe a lot of these concepts reach that point.
1420
01:16:26,500 --> 01:16:28,500
I don't know.
1421
01:16:28,500 --> 01:16:29,500
I don't know.
1422
01:16:29,500 --> 01:16:33,500
I mean, like half a dozen standard questions you ask when you're interviewing at Big Tech.
1423
01:16:33,500 --> 01:16:40,500
How do you do, you know, what do you want to pick a stack or a heap or whatever, like, you know, people are going to be like, why are you even asking that question?
1424
01:16:40,500 --> 01:16:43,500
I don't know why they ask it half the time.
1425
01:16:43,500 --> 01:17:00,500
It's, you know, you know, Joel, Joel Spolsky from from Stack Overflow and Joel on software talked about this, whatever, 20 years ago, he said that he asked, he would ask questions about, you know, basic HTML concepts, right, as part of his thing.
1426
01:17:00,500 --> 01:17:05,500
And he said that, like, it was just, it was just a filter.
1427
01:17:05,500 --> 01:17:06,500
That really is it.
1428
01:17:06,500 --> 01:17:16,500
It's not to say whether or not you know that, but like the amount that people would like exaggerate, I'll put it politely on their resume and not even be able to answer the most basic thing.
1429
01:17:16,500 --> 01:17:21,500
That's, that's really all you're like looking for from this question.
1430
01:17:21,500 --> 01:17:24,500
But, you know, I'm with you.
1431
01:17:24,500 --> 01:17:32,500
Like people are like, oh, you know, I have a, I want you to develop a queue that can handle throttling and so on and so forth.
1432
01:17:32,500 --> 01:17:33,500
And you're like, all right, really?
1433
01:17:33,500 --> 01:17:34,500
Like, I get it.
1434
01:17:34,500 --> 01:17:36,500
Nobody's going to actually build that.
1435
01:17:36,500 --> 01:17:38,500
I mean, you will, you will.
1436
01:17:38,500 --> 01:17:41,500
But the first thing you're going to go do is Google how to do it.
1437
01:17:41,500 --> 01:17:43,500
That's what I'm saying.
1438
01:17:43,500 --> 01:17:44,500
Yeah.
1439
01:17:44,500 --> 01:17:47,500
Like no one does that off of like knowledge.
1440
01:17:47,500 --> 01:17:52,500
You're going to go look it up and then you're going to compare three different things and optimize it and.
1441
01:17:52,500 --> 01:17:54,500
David, this has been fun.
1442
01:17:54,500 --> 01:17:56,500
Where should people find you on the internet?
1443
01:17:56,500 --> 01:17:59,500
So I'm a big blue sky guy.
1444
01:17:59,500 --> 01:18:02,500
I certainly continue to spew there.
1445
01:18:02,500 --> 01:18:04,500
Please come try.
1446
01:18:04,500 --> 01:18:06,500
Try our platform.
1447
01:18:06,500 --> 01:18:07,500
I would love to hear from you.
1448
01:18:07,500 --> 01:18:10,500
If you're a data, if you're touch data, right?
1449
01:18:10,500 --> 01:18:16,500
If you spend any meaningful amount on data, I want to hear how you, we can make our distributed data pipelines better.
1450
01:18:16,500 --> 01:18:18,500
Expanso.io.
1451
01:18:18,500 --> 01:18:23,500
I post all my talks at my website, David Aron check.com.
1452
01:18:23,500 --> 01:18:25,500
And I just love talking to people.
1453
01:18:25,500 --> 01:18:34,500
I, you know, I am the first person to say I like to be the dumbest person in the room, which is very easy because, you know, when you're this dumb, it's your, everyone's smart.
1454
01:18:34,500 --> 01:18:41,500
But like, I, you know, I just want to be smarter about like you and your business and your, I don't know what your opinions are.
1455
01:18:41,500 --> 01:18:48,500
I got in an argument last night with a guy who like rewrote basically all of Excel and raw JavaScript, like from scratch.
1456
01:18:48,500 --> 01:18:50,500
And I was like, what the hell are you doing, man?
1457
01:18:50,500 --> 01:18:53,500
He was like, well, this isn't this because he wanted it.
1458
01:18:53,500 --> 01:18:54,500
He's just a JavaScript guy.
1459
01:18:54,500 --> 01:18:56,500
And I was like, my God, how do you even do that?
1460
01:18:56,500 --> 01:19:00,500
Not TypeScript, not coffee script, raw JavaScript.
1461
01:19:00,500 --> 01:19:02,500
That's a quote.
1462
01:19:02,500 --> 01:19:03,500
I know.
1463
01:19:03,500 --> 01:19:05,500
That is, that is, but anyhow, I loved it.
1464
01:19:05,500 --> 01:19:08,500
It was amazing conversation.
1465
01:19:08,500 --> 01:19:12,500
We will, we have you in the blue sky starter pack for our guests.
1466
01:19:12,500 --> 01:19:14,500
I need to convert that to a list at some point.
1467
01:19:14,500 --> 01:19:15,500
So people can find you there.
1468
01:19:15,500 --> 01:19:17,500
I think you're iron yuppie on blue sky.
1469
01:19:17,500 --> 01:19:18,500
I am iron yuppie.
1470
01:19:18,500 --> 01:19:19,500
That is me.
1471
01:19:19,500 --> 01:19:20,500
So yeah.
1472
01:19:20,500 --> 01:19:26,500
And we'll, we'll definitely, we'll figure out a time we can have you on for a second round of history, some of your antics at Kubeflow and Microsoft.
1473
01:19:26,500 --> 01:19:27,500
I have so many questions.
1474
01:19:27,500 --> 01:19:28,500
I just want to know the tea.
1475
01:19:28,500 --> 01:19:31,500
Like the next whole episode has to be the tea.
1476
01:19:31,500 --> 01:19:34,500
I will talk about this until I'm blue in the face.
1477
01:19:34,500 --> 01:19:38,500
I like, I say this as someone like, I just don't believe in like speaking ill of people.
1478
01:19:38,500 --> 01:19:42,500
So like, don't, don't tune in if you think I'm going to like badmouth someone.
1479
01:19:42,500 --> 01:19:46,500
Like it's just the navigations of these things happening is just so fascinating.
1480
01:19:46,500 --> 01:19:50,500
I feel so lucky that I couldn't be there.
1481
01:19:50,500 --> 01:19:51,500
All right.
1482
01:19:51,500 --> 01:19:52,500
Thank you so much.
1483
01:19:52,500 --> 01:19:53,500
And thank you everyone for listening.
1484
01:19:53,500 --> 01:19:55,500
We will talk to you again soon.
1485
01:20:02,500 --> 01:20:06,500
Thank you for listening to this episode of fork around and find out.
1486
01:20:06,500 --> 01:20:11,500
If you like this show, please consider sharing it with a friend, a coworker, a family member, or even an enemy.
1487
01:20:11,500 --> 01:20:16,500
However, we get the word out about this show helps it to become sustainable for the long term.
1488
01:20:16,500 --> 01:20:26,500
If you want to sponsor this show, please go to fafo.fm slash sponsor and reach out to us there about what you're interested in sponsoring and how we can help.
1489
01:20:26,500 --> 01:20:30,500
We hope your system stay available and your pagers stay quiet.
1490
01:20:30,500 --> 01:20:32,500
We'll see you again next time.