Jan. 27, 2026
Embracing the Journey with Cassidy Williams
Cassidy Williams was too funny to be a developer so she was banished to the island of misfit devs called DevRel. Along the way she found a passion for memes and dreams and mechanical keyboards. No Oxford commas required. We start 2026 off strong with a few predictions, a medium amount of jokes, and a lot of AI.
Subscribe to her hilarious newsletter at cassidoo.co
1
00:00:00,000 --> 00:00:11,560
Welcome to Fork Around and Find Out, the podcast about building, running, and maintaining software
2
00:00:11,560 --> 00:00:14,560
and systems.
3
00:00:14,560 --> 00:00:24,760
Hello everyone and happy new year from Fork Around and Find Out.
4
00:00:24,760 --> 00:00:28,199
I am Justin Garrison and with me today as always is Autumn Nash.
5
00:00:28,199 --> 00:00:29,199
How's it going, Autumn?
6
00:00:29,599 --> 00:00:31,239
Good, but don't lie, it's still December.
7
00:00:31,239 --> 00:00:32,240
That's just going to air.
8
00:00:32,240 --> 00:00:33,600
Hey, hey, it doesn't matter.
9
00:00:33,600 --> 00:00:34,159
It doesn't matter.
10
00:00:34,159 --> 00:00:35,159
It's January.
11
00:00:35,159 --> 00:00:36,159
You're ruining the fun.
12
00:00:37,159 --> 00:00:41,759
Let's just like we're trying to survive the last few days of 2025.
13
00:00:41,759 --> 00:00:44,519
Okay, like let us know what it's like on the other side.
14
00:00:47,879 --> 00:00:50,399
Please everyone drop a comment and tell us what it's like.
15
00:00:50,719 --> 00:00:51,519
Did you make it?
16
00:00:51,519 --> 00:00:54,159
Like is it better?
17
00:00:56,359 --> 00:00:57,480
The bar is on the floor.
18
00:00:57,480 --> 00:00:58,480
Let's be real.
19
00:00:58,959 --> 00:01:03,559
2020, but I want to say it can't be worse, but man, I'm always surprised.
20
00:01:03,559 --> 00:01:07,120
Justin, Justin, you just gave us a curse.
21
00:01:07,120 --> 00:01:10,239
Take it back right now.
22
00:01:10,239 --> 00:01:12,079
You blamed all of our families.
23
00:01:12,079 --> 00:01:13,079
Why?
24
00:01:13,079 --> 00:01:17,000
But we're starting the year off strong because today we got Cassidy Williams on the show.
25
00:01:17,000 --> 00:01:20,799
Cassidy is a senior director of developer advocacy at GitHub.
26
00:01:20,799 --> 00:01:23,239
Man, there's a lot of Ds in that phrase.
27
00:01:23,239 --> 00:01:24,239
Fancy.
28
00:01:24,759 --> 00:01:30,640
You have the best tagline here on the front of your website that you say, I like to make
29
00:01:30,640 --> 00:01:33,119
memes and dreams and software.
30
00:01:33,119 --> 00:01:38,560
And not only is that a great combination, but also you avoid the whole problem of should
31
00:01:38,560 --> 00:01:42,840
I have an Oxford comma or not by just throwing extra ands in that sentence.
32
00:01:42,840 --> 00:01:43,840
And that's wonderful.
33
00:01:43,840 --> 00:01:44,840
Yeah.
34
00:01:44,840 --> 00:01:47,439
And so no one will ever question any word I ever say.
35
00:01:47,439 --> 00:01:48,719
Yeah, you know, it doesn't matter.
36
00:01:48,719 --> 00:01:50,759
She's like, I just add ands and it's fine.
37
00:01:51,760 --> 00:01:55,400
But I think that's what's best about your context, your content, though.
38
00:01:55,400 --> 00:01:57,880
Like it breeds your personality.
39
00:01:57,880 --> 00:02:04,520
Like some people make like the content they make about developers and tech.
40
00:02:04,520 --> 00:02:06,840
Like, and it's so boring.
41
00:02:08,840 --> 00:02:09,480
You know what I mean?
42
00:02:09,480 --> 00:02:12,960
Like, I love that we're professionals and we're doing a job.
43
00:02:12,960 --> 00:02:16,920
But like, what makes me want to watch your video versus somebody else's video?
44
00:02:16,920 --> 00:02:19,000
And yours are like absolutely hilarious.
45
00:02:19,000 --> 00:02:19,719
Like half the time.
46
00:02:19,719 --> 00:02:21,319
I'm just like, there's one tear.
47
00:02:21,319 --> 00:02:23,479
I'm dying and I'm like reposting it.
48
00:02:23,479 --> 00:02:26,520
I'm like, Cassidy understands my life.
49
00:02:26,520 --> 00:02:31,199
What's that one video that you posted, like you reposted it, but it was from like a year ago.
50
00:02:31,199 --> 00:02:33,599
I forget, but I posted it and I was like still accurate.
51
00:02:33,599 --> 00:02:37,439
Like it's like one of the like.
52
00:02:37,439 --> 00:02:39,919
Usually it's just pain and then laughing at the pain.
53
00:02:39,919 --> 00:02:41,159
So it could be literally anything.
54
00:02:41,159 --> 00:02:43,919
That's I feel like that's what gets you through being an engineer.
55
00:02:43,919 --> 00:02:47,560
You just like laugh at the pain and then other people laugh with you.
56
00:02:47,560 --> 00:02:51,680
And then it helps your imposter syndrome to realize it's not just you and you don't just suck.
57
00:02:51,680 --> 00:02:53,240
We all suck together.
58
00:02:53,240 --> 00:02:59,120
Right. And if you laugh at it and everybody's laughing at pain, then maybe the pain will go away.
59
00:02:59,120 --> 00:02:59,719
That's wishful thinking.
60
00:02:59,719 --> 00:03:02,000
I'm always like, but there's hope.
61
00:03:02,000 --> 00:03:03,640
Maybe. I don't know.
62
00:03:03,640 --> 00:03:05,159
Maybe.
63
00:03:05,159 --> 00:03:13,000
But on top of your great videos, you also have a wonderful newsletter with jokes that even make me cringe sometimes,
64
00:03:13,039 --> 00:03:19,960
which is lovely because I think they're the cherry on top of a wonderful newsletter and a prolific writer on your blog.
65
00:03:19,960 --> 00:03:23,840
You're going through a whole daily December blog post, right?
66
00:03:23,840 --> 00:03:24,919
Yeah. Blog vent.
67
00:03:24,919 --> 00:03:33,879
It is. I always go in so optimistic and then like I'm like halfway through December now and I'm like, I am grasping for straws of topics.
68
00:03:33,879 --> 00:03:39,000
Well, and since this is January now, how did it go for you?
69
00:03:39,000 --> 00:03:43,199
It was incredible.
70
00:03:43,199 --> 00:03:45,280
Which, OK, first of all, let's unpack this.
71
00:03:45,280 --> 00:03:55,360
For one, I think cringe should be the new word for all things good, because I love people that will unabashedly be themselves and hilarious.
72
00:03:55,360 --> 00:03:57,199
But it just like I love the jokes.
73
00:03:57,199 --> 00:04:00,120
But also, how do you out cringe Justin?
74
00:04:00,120 --> 00:04:03,599
Because he is like king of the dad jokes.
75
00:04:03,599 --> 00:04:06,719
Like he like dad jokes are so bad.
76
00:04:06,759 --> 00:04:08,960
You really out joke Justin?
77
00:04:08,960 --> 00:04:11,159
I'm not a real dad. I'm a faux pas.
78
00:04:13,960 --> 00:04:17,680
Oh my God, I love you.
79
00:04:17,680 --> 00:04:20,839
And I think we're done with the podcast, right?
80
00:04:20,839 --> 00:04:23,439
January 26th, we're off to a good start.
81
00:04:23,439 --> 00:04:24,639
This is Mike Drop.
82
00:04:24,639 --> 00:04:26,240
Thank you all for coming.
83
00:04:26,240 --> 00:04:29,839
Subscribe at Casadoo.co.
84
00:04:29,839 --> 00:04:31,279
That's it.
85
00:04:31,279 --> 00:04:33,199
That's why I'm here.
86
00:04:33,240 --> 00:04:43,800
So tell us, tell us how you got into from your traditionally a software developer or you were doing software development into developer advocacy.
87
00:04:43,800 --> 00:04:44,800
What did that look like for you?
88
00:04:44,800 --> 00:04:46,240
What have you been doing throughout your career?
89
00:04:46,240 --> 00:04:47,639
Obviously hilarious.
90
00:04:47,639 --> 00:04:51,000
I mean, it was just like you can't you can't be a software developer with this sense of humor.
91
00:04:51,000 --> 00:04:52,839
So you had to go do something else.
92
00:04:52,839 --> 00:04:57,079
I just had to figure out how can I, you know, force this upon people?
93
00:04:57,079 --> 00:04:59,279
No, I request jokes weren't happening.
94
00:04:59,279 --> 00:05:02,359
You're like she's actually got a personality and people skills.
95
00:05:02,359 --> 00:05:03,719
Oh, no.
96
00:05:03,719 --> 00:05:06,039
What are we going to do?
97
00:05:06,039 --> 00:05:10,839
Honestly, my entire career has been just like a dev rel sandwich where I actually.
98
00:05:10,839 --> 00:05:21,000
So it kind of goes back to college where when I got into college, they had like asked if I could speak to high school students about majoring in computer science and stuff like that.
99
00:05:21,000 --> 00:05:22,839
And I liked it.
100
00:05:22,839 --> 00:05:26,759
And I was just like, I wonder if there's roles where I can talk in addition to code.
101
00:05:26,800 --> 00:05:29,879
I don't know. And I just continued on with my life.
102
00:05:29,879 --> 00:05:39,240
But then there was a point my senior year, I was going to a lot of different hackathons and I'm going to just go through this part fast because it's kind of silly how obnoxious it is.
103
00:05:39,240 --> 00:05:40,360
But here's what happened.
104
00:05:40,360 --> 00:05:41,759
I was going to a lot of hackathons.
105
00:05:41,759 --> 00:05:46,399
One of the hackathons was a hackathon on an airplane or as a flight from San Francisco to London.
106
00:05:46,399 --> 00:05:47,519
And we had to build something.
107
00:05:47,519 --> 00:05:52,120
And my team ended up winning and we had to speak at the United Nations about our project.
108
00:05:52,160 --> 00:05:59,399
And in that process, I was also doing other hackathons until the United Nations talk, and I ended up interviewing at Venmo at the time.
109
00:05:59,399 --> 00:06:03,920
And that was my first job out of college where they asked me to do both software engineering and dev advocacy.
110
00:06:03,920 --> 00:06:06,720
And it was early in the industry at that time.
111
00:06:06,720 --> 00:06:12,000
And so I was just kind of doing both roles, figuring out what it would look like for the company at the time.
112
00:06:12,000 --> 00:06:18,079
And then I kind of bounced between advocacy and engineering, depending on the role for the rest of my career.
113
00:06:18,079 --> 00:06:19,519
There's so much to unpack there.
114
00:06:19,680 --> 00:06:23,399
First of all, there's a hackathon on an airplane.
115
00:06:23,399 --> 00:06:27,719
Also, you spoke at the United Nations and you did this before your first college job.
116
00:06:27,719 --> 00:06:30,039
Like, way to be an overachiever.
117
00:06:30,039 --> 00:06:31,599
Also, I love this for you.
118
00:06:31,599 --> 00:06:37,959
Thanks. Yeah, no, it was it was a whirlwind of just lots of things happening at once.
119
00:06:37,959 --> 00:06:40,439
And yeah, there is a lot to unpack there.
120
00:06:40,439 --> 00:06:49,439
But long story short, hackathons and meeting a lot of people led to me eventually going into advocacy because I saw people at these hackathons.
121
00:06:49,439 --> 00:06:51,600
And who were representing companies and stuff.
122
00:06:51,600 --> 00:06:53,560
And I was like, wait, this is your job.
123
00:06:53,560 --> 00:06:55,800
You can just help people code. That's so fun.
124
00:06:55,800 --> 00:07:02,600
And so, again, it wasn't like as much of a full time job back when I was about to enter the industry.
125
00:07:02,600 --> 00:07:09,360
And so my initial roles were combos of advocacy and engineering and then went all into engineering,
126
00:07:09,360 --> 00:07:13,399
then went all into advocacy and back and forth for almost every role.
127
00:07:13,399 --> 00:07:17,360
How did you find roles that were both advocacy and engineering?
128
00:07:19,959 --> 00:07:23,920
It was the kind of thing where so I started at Venmo.
129
00:07:23,920 --> 00:07:26,120
Venmo was my first job out of school.
130
00:07:26,120 --> 00:07:30,680
And at the time, Venmo was owned by a company called Braintree.
131
00:07:30,680 --> 00:07:35,079
And then PayPal bought Braintree and Venmo right around when I was joining.
132
00:07:35,079 --> 00:07:37,839
And the PayPal split off from eBay.
133
00:07:37,839 --> 00:07:39,480
There was a lot of like shifts and stuff.
134
00:07:39,480 --> 00:07:46,120
And so they were kind of like combination making up the role at Venmo, but also changing tides and stuff.
135
00:07:46,160 --> 00:07:52,199
And so that's why it was kind of a combo role because they needed someone to speak to developers and use the Venmo API at the time.
136
00:07:52,199 --> 00:07:54,399
But it was also shifting.
137
00:07:54,399 --> 00:07:58,360
And because of all of my work in the New York City tech scene at the time, that's where I was living.
138
00:07:58,360 --> 00:08:06,439
I eventually when enough changes were happening because of that buyout and PayPal splitting off from eBay and stuff,
139
00:08:06,439 --> 00:08:09,639
I ended up going to a startup that basically wanted me to do the exact same thing.
140
00:08:09,639 --> 00:08:13,360
And it was called Clarify, working in AI at the time.
141
00:08:13,400 --> 00:08:19,080
And that was also a combo role where I was doing advocacy and engineering on the product,
142
00:08:19,080 --> 00:08:25,480
where because it was a startup that was less than 20 people, I was just kind of fulfilling the needs.
143
00:08:25,480 --> 00:08:29,639
I was also getting tired and I also wanted to move away from New York City.
144
00:08:29,639 --> 00:08:33,600
And so I moved to Seattle and worked for a creative agency for a while.
145
00:08:33,600 --> 00:08:35,480
And that was just straight engineering.
146
00:08:35,480 --> 00:08:38,480
And so I was coding for clients, doing some engineering management.
147
00:08:38,480 --> 00:08:40,279
And that was my role.
148
00:08:40,319 --> 00:08:44,600
And I liked it a lot, but I ended up missing talking to developers.
149
00:08:44,600 --> 00:08:50,480
And so from there, I went to and that agency ended up being bought and doesn't exist anymore.
150
00:08:50,480 --> 00:08:51,919
And I went to Amazon after that.
151
00:08:51,919 --> 00:08:53,879
And that was full advocacy.
152
00:08:53,879 --> 00:08:56,559
And it was advocacy for the Echo.
153
00:08:56,559 --> 00:09:00,919
I'm not going to say her name because she's behind me and she'll hear me, the Echo.
154
00:09:00,919 --> 00:09:07,240
And it was fun, but without getting too into the weeds.
155
00:09:07,279 --> 00:09:12,799
The culture didn't really fit with what I wanted to do and I didn't love it.
156
00:09:12,799 --> 00:09:15,159
And so I ended up going to CodePen after that.
157
00:09:15,159 --> 00:09:16,840
And that was a full engineering role.
158
00:09:16,840 --> 00:09:22,200
And I loved working on CodePen and just working on a product that I was using so regularly.
159
00:09:22,200 --> 00:09:28,440
And I loved CodePen, but I started to miss that speaking to developers aspect again.
160
00:09:28,440 --> 00:09:33,320
And so eventually after CodePen, I went and taught React full time.
161
00:09:33,360 --> 00:09:39,360
And so I went to a small company called React Training where that team eventually made remix,
162
00:09:39,360 --> 00:09:41,920
but was maintaining React router and a bunch of other stuff.
163
00:09:41,920 --> 00:09:46,560
And so I was teaching coding full time, doing corporate workshops, public workshops,
164
00:09:46,560 --> 00:09:48,879
and it was awesome traveling a ton.
165
00:09:48,879 --> 00:09:54,120
And that was through the end of 2019 and beginning of 2020.
166
00:09:54,120 --> 00:10:02,360
And so people weren't doing remote workshops and talks as much as much at the time.
167
00:10:02,399 --> 00:10:07,039
The world has changed a whole lot in the past five, six years with regards to that.
168
00:10:07,039 --> 00:10:12,240
And so React Training, we ended up having to just kind of...
169
00:10:12,240 --> 00:10:17,680
Everybody had to part ways because the business wasn't going to survive that for a long time.
170
00:10:17,680 --> 00:10:22,720
And from there, I went to Netlify, did advocacy there for a while.
171
00:10:22,720 --> 00:10:29,080
And working in developer advocacy at Netlify was really fun because it was working on this platform,
172
00:10:29,120 --> 00:10:37,840
the glory days of the Jamstack and helping developers build web dev and build web products better,
173
00:10:37,840 --> 00:10:40,040
which was very fun.
174
00:10:40,040 --> 00:10:43,040
Eventually I burnt out hardcore and took a little break.
175
00:10:43,040 --> 00:10:47,520
And I ended up doing some part time work, working at remote.com a little bit,
176
00:10:47,520 --> 00:10:50,920
working for some VCs and advising startups for a while.
177
00:10:50,920 --> 00:10:57,040
And then eventually I went to another startup that I was advising and I got really close with the team
178
00:10:57,040 --> 00:10:58,879
and it was called Contenda.
179
00:10:58,879 --> 00:11:00,840
And that startup was so fun.
180
00:11:00,840 --> 00:11:06,720
We were building a bunch of AI products before ChatGPD came out and then ChatGPD came out.
181
00:11:06,720 --> 00:11:11,439
And we had a lot of different pivots and stuff kind of fitting in there.
182
00:11:11,439 --> 00:11:15,639
And honestly, after enough pivots, we got hired and we were like,
183
00:11:15,639 --> 00:11:20,000
okay, Contenda has been very fun, but it's really hard to sustain a business.
184
00:11:20,000 --> 00:11:25,159
And so then I consulted for a little while and ended up at GitHub.
185
00:11:25,159 --> 00:11:28,440
So that's a much longer tapestry of what my career has been.
186
00:11:28,440 --> 00:11:33,720
But it really has been a sandwich of developing and advocacy and developing and advocacy.
187
00:11:33,720 --> 00:11:43,480
Do you think in the year 2026 that AI and coding assistants has ruined what hackathons used to be?
188
00:11:43,480 --> 00:11:45,680
Whoa, what a deep question.
189
00:11:45,680 --> 00:11:48,120
I was just like right off the bat, because I used to love hackathons,
190
00:11:48,120 --> 00:11:53,160
but I loved it because of the team aspect of you had to get an expert from every little piece
191
00:11:53,160 --> 00:11:55,280
that had some experience in different things.
192
00:11:55,280 --> 00:11:57,920
And I would come in and like, oh, I'm the systems expert or architecture.
193
00:11:57,919 --> 00:12:00,719
You get your front end expert, your DBA, whatever.
194
00:12:00,719 --> 00:12:04,679
And you had to work as a team really, really closely, really quickly.
195
00:12:04,679 --> 00:12:10,679
And I'm sure that that plane ride was a team of a handful of people that you all sat next to each other.
196
00:12:10,679 --> 00:12:12,799
And you just threw files back.
197
00:12:12,799 --> 00:12:14,599
I don't know what the network situation was on the plane.
198
00:12:14,599 --> 00:12:16,759
We were literally on an airplane without Wi-Fi.
199
00:12:16,759 --> 00:12:18,159
USB sticks, no Wi-Fi.
200
00:12:18,159 --> 00:12:19,839
Just like, here you go. Here's the next file.
201
00:12:19,839 --> 00:12:26,120
But now in the age of 2026 and in Claude and all these other code assistants and co-pilot,
202
00:12:26,120 --> 00:12:31,879
it's individuals that are like, I will do this part and I will figure it out with my co-pilot and you do that part.
203
00:12:31,879 --> 00:12:37,600
And now it's more about divvying up the work than the camaraderie and teamwork, right?
204
00:12:37,600 --> 00:12:39,399
I've never really thought of it that way.
205
00:12:39,399 --> 00:12:45,200
I feel like there's aspects of that where.
206
00:12:45,200 --> 00:12:52,039
I don't know, I feel like in a lot of my really early hackathon days when I was still learning and a really early career developer,
207
00:12:52,039 --> 00:12:56,399
so much of my time was spent just like reading docs and trying to learn a new technology.
208
00:12:56,399 --> 00:13:06,919
And now if I do a hackathon, like, for example, I did the GitHub game off recently that it's more like I spent more time figuring out what I wanted to build.
209
00:13:06,919 --> 00:13:09,519
And then I didn't have like AI write it for me.
210
00:13:09,519 --> 00:13:12,879
I just kind of used it as my tool as I was building my project together.
211
00:13:12,879 --> 00:13:22,000
And so you're right, it was less about the camaraderie, but it was also more about like the final product rather than the let's throw code into a thing until something.
212
00:13:22,039 --> 00:13:24,240
Works so it might be.
213
00:13:24,240 --> 00:13:29,320
I don't know if it ruins the hackathon vibe, but it definitely changes it.
214
00:13:29,320 --> 00:13:32,960
And the reason I always went to hackathons was to network with people, right?
215
00:13:32,960 --> 00:13:35,759
It was like, hey, I want to maybe I don't have a team or I have two people.
216
00:13:35,759 --> 00:13:37,440
We need two more or something like that.
217
00:13:37,440 --> 00:13:39,919
And it was always about meeting people and networking.
218
00:13:39,919 --> 00:13:42,840
I don't feel like it's that way anymore.
219
00:13:42,840 --> 00:13:51,759
I haven't been to a hackathon for a while, but just the vibe I feel like is going more towards like what CTF are like CTFs capture the flags are usually individuals.
220
00:13:51,799 --> 00:13:57,240
That are like, I'm going to hack in this box, get all my points, and it's sometimes it's teams, but more often than not, it's a single person.
221
00:13:57,240 --> 00:13:58,960
That's like, I know how to use all these tools.
222
00:13:58,960 --> 00:14:04,319
I'm just going to see how much I can hack and they get some notoriety because they got they won or whatever.
223
00:14:04,319 --> 00:14:07,720
They learn some stuff, but I don't feel like it's the same thing anymore.
224
00:14:08,000 --> 00:14:12,240
I don't know if I agree. I feel like I put on a lot of hackathons and I do a lot of hackathons.
225
00:14:12,240 --> 00:14:16,200
And I think like I'm usually very critical of AI.
226
00:14:16,200 --> 00:14:19,720
Like, I try to hold like an honest perception of it.
227
00:14:20,200 --> 00:14:29,320
And now that I use it every day, like, I don't really think that AI has changed any camaraderie or networking of anything.
228
00:14:29,320 --> 00:14:32,160
I think it almost like this is going to be a wild take.
229
00:14:32,160 --> 00:14:40,600
But I think that we're going to finally, when all this kind of gets more mature, we're finally going to see the difference between developers who could just code.
230
00:14:40,600 --> 00:14:43,600
Like, I think for a long time, we put up with people who could just code.
231
00:14:43,600 --> 00:14:44,680
That was all they were good at.
232
00:14:44,680 --> 00:14:48,720
They were good at the technical part and nobody wanted to put up with him or deal with him.
233
00:14:48,720 --> 00:14:52,519
But you had to hold on to this one dude because he was really good at being technical.
234
00:14:52,519 --> 00:14:53,519
He was always a dude.
235
00:14:53,519 --> 00:14:55,920
Yeah. So like you've got now...
236
00:14:55,920 --> 00:14:56,920
It's an equalizer.
237
00:14:57,519 --> 00:15:03,879
Yes, to me now, like just how we like the joke that we made about like how Cassidy had an actual personality and people skills.
238
00:15:03,879 --> 00:15:10,519
Like, I think that it's going to be really interesting because it's transforming who we are as developers.
239
00:15:10,519 --> 00:15:16,639
Like, I don't think being a developer five years ago and what being a developer five years from now is going to be the same thing.
240
00:15:17,360 --> 00:15:27,919
I like to use it to like make proof of concepts for things, but it's like to make different versions and to kind of like troubleshoot it and get it to test things and then go right the actual thing.
241
00:15:27,919 --> 00:15:31,439
Like, I don't know if I think it almost puts more...
242
00:15:32,199 --> 00:15:40,319
Where I hope it will allow for more people that have both the technical skills, the networking and the people skills and the ability to teach.
243
00:15:40,320 --> 00:15:49,080
Like, because we're going to have to change the way that we teach junior developers and bring people into the fold because they're like, like Stack Overflow.
244
00:15:49,080 --> 00:15:52,840
Like Stack Overflow is not even going to get the same amount of information it used to get, right?
245
00:15:52,840 --> 00:15:58,040
Like, there's a whole new way of trying to figure out how to be a developer because you have...
246
00:15:58,840 --> 00:16:01,560
Like, it's just going to write this code for you, but how do you know if it's good or not?
247
00:16:01,560 --> 00:16:02,080
Like, how do you...
248
00:16:02,080 --> 00:16:06,560
Like, we're going to be forced to have to like really teach junior engineers.
249
00:16:06,599 --> 00:16:16,919
And I think it's going to like be a very like interesting pivot on how we do that and how senior engineers are going to have to be better at like teaching and building those relationships.
250
00:16:16,919 --> 00:16:21,519
So I think it's actually going to be the opposite where there's more emphasis on people.
251
00:16:21,519 --> 00:16:23,839
Well, if it's done right, you know, we could always do it the wrong way.
252
00:16:24,239 --> 00:16:25,279
If we don't fumble it.
253
00:16:26,679 --> 00:16:32,079
If we don't like completely fumble the bag, I think that like I honestly think that's going to be the differentiator.
254
00:16:32,520 --> 00:16:41,920
One, it's going to be the people that take in the most data of the code written because we're not going to have those free open areas to be like, hey, did you have this problem?
255
00:16:41,920 --> 00:16:44,759
Because people are going to ask like, you know, AI now.
256
00:16:45,200 --> 00:17:00,360
And the people, whatever big company or little company, whatever that figures out how to utilize AI in the right way to teach people and not completely like fumble their pipeline and really make good developers who can use it.
257
00:17:00,399 --> 00:17:03,120
But aren't like cut off by it.
258
00:17:03,560 --> 00:17:07,680
I think that's going to be the huge differentiator.
259
00:17:08,160 --> 00:17:09,640
I think that too.
260
00:17:09,640 --> 00:17:10,960
And sorry to interrupt you, Justin.
261
00:17:12,079 --> 00:17:18,000
First of all, I do still see like student hackathons, the people at MLH and all of their stuff there.
262
00:17:18,240 --> 00:17:21,839
They're doing such great networking events for students and early career people.
263
00:17:21,839 --> 00:17:25,839
So I do think that the hackathon spirit of networking is still alive.
264
00:17:26,199 --> 00:17:33,919
I think the lack of collaboration and stuff is more a sign of like society than AI.
265
00:17:33,959 --> 00:17:40,119
I'm sure AI doesn't help, but I also think society is weird, particularly now.
266
00:17:40,119 --> 00:17:41,439
That's a good word for it.
267
00:17:41,439 --> 00:17:43,919
Yeah, that's weird.
268
00:17:45,480 --> 00:17:55,159
But I also, I like what you mentioned, Autumn, when it comes to like the content that's being put out now.
269
00:17:55,320 --> 00:18:03,160
I think this is where like the revival of blogs that I'm seeing and like people leaning into RSS feeds and newsletters is really interesting.
270
00:18:03,160 --> 00:18:14,160
And I hope that we see more of that because people are going away from centralized platforms like the Twitters of the world and stuff, because they're just like, well, if it goes away, where's all of my content?
271
00:18:14,880 --> 00:18:16,560
It's on your spaces that you create.
272
00:18:17,039 --> 00:18:25,879
And this isn't everyone, but I feel like that's a trend I'm starting to see as people starting to create their content and communities in places that are more portable.
273
00:18:26,200 --> 00:18:32,159
I feel like it's almost going to be like almost democratizing, but not in the way that it might seem like.
274
00:18:32,679 --> 00:18:40,319
I think it's going to be interesting because the people like the Cassities and Justin who like to play around with things and try things.
275
00:18:40,319 --> 00:18:42,919
And like your curiosity is never going to go away.
276
00:18:43,000 --> 00:18:51,200
Your big personalities are never going to go away, but now they shine through even more because people are just putting out AI slop that has no personality.
277
00:18:51,480 --> 00:18:55,960
So now like you being funny is even more of a differentiator.
278
00:18:55,960 --> 00:19:02,200
The fact that you can both do the technical and be a really good teacher and make that interesting to me.
279
00:19:02,200 --> 00:19:05,600
I think that honestly, like we're a bunch of nerds.
280
00:19:05,600 --> 00:19:08,880
We've made friends on the Internet without having to be in person for a long time.
281
00:19:09,240 --> 00:19:11,600
And I think it's going to make the Internet cool again.
282
00:19:11,679 --> 00:19:24,199
You know, like you're just like, look at this weird thing I hacked together or, you know, like just, I don't know, like I hate dealing with DNS and AI can do that part for me while I do the cool colors and like the other cool thing.
283
00:19:24,199 --> 00:19:30,559
You know, I mean, to your point of the AI piece is is a tool to write the code, right?
284
00:19:30,559 --> 00:19:34,679
Like if I was going into any hackathon for myself, like I was never I never considered myself a coder.
285
00:19:34,839 --> 00:19:37,719
I was like, I will set up the infrastructure so that you can run your code, do whatever.
286
00:19:37,960 --> 00:19:44,480
And now I can participate in the code writing pieces of it because I have a tool that generally does that.
287
00:19:44,920 --> 00:19:49,279
But back to what Cassidy was saying before, like I used to also read all the docs.
288
00:19:49,279 --> 00:19:54,079
And that's how I know when things work or they don't work, when the AI says, oh, you should be able to do this.
289
00:19:54,079 --> 00:19:56,000
Like, no, no, no, that's not in the API spec.
290
00:19:56,000 --> 00:19:58,000
That is not a field that is available.
291
00:19:58,200 --> 00:19:59,000
We have to do a different way.
292
00:19:59,000 --> 00:20:06,799
Right. Like you still have to have that one notch lower understanding of what you're doing to be able to do it in any successful way.
293
00:20:07,720 --> 00:20:17,559
But I think my my ultimate question is, what does that notion of I have a tool that writes the code for me and possibly even reads the docs better than than a human coder?
294
00:20:17,600 --> 00:20:19,960
Comprehend it. What happens to DevRel?
295
00:20:20,880 --> 00:20:28,079
Right. Because DevRel was the person that had to take all the docs and give them the pieces that they cared about.
296
00:20:28,079 --> 00:20:29,279
And then they could go write the code.
297
00:20:29,279 --> 00:20:31,240
But now I have a tool that does both those sides.
298
00:20:31,519 --> 00:20:33,720
So is DevRel's job just to be funny?
299
00:20:34,360 --> 00:20:37,640
All right. Like the things that the humor side of the AI that's not good at.
300
00:20:39,200 --> 00:20:42,319
I think like we are far from that.
301
00:20:42,600 --> 00:20:46,600
Maybe someday I'm going to be just like, remember when I didn't worry about my job.
302
00:20:46,600 --> 00:20:55,920
Ha ha ha. But like the number of AI written blog posts that I've been asked to review, not even just like the ones that coworkers send me.
303
00:20:55,920 --> 00:20:58,680
I mean, like, in general, people will ask me to review blog posts.
304
00:20:58,680 --> 00:21:01,880
I'm just like, I can tell an AI wrote this because it's terrible.
305
00:21:02,360 --> 00:21:07,920
Like I I'm personally not worried about that level of content creation.
306
00:21:07,920 --> 00:21:15,640
It might speed things up where I love tools like Descript, for example, where it uses AI to very quickly get a transcript and to make edits.
307
00:21:15,640 --> 00:21:23,200
If you like remove a word and it'll cut something or using AI to auto add captions or something.
308
00:21:23,279 --> 00:21:29,480
I think that there's things that AI does for you, but there's still things that.
309
00:21:30,920 --> 00:21:38,640
I know an AI couldn't do the product demos that I do, and not because that not because the AI tools are bad and not because I'm perfect,
310
00:21:38,880 --> 00:21:45,880
but because there's still a level of human quality that is needed for now, for the next.
311
00:21:45,880 --> 00:21:49,039
I also think the human relation part to it, you know what I mean?
312
00:21:49,279 --> 00:21:53,319
You understand what other humans find, not just funny, because I don't want to like,
313
00:21:53,759 --> 00:21:56,639
I think we have to be careful to not reduce Cassie to just being funny, right?
314
00:21:56,759 --> 00:21:57,799
But it's more than that.
315
00:21:57,799 --> 00:22:02,799
Like, like, you know, we were talking about laughing about the bad times, like when things go wrong.
316
00:22:02,799 --> 00:22:07,599
It's not just so much about being funny, but like you can then identify with all those developers.
317
00:22:07,599 --> 00:22:09,759
I feel your pain. This happens to me, too.
318
00:22:10,119 --> 00:22:13,599
Like, and I think that, like, AI is good for a lot of things.
319
00:22:13,599 --> 00:22:21,119
But to me, it's like an IDE or, you know, like Photoshop or just the different things that we have already been using to enhance the world.
320
00:22:21,119 --> 00:22:23,039
Like, it's really bad at some things.
321
00:22:23,039 --> 00:22:25,919
It's really good at some things. You have to know that just to use it, right?
322
00:22:25,919 --> 00:22:30,199
Like, there's so many like, they're just double different levels of abstraction.
323
00:22:30,199 --> 00:22:33,119
Like Photoshop didn't stop people from being photographers.
324
00:22:33,159 --> 00:22:42,959
You know what I mean? Like, it didn't stop people from having cool, unique ways of creating like a drama or a story with photos, you know?
325
00:22:42,960 --> 00:22:44,559
So like, I don't know.
326
00:22:44,559 --> 00:22:50,759
I think that that's when we know AI really has the relevance when we get through the like pretending it will do everybody's job.
327
00:22:51,360 --> 00:22:53,319
And like, it almost shows what we're good at.
328
00:22:53,319 --> 00:23:00,799
Like, I don't think the way that you use your videos to like show the empathy and relation that other people like you understand what other people are going through.
329
00:23:00,799 --> 00:23:03,799
Like, I don't think AI could ever do that.
330
00:23:04,519 --> 00:23:07,160
There's do either of you know the game Go?
331
00:23:08,440 --> 00:23:10,640
I know of the game Go. I've never played it, but I know it.
332
00:23:10,680 --> 00:23:12,400
Yeah, I love Go.
333
00:23:12,400 --> 00:23:14,080
I play Go every day.
334
00:23:14,120 --> 00:23:16,400
It's a really, really fun game you can play online.
335
00:23:16,400 --> 00:23:17,120
It's great.
336
00:23:18,080 --> 00:23:30,640
Alpha Go came out in like 2017 or something, and it was a whole big thing, kind of like IBM's Watson beat chess master and Alpha Go was Google's version of beating a Go master.
337
00:23:30,680 --> 00:23:36,880
And it's interesting to see how much the game Go has changed because of that.
338
00:23:37,120 --> 00:23:39,720
Where I again, I played a lot.
339
00:23:39,720 --> 00:23:43,680
I was a part of a study on like, did AI ruin your enjoyment of the game?
340
00:23:43,680 --> 00:23:49,040
And pretty much everybody said no, they still like playing Go, even though an AI could probably beat them.
341
00:23:49,040 --> 00:24:00,600
But what's interesting about the games that we saw and the things that that you can see when Alpha Go has played is it comes up with moves that you've never seen before.
342
00:24:00,839 --> 00:24:06,199
I took a couple Go lessons where a teacher was saying like, oh, in this situation you want to go here.
343
00:24:06,199 --> 00:24:06,879
And I was like, why?
344
00:24:06,879 --> 00:24:11,919
And she said, interestingly, nobody did this for about two or three hundred years.
345
00:24:11,959 --> 00:24:13,399
This was actually the correct move.
346
00:24:13,399 --> 00:24:16,439
But when Alpha Go made this move, everyone was like, why would you do that?
347
00:24:16,839 --> 00:24:18,759
Silly AI, this is ridiculous.
348
00:24:18,959 --> 00:24:21,599
And then like 20 moves later, they're like, wait, what?
349
00:24:21,599 --> 00:24:23,399
And it changed the game literally.
350
00:24:23,639 --> 00:24:27,399
We're like, people are just like, now we know you should definitely go there.
351
00:24:27,600 --> 00:24:34,200
And it provided new insights that the people and players didn't see because kind of like that.
352
00:24:34,880 --> 00:24:37,600
There's a Grace Hopper quote, the most dangerous phrase in the languages.
353
00:24:37,600 --> 00:24:38,759
We've always done it this way.
354
00:24:39,120 --> 00:24:42,560
I think there's a lot of that in humanity and stuff.
355
00:24:42,560 --> 00:24:47,120
We are habitual creatures and this is getting very deep and philosophical.
356
00:24:47,400 --> 00:24:54,600
But I think very similarly, AI is going to change the way we see things and the way we do things.
357
00:24:54,719 --> 00:24:59,519
But that doesn't necessarily mean it takes away from not only our enjoyment of the work,
358
00:24:59,519 --> 00:25:01,959
but how we do the work and the work that needs to be done.
359
00:25:04,959 --> 00:25:06,839
And there's it's not just our industry, right?
360
00:25:06,839 --> 00:25:12,079
Like every industry is grappling with what is happening with something that can create stuff
361
00:25:12,079 --> 00:25:15,439
that only people used to be able to write, like difficult things like the music.
362
00:25:15,439 --> 00:25:18,079
Music industry is just like exploding right now.
363
00:25:18,079 --> 00:25:23,639
Right. Like there's just like we don't know what a what a demo is anymore, because this AI company
364
00:25:23,640 --> 00:25:26,840
can crank out a million years of demos in a day.
365
00:25:26,840 --> 00:25:28,240
And it's just like, well, what do we do with that?
366
00:25:28,240 --> 00:25:29,560
Like, this is not impossible.
367
00:25:29,560 --> 00:25:32,520
It's not possible for a human because we have finite time.
368
00:25:32,880 --> 00:25:39,600
And my enjoyment of things like music and short form video, when I see so many videos,
369
00:25:39,600 --> 00:25:41,800
like, you know what, I just don't want to be on this platform anymore.
370
00:25:41,960 --> 00:25:43,480
Like, I don't want to be.
371
00:25:43,480 --> 00:25:45,200
I don't want to see the AI slop.
372
00:25:45,200 --> 00:25:47,320
I don't know. No one else put effort into this.
373
00:25:47,720 --> 00:25:52,759
And I think the the effort and knowing that someone cared enough to spend their time on it
374
00:25:53,240 --> 00:25:54,960
is sometimes why I care. Right.
375
00:25:54,960 --> 00:25:59,839
There's someone like I'm practicing go or whatever to get up to a human level,
376
00:26:00,160 --> 00:26:01,680
to get better as a player.
377
00:26:01,680 --> 00:26:03,319
And it doesn't even matter if you're the best in the world.
378
00:26:03,319 --> 00:26:06,879
Right. Like I watch my kids play sports and I'm like, hey, if everyone's at the same level,
379
00:26:06,879 --> 00:26:08,200
it's still enjoyable. Right.
380
00:26:08,200 --> 00:26:11,480
They're not good players, but they all kind of suck in their own way.
381
00:26:11,480 --> 00:26:15,359
And it's fun to watch because it's it's it's actually great seeing a hackathon
382
00:26:15,480 --> 00:26:17,440
when everyone sucks in the same ways. Right.
383
00:26:17,440 --> 00:26:20,759
Like if there's one team in there that's, you know, professional software developers
384
00:26:20,799 --> 00:26:22,799
and like, well, I don't care because they're going to win the hackathon every time.
385
00:26:22,799 --> 00:26:26,119
No, no, no. I want to level the playing fields and I want to know they have fun.
386
00:26:26,119 --> 00:26:28,160
They're learning something and the effort is the same.
387
00:26:28,720 --> 00:26:32,160
But if they're all using it equally, aren't they still doing it at the same level?
388
00:26:32,160 --> 00:26:34,400
You know what I mean? Like if they're all like, because.
389
00:26:35,920 --> 00:26:38,359
I don't know, like, you know what I mean, like if they're all coming in
390
00:26:38,359 --> 00:26:41,240
and they're building something and they're using it to help them
391
00:26:41,240 --> 00:26:44,359
find different ways to do it, like we don't know that they're all using it
392
00:26:44,359 --> 00:26:46,359
to write it the same way. You know what I mean?
393
00:26:46,920 --> 00:26:50,200
From my experience, the people that come in that are good at hackathons
394
00:26:50,200 --> 00:26:52,840
are the ones that know you can do something that someone else doesn't know.
395
00:26:53,200 --> 00:26:56,279
Right. It's like if you if you can prompt the AI and say, oh, I actually know
396
00:26:56,440 --> 00:27:00,960
this Linux subsystem or I know that API or I know like these electronics,
397
00:27:00,960 --> 00:27:04,920
whatever it is, like I know this piece better than anyone else here,
398
00:27:04,920 --> 00:27:06,600
then I know what it's capable of.
399
00:27:06,600 --> 00:27:11,240
So I can prompt the AI to help me get to the edges of how it's used.
400
00:27:11,480 --> 00:27:14,960
And that's where I usually see people succeed is when they reach some edge
401
00:27:14,960 --> 00:27:18,519
of their own understanding to be able to implement it in a certain way.
402
00:27:18,759 --> 00:27:20,920
You ever seen a bunch of college kids use AI, though?
403
00:27:20,920 --> 00:27:22,400
It's very interesting.
404
00:27:22,400 --> 00:27:25,279
No, no, but honestly, like it's like it's like when you give.
405
00:27:27,200 --> 00:27:29,440
The reason why I say this is because I really like going to schools
406
00:27:29,440 --> 00:27:31,759
and teaching little kids about like technology.
407
00:27:32,160 --> 00:27:34,319
Their brains are so malleable, right?
408
00:27:34,319 --> 00:27:38,000
Like they have they they have not yet learned the constraints
409
00:27:38,000 --> 00:27:40,680
of like adulthood, rooting or fun.
410
00:27:40,680 --> 00:27:44,599
And the way that they look at like how to solve problems is very different.
411
00:27:44,639 --> 00:27:48,639
Like now we have to be super careful that we don't kill their creativity
412
00:27:48,639 --> 00:27:50,559
and curiosity, right?
413
00:27:50,559 --> 00:27:53,119
But I think that goes in like the education of it.
414
00:27:53,399 --> 00:27:54,079
You know what I mean?
415
00:27:54,079 --> 00:27:58,279
Like I don't I don't seriously like I did you see that thing
416
00:27:58,279 --> 00:28:03,240
about a bunch of game developers talking about how their CEO said
417
00:28:03,240 --> 00:28:06,480
that they're using AI for all their proof of concepts?
418
00:28:06,480 --> 00:28:08,559
And then that artist was like, no, we're not like
419
00:28:09,879 --> 00:28:11,639
use it to explore a couple of ideas.
420
00:28:11,639 --> 00:28:14,119
And then I go draw it the way that I have always done it, you know?
421
00:28:14,119 --> 00:28:19,319
And I think like like I was writing like just something that I was going to use
422
00:28:19,759 --> 00:28:22,719
for something else next week, and I wrote it three different ways.
423
00:28:22,759 --> 00:28:25,719
And then I had it tested on a bunch of things, which would have took me forever.
424
00:28:26,159 --> 00:28:28,199
But that just made like work.
425
00:28:28,199 --> 00:28:30,599
Like, for instance, you were talking about content, but like,
426
00:28:30,599 --> 00:28:33,119
what if it's not all AI content generated?
427
00:28:33,399 --> 00:28:36,719
But what if I take a bunch of content of my kids in my life
428
00:28:36,719 --> 00:28:38,599
and sometimes I get AI to help me edit it?
429
00:28:38,599 --> 00:28:41,199
So the editing process is half the time, right?
430
00:28:41,240 --> 00:28:45,600
Like, I think there's still ways that like we're almost going to more appreciate.
431
00:28:45,640 --> 00:28:49,920
Like when you see like the cool like remixes on TikTok, like, cool.
432
00:28:49,920 --> 00:28:50,600
It's new music.
433
00:28:50,600 --> 00:28:52,319
Like there's one of Sleep Token and Christmas.
434
00:28:52,319 --> 00:28:54,120
And I was like, this is the best thing I've ever heard.
435
00:28:54,120 --> 00:28:57,840
But honestly, like, am I going to not listen to Sleep Token because that?
436
00:28:57,840 --> 00:28:59,880
No way. Like in.
437
00:28:59,880 --> 00:29:02,960
And it's a good thing because you have familiarity with what it, you know,
438
00:29:02,960 --> 00:29:05,640
you you have a connection to the other thing it's remixing.
439
00:29:05,840 --> 00:29:08,000
And you're like, oh, this has a different vibe of nostalgia.
440
00:29:08,000 --> 00:29:10,799
Have you ever heard just regular AI stuff?
441
00:29:10,799 --> 00:29:12,879
Like that there's no relation of that.
442
00:29:12,879 --> 00:29:15,759
There's no aspect. Yeah, there's no color.
443
00:29:16,480 --> 00:29:19,799
Yes. It's like if you looked at member, like it's like if you took the
444
00:29:20,079 --> 00:29:23,119
this is going to be weird because I love photography and I used to do design.
445
00:29:23,359 --> 00:29:26,000
But if you took all of the like color out of the world
446
00:29:26,000 --> 00:29:28,639
and everything was black and white, and I don't mean the beautiful depth
447
00:29:28,639 --> 00:29:31,879
of the back black and white, but like if you did it, like you just pulled it out.
448
00:29:32,240 --> 00:29:35,319
That's it's not the same thing as if you if you did black and white
449
00:29:35,319 --> 00:29:36,839
and like an artistic way.
450
00:29:36,839 --> 00:29:39,279
And it's not the same thing as the world in color, right?
451
00:29:39,279 --> 00:29:41,440
Like it's just kind of like a sad version of it.
452
00:29:41,960 --> 00:29:44,519
Like and I think that human relation is it right?
453
00:29:44,519 --> 00:29:47,920
Like the human relation in music, the human relation of what Cassidy,
454
00:29:47,920 --> 00:29:50,639
because she's gone back and forth to being a developer and she's
455
00:29:50,639 --> 00:29:53,759
and like Devreel and she teaches and she goes to these hackathons.
456
00:29:54,079 --> 00:29:56,960
She has a different relation, even the way that you talk about things
457
00:29:56,960 --> 00:30:00,240
when you're super like into like you'll come up with a book
458
00:30:00,240 --> 00:30:01,839
and you'll be like, hey, I just read this new book.
459
00:30:01,839 --> 00:30:04,240
And I'm like, I would have never read that book if you didn't give me that.
460
00:30:04,720 --> 00:30:07,839
No, honestly, like the other day, he he was so excited about something.
461
00:30:08,240 --> 00:30:11,359
And I was like, oh, I've been having the worst tech week, you know,
462
00:30:11,359 --> 00:30:14,480
like I've just said, like, oh, like I don't like, do I even like the stuff anymore?
463
00:30:14,720 --> 00:30:16,359
And he's so excited about it.
464
00:30:16,359 --> 00:30:18,319
It makes me excited about it again.
465
00:30:18,319 --> 00:30:20,199
You know, like, I don't think that you could kill that.
466
00:30:20,199 --> 00:30:23,319
Like and I do think that it's true.
467
00:30:23,319 --> 00:30:25,559
There's times where I look at a post, I'm just like,
468
00:30:25,559 --> 00:30:29,000
if you couldn't bother to write this post, I'm not going to be bothered to read it.
469
00:30:29,439 --> 00:30:34,439
Like I like when it comes to things like writing and music and videos and stuff,
470
00:30:34,439 --> 00:30:36,079
if it's all AI, you can tell.
471
00:30:36,079 --> 00:30:37,199
I don't want all of it.
472
00:30:37,200 --> 00:30:38,559
I don't want it. I don't want to hear that.
473
00:30:38,559 --> 00:30:41,360
If you're using it to like better something, maybe.
474
00:30:41,360 --> 00:30:44,000
But if it's just all of that and you gave nothing, I don't want it.
475
00:30:44,080 --> 00:30:47,240
I think like AI is a somewhat decent editor.
476
00:30:47,240 --> 00:30:48,600
It's a terrible writer.
477
00:30:48,600 --> 00:30:51,039
I think it's a decent outliner.
478
00:30:51,200 --> 00:30:53,360
I don't want it to fill in the blanks like.
479
00:30:53,960 --> 00:30:56,759
And I think that that's that's where some of the gaps are.
480
00:30:56,759 --> 00:31:03,519
But I feel like we're also in such early days that society is learning that.
481
00:31:03,519 --> 00:31:06,600
Like you can't it's not a silver bullet for so many things.
482
00:31:06,599 --> 00:31:10,519
And I think that like that that is a big part of my job now is figuring out,
483
00:31:10,519 --> 00:31:12,000
like how do we communicate that?
484
00:31:12,000 --> 00:31:13,799
Like, for example, we had GitHub.
485
00:31:14,039 --> 00:31:15,279
We're building GitHub Copilot.
486
00:31:15,279 --> 00:31:17,839
There's all of these different tools you can use.
487
00:31:18,159 --> 00:31:23,000
Smart autocomplete chat agents, all these things where they are genuinely helpful.
488
00:31:23,079 --> 00:31:25,119
We don't want to replace developers.
489
00:31:25,519 --> 00:31:28,839
We don't want to take the enjoyment out of coding and creating and stuff.
490
00:31:29,079 --> 00:31:34,319
But we also know that we can help accelerate certain aspects of the process.
491
00:31:34,359 --> 00:31:40,720
How do we communicate that and educate people in the way that will better them
492
00:31:40,720 --> 00:31:43,960
rather than make them feel nervous or gross about it?
493
00:31:44,480 --> 00:31:48,559
Not just that, but just as a developer who's been using Copilot, like it's so
494
00:31:48,559 --> 00:31:54,159
different using like an agent in VS code verse using like the ask version of it
495
00:31:54,159 --> 00:32:00,919
or using it in a browser or using GitHub Copilot CLI, by the way,
496
00:32:01,039 --> 00:32:04,279
GitHub Copilot CLI is fire.
497
00:32:04,320 --> 00:32:05,480
Like it is so different.
498
00:32:05,480 --> 00:32:07,039
It's so wild.
499
00:32:07,039 --> 00:32:10,200
I've only used it a little bit, but it's so amazing.
500
00:32:10,200 --> 00:32:14,600
It is a game changer in difference between that and using in the VS code, I think,
501
00:32:14,600 --> 00:32:19,080
because it separates it for me and it can go like, I can give it very like direct
502
00:32:19,560 --> 00:32:22,800
like instructions and it'll build something and then I'll go and do my own.
503
00:32:22,800 --> 00:32:25,240
Like, you know, it keeps it separate, but I don't know.
504
00:32:25,279 --> 00:32:28,360
I just think that like we're still figuring it out.
505
00:32:28,399 --> 00:32:32,079
And I think that it's almost like with our buying power, we're also going to make
506
00:32:32,079 --> 00:32:32,959
a lot of these decisions.
507
00:32:32,959 --> 00:32:34,079
Yeah. Right.
508
00:32:34,079 --> 00:32:37,879
Like if you don't buy AI music, they're going to stop making it.
509
00:32:38,399 --> 00:32:40,119
You know what I mean? So like.
510
00:32:40,119 --> 00:32:44,879
Yeah, I think one of the best benefits we have to AI in 2025 has been that it's
511
00:32:44,879 --> 00:32:48,199
really expensive for the people running it, right?
512
00:32:48,199 --> 00:32:50,199
Like they actually have to put a bunch of investment in.
513
00:32:50,199 --> 00:32:53,399
And some of that is offset by VC and government funds and whatever.
514
00:32:53,399 --> 00:32:54,839
It can't be forever. Right.
515
00:32:54,839 --> 00:32:57,159
At some point, you get that Uber tipping point.
516
00:32:57,160 --> 00:33:00,400
Yeah. Where all your rides are no longer VC funded and you're like,
517
00:33:00,400 --> 00:33:01,920
oh, how much does this ride actually cost?
518
00:33:01,920 --> 00:33:03,000
Like, I don't think so.
519
00:33:03,000 --> 00:33:05,000
I don't think that's where it's going to get interesting
520
00:33:05,000 --> 00:33:07,519
because it is good at some coding projects. Right.
521
00:33:07,519 --> 00:33:11,560
So I think that it'll be worth spending the money to keep that part around.
522
00:33:11,880 --> 00:33:15,240
But when everyone's like, please get rid of the AI music
523
00:33:15,240 --> 00:33:18,200
and we don't want you to write anything or like we have.
524
00:33:18,200 --> 00:33:21,320
We've had auto correct and Grammarly and all kind of stuff for forever.
525
00:33:21,320 --> 00:33:24,440
Like, I think those things people will have to use them
526
00:33:24,920 --> 00:33:29,400
and like more sparsely because they won't be free and they won't be cheap.
527
00:33:29,400 --> 00:33:31,240
So you'll use them where they actually.
528
00:33:32,120 --> 00:33:34,840
You're going to pay for the things that you actually get benefit from.
529
00:33:35,480 --> 00:33:35,960
Exactly.
530
00:33:35,960 --> 00:33:39,720
And hopefully it's not just people paying to not think anymore
531
00:33:39,720 --> 00:33:41,960
because I think there's a lot of value in thinking.
532
00:33:41,960 --> 00:33:45,720
And I think that's why it's been free for so long too, though, because if they
533
00:33:45,720 --> 00:33:50,039
if they like get you at that carrot, right, where they're like, try this
534
00:33:50,039 --> 00:33:53,720
and also forget how to write things and forget how to think.
535
00:33:53,799 --> 00:33:58,200
And then you're reliant on exactly, which is why I play devil's advocate
536
00:33:58,200 --> 00:33:59,640
with my kids about all this stuff.
537
00:33:59,640 --> 00:34:01,480
And I'm like, well, what do you think about that?
538
00:34:01,480 --> 00:34:04,120
And I like it's always about that critical thinking.
539
00:34:04,120 --> 00:34:05,480
Socratic questioning.
540
00:34:05,480 --> 00:34:08,679
Because it's like I think that's the part that like
541
00:34:08,679 --> 00:34:11,960
nobody could ever replace your curiosity, either of you two.
542
00:34:11,960 --> 00:34:13,960
That is what makes you who you are.
543
00:34:13,960 --> 00:34:17,159
There's no AI that could be your personality or curiosity.
544
00:34:17,159 --> 00:34:19,320
And I don't think that we're going to lose that as long as we're
545
00:34:19,960 --> 00:34:22,199
cognitive of that being a tool.
546
00:34:22,199 --> 00:34:22,760
You know what I mean?
547
00:34:22,760 --> 00:34:24,520
I think you I think you hit on something that.
548
00:34:25,720 --> 00:34:29,720
AI and dev rel, to many extent, like the thing that is valuable
549
00:34:29,720 --> 00:34:31,640
out of it is getting someone else excited.
550
00:34:32,200 --> 00:34:37,240
It is to inspire someone to go do the thing that was difficult, right?
551
00:34:37,240 --> 00:34:38,680
Because those things are difficult tasks.
552
00:34:38,680 --> 00:34:42,840
Like if you say here's a blank ID, go make a web page.
553
00:34:42,840 --> 00:34:44,840
Most people will be like, I don't know where to start.
554
00:34:44,840 --> 00:34:45,560
Right. Like even back.
555
00:34:45,560 --> 00:34:49,480
Oh, my God, that's what it helps me with with ADHD, like deer in the headlights.
556
00:34:49,480 --> 00:34:51,160
Like I get so overwhelmed.
557
00:34:51,159 --> 00:34:55,159
And then all of a sudden, AI has broken it into like all these different ways.
558
00:34:55,159 --> 00:34:55,719
And it's done it.
559
00:34:55,719 --> 00:34:57,480
And I'm like, I don't really like the way you did it.
560
00:34:57,480 --> 00:34:58,759
But I like this part.
561
00:34:58,759 --> 00:35:00,839
But in this part, yes.
562
00:35:00,839 --> 00:35:02,599
And then I'm like, oh, this is horrible.
563
00:35:03,159 --> 00:35:05,480
Dude, it's like I get deer in the headlights.
564
00:35:05,480 --> 00:35:06,519
I'll procrastinate.
565
00:35:06,519 --> 00:35:07,480
Like, you know what I mean?
566
00:35:07,480 --> 00:35:08,920
Like, but that's not even new, right?
567
00:35:08,920 --> 00:35:10,440
Because we've been doing that, like Ruby on Rails, right?
568
00:35:10,440 --> 00:35:12,199
Like templates out the whole website.
569
00:35:12,199 --> 00:35:12,920
That's what I'm saying.
570
00:35:12,920 --> 00:35:15,799
And it's just like, yeah, it's all this information.
571
00:35:15,799 --> 00:35:17,239
It's just the hype cycle.
572
00:35:17,239 --> 00:35:20,039
And getting people excited and enough to say like, hey,
573
00:35:20,039 --> 00:35:22,599
this might be difficult, but you have to figure out some of these things.
574
00:35:22,599 --> 00:35:27,719
I don't know if I showed you, Autumn, I vibe coded like an app called Where to Watch It.
575
00:35:27,719 --> 00:35:28,199
Did you see it?
576
00:35:29,079 --> 00:35:30,920
Oh, wait, one more thing before we move on really quick.
577
00:35:31,800 --> 00:35:39,239
Your video about Kubernetes, you used a container, water, and some other stuff.
578
00:35:39,239 --> 00:35:43,239
And like before this, like you're the only person, like before scale, before this was,
579
00:35:43,239 --> 00:35:45,000
you're the only person I knew that did Kubernetes, right?
580
00:35:45,719 --> 00:35:47,559
And I just didn't understand them.
581
00:35:47,559 --> 00:35:48,759
I was like, OK, it's a container.
582
00:35:50,199 --> 00:35:53,559
And the way that you related it to like real life objects,
583
00:35:53,559 --> 00:35:56,599
if an AI has never played with water and whatever, like, yeah,
584
00:35:56,599 --> 00:35:58,840
they can relate it to something and they can give it to you.
585
00:35:58,840 --> 00:36:02,759
But they're not, they've never been a human that held that water that did those things.
586
00:36:02,759 --> 00:36:07,880
So like nobody could ever reinvent the way that you used a random water
587
00:36:07,880 --> 00:36:09,639
and other things to tell that story.
588
00:36:09,639 --> 00:36:12,519
Like it's how you tell the story when you got your like glasses
589
00:36:12,519 --> 00:36:15,400
and you could see like different like colors all of a sudden.
590
00:36:15,400 --> 00:36:19,480
Like it's your, you get curious about a subject, you get all excited.
591
00:36:19,480 --> 00:36:21,880
You go out into the world and do something with it.
592
00:36:21,880 --> 00:36:24,760
And then you share it with everyone because that's your personality.
593
00:36:24,760 --> 00:36:26,360
And then everybody else is excited about it.
594
00:36:26,360 --> 00:36:28,360
Well, yeah, getting excited, like the color.
595
00:36:28,360 --> 00:36:31,320
I've been colorblind my entire life and I've only been able to see
596
00:36:31,880 --> 00:36:33,719
more colors for the last two years.
597
00:36:33,719 --> 00:36:36,840
But the way that you did that changed my like way that we do it.
598
00:36:36,840 --> 00:36:41,960
We went to, when we went to do awesome, when I still worked at AWS, we had different cards.
599
00:36:41,960 --> 00:36:46,679
We picked the cards differently because I wouldn't have considered colorblindness before then.
600
00:36:46,679 --> 00:36:48,519
And then I was like, oh crap, we have to be careful.
601
00:36:48,519 --> 00:36:50,280
What if someone can't see these?
602
00:36:50,280 --> 00:36:51,400
Yeah. Yeah.
603
00:36:51,400 --> 00:36:55,079
And like that, that has given me different excitement to go experience more things
604
00:36:55,079 --> 00:36:58,840
because I know what's possible that I can see more shades of red now.
605
00:36:58,840 --> 00:37:00,599
And in those contexts, like, oh, cool.
606
00:37:00,599 --> 00:37:02,840
Now I know where my next limit is.
607
00:37:02,840 --> 00:37:06,199
Like I still can't see everything, but I can actually see sunsets better now.
608
00:37:06,199 --> 00:37:09,559
Right? Like those sorts of things are using those tools
609
00:37:09,559 --> 00:37:11,559
to get excited about doing something else.
610
00:37:11,559 --> 00:37:16,920
And that's where I think AI and DevRel fits where the human aspect of DevRel
611
00:37:16,920 --> 00:37:21,159
is really to inspire, to tell people about some things that maybe they didn't know existed,
612
00:37:21,159 --> 00:37:26,760
to explain something in a way that is more relatable, that wouldn't be, you know,
613
00:37:28,119 --> 00:37:33,159
next word predicted, but be able to go through like a different surprise of a human aspect
614
00:37:33,159 --> 00:37:36,440
of how this stuff works. But like I was saying, like I, I, I've, I've coded this thing just
615
00:37:36,440 --> 00:37:39,480
because I was starting from, I wanted to do it for a while.
616
00:37:39,480 --> 00:37:43,400
It was an idea that I heard someone else have on a podcast. I'm like, oh, that sounds like fun.
617
00:37:43,400 --> 00:37:46,280
I wish I could do that. Right. But it would take me a long time to do it.
618
00:37:46,280 --> 00:37:50,280
I was like, I just have limited time and it still took me time to build.
619
00:37:50,840 --> 00:37:54,680
But I was watching a movie while I was doing it. Right. Like this isn't something that like,
620
00:37:54,680 --> 00:37:57,880
it's not making money. It's not something that's like, so I just wanted to get to the point of
621
00:37:57,880 --> 00:38:03,240
how much would it cost me as an individual who knows some of this stuff, do it. Right.
622
00:38:03,240 --> 00:38:08,760
And so far I think I'm like $60 in AI credits in, in a $25 domain. And that's it. Right.
623
00:38:08,760 --> 00:38:13,400
So it's like, I'm still under a hundred dollars, which is great because it's taken me a few evenings
624
00:38:13,400 --> 00:38:17,320
of, Hey, like the app mostly works. Is it secure? Absolutely not. Right. Like I don't,
625
00:38:17,320 --> 00:38:22,440
there's no, no oddity in this, but it works for me. And I feel like that,
626
00:38:22,440 --> 00:38:27,079
I feel like the thing that we're seeing a lot and I think will be an interesting trend. And I say,
627
00:38:27,079 --> 00:38:33,320
we, as in like just my team at GitHub is people are building personal tools so much more because
628
00:38:33,320 --> 00:38:39,320
they can kind of vibe it out. And, and I've used so many of my domain names and like dusty side
629
00:38:39,320 --> 00:38:46,360
projects on my pile purely because like you said, I, I'm able to get started and I'm able to,
630
00:38:46,360 --> 00:38:51,160
to be just like, okay, you know what? I just want this app to exist for me so I can have this tool.
631
00:38:51,720 --> 00:38:55,720
I'm not going to worry about how it's built. I'm not going to worry about anything. I want the
632
00:38:55,720 --> 00:39:00,120
actual final product and I'm able to just do it. Sometimes it just stays in a private repo and it
633
00:39:00,120 --> 00:39:06,519
truly is just for me, but it works. And that part has been great. That's still the energy of the
634
00:39:06,519 --> 00:39:12,759
side project, the hackathons, but it's just a different aspect of coming like to it, you know?
635
00:39:13,880 --> 00:39:19,239
And I think the, one of the the thesis or the reason this podcast exists is to help people
636
00:39:19,239 --> 00:39:26,119
understand the long-term maintenance of software and decisions you make. How does that explosion
637
00:39:26,119 --> 00:39:34,920
of I can make any tool I want affect people in a year when what does that turn into in six months
638
00:39:34,920 --> 00:39:38,760
or whatever? Like, yeah, I can just let a domain expire. Maybe I don't care about it anymore,
639
00:39:38,760 --> 00:39:45,639
but I feel like we are so far outpacing anything we've ever done before and we have no people
640
00:39:45,639 --> 00:39:50,280
trained or understanding how any of this is going to be maintained in even a year from now.
641
00:39:50,280 --> 00:39:53,800
I honestly think that's going to be a learning process. Just like teaching junior developers,
642
00:39:53,800 --> 00:39:57,320
like we're going to have to figure it out. It's, it's really hard to maintain something
643
00:39:57,320 --> 00:40:03,240
that you didn't write 4,000 lines later. Right. And I think that that's where we need to keep
644
00:40:03,239 --> 00:40:08,919
beating the drum as an industry that it's not just about senior devs. We need to like still
645
00:40:08,919 --> 00:40:14,199
level up the people who are early in their careers because those people are the future senior devs.
646
00:40:14,199 --> 00:40:18,839
And again, the roles are probably going to be looking different in a year, two years, five years.
647
00:40:18,839 --> 00:40:23,079
We don't know what it's going to look like, but we need people to continue to enter the industry
648
00:40:23,079 --> 00:40:27,639
and learn, even though it's particularly weird and challenging right now with all of these tools
649
00:40:27,639 --> 00:40:31,639
coming up. It's such a weird place to be in. Like, I feel like you don't even know what you're
650
00:40:31,639 --> 00:40:35,239
shooting for when you are thinking about promotions because you're like, what does that
651
00:40:35,239 --> 00:40:41,400
even look like right now? Yeah, it's also tough. Go ahead. Weird thing that I was thinking of when
652
00:40:41,400 --> 00:40:47,079
we were talking about paying kind of like to be able to do a little bit more. Like I got advice
653
00:40:47,079 --> 00:40:52,359
from a GM when I was like first a software engineer at AWS and I was like, how do you like
654
00:40:53,000 --> 00:40:58,440
balance being like a good parent and showing up to all your kids' stuff and like being a good spouse
655
00:40:58,440 --> 00:41:02,599
and like still like kicking butt at your career? Like how do you do all that? And he told me
656
00:41:03,159 --> 00:41:09,240
that he contracts out the stuff that he doesn't want to do. So like Instacart and like getting
657
00:41:09,240 --> 00:41:13,000
someone to like clean your house once a month or those kinds of things. And it's kind of funny
658
00:41:13,000 --> 00:41:17,800
because sometimes I almost think that's what people, that would be the cool thing of AI,
659
00:41:17,800 --> 00:41:21,800
getting it to do the things that we don't want to do or to make our lives easier so we could get a
660
00:41:21,800 --> 00:41:27,240
little bit more time with our like side projects or we can get like do things faster because I
661
00:41:27,239 --> 00:41:31,639
think that's what we should use it for, you know, to buy you a little bit of time for the fun things.
662
00:41:31,639 --> 00:41:36,199
I just wanted to do my laundry and my dishes. Oh my, I would pay so much money.
663
00:41:37,639 --> 00:41:46,119
I want this. If you want a VC that like that would make so much money, like do you,
664
00:41:46,119 --> 00:41:53,079
do you, I have three kids, Cassidy. My laundry pile haunts my dreams. I only have two and it's
665
00:41:53,079 --> 00:42:00,519
endless. It's endless. The amount I would give a hood rat money, like I would give so much money.
666
00:42:00,519 --> 00:42:05,960
But actually though, like I keep seeing like people saying we're working on a robot that
667
00:42:05,960 --> 00:42:11,239
can fold your laundry, but the video's AI. And I'm just like, okay, that's not real. Make it real.
668
00:42:11,239 --> 00:42:15,639
I'm begging you. Because I can find out where the socks go because those little socks cost
669
00:42:15,639 --> 00:42:20,360
just as much as regular size socks. And I just want to know like how, like,
670
00:42:20,440 --> 00:42:27,800
like, and like, just like if you just somebody could just do the annoying things, like manage
671
00:42:27,800 --> 00:42:34,200
where my kids controllers went and like where they lost it or like do laundry or someone.
672
00:42:34,200 --> 00:42:35,480
This is the future we want.
673
00:42:35,480 --> 00:42:40,360
Can it nag you? Like I'm going to start an app that nags my children and it's like an AI that's like,
674
00:42:40,360 --> 00:42:44,599
you haven't taken a shower yet. Your stuff's blown the floor. Like I'm going to just get like.
675
00:42:44,599 --> 00:42:46,280
That's a good vibe coded app idea.
676
00:42:46,440 --> 00:42:50,120
I swear to God, don't play with me. I'm going to go get see if they still have Amazon deep
677
00:42:50,120 --> 00:42:55,000
seat cameras and be like, no, there's still stuff on the floor in your room. So I don't have to get
678
00:42:55,000 --> 00:42:59,400
off my butt and walk upstairs and tell them that there's still stuff on the room six times.
679
00:42:59,400 --> 00:43:00,440
You know, I just got to start up.
680
00:43:04,120 --> 00:43:09,320
And I think about like when was the washing machine, the clothes washer invented, right?
681
00:43:09,320 --> 00:43:13,800
Like that was a huge game changer for so many people to be able to automate their work and to
682
00:43:13,800 --> 00:43:19,320
do the thing that they didn't want to do anymore. Right. Like that was absolutely just even like a
683
00:43:19,320 --> 00:43:24,039
dishwasher clothes. I always talk about like this is this is the original agent in my house.
684
00:43:24,039 --> 00:43:29,080
Right. Like this is the agent that originally like set it all off of like it does stuff for me when
685
00:43:29,080 --> 00:43:32,280
I don't have to do it. And I just set a timer and yeah, I have to change it still. And I have to
686
00:43:32,280 --> 00:43:37,480
fold the clothes. But you know, most of the work is already like I don't have to bring it a basket,
687
00:43:37,480 --> 00:43:38,039
go down somewhere.
688
00:43:38,039 --> 00:43:43,480
That's why I think it's hilarious. Like there's a disconnect between like the products people
689
00:43:43,480 --> 00:43:49,240
want. Right. And the executives who are like farming these ideas, like I just want to know
690
00:43:49,240 --> 00:43:54,599
what they go into like meetings with PowerPoints on because I'm like, there are moms that would pay
691
00:43:54,599 --> 00:44:01,159
you so much money. Like even not even just moms, like there's so many real life ways that we could
692
00:44:03,159 --> 00:44:08,039
you can't see our faces, but if you could see my face right now, the money I would pay you to do
693
00:44:08,039 --> 00:44:14,039
some of the boring like monotonous crap that I have to do every day. If you could just like,
694
00:44:14,920 --> 00:44:19,800
there's so many cool and nobody like there's this whole market of how AI could make our lives like
695
00:44:19,800 --> 00:44:23,639
easier. And they keep giving us stuff that nobody asked for. And I'm like,
696
00:44:23,639 --> 00:44:26,440
listen to AI music. I want you to fold a shirt.
697
00:44:27,320 --> 00:44:31,800
Or like just do something annoying, like go fill out my kids forms. You know what I mean? Like go
698
00:44:31,800 --> 00:44:37,079
fill out all their field trip forms and all the like emergency contact update forms and stuff.
699
00:44:37,319 --> 00:44:39,960
You don't want the AI to have that data though. Like that's the other.
700
00:44:39,960 --> 00:44:42,679
Can I run it locally? They have all the data.
701
00:44:44,679 --> 00:44:48,440
Have you seen the stuff the government's doing? That ship sailed. Okay. Go fill out the forms.
702
00:44:48,440 --> 00:44:49,079
I don't even care.
703
00:44:49,079 --> 00:44:53,319
That's actually kind of filling out the forms. So that's a good idea if you got like a pen plotter
704
00:44:53,319 --> 00:44:55,079
and then you just AI to fill it out.
705
00:44:55,079 --> 00:45:01,000
Take something off my plate. That's annoying. I swear to God, the money that people because
706
00:45:01,000 --> 00:45:05,159
we're all like millennials are working like 8 million jobs right now. We're trying to be
707
00:45:05,159 --> 00:45:09,239
present parents. We're trying to like heal the trauma that happened. We're all trying to go to
708
00:45:09,239 --> 00:45:19,079
therapy, work six jobs, make sour dough. Like, bro, if you take something off my plate, I would
709
00:45:19,079 --> 00:45:25,239
pay you obscene amounts of money. Like that one. This is an ADHD problem.
710
00:45:28,599 --> 00:45:31,079
Look, I got a new printer. I got another one.
711
00:45:31,719 --> 00:45:35,159
You have two 3D printers now? This is your problem.
712
00:45:35,159 --> 00:45:41,480
Shut up, Justin. What if somebody did some of the adult stuff? Like I have more time than you did
713
00:45:41,480 --> 00:45:46,440
stuff. The problem you have is not saying no. And that is like, if you want to outsource saying no
714
00:45:46,440 --> 00:45:50,599
to things, just text me. Have you been talking to my therapist? Shut up, Justin. God.
715
00:45:51,799 --> 00:45:54,199
Come on, man. I thought we were friends.
716
00:45:54,679 --> 00:46:03,879
Anyway, I would, I just gave me some project ideas and I want that. I'm going to go write
717
00:46:03,879 --> 00:46:08,439
an app that nags my children and scans their room. Thanks. Yeah, no, I think I think that'd be neat.
718
00:46:08,439 --> 00:46:10,839
But yeah, that'd be like, get your shoes off the floor.
719
00:46:16,599 --> 00:46:20,679
Let's see your house just have motion sensors in every room with cameras. That's like your control.
720
00:46:20,919 --> 00:46:28,679
You didn't wait a camera to go over in my pantry and be like, put the Nutella snacks down,
721
00:46:28,679 --> 00:46:33,399
put them down. You're on your third Pringles. Leave it alone and go get some carrots.
722
00:46:35,480 --> 00:46:41,799
Some of these are actually very feasible. I grow up. My mind is like, oh, I could do that.
723
00:46:43,319 --> 00:46:47,879
Like I'm about to have IOT devices everywhere. My children are going to be so annoyed at me.
724
00:46:48,039 --> 00:46:54,360
That I'm haunted. They're going to be like, we live in LA, but there's there's such like those kids
725
00:46:54,360 --> 00:46:58,280
that like their mom talks about cybersecurity too much and they're going to be like, and it's
726
00:46:58,280 --> 00:47:03,720
stealing your data. And that's a bad idea because what if it gets hacked by a malicious actor? And
727
00:47:03,720 --> 00:47:08,680
they're going to be like, I live in like a data like prison and like, I want you to stop.
728
00:47:09,480 --> 00:47:15,000
This is locally hosted home assistant with local AI models running on a Raspberry Pi through pie
729
00:47:15,000 --> 00:47:18,760
hole or whatever. Just keep listing it off. Just scare people away.
730
00:47:22,599 --> 00:47:24,119
We got down the nerd rabbit hole.
731
00:47:27,480 --> 00:47:31,000
I'm about to build so much hood rat stuff from my house over Christmas break.
732
00:47:31,880 --> 00:47:37,079
It sounds so good though. And then when this airs in January, you'll be like, yeah,
733
00:47:37,079 --> 00:47:42,280
I did do all of that. That's the real test. Yeah. Mid-January when this lands.
734
00:47:45,320 --> 00:47:49,880
For you. Co-pilot agent will be working very hard. Okay.
735
00:47:53,800 --> 00:47:58,760
Can you make a mom? Like, what if you made an AI agent that was like the other mom,
736
00:47:58,760 --> 00:48:04,679
like the mom that just nagged them about us? This is how they come up with those chats that like
737
00:48:04,679 --> 00:48:10,679
speak for certain characters and avatars that we got to, we got to make a little less dystopian.
738
00:48:11,399 --> 00:48:17,399
Damn it. It was a little Terminator. Wasn't it? Like, yeah, this is, this is where the rabbit
739
00:48:17,399 --> 00:48:22,119
hole goes. You got to read. Okay. You're right. You're right. We've come up with all these ideas
740
00:48:22,759 --> 00:48:27,559
in, in whatever 30 minutes we were talking. What is, what does 2026 look like? Where do
741
00:48:27,559 --> 00:48:31,319
we think this is going? What is everyone else going to be doing? What are all the other crazy
742
00:48:31,319 --> 00:48:37,000
ideas that's in 2026? Someone's going to throw millions of dollars and say, yeah, that's the
743
00:48:37,480 --> 00:48:41,239
thing. Unfortunately, I don't think it's going to be folding a shirt. That's not what I think it is.
744
00:48:41,239 --> 00:48:44,119
I don't think this is going to be the time where they throw millions of dollars. I think this is
745
00:48:44,119 --> 00:48:48,760
going to be the time that they realized they wasted millions of dollars. This is the reckoning 2026.
746
00:48:49,639 --> 00:48:55,880
Like they've just lit money on fire. I don't know. I think the runway is a little bit longer. Like
747
00:48:55,880 --> 00:49:00,920
they are burning it very fast. The runway is long. I, I, I'll be curious to see like,
748
00:49:01,639 --> 00:49:06,519
what, yeah, first of all, the, the VCs, the people spending money on this,
749
00:49:06,519 --> 00:49:10,440
are they going to crack down or are they going to? What's the pivot though?
750
00:49:10,440 --> 00:49:14,599
What now that they've decided like gardeners come out and they said 98% of products aren't making
751
00:49:14,599 --> 00:49:19,240
money. Like what's the pivot? How are they now going to start picking the next thing to invest in?
752
00:49:19,880 --> 00:49:25,159
It'd be cool if they listened to their users. Um, but we'll see, we'll see what happens.
753
00:49:25,159 --> 00:49:27,960
Like we've been doing this long enough that we know they don't do that. Okay.
754
00:49:28,920 --> 00:49:32,840
Like first of all, listening to users, seeing like what people are actually using, which,
755
00:49:32,840 --> 00:49:38,440
which tools are going to like bubble to the top and be like the ones that last and which tools
756
00:49:38,440 --> 00:49:43,880
are going to be like, okay, that was a good try. We're done. And I also, I genuinely think that
757
00:49:43,880 --> 00:49:49,960
what we're going to see in 2026 is again, more personal tools. And I'm hoping to see a lot more
758
00:49:49,960 --> 00:49:56,519
like personal blogs and websites, making a comeback, a lot of like decentralized portable
759
00:49:56,519 --> 00:50:01,639
things that people can take from platform to platform as like social media is changing.
760
00:50:01,639 --> 00:50:05,719
The algorithms are changing. All these things are changing. People building on their own
761
00:50:06,599 --> 00:50:12,360
setups more. I feel like I see that as a trend coming. I don't know if it'll come in full force
762
00:50:12,360 --> 00:50:16,679
or if it's in my like indie hacker bubble, but that's, that's what I feel like I'm seeing.
763
00:50:16,679 --> 00:50:22,840
Do you have any advice for, I don't know, future Cassavies out there who are wanting to start that
764
00:50:22,840 --> 00:50:27,640
blog or, cause it's daunting, like writing regularly.
765
00:50:28,680 --> 00:50:34,120
It is, but also you have more in your brain than you think. I literally,
766
00:50:34,120 --> 00:50:38,600
The internet is harsh critics. Like how do you deal with all the critics?
767
00:50:38,600 --> 00:50:43,800
Yeah. And yet people don't listen to you as much as you think. And so I think you're on,
768
00:50:43,800 --> 00:50:48,440
you are your own harshest critic a lot of the times. And there have been so many times where
769
00:50:48,440 --> 00:50:52,120
I'm talking to someone where they're just like, oh, well that course already exists. There's
770
00:50:52,119 --> 00:50:57,960
content on this that already exists, but there's rarely content that exists that speaks the way you
771
00:50:57,960 --> 00:51:04,279
speak. And the way you speak is a special way, which, which sounds like very sunshine and daisies,
772
00:51:04,279 --> 00:51:09,400
but it's true where you never know if your voice is going to be the thing that helps someone
773
00:51:09,400 --> 00:51:14,279
understand something better or helps someone learn better, change their perspective better.
774
00:51:14,279 --> 00:51:17,000
Inspire them to do something else, right? Like that's the, yeah.
775
00:51:17,000 --> 00:51:19,239
Try. Yeah. That's really true though.
776
00:51:19,239 --> 00:51:24,759
And a blog post doesn't have to be an intimidating thing. This is just going to turn into me telling
777
00:51:24,759 --> 00:51:30,119
people they should blog more, but like some of the best blogs I've read, I've read are like
778
00:51:30,119 --> 00:51:34,039
a paragraph long, but it's like a really good insight. And I'm just like, wait, that was good.
779
00:51:34,039 --> 00:51:38,439
I need to bookmark this. I need to remember this. I have tried for so long to tell people like,
780
00:51:38,439 --> 00:51:44,519
don't put a thread of, of multiple posts or tweets or whatever, write a three paragraph
781
00:51:44,519 --> 00:51:48,919
blog post, right? Like it's going to, it's going to exist longer. You'll be able to link it better.
782
00:51:49,320 --> 00:51:50,760
The link will exist. Yeah.
783
00:51:54,039 --> 00:51:57,480
How do you like, one of the things that I always tell people is make it a habit,
784
00:51:57,480 --> 00:52:02,360
figure out when it fits in your time that says, Hey, I'm inspired to do this now. Not only like,
785
00:52:02,360 --> 00:52:06,440
do I have the energy to do it. I have the ability to do it. And a lot of times people are like,
786
00:52:06,440 --> 00:52:10,440
I'm on my phone and I can't type out a blog post. And I'm like, yeah, that's kind of hard
787
00:52:10,440 --> 00:52:14,119
for a little while. You could do it with like medium or ghost or something like that, but it's,
788
00:52:14,119 --> 00:52:19,480
it's still difficult to say like, Oh, I have the ability and inspiration to write something
789
00:52:19,480 --> 00:52:26,279
right now. Right. How do you act on that? I think that is building a habit as a result of
790
00:52:26,279 --> 00:52:31,719
the workflows that you have. And I think that, and this is actually a course I used to teach.
791
00:52:31,719 --> 00:52:37,319
And I kind of want to like, I did it in person and I want to record it of just like developing
792
00:52:37,319 --> 00:52:43,719
workflows for yourself. Because I think a lot of times when we try to start a habit and fail at it,
793
00:52:43,719 --> 00:52:50,119
we get discouraged when we miss a day or something. It's because we haven't built a system for doing
794
00:52:50,119 --> 00:52:54,839
it with as low overhead as possible. We build up this huge system where it's like, I'm going to
795
00:52:54,839 --> 00:52:59,799
create blogs and it's going to follow this perfect template, or I'm going to, I'm going to make sure
796
00:52:59,799 --> 00:53:05,959
everything follows a specific formula every single time without starting significantly smaller than
797
00:53:05,959 --> 00:53:10,599
you plan. And then like add more over time as you get comfortable spreading, spreading out the work
798
00:53:10,679 --> 00:53:15,159
and making it so that that workflow builds up to, oh, this last piece is easy. Because I did a bunch
799
00:53:15,159 --> 00:53:18,920
of little work leading up to it. And that's, I mean, I think a lot of people, a lot of people fail at
800
00:53:18,920 --> 00:53:22,199
personal websites and blogs because every time they want to write something, they have to redesign
801
00:53:22,199 --> 00:53:27,559
it. Exactly. Oh, and I've been a victim of that. But then they're like, use a template and just
802
00:53:27,559 --> 00:53:33,319
start reading. And I've even my newsletter, I've been writing my newsletter now for eight years,
803
00:53:33,319 --> 00:53:38,519
almost nine years now. And it's a consistent thing I do every single week. And it's been,
804
00:53:38,519 --> 00:53:46,519
how do you keep it interesting? I don't know. But like, but what I do is, is over time,
805
00:53:46,519 --> 00:53:51,480
I have a template that I always follow throughout the week. I have different links and I have
806
00:53:51,480 --> 00:53:56,199
different tools where, for example, I'm not paid by any of these tools. This is just what I use.
807
00:53:56,199 --> 00:53:59,719
Raindrop is a bookmarking tool that's cross-platform. It has browser extensions.
808
00:53:59,719 --> 00:54:04,360
This has phone things. Whenever I read an article that I think is good, I bookmark it in Raindrop
809
00:54:04,360 --> 00:54:08,119
and it's all just in a put in newsletter folder. And then when I'm writing my newsletter, I just
810
00:54:08,119 --> 00:54:11,799
pull from that. Whenever I do something where I'm just like, oh, this could be good for the
811
00:54:11,799 --> 00:54:16,039
newsletter, I write it in like a scratch note that I can then put in the newsletter later.
812
00:54:16,039 --> 00:54:19,799
Whenever I see a funny joke, I'm like, ooh, we will add that to the newsletter sometime. And so
813
00:54:20,679 --> 00:54:25,719
I have a workflow where even though it still takes time to write the whole newsletter every
814
00:54:25,719 --> 00:54:29,960
single week, it's yeah, it's spread out. And I've developed enough workflows where
815
00:54:29,960 --> 00:54:35,239
I have the exact same formula that I follow that works well, even if I'm having a tough week,
816
00:54:35,239 --> 00:54:40,599
because it's kind of just- You know, it's funny. To take it all back full circle,
817
00:54:40,599 --> 00:54:46,759
I think that's the thing that the things that AI will be good at is adding to people's workflows
818
00:54:46,759 --> 00:54:50,599
so they can be more efficient. I don't think it's ever going to be- Less context switching.
819
00:54:50,599 --> 00:54:56,759
Yeah. Yes. And I feel like that's the best part is when it's not making you switch contexts as
820
00:54:56,759 --> 00:55:02,279
much, but you're still recording that or putting it away or organizing that thing in the background.
821
00:55:03,160 --> 00:55:08,840
I think that's when it's going to be very- I had a similar workflow with my newsletter that I
822
00:55:08,840 --> 00:55:13,320
stopped doing because the workflow started to get difficult. But Pocket went away, which was my place
823
00:55:13,320 --> 00:55:18,040
to bookmark and find my things and make those notes. And then when Pocket wasn't there, I never
824
00:55:18,040 --> 00:55:22,519
really replaced it. I was just like, oh, all the workflows are gone now. And I use Pocket for years
825
00:55:22,519 --> 00:55:26,920
for so many things. I just had a habit of every Sunday I sat down and read my Pocket queue. And
826
00:55:26,920 --> 00:55:29,960
it was a great workflow because I was like, oh, I don't have to read this during the week. I don't
827
00:55:29,960 --> 00:55:33,800
have to keep the tab open because I know Sunday night I'll have some time and I'm just going to
828
00:55:33,800 --> 00:55:38,840
go read it. And now that hasn't been there. And I feel like I'm missing a big part of that,
829
00:55:38,840 --> 00:55:43,079
what used to be easy workflow now is hard to say, where was that link? What was that thing
830
00:55:43,079 --> 00:55:47,559
I was trying to do? All that stuff gets more difficult. Yeah. Try Raindrop. It's nice.
831
00:55:51,400 --> 00:55:56,360
I have no idea. I have never tried. I was looking for a while to get something that was self-hostable
832
00:55:56,360 --> 00:56:00,920
so I could own the data, which then was like redesigning my blog. And it just ended up being
833
00:56:00,920 --> 00:56:07,400
like this big wrong comparison. That's the danger. I actually just wrote about a blog post where I
834
00:56:07,400 --> 00:56:12,680
was talking about photo backup, where I was going into such a rabbit hole of like, I want to use
835
00:56:12,680 --> 00:56:17,160
open source this, that way I can self-host that, do all these different things. And then I found
836
00:56:17,160 --> 00:56:23,559
an open source hosted solution that gets the job done. And I'm using that. And then eventually I'll
837
00:56:23,559 --> 00:56:28,199
upgrade to something else, but one thing at a time. And I feel like, yeah, accepting those
838
00:56:28,199 --> 00:56:33,320
intermediate steps is hard, but helps with those habits and workflows. That's actually really true
839
00:56:33,320 --> 00:56:37,000
because I feel like that's what gets, like we were just talking about when you get overwhelmed and
840
00:56:37,000 --> 00:56:41,639
you don't actually start, like that's the basis of so many things. Like I feel like my GitHub
841
00:56:41,639 --> 00:56:48,920
pages website has been in shambles for like seven months. Maybe I should use Copilot on that. Like,
842
00:56:48,920 --> 00:56:53,159
can you fix this so I can get to the point I can actually add things? Like, yeah.
843
00:56:53,879 --> 00:56:56,759
My blog is open source. You can use that template if you want.
844
00:56:56,759 --> 00:57:01,079
Sweet. I might take you up on that. Because I just need an easy template.
845
00:57:01,079 --> 00:57:02,359
So I can actually post stuff.
846
00:57:02,359 --> 00:57:03,159
Get started.
847
00:57:05,000 --> 00:57:09,480
That's really cool that you did an open source template. Like I think it's rad that
848
00:57:09,480 --> 00:57:14,679
you can tell that you genuinely enjoy all this and you're like always willing to like share.
849
00:57:14,679 --> 00:57:17,799
Like I like following you on Blue Sky because you're always like, and then I did this cool
850
00:57:17,799 --> 00:57:19,480
thing and it's never like gatekeeping.
851
00:57:20,199 --> 00:57:26,679
Thanks. I do that because I think it's important to pay it forward. Where enough people provided
852
00:57:26,679 --> 00:57:31,960
resources for me as I was learning and growing and doing things that I would rather give things away
853
00:57:31,960 --> 00:57:38,599
that be just like, and this is mine. It's good to learn from all these things. Plus then my future
854
00:57:38,599 --> 00:57:42,519
self learns from it too because I can't tell you how many times I've Googled something and my own
855
00:57:42,519 --> 00:57:44,760
blog post comes up. That's awesome.
856
00:57:44,760 --> 00:57:48,440
Right. Yeah. Right for yourself. Right. The thing that you learned, write it down because you will
857
00:57:48,440 --> 00:57:49,159
find it later.
858
00:57:49,159 --> 00:57:53,240
And that's my favorite thing about community work is the playing it forward aspect because
859
00:57:53,240 --> 00:57:57,000
when people are like, I worked so hard, which we do work hard, don't get me wrong, but like,
860
00:57:57,720 --> 00:58:01,960
I don't think anybody would be there or be around or be successful without other people.
861
00:58:01,960 --> 00:58:04,679
Like I'm always super appreciative of all the people that have helped me.
862
00:58:05,800 --> 00:58:06,760
Yeah. It matters.
863
00:58:09,559 --> 00:58:13,800
Well, thank you so much, Cassidy. Yeah, this has been a lot of fun. I think we kind of went
864
00:58:13,800 --> 00:58:15,559
all over the place, which was great, but also-
865
00:58:15,559 --> 00:58:21,400
That's the best rabbit hole ever. We went from like laundry to like Terminator and like,
866
00:58:22,039 --> 00:58:23,320
I'm going to write some apps.
867
00:58:25,079 --> 00:58:28,119
When this comes out, Autumn, we're going to ping you again. And we'll say, everyone send
868
00:58:28,119 --> 00:58:31,480
Autumn a message and say, ask her if she did it when this episode comes out.
869
00:58:32,440 --> 00:58:34,920
That's a lot of accountability, Justin.
870
00:58:34,920 --> 00:58:37,159
But hey, maybe that's what you need. Maybe that's what you need.
871
00:58:37,159 --> 00:58:41,880
Probably it is. Like Justin will be like, did you do this thing for the podcast? Did you do that?
872
00:58:41,880 --> 00:58:46,840
And I'm like, Justin's- I was like, it's bad when I'm making Justin be the adult here.
873
00:58:50,039 --> 00:58:51,640
That's, yeah, it's fun.
874
00:58:54,360 --> 00:58:56,519
Cassidy, where should people find you on the internet?
875
00:58:57,079 --> 00:59:03,800
You can find me at Cassidy, C-A-S-S-I-D-O-O on most things. Cassidy.co is my website.
876
00:59:03,800 --> 00:59:08,599
Or you could Google Cassidy Williams. There's a Scooby Doo character named Cassidy Williams.
877
00:59:08,599 --> 00:59:11,400
And so I'm not her, but I'm the other one.
878
00:59:12,119 --> 00:59:14,360
Does AI ever confuse that? Is that a problem?
879
00:59:14,360 --> 00:59:18,280
Yeah, actually, no. She's really ruined my SEO, but that's okay.
880
00:59:20,200 --> 00:59:24,680
Also, I love your name, Cassidy. That's like the most adorable thing. And you have it everywhere.
881
00:59:24,680 --> 00:59:30,200
Yeah, my mom made it up. She used to do like, how do you do Cassidy? And it's just stuck.
882
00:59:30,200 --> 00:59:30,599
Oh, perfect.
883
00:59:30,599 --> 00:59:34,280
Oh, that's adorable. It was already cute. And then you made it cuter.
884
00:59:35,000 --> 00:59:41,240
All right. Thank you everyone for listening and we will talk to you again next month.
885
00:59:42,519 --> 00:59:50,519
Bye.
00:00:00,000 --> 00:00:11,560
Welcome to Fork Around and Find Out, the podcast about building, running, and maintaining software
2
00:00:11,560 --> 00:00:14,560
and systems.
3
00:00:14,560 --> 00:00:24,760
Hello everyone and happy new year from Fork Around and Find Out.
4
00:00:24,760 --> 00:00:28,199
I am Justin Garrison and with me today as always is Autumn Nash.
5
00:00:28,199 --> 00:00:29,199
How's it going, Autumn?
6
00:00:29,599 --> 00:00:31,239
Good, but don't lie, it's still December.
7
00:00:31,239 --> 00:00:32,240
That's just going to air.
8
00:00:32,240 --> 00:00:33,600
Hey, hey, it doesn't matter.
9
00:00:33,600 --> 00:00:34,159
It doesn't matter.
10
00:00:34,159 --> 00:00:35,159
It's January.
11
00:00:35,159 --> 00:00:36,159
You're ruining the fun.
12
00:00:37,159 --> 00:00:41,759
Let's just like we're trying to survive the last few days of 2025.
13
00:00:41,759 --> 00:00:44,519
Okay, like let us know what it's like on the other side.
14
00:00:47,879 --> 00:00:50,399
Please everyone drop a comment and tell us what it's like.
15
00:00:50,719 --> 00:00:51,519
Did you make it?
16
00:00:51,519 --> 00:00:54,159
Like is it better?
17
00:00:56,359 --> 00:00:57,480
The bar is on the floor.
18
00:00:57,480 --> 00:00:58,480
Let's be real.
19
00:00:58,959 --> 00:01:03,559
2020, but I want to say it can't be worse, but man, I'm always surprised.
20
00:01:03,559 --> 00:01:07,120
Justin, Justin, you just gave us a curse.
21
00:01:07,120 --> 00:01:10,239
Take it back right now.
22
00:01:10,239 --> 00:01:12,079
You blamed all of our families.
23
00:01:12,079 --> 00:01:13,079
Why?
24
00:01:13,079 --> 00:01:17,000
But we're starting the year off strong because today we got Cassidy Williams on the show.
25
00:01:17,000 --> 00:01:20,799
Cassidy is a senior director of developer advocacy at GitHub.
26
00:01:20,799 --> 00:01:23,239
Man, there's a lot of Ds in that phrase.
27
00:01:23,239 --> 00:01:24,239
Fancy.
28
00:01:24,759 --> 00:01:30,640
You have the best tagline here on the front of your website that you say, I like to make
29
00:01:30,640 --> 00:01:33,119
memes and dreams and software.
30
00:01:33,119 --> 00:01:38,560
And not only is that a great combination, but also you avoid the whole problem of should
31
00:01:38,560 --> 00:01:42,840
I have an Oxford comma or not by just throwing extra ands in that sentence.
32
00:01:42,840 --> 00:01:43,840
And that's wonderful.
33
00:01:43,840 --> 00:01:44,840
Yeah.
34
00:01:44,840 --> 00:01:47,439
And so no one will ever question any word I ever say.
35
00:01:47,439 --> 00:01:48,719
Yeah, you know, it doesn't matter.
36
00:01:48,719 --> 00:01:50,759
She's like, I just add ands and it's fine.
37
00:01:51,760 --> 00:01:55,400
But I think that's what's best about your context, your content, though.
38
00:01:55,400 --> 00:01:57,880
Like it breeds your personality.
39
00:01:57,880 --> 00:02:04,520
Like some people make like the content they make about developers and tech.
40
00:02:04,520 --> 00:02:06,840
Like, and it's so boring.
41
00:02:08,840 --> 00:02:09,480
You know what I mean?
42
00:02:09,480 --> 00:02:12,960
Like, I love that we're professionals and we're doing a job.
43
00:02:12,960 --> 00:02:16,920
But like, what makes me want to watch your video versus somebody else's video?
44
00:02:16,920 --> 00:02:19,000
And yours are like absolutely hilarious.
45
00:02:19,000 --> 00:02:19,719
Like half the time.
46
00:02:19,719 --> 00:02:21,319
I'm just like, there's one tear.
47
00:02:21,319 --> 00:02:23,479
I'm dying and I'm like reposting it.
48
00:02:23,479 --> 00:02:26,520
I'm like, Cassidy understands my life.
49
00:02:26,520 --> 00:02:31,199
What's that one video that you posted, like you reposted it, but it was from like a year ago.
50
00:02:31,199 --> 00:02:33,599
I forget, but I posted it and I was like still accurate.
51
00:02:33,599 --> 00:02:37,439
Like it's like one of the like.
52
00:02:37,439 --> 00:02:39,919
Usually it's just pain and then laughing at the pain.
53
00:02:39,919 --> 00:02:41,159
So it could be literally anything.
54
00:02:41,159 --> 00:02:43,919
That's I feel like that's what gets you through being an engineer.
55
00:02:43,919 --> 00:02:47,560
You just like laugh at the pain and then other people laugh with you.
56
00:02:47,560 --> 00:02:51,680
And then it helps your imposter syndrome to realize it's not just you and you don't just suck.
57
00:02:51,680 --> 00:02:53,240
We all suck together.
58
00:02:53,240 --> 00:02:59,120
Right. And if you laugh at it and everybody's laughing at pain, then maybe the pain will go away.
59
00:02:59,120 --> 00:02:59,719
That's wishful thinking.
60
00:02:59,719 --> 00:03:02,000
I'm always like, but there's hope.
61
00:03:02,000 --> 00:03:03,640
Maybe. I don't know.
62
00:03:03,640 --> 00:03:05,159
Maybe.
63
00:03:05,159 --> 00:03:13,000
But on top of your great videos, you also have a wonderful newsletter with jokes that even make me cringe sometimes,
64
00:03:13,039 --> 00:03:19,960
which is lovely because I think they're the cherry on top of a wonderful newsletter and a prolific writer on your blog.
65
00:03:19,960 --> 00:03:23,840
You're going through a whole daily December blog post, right?
66
00:03:23,840 --> 00:03:24,919
Yeah. Blog vent.
67
00:03:24,919 --> 00:03:33,879
It is. I always go in so optimistic and then like I'm like halfway through December now and I'm like, I am grasping for straws of topics.
68
00:03:33,879 --> 00:03:39,000
Well, and since this is January now, how did it go for you?
69
00:03:39,000 --> 00:03:43,199
It was incredible.
70
00:03:43,199 --> 00:03:45,280
Which, OK, first of all, let's unpack this.
71
00:03:45,280 --> 00:03:55,360
For one, I think cringe should be the new word for all things good, because I love people that will unabashedly be themselves and hilarious.
72
00:03:55,360 --> 00:03:57,199
But it just like I love the jokes.
73
00:03:57,199 --> 00:04:00,120
But also, how do you out cringe Justin?
74
00:04:00,120 --> 00:04:03,599
Because he is like king of the dad jokes.
75
00:04:03,599 --> 00:04:06,719
Like he like dad jokes are so bad.
76
00:04:06,759 --> 00:04:08,960
You really out joke Justin?
77
00:04:08,960 --> 00:04:11,159
I'm not a real dad. I'm a faux pas.
78
00:04:13,960 --> 00:04:17,680
Oh my God, I love you.
79
00:04:17,680 --> 00:04:20,839
And I think we're done with the podcast, right?
80
00:04:20,839 --> 00:04:23,439
January 26th, we're off to a good start.
81
00:04:23,439 --> 00:04:24,639
This is Mike Drop.
82
00:04:24,639 --> 00:04:26,240
Thank you all for coming.
83
00:04:26,240 --> 00:04:29,839
Subscribe at Casadoo.co.
84
00:04:29,839 --> 00:04:31,279
That's it.
85
00:04:31,279 --> 00:04:33,199
That's why I'm here.
86
00:04:33,240 --> 00:04:43,800
So tell us, tell us how you got into from your traditionally a software developer or you were doing software development into developer advocacy.
87
00:04:43,800 --> 00:04:44,800
What did that look like for you?
88
00:04:44,800 --> 00:04:46,240
What have you been doing throughout your career?
89
00:04:46,240 --> 00:04:47,639
Obviously hilarious.
90
00:04:47,639 --> 00:04:51,000
I mean, it was just like you can't you can't be a software developer with this sense of humor.
91
00:04:51,000 --> 00:04:52,839
So you had to go do something else.
92
00:04:52,839 --> 00:04:57,079
I just had to figure out how can I, you know, force this upon people?
93
00:04:57,079 --> 00:04:59,279
No, I request jokes weren't happening.
94
00:04:59,279 --> 00:05:02,359
You're like she's actually got a personality and people skills.
95
00:05:02,359 --> 00:05:03,719
Oh, no.
96
00:05:03,719 --> 00:05:06,039
What are we going to do?
97
00:05:06,039 --> 00:05:10,839
Honestly, my entire career has been just like a dev rel sandwich where I actually.
98
00:05:10,839 --> 00:05:21,000
So it kind of goes back to college where when I got into college, they had like asked if I could speak to high school students about majoring in computer science and stuff like that.
99
00:05:21,000 --> 00:05:22,839
And I liked it.
100
00:05:22,839 --> 00:05:26,759
And I was just like, I wonder if there's roles where I can talk in addition to code.
101
00:05:26,800 --> 00:05:29,879
I don't know. And I just continued on with my life.
102
00:05:29,879 --> 00:05:39,240
But then there was a point my senior year, I was going to a lot of different hackathons and I'm going to just go through this part fast because it's kind of silly how obnoxious it is.
103
00:05:39,240 --> 00:05:40,360
But here's what happened.
104
00:05:40,360 --> 00:05:41,759
I was going to a lot of hackathons.
105
00:05:41,759 --> 00:05:46,399
One of the hackathons was a hackathon on an airplane or as a flight from San Francisco to London.
106
00:05:46,399 --> 00:05:47,519
And we had to build something.
107
00:05:47,519 --> 00:05:52,120
And my team ended up winning and we had to speak at the United Nations about our project.
108
00:05:52,160 --> 00:05:59,399
And in that process, I was also doing other hackathons until the United Nations talk, and I ended up interviewing at Venmo at the time.
109
00:05:59,399 --> 00:06:03,920
And that was my first job out of college where they asked me to do both software engineering and dev advocacy.
110
00:06:03,920 --> 00:06:06,720
And it was early in the industry at that time.
111
00:06:06,720 --> 00:06:12,000
And so I was just kind of doing both roles, figuring out what it would look like for the company at the time.
112
00:06:12,000 --> 00:06:18,079
And then I kind of bounced between advocacy and engineering, depending on the role for the rest of my career.
113
00:06:18,079 --> 00:06:19,519
There's so much to unpack there.
114
00:06:19,680 --> 00:06:23,399
First of all, there's a hackathon on an airplane.
115
00:06:23,399 --> 00:06:27,719
Also, you spoke at the United Nations and you did this before your first college job.
116
00:06:27,719 --> 00:06:30,039
Like, way to be an overachiever.
117
00:06:30,039 --> 00:06:31,599
Also, I love this for you.
118
00:06:31,599 --> 00:06:37,959
Thanks. Yeah, no, it was it was a whirlwind of just lots of things happening at once.
119
00:06:37,959 --> 00:06:40,439
And yeah, there is a lot to unpack there.
120
00:06:40,439 --> 00:06:49,439
But long story short, hackathons and meeting a lot of people led to me eventually going into advocacy because I saw people at these hackathons.
121
00:06:49,439 --> 00:06:51,600
And who were representing companies and stuff.
122
00:06:51,600 --> 00:06:53,560
And I was like, wait, this is your job.
123
00:06:53,560 --> 00:06:55,800
You can just help people code. That's so fun.
124
00:06:55,800 --> 00:07:02,600
And so, again, it wasn't like as much of a full time job back when I was about to enter the industry.
125
00:07:02,600 --> 00:07:09,360
And so my initial roles were combos of advocacy and engineering and then went all into engineering,
126
00:07:09,360 --> 00:07:13,399
then went all into advocacy and back and forth for almost every role.
127
00:07:13,399 --> 00:07:17,360
How did you find roles that were both advocacy and engineering?
128
00:07:19,959 --> 00:07:23,920
It was the kind of thing where so I started at Venmo.
129
00:07:23,920 --> 00:07:26,120
Venmo was my first job out of school.
130
00:07:26,120 --> 00:07:30,680
And at the time, Venmo was owned by a company called Braintree.
131
00:07:30,680 --> 00:07:35,079
And then PayPal bought Braintree and Venmo right around when I was joining.
132
00:07:35,079 --> 00:07:37,839
And the PayPal split off from eBay.
133
00:07:37,839 --> 00:07:39,480
There was a lot of like shifts and stuff.
134
00:07:39,480 --> 00:07:46,120
And so they were kind of like combination making up the role at Venmo, but also changing tides and stuff.
135
00:07:46,160 --> 00:07:52,199
And so that's why it was kind of a combo role because they needed someone to speak to developers and use the Venmo API at the time.
136
00:07:52,199 --> 00:07:54,399
But it was also shifting.
137
00:07:54,399 --> 00:07:58,360
And because of all of my work in the New York City tech scene at the time, that's where I was living.
138
00:07:58,360 --> 00:08:06,439
I eventually when enough changes were happening because of that buyout and PayPal splitting off from eBay and stuff,
139
00:08:06,439 --> 00:08:09,639
I ended up going to a startup that basically wanted me to do the exact same thing.
140
00:08:09,639 --> 00:08:13,360
And it was called Clarify, working in AI at the time.
141
00:08:13,400 --> 00:08:19,080
And that was also a combo role where I was doing advocacy and engineering on the product,
142
00:08:19,080 --> 00:08:25,480
where because it was a startup that was less than 20 people, I was just kind of fulfilling the needs.
143
00:08:25,480 --> 00:08:29,639
I was also getting tired and I also wanted to move away from New York City.
144
00:08:29,639 --> 00:08:33,600
And so I moved to Seattle and worked for a creative agency for a while.
145
00:08:33,600 --> 00:08:35,480
And that was just straight engineering.
146
00:08:35,480 --> 00:08:38,480
And so I was coding for clients, doing some engineering management.
147
00:08:38,480 --> 00:08:40,279
And that was my role.
148
00:08:40,319 --> 00:08:44,600
And I liked it a lot, but I ended up missing talking to developers.
149
00:08:44,600 --> 00:08:50,480
And so from there, I went to and that agency ended up being bought and doesn't exist anymore.
150
00:08:50,480 --> 00:08:51,919
And I went to Amazon after that.
151
00:08:51,919 --> 00:08:53,879
And that was full advocacy.
152
00:08:53,879 --> 00:08:56,559
And it was advocacy for the Echo.
153
00:08:56,559 --> 00:09:00,919
I'm not going to say her name because she's behind me and she'll hear me, the Echo.
154
00:09:00,919 --> 00:09:07,240
And it was fun, but without getting too into the weeds.
155
00:09:07,279 --> 00:09:12,799
The culture didn't really fit with what I wanted to do and I didn't love it.
156
00:09:12,799 --> 00:09:15,159
And so I ended up going to CodePen after that.
157
00:09:15,159 --> 00:09:16,840
And that was a full engineering role.
158
00:09:16,840 --> 00:09:22,200
And I loved working on CodePen and just working on a product that I was using so regularly.
159
00:09:22,200 --> 00:09:28,440
And I loved CodePen, but I started to miss that speaking to developers aspect again.
160
00:09:28,440 --> 00:09:33,320
And so eventually after CodePen, I went and taught React full time.
161
00:09:33,360 --> 00:09:39,360
And so I went to a small company called React Training where that team eventually made remix,
162
00:09:39,360 --> 00:09:41,920
but was maintaining React router and a bunch of other stuff.
163
00:09:41,920 --> 00:09:46,560
And so I was teaching coding full time, doing corporate workshops, public workshops,
164
00:09:46,560 --> 00:09:48,879
and it was awesome traveling a ton.
165
00:09:48,879 --> 00:09:54,120
And that was through the end of 2019 and beginning of 2020.
166
00:09:54,120 --> 00:10:02,360
And so people weren't doing remote workshops and talks as much as much at the time.
167
00:10:02,399 --> 00:10:07,039
The world has changed a whole lot in the past five, six years with regards to that.
168
00:10:07,039 --> 00:10:12,240
And so React Training, we ended up having to just kind of...
169
00:10:12,240 --> 00:10:17,680
Everybody had to part ways because the business wasn't going to survive that for a long time.
170
00:10:17,680 --> 00:10:22,720
And from there, I went to Netlify, did advocacy there for a while.
171
00:10:22,720 --> 00:10:29,080
And working in developer advocacy at Netlify was really fun because it was working on this platform,
172
00:10:29,120 --> 00:10:37,840
the glory days of the Jamstack and helping developers build web dev and build web products better,
173
00:10:37,840 --> 00:10:40,040
which was very fun.
174
00:10:40,040 --> 00:10:43,040
Eventually I burnt out hardcore and took a little break.
175
00:10:43,040 --> 00:10:47,520
And I ended up doing some part time work, working at remote.com a little bit,
176
00:10:47,520 --> 00:10:50,920
working for some VCs and advising startups for a while.
177
00:10:50,920 --> 00:10:57,040
And then eventually I went to another startup that I was advising and I got really close with the team
178
00:10:57,040 --> 00:10:58,879
and it was called Contenda.
179
00:10:58,879 --> 00:11:00,840
And that startup was so fun.
180
00:11:00,840 --> 00:11:06,720
We were building a bunch of AI products before ChatGPD came out and then ChatGPD came out.
181
00:11:06,720 --> 00:11:11,439
And we had a lot of different pivots and stuff kind of fitting in there.
182
00:11:11,439 --> 00:11:15,639
And honestly, after enough pivots, we got hired and we were like,
183
00:11:15,639 --> 00:11:20,000
okay, Contenda has been very fun, but it's really hard to sustain a business.
184
00:11:20,000 --> 00:11:25,159
And so then I consulted for a little while and ended up at GitHub.
185
00:11:25,159 --> 00:11:28,440
So that's a much longer tapestry of what my career has been.
186
00:11:28,440 --> 00:11:33,720
But it really has been a sandwich of developing and advocacy and developing and advocacy.
187
00:11:33,720 --> 00:11:43,480
Do you think in the year 2026 that AI and coding assistants has ruined what hackathons used to be?
188
00:11:43,480 --> 00:11:45,680
Whoa, what a deep question.
189
00:11:45,680 --> 00:11:48,120
I was just like right off the bat, because I used to love hackathons,
190
00:11:48,120 --> 00:11:53,160
but I loved it because of the team aspect of you had to get an expert from every little piece
191
00:11:53,160 --> 00:11:55,280
that had some experience in different things.
192
00:11:55,280 --> 00:11:57,920
And I would come in and like, oh, I'm the systems expert or architecture.
193
00:11:57,919 --> 00:12:00,719
You get your front end expert, your DBA, whatever.
194
00:12:00,719 --> 00:12:04,679
And you had to work as a team really, really closely, really quickly.
195
00:12:04,679 --> 00:12:10,679
And I'm sure that that plane ride was a team of a handful of people that you all sat next to each other.
196
00:12:10,679 --> 00:12:12,799
And you just threw files back.
197
00:12:12,799 --> 00:12:14,599
I don't know what the network situation was on the plane.
198
00:12:14,599 --> 00:12:16,759
We were literally on an airplane without Wi-Fi.
199
00:12:16,759 --> 00:12:18,159
USB sticks, no Wi-Fi.
200
00:12:18,159 --> 00:12:19,839
Just like, here you go. Here's the next file.
201
00:12:19,839 --> 00:12:26,120
But now in the age of 2026 and in Claude and all these other code assistants and co-pilot,
202
00:12:26,120 --> 00:12:31,879
it's individuals that are like, I will do this part and I will figure it out with my co-pilot and you do that part.
203
00:12:31,879 --> 00:12:37,600
And now it's more about divvying up the work than the camaraderie and teamwork, right?
204
00:12:37,600 --> 00:12:39,399
I've never really thought of it that way.
205
00:12:39,399 --> 00:12:45,200
I feel like there's aspects of that where.
206
00:12:45,200 --> 00:12:52,039
I don't know, I feel like in a lot of my really early hackathon days when I was still learning and a really early career developer,
207
00:12:52,039 --> 00:12:56,399
so much of my time was spent just like reading docs and trying to learn a new technology.
208
00:12:56,399 --> 00:13:06,919
And now if I do a hackathon, like, for example, I did the GitHub game off recently that it's more like I spent more time figuring out what I wanted to build.
209
00:13:06,919 --> 00:13:09,519
And then I didn't have like AI write it for me.
210
00:13:09,519 --> 00:13:12,879
I just kind of used it as my tool as I was building my project together.
211
00:13:12,879 --> 00:13:22,000
And so you're right, it was less about the camaraderie, but it was also more about like the final product rather than the let's throw code into a thing until something.
212
00:13:22,039 --> 00:13:24,240
Works so it might be.
213
00:13:24,240 --> 00:13:29,320
I don't know if it ruins the hackathon vibe, but it definitely changes it.
214
00:13:29,320 --> 00:13:32,960
And the reason I always went to hackathons was to network with people, right?
215
00:13:32,960 --> 00:13:35,759
It was like, hey, I want to maybe I don't have a team or I have two people.
216
00:13:35,759 --> 00:13:37,440
We need two more or something like that.
217
00:13:37,440 --> 00:13:39,919
And it was always about meeting people and networking.
218
00:13:39,919 --> 00:13:42,840
I don't feel like it's that way anymore.
219
00:13:42,840 --> 00:13:51,759
I haven't been to a hackathon for a while, but just the vibe I feel like is going more towards like what CTF are like CTFs capture the flags are usually individuals.
220
00:13:51,799 --> 00:13:57,240
That are like, I'm going to hack in this box, get all my points, and it's sometimes it's teams, but more often than not, it's a single person.
221
00:13:57,240 --> 00:13:58,960
That's like, I know how to use all these tools.
222
00:13:58,960 --> 00:14:04,319
I'm just going to see how much I can hack and they get some notoriety because they got they won or whatever.
223
00:14:04,319 --> 00:14:07,720
They learn some stuff, but I don't feel like it's the same thing anymore.
224
00:14:08,000 --> 00:14:12,240
I don't know if I agree. I feel like I put on a lot of hackathons and I do a lot of hackathons.
225
00:14:12,240 --> 00:14:16,200
And I think like I'm usually very critical of AI.
226
00:14:16,200 --> 00:14:19,720
Like, I try to hold like an honest perception of it.
227
00:14:20,200 --> 00:14:29,320
And now that I use it every day, like, I don't really think that AI has changed any camaraderie or networking of anything.
228
00:14:29,320 --> 00:14:32,160
I think it almost like this is going to be a wild take.
229
00:14:32,160 --> 00:14:40,600
But I think that we're going to finally, when all this kind of gets more mature, we're finally going to see the difference between developers who could just code.
230
00:14:40,600 --> 00:14:43,600
Like, I think for a long time, we put up with people who could just code.
231
00:14:43,600 --> 00:14:44,680
That was all they were good at.
232
00:14:44,680 --> 00:14:48,720
They were good at the technical part and nobody wanted to put up with him or deal with him.
233
00:14:48,720 --> 00:14:52,519
But you had to hold on to this one dude because he was really good at being technical.
234
00:14:52,519 --> 00:14:53,519
He was always a dude.
235
00:14:53,519 --> 00:14:55,920
Yeah. So like you've got now...
236
00:14:55,920 --> 00:14:56,920
It's an equalizer.
237
00:14:57,519 --> 00:15:03,879
Yes, to me now, like just how we like the joke that we made about like how Cassidy had an actual personality and people skills.
238
00:15:03,879 --> 00:15:10,519
Like, I think that it's going to be really interesting because it's transforming who we are as developers.
239
00:15:10,519 --> 00:15:16,639
Like, I don't think being a developer five years ago and what being a developer five years from now is going to be the same thing.
240
00:15:17,360 --> 00:15:27,919
I like to use it to like make proof of concepts for things, but it's like to make different versions and to kind of like troubleshoot it and get it to test things and then go right the actual thing.
241
00:15:27,919 --> 00:15:31,439
Like, I don't know if I think it almost puts more...
242
00:15:32,199 --> 00:15:40,319
Where I hope it will allow for more people that have both the technical skills, the networking and the people skills and the ability to teach.
243
00:15:40,320 --> 00:15:49,080
Like, because we're going to have to change the way that we teach junior developers and bring people into the fold because they're like, like Stack Overflow.
244
00:15:49,080 --> 00:15:52,840
Like Stack Overflow is not even going to get the same amount of information it used to get, right?
245
00:15:52,840 --> 00:15:58,040
Like, there's a whole new way of trying to figure out how to be a developer because you have...
246
00:15:58,840 --> 00:16:01,560
Like, it's just going to write this code for you, but how do you know if it's good or not?
247
00:16:01,560 --> 00:16:02,080
Like, how do you...
248
00:16:02,080 --> 00:16:06,560
Like, we're going to be forced to have to like really teach junior engineers.
249
00:16:06,599 --> 00:16:16,919
And I think it's going to like be a very like interesting pivot on how we do that and how senior engineers are going to have to be better at like teaching and building those relationships.
250
00:16:16,919 --> 00:16:21,519
So I think it's actually going to be the opposite where there's more emphasis on people.
251
00:16:21,519 --> 00:16:23,839
Well, if it's done right, you know, we could always do it the wrong way.
252
00:16:24,239 --> 00:16:25,279
If we don't fumble it.
253
00:16:26,679 --> 00:16:32,079
If we don't like completely fumble the bag, I think that like I honestly think that's going to be the differentiator.
254
00:16:32,520 --> 00:16:41,920
One, it's going to be the people that take in the most data of the code written because we're not going to have those free open areas to be like, hey, did you have this problem?
255
00:16:41,920 --> 00:16:44,759
Because people are going to ask like, you know, AI now.
256
00:16:45,200 --> 00:17:00,360
And the people, whatever big company or little company, whatever that figures out how to utilize AI in the right way to teach people and not completely like fumble their pipeline and really make good developers who can use it.
257
00:17:00,399 --> 00:17:03,120
But aren't like cut off by it.
258
00:17:03,560 --> 00:17:07,680
I think that's going to be the huge differentiator.
259
00:17:08,160 --> 00:17:09,640
I think that too.
260
00:17:09,640 --> 00:17:10,960
And sorry to interrupt you, Justin.
261
00:17:12,079 --> 00:17:18,000
First of all, I do still see like student hackathons, the people at MLH and all of their stuff there.
262
00:17:18,240 --> 00:17:21,839
They're doing such great networking events for students and early career people.
263
00:17:21,839 --> 00:17:25,839
So I do think that the hackathon spirit of networking is still alive.
264
00:17:26,199 --> 00:17:33,919
I think the lack of collaboration and stuff is more a sign of like society than AI.
265
00:17:33,959 --> 00:17:40,119
I'm sure AI doesn't help, but I also think society is weird, particularly now.
266
00:17:40,119 --> 00:17:41,439
That's a good word for it.
267
00:17:41,439 --> 00:17:43,919
Yeah, that's weird.
268
00:17:45,480 --> 00:17:55,159
But I also, I like what you mentioned, Autumn, when it comes to like the content that's being put out now.
269
00:17:55,320 --> 00:18:03,160
I think this is where like the revival of blogs that I'm seeing and like people leaning into RSS feeds and newsletters is really interesting.
270
00:18:03,160 --> 00:18:14,160
And I hope that we see more of that because people are going away from centralized platforms like the Twitters of the world and stuff, because they're just like, well, if it goes away, where's all of my content?
271
00:18:14,880 --> 00:18:16,560
It's on your spaces that you create.
272
00:18:17,039 --> 00:18:25,879
And this isn't everyone, but I feel like that's a trend I'm starting to see as people starting to create their content and communities in places that are more portable.
273
00:18:26,200 --> 00:18:32,159
I feel like it's almost going to be like almost democratizing, but not in the way that it might seem like.
274
00:18:32,679 --> 00:18:40,319
I think it's going to be interesting because the people like the Cassities and Justin who like to play around with things and try things.
275
00:18:40,319 --> 00:18:42,919
And like your curiosity is never going to go away.
276
00:18:43,000 --> 00:18:51,200
Your big personalities are never going to go away, but now they shine through even more because people are just putting out AI slop that has no personality.
277
00:18:51,480 --> 00:18:55,960
So now like you being funny is even more of a differentiator.
278
00:18:55,960 --> 00:19:02,200
The fact that you can both do the technical and be a really good teacher and make that interesting to me.
279
00:19:02,200 --> 00:19:05,600
I think that honestly, like we're a bunch of nerds.
280
00:19:05,600 --> 00:19:08,880
We've made friends on the Internet without having to be in person for a long time.
281
00:19:09,240 --> 00:19:11,600
And I think it's going to make the Internet cool again.
282
00:19:11,679 --> 00:19:24,199
You know, like you're just like, look at this weird thing I hacked together or, you know, like just, I don't know, like I hate dealing with DNS and AI can do that part for me while I do the cool colors and like the other cool thing.
283
00:19:24,199 --> 00:19:30,559
You know, I mean, to your point of the AI piece is is a tool to write the code, right?
284
00:19:30,559 --> 00:19:34,679
Like if I was going into any hackathon for myself, like I was never I never considered myself a coder.
285
00:19:34,839 --> 00:19:37,719
I was like, I will set up the infrastructure so that you can run your code, do whatever.
286
00:19:37,960 --> 00:19:44,480
And now I can participate in the code writing pieces of it because I have a tool that generally does that.
287
00:19:44,920 --> 00:19:49,279
But back to what Cassidy was saying before, like I used to also read all the docs.
288
00:19:49,279 --> 00:19:54,079
And that's how I know when things work or they don't work, when the AI says, oh, you should be able to do this.
289
00:19:54,079 --> 00:19:56,000
Like, no, no, no, that's not in the API spec.
290
00:19:56,000 --> 00:19:58,000
That is not a field that is available.
291
00:19:58,200 --> 00:19:59,000
We have to do a different way.
292
00:19:59,000 --> 00:20:06,799
Right. Like you still have to have that one notch lower understanding of what you're doing to be able to do it in any successful way.
293
00:20:07,720 --> 00:20:17,559
But I think my my ultimate question is, what does that notion of I have a tool that writes the code for me and possibly even reads the docs better than than a human coder?
294
00:20:17,600 --> 00:20:19,960
Comprehend it. What happens to DevRel?
295
00:20:20,880 --> 00:20:28,079
Right. Because DevRel was the person that had to take all the docs and give them the pieces that they cared about.
296
00:20:28,079 --> 00:20:29,279
And then they could go write the code.
297
00:20:29,279 --> 00:20:31,240
But now I have a tool that does both those sides.
298
00:20:31,519 --> 00:20:33,720
So is DevRel's job just to be funny?
299
00:20:34,360 --> 00:20:37,640
All right. Like the things that the humor side of the AI that's not good at.
300
00:20:39,200 --> 00:20:42,319
I think like we are far from that.
301
00:20:42,600 --> 00:20:46,600
Maybe someday I'm going to be just like, remember when I didn't worry about my job.
302
00:20:46,600 --> 00:20:55,920
Ha ha ha. But like the number of AI written blog posts that I've been asked to review, not even just like the ones that coworkers send me.
303
00:20:55,920 --> 00:20:58,680
I mean, like, in general, people will ask me to review blog posts.
304
00:20:58,680 --> 00:21:01,880
I'm just like, I can tell an AI wrote this because it's terrible.
305
00:21:02,360 --> 00:21:07,920
Like I I'm personally not worried about that level of content creation.
306
00:21:07,920 --> 00:21:15,640
It might speed things up where I love tools like Descript, for example, where it uses AI to very quickly get a transcript and to make edits.
307
00:21:15,640 --> 00:21:23,200
If you like remove a word and it'll cut something or using AI to auto add captions or something.
308
00:21:23,279 --> 00:21:29,480
I think that there's things that AI does for you, but there's still things that.
309
00:21:30,920 --> 00:21:38,640
I know an AI couldn't do the product demos that I do, and not because that not because the AI tools are bad and not because I'm perfect,
310
00:21:38,880 --> 00:21:45,880
but because there's still a level of human quality that is needed for now, for the next.
311
00:21:45,880 --> 00:21:49,039
I also think the human relation part to it, you know what I mean?
312
00:21:49,279 --> 00:21:53,319
You understand what other humans find, not just funny, because I don't want to like,
313
00:21:53,759 --> 00:21:56,639
I think we have to be careful to not reduce Cassie to just being funny, right?
314
00:21:56,759 --> 00:21:57,799
But it's more than that.
315
00:21:57,799 --> 00:22:02,799
Like, like, you know, we were talking about laughing about the bad times, like when things go wrong.
316
00:22:02,799 --> 00:22:07,599
It's not just so much about being funny, but like you can then identify with all those developers.
317
00:22:07,599 --> 00:22:09,759
I feel your pain. This happens to me, too.
318
00:22:10,119 --> 00:22:13,599
Like, and I think that, like, AI is good for a lot of things.
319
00:22:13,599 --> 00:22:21,119
But to me, it's like an IDE or, you know, like Photoshop or just the different things that we have already been using to enhance the world.
320
00:22:21,119 --> 00:22:23,039
Like, it's really bad at some things.
321
00:22:23,039 --> 00:22:25,919
It's really good at some things. You have to know that just to use it, right?
322
00:22:25,919 --> 00:22:30,199
Like, there's so many like, they're just double different levels of abstraction.
323
00:22:30,199 --> 00:22:33,119
Like Photoshop didn't stop people from being photographers.
324
00:22:33,159 --> 00:22:42,959
You know what I mean? Like, it didn't stop people from having cool, unique ways of creating like a drama or a story with photos, you know?
325
00:22:42,960 --> 00:22:44,559
So like, I don't know.
326
00:22:44,559 --> 00:22:50,759
I think that that's when we know AI really has the relevance when we get through the like pretending it will do everybody's job.
327
00:22:51,360 --> 00:22:53,319
And like, it almost shows what we're good at.
328
00:22:53,319 --> 00:23:00,799
Like, I don't think the way that you use your videos to like show the empathy and relation that other people like you understand what other people are going through.
329
00:23:00,799 --> 00:23:03,799
Like, I don't think AI could ever do that.
330
00:23:04,519 --> 00:23:07,160
There's do either of you know the game Go?
331
00:23:08,440 --> 00:23:10,640
I know of the game Go. I've never played it, but I know it.
332
00:23:10,680 --> 00:23:12,400
Yeah, I love Go.
333
00:23:12,400 --> 00:23:14,080
I play Go every day.
334
00:23:14,120 --> 00:23:16,400
It's a really, really fun game you can play online.
335
00:23:16,400 --> 00:23:17,120
It's great.
336
00:23:18,080 --> 00:23:30,640
Alpha Go came out in like 2017 or something, and it was a whole big thing, kind of like IBM's Watson beat chess master and Alpha Go was Google's version of beating a Go master.
337
00:23:30,680 --> 00:23:36,880
And it's interesting to see how much the game Go has changed because of that.
338
00:23:37,120 --> 00:23:39,720
Where I again, I played a lot.
339
00:23:39,720 --> 00:23:43,680
I was a part of a study on like, did AI ruin your enjoyment of the game?
340
00:23:43,680 --> 00:23:49,040
And pretty much everybody said no, they still like playing Go, even though an AI could probably beat them.
341
00:23:49,040 --> 00:24:00,600
But what's interesting about the games that we saw and the things that that you can see when Alpha Go has played is it comes up with moves that you've never seen before.
342
00:24:00,839 --> 00:24:06,199
I took a couple Go lessons where a teacher was saying like, oh, in this situation you want to go here.
343
00:24:06,199 --> 00:24:06,879
And I was like, why?
344
00:24:06,879 --> 00:24:11,919
And she said, interestingly, nobody did this for about two or three hundred years.
345
00:24:11,959 --> 00:24:13,399
This was actually the correct move.
346
00:24:13,399 --> 00:24:16,439
But when Alpha Go made this move, everyone was like, why would you do that?
347
00:24:16,839 --> 00:24:18,759
Silly AI, this is ridiculous.
348
00:24:18,959 --> 00:24:21,599
And then like 20 moves later, they're like, wait, what?
349
00:24:21,599 --> 00:24:23,399
And it changed the game literally.
350
00:24:23,639 --> 00:24:27,399
We're like, people are just like, now we know you should definitely go there.
351
00:24:27,600 --> 00:24:34,200
And it provided new insights that the people and players didn't see because kind of like that.
352
00:24:34,880 --> 00:24:37,600
There's a Grace Hopper quote, the most dangerous phrase in the languages.
353
00:24:37,600 --> 00:24:38,759
We've always done it this way.
354
00:24:39,120 --> 00:24:42,560
I think there's a lot of that in humanity and stuff.
355
00:24:42,560 --> 00:24:47,120
We are habitual creatures and this is getting very deep and philosophical.
356
00:24:47,400 --> 00:24:54,600
But I think very similarly, AI is going to change the way we see things and the way we do things.
357
00:24:54,719 --> 00:24:59,519
But that doesn't necessarily mean it takes away from not only our enjoyment of the work,
358
00:24:59,519 --> 00:25:01,959
but how we do the work and the work that needs to be done.
359
00:25:04,959 --> 00:25:06,839
And there's it's not just our industry, right?
360
00:25:06,839 --> 00:25:12,079
Like every industry is grappling with what is happening with something that can create stuff
361
00:25:12,079 --> 00:25:15,439
that only people used to be able to write, like difficult things like the music.
362
00:25:15,439 --> 00:25:18,079
Music industry is just like exploding right now.
363
00:25:18,079 --> 00:25:23,639
Right. Like there's just like we don't know what a what a demo is anymore, because this AI company
364
00:25:23,640 --> 00:25:26,840
can crank out a million years of demos in a day.
365
00:25:26,840 --> 00:25:28,240
And it's just like, well, what do we do with that?
366
00:25:28,240 --> 00:25:29,560
Like, this is not impossible.
367
00:25:29,560 --> 00:25:32,520
It's not possible for a human because we have finite time.
368
00:25:32,880 --> 00:25:39,600
And my enjoyment of things like music and short form video, when I see so many videos,
369
00:25:39,600 --> 00:25:41,800
like, you know what, I just don't want to be on this platform anymore.
370
00:25:41,960 --> 00:25:43,480
Like, I don't want to be.
371
00:25:43,480 --> 00:25:45,200
I don't want to see the AI slop.
372
00:25:45,200 --> 00:25:47,320
I don't know. No one else put effort into this.
373
00:25:47,720 --> 00:25:52,759
And I think the the effort and knowing that someone cared enough to spend their time on it
374
00:25:53,240 --> 00:25:54,960
is sometimes why I care. Right.
375
00:25:54,960 --> 00:25:59,839
There's someone like I'm practicing go or whatever to get up to a human level,
376
00:26:00,160 --> 00:26:01,680
to get better as a player.
377
00:26:01,680 --> 00:26:03,319
And it doesn't even matter if you're the best in the world.
378
00:26:03,319 --> 00:26:06,879
Right. Like I watch my kids play sports and I'm like, hey, if everyone's at the same level,
379
00:26:06,879 --> 00:26:08,200
it's still enjoyable. Right.
380
00:26:08,200 --> 00:26:11,480
They're not good players, but they all kind of suck in their own way.
381
00:26:11,480 --> 00:26:15,359
And it's fun to watch because it's it's it's actually great seeing a hackathon
382
00:26:15,480 --> 00:26:17,440
when everyone sucks in the same ways. Right.
383
00:26:17,440 --> 00:26:20,759
Like if there's one team in there that's, you know, professional software developers
384
00:26:20,799 --> 00:26:22,799
and like, well, I don't care because they're going to win the hackathon every time.
385
00:26:22,799 --> 00:26:26,119
No, no, no. I want to level the playing fields and I want to know they have fun.
386
00:26:26,119 --> 00:26:28,160
They're learning something and the effort is the same.
387
00:26:28,720 --> 00:26:32,160
But if they're all using it equally, aren't they still doing it at the same level?
388
00:26:32,160 --> 00:26:34,400
You know what I mean? Like if they're all like, because.
389
00:26:35,920 --> 00:26:38,359
I don't know, like, you know what I mean, like if they're all coming in
390
00:26:38,359 --> 00:26:41,240
and they're building something and they're using it to help them
391
00:26:41,240 --> 00:26:44,359
find different ways to do it, like we don't know that they're all using it
392
00:26:44,359 --> 00:26:46,359
to write it the same way. You know what I mean?
393
00:26:46,920 --> 00:26:50,200
From my experience, the people that come in that are good at hackathons
394
00:26:50,200 --> 00:26:52,840
are the ones that know you can do something that someone else doesn't know.
395
00:26:53,200 --> 00:26:56,279
Right. It's like if you if you can prompt the AI and say, oh, I actually know
396
00:26:56,440 --> 00:27:00,960
this Linux subsystem or I know that API or I know like these electronics,
397
00:27:00,960 --> 00:27:04,920
whatever it is, like I know this piece better than anyone else here,
398
00:27:04,920 --> 00:27:06,600
then I know what it's capable of.
399
00:27:06,600 --> 00:27:11,240
So I can prompt the AI to help me get to the edges of how it's used.
400
00:27:11,480 --> 00:27:14,960
And that's where I usually see people succeed is when they reach some edge
401
00:27:14,960 --> 00:27:18,519
of their own understanding to be able to implement it in a certain way.
402
00:27:18,759 --> 00:27:20,920
You ever seen a bunch of college kids use AI, though?
403
00:27:20,920 --> 00:27:22,400
It's very interesting.
404
00:27:22,400 --> 00:27:25,279
No, no, but honestly, like it's like it's like when you give.
405
00:27:27,200 --> 00:27:29,440
The reason why I say this is because I really like going to schools
406
00:27:29,440 --> 00:27:31,759
and teaching little kids about like technology.
407
00:27:32,160 --> 00:27:34,319
Their brains are so malleable, right?
408
00:27:34,319 --> 00:27:38,000
Like they have they they have not yet learned the constraints
409
00:27:38,000 --> 00:27:40,680
of like adulthood, rooting or fun.
410
00:27:40,680 --> 00:27:44,599
And the way that they look at like how to solve problems is very different.
411
00:27:44,639 --> 00:27:48,639
Like now we have to be super careful that we don't kill their creativity
412
00:27:48,639 --> 00:27:50,559
and curiosity, right?
413
00:27:50,559 --> 00:27:53,119
But I think that goes in like the education of it.
414
00:27:53,399 --> 00:27:54,079
You know what I mean?
415
00:27:54,079 --> 00:27:58,279
Like I don't I don't seriously like I did you see that thing
416
00:27:58,279 --> 00:28:03,240
about a bunch of game developers talking about how their CEO said
417
00:28:03,240 --> 00:28:06,480
that they're using AI for all their proof of concepts?
418
00:28:06,480 --> 00:28:08,559
And then that artist was like, no, we're not like
419
00:28:09,879 --> 00:28:11,639
use it to explore a couple of ideas.
420
00:28:11,639 --> 00:28:14,119
And then I go draw it the way that I have always done it, you know?
421
00:28:14,119 --> 00:28:19,319
And I think like like I was writing like just something that I was going to use
422
00:28:19,759 --> 00:28:22,719
for something else next week, and I wrote it three different ways.
423
00:28:22,759 --> 00:28:25,719
And then I had it tested on a bunch of things, which would have took me forever.
424
00:28:26,159 --> 00:28:28,199
But that just made like work.
425
00:28:28,199 --> 00:28:30,599
Like, for instance, you were talking about content, but like,
426
00:28:30,599 --> 00:28:33,119
what if it's not all AI content generated?
427
00:28:33,399 --> 00:28:36,719
But what if I take a bunch of content of my kids in my life
428
00:28:36,719 --> 00:28:38,599
and sometimes I get AI to help me edit it?
429
00:28:38,599 --> 00:28:41,199
So the editing process is half the time, right?
430
00:28:41,240 --> 00:28:45,600
Like, I think there's still ways that like we're almost going to more appreciate.
431
00:28:45,640 --> 00:28:49,920
Like when you see like the cool like remixes on TikTok, like, cool.
432
00:28:49,920 --> 00:28:50,600
It's new music.
433
00:28:50,600 --> 00:28:52,319
Like there's one of Sleep Token and Christmas.
434
00:28:52,319 --> 00:28:54,120
And I was like, this is the best thing I've ever heard.
435
00:28:54,120 --> 00:28:57,840
But honestly, like, am I going to not listen to Sleep Token because that?
436
00:28:57,840 --> 00:28:59,880
No way. Like in.
437
00:28:59,880 --> 00:29:02,960
And it's a good thing because you have familiarity with what it, you know,
438
00:29:02,960 --> 00:29:05,640
you you have a connection to the other thing it's remixing.
439
00:29:05,840 --> 00:29:08,000
And you're like, oh, this has a different vibe of nostalgia.
440
00:29:08,000 --> 00:29:10,799
Have you ever heard just regular AI stuff?
441
00:29:10,799 --> 00:29:12,879
Like that there's no relation of that.
442
00:29:12,879 --> 00:29:15,759
There's no aspect. Yeah, there's no color.
443
00:29:16,480 --> 00:29:19,799
Yes. It's like if you looked at member, like it's like if you took the
444
00:29:20,079 --> 00:29:23,119
this is going to be weird because I love photography and I used to do design.
445
00:29:23,359 --> 00:29:26,000
But if you took all of the like color out of the world
446
00:29:26,000 --> 00:29:28,639
and everything was black and white, and I don't mean the beautiful depth
447
00:29:28,639 --> 00:29:31,879
of the back black and white, but like if you did it, like you just pulled it out.
448
00:29:32,240 --> 00:29:35,319
That's it's not the same thing as if you if you did black and white
449
00:29:35,319 --> 00:29:36,839
and like an artistic way.
450
00:29:36,839 --> 00:29:39,279
And it's not the same thing as the world in color, right?
451
00:29:39,279 --> 00:29:41,440
Like it's just kind of like a sad version of it.
452
00:29:41,960 --> 00:29:44,519
Like and I think that human relation is it right?
453
00:29:44,519 --> 00:29:47,920
Like the human relation in music, the human relation of what Cassidy,
454
00:29:47,920 --> 00:29:50,639
because she's gone back and forth to being a developer and she's
455
00:29:50,639 --> 00:29:53,759
and like Devreel and she teaches and she goes to these hackathons.
456
00:29:54,079 --> 00:29:56,960
She has a different relation, even the way that you talk about things
457
00:29:56,960 --> 00:30:00,240
when you're super like into like you'll come up with a book
458
00:30:00,240 --> 00:30:01,839
and you'll be like, hey, I just read this new book.
459
00:30:01,839 --> 00:30:04,240
And I'm like, I would have never read that book if you didn't give me that.
460
00:30:04,720 --> 00:30:07,839
No, honestly, like the other day, he he was so excited about something.
461
00:30:08,240 --> 00:30:11,359
And I was like, oh, I've been having the worst tech week, you know,
462
00:30:11,359 --> 00:30:14,480
like I've just said, like, oh, like I don't like, do I even like the stuff anymore?
463
00:30:14,720 --> 00:30:16,359
And he's so excited about it.
464
00:30:16,359 --> 00:30:18,319
It makes me excited about it again.
465
00:30:18,319 --> 00:30:20,199
You know, like, I don't think that you could kill that.
466
00:30:20,199 --> 00:30:23,319
Like and I do think that it's true.
467
00:30:23,319 --> 00:30:25,559
There's times where I look at a post, I'm just like,
468
00:30:25,559 --> 00:30:29,000
if you couldn't bother to write this post, I'm not going to be bothered to read it.
469
00:30:29,439 --> 00:30:34,439
Like I like when it comes to things like writing and music and videos and stuff,
470
00:30:34,439 --> 00:30:36,079
if it's all AI, you can tell.
471
00:30:36,079 --> 00:30:37,199
I don't want all of it.
472
00:30:37,200 --> 00:30:38,559
I don't want it. I don't want to hear that.
473
00:30:38,559 --> 00:30:41,360
If you're using it to like better something, maybe.
474
00:30:41,360 --> 00:30:44,000
But if it's just all of that and you gave nothing, I don't want it.
475
00:30:44,080 --> 00:30:47,240
I think like AI is a somewhat decent editor.
476
00:30:47,240 --> 00:30:48,600
It's a terrible writer.
477
00:30:48,600 --> 00:30:51,039
I think it's a decent outliner.
478
00:30:51,200 --> 00:30:53,360
I don't want it to fill in the blanks like.
479
00:30:53,960 --> 00:30:56,759
And I think that that's that's where some of the gaps are.
480
00:30:56,759 --> 00:31:03,519
But I feel like we're also in such early days that society is learning that.
481
00:31:03,519 --> 00:31:06,600
Like you can't it's not a silver bullet for so many things.
482
00:31:06,599 --> 00:31:10,519
And I think that like that that is a big part of my job now is figuring out,
483
00:31:10,519 --> 00:31:12,000
like how do we communicate that?
484
00:31:12,000 --> 00:31:13,799
Like, for example, we had GitHub.
485
00:31:14,039 --> 00:31:15,279
We're building GitHub Copilot.
486
00:31:15,279 --> 00:31:17,839
There's all of these different tools you can use.
487
00:31:18,159 --> 00:31:23,000
Smart autocomplete chat agents, all these things where they are genuinely helpful.
488
00:31:23,079 --> 00:31:25,119
We don't want to replace developers.
489
00:31:25,519 --> 00:31:28,839
We don't want to take the enjoyment out of coding and creating and stuff.
490
00:31:29,079 --> 00:31:34,319
But we also know that we can help accelerate certain aspects of the process.
491
00:31:34,359 --> 00:31:40,720
How do we communicate that and educate people in the way that will better them
492
00:31:40,720 --> 00:31:43,960
rather than make them feel nervous or gross about it?
493
00:31:44,480 --> 00:31:48,559
Not just that, but just as a developer who's been using Copilot, like it's so
494
00:31:48,559 --> 00:31:54,159
different using like an agent in VS code verse using like the ask version of it
495
00:31:54,159 --> 00:32:00,919
or using it in a browser or using GitHub Copilot CLI, by the way,
496
00:32:01,039 --> 00:32:04,279
GitHub Copilot CLI is fire.
497
00:32:04,320 --> 00:32:05,480
Like it is so different.
498
00:32:05,480 --> 00:32:07,039
It's so wild.
499
00:32:07,039 --> 00:32:10,200
I've only used it a little bit, but it's so amazing.
500
00:32:10,200 --> 00:32:14,600
It is a game changer in difference between that and using in the VS code, I think,
501
00:32:14,600 --> 00:32:19,080
because it separates it for me and it can go like, I can give it very like direct
502
00:32:19,560 --> 00:32:22,800
like instructions and it'll build something and then I'll go and do my own.
503
00:32:22,800 --> 00:32:25,240
Like, you know, it keeps it separate, but I don't know.
504
00:32:25,279 --> 00:32:28,360
I just think that like we're still figuring it out.
505
00:32:28,399 --> 00:32:32,079
And I think that it's almost like with our buying power, we're also going to make
506
00:32:32,079 --> 00:32:32,959
a lot of these decisions.
507
00:32:32,959 --> 00:32:34,079
Yeah. Right.
508
00:32:34,079 --> 00:32:37,879
Like if you don't buy AI music, they're going to stop making it.
509
00:32:38,399 --> 00:32:40,119
You know what I mean? So like.
510
00:32:40,119 --> 00:32:44,879
Yeah, I think one of the best benefits we have to AI in 2025 has been that it's
511
00:32:44,879 --> 00:32:48,199
really expensive for the people running it, right?
512
00:32:48,199 --> 00:32:50,199
Like they actually have to put a bunch of investment in.
513
00:32:50,199 --> 00:32:53,399
And some of that is offset by VC and government funds and whatever.
514
00:32:53,399 --> 00:32:54,839
It can't be forever. Right.
515
00:32:54,839 --> 00:32:57,159
At some point, you get that Uber tipping point.
516
00:32:57,160 --> 00:33:00,400
Yeah. Where all your rides are no longer VC funded and you're like,
517
00:33:00,400 --> 00:33:01,920
oh, how much does this ride actually cost?
518
00:33:01,920 --> 00:33:03,000
Like, I don't think so.
519
00:33:03,000 --> 00:33:05,000
I don't think that's where it's going to get interesting
520
00:33:05,000 --> 00:33:07,519
because it is good at some coding projects. Right.
521
00:33:07,519 --> 00:33:11,560
So I think that it'll be worth spending the money to keep that part around.
522
00:33:11,880 --> 00:33:15,240
But when everyone's like, please get rid of the AI music
523
00:33:15,240 --> 00:33:18,200
and we don't want you to write anything or like we have.
524
00:33:18,200 --> 00:33:21,320
We've had auto correct and Grammarly and all kind of stuff for forever.
525
00:33:21,320 --> 00:33:24,440
Like, I think those things people will have to use them
526
00:33:24,920 --> 00:33:29,400
and like more sparsely because they won't be free and they won't be cheap.
527
00:33:29,400 --> 00:33:31,240
So you'll use them where they actually.
528
00:33:32,120 --> 00:33:34,840
You're going to pay for the things that you actually get benefit from.
529
00:33:35,480 --> 00:33:35,960
Exactly.
530
00:33:35,960 --> 00:33:39,720
And hopefully it's not just people paying to not think anymore
531
00:33:39,720 --> 00:33:41,960
because I think there's a lot of value in thinking.
532
00:33:41,960 --> 00:33:45,720
And I think that's why it's been free for so long too, though, because if they
533
00:33:45,720 --> 00:33:50,039
if they like get you at that carrot, right, where they're like, try this
534
00:33:50,039 --> 00:33:53,720
and also forget how to write things and forget how to think.
535
00:33:53,799 --> 00:33:58,200
And then you're reliant on exactly, which is why I play devil's advocate
536
00:33:58,200 --> 00:33:59,640
with my kids about all this stuff.
537
00:33:59,640 --> 00:34:01,480
And I'm like, well, what do you think about that?
538
00:34:01,480 --> 00:34:04,120
And I like it's always about that critical thinking.
539
00:34:04,120 --> 00:34:05,480
Socratic questioning.
540
00:34:05,480 --> 00:34:08,679
Because it's like I think that's the part that like
541
00:34:08,679 --> 00:34:11,960
nobody could ever replace your curiosity, either of you two.
542
00:34:11,960 --> 00:34:13,960
That is what makes you who you are.
543
00:34:13,960 --> 00:34:17,159
There's no AI that could be your personality or curiosity.
544
00:34:17,159 --> 00:34:19,320
And I don't think that we're going to lose that as long as we're
545
00:34:19,960 --> 00:34:22,199
cognitive of that being a tool.
546
00:34:22,199 --> 00:34:22,760
You know what I mean?
547
00:34:22,760 --> 00:34:24,520
I think you I think you hit on something that.
548
00:34:25,720 --> 00:34:29,720
AI and dev rel, to many extent, like the thing that is valuable
549
00:34:29,720 --> 00:34:31,640
out of it is getting someone else excited.
550
00:34:32,200 --> 00:34:37,240
It is to inspire someone to go do the thing that was difficult, right?
551
00:34:37,240 --> 00:34:38,680
Because those things are difficult tasks.
552
00:34:38,680 --> 00:34:42,840
Like if you say here's a blank ID, go make a web page.
553
00:34:42,840 --> 00:34:44,840
Most people will be like, I don't know where to start.
554
00:34:44,840 --> 00:34:45,560
Right. Like even back.
555
00:34:45,560 --> 00:34:49,480
Oh, my God, that's what it helps me with with ADHD, like deer in the headlights.
556
00:34:49,480 --> 00:34:51,160
Like I get so overwhelmed.
557
00:34:51,159 --> 00:34:55,159
And then all of a sudden, AI has broken it into like all these different ways.
558
00:34:55,159 --> 00:34:55,719
And it's done it.
559
00:34:55,719 --> 00:34:57,480
And I'm like, I don't really like the way you did it.
560
00:34:57,480 --> 00:34:58,759
But I like this part.
561
00:34:58,759 --> 00:35:00,839
But in this part, yes.
562
00:35:00,839 --> 00:35:02,599
And then I'm like, oh, this is horrible.
563
00:35:03,159 --> 00:35:05,480
Dude, it's like I get deer in the headlights.
564
00:35:05,480 --> 00:35:06,519
I'll procrastinate.
565
00:35:06,519 --> 00:35:07,480
Like, you know what I mean?
566
00:35:07,480 --> 00:35:08,920
Like, but that's not even new, right?
567
00:35:08,920 --> 00:35:10,440
Because we've been doing that, like Ruby on Rails, right?
568
00:35:10,440 --> 00:35:12,199
Like templates out the whole website.
569
00:35:12,199 --> 00:35:12,920
That's what I'm saying.
570
00:35:12,920 --> 00:35:15,799
And it's just like, yeah, it's all this information.
571
00:35:15,799 --> 00:35:17,239
It's just the hype cycle.
572
00:35:17,239 --> 00:35:20,039
And getting people excited and enough to say like, hey,
573
00:35:20,039 --> 00:35:22,599
this might be difficult, but you have to figure out some of these things.
574
00:35:22,599 --> 00:35:27,719
I don't know if I showed you, Autumn, I vibe coded like an app called Where to Watch It.
575
00:35:27,719 --> 00:35:28,199
Did you see it?
576
00:35:29,079 --> 00:35:30,920
Oh, wait, one more thing before we move on really quick.
577
00:35:31,800 --> 00:35:39,239
Your video about Kubernetes, you used a container, water, and some other stuff.
578
00:35:39,239 --> 00:35:43,239
And like before this, like you're the only person, like before scale, before this was,
579
00:35:43,239 --> 00:35:45,000
you're the only person I knew that did Kubernetes, right?
580
00:35:45,719 --> 00:35:47,559
And I just didn't understand them.
581
00:35:47,559 --> 00:35:48,759
I was like, OK, it's a container.
582
00:35:50,199 --> 00:35:53,559
And the way that you related it to like real life objects,
583
00:35:53,559 --> 00:35:56,599
if an AI has never played with water and whatever, like, yeah,
584
00:35:56,599 --> 00:35:58,840
they can relate it to something and they can give it to you.
585
00:35:58,840 --> 00:36:02,759
But they're not, they've never been a human that held that water that did those things.
586
00:36:02,759 --> 00:36:07,880
So like nobody could ever reinvent the way that you used a random water
587
00:36:07,880 --> 00:36:09,639
and other things to tell that story.
588
00:36:09,639 --> 00:36:12,519
Like it's how you tell the story when you got your like glasses
589
00:36:12,519 --> 00:36:15,400
and you could see like different like colors all of a sudden.
590
00:36:15,400 --> 00:36:19,480
Like it's your, you get curious about a subject, you get all excited.
591
00:36:19,480 --> 00:36:21,880
You go out into the world and do something with it.
592
00:36:21,880 --> 00:36:24,760
And then you share it with everyone because that's your personality.
593
00:36:24,760 --> 00:36:26,360
And then everybody else is excited about it.
594
00:36:26,360 --> 00:36:28,360
Well, yeah, getting excited, like the color.
595
00:36:28,360 --> 00:36:31,320
I've been colorblind my entire life and I've only been able to see
596
00:36:31,880 --> 00:36:33,719
more colors for the last two years.
597
00:36:33,719 --> 00:36:36,840
But the way that you did that changed my like way that we do it.
598
00:36:36,840 --> 00:36:41,960
We went to, when we went to do awesome, when I still worked at AWS, we had different cards.
599
00:36:41,960 --> 00:36:46,679
We picked the cards differently because I wouldn't have considered colorblindness before then.
600
00:36:46,679 --> 00:36:48,519
And then I was like, oh crap, we have to be careful.
601
00:36:48,519 --> 00:36:50,280
What if someone can't see these?
602
00:36:50,280 --> 00:36:51,400
Yeah. Yeah.
603
00:36:51,400 --> 00:36:55,079
And like that, that has given me different excitement to go experience more things
604
00:36:55,079 --> 00:36:58,840
because I know what's possible that I can see more shades of red now.
605
00:36:58,840 --> 00:37:00,599
And in those contexts, like, oh, cool.
606
00:37:00,599 --> 00:37:02,840
Now I know where my next limit is.
607
00:37:02,840 --> 00:37:06,199
Like I still can't see everything, but I can actually see sunsets better now.
608
00:37:06,199 --> 00:37:09,559
Right? Like those sorts of things are using those tools
609
00:37:09,559 --> 00:37:11,559
to get excited about doing something else.
610
00:37:11,559 --> 00:37:16,920
And that's where I think AI and DevRel fits where the human aspect of DevRel
611
00:37:16,920 --> 00:37:21,159
is really to inspire, to tell people about some things that maybe they didn't know existed,
612
00:37:21,159 --> 00:37:26,760
to explain something in a way that is more relatable, that wouldn't be, you know,
613
00:37:28,119 --> 00:37:33,159
next word predicted, but be able to go through like a different surprise of a human aspect
614
00:37:33,159 --> 00:37:36,440
of how this stuff works. But like I was saying, like I, I, I've, I've coded this thing just
615
00:37:36,440 --> 00:37:39,480
because I was starting from, I wanted to do it for a while.
616
00:37:39,480 --> 00:37:43,400
It was an idea that I heard someone else have on a podcast. I'm like, oh, that sounds like fun.
617
00:37:43,400 --> 00:37:46,280
I wish I could do that. Right. But it would take me a long time to do it.
618
00:37:46,280 --> 00:37:50,280
I was like, I just have limited time and it still took me time to build.
619
00:37:50,840 --> 00:37:54,680
But I was watching a movie while I was doing it. Right. Like this isn't something that like,
620
00:37:54,680 --> 00:37:57,880
it's not making money. It's not something that's like, so I just wanted to get to the point of
621
00:37:57,880 --> 00:38:03,240
how much would it cost me as an individual who knows some of this stuff, do it. Right.
622
00:38:03,240 --> 00:38:08,760
And so far I think I'm like $60 in AI credits in, in a $25 domain. And that's it. Right.
623
00:38:08,760 --> 00:38:13,400
So it's like, I'm still under a hundred dollars, which is great because it's taken me a few evenings
624
00:38:13,400 --> 00:38:17,320
of, Hey, like the app mostly works. Is it secure? Absolutely not. Right. Like I don't,
625
00:38:17,320 --> 00:38:22,440
there's no, no oddity in this, but it works for me. And I feel like that,
626
00:38:22,440 --> 00:38:27,079
I feel like the thing that we're seeing a lot and I think will be an interesting trend. And I say,
627
00:38:27,079 --> 00:38:33,320
we, as in like just my team at GitHub is people are building personal tools so much more because
628
00:38:33,320 --> 00:38:39,320
they can kind of vibe it out. And, and I've used so many of my domain names and like dusty side
629
00:38:39,320 --> 00:38:46,360
projects on my pile purely because like you said, I, I'm able to get started and I'm able to,
630
00:38:46,360 --> 00:38:51,160
to be just like, okay, you know what? I just want this app to exist for me so I can have this tool.
631
00:38:51,720 --> 00:38:55,720
I'm not going to worry about how it's built. I'm not going to worry about anything. I want the
632
00:38:55,720 --> 00:39:00,120
actual final product and I'm able to just do it. Sometimes it just stays in a private repo and it
633
00:39:00,120 --> 00:39:06,519
truly is just for me, but it works. And that part has been great. That's still the energy of the
634
00:39:06,519 --> 00:39:12,759
side project, the hackathons, but it's just a different aspect of coming like to it, you know?
635
00:39:13,880 --> 00:39:19,239
And I think the, one of the the thesis or the reason this podcast exists is to help people
636
00:39:19,239 --> 00:39:26,119
understand the long-term maintenance of software and decisions you make. How does that explosion
637
00:39:26,119 --> 00:39:34,920
of I can make any tool I want affect people in a year when what does that turn into in six months
638
00:39:34,920 --> 00:39:38,760
or whatever? Like, yeah, I can just let a domain expire. Maybe I don't care about it anymore,
639
00:39:38,760 --> 00:39:45,639
but I feel like we are so far outpacing anything we've ever done before and we have no people
640
00:39:45,639 --> 00:39:50,280
trained or understanding how any of this is going to be maintained in even a year from now.
641
00:39:50,280 --> 00:39:53,800
I honestly think that's going to be a learning process. Just like teaching junior developers,
642
00:39:53,800 --> 00:39:57,320
like we're going to have to figure it out. It's, it's really hard to maintain something
643
00:39:57,320 --> 00:40:03,240
that you didn't write 4,000 lines later. Right. And I think that that's where we need to keep
644
00:40:03,239 --> 00:40:08,919
beating the drum as an industry that it's not just about senior devs. We need to like still
645
00:40:08,919 --> 00:40:14,199
level up the people who are early in their careers because those people are the future senior devs.
646
00:40:14,199 --> 00:40:18,839
And again, the roles are probably going to be looking different in a year, two years, five years.
647
00:40:18,839 --> 00:40:23,079
We don't know what it's going to look like, but we need people to continue to enter the industry
648
00:40:23,079 --> 00:40:27,639
and learn, even though it's particularly weird and challenging right now with all of these tools
649
00:40:27,639 --> 00:40:31,639
coming up. It's such a weird place to be in. Like, I feel like you don't even know what you're
650
00:40:31,639 --> 00:40:35,239
shooting for when you are thinking about promotions because you're like, what does that
651
00:40:35,239 --> 00:40:41,400
even look like right now? Yeah, it's also tough. Go ahead. Weird thing that I was thinking of when
652
00:40:41,400 --> 00:40:47,079
we were talking about paying kind of like to be able to do a little bit more. Like I got advice
653
00:40:47,079 --> 00:40:52,359
from a GM when I was like first a software engineer at AWS and I was like, how do you like
654
00:40:53,000 --> 00:40:58,440
balance being like a good parent and showing up to all your kids' stuff and like being a good spouse
655
00:40:58,440 --> 00:41:02,599
and like still like kicking butt at your career? Like how do you do all that? And he told me
656
00:41:03,159 --> 00:41:09,240
that he contracts out the stuff that he doesn't want to do. So like Instacart and like getting
657
00:41:09,240 --> 00:41:13,000
someone to like clean your house once a month or those kinds of things. And it's kind of funny
658
00:41:13,000 --> 00:41:17,800
because sometimes I almost think that's what people, that would be the cool thing of AI,
659
00:41:17,800 --> 00:41:21,800
getting it to do the things that we don't want to do or to make our lives easier so we could get a
660
00:41:21,800 --> 00:41:27,240
little bit more time with our like side projects or we can get like do things faster because I
661
00:41:27,239 --> 00:41:31,639
think that's what we should use it for, you know, to buy you a little bit of time for the fun things.
662
00:41:31,639 --> 00:41:36,199
I just wanted to do my laundry and my dishes. Oh my, I would pay so much money.
663
00:41:37,639 --> 00:41:46,119
I want this. If you want a VC that like that would make so much money, like do you,
664
00:41:46,119 --> 00:41:53,079
do you, I have three kids, Cassidy. My laundry pile haunts my dreams. I only have two and it's
665
00:41:53,079 --> 00:42:00,519
endless. It's endless. The amount I would give a hood rat money, like I would give so much money.
666
00:42:00,519 --> 00:42:05,960
But actually though, like I keep seeing like people saying we're working on a robot that
667
00:42:05,960 --> 00:42:11,239
can fold your laundry, but the video's AI. And I'm just like, okay, that's not real. Make it real.
668
00:42:11,239 --> 00:42:15,639
I'm begging you. Because I can find out where the socks go because those little socks cost
669
00:42:15,639 --> 00:42:20,360
just as much as regular size socks. And I just want to know like how, like,
670
00:42:20,440 --> 00:42:27,800
like, and like, just like if you just somebody could just do the annoying things, like manage
671
00:42:27,800 --> 00:42:34,200
where my kids controllers went and like where they lost it or like do laundry or someone.
672
00:42:34,200 --> 00:42:35,480
This is the future we want.
673
00:42:35,480 --> 00:42:40,360
Can it nag you? Like I'm going to start an app that nags my children and it's like an AI that's like,
674
00:42:40,360 --> 00:42:44,599
you haven't taken a shower yet. Your stuff's blown the floor. Like I'm going to just get like.
675
00:42:44,599 --> 00:42:46,280
That's a good vibe coded app idea.
676
00:42:46,440 --> 00:42:50,120
I swear to God, don't play with me. I'm going to go get see if they still have Amazon deep
677
00:42:50,120 --> 00:42:55,000
seat cameras and be like, no, there's still stuff on the floor in your room. So I don't have to get
678
00:42:55,000 --> 00:42:59,400
off my butt and walk upstairs and tell them that there's still stuff on the room six times.
679
00:42:59,400 --> 00:43:00,440
You know, I just got to start up.
680
00:43:04,120 --> 00:43:09,320
And I think about like when was the washing machine, the clothes washer invented, right?
681
00:43:09,320 --> 00:43:13,800
Like that was a huge game changer for so many people to be able to automate their work and to
682
00:43:13,800 --> 00:43:19,320
do the thing that they didn't want to do anymore. Right. Like that was absolutely just even like a
683
00:43:19,320 --> 00:43:24,039
dishwasher clothes. I always talk about like this is this is the original agent in my house.
684
00:43:24,039 --> 00:43:29,080
Right. Like this is the agent that originally like set it all off of like it does stuff for me when
685
00:43:29,080 --> 00:43:32,280
I don't have to do it. And I just set a timer and yeah, I have to change it still. And I have to
686
00:43:32,280 --> 00:43:37,480
fold the clothes. But you know, most of the work is already like I don't have to bring it a basket,
687
00:43:37,480 --> 00:43:38,039
go down somewhere.
688
00:43:38,039 --> 00:43:43,480
That's why I think it's hilarious. Like there's a disconnect between like the products people
689
00:43:43,480 --> 00:43:49,240
want. Right. And the executives who are like farming these ideas, like I just want to know
690
00:43:49,240 --> 00:43:54,599
what they go into like meetings with PowerPoints on because I'm like, there are moms that would pay
691
00:43:54,599 --> 00:44:01,159
you so much money. Like even not even just moms, like there's so many real life ways that we could
692
00:44:03,159 --> 00:44:08,039
you can't see our faces, but if you could see my face right now, the money I would pay you to do
693
00:44:08,039 --> 00:44:14,039
some of the boring like monotonous crap that I have to do every day. If you could just like,
694
00:44:14,920 --> 00:44:19,800
there's so many cool and nobody like there's this whole market of how AI could make our lives like
695
00:44:19,800 --> 00:44:23,639
easier. And they keep giving us stuff that nobody asked for. And I'm like,
696
00:44:23,639 --> 00:44:26,440
listen to AI music. I want you to fold a shirt.
697
00:44:27,320 --> 00:44:31,800
Or like just do something annoying, like go fill out my kids forms. You know what I mean? Like go
698
00:44:31,800 --> 00:44:37,079
fill out all their field trip forms and all the like emergency contact update forms and stuff.
699
00:44:37,319 --> 00:44:39,960
You don't want the AI to have that data though. Like that's the other.
700
00:44:39,960 --> 00:44:42,679
Can I run it locally? They have all the data.
701
00:44:44,679 --> 00:44:48,440
Have you seen the stuff the government's doing? That ship sailed. Okay. Go fill out the forms.
702
00:44:48,440 --> 00:44:49,079
I don't even care.
703
00:44:49,079 --> 00:44:53,319
That's actually kind of filling out the forms. So that's a good idea if you got like a pen plotter
704
00:44:53,319 --> 00:44:55,079
and then you just AI to fill it out.
705
00:44:55,079 --> 00:45:01,000
Take something off my plate. That's annoying. I swear to God, the money that people because
706
00:45:01,000 --> 00:45:05,159
we're all like millennials are working like 8 million jobs right now. We're trying to be
707
00:45:05,159 --> 00:45:09,239
present parents. We're trying to like heal the trauma that happened. We're all trying to go to
708
00:45:09,239 --> 00:45:19,079
therapy, work six jobs, make sour dough. Like, bro, if you take something off my plate, I would
709
00:45:19,079 --> 00:45:25,239
pay you obscene amounts of money. Like that one. This is an ADHD problem.
710
00:45:28,599 --> 00:45:31,079
Look, I got a new printer. I got another one.
711
00:45:31,719 --> 00:45:35,159
You have two 3D printers now? This is your problem.
712
00:45:35,159 --> 00:45:41,480
Shut up, Justin. What if somebody did some of the adult stuff? Like I have more time than you did
713
00:45:41,480 --> 00:45:46,440
stuff. The problem you have is not saying no. And that is like, if you want to outsource saying no
714
00:45:46,440 --> 00:45:50,599
to things, just text me. Have you been talking to my therapist? Shut up, Justin. God.
715
00:45:51,799 --> 00:45:54,199
Come on, man. I thought we were friends.
716
00:45:54,679 --> 00:46:03,879
Anyway, I would, I just gave me some project ideas and I want that. I'm going to go write
717
00:46:03,879 --> 00:46:08,439
an app that nags my children and scans their room. Thanks. Yeah, no, I think I think that'd be neat.
718
00:46:08,439 --> 00:46:10,839
But yeah, that'd be like, get your shoes off the floor.
719
00:46:16,599 --> 00:46:20,679
Let's see your house just have motion sensors in every room with cameras. That's like your control.
720
00:46:20,919 --> 00:46:28,679
You didn't wait a camera to go over in my pantry and be like, put the Nutella snacks down,
721
00:46:28,679 --> 00:46:33,399
put them down. You're on your third Pringles. Leave it alone and go get some carrots.
722
00:46:35,480 --> 00:46:41,799
Some of these are actually very feasible. I grow up. My mind is like, oh, I could do that.
723
00:46:43,319 --> 00:46:47,879
Like I'm about to have IOT devices everywhere. My children are going to be so annoyed at me.
724
00:46:48,039 --> 00:46:54,360
That I'm haunted. They're going to be like, we live in LA, but there's there's such like those kids
725
00:46:54,360 --> 00:46:58,280
that like their mom talks about cybersecurity too much and they're going to be like, and it's
726
00:46:58,280 --> 00:47:03,720
stealing your data. And that's a bad idea because what if it gets hacked by a malicious actor? And
727
00:47:03,720 --> 00:47:08,680
they're going to be like, I live in like a data like prison and like, I want you to stop.
728
00:47:09,480 --> 00:47:15,000
This is locally hosted home assistant with local AI models running on a Raspberry Pi through pie
729
00:47:15,000 --> 00:47:18,760
hole or whatever. Just keep listing it off. Just scare people away.
730
00:47:22,599 --> 00:47:24,119
We got down the nerd rabbit hole.
731
00:47:27,480 --> 00:47:31,000
I'm about to build so much hood rat stuff from my house over Christmas break.
732
00:47:31,880 --> 00:47:37,079
It sounds so good though. And then when this airs in January, you'll be like, yeah,
733
00:47:37,079 --> 00:47:42,280
I did do all of that. That's the real test. Yeah. Mid-January when this lands.
734
00:47:45,320 --> 00:47:49,880
For you. Co-pilot agent will be working very hard. Okay.
735
00:47:53,800 --> 00:47:58,760
Can you make a mom? Like, what if you made an AI agent that was like the other mom,
736
00:47:58,760 --> 00:48:04,679
like the mom that just nagged them about us? This is how they come up with those chats that like
737
00:48:04,679 --> 00:48:10,679
speak for certain characters and avatars that we got to, we got to make a little less dystopian.
738
00:48:11,399 --> 00:48:17,399
Damn it. It was a little Terminator. Wasn't it? Like, yeah, this is, this is where the rabbit
739
00:48:17,399 --> 00:48:22,119
hole goes. You got to read. Okay. You're right. You're right. We've come up with all these ideas
740
00:48:22,759 --> 00:48:27,559
in, in whatever 30 minutes we were talking. What is, what does 2026 look like? Where do
741
00:48:27,559 --> 00:48:31,319
we think this is going? What is everyone else going to be doing? What are all the other crazy
742
00:48:31,319 --> 00:48:37,000
ideas that's in 2026? Someone's going to throw millions of dollars and say, yeah, that's the
743
00:48:37,480 --> 00:48:41,239
thing. Unfortunately, I don't think it's going to be folding a shirt. That's not what I think it is.
744
00:48:41,239 --> 00:48:44,119
I don't think this is going to be the time where they throw millions of dollars. I think this is
745
00:48:44,119 --> 00:48:48,760
going to be the time that they realized they wasted millions of dollars. This is the reckoning 2026.
746
00:48:49,639 --> 00:48:55,880
Like they've just lit money on fire. I don't know. I think the runway is a little bit longer. Like
747
00:48:55,880 --> 00:49:00,920
they are burning it very fast. The runway is long. I, I, I'll be curious to see like,
748
00:49:01,639 --> 00:49:06,519
what, yeah, first of all, the, the VCs, the people spending money on this,
749
00:49:06,519 --> 00:49:10,440
are they going to crack down or are they going to? What's the pivot though?
750
00:49:10,440 --> 00:49:14,599
What now that they've decided like gardeners come out and they said 98% of products aren't making
751
00:49:14,599 --> 00:49:19,240
money. Like what's the pivot? How are they now going to start picking the next thing to invest in?
752
00:49:19,880 --> 00:49:25,159
It'd be cool if they listened to their users. Um, but we'll see, we'll see what happens.
753
00:49:25,159 --> 00:49:27,960
Like we've been doing this long enough that we know they don't do that. Okay.
754
00:49:28,920 --> 00:49:32,840
Like first of all, listening to users, seeing like what people are actually using, which,
755
00:49:32,840 --> 00:49:38,440
which tools are going to like bubble to the top and be like the ones that last and which tools
756
00:49:38,440 --> 00:49:43,880
are going to be like, okay, that was a good try. We're done. And I also, I genuinely think that
757
00:49:43,880 --> 00:49:49,960
what we're going to see in 2026 is again, more personal tools. And I'm hoping to see a lot more
758
00:49:49,960 --> 00:49:56,519
like personal blogs and websites, making a comeback, a lot of like decentralized portable
759
00:49:56,519 --> 00:50:01,639
things that people can take from platform to platform as like social media is changing.
760
00:50:01,639 --> 00:50:05,719
The algorithms are changing. All these things are changing. People building on their own
761
00:50:06,599 --> 00:50:12,360
setups more. I feel like I see that as a trend coming. I don't know if it'll come in full force
762
00:50:12,360 --> 00:50:16,679
or if it's in my like indie hacker bubble, but that's, that's what I feel like I'm seeing.
763
00:50:16,679 --> 00:50:22,840
Do you have any advice for, I don't know, future Cassavies out there who are wanting to start that
764
00:50:22,840 --> 00:50:27,640
blog or, cause it's daunting, like writing regularly.
765
00:50:28,680 --> 00:50:34,120
It is, but also you have more in your brain than you think. I literally,
766
00:50:34,120 --> 00:50:38,600
The internet is harsh critics. Like how do you deal with all the critics?
767
00:50:38,600 --> 00:50:43,800
Yeah. And yet people don't listen to you as much as you think. And so I think you're on,
768
00:50:43,800 --> 00:50:48,440
you are your own harshest critic a lot of the times. And there have been so many times where
769
00:50:48,440 --> 00:50:52,120
I'm talking to someone where they're just like, oh, well that course already exists. There's
770
00:50:52,119 --> 00:50:57,960
content on this that already exists, but there's rarely content that exists that speaks the way you
771
00:50:57,960 --> 00:51:04,279
speak. And the way you speak is a special way, which, which sounds like very sunshine and daisies,
772
00:51:04,279 --> 00:51:09,400
but it's true where you never know if your voice is going to be the thing that helps someone
773
00:51:09,400 --> 00:51:14,279
understand something better or helps someone learn better, change their perspective better.
774
00:51:14,279 --> 00:51:17,000
Inspire them to do something else, right? Like that's the, yeah.
775
00:51:17,000 --> 00:51:19,239
Try. Yeah. That's really true though.
776
00:51:19,239 --> 00:51:24,759
And a blog post doesn't have to be an intimidating thing. This is just going to turn into me telling
777
00:51:24,759 --> 00:51:30,119
people they should blog more, but like some of the best blogs I've read, I've read are like
778
00:51:30,119 --> 00:51:34,039
a paragraph long, but it's like a really good insight. And I'm just like, wait, that was good.
779
00:51:34,039 --> 00:51:38,439
I need to bookmark this. I need to remember this. I have tried for so long to tell people like,
780
00:51:38,439 --> 00:51:44,519
don't put a thread of, of multiple posts or tweets or whatever, write a three paragraph
781
00:51:44,519 --> 00:51:48,919
blog post, right? Like it's going to, it's going to exist longer. You'll be able to link it better.
782
00:51:49,320 --> 00:51:50,760
The link will exist. Yeah.
783
00:51:54,039 --> 00:51:57,480
How do you like, one of the things that I always tell people is make it a habit,
784
00:51:57,480 --> 00:52:02,360
figure out when it fits in your time that says, Hey, I'm inspired to do this now. Not only like,
785
00:52:02,360 --> 00:52:06,440
do I have the energy to do it. I have the ability to do it. And a lot of times people are like,
786
00:52:06,440 --> 00:52:10,440
I'm on my phone and I can't type out a blog post. And I'm like, yeah, that's kind of hard
787
00:52:10,440 --> 00:52:14,119
for a little while. You could do it with like medium or ghost or something like that, but it's,
788
00:52:14,119 --> 00:52:19,480
it's still difficult to say like, Oh, I have the ability and inspiration to write something
789
00:52:19,480 --> 00:52:26,279
right now. Right. How do you act on that? I think that is building a habit as a result of
790
00:52:26,279 --> 00:52:31,719
the workflows that you have. And I think that, and this is actually a course I used to teach.
791
00:52:31,719 --> 00:52:37,319
And I kind of want to like, I did it in person and I want to record it of just like developing
792
00:52:37,319 --> 00:52:43,719
workflows for yourself. Because I think a lot of times when we try to start a habit and fail at it,
793
00:52:43,719 --> 00:52:50,119
we get discouraged when we miss a day or something. It's because we haven't built a system for doing
794
00:52:50,119 --> 00:52:54,839
it with as low overhead as possible. We build up this huge system where it's like, I'm going to
795
00:52:54,839 --> 00:52:59,799
create blogs and it's going to follow this perfect template, or I'm going to, I'm going to make sure
796
00:52:59,799 --> 00:53:05,959
everything follows a specific formula every single time without starting significantly smaller than
797
00:53:05,959 --> 00:53:10,599
you plan. And then like add more over time as you get comfortable spreading, spreading out the work
798
00:53:10,679 --> 00:53:15,159
and making it so that that workflow builds up to, oh, this last piece is easy. Because I did a bunch
799
00:53:15,159 --> 00:53:18,920
of little work leading up to it. And that's, I mean, I think a lot of people, a lot of people fail at
800
00:53:18,920 --> 00:53:22,199
personal websites and blogs because every time they want to write something, they have to redesign
801
00:53:22,199 --> 00:53:27,559
it. Exactly. Oh, and I've been a victim of that. But then they're like, use a template and just
802
00:53:27,559 --> 00:53:33,319
start reading. And I've even my newsletter, I've been writing my newsletter now for eight years,
803
00:53:33,319 --> 00:53:38,519
almost nine years now. And it's a consistent thing I do every single week. And it's been,
804
00:53:38,519 --> 00:53:46,519
how do you keep it interesting? I don't know. But like, but what I do is, is over time,
805
00:53:46,519 --> 00:53:51,480
I have a template that I always follow throughout the week. I have different links and I have
806
00:53:51,480 --> 00:53:56,199
different tools where, for example, I'm not paid by any of these tools. This is just what I use.
807
00:53:56,199 --> 00:53:59,719
Raindrop is a bookmarking tool that's cross-platform. It has browser extensions.
808
00:53:59,719 --> 00:54:04,360
This has phone things. Whenever I read an article that I think is good, I bookmark it in Raindrop
809
00:54:04,360 --> 00:54:08,119
and it's all just in a put in newsletter folder. And then when I'm writing my newsletter, I just
810
00:54:08,119 --> 00:54:11,799
pull from that. Whenever I do something where I'm just like, oh, this could be good for the
811
00:54:11,799 --> 00:54:16,039
newsletter, I write it in like a scratch note that I can then put in the newsletter later.
812
00:54:16,039 --> 00:54:19,799
Whenever I see a funny joke, I'm like, ooh, we will add that to the newsletter sometime. And so
813
00:54:20,679 --> 00:54:25,719
I have a workflow where even though it still takes time to write the whole newsletter every
814
00:54:25,719 --> 00:54:29,960
single week, it's yeah, it's spread out. And I've developed enough workflows where
815
00:54:29,960 --> 00:54:35,239
I have the exact same formula that I follow that works well, even if I'm having a tough week,
816
00:54:35,239 --> 00:54:40,599
because it's kind of just- You know, it's funny. To take it all back full circle,
817
00:54:40,599 --> 00:54:46,759
I think that's the thing that the things that AI will be good at is adding to people's workflows
818
00:54:46,759 --> 00:54:50,599
so they can be more efficient. I don't think it's ever going to be- Less context switching.
819
00:54:50,599 --> 00:54:56,759
Yeah. Yes. And I feel like that's the best part is when it's not making you switch contexts as
820
00:54:56,759 --> 00:55:02,279
much, but you're still recording that or putting it away or organizing that thing in the background.
821
00:55:03,160 --> 00:55:08,840
I think that's when it's going to be very- I had a similar workflow with my newsletter that I
822
00:55:08,840 --> 00:55:13,320
stopped doing because the workflow started to get difficult. But Pocket went away, which was my place
823
00:55:13,320 --> 00:55:18,040
to bookmark and find my things and make those notes. And then when Pocket wasn't there, I never
824
00:55:18,040 --> 00:55:22,519
really replaced it. I was just like, oh, all the workflows are gone now. And I use Pocket for years
825
00:55:22,519 --> 00:55:26,920
for so many things. I just had a habit of every Sunday I sat down and read my Pocket queue. And
826
00:55:26,920 --> 00:55:29,960
it was a great workflow because I was like, oh, I don't have to read this during the week. I don't
827
00:55:29,960 --> 00:55:33,800
have to keep the tab open because I know Sunday night I'll have some time and I'm just going to
828
00:55:33,800 --> 00:55:38,840
go read it. And now that hasn't been there. And I feel like I'm missing a big part of that,
829
00:55:38,840 --> 00:55:43,079
what used to be easy workflow now is hard to say, where was that link? What was that thing
830
00:55:43,079 --> 00:55:47,559
I was trying to do? All that stuff gets more difficult. Yeah. Try Raindrop. It's nice.
831
00:55:51,400 --> 00:55:56,360
I have no idea. I have never tried. I was looking for a while to get something that was self-hostable
832
00:55:56,360 --> 00:56:00,920
so I could own the data, which then was like redesigning my blog. And it just ended up being
833
00:56:00,920 --> 00:56:07,400
like this big wrong comparison. That's the danger. I actually just wrote about a blog post where I
834
00:56:07,400 --> 00:56:12,680
was talking about photo backup, where I was going into such a rabbit hole of like, I want to use
835
00:56:12,680 --> 00:56:17,160
open source this, that way I can self-host that, do all these different things. And then I found
836
00:56:17,160 --> 00:56:23,559
an open source hosted solution that gets the job done. And I'm using that. And then eventually I'll
837
00:56:23,559 --> 00:56:28,199
upgrade to something else, but one thing at a time. And I feel like, yeah, accepting those
838
00:56:28,199 --> 00:56:33,320
intermediate steps is hard, but helps with those habits and workflows. That's actually really true
839
00:56:33,320 --> 00:56:37,000
because I feel like that's what gets, like we were just talking about when you get overwhelmed and
840
00:56:37,000 --> 00:56:41,639
you don't actually start, like that's the basis of so many things. Like I feel like my GitHub
841
00:56:41,639 --> 00:56:48,920
pages website has been in shambles for like seven months. Maybe I should use Copilot on that. Like,
842
00:56:48,920 --> 00:56:53,159
can you fix this so I can get to the point I can actually add things? Like, yeah.
843
00:56:53,879 --> 00:56:56,759
My blog is open source. You can use that template if you want.
844
00:56:56,759 --> 00:57:01,079
Sweet. I might take you up on that. Because I just need an easy template.
845
00:57:01,079 --> 00:57:02,359
So I can actually post stuff.
846
00:57:02,359 --> 00:57:03,159
Get started.
847
00:57:05,000 --> 00:57:09,480
That's really cool that you did an open source template. Like I think it's rad that
848
00:57:09,480 --> 00:57:14,679
you can tell that you genuinely enjoy all this and you're like always willing to like share.
849
00:57:14,679 --> 00:57:17,799
Like I like following you on Blue Sky because you're always like, and then I did this cool
850
00:57:17,799 --> 00:57:19,480
thing and it's never like gatekeeping.
851
00:57:20,199 --> 00:57:26,679
Thanks. I do that because I think it's important to pay it forward. Where enough people provided
852
00:57:26,679 --> 00:57:31,960
resources for me as I was learning and growing and doing things that I would rather give things away
853
00:57:31,960 --> 00:57:38,599
that be just like, and this is mine. It's good to learn from all these things. Plus then my future
854
00:57:38,599 --> 00:57:42,519
self learns from it too because I can't tell you how many times I've Googled something and my own
855
00:57:42,519 --> 00:57:44,760
blog post comes up. That's awesome.
856
00:57:44,760 --> 00:57:48,440
Right. Yeah. Right for yourself. Right. The thing that you learned, write it down because you will
857
00:57:48,440 --> 00:57:49,159
find it later.
858
00:57:49,159 --> 00:57:53,240
And that's my favorite thing about community work is the playing it forward aspect because
859
00:57:53,240 --> 00:57:57,000
when people are like, I worked so hard, which we do work hard, don't get me wrong, but like,
860
00:57:57,720 --> 00:58:01,960
I don't think anybody would be there or be around or be successful without other people.
861
00:58:01,960 --> 00:58:04,679
Like I'm always super appreciative of all the people that have helped me.
862
00:58:05,800 --> 00:58:06,760
Yeah. It matters.
863
00:58:09,559 --> 00:58:13,800
Well, thank you so much, Cassidy. Yeah, this has been a lot of fun. I think we kind of went
864
00:58:13,800 --> 00:58:15,559
all over the place, which was great, but also-
865
00:58:15,559 --> 00:58:21,400
That's the best rabbit hole ever. We went from like laundry to like Terminator and like,
866
00:58:22,039 --> 00:58:23,320
I'm going to write some apps.
867
00:58:25,079 --> 00:58:28,119
When this comes out, Autumn, we're going to ping you again. And we'll say, everyone send
868
00:58:28,119 --> 00:58:31,480
Autumn a message and say, ask her if she did it when this episode comes out.
869
00:58:32,440 --> 00:58:34,920
That's a lot of accountability, Justin.
870
00:58:34,920 --> 00:58:37,159
But hey, maybe that's what you need. Maybe that's what you need.
871
00:58:37,159 --> 00:58:41,880
Probably it is. Like Justin will be like, did you do this thing for the podcast? Did you do that?
872
00:58:41,880 --> 00:58:46,840
And I'm like, Justin's- I was like, it's bad when I'm making Justin be the adult here.
873
00:58:50,039 --> 00:58:51,640
That's, yeah, it's fun.
874
00:58:54,360 --> 00:58:56,519
Cassidy, where should people find you on the internet?
875
00:58:57,079 --> 00:59:03,800
You can find me at Cassidy, C-A-S-S-I-D-O-O on most things. Cassidy.co is my website.
876
00:59:03,800 --> 00:59:08,599
Or you could Google Cassidy Williams. There's a Scooby Doo character named Cassidy Williams.
877
00:59:08,599 --> 00:59:11,400
And so I'm not her, but I'm the other one.
878
00:59:12,119 --> 00:59:14,360
Does AI ever confuse that? Is that a problem?
879
00:59:14,360 --> 00:59:18,280
Yeah, actually, no. She's really ruined my SEO, but that's okay.
880
00:59:20,200 --> 00:59:24,680
Also, I love your name, Cassidy. That's like the most adorable thing. And you have it everywhere.
881
00:59:24,680 --> 00:59:30,200
Yeah, my mom made it up. She used to do like, how do you do Cassidy? And it's just stuck.
882
00:59:30,200 --> 00:59:30,599
Oh, perfect.
883
00:59:30,599 --> 00:59:34,280
Oh, that's adorable. It was already cute. And then you made it cuter.
884
00:59:35,000 --> 00:59:41,240
All right. Thank you everyone for listening and we will talk to you again next month.
885
00:59:42,519 --> 00:59:50,519
Bye.