255: AI in the Streets

You spend all day steering AI through code. Then you step outside and it's steering everything else. AI is listening to therapy sessions and suggesting treatments to your therapist. Your spouse is arguing with a chatbot about where Savannah, Georgia is. You call a company for help and get handed from one AI pretending to be human to another AI pretending to be human. The crew has been noticing it everywhere, and this week they compare notes on what it actually feels like when AI stops being a tool you chose and starts being a thing that just happens to you.

Links mentioned in the show:

Follow the show and be sure to join the discussion on Discord! Our website is workingcode.dev and we're @workingcode.dev on Bluesky. New episodes drop weekly on Thursday.

And, if you're feeling the love, support us on Patreon.

With audio editing and engineering by ZCross Media.


Transcript

Spot an error? Send a pull request on GitHub.

[00:00:00] Cold Open

[00:00:00] Carol: I called this one company and it's like, hi, I'm Amy, your virtual assistant. I'm like, let me speak to someone. She's like, where are you at?

[00:00:06] Carol: I'm like, telling her. And she's like, okay, how can I help you? I'm like, let me speak to someone. And finally it sends me through to what's supposed to be another person like to talk to. And every response is, oh, that's wonderful. Hold on, lemme look at that. And I'm like, this isn't even another human. I'm so pissed off now.

[00:00:24]

[00:00:44] Intro

[00:00:44] Adam: Okay, here we go. It is show number 255 and on today's show we've got two copies of Microsoft Outlook running on, wait, no, sorry, wrong thing. did you guys watch the Artemis launch?

[00:00:54] Ben: No,

[00:00:55] Carol: Yes.

[00:00:56] Tim: The toilets didn't work and two copies of Outlook didn't work.

[00:01:00] Adam: Yeah. Their computer had two copies of Outlook. It was an issue

[00:01:02] Carol: and during the entering the live feed you shot, you could see him put in his pen to the laptop or to his iPad, then

[00:01:08] Adam: Yeah,

[00:01:08] Carol: know that. Yeah.

[00:01:09] Tim: that's,

[00:01:09] Adam: missed that part. I.

[00:01:10] Tim: that's too, too many copies of Outlook running in the on, on that space station.

[00:01:14] Adam: Agreed.but, so on today's show we're gonna talk about, running into AI on the street. You know, not, not using coding assistance, just like, you know, where have we run into AI in the real world. but first, as usual, let's start with our triumphs and fails the whole crew here tonight. As you may have heard, and Ben, I'm coming to you first.

[00:01:31] Adam: What do you got going on my friend?

[00:01:33] Ben's Fail

[00:01:33] Ben: I am gonna go with a failure, and this is a very soft failure. I think, I did some hand coding over the weekend with my digits. I know, right? I, I was playing around with some ColdFusion custom tags. I have a, um, a library that I'm sort of trying to build that helps me author, email markup, which is notoriously hard even apparently in today's age where Outlook Desktop is still the third most used email client apparently is reported by Litmus.

[00:02:02] Ben: I have this library I'm trying to use and, and the, I had a lot of fun doing it, and it was one of these things where I would just stare off into space and think about what I wanted the code to look like and think about how I wanted the code to feel as I was writing it. And I was thinking about it in the shower and I was thinking about it a little bit when I'm out driving and the failure is that.

[00:02:24] Ben: I loved it. I love that mind space and being lost in thoughts like that. And I, I haven't had that yet, that same kind of saturated, I'm marinating in the thought space type feeling when I'm building something with AI. AI for me is still predominantly like an angst based experience, which, I guess that's the, the meat of the failure.

[00:02:51] Ben: I want to get to a point, or I'm hoping that I can get to a point where I can be building something with AI and have that joy of just having it permeate my thoughts and think about it when I'm not at my desk. And the same that I, that I do when I'm hand coding. I

[00:03:08] Adam: So you wanna like write a prompt and then go, I know kung fu.

[00:03:11] Ben: No, like I just, I just want it to be part of the fabric of being Ben.

[00:03:16] Ben: You know, I want, I, that's, that is, even if, even if like my identity has historically been very enmeshed with enmeshed with handwriting code, I am okay with the evolution of that, where my identity is becoming more about building products and thinking about products and dealing with people. But I just haven't gotten to a point yet where I'm feeling that that immersive joy, I guess joy joy's almost not like the right word.

[00:03:51] Ben: It's just, I, I feel like handwriting code for me has always been just part of the ether. I'm not, I'm not articulating well, but I, I'm just, I'm just not there yet, even remotely close when it comes to building stuff with AI and I, I, I don't know if I'm holding myself back or maybe I'm just expecting that to happen and maybe it won't happen the same way.

[00:04:12] Ben: I'm, I'm not sure.

[00:04:14] Tim: I will say that past few weeks I felt very engaged with kind of the pro, so I kind of look at like coding with AI as like being the teacher at a extremely gifted programming academy for kindergartners. Okay. And you got all, they're all so energetic, they all just wanna please you so bad.

[00:04:39] Tim: And like, so I have, I don't even close it, I don't know, last time I've closed my command prompt, like 10 command prompt windows right now open that I just keep like checking on. So I'll like, I'll start a little something over here. I'm like, Hey, can you go this, this is not pulling the correct policy and the amount and go do this thing.

[00:04:59] Tim: And it starts beeping and booping. And then I go to my next tab and I just keep checking on all my little children. And then, you know, sometimes I'll, I'll, I'll stop and I'll look at what their, you know, 'cause it talks out loud. So you're like, wait, wait, wait. You're way off. Base control C, stop, stop, stop, stop.

[00:05:16] Tim: Right? And then you're like, you're looking in the wrong fo folder. Stop looking there. Look here. so then I'll, I'll get it back on track and then I'll go check on the other kids. And then it's like, when it's all done, it's like the whole day I feel like I'm herding cats, but the end of the day I look back, I'm like, I got so much done. See, that's, that's kinda what gets me the joy that it's done.

[00:05:35] Tim: Not necessarily that I hand coded it, but that I've had it do it. I've had it created, I've had it tested and reiterated and tested again and again writing documentations, telling it to go update the Jira 'cause I hate Jira tickets. But now, now that I've found the, the Atlassian, MCP server, I'm just like, go explain exactly what we did on this ticket and mark four hours worth of time.

[00:05:59] Tim: And I'm like, that's giving me the feels that I used to get from like writing one really nice, beautiful page of

[00:06:06] Ben: Yeah.

[00:06:07] Carol: I am kind of like you there. I get super excited building some features out and doing all the initial like research and I'm like, oh, I'm gonna get to implement this thing. But then when I look in the backlog and someone's already picked it up and started doing the work, I kind of like shake my fist and go, I wanted to write that so bad.

[00:06:23] Carol: Like that sounded like so much fun. And now I don't get to.

[00:06:28] Ben: So here's another aspect of it, and this is very particular to me. I think because I've done a lot of writing over my career and what I have noticed very clearly is that as I have moved into the AI world, not just that my desire to write things down has waned, but the inspiration to write anything down has like all but disappeared.

[00:06:55] Ben: And then that gets juxtaposed with, from handwriting a little bit of code over the weekend. I wrote two blog posts, one about kind of unifying the way some custom tag hierarchies work. And then I came across a bug and I wrote about the bug in Adobe ColdFusion. And it's, it's hard to go back to dip my toe back into the hand coding and suddenly I feel inspired to write again because I'm running into problems.

[00:07:21] Ben: And then I write about running into those problems and it's very deterministic stuff. And I can write up a demo and I can say, here's how the demo works, and here's the little fix that I could put into place. And the AI world is so non-deterministic that, even when I, even when I struggle and I come up with something that feels like it's alleviated that struggle, I don't feel confident to write about it in any way whatsoever because I feel one so new to the whole concept and it all feels like a black box still. Like I don't wanna write about it 'cause I don't even understand what it is that's breaking and what it is that's being fixed. And I don't know.

[00:07:58] Ben: So it, it's like I love the hand coding and everything that came with it and the whole ethos of it and the way my mind worked and the way I was inspired to write. And then I go back to the AI world and I like, I don't have any of that. And uh, that's, that's my failure.

[00:08:11] Tim: I, I'll push back on you here. So, I mean, early days when you first started your blog that all of us were exposed to in the, in the early days of ColdFusion, What I do love about what you did is you were learning in public, there was a lot of stuff you didn't know about co. And you did a lot of experiments and you told us about those experiments.

[00:08:30] Tim: And I'm like, okay, cool. I didn't have to do the experiment you did. I could just take your takeaway. Right? And that's what really made me admire your blog and want to meet you, in person. So I, I mean, I think there's some correlation there. You're learn, you can learn in public about something that even, I mean, you don't understand the Java compiler that,

[00:08:50] Ben: Right.

[00:08:50] Tim: Macromedia wrote, right?

[00:08:52] Tim: So, or well, actually Adobe put it in Java. But anyway, the c plus plus compiler, you don't understand all, you don't understand, but you understand, you know, were learning in public, you were experimenting, you were reporting on your experiments. And I think there's a, there's a big space for that right now.

[00:09:06] Tim: 'cause I mean, AI is a complete experiment for us right now,

[00:09:10] Adam: Yeah, it's all new.

[00:09:11] Tim: yeah. So I, I I don't see why you can't do that Same thing with your writing in the blog.To help a new audience of people.

[00:09:21] Ben: I, it's a great, it's a great probing question, and I think maybe part of the issue there is that it's a different exploratory mindset, meaning I feel like I'm just constantly prompting the AI to do stuff, and if it doesn't quite do what I want, you know, I say, is there anything we can put into the CLAUDE.md file to help do that so we don't do it again?

[00:09:47] Ben: Or is there a skill I can create to do it? But it's all just like me constantly asking Claude to sort of figure it out.

[00:09:55] Tim: Mm-hmm.

[00:09:55] Ben: And I, I don't know, it just feels like it's, it's, I don't know. I'm, I, I'm going to take what you said and I'm going to reflect on it and I'm see what happens

[00:10:08] Tim: Cool.

[00:10:08] Adam: Do you at least feel like you're getting some productivity gains from the AI stuff? You know, like you, you mentioned last week you, you spent a lot of time working on an app that you don't know the technology stack, right? It was like React and

[00:10:18] Ben: Yeah, I, so I'm getting a lot done, but it's just not rewarding in the same way. I hate to say that. I hate to say that. It's, it's like, it's not that it's not rewarding. It's more like,

[00:10:33] Ben: I think changing the title of this episode. This is just gonna be a therapy right? I think, I think this is gonna be

[00:10:39] Ben: I, it's, it's, it's,

[00:10:42] Ben: I think the app was built. I don't know.

[00:10:45] Adam: Let's,

[00:10:46] Ben: Yeah. Let, let's, let's let, let, let's move on and, and we'll come

[00:10:50] Tim: the other day.

[00:10:51] Ben: So that's my failure.Carol, what do you got going on?

[00:10:55] Carol's Triumph

[00:10:55] Carol: Well, mine's gonna be a much simpler answer and I'm gonna go with a triumph and you guys will be very excited to know. I saw a movie that is current.

[00:11:04] Ben: What?

[00:11:05] Carol: Yeah, in the theater. Yeah, in the theater. Granted, I went to watch it the weekend after it left IMAX because some stupid Luigi movie took its spot. But I went and saw Hail Mary, so

[00:11:18] Ben: oh yes.

[00:11:19] Carol: the book, I loved it.

[00:11:20] Carol: I loved it so much. It's such a good movie. I will definitely be rewatching it.

[00:11:25] Ben: It was a lot of fun, I have to

[00:11:27] Carol: Mm-hmm.

[00:11:28] Adam: okay, so you, you were a read the book first person. Right.

[00:11:31] Carol: Yeah. Twice. Yeah.

[00:11:33] Adam: And how do you think that that colored your experience of the movie?

[00:11:37] Carol: so when you have ADHD, you tend to predict lots of things in movies already. So it just made it worse for me because I knew what was about to come. So I was just constantly like, oh, when's that gonna happen? And when's this coming up and when's that gonna be exposed? And at one point Steve looks at me and goes, I need to pee.

[00:11:56] Carol: I was like, you have like four minutes. I was like, I guarantee you you're gonna wanna be back within four minutes. So make it quick.

[00:12:03] Carol: But yeah, so saw a movie and I'm very proud of myself for going to do it 'cause I tend to be way behind you guys on current social like events in the media.

[00:12:13] Tim: You got real stuff to do.

[00:12:15] Carol: Yeah. Thing called light.

[00:12:17] Tim: Yeah, we have them.

[00:12:19] Carol: I enjoy baking sourdough.

[00:12:23] Ben: I usually go to this little local movie theater and they only ever show one trailer. So for the past, like two years, that's been my context for watching movies. You have one trailer and then you jump in. And when I went to watch Project Hail Mary, I, we had like half an hour of trailers or

[00:12:40] Tim: Yeah. That's my usual experience at

[00:12:42] Ben: And I was so panicked because it's a two and a half hour movie and I got the, I got the baby bladder and I was, I was like, you're killing me here, guys.

[00:12:51] Carol: you didn't have anyone to tell you when to go pee.

[00:12:54] Ben: No, I asked Adam, before I went, I said, Hey, is there a thing you can describe that is the right time to go pee? And he said, no.

[00:13:06] Adam: Well, no, I think my exact words were bring an empty bottle and a jacket to put over your legs.

[00:13:13] Carol: Well, like talking about the runtime though, in the previews, we were trying to book a dinner reservation after. I was like, okay, it's a two hour 37 minute runtime, but we're not gonna watch the credits. And then how much is it for previews? I was like, we can't book dinner reservations for three and a half hours by the time we drive there.

[00:13:32] Carol: I was like, this is a long movie and a long process. It should be, you know, it's almost like boarding a plane. It's like, hey, your flight's at, you know, 1135, but we're gonna board at 1105, so you really have to be at the gate 1105. Like just tell me the time it starts. Just tell me when they need to be in my seat, not the show up early time.

[00:13:50] Carol: Yeah.

[00:13:52] Tim: I, I know, so Ray Camden wrote one, but there's also another website. It's called RunPee.com,

[00:13:57] Ben: Oh, that's

[00:13:58] Tim: you when to go pee during the movie. Like when you,

[00:14:00] Carol: No way.

[00:14:01] Tim: you gotta go RunPee.com, you, you could, uh, it will tell you so,

[00:14:07] Carol: We'll have to check that out in 10 years. Wanna go watch another movie again?

[00:14:12] Adam: Uh,

[00:14:12] Adam: I'll say that. Uh uh This is no spoilers, I think here, but I went to PE I think it was when some language translation stuff was being worked out, and I feel like that was the best time to go.

[00:14:25] Carol: Yeah. Yeah.

[00:14:26] Adam: Really? Ugh.

[00:14:28] Carol: don't know that

[00:14:29] Ben: I mean,

[00:14:29] Tim: see for me and Adam, that was like my favorite part of the book. So I really wanna see how they did it in the

[00:14:33] Carol: I

[00:14:33] Ben: it was a fun, it's, I didn't miss it entirely, but, you know, I missed two minutes of it and I, and I felt okay with that.

[00:14:40] Tim: you, you angered

[00:14:41] Adam: we need to move on, or I'm gonna

[00:14:43] Ben: Okay.

[00:14:45] Carol: Well, that was me, Tim. What do you have?

[00:14:48] Tim's Triumph and Fail

[00:14:48] Tim: Well, I'm gonna double dip today. I'm taking a triumph and a fail. The triumph is one that's been in the making since Feb, middle of February. It's now middle of April. The procedure for my retina, the bubble is finally 100% gone. I woke up Sunday morning

[00:15:04] Tim: and the bubble is gone. I can see fairly well out of my right eye.

[00:15:10] Adam: Rub it in.

[00:15:11] Tim: There's still, there's still a little, there's still a little gas there. but it should clear up. So I go to the, my last visit at the Emory Hospital in Atlanta is we tomorrow, Wednesday. And so hopefully I get a clean bill of health and I can finally put this chapter of my life behind me and I can shave off my goatee.

[00:15:31] Carol: you, do you feel like you're gonna have to get new glasses or do you feel like your

[00:15:34] Tim: Oh, I'm definitely gonna, I mean if this, if it doesn't, yeah. I'm sorry to interrupt, but if it doesn't clear up, I definitely need a new prescription in my right eye. It, it's so much worse.but they did tell me it would take a full year for it to completely, he heal physically,

[00:15:49] Carol: Wow.

[00:15:50] Tim: I might have to, I might have to get a prescription and then go back again and get another prescription 'cause my, it can completely

[00:15:55] Ben: Bonkers.

[00:15:56] Adam: can I make a suggestion? Just get like a pirate eye patch instead.

[00:16:02] Tim: I can see if I, I mean, I'm enjoying actual, you know, depth of field of depth, binocular field of depth vision finally. It's amazing.

[00:16:14] Ben: That's awesome though, dude.

[00:16:15] Tim: yeah, yeah. It's been, it's been weird. You know, it's like, it's, most of the time when you're sick, you hurt. I, I did ever hurt,

[00:16:22] Adam: Yeah.

[00:16:23] Tim: but on the inside my emotions were hurt.

[00:16:26] Tim: I just felt like a, it's odd, just like your body breaks down. You realize just how complex and amazing bodies are, but at the same time you're like, I felt like a failure, even though it's like's not my fault. Right. My, it's just bodies being

[00:16:40] Ben: Right?

[00:16:41] Tim: so

[00:16:41] Ben: so stupid.

[00:16:42] Tim: they're so stupid.

[00:16:43] Ben: The

[00:16:44] Tim: So that's, that's my triumph, my failure.

[00:16:48] Tim: So I was talking to my parents about something, I don't remember what it was, and I was talking about some website domains I had sold in the past. 'cause oh, I know what brought it up. I was telling them about how I was using AI. I have all these domains, I have tons of domains I've been sitting on for years, I don't do anything with.

[00:17:03] Tim: But now that with AI it's so easy, I can create a, a Hugo static generated site in probably 30, 40 minutes. And so I was going through my, my different domains and one of them I had put up for sale a long time ago, back in like 2015. And so I went to look at, Sedo, Sedo.com, which is a, domain site that like does domain auctions and holds it in escrow to make sure that one per, you know, no one steals your money and you don't steal the domain, not give them the domain which I'd sold, I'd sold different ones through for over the years.

[00:17:39] Tim: And I looked at some of my messages going back to 2018.and so someone had offered me $60,000 for the domain.

[00:17:48] Ben: Dang

[00:17:49] Tim: Someone offered me $80,000 for domain. And then the final one was someone offered me in 2018, $100,000

[00:17:58] Ben: what

[00:17:58] Tim: for hairstyles.net. And I missed it. I didn't, I don't know what, I don't know what happened.

[00:18:03] Tim: I got notified. I never saw it, and they were acted on it. 'cause trust me, I would've pulled the trigger on

[00:18:08] Carol: mm-hmm.

[00:18:09] Tim: thousand dollars. And so that's, I was, I saw that I was so sick. I texted my dad, I texted my dad. It was like, Hey, you know that conversation we had where I sold a domain for like, you know, $20,000?

[00:18:21] Tim: I'm like, yeah, I missed one for a hundred K.

[00:18:25] Carol: Oh, wow.

[00:18:26] Tim: so I, I used my new found, AI skills to create a placeholder page. So you've got a hairstyles.net. It will redirect, it will tell you it's for sale. You click a button, it takes you to Sedo where you can make an offer. And the minimum bid is a hundred grand.

[00:18:39] Tim: If, if people, was it eight years ago,

[00:18:41] Carol: Yeah,

[00:18:42] Adam: Mm-hmm.

[00:18:42] Tim: if eight years ago would do a hundred thousand, I probably can get 150.

[00:18:47] Adam: Maybe, yeah.

[00:18:49] Tim: So

[00:18:51] Adam: So one of these days Tim's gonna take us all out for a fancy testicle dinner

[00:18:54] Tim: there we go.

[00:18:56] Carol: and we're hairstyles.

[00:18:58] Tim: It's a pay off my house.

[00:19:01] Adam: yeah. Really.

[00:19:03] Tim: So that's my triumph. That's my fail. How about you, Adam?

[00:19:06] Adam's Triumph or Fail

[00:19:06] Adam: So I don't know what to call this one. you know, I'm just not sure yet. The jury's still out if this is a triumph for a fail, but I have some interesting stuff to share. you know, we've talked a lot over the last several weeks, maybe several months, about, our usage of Claude and Claude Code, and I have in the past dabbled with like ChatGPT, mostly just the free one and some of the other models that are available, like through OpenCode for free.

[00:19:30] Adam: but I, I've, I just keep hearing people say that like the, the new, ChatGPT Codex models are pretty good and. this, as we're having this conversation over the last, like two weeks, anthropic has just really doubled down on being hardheaded about not releasing their stuff as open source, not the model itself, but you know, the, the tooling and stuff around it.

[00:19:53] Adam: Like, for example, the OpenAI

[00:19:55] Tim: Until their, until their code gets, leaked

[00:19:58] Adam: until it leaked. Yeah. but even then they, you know, they, they sent out a bunch of, DMCA take downs for people, like people reconstituted the source from the source maps and stuff like that. And, they sent out a bunch of legal crap out after that anyway, but like, you know, there's that and, they just refuse to answer questions like there's, so we've talked about Matt Pocock who makes his living selling courses.

[00:20:22] Adam: Right.he came onto my radar from selling TypeScript courses and now he's doing stuff with AI. He has a course he's selling right now, I believe it's called AI Hero, that is very heavily built around Claude Code. And like, you know, he teaches how to do specific things with the Claude Harness or the Claude Code harness.

[00:20:42] Adam: And he has, he's always been a very positive, non-judgmental person. And I saw some tweets that he posted where he was like, you know, just he finally lost it. Like he seems like he's had it. He's, he

[00:20:54] Carol: Uhoh.

[00:20:55] Adam: ago, for clarification, like, can I tell people how to do this and OpenAI or not OpenAI?

[00:21:01] Adam: anthropic just refuses to answer. They just ghost him. And it's, it's just ridiculous. 'cause like, I, I don't think it's, I haven't heard of it happening recently, but in the past, anthropic has banned people from using their models for doing things, like using it in OpenCode or, you know, finding ways to get different models to talk to each other in like conversational modes and stuff.

[00:21:24] Adam: And, you know, none of us as a lawyer, we can't read the terms of service and then, you know, have to like back that up in court, right? Like, and, and people just want a, a clear explanation or barring a clear explanation, at least a direct answer that is in writing. Like, can I do this? Yes or no? And they like refuse to do that.

[00:21:44] Adam: and so I've been getting increasingly frustrated with Anthropic and combine that with hearing that, codex is getting pretty good and combine all of that with,continuing to struggle to try to get my work done that I'm trying to get done.

[00:21:58] Adam: because I'm bumping into my, my usage limits on the a hundred dollars a month Claude plan. So finally I was just like, okay, it's time I need to try out Codex. I bit the bullet. They don't have a hundred dollars a month level, so I signed up for the $200 a month level. And this was like a little bit more than 24 hours ago, right?

[00:22:14] Adam: So like midday on Monday. and it's stupid.

[00:22:19] Tim: It's

[00:22:20] Adam: It's, I mean,

[00:22:21] Tim: when it comes to coding, it's stupid.

[00:22:23] Adam: it's decent, but when you're coming from months of being a very heavy Claude user and just seeing the, like, I can see how like certain things, ChatGPT, codex, whatever you call it, let's just call it codex certain things I can see, like it digs a little deeper, it tries a little harder, but it still makes stupid decisions.

[00:22:44] Adam: and, and I'm, I'm very focused on this one migration task that I've been working on, like did migration. Now I'm working my way through a bunch of automated code reviews and like review, fix, re-review, refix, re-review, refix, that sort of cycle.and it's just, I, I, it's made me incredibly frustrated. I do think part of it might be the harness itself.

[00:23:05] Adam: Like I started with the Codex CLI app,

[00:23:09] Ben: Is that like Claude Code Desktop?

[00:23:12] Adam: no, it's like Claude Code, in the, in the CLI, the, the terminal version.

[00:23:17] Ben: Oh, so.

[00:23:18] Adam: separate desktop native

[00:23:20] Tim: you just type codex and it runs.

[00:23:21] Adam: Right. You type code X in your, in your terminal and it runs and it's very similar to Claude Code or OpenCode. and you know, like visually it's fine, it works okay, but like simple stuff like Claude was really good at helping you customize your Claude Code installation, right?

[00:23:39] Adam: Like you and I, Ben talked a little bit about customizing my status line, right? Like I had it show my current Git branch and like had a progress bar for the context window and stuff like that. And there is stuff in Codex, the CLI client, to do similar things like the, the status line in particular, right?

[00:23:57] Adam: So like they do have a slash status line command and that pulls up like a, just a checklist and you can say, show this, show this. Don't show that. Don't show that it's a little bit less customizable. It's not so much like, it doesn't just allow you to put whatever you want in there, but you can turn on and off different things.

[00:24:13] Adam: And the, I just asked it as like, okay, using your official client. And I'm asking your official model, can I do this? And it's like, no, sorry. It's not possible. And then, you know, I did some Google searches, their documentation is terrible, it doesn't explain how to do this. And then finally I just tried like slash status line.

[00:24:30] Adam: It's like, oh, okay. I can just toggle it on. It was right there. Like it seems so simple that it should know this, but it didn't.

[00:24:36] Ben: Yo, I, I have to say Claude Code. From from the other side. In the last two weeks, it has gotten really randomly janky, specifically with knowing how to do its own things. So I have a couple of skills, and they're skills that I'm the only one who should execute them. Meaning that skills can either be auto discovered by Claude or they can be manually invoked, or a bit of both.

[00:25:02] Ben: And you can set it to, I, I forget what the right flag is, but in the, in the front matter, you can do like model invocation, disabled. True to say, you're the user's the only one who can run it. So I will try to run the command and then Claude will say, okay, I see you wanna run a command. What would you like to do with it?

[00:25:20] Ben: And I'm like, run it. That's what it's there for. And it would go, okay, unfortunately I can't seem to run it because you have the model flag disabled. And I said, yeah, I know I'm trying to run it. You don't have to discover it. And it says, well, if you just turn that flag, off, then I'd be able to run it for you.

[00:25:37] Ben: And I said, okay. Then what the heck is the point of that flag in the first place? And then he goes, oh, let me do a web search. And then it comes back and like, oh, let me do the Claude Code. Best guidelines. He goes, oh, you're totally right. That's exactly what that flag is supposed to be able to do. Something must be going wrong.

[00:25:52] Ben: And it's like, and then I'll try it again. And it works perfectly. And it's something happened like two weeks ago where it's just become like, it doesn't even know how to use itself. It's really weird.

[00:26:03] Adam: yeah. The, and it's been buggy lately. There, there's another thing that was kind of pushing me away, right? So, I dunno if you guys have noticed there's been a lot of buzz on Twitter and on Reddit and, and various places about people feeling like they're. Quota runs out really, really fast. You know, the, I doing the same, you know, I, I have literal scripts that I run, not necessarily deterministic, but you know, very, consistently low token usage on like, to do the code review stuff.

[00:26:32] Adam: and before where it would take maybe two to two and a half hours to burn through my five hour session quota, if I were to just like, start it at the beginning of the session and let it rip, you know, it would burn through the entire quota about two to two and a half hours. And then there was a day where I burned it all out in like 45 minutes and I was like, that's bananas.

[00:26:49] Tim: could this be the surge pricing? 'cause they've started surge pricing.

[00:26:52] Adam: no. Well, I think that might have had a little bit to do with it. but the, I don't, I. Maybe,they, they didn't announce that until like two days later, after I started noticing this stuff, which is another thing. It's just, you know, again, we're just adding fuel to the fire of like, things that were pissing me off about anthropic.

[00:27:10] Adam: It was like, instead of announcing a change ahead of time and then giving you time to prepare for it, they just make the change and then announce, oh, this is what everybody's been whining about for the last week.

[00:27:20] Tim: Yeah.

[00:27:22] Adam: you know, they, what their claim was to, to kind of clarify what Tim was saying is that it wasn't necessarily surge pricing, but it was like surge rate limiting, right?

[00:27:31] Adam: So during peak hours of the day, you're, like during the, the basically US business hours, your session quota would be

[00:27:40] Tim: 10:10 AM to 2:00 PM something like that, or maybe nine to two.

[00:27:45] Adam: I think it was until 2:00 PM Pacific, so that would be 5:00 PM Eastern anyway, and I think it was, yeah, I don't know, whatever. But, ba my, the way I interpreted it, which is inferring some stuff 'cause they didn't, of course, Anthropic, they're not just gonna give you a straight answer, but, but what I inferred from it was that they, they're not changing your weekly quota, but during those hours, your five hour session quota was greatly reduced.

[00:28:10] Adam: So basically they're just kind of forcing you to work outside of business hours if you wanna make the most out of your subscription, which again, pissed me off. so, yeah. You know, like, and so I, I, I have been kind of harsh on, on Codex here. I do want to say I have tried it in a couple of different harnesses, which is one of the great things, you know, one of the things drawing me away from Anthropic is that Codex, they're more permissive in what they allow you to use it with.

[00:28:37] Adam: I'm using it. in the last, for the last like three or four hours I've been playing with it in OpenCode and I'm liking that interface a lot more. I find it just like everything seems to be gray on gray on gray, which is annoying to me. I like a little bit of color, but of all the gray on gray on gray interfaces, OpenCode seems to be the most screwable, like I can actually look at it and make sense of it.

[00:29:00] Tim: Mm-hmm. So I'll kind of, I think, reinforce some of your assumptions based off my experience with it, the two. 'cause you know, I mentioned before where I would, I have two subscriptions, so I'll pit them against each other and they kind of expose based off their answers, what their strong points are and what their weak parts are.

[00:29:20] Tim: So I, I think Anthropic went the route of being very business focused, very developer focused. You know, you give it a problem and immediately it wants to start spitting out Python. Scripts, right? whereas, OpenAI ChatGPT tends to be a bit more general user, high level things. and, and so when I would pit these against each other, I would notice that like Claude would immediately get down to implementation details without spending enough time in the planning stage.

[00:29:52] Tim: And whereas ChatGPT OpenAI would spend a lot more time researching public information. And, and so using the two of them back and forth helped, really helped. 'cause it's like OpenAI caught a lot of assumptions that Claude didn't, and I think that, but I think OpenAI is pivoting. They got rid of their, video generation, what was that thing

[00:30:19] Adam: Oh, Sora.

[00:30:20] Tim: They got got rid of that. I think they're seeing that Anthropic is actually making money. And they're not, they're not.

[00:30:27] Adam: my, my theory on that is that they need the GPUs and Sora wasn't making any money, and so it was just wasting money.

[00:30:33] Adam: And they need those GPUs for training or whatever.

[00:30:35] Tim: yeah. But I think they're gonna pivot more to the business side of things. 'cause that's, that's the people who really, I mean a mom and, you know, a mom and pop an individual user, they're may be gonna spend 20 bucks a month, 30 bucks max. Whereas a company who's making money off of building things, they're gonna spend $200 per, per user happily if they get good return on it.

[00:30:59] Tim: And

[00:30:59] Adam: they're likely to have minimum three, four users, if not

[00:31:02] Tim: right. And our company, yeah, we, we got like 50 something. So, so Anthropic is really kind of cornered that market. And OpenAI, I think is kind of trying to push into that, but they haven't got there yet.

[00:31:15] Adam: Well, I mean, like to, I guess to kind of wrap up here, you know, I, I said I was being kind of hard on OpenAI stuff, and I do think that the. My frustration, it feels like it's primarily from coming from the model, like the tools, the openness, the open source of their tooling. I, I feel like it's decent enough.

[00:31:35] Adam: It may, you know, in some ways I feel like it may be lagging behind Claude Code, but that just could be because I have two months or more of Claude Code experience and, 24 hours of, of Codex experience. But, the, the models will get better, right? Like, so if ev, if everything else except the model is equal or better, on this side of the fence, right?

[00:31:55] Adam: The grass is always greener sort of thing. But if, if, if the, if everything else is good over here and it's just the model's a little lagging behind, then maybe I should stick it out. But I will say I had a period today, I even copied a, a series of my prompts into our discord where I was just getting more and more and more furious with

[00:32:14] Tim: saw that.

[00:32:15] Adam: Oh my god. To the point where I was like, I told Codex like, I've been working with you for less than a day and I already want to cancel my subscription. You are awful. but I, like I said, I do think it's the model. I do think the models will continue to get better, so it'll be, you know, you can't just BI mean, some people I'm sure do bounce back and forth between whoever has the best model on any given week, but,

[00:32:37] Tim: Brian,

[00:32:40] Adam: but yeah.

[00:32:41] Adam: Anyway, so let's, let's wrap it there. We've been doing triumphs and fails for quite a long time now. So let's, uh, let's move on to the topic of the day, right?

[00:32:48] Tim: more AI.

[00:32:49] Ben: Yeah, what are we talking about?

[00:32:51] AI Beyond Coding

[00:32:51] Adam: Uh, well, you know, AI beyond coding uses, right? Like, where are we running into it in the real world? That's kind of, I think that would be an interesting discussion to have and,

[00:33:00] Tim: The, the non, the non programming world?

[00:33:02] Adam: Yes. Yeah, we are, we're real people too. if you ask us. in our opinion. but my wife is a therapist, and she uses a tool to do her electronic medical records.

[00:33:14] Adam: And insurance companies require like a specific way that you format your notes for, you know, getting paid for your sessions that you do and stuff like that. And so at dinner tonight we were talking about, the tooling that's available to her in her, like the AI specific tooling that's available through her tool suite.

[00:33:32] Adam: I'm trying to like, walk around the words. I don't wanna, you know, name drop any particular tools or anything. But, so she uses as a system to take notes. And basically the thing that she's been making use of is, you know, insurance companies are very particular about, they wanna see these words in the note or they wanna make sure the note covers this particular aspect of the treatment plan and that sort of thing.

[00:33:55] Adam: And, you know, therapists don't get billed for or don't get paid for the time they spend writing notes. So. They get paid for the time that they're actually talking to their clients.

[00:34:05] Tim: I pay for an hour and they give me 45 minutes, so

[00:34:09] Adam: that's, yeah. It depends. Every therapist does it kind of differently or, or every company. I think my wife charges for 50 minutes and, and sees them for 50 minutes. But

[00:34:17] Tim: okay.

[00:34:17] Adam: way, whatever. yeah, so the tool that she has, like, she just kind of puts in an outline of notes, like kind of, or, or a summary or shorthand of what she wants to say, and then there's like an AI button in it and it takes the, the note that she has written and, and kind of fleshes it out to be a full professional like pro.

[00:34:35] Adam: note with everything that the insurance company wants to say, and she likes that. But then we got into like, well, you know, some, the, the same tool, the same system has a new feature now where, you can let it listen to the sessions. Like it'll record the audio of your sessions and it will automatically take the note.

[00:34:54] Adam: And like she hasn't used it. We kind of talked about like what, that could enable, right? It could, it could offer some advice not to the client, but to her as the therapist say, like, uh, you might not have picked up on this, or Have you considered this kind of a treatment? That sort of thing, which would be interesting.

[00:35:13] Adam: But there's huge privacy concerns

[00:35:14] Carol: Huge. Yeah. I'm a big no-no on that one. Like

[00:35:18] Carol: So when we've been through this training, they'll try showing us, different tools that we can use, and they're like, oh, did you know you can just speak to it and have it record for you and give you answers? I was like, the last thing I want is my laptop recording me more. I'm like, what I'm doing, I don't even want it to hear me when I'm talking outside of work hours.

[00:35:38] Carol: Right. I'm like, no. If my therapist was like, may I record this session with audio? I'd be like, I'm finding a new therapist. Like, there's no way I would want that to be a thing. But I think of the privacy side of it, like, who's gonna use that data for what now?

[00:35:50] Tim: Right.

[00:35:51] Carol: Yeah.

[00:35:53] Tim: Yeah. I had a therapy session yesterday and it was, it's online. So one of the, options was, do you allow recording? And I clicked, no. 'cause 'cause I, you know, I, I don't want them throwing that into AI and then it goes, it gets leaked. I mean, you, you never know. It's like we always, you know, no one thought when 23 and me came out that your, data about your, your DNA and your genetic disposition would eventually be bought.

[00:36:16] Tim: 'cause they go out of business. Like once, once you give permission to something, you can't get that back

[00:36:22] Carol: yep, I agree.

[00:36:22] Adam: Once you give up the data, there's no taking it back.

[00:36:25] Security Breach Fatigue

[00:36:25] Ben: Yo, sorry, quick side tangent for two seconds. One thing that I've noticed in this AI world is I feel like people have become extremely lackadaisical about. Security issues. Whereas you go back five, six years ago, if there was a, any kind of a big breach, people would be like, oh, well that's the end of that company.

[00:36:49] Ben: I mean, it was never the end of that company, but people thought about these things like, this is catastrophic. You've destroyed trust with your users, et cetera, et cetera. And now I was hearing people, several different podcasts have been talking about the, Claude Code, you know, how they accidentally shipped their source maps and exposed all of their agent code.

[00:37:09] Ben: People are like, yeah, it's no big deal. And you're like, it's no big deal that a company that's raised $300 billion. I, I'm just saying

[00:37:18] Adam: of their customer

[00:37:18] Ben: No, no, no, no. I'm, I'm, I'm just saying like, I feel like the, the temperature of the programming world has been like, oh yeah, bad stuff happens. No, no big deal. And, and if you went back five, six years ago, I think it was very different.

[00:37:33] Tim: yeah, I

[00:37:34] Adam: think you're right. But I think it's a good thing.

[00:37:36] Tim: I don't know if it's so 20, so 20 something years ago there was a company called TransUnion that we, we were a customer of, and that did credit scoring and things like that. You now know them as Equifax, and the entire reason they were, they changed their name is because they got leaked and they leaked their customer data.

[00:37:55] Tim: And so they had to do a very expensive, you know, corporate shift to make it look like they went out of business. Basically. They didn't, they just became another brand and reist and now like you're saying, it's like, you know, they, their stuff gets leaked and they just go, yep, sorry, that's embarrassing.

[00:38:13] Tim: And then nothing ever happens.

[00:38:15] Ben: way people react to the OpenCode stuff, right? So OpenCode became super popular, and then it was like, oh yeah, all these people's API tokens are getting leaked. And Andrej, what is it? Andrej Karpathy, what's his name? Something car, you know, like the guy, the vibe coder guy, like his API tokens were, were leaked because he signed up for OpenCode.

[00:38:33] Ben: And, and I, I, again, I feel like you went, you go back a couple years ago and that would've been it, like no one would've ever talked about this project again because it was so terribly SSE insecure. And people today are just like, oh, how hilarious is it that people are just open up their whole worlds, these totally insecure engines.

[00:38:52] Ben: Oh, let's just have that company get acquired by OpenAI or

[00:38:56] Tim: Well, I, I think it's, there's no government control. There's no repercussions for it, so what can people do,

[00:39:01] Ben: No, no, I a hundred percent. I'm just saying like the numbing that seems to have happened is shocking to me.

[00:39:07] Adam: So the, the OpenCode thing is interesting. Honestly, I, I put that in the category of people who probably stand to financially gain from the AI hype train, continuing on the tracks, right? Anything that's going to make it start to look like a train wreck. it is probably got a lot of people that would stand to lose money from that.

[00:39:27] Adam: And so they, you know, if it's their personal investments or their, you know, like if they helped, fund any of these companies, like v VC money type thing, anything like that, you know,but going back to my earlier comment about, I think it's good, obviously I don't think it's good to, for the credit bureaus to leak their customer data.

[00:39:47] Adam: That's not what I was getting at. But more like, you know, companies make mistakes. People make mistakes and barring something horrible, like leaking all of our customer data and having that contain incredibly sensitive private information. Right. If it, if it, if I leak your address, yeah, that's crappy.

[00:40:05] Tim: But it's also basically public record, right? Like, if I know where to look, I can just go get that. So I might not even have to pay for it. I had to explain to my kids what a White pages were. I said it basically, it was a big giant book. It is a big, giant book that doxed everybody in your neighborhood.

[00:40:20] Adam: Yeah.

[00:40:21] Ben: That's

[00:40:22] Adam: they, they, they gave it to you for free, whether you wanted it

[00:40:25] Tim: Exactly.

[00:40:27] Carol: it's not like OPMs so many years ago when like our background checks were leaked. Right? Like we had a, a breach Right. And fingerprints were taken. Like

[00:40:37] Adam: Mm,

[00:40:37] Carol: of that stuff, like this is, this is everything you need to, to make another copy of you.

[00:40:43] Adam: Yeah.

[00:40:44] Carol: Yeah.

[00:40:45] Adam: So, but like, you know, if, if it's, if it's, you know, I don't know what kind of data I'm talking about, but like, you know, if we make a mistake, if we ship a bug and it causes a bunch of peoples, a lot of grief and maybe they lose a little bit of money, like, I think people are more forgiving of that now than they were five years ago.

[00:41:03] Adam: And I think that what that to me represents is. Kind of two things. People are better understanding that like not every website you go to on the internet is run by a mega corporation with a billion dollars available to spend on testing their software. And two, like the, the people are understanding that and that there's, you know, there's

[00:41:26] Tim: but it's, it's the billion dollar companies that are getting breached.

[00:41:29] Adam: True. I, I can't, I'm not gonna argue.

[00:41:31] Tim: argue. I mean, there's the ones that get in the news. If, if our company got breached, you know, only our corporate parents would care, but it wouldn't get, make the news.

[00:41:40] Adam: Yeah.

[00:41:41] Ben: I feel like the, the slippery slope has been. The AI makes mistakes. Yeah. But people make mistakes. And then somehow that became, mistakes are okay in general. Which again, not to say that we don't all make mistakes and bad stuff happens, and we try to learn from it a hundred percent. That's the reality of the world.

[00:42:00] Ben: That's reality of product development. But I, I feel like there's, in order to allow for having these age agentic systems that make mistakes, we've had to soften our stance on mistakes. And I think in some ways, maybe way too much or to paint it as that all mistakes are equal and they are not. Anyway.

[00:42:20] Adam: Yeah. No, I, I, I see where you're coming from. I don't know that I have anything to add to that.

[00:42:24] Tim: So anybody else got any AI in the streets? LLM, in the sheets stories.

[00:42:29] Adam: I was waiting for that.

[00:42:31] Carol: Heck yeah, I do.

[00:42:33] AI in Everyday Life

[00:42:33] Carol: So, I haven't really talked about it on the show yet, but I guess I can kind of talk about it. Um, we're PCSing again, so we are about to make another move across the country. So that's like a four night trip we'll be doing and my husband thinks that all of a sudden the best way to get our route is to ask Claude and generate a map image of where we should stay, which is military bases along this route.

[00:43:00] Carol: First, let me just let you know, it doesn't know where Savannah Georgia is. 10 different tries. Drop the pen in near Destin, Florida, which is

[00:43:12] Tim: Yeah. Nowhere here.

[00:43:13] Carol: and a half, five hour drive away. And my husband's going, well, according to Claude, we can make it in three days and this is how many miles it is. And I said, lemme look at that map.

[00:43:25] Carol: And instead of just opening Google Maps and generating a route, regenerate the image. Regenerate the image. And I'm going, sometimes AI isn't the answer.

[00:43:37] Tim: use

[00:43:38] Carol: We're not going to use. Yeah, we're not going to use it. I'm just gonna tell you the map, the all the bases. They're along I 10 and I 20. And you can make the decision.

[00:43:46] Carol: Do you wanna go North Louisiana or South Louisiana? Then we'll pick where we're staying. We don't need Claude for this.

[00:43:54] Tim: That's funny.

[00:43:56] Adam: It's like me with magnets, man. If, if there's a way to solve a problem using magnets, I will. I'll be all about that.

[00:44:03] Carol: Well, I even ask him, I'm like, do you have a concept of tokens? Like, do you understand, like you are limited on what you're allowed to use and you're just regenerating the same information over and over, and not telling it how to fix it. You're just saying generate regenerate. Not like, oh, that's not Savannah, that's Destin Florida.

[00:44:25] Tim: right.

[00:44:26] Ben: That's frustrating though.

[00:44:28] Tim: I, I

[00:44:29] Ben: mistakes. Carol.

[00:44:31] Tim: It does. And I use ChatGPT. So I, when I created all my recipes for our wild game dinner a few weeks back, I had to scale them up, right? 'cause and so I'm like, there's 20 diners, and I just took it at its word. Like, the amounts it gave me were legit. And when I went, so when I had the other chefs like, like create these salads and different side dishes, it was like, it was enough to feed 100 people. It was a ridiculous, I'm like, I had this huge vat of cucumbers that, you know, it's like this Koji cucumber salad thing that was gonna go on the side. It was just, I told him, I'm like, it's just a little tiny bit on the side. It's not a full plate. Yeah. We threw away probably like four pounds worth of that stuff when we were

[00:45:12] Carol: Well, at least it wasn't the opposite and you only had enough food to feed like five.

[00:45:16] Tim: Yeah. Yeah. so I, I have one AI in the street, so it's like I, anytime I talk about AI to my kids, they get like. Viscerally angry. Like they, they don't, they don't like it at all. And I, I, and I think I've talked to other people, they're, it seems Gen Z particularly, which I think is their generation, I don't know.

[00:45:39] Tim: They get, get confused. I know I'm Gen X, but everyone else, whatever. I think Gen Z just, they were growing up in high school where AI was just getting kind of started and the teachers were so worried that everyone was gonna cheat, that they basically said, don't use it. You know, they, they drilled into their head that it's, if you do that, you're gonna flunk, you know, you gotta learn anything.

[00:45:59] Tim: And so now that they're actually.

[00:46:01] Adam: We got this is your brain on drugs. They got, this is your brain

[00:46:03] Tim: Right. This is your brain. I learned from watching you dad. yeah. So like, now that they're my daughter's, like she's in the workforce and like my, so it's funny, she's working for the guy who first hired me for the company I'm at now, and he's like, he sold the company, he owns like a country club and a bunch of other things.

[00:46:24] Tim: And so she's working at the country club and she's doing kind of like sales and marketing for them.and, I showed him some stuff I generated with AI and he's like, this is awesome. I'm gonna, I'm gonna buy this for Lily so Lily can use this. And Lily basically just looked at him and said, Nope. I know how to use, I know how to use Canva.

[00:46:42] Tim: I don't need this. I'm gonna use do it my way and, and my son's the same way. It's like, okay. Uh uh. Particularly my son. 'cause if he goes into corporate programming like I am, it's like pretty much every corporation is forcing a mandate that you gotta use AI.

[00:46:59] Ben: Well, I remember you were telling us a story when you were in Portugal and he had drawn some manga character or something and you had AI colored in and he was

[00:47:07] Tim: Ate it.

[00:47:08] Ben: he was very off put by that.

[00:47:10] Tim: Oh yeah. He was extremely, yeah. He's like, that's the worst thing you could have done to me. I'm like, geez, dude, chill. Not like I beat you. Come on.

[00:47:17] Adam: Yeah, I mean that, so that's interesting 'cause that's a very artistic. Point of view perspective. Right. I don't know if Lily's is similar. I mean, it sounds kind of like it's, it's similar 'cause it's a, you know, you talked about using Canva, so that's kind of an artistic approach to the work that she'd be doing.

[00:47:33] Adam: I have two kids. One of them I would say is, we haven't talked about it a whole lot, but I would say that they're kind of ambivalent about AI stuff, at least as much as I've heard. And then the other one, my older kid is more of an artist. She draws stuff, she does a little bit of painting and like anytime she sees me using AI, or anytime I mention that I made something with AI, she just, she just gets disgusted and like wants to leave the room.

[00:47:55] Tim: Yeah. Same for my kids.

[00:47:56] Adam: and like I'm not doing anything directly to her or, or I'm not even usually, doing anything art related. Like, I generated my profile picture. Right. And that was based on a selfie of me. I was like, here's a selfie of me, make it in the style of Rick and Morty or whatever. But like. I don't know.

[00:48:13] Adam: Like, I, I get it and I think that honestly it's probably to some extent gonna do them good, right? Like our generation, I feel, I feel like the thing I'm most worried about is the brain rot, right? I feel like we're eager to give up these thing, these skills because we're a little bit later in our careers and it's like, okay, hey, cool, I can, I can take my foot off the gas, I can kind of relax and go on cruise control for a couple of years toward the end here.

[00:48:37] Adam: and it's probably not doing us a service, like we mentioned last week. It's making us stupid and lazy and, like, I don't blame them for it. I think that the, the danger for them and for that attitude is that they're then going to end up in a situation where they don't have a choice, they have to use it, or, or the people who are do choose to use it are going to outpace them and, and do better and more work, which is, I don't know, you can take it philosophically in a bunch of different directions, but.

[00:49:04] Adam: that's my worry is like I, I do think that they're avoiding the brain rot, which I think is great. It's healthy, but I also worry that they're not gonna have that tool in their tool belt when they need it.

[00:49:13] Tim: right. They're gonna get out competed by people that have been using it.

[00:49:16] Adam: Yeah.

[00:49:17] Carol: Yeah, like for us, we were molded by the creation of the internet, right? Like everything we've been doing is new. Our kids have had phones in their pockets for how many years, right? So having technology at their fingertips isn't something that is new to them or even exciting. It's just an expectation. So I think it's easier for them to go.

[00:49:38] Carol: I don't care. I just don't care. Like, if it's not providing for me right now, why do, why should I look into it? Like why should I, let that change how I'm going to like live my life?

[00:49:48] Tim: Yeah. That's, that's a really good viewpoint, Carol. I, I didn't think of it that way. 'cause Yeah, like internet was extremely new. Like I was dialing up on a modem, right. And going, oh, machines can talk to each other. This is so cool.

[00:49:58] Adam: Mm-hmm.

[00:49:59] Tim: yeah. And so I see the next new cool thing. I'm like, this is gonna be, you know, I, I am assuming that what we're going through now with AI is gonna be as transformational as what the internet was.

[00:50:11] Tim: In the late nineties, two thousands, maybe it is, maybe it isn't. They haven't had that experience that they've been, like you said, around Yeah, that's a really good takeaway.

[00:50:22] The Human Connection Problem

[00:50:22] Ben: One thing that I'm surprised by when people are surprised by this is I'll hear them run some sort of little experiment where they take a piece of AI written prose or AI generated music, and they play it for a group of people and say, you know, did you like this? And they, oh yeah, this was really good. It was very catchy.

[00:50:41] Ben: And they say, did you know this was generated by AI? And then the people who listen to it are like, oh, this is gross. I actually don't like it anymore. And people are surprised by that. And I'm like, how are you surprised by that? A human thought. They were connecting with another human through some sort of medium.

[00:50:57] Ben: And then you said actually you've been connecting with a machine. Like of, we've all seen the, the movie Her. I think at this point there's the one pivotal, you know, spoiler alert here, it's 15 years old or whatever the spoiler alert, you know, he's deeply connected to the, to his os, whatever her name is.

[00:51:15] Ben: I can't remember her name. And,

[00:51:18] Carol: her.

[00:51:19] Ben: no, the, the actress

[00:51:21] Tim: Scarlet

[00:51:21] Ben: Scarlet Johansson. Yeah. He's deeply connected to Scarlet Johansson. And then at one

[00:51:26] Tim: wouldn't we all wanna be

[00:51:27] Ben: he says, you know, are you talking to other people? And she says, yeah, and you know, how many other people are you talking to? And she's like, 10,642. And he's clearly crushed by this idea that the machine is talking to so many other people.

[00:51:41] Ben: And on one hand you could be like, well, why is that affecting his relationship at all?

[00:51:45] Ben: You are like, because he was connecting with somebody.so, so I'm just, I'm shocked anytime, anytime in, in the modern age when people are still surprised that humans want to connect with other humans. And when you rug pull or catfish them, so to speak, that they're bothered by that. And it, I, I don't know. I'm just, I just, I'm so befuddled by people who don't get that,

[00:52:10] Ben: and I'm hoping that, we don't let AI take over everything. I, I, it's like I'll be listening to an interview and they say, oh, how great would it be to get on a, on a call? And your call is powered by AI. Like whoever wants to talk to a person, all they wanna do is get things done quick and efficiently.

[00:52:27] Ben: And I'm like, yeah, in the most abstract way, that is true. But then you talk to anyone who actually gets on a call with people and you know, everyone's like, operator, operator, operator.

[00:52:38] Carol: time I'm like, speak to a human. Uh,so with that, we've been looking for houses. So I've been calling up like agents and stuff, trying to figure out where we're gonna live. I called this one company and it's like, hi, I'm Amy, your virtual assistant. I'm like, let me speak to someone. She's like, where are you at?

[00:52:56] Carol: I'm like, telling her. And she's like, okay, how can I help you? I'm like, let me speak to someone. And finally it sends me through to what's supposed to be another person like to talk to. And every response is, oh, that's wonderful. Hold on, lemme look at that. And I'm like, this isn't even another human. I'm so pissed off now.

[00:53:15] Carol: And I hung up. I was like, I quit. I quit. I won't work with you. Don't give me another model to talk to you. I want a person.

[00:53:22] Tim: But I mean, what if the person is like, obviously in like a call center in somewhere, you know, India, Pakistan, and you can't understand a word they say because that

[00:53:32] Carol: sorry.

[00:53:32] Tim: to

[00:53:33] Ben: that is tough for sure.

[00:53:35] Carol: At least I still get to feel like I'm talking to someone who's looking at data and, and exactly like what I wanna see.

[00:53:41] Adam: I,

[00:53:42] Tim: but but if I can't understand them,

[00:53:44] Adam: exactly. I'm 50 50. The, the, the company that I use to refill my medication uses a call center overseas somewhere, and I feel like there's only two people that work there. I only ever talk to one of two people. Uh,but the, and, and I cannot understand either one of them. You know, like they ask me a question and it's the same, like nine questions they ask me every month when we call to, to refill my medication.

[00:54:07] Adam: And still, I'm like, I don't understand what you're saying. Can you try again? Can you slow down? Or, or enunciate, or whatever it is. And it just makes me so mad, like not mad at the person. I'm not mad at them for having an accent, but like,

[00:54:20] Tim: The company's so

[00:54:21] Adam: I, I'm trying, I am, I'm active listening and I'm trying to make out what you're saying.

[00:54:26] Adam: And I know that there's only like so many questions that could be right. And, and still I feel it. It just, it's like

[00:54:34] Ben: It's, it's, it's embarrassing to have to say like five times. Sorry, can you repeat that? I, I feel self-conscious about it.

[00:54:41] Tim: I do

[00:54:41] Carol: do too. I always hate when I have to do that.

[00:54:44] Adam: And you know, I, look, I, when the, the previous story when you were talking about, the real, the, the real estate agent like, you know, menu system and sort sending on to the next person, my first gut reaction was like, okay, what would I say to curse out this machine? I would be like, I will find where you live and I will come over there and I will unplug you.

[00:55:01] Adam: Like,

[00:55:02] Carol: I'll unplug you for sure.

[00:55:04] Adam: right. but then, you know, but then Tim's Tim brought up the, like, would you rather talk to a person that you can't understand? Like, you know, honestly, I do think if I knew that those were the two choices I had, I would take the machine.

[00:55:15] Carol: I take the person. Yeah.

[00:55:19] AI for the Elderly

[00:55:19] Tim: So another area that, that AI in the street, I know Dan Wilson, we had him on the show, he's doing a new thing where he's trying to like, encourage people to have elderly people like, like people who are home bound, kind of like use AI to answer questions and to help them, you know, navigate their life in their, their senior years. Um,

[00:55:43] Tim: which I think is a great idea. I just don't know if this, that. Generation Boomer is gonna adopt that. They just keep call, they just keep calling me.

[00:55:55] Carol: same.

[00:55:56] Tim: but maybe when I'm in my eighties, that's what I'll be doing. Although I take that back. My, so my father-in-law, my wife's dad father, he is like an AI guru. He's ridiculous. 84 years old, an absolute master using DeepSeek. So, so do it like building websites. And then he's like asking me questions and I'm like, he's really advanced.

[00:56:16] Tim: I'm like this. That's pretty, it's how he keeps his youthful brain going. Just always has some challenge.

[00:56:22] Adam: what's your middle name? Tim.

[00:56:24] Tim: James.

[00:56:25] Adam: So they're calling chat TJC?

[00:56:28] Tim: There you go. There you go. chat. T-G-C-T-G-C. I like that. Yeah. Yeah. My parents do that. 'cause they, my dad is like the le the least technical person in the world. I, I had to, I was in my mid twenties when I learned the whole thing about righty tidy lefty Lucy for like. screws work and, and someone was like, how old are you?

[00:56:48] Tim: And you don't know that. I'm like, my dad knows nothing about anything, mechanical, electrical, every, it's all magic to him. So,

[00:56:57] Adam: Well, it's a good thing he didn't ever run into any meridians. I.

[00:57:01] Tim: exactly.

[00:57:03] Ben: speaking of the, of the, elderly, my wife had read some article probably in the New York

[00:57:08] Adam: Oh, that's harsh to say about your wife.

[00:57:10] Carol: Are you calling her old?

[00:57:11] Tim: Oh wow.

[00:57:13] Ben: no, that's not, you mis you mis understanding my words. She was reading an article about, an elderly woman who lived out, you know, in farm country or something, and she was of an age where most of her friends had already died and she wasn't super mobile, so even going to church wasn't something that she did a whole lot.

[00:57:31] Ben: And I think it was her kids or her grandkids ended up getting her this little device that sticks in her home. It's like a little desktop robot or something. And it just, throughout the day, it'll ask her questions, you know, how you, how's it going, what's on your mind, anything you wanna talk about. And. I'm such of two minds about it because apparently this has completely changed this woman's life and she feels so much more alive and so much more engaged and fulfilled, which is amazing.

[00:58:00] Ben: And I can't say anything bad about that. That sounds nothing but amazing. But part of me, if something about that feels so dystopian, this, like the only person that this woman

[00:58:10] Tim: Very black mirror.

[00:58:11] Ben: with is a, a machine.

[00:58:13] Carol: Yeah.

[00:58:14] Ben: Oh, it's, I'm, I feel good and bad for her at the same

[00:58:18] Tim: It, it's, it's kinda like the horror movies where you see the guy dressed in a dress, talking, having a tea party with all the dolls in the attic. You're like, something bad's gonna happen. It's kinda like, that's kinda like, you're not talking to real things, dude.

[00:58:33] Ben: I know. And then I feel bad for feeling judgmental. Oh,

[00:58:39] Tim: Hmm.

[00:58:41] Ben: I, I'm, yeah, I've noticed that. I don't hear people generally. Talking about AI outside of the tech world, when I'm just out about in the world, I don't hear people talking about it.

[00:58:57] Tim: I don't either.

[00:58:58] Adam: It comes up not a lot, but occasionally at the drop zone. Like my skydiving buddies are talking about it and very few of them write code.

[00:59:07] Tim: Yeah. When I explain to people, like at church who like, you know, they're, they're not in the tech world at all, kind of what's going on and my fears and concerns about, they're just like, they look at me like I'm one, I'm strange and I am, so I admit that, but two, they're like, it's almost a sense of disbelief.

[00:59:24] Tim: Like I'm overblowing it and maybe I am, but it's like one day there's gonna be a day of reckoning, and I don't know which side you're gonna be on, but at least I feel like I'm prepared.

[00:59:38] Adam: Alright, well let's, uh, let's go ahead and wrap it up there.

[00:59:41] Patreon

[00:59:41] Adam: so this episode of Working Code was brought to you by having two copies of Microsoft Outlook installed on your machine. You know, if that's you, good luck, take some good pictures of the moon while you're there.

[00:59:50] Adam: And listeners like you, if you are enjoying the show and you wanna make sure that we can keep putting more of whatever this is out into the universe, you should consider supporting us on Patreon. Our patrons cover our recording, editing and transcription costs, and we couldn't do this every week without them.

[01:00:04] Adam: Special thanks of course. As always, to our top patron, Monte, you rock. And yeah. thanks for your longtime support.

[01:00:12] Thanks For Listening!

[01:00:12] Adam: we are gonna go record the after show, which is the thing that I tell you about at this point of the show every week. Basically the outro music plays. Some of you will stick around and continue listening 'cause we got some more stuff to talk about.

[01:00:23] Adam: so, we're gonna talk more about AI, it looks like, and, A 30-year-old mix tape. Hmm. That sounds interesting. I think I know where that's going. but anyway, uh, if you wanna get, uh, content like that, You can go to patreon.com/workingcodepod,

[01:00:36] Adam: throw a few dollars our way, and we will throw a few podcasts your way. That's how that works. Like for like. Anyway, that's gonna do it for us this week.

[01:00:46] Adam: We'll catch you again next week and until then,

[01:00:48] Tim: Hey listeners, remember your listeners in the streets and contributors in the Excel sheets. Your heart matters.

prev episode: 254: Claudependent