DevEx in the Age of AI - Frictionless by Nicole Forsgren and Abi Noda
p.1 - 150
Book Covered

Frictionless: 7 Steps to Remove Barriers, Unlock Value, and Outpace Your Competition in the AI Era
by Nicole Forsgren, Abi Noda
Get the book →Book links are affiliate links. We earn from qualifying purchases.
Authors
Hosts
Transcript
This transcript was auto-generated by our recording software and may contain errors.
Nathan Toups (00:00)
And so I had to make a business case as to like, Hey, we're sitting on a bunch of wasted time and energy and there's this head count. And so like, and this is the kind of thing is like learning how to speak to a CEO or CTO decision maker, learning how to get sponsorship from executive leadership on why something like this makes a huge difference is its own skill.
Carter Morgan (00:32)
Hey there, welcome to Book Overflow. This is podcast for software engineers, by software engineers, where every week we read one of the best technical books in the world in an effort to improve our craft. I am Carter Morgan, and I'm joined here as always by my co-host, Nathan Toops. How are you doing, Nathan?
Nathan Toups (00:43)
Doing great. Hey, everybody.
Carter Morgan (00:45)
Well, as always, make sure to like, comment, subscribe, share the podcast with your friends and coworkers. You can book time with us on Leland if you'd like to get some advice from Nathan and I. And you can also join the Discord, which is a hoppin' place full of lots of Book Overflow listeners, which ⁓ I really enjoy visiting. So we'd love to see you there. And I just wanted to address something. Some people brought up in Discord, we've had some people message us privately. If you follow us on an audio platform, really since the inception of the podcast,
We've had a few advertisements that have been ⁓ sponsorships that Nathan and I have read in our voices, but we haven't had dynamically inserted advertising. YouTube's had that for a while, but we haven't had that on our audio platforms. We were recently just selected for the Spotify Partner Program and Spotify is where we host our podcasts. And so we are now eligible to do advertisements there. And we turned it on last week. I guess just a note about advertising. I know some people are really morally opposed to advertising in general.
⁓ If you create content advertising is how you make a lot of money and as much as Nathan and I enjoy making this podcast for the love of the game we also do enjoy making money on the podcast ⁓ if anything just to cover the expenses and hopefully one day to ⁓ You know be a nice little chunk of change for Nathan and I ⁓ So as far as advertising goes we've gotten two comments one is that YouTube appears to be much better at dynamically inserting ad breaks in Spotify They've been at this for a lot longer and we had some people say that it was jarring when the ad breaks
started, appeared. So ⁓ duly noted, and we're gonna make sure in the future that we ⁓ manually select where the ad breaks happen. So it's less of a jarring experience for you guys. And another on the content of the advertising. I hate it, Nathan hates it. I think most of us hate it. There were some gambling advertisements there. ⁓ One, I know just personally, I have a moral opposition to gambling.
And I am concerned from a kind of public policy perspective, how much gambling has infected American life. And it does not please me that there are gambling advertisements on this podcast. We are doing what we can to look into it. Nathan, want to, do you want to say anything about gambling? I don't want to speak for you too much.
Nathan Toups (02:53)
No, I'm in the same boat. think it's ⁓ any business model that is, it's entire business model is causing harm to its customer is not okay with me and gambling's subset of that larger.
Carter Morgan (03:01)
Yeah.
If you want to have
a fun night in Vegas once a year, probably nothing wrong with that. But the amount that gambling especially is affecting young men in our society, I think is troubling. ⁓ And so we are looking into if we can opt out of that. ⁓ And we are also looking at maybe potential other advertising partnerships that would not serve up gambling ads. But these gambling companies are also spending a lot on advertising. ⁓ While we get all this sorted out, just know that
Any ad that appears through like dynamic insertion, the same way it appears on YouTube, we don't control YouTube advertisements as well. And there's something I think just about the YouTube format that makes it much clearer that this is a YouTube run ad and is not associated with us. That isn't as clear in an audio platform, but just know if an ad is read by Nathan or I and our voices, that is us endorsing the product, that is us saying that we are at least morally okay with it. If it's a popping up through dynamic insertion, just know we don't have complete control over that. And so,
As we sort this out, you we appreciate your patience.
Nathan Toups (04:06)
And if you're interested in other alternatives, so, you know, I would love it if we could support this podcast without relying on advertising. So if you would be a fan of something like a ⁓ premium subscription model or have ideas of how we could have this be listener supported, I'm all ears. We can't commit to anything yet. We're still trying to just actually at least pay for the books and keep the lights on. ⁓ But I would we want to do something that's serving you all. if it's,
bombarding you with trashy ads. That's not our goal. So yeah.
Carter Morgan (04:37)
Yeah.
Absolutely.
If there's a subset of listeners that's willing to pay five bucks a month for a Patreon, I mean, we would gladly turn advertisements off on this podcast because that is, I think we'd be much more comfortable with something like that. Anyhow, lots of meta talk about advertisements about book overflow and producing podcasts. Let's talk about the actual book, which this week is Frictionless by Nicole Forsgren and Abhi Noda. I can give you the author introductions real quick. Nicole Forsgren, if you remember, she gave
I write Nathan the forward for the death ops handbook. I don't believe she has an off. Yes.
Nathan Toups (05:13)
Exactly. And
Nicole is just deeply embedded in this platform engineering and DevOps world. ⁓ Dora in Space Metrics, she's had a hand in all of this in deep research. We'll get into it. Her name is going to keep coming up anytime we get into this SRE ⁓ platform engineering DevOps world.
Carter Morgan (05:32)
Absolutely. ⁓ And then Abhi Noda, we're going to give the full instructions, but I thought it was cool you mentioned beforehand that you actually know Abhi, don't you Nathan?
Nathan Toups (05:41)
Yeah, he's the CEO of a company called DX that just got acquired by Atlassian. I'm a big fan. ⁓ Actually, I have a full disclosure. Yes, I've had direct interactions with Abhi. He's awesome. And then also, ⁓ I actually have a, there's like a small feature on the DX website of the work I did using DX at a previous employer at Flyer. ⁓ We rolled out their software that powers a lot of the ideas in this book. ⁓
And you don't have to use their software to get the benefits of the ideas, but there's a really nice overlap that these two were deeply involved with how the software is being built to unblock engineers and get them to be ⁓ super productive and also super happy at work. ⁓ yeah, it's pretty cool. Yeah, these are good folks. When I saw this book was coming out at the end of the year, last year, I was like, okay, this has to get on the list because there's just lots of great ideas in here.
Carter Morgan (06:27)
Yeah.
DX is a Utah Tech success story, which is fun. ⁓ Avi lives in Salt Lake City and DX is headquartered here in Utah. ⁓ yeah, really cool people who are the authors of this book. Let me give you the introduction the book gives them. Nicole Forsgren is an expert in DevEx, DevOps and decision-making and is the lead author of the Shingo publication award-winning book, Accelerate! The Science of Lean Software and DevOps. Her work on technical practices and development has been published in industry and academic journals.
Nathan Toups (06:40)
Yep.
Carter Morgan (07:05)
and is used to guide organizational transformations around the world. Abhinota is the co-founder and CEO of DX, where he leads the company's strategic direction and R &D efforts. His work focuses on helping leaders measure and improve developer experience. Before joining DX, Noda held engineering leadership roles at several companies and was the founder and CEO of Pole Panda, which was acquired by GitHub in 2019. So very, very, very cool people. ⁓ As far as the book introduction, what we have is AI can generate code in minutes.
So why does shipping software still take forever? The answer is friction, the invisible barriers that turn quick wins into endless delays. While your competitors ship daily updates, your developers burn out fighting broken tools instead of solving real problems. In Frictionless, developer experience experts Nicole Forsgren and Abhi Noda reveal why eliminating friction isn't just about developer happiness, it's about business survival. Drawing from research with hundreds of software teams, they provide a complete seven-step methodology to identify bottlenecks, measure impact, and systematically remove barriers that slow innovation.
Whether your teams use AI coding assistants or established workflows, this book shows you how to unlock your engineering organization's full potential. You'll discover how companies like LinkedIn tackle developer friction and transform their business as a result. So this is another two-parter episode. ⁓ We read the first half this week. Nathan, give me your thoughts on the first half of Frictionless.
Nathan Toups (08:27)
Yeah, so I've been super excited to read this since I knew it was coming out. ⁓ Friction is something I talk a lot about with my consulting clients and the framing of this book, I thought it was just super solid. This is a very easy to read book. I know we talk about that from ⁓ a book overflow format. It's very easy to get through it. I was actually sad that we couldn't read a 300 page book in one sitting because we kind of had to stop somewhere in the middle and there's no perfect place to do it. ⁓
Carter Morgan (08:50)
Yeah.
Nathan Toups (08:56)
But I mean, I think I highlighted half this book. there's just, for me, this is like exactly the type of thing I love thinking about with that intersection of organizational structure and engineering and how we ship and build things. And I thought it was a very approachable style, but also there's a confidence of like deep expertise that you were looking at this and going, oh yeah, yeah, this is great.
Carter Morgan (08:59)
You
Yeah, you can tell that the people who wrote this have that deep expertise. And I also really enjoy books like this. I enjoy that kind of intersection between business and software. ⁓ I appreciate, we're going to talk a lot about how they talk about how you sell developer experience improvements to ⁓ leadership. And I have had kind of that experience joining my startup where I've implemented a lot of things like CI CD and Terraform and ⁓ some
testing philosophies that, ⁓ it's funny when I started doing them, and we'll talk about this throughout the podcast, when I started doing them, I was just like, well, this is good practice and it is ⁓ not gonna be hurtful to the organization in any way. And I've been wanting to get my hands on with some of these things. So yeah, let's do this. But as we've moved to like an AI, a more AI native world, and we as an organization are trying to become more AI native engineers.
Yeah, having all this kind of like DX stuff in place is a huge competitive advantage. As far as this book goes specifically, I do think that like a lot of this first half is spent on like gathering information about DX and like how you do that. And so I think I found some of that to maybe be not as immediately applicable at least to my role. And it's also interesting, this book builds itself about like how to
move fast in the world of AI. And I think that's all true. I really do believe that we're at this fortunate moment in time with artificial intelligence that what's good for AI is also good for humans. Fast builds, tests, documentation, clean code, all helps AI, but it's also just easier for humans to reason about. ⁓ But to me, it feels like this book was probably written before AI, and then AI kind of
gained steam and they didn't change most of the content of the book, but they changed some of the framing around it. ⁓ I don't think that's necessarily a terrible thing because I think what applies here is applicable in the age of AI and kind of pre-AI. ⁓ But it is, yeah, I thought it was kind of an interesting framing.
Nathan Toups (11:18)
Mm-hmm.
Yeah, I felt the same way. I don't think I articulated it clearly in my notes, but you're right, especially the front loading of the book kind of sets up this sort of, we're in this world of AI and these processes are more important than ever. And you get into the meat of the book. I actually I think I was probably like.
two thirds of the way through what we were supposed to read. And I was like, we haven't talked about AI in a while. Like it just kind of hit me. And I was like, well, it actually doesn't matter. really, I think maybe this is where AI tech, like conversations are gonna happen increasingly. Is this sort of like assumed that there's oxygen in the room? Like you're gonna use these things on top of it. And so, ⁓ yeah, I thought that was a really astute observation, Carter.
Carter Morgan (12:21)
Well, thank you very much. ⁓ But it is a great book and we're excited to dive more into it. We're going to take a quick break and be right
And we're back. Thanks for listening. Okay. So we just talked up about what is developer experience. I think most of us understand that, but I mentioned on this podcast before, I look at the listener analytics and I'm always pleasantly surprised that we have a decent chunk of listeners that are in college still. And so we want to make sure that we are addressing them as well. So as far as developer experience, there's some great quotes from the book we have here.
says developer experience isn't just about making developers happy, it's about removing friction that slows down your entire business, which I think is really kind of the whole theme of this book. And so when we talk about developer experience, it really is just the, how pleasant it is to build software at your company. And so what that's going to be is like, well, do your unit tests take like a super long time to run? are your build pipelines? how, and, when you are one, are your build pipelines quick? So when you commit code to main,
Does it get out to production quickly? But also too, are they reliable? Because if you have a flaky build pipeline that fails halfway through, that's really frustrating. ⁓ Documentation is another thing that falls under developer experience. Onboarding, when you join a company, how quickly can you actually start contributing? ⁓ I those are big ones that stand out to me. Am I missing anything here, Nathan?
Nathan Toups (14:12)
Great.
No, this is exactly, I think if you want to put it into some framing, ⁓ developer experience basically allows the DevOps ⁓ and platform engineering part of the organization become a product team, right? You really are, the product is the things that you're building for your user and your user is the software engineers in your organization, right? And it kind of flips this thing on, of being stuck in Ops Land where you're kind of,
Carter Morgan (14:29)
Yeah, yeah.
Nathan Toups (14:41)
saying no and you're provisioning resources and you're doing the everything that the Phoenix project and ⁓ the Unicorn project talked about where you throw the responsibility over the wall and you let some other team deal with it. This is about a sort of like extreme ownership idea that and you use this product methodology. ⁓ I will say there's a lot of terminology that gets pretty confusing. ⁓ Sometimes you'll have forward deployed engineers at organizations that have a developer experience.
where maybe you're a company like Stripe that has an API that other developers outside use, they might actually have a forward deployed engineer, forward deployed meaning it's customer facing, ⁓ that is developer experience. That's not what we're talking about here. I mean, they overlap in some of the ideas, but typically when we're talking about developer experience, we're talking about an inward facing product. So this is exactly how do I reduce friction?
in the onboarding process. How do I reduce friction in our CI CD pipelines? How do I reduce friction when it comes to all kinds of processes? And the idea here is, hey, if I can reduce friction, my engineers are to be happier. We're going to ship more valuable code. We'll compete better against our competition because we can ship new features to production with fewer errors. All of the research that's been going on for the last 15 years under DevOps is really kind of
put together here in this book as far as framing it around this idea that faster processes unencumbered by a bunch of friction is actually what we're shooting for. ⁓ And so, yeah.
Carter Morgan (16:22)
And friction is a great word for this because these
processes, these little pain points are not designed on purpose and they do just kind of accumulate over time. And it's never the biggest problem you're dealing with, but they all add up. And this is a great quote from the book. They say, AI acceleration makes existing friction even more costly. As AI transforms development workflows and new dynamic emerges, friction that was manageable at slower speeds becomes a serious bottleneck.
When developers can generate code in minutes, but deployments take hours, or when AI can suggest dozens of solutions, but testing infrastructure can't validate them quickly, the constraint shifts from creation to validation and delivery. I think this is, it feels like the AI discussion has picked up in recent weeks. And we've talked about it in the podcast last week. you my opinion on it is like, there's a there there. I think there have been some developers who have kind of steadfastly refused to use AI and I would
encourage you to maybe not be one of those developers. ⁓ But we don't know what the industry is going to look like in a few years. But I do think this isn't, know, the AI hype starts on Twitter. This is something they do not consider because yes, coding has been the primary bottleneck for a long time in our industry. Just like how fast can you actually turn out code? That's no longer the case. But that doesn't mean there's not a lot of other bottlenecks and friction points. ⁓ And it's also, I don't know, like
It's one of those things like if they invent the super intelligence, all bets are off, right? But as the models currently stand today, like I know at our organization, we are actually undergoing a big rewrite because our old stack one, there was some ways with how we were structuring our data and just kind of the stack in general that kind of they were hard caps on or we're at least going to be very, very painful to scale with. And so we figured, you know what? We're at a point right now with our company where it makes sense, where we can afford to kind of do this, what we think the right way. ⁓
But another reason we did it, and we're going to talk later about this when we talk about selling developer experience improvements to management, is that AI agents were not able to reason about our code base, right? Because it had a bunch of weird patterns and some non-standard frameworks that it was using. so, anyhow, I think that's an underrated hurdle to AI adoption, especially if you've worked at kind of larger legacy companies.
You know, AI does really great when it has kind of all the context in one place and developers do really great when they have all the context in one place. If you worked at any sort of bigger company, that is not the case.
Nathan Toups (18:55)
Right. Yeah, it really does amplify, if you have great processes, AI tooling can really continue. There's a term I actually have been using interchangeably with developer experience. I've been, actually this is a really good thing to bring up is that some folks are starting to transition to a term called engineering enablement.
⁓ which is sort of complemented developer experience. Developer experience can be misinterpreted a lot of ways, but one of the goals is to enable the engineers, right? And if you're enabling these engineers, and this actually came up in team topologies, Enablement was one of the focuses that the types of platform engineering team could do. would, you you kind of embed, you enable these patterns that you want to have in place, and then they kind of go off and do their own thing. ⁓ The same thing has to be true
Carter Morgan (19:19)
Yeah, yeah.
Nathan Toups (19:49)
with these sort of, yeah, these best practices that are in place. And it is, it's really kind of funny. A lot of the folks in the DevEx world have been kind of realizing how right they were. You it was hard for the CEOs to get prioritization for like good documentation and excellent test coverage and all these things. And then all of a sudden, now that these initiatives are trying to get autonomous agents to do more and more work in the infrastructure, turns out it's mission critical that
Carter Morgan (20:01)
Yeah.
Nathan Toups (20:18)
the very same stuff that you were lacking before is in place. And so, yeah, all of a sudden you're seeing documentation first sort of development and having really nice quality gates in place as an essential part of safely using agents, right? I think they talked about this. There's that famous story of Jason Limpkin, who's funny. Jason Limpkin was actually a guest on Lenny's podcast, Too Long ago. He has this company called saster.ai.
And he kind of blew up on Twitter because he was vibe coding and they deleted production. They deleted the production database, even though he'd put all these constraints in place, set a code freeze and what changes, and then it like very gladly bypassed all of that and deleted the production database. ⁓ You know, I would argue that this is a good example of when in an organization, if your intern could have done this on accident by copying and pasting something off of Stack Overflow.
Carter Morgan (20:55)
Yeah.
Nathan Toups (21:17)
it wasn't their fault in the sense that yes, they actually hit the button and did the thing, but your processes were broken that you allowed someone at the intern level to actually delete your production database. There should have been checks and balances and controls. So even if you tried to execute this command, it wouldn't work. And I think we're still, for whatever reason, like we want to use all these agentic stuff, but we really need better controls so that it's safe behavior is
Carter Morgan (21:30)
Right.
Nathan Toups (21:47)
within a set of constraints that are in place. so.
Carter Morgan (21:50)
Yeah, it kind of reminds me of how like a lot of companies, I every company went remote during the pandemic. And then some companies really struggled with that because they tried to just take their existing business approach and just like map it onto zoom and that doesn't work. And the companies that are really, really thrived and still are thriving as full remote companies recognize that full remote I think works, ⁓ but is a completely different paradigm. And you have to have different processes around it. ⁓ I feel the same way like
You can just unleash an agent onto your code base, but if it doesn't have all of this DevX stuff around it, it's not going to be effective. Someone actually compared it on Twitter to the NP complete paradigm, which Nathan and I are very familiar with because of our graduate algorithms experience at Georgia Tech, where there are categories of problems that are hard to solve but easy to verify.
⁓ And AI agents are in some ways like that, but they're guessing machines. And so what makes them particularly good at software is that they can verify themselves, usually through comprehensive testing, right? And so ⁓ if you have kind of these boundaries around it, ⁓ then yeah, there's a little more freedom for an agent to...
contribute autonomously. was actually reading this, Jersey Aura has had this great article out this week about how Codex, ChatGPT Codex, their team uses Codex to build Codex. And one, I'm going to link it in the show notes. I'm not going to link this one. There was this dumb essay that went viral on Twitter called like something big is happening, which I thought was like a lot of AI fear mongering, right? There's another great essay, I think, titled, Why I'm Not Worried About AI Taking Our Jobs.
from an economics perspective, ⁓ where the author makes the argument that AI is a transformational technology, maybe even as transformational as electricity was. But like electricity, we are not going to see massive layoffs, and it'll be more of a standard diffusion of the technology. Anyhow, all that to say, what was I talking about? yeah, codex. ⁓ Which is like, that dumb essay was like,
Cloud code is being used to build cloud code. AI is iterating on themselves. That's like saying VS code is used to build VS code. Like, if cloud code, like the developers at Cloud Code are using cloud code to build cloud code, that makes sense. I would be shocked if they weren't doing that, but like, I don't know. Anyhow.
Nathan Toups (24:32)
Right. Yeah, I
know, I think it's somewhere in between. I think it is a little different than VS code being used to build VS code. I think it's somewhere between... It's kind of like... every programming language, at least I should say not every programming language, every compiled programming language at some point that doesn't use a virtual machine runtime. again, Java doesn't write itself in Java. Java is like C++ under the hood or whatever with this virtual machine. We go...
Carter Morgan (24:39)
Right, right.
Right.
Nathan Toups (25:02)
C, ⁓ Rust, all of these languages end up dogfooding themselves. They end up, at some point, you write a version of the compiler in a previous version of that code, and then you get to this sort of like weird inflection point where you're no longer bootstrapping it with an external dependency. Though in the history of changes, you can't build that original version without it. But once you've bootstrapped yourself, it's kind of in this weird
Carter Morgan (25:09)
Yes.
Right, right.
Nathan Toups (25:31)
inflection point, I think that Cloud Code building itself is kind of somewhere in between no big deal VS Code building VS Code and almost like a compiler compiling itself in that we're using these agentic AI to build more features to the agentic AI and that there's something interesting to that, right? Yeah, I find it equally obnoxious, like the hype machine.
Carter Morgan (25:38)
Right.
Right, right.
Nathan Toups (26:00)
and reminds me a lot of the NFT craze that's happened in 2021. I do think that we're personifying these things in a weird way that I'm just not comfortable with. But I also find the folks that are like, oh, doom and gloom. If you even touch this stuff, you're an idiot. And I'm to be laughing at you when the AI bubble bursts and I'm over here doing, you know,
Carter Morgan (26:04)
Yeah, yeah.
Nathan Toups (26:26)
I'm still doing lead code or whatever. I'm like, okay, that world's gone too. Both of these worlds are not our future, and I think it's something much more nuanced in between.
Carter Morgan (26:29)
Yeah. Right.
thought it was with the Codex team, I thought it was interesting that, because I've been hearing all the time, like, oh, I've got my agents, I've got swarms of agents. I'm a little like, what are you building? Tell me what you're building. Like, everyone just talks about their agents. And so this Codex article was the first one I read where Yerjai Oras asked some great questions and actually found out like, okay, how are they using these agents? And I thought it was interesting that they have like, they have a PR review agent, right? That will automatically review PRs. And they do have agents, ton of
Nathan Toups (26:49)
Yeah.
Wait.
Carter Morgan (27:06)
autonomously contributing to the codex code base and they have sectioned off parts of the codex code base as non-critical. And so if an AI agent makes a contribution to a non-critical part of the code base and the pull request review agent decides that ⁓ it looks good and that there's nothing wrong with it, then it can be merged automatically, which I thought was interesting. Now,
These agents that they've developed are also fed with massive amounts of curated context from the Codex team so that they know exactly, you know, like this PR review agent is not some generic PR review agent. It's stuffed with all of the ⁓ opinions of the Codex team. And these development agents too are also stuffed with the opinions of the Codex team. ⁓ So yeah, I don't know. I thought that was interesting, ⁓ but it's also a sign of how
DevEx works because I believe I mentioned in the article, but I can't remember exactly, but I'm confident that Codex has lots and lots of tests around it. And you cannot have these agents contributing autonomously if they're not able to verify that what they're contributing hasn't broken anything. ⁓ So yeah, just another win for DevEx.
Nathan Toups (28:20)
Yeah,
some of that still makes me uncomfortable, I'll say. I agree in principle that these things have the potential, but it still bothers me at the end of the day. We have a for loop where we're basically using the equivalent of eval, right? Eval being that I take the text and I turn it into executable code. And we're basically saying, ⁓
Carter Morgan (28:27)
Right, right.
Right.
Nathan Toups (28:50)
you know, we're taking the eval loop in this for loop where the large language model runs with some set of input text, the output comes out of it, and then the output becomes the input for the next round of the for loop. And that's just the most dangerous way that you could do anything. And the idea that you can think of all possibilities, this becomes to that unbounded context, this is inherently an unbounded context. Like, I'm still uncomfortable with the fact that
Carter Morgan (29:06)
Rhyme.
rhyme.
Nathan Toups (29:21)
our code bases are gonna get increasingly complex and we don't have humans in the loop and we don't have a proper way of looking for emergent problems. I'm not talking about if you look at something in isolation, a perfect example of this, and I think I've brought this up before, when NASA had the Challenger disaster, there was no one failed test that caused this to be a problem.
Carter Morgan (29:28)
rhyme.
Nathan Toups (29:49)
Right. The Challenger explosion happened because there was a cascading failure of a bunch of things that were within the error thresholds that were considered acceptable, but emerged a problem that came from the complexity of the system. This just feels like that. It feels like what were the norms that we're pushing out right now and this, this obsession with speed is we're getting us to a point in which we're like
Carter Morgan (30:15)
Yes.
Nathan Toups (30:18)
it don't worry about reasoning about the system. The AIs are self-correcting. They're doing all these things. You if you're not moving faster, then you're going to get bypassed by everybody else. There's like a lot of FOMO. And that's really like an awful, that's an awful thing to be juxtaposed to safety. Right. ⁓ And so that's the part where like, again, I don't want to be a fear monger, but there could be some sort of pretty devastating cascading failure. I would actually highly recommend checking out Carl Brown's
Carter Morgan (30:35)
right.
Yes.
Nathan Toups (30:47)
newest
Internet of Bugs episode on this, he explores these very same topics. And again, he's not a he's not a head in the sand person. He just is looking at these things, saying, hey, you don't just throw out all of these good ideas because something new and interesting has come out. We should actually have a very sober conversation about it. ⁓ And so, yeah.
Carter Morgan (31:06)
right.
Yeah,
it all feels to me like a bet on like the super intelligence emerging because it's like, once we have the super intelligence, then they'll just be able to do all these things. But I don't know. I think the models have gotten better and they're doing some really cool things. I released a swarm of agents the other day where we are migrating Mongo to Postgres and we are using Prisma as our ORM. And so Prisma has kind of this schema language. so
Nathan Toups (31:16)
Great.
Cool.
Carter Morgan (31:37)
I wrote up a script and when I said I wrote a script, had Claude write up the script. I wrote just a simple script that grabs like hundred documents from our Mongo collection. And then I wrote some Claude code skills that basically say, Hey, go run this script, get the a hundred of these documents, analyze these documents, and then generate a Prisma schema, right? ⁓ for us. then, and then I wrote kind of like a Claude code skill on top of that, that said like, basically, Hey, spin up 10 agents at a time and run.
this cloud code skill. And so it was really, really cool to watch all of these parallel agents go and look at the database. And then it came back. It didn't produce like code. It produced a markdown document for us, which basically had the Prisma schema and then its design considerations, right? For why it chose what it chose. Really cool. And it got us a really, really great starting point. And then as we started digging into it and ⁓ making some of our thoughts like, well, actually, no, we want to adopt this pattern. We'd like to rename this field this way, or we want to use this convention.
Nathan Toups (32:14)
Mm-hmm.
Carter Morgan (32:34)
We fed that back into the skill context and then ran it again and it did a lot better the second time. ⁓ Yeah, there's really, really cool stuff going on, but I'm with you. just think at some point someone's going to have to answer for the code that's written. And I think people are just optimistic that no, they won't. That an AI agent will answer for the code that was written.
Nathan Toups (32:56)
We're so entrenched with Dunning-Kruger that it's not even funny. There's this weird resentment that even software engineers exist. It's a really strange thing. again, I'm in the camp that this is a yes and sort of situation where yeah, there is some ⁓ really amazing things that this can do. And I will say it's redefined to the...
Carter Morgan (32:59)
Yeah.
⁓ absolutely.
Nathan Toups (33:21)
conversation around toil and we've heard that we've seen this around that sort of like Twitter versus like what is toil and what we used to consider toil and what we consider toil now is shifted. The goal posts have shifted. ⁓ I don't like, I would like to have a world in which being human is a lot of fun and that we could spend more time having fun solving interesting problems and not this like, you're irrelevant. And there's like this kind of like mean natured.
Carter Morgan (33:23)
Yes.
Nathan Toups (33:49)
finger pointing of like, you're irrelevant, that seems to be the undertone of some of the folks that are in this conversation. And I'm like, I don't want to hang out with you. Like the way that you're talking, you're just like a nasty person. This excitement over, you know, disrupting a bunch of people. I'm just like, okay, this is, it's an anti-human way of like looking at the world. yeah.
Carter Morgan (33:59)
Yeah.
Yeah, and
it's so funny. It's like these people have this vision that like there's gonna be 50 % layoffs, but 300 people in San Francisco get to be billionaires. I'm like, so that's not what's gonna happen guys, because if that happens, might I remind you of the French Revolution, right? Like, and I hate to talk like that. And I am generally a believer in free people and free markets, but like this idea that like, you only have
Nathan Toups (34:30)
Yeah, you know, it's.
Carter Morgan (34:41)
one year to escape the permanent underclass. I'm like, I'm tell you folks, there's not gonna be a permanent underclass because the moment we have a permanent overclass and a permanent underclass, everything about how we have structured American society is up for discussion, right? Is up for violent discussion.
Nathan Toups (34:54)
I love
that this book on friction has really, really riled us up. This is great. ⁓
Carter Morgan (35:00)
Yeah. so,
so I guess getting back to the book and, and I promise this is all related because, ⁓ yeah, it's, it's about how, how do you reduce that friction to make it easier to operate? And as you make it easier for a human operate, you make it easier for an agent to operate. ⁓ and, but, but I, I stand by that, like, again, you just can't vibe code slop because a lot of this
Nathan Toups (35:21)
Yeah.
Carter Morgan (35:29)
reducing friction, like, it's, that's engineering. Like it's just, it's so interesting. Like at our company, we're talking about trying to become more like an AI native company. And there's talk about like, we'll do product design. Well, product managers and designers and software engineers kind of all eventually fold in and have the same role. ⁓ And so we had all the product managers and designers and engineers in a room as we're talking about kind like, okay, this big AI native enablement rewrite. And I put together the proposal doc.
Nathan Toups (35:34)
Yeah.
Carter Morgan (35:58)
And so it's this big proposal doc that talks about all the technologies we're going to use, why we're going to use them. It's talking about like Node and Express and REST and Kinesis and the specific type of like event-driven architecture patterns we want to use. And I couldn't help but notice that the designers and the product managers got up and left the room the moment we started reading the document. Right. And it's just like all of this, this reducing friction, this establishing good patterns, this making it so an agent can...
reason effectively about your code base, that's all still software engineering and that's all still valuable domain knowledge. And so, you know, I think there's still a lot of value in what this book is talking about.
Nathan Toups (36:39)
Yeah, you know, I like, I'm working with a new client and we are taking a very AI forward approach, ⁓ partially to keep the team small. like a smaller tech initiative. like the way that we can iterate with it. And it does change some ways that we are taking our approach. But it's funny because I think because...
Like the CTO has a background as software architect and he really understands infrastructure. And maybe this is not so scary to a lot of us who've been in the DevOps and SRE world. It's like the good folks, the folks that have been doing this for a while, I've always joked about automating myself out of a job. I've always been automate away the thing I got hired for. And if you're at a good company, they're gonna give you higher order problems to work on.
And I think that this is what's actually, this is sort of this inflection point we're having with AI tools, which is like, can I automate the boring parts of my job? Can I use these sort of problem solving tools? And then, and I think this is where this ties into the frictionless book is that what this exposes is like when you were doing the troubleshooting with MongoDB, you...
Carter Morgan (37:37)
Right.
Nathan Toups (37:54)
kept coming back to the episodes saying, hey, we solved this thing and then we now actually realize this problem's further up and further up and further up. Those are the friction points, right? Those are the friction points within the application itself. And you can find this everywhere, which is you won't notice where something's slowing you down until you actually speed up other parts of the system. And I like that they outlined, I think we talked about this briefly, like onboarding friction, code-based friction, process friction. ⁓
Carter Morgan (38:01)
Right, right.
Nathan Toups (38:25)
development and deployment friction, as you start optimizing for these things, it's like spinning a bunch of plates. And you want to play this game of like, how much can I reduce friction safely? Obviously, the safety part is an essential piece to this. ⁓ And I like that the book, after it frames this, it kind of gives us this toolkit for understanding what's important and how we approach things. ⁓ I think we talked about this in team topologies as well.
And I kind of want to talk about the DevEx framework. I think this is an important framing here, which is that, and we saw this in the DevOps handbook, actually. Feedback loops, flow state, cognitive load, right? Feedback loops is how quick can I have my feedback loop? If I have a six month feedback loop cycle, it's huge risk, right? Not only is there tons of friction in process, flow state is how much time can I spend in that where I'm on the edge of not quite understanding what I'm doing and deeply engaged in it.
Carter Morgan (39:07)
Mm-hmm.
Nathan Toups (39:25)
Right, to me that's flow state. So I should be kind of out of my comfort zone, but also well within my domain expertise, right? How long can I stay there? Because that's where the most productivity is coming out of me. And then cognitive load is, can I reduce the cognitive load, right? Cognitive load is if I have to do a bunch of context switching being like, wait, I have to update this YAML file and then there's this Prisma schema and then I have to get back into the business logic and then like all of these things have to go through separate CI CD processes before it's
updated module is available to my main business logic. It could be like, that's a lot of crap to deal with, ⁓ So if I can reduce cognitive load where I can solve one part of the business problem without having to understand how the entire system functions, now we start seeing, and again, as we're seeing this, this is what AI agents are doing all the time. like, can I reduce the context window, right? We all know that if you get 70 % full in your context window, your agent's screwed. Like it's not making good decisions anymore.
Carter Morgan (40:15)
Right, right.
Right, yeah.
Nathan Toups (40:24)
Well, that's the same as cognitive load. If your context window is 70 % full, you're not gonna make good decisions anymore, right? Like these are the same sort of things that go back to each other. And so, I know, I thought that the framing of this and like tying it all the way back to sort of the DevOps handbook was super valuable from friction standpoint.
Carter Morgan (40:31)
Right.
Well, we're gonna take a quick break and then we'll be right And we're back, thanks for listening. All well, let's talk about how you make the business case for this. This is something, I love you engineers. I love all of us as engineers, but sometimes we understand the value of this.
And we understand, but I think even we as engineers only understand the value of DevEx from like a, this makes my job more pleasant. And even sometimes we have trouble articulating, this is why there is business value to all of these improvements. ⁓ So part three has a great couple chapters on how you make the business case. ⁓ Nathan, mean, have you had experience having to, I know you've done a lot of DevEx work.
Have you had to make the case in the past or were you mostly assigned after the case had been made?
Nathan Toups (41:33)
No, I was at a company, I was at a company that had 300 ICs, actually I'll go ahead and disclose it because there's a case study on this. So DX actually did a case study when I was at Flyer. Flyer is this airline software company. There's about 300 ICs at the time where Series C funded. I was on the platform engineering team, but we didn't have any sort of like centralized developer experience initiatives. And I was working with one,
part of the organization that I really couldn't understand. I knew the engineers weren't happy and I knew that leadership, the sort of management had one view of the world and that the software engineers had a different view because I was actually doing some of stuff this book talked about, which is like, I started scheduling a lot of one-to-ones and like talking to people. ⁓ And I realized we needed something more than me just talking to some people. We needed like a framework for doing a mix of qualitative and quantitative analysis.
to have a real conversation about how we fix some of the major friction that was causing us to miss timelines for lot of promises we had. And somehow I was able to strong arm getting DX adopted in the whole organization. So I had to own this business case of like, here's why a survey that we send out every 90 days and a bunch of measuring of our tools.
Carter Morgan (42:47)
Nice.
Nathan Toups (42:57)
is actually gonna be like good for the engineers and make them happier and to let you make business decisions better. can, and I had to pitch it in like, you know, we have the super high performing team over here. We have this team over here that's struggling. We don't know why. Like we have some ideas. they have a great engineering manager. ⁓ you know, they do all this cool stuff and they find valuable work to work on. really, but what is it unique about them? You know, are they engaging with product differently? Are they?
Carter Morgan (43:13)
Right.
Nathan Toups (43:26)
in mired in a bunch of tech debt, right? Like all these questions that kind of come up. And so I had to make a business case as to like, Hey, we're sitting on a bunch of wasted time and energy and there's this head count. And so like, and this is the kind of thing is like learning how to speak to a CEO or CTO decision maker, learning how to get sponsorship from executive leadership on why something like this makes a huge difference is its own skill.
I mean, it's kind of like an inside sales kind of skill set that you have to gain. But yeah, I had to convince my manager that I should even do this. This is not what it was assigned to me. I had to figure out how to do this in my spare time, quote unquote. And then I had to get enough champions throughout the organization, both technical and non-technical, and had to make a case as to why it was gonna solve a problem for each one of these stakeholders, right?
Carter Morgan (43:58)
Yeah.
Nathan Toups (44:22)
⁓ you can't go and talk to the CEO unless you actually know why this problem is this way and we're not measuring it, right? ⁓ Or if we could get your team to perform like this one over here, this is how many more features you could crank out per month or whatever, right? And I had to come get scrappy and figure out ways to measure progress and improvements. ⁓ And not every engineer wants to do that, right? If what I'm saying right now sounds exhausting,
Carter Morgan (44:22)
Right.
Yeah
Nathan Toups (44:52)
⁓
Dev X is probably not the career path for you. ⁓ If it actually sounds kind of exciting that you're like, there's a puzzle, and that's how I look at it. There's a puzzle here that can be solved. And if we can just figure out how to like, we always talked about the way I pitched it actually to the organization was, I want every engineer to have their best engineering day ever. Right? Like if you described your ideal day at work, where you said, I get to do this, I get to work on this type of problem.
I'm not being called at three in the morning, right? Like we talk about this ideal workday. How do we as an organization cultivate the ideal workday for our engineers, right? And it turns out that developer happiness is tightly coupled to how much friction we can reduce, how valuable, how much impact you feel like you're making in the organization. Do you feel like you have a voice in changing processes, right? Like all of these things that sound kind of woo-woo sometimes are actually like deeply involved with
why did this guy leave? ⁓ well, he left because nobody listened to him and his manager sucked, ⁓ Turns out, if you make that engineer happy with a manager that ensures that they do valuable work and cultivates their growth and that you ⁓ actually listen to your engineers. And so much of this book is about listening to engineers, which I thought was really interesting.
Carter Morgan (45:53)
Arrive.
Yeah.
Right.
My manager is giving a masterclass in this and kind of translating it to business value because we've been struggling with the existing system for a long time, but you know, it's the system. Like it has all of the business logic in it. And we did not like the tech stack. We did not like MongoDB. We did not like Java. We did not like, we really did not like GraphQL. But, and so there had been talk about kind of just like declaring bankruptcy on it and starting over. And I was very against that.
which is like everything I have read has said, do not do a rewrite, you know, just like it's, ⁓ it's yeah, there, there's a lot of challenges with doing a rewrite. And so, ⁓ but I had spent all of January trying to fix the system. And like I said, driving some of those bottlenecks deeper and deeper until we got to the point where, yeah, it's our problem is, is how we're structuring our MongoDB data.
and our data would be better in a relational schema. And so we were kind of left with like, okay, do we either change like all of our data access patterns and try to use MongoDB aggregation pipelines a lot more, or do we ⁓ like change the shape of the data? Like we try to denormalize a lot more things where we have to make less calls out to MongoDB. about GraphQL? mean, maybe we try to really lock in on GraphQL and develop all these kinds of safeguards around it.
I just got to the point where I said, I think unless we make some pretty drastic changes to the system, we're not going to be able to achieve the scale we want. so kind of from an engineering side, I signed off and was like, something's got to change here. And if it has to change in as extreme a way as like changing the entire shape of your data, then maybe some other extreme changes are worthwhile. And so we as engineers, we're just kind of thinking about this as like scale. This will get us the scale that we need.
And then my boss pitched it as the AI enablement project, right? And that is something we care about. We do care about having patterns that are clear enough to reason about so that it's easy for both humans and agents to contribute to a code base. But the moment he started selling it, it's like the AI native enablement project. Yeah, our CEO was very, very interested in that. a way, he was interested in the original argument of like, we have bigger ambitions for
us as a company than this that involves scale. We need to be able to, we need some big changes to the system in order to be able to scale. That was already resonating with him, but the moment he kind of paired it with like, and also we think we'd be able to achieve faster velocity with new patterns. Then he was like, ⁓ very interesting. So hopefully, hopefully it works, you know, it's a big experiment.
Nathan Toups (48:58)
That's great. That's I think I just
read an article about how OpenAI is using Postgres with 800 million users. It was very interesting because they have a single write node. They have a single write and a bunch of read replicas and a bunch of other interesting stuff that they're doing, but they're not doing...
Carter Morgan (49:11)
Nice.
wow.
Nathan Toups (49:26)
super weird fancy stuff. just, they really looked at how their workloads are. They kept it as simple as possible. Of course, we know after reading designing data intensive applications that there's huge advantages if you have a single source of truth on the linearize ability of your data, right? Things like this. And so it is interesting to show like, not saying there's not times to reach to other database types, but we're really in a world in which
know, something that's easy to reason about and is well established has a ton of advantages, especially with the AI portion as well, right? You can, yeah.
Carter Morgan (50:04)
Yeah, and especially with
relational data, the ability to have an explicit schema is a big advantage as well. It's one for a human to be able to look and say, okay, I know the exact shape of this data, but it's also easier for an AI agent just to know, like, I know where all the data lives and it all lives here. ⁓ But again, like, what's good for agents is good for humans, at least now, until they develop their own language and replace us all.
Nathan Toups (50:21)
Yep.
And to a much smaller
degree, did something similar with like, know, Book Overflow website went from just basically a brochure site to we now have, you know, some application logic in it. I didn't end up using Prisma, but something very similar is called Drizzle, which is the same idea, right? It's a TypeScript native sort of typing system that translates into schemas and has some sort of ⁓ data migration pattern built in. ⁓
Carter Morgan (50:42)
huh.
Nathan Toups (50:54)
beautiful about it though is that yes, these strongly typed systems make it easy for the language model to reason about it, right? You now have these really nice annotations. It's much easier for it to generate code and you to review it in a way that you're like, okay, I can reason about it. I know what the schema is. I know what it is. It is interacting with the schema in the way I expect it, right? These kind of...
pieces there that are really nice. And yeah, if your data is highly relational, you should use a relational database. That's a life lesson.
Carter Morgan (51:23)
Right. But MongoDB is web scale. We said that all the time. We
don't know much about MongoDB. We just know it's web scale. Well, do you want to talk about, they have this whole seven step process, which I believe, did we get through all? We got through five of the seven steps this week. But maybe this would be a good time to talk about kind of like, how do you gather data to, because like you mentioned up top, a lot of DevEx and a lot of improving DevEx is listening.
Nathan Toups (51:31)
It is web scale, yes.
Yep.
Carter Morgan (51:52)
What are
the strategies for listening? And then once you've gathered that sort of data, how are you deciding what to act on first? Because at most companies, there's not just gonna be like the one silver bullet. That's why, again, it's friction. It's lots of little pain points that add up to make a overall punishing experience. So what do you tackle first? There's some good thoughts in this book about.
Nathan Toups (52:15)
Yeah, and this is you know, the way that this book structure is kind of cool. It's I like it a lot where If you just read the first couple chapters, it's like the TLDR version and then you read like ⁓ you get through part I think what four or so or no part get through part three You've gotten a little more expanded into these ideas There's some repetition. It's not really repetition though. It's sort of like an overview
and then you zoom in, it's like on Google Maps, like you zoom in a little bit and you zoom in a little bit more and it gets more and more detail. So they're kind of reiterating on ideas that were earlier in the book, but just with significantly more detail. And I think that's where this sort of like improving this part four, that's all about improving DevEx and the seven step process. One of the things I loved, and I think this was super important, is these things actually, so I thought I did this at a 300 %
you know, 300 IC organization, you don't have to have an organization that's that big. You don't have to have some formal software. You don't have to use DX. You don't have to have all these things. you're in a 10-person engineering org, there's a really clear way to get started, which is start a listening tour. And the listening tour becomes really simple because there's 10 people, and you literally sit down with all 10 of them one-on-one and just talk to them about what you love, what you don't love. ⁓
Carter Morgan (53:31)
Yay.
Nathan Toups (53:39)
you know, what's the biggest thing that you dread? Or what's the thing that you're most excited about? What's a tool that if it disappeared, you wouldn't care about anymore? Like, there's just, they actually give you a question bank of some really great questions. And I'll tell you, matter how big or small the organization is, don't start with some super formal software and measuring every git commit and all these other things. Literally just like, go to listening to her, go talk to people.
Carter Morgan (54:00)
Right,
thought it was interesting. ⁓ was like, just a little aside, when they were talking about like, ⁓ one question you can ask is basically, how disappointed would you be if you couldn't use this tool anymore? And the answer was like, very disappointed, somewhat disappointed, or not at all disappointed. And they said, for any kind of developer experience tool, to like, the sign that it's found product market fit is if 40 % of the users of it say that they would feel very disappointed if they couldn't use it. I was like, that's interesting.
So yeah, just a little aside there.
Nathan Toups (54:36)
And
I'll tell you, it sounds so simple, but it made a huge difference. Two things, I had to kind of go out on a propaganda tour, I guess, which is like pitching, because some of the pushbacks you're gonna get is, are obvious questions, or these aren't the things that are gonna move the needle on productivity. Like some people are just like really anti this, or they're worried that you're gonna start asking a bunch of questions so that they can do a reduction in force, right? Like there's that kind of.
Carter Morgan (55:03)
Right, right.
Nathan Toups (55:03)
office
space thing where you're like the consultant comes in and asks a bunch of questions, are you actually useful? You really have to have come from a position of trust for this. ⁓ And you really are trying to make sure that your questions are posed in the sense of like developer happiness. When we kind of led this and I said, look, it is really important to me that your voice is heard.
Carter Morgan (55:06)
I'm just...
Nathan Toups (55:26)
it really, you will have a disproportionate impact on the future of this organization if you help me answer these questions, right? Like kind of selling it this way. We got 97 % participation, which is unheard of, but we got 97 % participation amongst 300 individual contributors at our organization, mostly because I went out and like, DX had a really good thing of like who didn't fill it out. And I would literally just like drop into the DMs and Slack and be like, hey, I really like your...
Carter Morgan (55:42)
Wow.
Nathan Toups (55:54)
input on this. would just, I would, would have no qualms in going out and just like, you know, being slightly annoying if I had to be because I really needed, I knew that if we got high participation, we could make a really positive impact in the organization. And so I, I'm so glad they started with this in the seven step journey because it really is important. know, ⁓ put yourself in their shoes. If somebody is coming around to you and you're like,
in your mind, you're like, man, this CSC pipeline sucks or hate that we use this tool. CTO loves this tool, I was at this other company and this other tool is so much better. I wish this tool would just burn up and never be here again. This is your opportunity to talk about this. Please, please share it. ⁓ So yeah.
Carter Morgan (56:41)
Well, and we should touch on the Rice framework ⁓ because once you've kind of identified, these are all the different pain points, you've synthesized that information across developers. ⁓ They say, OK, how do you choose what to do next? And that is the Rice framework, which is ⁓ reach, impact, confidence, and effort. So from the book, they say, reach, how many developers feel this pain? Impact, will developers immediately notice the improvement? Confidence.
Nathan Toups (56:44)
yeah.
Carter Morgan (57:10)
Can you deliver this successfully and effort? How quickly can you show results?
Nathan Toups (57:15)
And the formula is reach times impact times confidence divided by effort. it's really important to think about it this way in the sense that you want to make effort as close to zero as possible. So how do you make the number as close to infinity as you can? You either make the denominator really large or you make ⁓ the numerator really large or the denominator really, really small. We know how to make a number really big.
Carter Morgan (57:21)
interesting. Yeah.
Nathan Toups (57:45)
Obviously, you can't have zero. You can't have zero effort. That would be undefined behavior. But the smaller we can get this, and the bigger we can get the number on top, and the smaller we can get on the bottom. it's actually funny. This is pretty similar to, I don't know if you've heard of, like, Alex Hormozi. He's, like, an internet marketing guru guy. He has a similar, like, value thing where you try to make effort as small as possible. So you say, hey, you know, I can...
Carter Morgan (58:02)
I haven't.
Nathan Toups (58:12)
get your CI-CD pipeline from two hours down to 10 minutes in one week ⁓ with 100 % satisfaction guaranteed, right? You hear all this, it's like, you know, too good to be too sales pitch. But if you put this in the Rice formula and you really have this, like, what's the reach? Every engineer in the organization is affected by it. Let's say it's like a flaky test. Like, this is a good one that would tie into for like a Rice formula thing. Flaky test, huge reach.
Every engineer is affected by flaky tests. The impact is I don't trust my build system. I kind of roll my eyes every time I send it in because maybe sometimes it's going to fail and sometimes it's not. I might have to rerun it and it's going to ruin my lunch break or whatever. So it has a huge impact on happiness, confidence. So maybe we realize it's like a five-line fix. We realize that either we turn this test off or we have this new pattern.
Carter Morgan (59:03)
Right,
Nathan Toups (59:08)
in an effort, it's a five line fix. We could do this in one sprint cycle with ensuring that it's all working the way it is. This would put this kind of initiative super high on a RISE score. ⁓ If I realize that this has high reach, high impact, and high confidence, or let's say I have low confidence, the flaky test, no one can reproduce it. We're not quite sure. have some ideas, but it's a 50 % chance. ⁓
50 % confidence that we're gonna do it. And it looks like it's gonna be a six month initiative, right? Now we have a really low Rice formula, right? The Rice formula score, which is that like, yeah, it's important work. Yes, it impacts a lot of people. Yes, it has huge reach, but you have a high risk that it won't work and it's gonna take a long time to figure it out. You should deprioritize that versus something that is like quick turnaround, high impact, high confidence you can get it done. So it's...
I think this is a really important formula to hold yourself accountable, especially early in these processes. You can't always just do the, I would call this a low-hanging fruit task. You can't always do those. But if you want to build credibility with the team, that you're able to solve problems, it's really good for you to identify things that score high on a Rice formula score, so you can get quick wins to earn credibility, right?
Carter Morgan (1:00:33)
And this is something that doesn't just apply to DevEx, but applies especially as you become more senior in your career. Like you have to pick your battles and you have to again, figure out like, yeah, is this project reach? mean, is it going to have broad impact? You know, I guess, or are lots of people gonna benefit this project ⁓ impact or, you know, is it gonna move the needle in the business confidence? Can I deliver it an effort? How quickly can you show results? ⁓
Nathan Toups (1:00:39)
Yeah. Great.
Carter Morgan (1:01:00)
I just remember I had a project at one of the big tech companies I worked on and it was kind of, it turned into a beast of a project and my manager did a poor job of communicating to me what he thought an acceptable timeline was. Um, I think he wanted it done in like three months and it wound up taking like seven. I was not aware that it was supposed to be a three month project, right? And so I was kind of broadsided at the end of it. I was just happy it was done. I was like, this is great. Like, you know, we did it and.
Nathan Toups (1:01:19)
Yeah.
Carter Morgan (1:01:29)
And he told me, said, Carter, if I had known it would have taken seven months, I just wouldn't have done it. And that was like a big wake up call for me. was like, whoa, like I, it's not just about getting the work done. It's about like, are we using our resources effectively? And so, you know, you need to develop that intuition as you get more senior in your career. Maybe this is good time to give our hot takes, hot takes about the book, hot takes about what we've read. What do you got, Nathan?
Nathan Toups (1:01:44)
frame.
Yeah, yeah, yeah.
I think we can give more focus to the... Maybe we can tie in the parts we didn't finish covering here in the next episode, because I do think this idea of selling your strategy is a really good one. We kind of touched on it. I I talked about that kind of earlier, but... ⁓ One of the areas... So, hot takes. Yeah. It turns out if you actually ask people questions about DevEx, you'll get valuable answers, right? It's like...
Carter Morgan (1:02:07)
Yes.
Hehehehe.
Nathan Toups (1:02:23)
It's like they just keep bringing this up. You're like, hey, actually ask these questions.
Carter Morgan (1:02:27)
Yeah, for me, ⁓ I can't remember who reviewed Michael Bay's Pearl Harbor and they said, this is a two hour movie stuffed into three hours. ⁓ I'm not going to make that exact same criticism, but what I like in this book, I really like, but I'm just looking at it right now, which is like, okay, so the first 50 pages are awesome. that is...
Understanding dev X the three essential elements of dev X and making the business case then you get to part four which it starts page 51 improving dev X a seven step process and then there's about an additional 50 pages devoted to gathering information about Dev X right and that is a chunk of the book which I felt could have been more like 15 pages I get it like I'm sure that these skills have been really really useful to
Nicole and obby and their work and Maybe it just turns this book into something that's a little more targeted towards someone like you found yourself in your situation Nathan where you had like 300 individual contributors, right? But then this book does kind of call out sometimes like and if you're just working at a startup with like 10 engineers Anyhow, I thought there was a lot of information dedicated like or a lot of time dedicated to gathering information when and again I'm at a startup. We have like 12 engineers
When I'm thinking about how I would do this, I would just ask people at lunch what they're struggling with. I probably wouldn't even need to ask many people because we all sit next to each other and hear what everyone's struggling with every day, right? ⁓ So I would have liked a little, a bit more of a balance there, but it's a minor complaint in a very readable book.
Nathan Toups (1:04:16)
Yeah, and it also could be like they have actually have a workbook section at the end. And I think maybe some of this some of the more specific details could probably be ripped out into that. We're like, hey, if you want to dive in deeper, if you would like, you know, 20 great starter questions, go to the workbook section at the end. Right. And then it would instead of having to go over each one of those, I think that's valuable feedback. I'm also like, this is a domain book for something I've done lots of work on and
And so maybe my tolerance or excitement about diving into it is higher. Yeah.
Carter Morgan (1:04:46)
Yeah.
Well, what about as far as our careers go, Nathan, what were you doing differently? Because we read the first half of Frictionless.
Nathan Toups (1:04:57)
Yeah, so one of the things that came up, we kind of breezed over it, but I thought this was, it's worth saying is ⁓ the book advocates, tell people what's going on and say it more than once. And I have a bad habit of, I do tell people what's going on, like, hey, we've got this initiative, but then I'm like, okay, I don't wanna annoy everyone, so I'm only gonna tell them that one time, right? But really, people aren't ignoring you just because like they,
Carter Morgan (1:05:14)
Yeah.
I am the exact same way.
Nathan Toups (1:05:26)
they just have a lot of stuff going on. And so if you're just repeating yourself with the excitement of, I've got this thing, it's not that it's, people will pick it up when it's the repetitions there and it's not going to be an annoying thing. Most of the time they probably just weren't even listening or they're so busy doing other stuff. So I'm trying to get better about having a unique view on the world and then being willing to repeat it, not being like, oh, I said it one time.
you know, I don't want to annoy anybody. ⁓ If it's impactful enough, you should repeat yourself. Just do it. And so.
Carter Morgan (1:06:02)
Well,
and you might think that's annoying, but I would pay attention to the communication you're doing with other individuals because I've been thinking about this more. I have found this is actually useful signal to me. Like, I don't find this annoying when other people do it. When my manager mentions things more than once, I'm not like, my gosh, this broken record. It's a signal to me like, this matters. Like, this is what he's really thinking about as opposed to kind of when he throws out something and then never mentions it again.
Nathan Toups (1:06:27)
Right?
Carter Morgan (1:06:32)
right? Even if we both agree like, that'd a good idea. It's a signal that like, this isn't the most important thing. I mean, my team. Yeah, I know, right? We we know exactly what he cares about and how important it is to him, right? ⁓
Nathan Toups (1:06:32)
Right.
Yeah, love him or hate him, Bernie Sanders is probably the most... Right. Yeah,
you rewind it 40 years and you're like, well, Bernie's still talking about this thing, you know, and it's not... I'm just like, that's Bernie, that's Bernie's brand, you know?
Carter Morgan (1:06:51)
Yeah, exactly. Right. There's and there's something to be said for that consistency there.
⁓ Yeah. And and you know, like I've seen that on my team, like now I did threaten to murder my coworkers if they made changes in click ops as opposed to Terraform. Maybe that's what got it through. But ⁓ but I've just kind of been banging the Terraform drum over and over again about like, you know, its importance and its utility. And I was I was so I was such a happy lead engineer because I left for Disneyland last week and ⁓
Nathan Toups (1:07:11)
Mm-hmm.
Carter Morgan (1:07:22)
And when I came back, our junior engineers, they're rigging up like a Lambda to generate thumbnails for like PDFs, which like is a trickier problem than you'd think. Or generating thumbnails for PDFs is a trickier problem than you'd think. But anyhow, they got it done. But they did the Lambda and the IAM roles and the logs all in Terraform. And when I came back, there were so many Terraform PRs for me to review. I was like, that's awesome. Yeah, yeah. So as far as what I'm going to do, ⁓ having read this book, ⁓
Nathan Toups (1:07:41)
Right. Amazing. That's so cool. That's awesome.
Carter Morgan (1:07:50)
And we're kind of embarking on this big rewrite. And I just want to make sure that we have a lot of good DevEx stuff already set up with our current system. I want to make sure we keep those and use them in the new thing, in the new system, but also just that we continue to focus on like driving those friction points down. In particular, what I'm concerned about, we are moving to a mono repo, which I historically have been opposed to.
But I actually think in an AI native world makes sense to have all of your context centrally located. ⁓ And I just want to make sure that our GitHub actions are really intelligent so that they're not linting everything when you commit, but they're only linting the appropriate projects. that the deployment actions are intelligent enough to understand what files have been changed and how to act. so I just want to make sure that our transition to the monorepid does not become a dev ex nightmare.
That's top of mind for me right now.
Nathan Toups (1:08:48)
That's great.
Carter Morgan (1:08:49)
Okay, who would we recommend the book to, Nathan?
Nathan Toups (1:08:53)
So you have a much more concise answer that I'm gonna let you shine with. Mine's a little over the map, but yeah, if you're tasked with addressing friction in organization, the book's title is right there, but I actually think if you're in an early stage startup, this is probably going to fall on you as well. But that someone, so if you're in large organization, obviously this is like,
Carter Morgan (1:08:55)
Hahaha
Nathan Toups (1:09:21)
platform engineering centric. It's very applicable. If you're a platform engineer and you're trying to figure out better strategies for getting reducing friction, this is literally written for you. But I think this is actually applicable for early stage startups too. If you're worried that the early hiring you've had is just slowing you down in a weird way, right? You maybe can skip some of the formality of it, of some of the systems that they have in here, but the nuts and bolts are there. so, yeah.
Carter Morgan (1:09:41)
Right, right.
Nathan Toups (1:09:50)
If you're trying to figure out how to ship code as a group and reduce the amount of friction and want to use sort of a modern take on all the DevOps learnings that we've had over the last 15 years, read this book.
Carter Morgan (1:10:03)
Yeah. And I said something similar. Anyone tasked with DevEx, I do think the book would be more valuable to people at larger organizations, like with 200 engineers. And if you are tasked specifically with like an overarching DevEx problem, like you were, I think, yeah, we'll see as the book goes on. Like I said, there was just a lot of time devoted to like the whole process of like organizing this effort and not as much time devoted to like, are common DevEx pain points and.
what you could do to address them. So yeah, make of that what you will. But again, another big plus for this book, very, very readable and a nice, pleasant, enjoyable read. So I think that always, to me, lowers the bar to entry for any recommendation I give. We'll be back next week. We'll wrap this up. ⁓ It's been fun. And we are excited to finish it out. ⁓ In the meantime, yeah, join the Discord. Link's at the bottom or it'll be in the episode description.
You can always contact us at contact at rojoroboto.io. You can ⁓ find us on Twitter at rojoroboto.io. You can find me on Twitter at Carter Morgan. And you can find Nathan and his work with Rojo Roboto and his newsletter at rojoroboto.com slash newsletter. Thanks for tuning in, folks. We will see you next week.
Nathan Toups (1:11:20)
See you.