Agile + 20
Alright. We're we're gonna talk software methodologies, which a topic that no one has any opinions about, so I'm sure it'll be a very dead conversation. I should say I should preface this by just saying what we've said before, but, definitely, like, raise your hand and hop in if you have any opinions on anything. So, Adam, have you read this
Speaker 2:piece? No. I haven't seen it before. I assume you saw it on Hacker News, like, a few days ago. Yeah.
Speaker 1:I saw it just yeah. I I I don't know. I think I actually saw someone make a reference to it on Twitter and then went back to the Hacker News discussion from a couple days ago. I've been offline for last week. So, yeah.
Speaker 1:I mean, it's it's, it has it conjures up some some feelings for sure. So alright. Well, you and I have both been in software, Dan as well, in software for the the the 20 years since the Agile manifesto. So, would what is what is your retrospective on Agile?
Speaker 2:Well, you know, it's so, you know, I I I feel like I've educated myself a
Speaker 3:little bit
Speaker 2:more in reading this document, some supporting material. But, I mean, I went through the same thing that I'm sure everyone else has, which is, like, the discovery of a bunch of cargo cult type, rituals and, poorly understood methodologies that didn't add up to much in sort of toxic environments and lots of, like, really toxic behavior kind of riding under the flag of Agile. Obviously, I've never really been subjected to it.
Speaker 1:Yeah. I was gonna say. Right? Way? Yeah.
Speaker 1:Okay. So you've never actually suffered with with with agile? I've well Do you say agile or agile by the way? I noticed you said agile. Do you say agile?
Speaker 2:You know what? I think I heard it as agile a lot.
Speaker 1:I think
Speaker 2:you say agile. I say agile. Alright. Well
Speaker 1:It it would I think there's even a reference in the actual manifesto to the fact that Martin Fowler is dismayed that everyone's pronouncing it incorrectly. But maybe you're actually pronouncing it correctly according to Martin Fowler.
Speaker 2:Oh, I wouldn't like that name.
Speaker 1:So have you ever had to endure Agile in any way, shape, or form,
Speaker 2:Adam? Well, sort of. So, in a couple of ways. So one, through the lens of our Agile conversation, and and needed to paint ourselves in that in in those kind of stripes. So exposed to it in that way.
Speaker 1:Hold on. What what what do you mean? The whole I I so
Speaker 4:you Yeah.
Speaker 2:I guess I guess, like, you know, may maybe, I don't know how familiar you are with this, but, like, when when there's a band passing through, you jump in and part be part of that band. You know? And so through one of the you know, when you're at a start up and you're trying to figure out what's working and what's not and how you can be part of or counter to a bunch of prevailing wins. So you you hop on some bandwagons. So for a while, we decided we're, you know, part of the Agile story and had a bunch of pitch decks that described how how it was part of it.
Speaker 1:So so this is where you are are actually offering up as those who are embracing agile as a tool for agile development?
Speaker 2:Exact exactly. And and to be clear You
Speaker 4:were a
Speaker 1:developer Delphix turned into a developer tools shop for some period of time?
Speaker 2:No. No. So, I mean so I I realized now this feels like a a like some sort of testimony needs.
Speaker 1:I'm sorry.
Speaker 2:Or like a therapist.
Speaker 1:No. No. No.
Speaker 2:It was, you know, what we heard from a bunch of customers was that they wanted to turn around environments more quickly. And this was one of the, you know, I mean, this was sort of part of the agile process slash agile snake oil, you know, the, you know, the consultants would come in and say, well, you know, you can't make forward progress because you're not agile enough. And in particular, you weren't creating environments, you weren't allowing folks to test things early in the development process. So trying to capture, you know, some of the desire in those organizations, some of it being well, like, well meaning and well founded, and some of it being based on snake
Speaker 1:oil. That actually sounds like some of the
Speaker 4:You're not genioplecting in the right direction, Adam.
Speaker 3:You just
Speaker 4:need to genioplect with your hands at a 45 degree angle. Yours at a 43 degree angle.
Speaker 1:No. No. No. No. I I'm
Speaker 5:I I we we had
Speaker 2:lots of faith healing exercises, yes.
Speaker 1:But it sounds like, I mean that's actually it sounds like you could make the argument that what you were doing was more promoting agility rather than agile, which I think
Speaker 2:That is so that that that's that's fair and that's true. But it you know, in order to get into that conversation, I I think we needed to, like, you know, do our own military parades, that that looked agile and and stuff like that. So so that was one way in which I was exposed. The other was and, and I'd be surprised if you hadn't encountered this too. You know, folks we'd hire would say, or, you know, whether it was in the engineering team, and this is like at at all the companies I've been to, you know, whether it was in engineering teams or in adjacent organizations saying, you know, why aren't you guys agile enough?
Speaker 2:Why aren't you doing agile? Everyone does agile. Don't you know anything?
Speaker 1:So but have you ever had scrums and sprints and pigs and chickens and ethics and stories?
Speaker 2:I have so, again, when I was, the CTO of Delphix, we had teams that were doing that. I did not participate in it necessarily. Like, I
Speaker 1:And you sound like my teenager. Like, listen, you're not in trouble. Okay? I just wanna I wanna keep you safe. That's what I've I
Speaker 2:didn't I I'm saying I didn't inhale.
Speaker 1:Okay.
Speaker 2:But, at least not too deeply. But but I guess to what what I think is is something that often gets missed in in the agile discussions, like, those teams thought that that was what they wanted to do, so, you know, knock themselves out, like, go for it. And if it was helping them, whether whether it was because it, you know, it was some sort of, you know, like, phantom cure, fine. But if if they wanted to do it that way, then that was fine. And often they kind of stabilized into something, a little less doctrinaarian.
Speaker 1:Yes. I mean, I think that the because the manifesto itself, I think, is fine ish. I mean, it's like I don't know. I think it's I'm waiting for Dan to unmute himself up here. Here.
Speaker 1:I think that there are some things that are fine about it, certainly. And I think the the problem with agile is when it became so prescriptive that it lost a lot of its agility.
Speaker 4:Let me look. Here's here's the thing with the agile manifesto. This is a programmer response to something which is fundamentally not a programming problem. And it is it is frankly and I I Adam hit this on the on the head, man. This is a bunch of hucksters snake oil salesman, you know, basically peddling this like, hey, look, if you contort yourself into this weird shape 5 times a day, like, the sky is gonna turn orange and and like, you know, unicorns are gonna rain down and everything's gonna be fucking magical.
Speaker 4:Excuse me. And, you know, the reality is that that, like, that that's just never true, you know. And in some cases I and and and it was like, who are these people writing this manifesto? You know, it's it's they're mostly consultants. There's a couple of
Speaker 1:points Yeah. That is the I mean, it's a very fundamental problem. The I mean, the you're right.
Speaker 2:They they they tell themselves in 2,001 as being practitioners, which they may have been, but it is telling that in the intervening 20 years, the number of books that has come out of this group of folks does kind of indicate a disposition. And certainly the majority of them And before. All of them are not writing code.
Speaker 1:Well, even they so and before well, they That's true. Yeah. I mean, the problem is that they don't they write enough code to be able to plausibly claim it. But, yes, I mean, this is a a group of hucksters coming together at a huckster convention, coming up with a huckster manifesto. I mean, to really as long as as long as we're all piling on here.
Speaker 1:And and I I do feel that, like, there's something about software where we are I I feel there's so much ambiguity in software. There's so much that is unstructured in the way we develop software that we are constantly seeking people to tell us how to do it. And the answer is it's complicated, and it's and there's not one way of developing software. And there are different different constraints at different times, and different things are effective for different kinds of software. And that's, like, it's a very complicated answer that no one actually wants to hear.
Speaker 1:I think people are desperate for this kind of, like, simplicity. I mean, it is the same reason that, like, religion arises. Right? Where you're just like, hey. Can you just, like I I I want some you know, in this, you know, the this terrible lonely world and this pale blue dot in the cosmos, I need some simplicity and some answers.
Speaker 1:So please provide me some answers that's offered at all. But I
Speaker 2:saw a great Steve Yaghi blog post along these lines, which, which made that direct connection with Scientology, saying, you know, creating your own religion is a much better gig.
Speaker 1:So you yeah. Do, do do you guys know who Ed Yordan is? Do you remember him? I see that a bunch of there there's some some of my so Ed Yordan was a,
Speaker 4:Oh, yeah. Yeah. Yeah. The death march guy. Right?
Speaker 1:Yes. And the in particular, he wrote a book called The Decline and Fall of the American Programmer in 1992 about how, there would be no software engineering in America because it's all can be done more cheaply abroad. And it felt like and this isn't so I'm this is basically written when I'm in college studying computer science. I'm basically being told that I'm I'm, I'm gonna be I must be obviated. And it just didn't like, none of it rang true to me.
Speaker 1:It but I couldn't I I didn't know why. I would I I feel like this is this doesn't make sense, but I'm too young to actually know why why none of this makes sense. And it turns out it was all wrong, and he wrote a book, 4 years later called the rise and resurrection of the American programmer. Like, actually, just getting. I was totally wrong.
Speaker 1:But if you could actually buy both books, please. If I could sell you a book coming and going, that would be great. Which does kind of, I think, go to the point of, like, this is actually the the the purpose of this is to actually sell books, unfortunately.
Speaker 2:You know, you know, but you you were saying, I mean, not to not to, not to praise it too heavily, but like the principles are not all wrong, and and, some of them at least feel obvious or, you know, I did not. I I certainly didn't read this in in 2001. But by the time I was aware of the agile manifesto, I sort of felt like a lot of this stuff was like, yeah. Like, understanding having engineers, having all the engineers, having people working on a task understand the point of the task and the person for whom they're building it is a valuable thing.
Speaker 1:That's good. Yes. Simplicity Oh, yeah. Sorry, Dan. Go ahead.
Speaker 4:But but but, like, at the same time, thou shall not kill is also not wrong. Right? I mean, that's a pretty good rule.
Speaker 1:Good stuff. Yeah.
Speaker 4:Right. You know? I mean, like, these guys hit on the kernel of truth, and and and they kinda rip on it. But then where it goes off the rail
Speaker 2:is they sell it as, like, you know, oh, this
Speaker 4:like, do this, and this is the solution to your problems. And, you know, like, I hate to say this because it it's gonna sound horribly elitist, but who are they marketing these things to? Yeah.
Speaker 1:Dan, I almost think it it's like you're almost outraged that there is kernel of truth because it's being like the kernel of truth is being as used as a toehold to actually sell you things that are quack yours.
Speaker 4:Well, but it always is. I mean, like, you know, I I I suspect that if you look at the rise of most major religions, this is people observing the world around them and being like, hey, if we rotate crops in that field, if I if, you know, if I let that that field wide fallow after I plant corn in there for a couple of years, and then I plant corn again, like that's sustainable. And we can kind of keep doing that more or less indefinitely. But if I just tell Fred, the the farmer down the road, hey, do that. Hey, you should write unit tests, for example.
Speaker 4:Fred's gonna be like, well, I'm not gonna you know, who are you? Right? Like, but if if I say, well, God told me to tell you, right, to not plant something in that field for a couple of years after you plant corn, then it's like and, oh, by the way, if you do it, fire and brimstone is gonna rain down, and your family gonna go I'll I'll die. And then it's like, woah. Okay.
Speaker 4:Alright. I won't plant corn there for a couple of years. You know? And similarly, if you have these kind of self styled master agile software craftspeople, men, whatever, people like Robert Martin kinda coming out and telling you, well, if you're not running tests for everything, then you're just wrong. And and, you know, this dude has a 100000 Twitter followers and has written 5 books
Speaker 2:on the subject. You're like, oh, he knows what he's talking about. No. He doesn't. The guy doesn't actually write any working software.
Speaker 2:Show me some and show show me a single important
Speaker 4:piece of code that that that dude has written. Right? Well Now contrast is like sorry.
Speaker 1:No. I I'm sorry. I don't I I don't wanna take you off your roll there. Sorry. Yeah.
Speaker 1:But, yeah, it's, you're drawing the fire in Rimston. No. I I mean, I I think you're you're right. I I think that the there is a certain sense of, like, fear is kind of being used. But the thing is, like, there is a kernel of truth here.
Speaker 1:Right? I mean, I think that there's a a lot of these the principles in the manifesto are not wrong. It's just
Speaker 2:And I would say it's it's it's less fast facile than just you thou shall not kill. Right? Like, because I've been in lots of software engineering shops where, like, software engineers are, like, don't understand, like, what the customer wants and are isolated from it. And, yeah, it's bad, but it's but they don't recognize it as it as it's bad. So so, I mean, perhaps where you're going, Brian, like, there's there's truth in a bunch of these things, and it and a lot of the failure is, the religion that comes up surrounding it.
Speaker 2:I mean, just go to the agilemanifesto.org and it and it looks like everyone laying hands on this on this on this sacred text.
Speaker 1:It does look like, it looks like an orb photo of like, you know, it's like the Trump. Trump, except that it's in a it's in a hotel room. It's in in Snowbird or whatever. That's right. But
Speaker 2:but then there's also a lack of specificity, which which gives one lots of, like, opportunity for faith healers to come in and say, say, you know, do it my way, and if you fail it's because you
Speaker 1:have not agile enough. Alright. So have you ever been
Speaker 4:in like a daily scrum,
Speaker 2:Adam? Yes. I mean, and so yes I have. I've been in ones that were and in fact, at at Fish Works, I think we did a daily scrum of a sort.
Speaker 1:Yes. But but not calling it by I mean, yes. Like, a daily communication, fine. But in terms of the Yes. Because the thing that I found a bit maddening about or so actually maybe surprising about agile is how rigid it became with in particular, like, the sprint cadence.
Speaker 1:Like Yes. It feels to me to be one of these, and and, you know, if you read what they say, and I think it was even in this retrospective, where they were saying, you know, the original intent of that was to allow engineers some time without the requirements changing, which I think is kind of an interesting idea. But then everything gets shoehorned into what became this 2 week cadence, which is I mean, it's kind of ridiculous. It's like there's so much stuff that that is shorter than that and so much stuff that is much longer than that. Like, why would we I I I don't know.
Speaker 1:I I found that was one of the things that always, rubbed me the wrong way.
Speaker 2:Yeah. I think in particular for the kinds of software that that we have spent a lot of our careers working on, a lot of it just doesn't fit in 2 weeks. Like, as as noble as it might be to get these incremental pieces working, there's lots of stuff that that just doesn't fit fit in 2 weeks.
Speaker 6:But I don't think 2 weeks was supposed to be a magic number. I mean, they were in a world where, like, new versions of software was coming out every 5 years. And they wanted to be like, okay. We need something that's longer than an hour and less than 5 years.
Speaker 1:Yeah. No. I think it's a good point. And and they were in a that was it was a world where you had this kind of this big release model. And I think being able to release software more frequently and to be able to judge software by the act of its creation, I think is good.
Speaker 1:I think that that's all good. So when Aaron, what when did the when did 2 weeks become sacrosanct? Because somewhere along the line, it did. Oh, we said oh, we for the folks that I've interacted with that have done agile, that that does seem to be a very fixed sprint cadence. Is that just me?
Speaker 1:I mean, is that not like
Speaker 4:I I I think you're right. I mean, I think that all of these things sort of calcified. I mean, look, bear in mind the context in which this stuff arose. Right? You know, like, I I remember sort of doing software development in the very late nineties and early 2000s when I was first sort of embarking on my professional career.
Speaker 4:And it was an era of a lot of really, like, micro tracking of all progress done by PM type people. You know, and these weren't, like, people fresh out of school. I mean, they were experienced manager types who had been in the industry for 20, 25 years. And I can remember having multiple meetings with a PM in one day where they were, like, adjusting something on a Gantt chart and Microsoft Project or whatever that, you know, software package was. And they were like, well, do you think you can do it like 2 hours earlier?
Speaker 4:And I'm just sitting there, I'm like, you know, I'm wasting a lot of time sitting in this meeting right now.
Speaker 3:Right.
Speaker 4:You know, totally distracted from my programming flow. Like, answering this question about whether I can adjust the schedule by 5 hours so that you can move a box around on a on a chart that has absolutely no meaning, no connection to reality. You know, to to
Speaker 2:in that lens, like, it Agile felt so foreign to me coming from some microsystems where, you know, I started the Solaris kernel group, worked with with Brian and and and other folks, where our management was, I think, absent by and large. And certainly product management was, I mean, not absent, but very ignorable. And they sort
Speaker 1:of Contained. Contained is the word we use for that. I don't it worked.
Speaker 2:Contained. Like, in in a different building, they'd show up and they'd ask you how long a slider should be and you'd give them an answer and then and then, you know, send them on some other goose chase. But but you you could really ignore them. But in in part, you know, people weren't paying attention to the products that we were working on. They were off
Speaker 6:So what was the process by which you found out whether the customer was happy with the product?
Speaker 1:We so we we actually had a pretty direct connection with our customers. I mean, I think that was actually one of the strengths of that organization. So we actually we dealt with our customers pretty directly. And and and that's part of the reason why I think we actually developed a bunch of stuff that was pretty relevant to them over the course of a decade. Is because we had a pretty direct connection.
Speaker 1:So I mean, like, the Sorry. Go ahead.
Speaker 2:That that notion of of being in touch with the customer was just so innate. Yes.
Speaker 1:That's right. And really valued. Like, the the and I do think that, like, I guess, Agile, like, hits on some of that. I am realizing I'm trying to actually, get some of the Agile nomenclature. Are are meetings in Agile actually called ceremonies?
Speaker 1:I'm up. Because I'm on a blog entry that is our what on that last thing is 4 agile ceremonies demystified. And I'm thinking, like, okay. That's pretty funny. Like, they're making, like, a snarky remark.
Speaker 1:I'm like, oh my god. These actually are called ceremonies. Has anyone worked in an organization where they've actually become like, you're invited to the ceremony at 2 o'clock? It's like,
Speaker 4:Only if you wear the robe.
Speaker 1:Is this like a breast? I mean, is this are we what's gonna happen at the ceremony?
Speaker 2:That's the only ceremony you could think of? Alright. That's fine.
Speaker 1:Yeah. See what's happening? This is the first one that came to mind.
Speaker 2:What's wrong
Speaker 4:with that? It's a good ceremony.
Speaker 1:It's a good ceremony. That's fine. What's wrong with that ceremony?
Speaker 2:Nothing. Nothing.
Speaker 1:I just don't I can't we didn't even a bris, wedding, funeral? How many ceremonies do we have? Graduation? I don't know. We've gotta get I mean, there are only, like, 5 ceremonies to pick from, really.
Speaker 1:Right? Or do
Speaker 2:we No. That's right. Please don't fact check
Speaker 1:that. That's right. There are only 5 ceremonies outside of Agile where there are many many many ceremonies. Oh, and so Dan, you would say the interesting point about, like, this like, hey. Can you pull this in 4 hours earlier?
Speaker 1:Because I I think we overly enshrine schedule estimation in software where we are trying to estimate something that is fundamentally there's there are many, many, many unknowns. I mean, there's there's software where you get to the point where there's, like, there there are those unknowns fade away and you've got a lot of knowns. And but in my experience, like, when you hit a date from a schedule perspective, it's because you're using that date to focus effort and to determine in particular what you do and what you don't do. So it's like, okay. We are we know we're not gonna do this because we need to deliver this to this customer on this date, but it's very hard to fix the actual scope and the date unless you have a really known problem.
Speaker 1:If there are any unknowns to the problem, it becomes, I think, really, really, really hard. Or at least it's been hard for me. I don't know if
Speaker 4:Oh, I I I think that's absolutely true. I mean, look, I think that these all of these methodologies, all of them, waterfall, agile, scrum, I don't I don't care. They work best for a certain class of problems, and they do not work at all for other classes of problems. And, you know, I I I think if you're trying to deliver, like, a payroll system or something like this and and by the way, this isn't a, you know, poo on the on the on the on the people who are working on that stuff. That's necessary important software, and I and I'm not suggesting that those people are lesser programmers or something like that.
Speaker 4:But the contours of that problem are a lot better understood than, say, implementing, like, a new
Speaker 1:Yeah. I mean, you're right. You're breaking up there at least, Dan, for at least a little bit for me. But I think that that that this point about repeatability is a really important point. That software to me, software the the reason that that there are so many unknowns in software is because if you're doing it right, you are tautologically solving aspects of a new problem.
Speaker 1:Because if you're solving an existing problem, the cost of goods sold of software is 0, especially in an open source world. You should just be using that crate, you know, in in Rust parlance. You should be using that the unmuting yourself.
Speaker 7:Yeah. I wanted to touch on what you were talking about, but I I think there's a Heisenberg principle at work with software in that you can tell what's in a release or you can tell when it ships, but not both.
Speaker 1:That's right. Yeah. I think you're right. I think you're right.
Speaker 7:And it it's it's really, really, really true in my experience.
Speaker 1:And
Speaker 5:I I don't I don't wanna throw, you know, too much rain on your parade, Brian, but, I I'm a huge fan of of Agile, and I've seen it work really effectively, on all sorts of projects from, like, front end to back end, small to large. The first project where I saw it meaningfully, you know, meaningful site project where I saw it implemented and and done well was, starting 2004 with the building of s 3 at, AWS. And, you know, the I think it's easy to throw, you know, to hurl abuse at us from a distance. If you've been burned once in an organization that, you know, sort of did a half assed implementation or pursued some of these more kind of, religious event type approaches to how the process is done and how meetings are run and so on. But you can take a really just pragmatic approach to, building software and, using the not not just the principles on the side, but but using some processes that have evolved over the last almost 20 years, around scrum and really get some nice velocity improvements.
Speaker 1:Alright. So, Tom, I'll I've got that's this is actually and it's in many ways a much more interesting conversation because I think somebody who had agile backfire, that it it's easy to to say hurl abuse at it. But so knowing examples of where scrum really or where agile really worked, what were some of the things about it that were effective? Or, I mean, what are some of the aspects of it that were so effective?
Speaker 5:Yeah. So here are a few. You give developers a quiet space for the duration of a sprint, and they have an uninterruptible window of time when they can do when they get to do nothing other than work on the tasks that were pulled into that sprint. That's just awesome. And and I wouldn't get married till, like, 2 weeks.
Speaker 5:I've seen it work, you know, most, teams that I've had have used 3 week sprints, but 2 weeks works for some, 1 week works for some. That that's sort of, you know, somewhat immaterial. But that quiet space where product managers can't interrupt you, is really golden.
Speaker 6:Was it at all tied to your release cadence that at the end of each sprint, we're going to push a new version of this particular service?
Speaker 5:Well, that's the most central part of it is that it's about the art of the possible. And you pull into the the sprint the things that you're pretty confident you can commit to actually being able to demo in a shippable form 3 weeks hence or 2 weeks hence. The most central process aspect of that to me then is the sprint demo where you demonstrate success or failure against those things that you committed to in your sprint a a test environment or actually live if you deployed it, and and tick off each of the things that you committed to to doing and in return got that kind of golden quiet window of time on which to work on them.
Speaker 1:So that's interesting, Tom, because, I mean, what you're saying is that the that the real value was the focus that it afforded engineers. Again, this is actually what the original actual manifesto, I think it came from, is trying to afford folks that kind of focus.
Speaker 5:It's it's that's half of us. So it's it's a, you know, it's a two way contract. I give you focus. I I say, I, as product owner, give you focus, and in return, you'll demonstrate which pieces of the things you committed to, you actually achieved in that focused wind of time. And it's never a 100%, but it might be 70% or 80% of the task you took in.
Speaker 5:And, you know, we look at them together and commit that, yes, that, you know, these all check the boxes. If a green light ship them, and you move on to the next timing process. It it really you know, I've seen so many teams, s 3 again, you know, classic example where the requirements were really, really iffy kind of getting into starting to work on that, and they changed, like, month by month. So it would have been super frustrating if the team didn't have the protection of at least these sort of sets of, windows of time, where they could just work through, a fixed set of commitments. Whatever direction they felt they need to do.
Speaker 1:And then what was the interval between between sprints? I mean, how did what was the kind of the, the alternation between sprints and
Speaker 5:sprint planning? Sorry. What was the
Speaker 1:Well, because what I found is often sprints end up being back to back, and they're the the kind of sprint planning ends up being a bit of an afterthought or the the or or we in our sprint planning, we just do the things that we're gonna do anyway. I I I it's like some curious about how the kind of the sprint planning piece worked for something as large as us. Right?
Speaker 5:Yeah. So so, certainly, Yeah. It'd be good to have Alan Atlas on here. Maybe we should get him on at one point because he was a scrum master of that program, and he he teaches scrum. So he is also a huge fan of it.
Speaker 5:But he, he trained a lot of people across Amazon then to, you know, on the back of the success of that, to be scrum masters. And, as a process, it was adopted right across the con the company. A common approach, one that I I'd liked is, to have 14 day sprints, have the 15th day for, you know, start up with your your sprint demo. The the engineering team that with the scrum master does a retrospective to assess their process and look at, you know, what worked well in that sprint and what could be better in terms of process improvements. And then you go into, like, typically, like, a half a day of, planning for the next one.
Speaker 5:And then you start on, you know, and on on the next, you know, 3 week cycle.
Speaker 1:And then how about for things that didn't fit into that cycle where you've got someone on the team who is engaged in a project that's gonna be paying down some technical debt. It's gonna be taking you know, it's gonna take 8 weeks.
Speaker 4:It's not
Speaker 1:gonna take 3. How and how do you fit those folks into that? Because this is where I found that agile really struggles.
Speaker 5:Yeah. Well, first of all, you know, paying down technical debt should always be part of the process in terms of how every team works. And so it's easy in you know, it's all too easy to starve that out. And some personally, my preferred approach that has worked well is to maintain 2 2 backlogs, one that is a featureful customer facing, set of takeable tasks. And the other is a prioritized list of technical dash and current feeding and scaling and operations related, things that, you know, the engineering team agrees are are in the right sequence.
Speaker 5:And the product owner might not be involved at all in the of that one. They just know that, you know, that the team feels like what's at number 1 on that is the most important thing to take next. And, typically, I I've shot for a 70 30 split in terms of story points that go towards features versus story points that go towards technical debt. If you hit, you know, if if you get into, you know, a crisis of performance or something like that, it might ramp all the way up from 30 to 100 if you need to go and do an entire sprint that's paying down Dash. And sometimes if you're, like, trying to meet meet external objective, 30 goes lower.
Speaker 5:But over the longer arc of time, that I've I've seen that work well as a balance between the 2.
Speaker 1:So you're going, like, full in story points, epics. I mean, you're the the
Speaker 5:the it sounds like Exactly.
Speaker 1:Yeah. The the whole 9 yards. And and then how do you kind of, assure and I can maybe give this is where what the kind of the scrum master's job is when it is working is to assure that the team has that focus for that period of time. Because I think it just becomes very tempting. You know, a crisis arises, and, you know, we need to you know?
Speaker 1:Okay. We we need to go. We've gotten customers. They it'd be it just it it feels like there's there's what we said, you know, a week and a half ago, we actually now need to change our direction. And Yep.
Speaker 1:I I you know, how do you and maybe that just didn't come up that frequently or maybe the team was really protected.
Speaker 5:Yeah. I've, across 100 of sprints, I've never had to abort. So, and if, you know, like I said, my, most common duration would have been 3 weeks. So, you know, on average then you'd be a week and a half in and something would have changed. Maybe it was one time, but something changes so radically that everything you've got in terms of tasks you've taken into the sprint is now irrelevant.
Speaker 5:That's just, that that's that's an extreme circumstance that is is really rare given it's only a week and a half ago that you planned what task you're gonna take in, and they were the top priority ones, you know.
Speaker 2:So Hey. Hey. So, Tom, I'm sure you've seen, like, lots of failed agile implementations or folks talking about those. Yeah. What what do you think distinguished the success that you had against, you know, what I've seen you know, what what seems to me as a a pile of like, I've seen mostly failures, not again, not personally, but as I've touched to get brushed against these organizations.
Speaker 5:Teams that were doing agile badly or doing something that they called agile, but really wasn't. The most common thing that I I've seen, TIG teams Australia is they don't do demos at all. And so, accountability breaks down. And the contract that I talked about earlier, and the trust that you get between the product leader or the product owner and in some in some companies, the product organization and the engineering team fails when you don't do demos. And I think that's a real loss, not just because of, like, the demise of sort of this contract and and what it represents in terms of being able to hit Velocity.
Speaker 5:But also, if you construct it well, demos can be a real celebration of the work achieved, but also they get I've seen it, them help engineers to raise their game when I bring in you know, I bring EAs and paralegals and other people just say, hey, you know, can you sit on this team's demo? And they actually love it because they're in a software engineering organization, but they're they feel in the rest of their jobs that they're somewhat tangential to the nuts and bolts of what the company is delivering, and they're real happy to come in and be part of the audience of a demo. And so that's the most common thing that that I see, like, not happening. The The second most common thing, frankly, is that, retrospectives don't happen. And so, there's no sharpening of the saw and iterating on the process and celebrating the things that work well with respect to, you know, process improvements over time.
Speaker 5:But more importantly, not a lack of attention to the parts that don't work. I'd say the third thing is, obsession, overestimation, and being good at it. And I I think that, you know, at at the extreme margins, maybe, like, doing agile and estimating your work every 3 weeks for years at a time. You people eventually get a little bit better at estimation, but not by very much. So, you know, people will get wrong.
Speaker 5:And sometimes the sprint will blow up because something you took on is a 3 pointer turned out to be a 13 pointer. And, you completely ran out of capacity. But and that's okay. You know, you have to shrug that off and and carry on, try and assess what is it about it that, was, that was missed with respect to, how you, said, you know, when you set out and you're planning and embarked on that sprint, but it shouldn't cause you to lose faith in the whole process.
Speaker 1:Yes. And so I find that when, part of the reason the software is difficult to estimate is because it's it's so hard to know the you know, we thought, you know, this part operated in this way. As it turns out, like, it's actually mis documented or it actually has got defects or we didn't understand how it was used or what would have I mean, it feels like they're so and and there there's so many layers to software. It's very easy to end up with a problem that you thought was simple being actually much more complicated than you thought it was through no not because of a lack of foresight, but just because of the of just the amount of the amount that's unknowable in so many different kinds of software systems before you actually get in to actually implement them. Totally.
Speaker 5:And so, you know, the one of the key things, Brian, is, you know, being honesty, being honest with the fact that that's the case and being open to, you know, very big is, you, you know, over time, you agree on a certain task and, that that everyone will understand then the the the contours of and say, you know, for this team, this was a 3 pointer. This is what a 3 pointer looks like, and this other thing was a 5 pointer. And you keep those as golden tasks so that and and the next time you know, maybe in 6 months' time, the the team will be different in terms of its velocity. So you you come along with something that looks a lot like that 5 pointer that you took, you know, 6 months ago and say, okay. This is really like that.
Speaker 5:This should be a 5 pointer as well. And you and you find out that it turns out to be an 8 this time around because, you know, the the profile as the of the team has changed a little bit, or there are there, you know, there was some misunderstanding in terms of what it takes to get that done now and and how it's different from 6 months ago. But, I think yeah.
Speaker 4:I I think one of the additionally, one of the reasons that it's so hard to estimate these things is that the industry is still very young. You know, we've been writing software now for what, like 60 years, somewhere around there. And or I got 70. And, you know, compared to human beings building bridges where we've been doing that for 1000 of years. And the body of knowledge around how to do that and how to successfully build a software project, it just isn't there.
Speaker 4:And and, you know, I go back again to this context thing, and sort of the rise of agile. And I think it's very important to contextualize these things in the sense that, you know, in the nineties, nobody I mean I mean, like, we're still not good at this, but nobody knew how to do it back then. And that was why you would have, you know, the project manager pulling you in 3 times and being like, can you shave 5 hours off of this thing and we're gonna track this stuff meticulously. And I mean, does anybody remember things like the personal software process and the team software process and the software engineering institute and stuff that those guys were putting out?
Speaker 2:Oh, yeah. Yeah. Yeah.
Speaker 1:Yeah. Definitely. I mean, it would we we all we learned that we were all centers because we weren't working on the base service software. There are there was only one SEI level 5 organization in the world, then you don't work for it.
Speaker 4:Yeah. Exactly. But, you know, like, I like a lot of those processes were oriented around very fine grain tracking of your time. And it was like, you know, write down everything you do all day. You know?
Speaker 4:And and I remember doing I I tried to do this, and I ended up stopping because I was so embarrassed. Because I was like, well, you know, 1999, half of my time spent, you know, per day is spent reading Usenet. And, you know, like maybe a couple hours a day or actually, like, you know, applying fingers to keyboard and text editor. And, you know, it's like, gee, am I a terrible, like, software developer as a result of that? Well, the answer is kind of no because I was learning a lot by reading Usenet at the time.
Speaker 4:And I could take those things and I could apply them to my work. And then, you know, that 4 hours might save me 2 weeks or some crazy thing like that. You know? And so when I look at things like agile and indeed, all of these processes and methodologies, it's all about trying to be like, hey, there's this completely unruly discipline that nobody knows how to do properly. Let's try to figure out a way to bring some structure and order to this thing, because right now it's just chaos.
Speaker 4:And chaos is not good from a business perspective.
Speaker 1:What else do you mean?
Speaker 5:The the key point here though is that, you know, this is a young discipline. People will continue to be poor at estimating, the effort to achieve a certain task for some considerable time to come. But one of the great things about agile is you get people working on the most important tasks at the top of the product backlog. And the the way in which I've seen projects go off track the in the worst way is you have teams of people working on the wrong
Speaker 2:thing. But, Tom, doesn't that reflect just the disconnect with the the customer and the customer's needs? Like, isn't it isn't that I mean, understood that that's part of the Agile manifesto. But, is it that lacking of that true north?
Speaker 3:It it is it lapping it. But, like, a lot of these, a lot of the meetings that we're talking about here, and this conversation is being mostly scrum focused. Organizing most of these meetings is about facilitating communication between the people who, developing software and the people who are having conversations with customers and making sure that everybody has the the full context of of what's happening. And, some people really benefit from the level of rigidity that is set up set out by, these individuals who are proposing very specific rigid processes, at least at first, because prior to that, they were just not having these conversations. They weren't talking, you know, the engineers were not talking to the product owner or to the support engineers or to the the salespeople or to anyone to be able to really understand the full context of the thing that they're doing.
Speaker 3:So these meetings are are mostly about facilitating that communication, and and I I find that for high performing teams that or teams that, you know, become high performing, they will often, you know, start off with one of these more rigid processes likely have some number of retrospectives in which they pair down the number of meetings they have to the things that are truly delivering them some value in communication, to the point where they get a good balance between, you know, having those those communications to gain context as well as having the the space and time to be able to really have uninterrupted blocks in which they can focus on producing software. Also, to your point, Dan, in the US at least, we're pretty bad at estimating how long it takes to build a bridge and how much it cost to build it. So I'm not sure if we're, you know, bridge building and and subway building are particularly, gold standards in terms of estimation.
Speaker 4:Alright. Fair enough. Fair enough.
Speaker 1:Well, he absolutely, Ian. And to to your point, I mean, I I think that the when you if you take construction and the let's not talk about a bridge, but maybe, like, a parking lot or something that where there are, that that there are often, you we've done this before. I'm gonna put a foundation in for a house. I've I have done many of these foundations. This foundation is no different than the than the other foundations I've done.
Speaker 1:I've got very good estimates for how long this is gonna take me. I know exactly how long it's gonna take me because it's it's there's a repeatability there. And, I mean, I think and probably like any software engineer, here, if I were to go rewrite the software that I have written over and over and over and over and over again, I would get really, really good at estimating how long it would take me to go do it because I've done it many times before. We don't do that in software. We don't rewrite our software.
Speaker 1:And we are in fact, what we are doing is often entirely not repeatable. It's new. And whenever a construction project looks like that, Ian, to your point, it has all the processes that software has. So we are in in the Bay Area here where we, for somewhat strange reasons, decided to build a a self anchored suspension bridge, in the east bay span of the Bay Bridge. Only the second self anchored suspension bridge ever made.
Speaker 1:This this allows you to not have these because these concrete anchorages, I guess, are very upsetting to a bridge engineer. I don't it's it's not really clear to me. I mean, I know that we software engineers do this all the time, so I shouldn't be I shouldn't besmirch a civil engineer that's doing this. But in a self anchored suspensions bridge, as my my neighbor is a civil engineer or cynical civil engineer says, you you get one bridge for the price of 2 because you have to build a bridge, build the bridge on top of that bridge, wrap the actual cable around the bridge, and then and then then destroy effectively the the this first bridge that you built, and the thing will anchor itself. And that bridge went went like 6, 7 x over budget in every conceivable dimension, had all sorts of problems that no bridge had ever seen before, because they were looking a lot more like a software project in the way they were doing it, and it was a it was a new and there were many aspects about that project that didn't have analogs in in previous projects.
Speaker 1:So I I don't feel that, like, this is you know, Dan, you're saying this is because software is new. I think it's deeper than that. I think that software is this very beautiful but paradoxical confluence of information and machine, and it looks like both and neither at the same time. So it's when we develop software developing software to me is more like it is as much writing a novel as it is building a bridge. In fact, it's kind of the confluence of the 2 of them.
Speaker 1:And it's it can be, as a result, really, really hard to to have for visibility, especially into the software that is the most innovative software. I mean, if you the the the software that has been the most important and you look at the history of its development, It's I that's why I'm actually, Tom, I'm really interested now in the history of s 3 because s 3 would be a a real counterexample to that where you've got a really piece of of very innovative software, very important software developed with a methodology that's often not associated with that kind of software. Can someone write a book on that, Tom?
Speaker 3:I would just be
Speaker 7:So I think the the repeatability aspect of that is sort of part of the context that Dan was asking for earlier. There's some talk about whether or not the authors or the signatories of the manifesto were writing code or not. But I think it's actually more important where they worked. It's not the type of projects that they worked on, but it was that customer relationship. But, you know, I don't disagree with anything Tom said, but what he explained was mostly about how it became a management methodology.
Speaker 7:But it started as a more of a customer relationship tool. So when you showed up to that demo and you showed it the working software that values so much, you were showing that to the customer. This is these were like consultancies. They were building the 15th website that year, and all the websites were mostly the same. So there really was a repeatability aspect to it.
Speaker 7:But you were sort of saying, okay, Dan's right. We don't really know how to make this, but we're gonna try our darnedest for 2, 3, 4, however many weeks, and we're gonna come back to you and prove that we did something and give you a chance to walk away. So I think that, you know, going back a little bit further in conversation to that to the discussion of Sun and Sun's, engineering relationship with the customers, you had, higher sticker prices, which meant probably fewer, customers per engineer. And, you know, if product manager comes in and wants to reprioritize something, why? The customer is not gonna walk away.
Speaker 7:They were using Solaris. They were bought into Solaris for the next decade. And, you know, whatever sort of hiccup had occurred in about 2 weeks that you may have called a sprint, it wasn't gonna change the relationship with that customer. So it wasn't worth doing. I think that's, you know, there's a gap that we've sort of been alighting over where it jumped from manifesto principles that were adopted by teams to this, like, management methodology that is, quote, unquote, implemented across an organization.
Speaker 7:Yeah.
Speaker 1:True. Sorry. Go ahead,
Speaker 4:Ted. Alright. Since we're also sort of looking at the unicorns success here, you know, and I and I would claim that for sort of every successful deployment of Agile methodologies in AWS, you know, there's a significantly larger number of failed applications of the methodology across a whole bunch of other organizations. And, you know, we've we've talked a lot about scrums specifically, but at the time that the that the agile manifesto was written, I I don't know that the signatories were thinking about, like, yes, scrum is the thing that should be. I mean, you know, there were other things that were sort of in play at the time, like extreme programming.
Speaker 4:People still thought that was a good idea. And, you know, it's like, what we have to do here
Speaker 7:Well, the manifesto itself was a compromise between all of those people. Like, the the inventor of scrum, the inventor of scrum programming were signatories and the manifesto was sort of the, okay, what are the things we can all agree on? Because I'm sick of fighting with all you guys. Let's, turn our attention outward and spread our gospel. We're basically
Speaker 1:What was it weird? Get to kind of a good point though, but in terms of, like, I'm, you know, I'm, you know, I'm, you know, set of disagreeing with these other people. Well, maybe you guys maybe you're focused on slightly different problems. I mean, the the one thing that I and I think that this is, you know, a a persistent point of frustration with that, Joel, is this kind of trying to apply it to every kind of problem. And I think it applies really well to some kinds of problems and not really well to others.
Speaker 1:I think it's it's there's a real danger. We used to call this a silver bullet, right, that there's there's no single silver bullet. And it I I think that that there's a danger of deluding ourselves into into thinking that there is a silver bullet. And I think things I like about the manifesto is that it is actually more directional and or or or or rather a common failure mode of people for for whom are that they're not applying it properly is this kinda lack of a demo. And, you know, the the idea of, like, you know, working software is the primary measure of progress is is highlighted in the manifesto.
Speaker 1:And yet you go to a lot of these, like, you know, agile coaching kind of pages, and they don't really talk about the the demonstration as much. They talk about, you know, chunking up the work, but not emphasizing the demo. I think it's kind of because and certainly we found I mean, just to speak from an oxide perspective, like, demos are great. They're they're energizing. They bring teams together.
Speaker 1:They give something something very concrete to focus on. They can also be, though, very small. And so, Tom, one question I've got for you on the demos. You know, I I and, again, yeah, like I said, we've got a a particular sense of problems we're we're solving, but, you know, someone will demonstrate, you know, something booting, which to most people would just be like, is that even, like, a problem? And, of course, for us, that's amazing.
Speaker 1:You know? You'll get a guy you know, a guest booting with a particular device being attached or what have you, even though it's not very impressive as a demo. And, Tom, I gotta imagine that a lot of those s 3 demos along the way were only impressive to those who were immersed in the problem being solved.
Speaker 5:Of course. Yeah. And, you know, yeah, seeing something boot if it was previously unable to boot is remarkable. A demo is, yeah, showing a a graph that shows a shift in performance of something that you can't that isn't directly tangible for the participants who are observing the demo, but the graph carries the message. It yeah.
Speaker 5:Yeah. It, it it it always won't be, you know, the most entertaining thing. But, again, it's the closing of the contract, and there's this sort of circular dimension to which the demo shapes, how you qualify the things that get it taken into the sprint because sometimes, you know, you see a task and you you feel like, okay, that that that sounds like the most important thing to do, and and I get it. But you start to think about, wait a minute. How would I demo that in in 3 weeks?
Speaker 5:What what would a demo, and a declaration of success against that really look like? And, there are times when you sort of step back from it and say, hey. Wait. You know, this really isn't takeable, because it's actually there there are many different pieces of it or, it's too ambiguous to actually be able to demo something and declare that you successfully executed against, how that task is described. So it needs more work.
Speaker 1:And and, Tom, did you find teams inside of AWS that were taking aspects? Because certainly, I look at, you know, the the when software development has gone well or been successful, there's certainly, like, pieces of agile that are often present in terms of, like, things like, you know, being demo heavy and and focusing on working software and and being able to to iterate quickly and so on. But without some of the other rigidity, or did you find that it was kinda like, no. Listen. If you wanna take a fraction of this, you need take, like, you gotta take story points and the the the whole works.
Speaker 5:You know, you take the stuff that's of value to you. And, you know, call them story points or tokens or whatever, but there's there's some way of, you know, use Fibonacci relative granularity of these lesailles in the relative granularity of these lasers in this way in terms of, you know, how many of that sort of task it takes to would would add up to this one. You know, bottom line though, being prescriptive does you no good, but being attentive to what within the methodology works for you and, doubling down on that, I think, is the path to success. And by the way, you know, Agile isn't just using AWS as used throughout Amazon by, you know, at probably at this stage, many, many thousands of teams.
Speaker 1:And then I would argue Yeah. Sorry. I didn't
Speaker 2:hear that.
Speaker 6:The core idea behind agile is you don't know the requirements at the beginning. You have to build things and iterate it and demo it to people and find out new things you didn't know about the requirements. The exact same thing is true for your agile development process. If you say oh I got this book and we're gonna follow this book's procedures exactly, you've just waterfouled your business by saying we're gonna follow agile perfectly. Like, no, you see what works for your team and the things that work for your team you keep doing and evolve on that and the things that work badly you abandon, don't do those things.
Speaker 6:I mean so to use the watchmaker analogy, if you come across a watch and it works, it's not because someone sat down, thought about real hard how to design it, and it's like, I have this perfectly intelligent design for a watch, now I'm gonna build it. No, they made thousands of different iterations and found out what worked and kept the things and made it more complicated and showed it to people who are like, you know what, I need a minute hand. And I'm like, okay, let me go back and add that.
Speaker 4:New invention for 2022, the portable sundial. But isn't that isn't that though in direct opposition to what some of the agile, like, manifesto signatories will say on on forums like Twitter? And again, I go back to the people like
Speaker 1:Ron Jeffries and Robert C Martin and so
Speaker 4:forth who, you know, Martin specifically will tell you if you're not unit testing everything, then you're wrong. And you don't need static piping, for example, because I just write all these unit tests. I don't have errors in my code because I'm a disciplined clean coder. Like, I just I mean, it it it sounds to me and I this this becomes the one true Scotsman thing at some point, but people are describing aspects of Agile methodology that do in fact work. And this goes back to what we were talking about at the beginning of the hour where, yeah, there's kernels of truth in all of these things.
Speaker 4:You know? But when we talk about agile programming, like, what does that mean? You know? And some, like, philosophical
Speaker 7:I think a chameleon nature is actually the key to its success and why we all pretty much hate it. It's that, you know, nineties, we don't know how to make software. It's software is mainly used to sell servers or to run some particular type of business. And, I forgot the author of the book Brian mentioned about, you you know, the demise of programmers. Jordan.
Speaker 4:Yeah.
Speaker 7:And, you know, I actually think that that book is right, but early. You know? Eventually, there is some point where, we're gonna create most of the software that is, opportunity cost based and it we're gonna go back to that type of software that is just, incrementally used to improve a business like it was in the eighties and early nineties. The thing that caused that book to be wrong, in my opinion, is is things like agile, which allowed the, industry to scale up management precisely through letting people pretend to agree while actually disagreeing, going off and doing their own thing and creating some software. And in the end, you know, it would be revealed that nobody was really talking about the same thing anyway, but everybody was left with working software to sell.
Speaker 7:And so, you know, money makes a lot of problems go away. So at some point, we're gonna hit some new barrier. You know, mini miniaturization of technology was a key to letting agile thrive, but we need, like, another management methodology and another miniaturization wave or something like that in order to push that book's, predictions off another 20, 30, 40 years. But, eventually, I think we can all tell that we're gonna get to that point. It's just what what is the, you know, social mechanism that we're all gonna use to to to keep pushing that off until we retire?
Speaker 1:Yeah. I you know, I don't I don't know if I've got the stamina to go back and reread that book. It's a book that I threw out as a in enraged as as a younger software engineer. So I'm not sure I could actually go back and reread it. Maybe I should.
Speaker 1:Jordan, it should be said, had a long history of getting things very, very wrong. So, Jordan became somewhat infamous for, forecasting that y two k would be an would be an apocalypse, moved off the grid in New Mexico, I believe, into a bunker where he continued to hold court and explain to any media that would come calling how, software had effectively destroyed the civilization or would destroy civilization y two k. And there's, like, actually a great I'd love to be able to find it. There's a great, what was then the McNeal Air News Hour interview with Jordan, where he claimed that New York City would, not have water on the morning of January 1, 2000. And, good on the McNeil later NewsHour, they actually went in and interviewed the folks that developed the that were responsible for the New York City, the the Department of Public Works.
Speaker 1:And they were like, we did not understand why you wanted to, interview us with respect to the story. Like, well, you know, there are some folks in software engineering who believe that the water system will not operate on January 1, 2000. So do they understand the way it works? Like, well, stick well, so so the water is in the Adirondacks, which is at a much higher elevation than New York City and just works by gravity, actually. And it's like, well, are there micro controlled valves?
Speaker 1:It's like, no. You let me show you what one of these valves look like. Of course, they're gigantic. Right? So and, of course, New York City had water, and there was there were no water issues in y two k.
Speaker 1:But so Jordan got things got things often wrong. I mean, he was as, he was, always No.
Speaker 6:No. We misinterpreted the sacred text. It's actually 2038.
Speaker 1:That that that's right. It's 2038. And then 2038 actually is there there will be interesting to see what happens. I do feel that, like and, Tom, I guess I one question I definitely got for you is it it is unquestionable that some of this miasma around agile, you know, the the and even this blog entry that we're talking about refers to a Dave Thomas piece where Dave Thomas gets very frustrated with agile and says that, you know, once the manifesto became popular, the word agile became a magnet for anyone with points to espouse, hours to bill, or products to sell, became a marketing term, which I think is what a lot of us are reacting to. I mean, do you feel that it's it must feel to you that it's it's, hey.
Speaker 1:This is somewhat unfortunate that this that this term is being sullied by the hucksters when there's when this process as described has had so much value for you and your personal history.
Speaker 5:Yeah. A little bit, Brian, but I I don't pay a lot of attention to
Speaker 1:that. Fair.
Speaker 5:You know, I I've just seen so many, positive proof points of how it has worked for teams. And each time I've I've been in an organization where they said, you know, that that just hasn't worked for us. You you start to you know, as I've asked, well, like, tell me about your implementation and so on. And, they you know, you you you come across people who are following a blog post or following a book, but not really. And, I
Speaker 1:mean, Tom, you know what this means. I mean, you you you have to write your own book, obviously. I mean, you gotta be
Speaker 5:maybe there are too many already, and that's that's the root cause.
Speaker 1:I think that there's some truth to that, where it's it just feels, it it it feels like it you need to be one needs to be less prescriptive when talking about software. That it it's it's very hard to be as prescriptive as a lot of these folks are, because there's too many opportunities for it to for it to go wrong. But, I do think I I would be be curious to learn more about the development of s 3 in particular because I think it was happening at I mean, it is effectively the as far as I'm concerned, the first publicly available web service, really. I mean, I'm sure there are counterexamples to that, but it's definitely very early. And
Speaker 5:Yeah. They they to me, Brian, the, I I did an article for, ACMQ. Oh. It's in it's in, CSCM.
Speaker 1:Yes. Yes. Yes. Yes.
Speaker 5:Yeah. Conversation with Werner that but but it really gets to not just sort of the the, you know, the early days of how it's built out and so on, but my starting point was, at launch, it was composed of 8 services. And, last oh, actually, I guess it was, year and a half ago at re Invent, Werner casually threw out that it was made up of, like, 262 services. And, I I felt like there was a fascinating nugget there in terms of how something can scale and have, like, the fundamentally same characteristics over time or for a period of 14 years and evolve from 8 services to 262, and from a single agile team to a very large number of teams now, and both what was it about, how it was initially designed and constructed, and how the teams worked that facilitated that evolution because that's evolution on a scale that we don't often see. And also, you know, what were some of the, unanticipated surprises that popped up along the way?
Speaker 5:So
Speaker 1:Yeah. It's actually I I what and we'll obviously link to in the show notes, this interview with between, you and Werner. The so it sounds like in terms of the actual development, you do mention in that interview that it was an agile team in the canonical sense. The, it it would be interesting to know, we was that the first agile team inside of AWS or had it been adopted earlier?
Speaker 5:It it it was the first. And
Speaker 2:Interesting.
Speaker 5:The the guy the guy who's leading that team and scrum mastered, that process for how many whoever number of sprints it took, was a real yeah. A very pragmatic, advocate of Agile. And his next role, after launch was, to do to be a scrum master trainer throughout the company,
Speaker 1:be interesting to to actually, read that history, and certainly I did the interview. It's great. I know I read it when it came out, but it it merits a a reread for sure.
Speaker 6:So the idea is that Agile has proven to be something that's very powerful but easy to get the implementation wrong. Is there something that is to Agile what Rust is to see?
Speaker 2:Way to bring it back.
Speaker 1:Exactly.
Speaker 8:To to me, so far, it feels like agile is more like a guideline rather than a target to hit. Every team I've been on has treated it as such. I've can't say that anyone strictly fallen agile as a methodology or scrum strictly as a methodology. And going back to, like, measuring effort or time, I'd say if you do it over the long term, estimates are garbage. But the one thing that you do find is that if you have the same team for an extended period of time, and people are doing some sort of measurement, it does get But once your team changes, that can go under the bus really fast.
Speaker 7:That's been my experience. I was struck when Tom said that the scrum master stuck around the s three team for the duration of the project. I've worked on about a dozen different, scrum teams, and I think only one that I can remember did the entire team stayed around for an entire project. And I'm talking, like, 4 months. So I think, you know, the agile is sort of successful in organizations and teams that have trust or can build trust, and it's unsuccessful when you don't have trust or can't build trust.
Speaker 7:I think, you know, in the absence of trust, one of the ways to build trust is reliability, and that's where all the ceremonies come in. But I'd to, was it Erin's question about what is better? I think, you can look at, you know, just about any methodology that provides a set of, points that people can rely on in lieu of that trust as a on ramp to trust. But, really, if you talk about successful agile teams, they're almost always high trust
Speaker 1:teams. So you are getting to, what is, I think, my favorite line from Tracy Getters' A Solved A New Machine, where Tom West deliberately decides that he's going to manage via trust. And the line that I love is he says that trust is risk, but that he found that the because I def I absolutely agree with that, that I think that the the the best work that we do on teams are when we trust one another, when the team trusts one another, and when I think that's when people feel they can do their their their best work. And how do you I I mean, I think you you want to have that trust. And then I have also found the other bit that I found that is extremely useful is having that those demos.
Speaker 1:The fixed cadence, I haven't quite gotten to, but I think that the or not on that, that tight cadence of that kind of 2 to 3 weeks. But having a a a real demonstrations, I think, are really important at for all the reasons that that you talked about, Tom, in terms of as serving as a as a team catalyst and everything else. But I but there's still a lot of ambiguity in there, and that's not prescriptive at all. But those are just the things that I've observed that are, tend to be true across high performing teams. So I think we we've we've hit the hour here.
Speaker 1:Adam, any, closing thoughts?
Speaker 2:You know, the this I thought this is a great discussion. And Tom, particularly thanks for joining. In in this blog post, I thought one of the most interesting questions it raised in this 20 year retrospective of Agile was, what do we wanna do differently next time? And, in this discussion, the thing that's been coming up is when Agile has failed, the answer can't just be you're doing it you're doing it wrong. You're not doing it enough.
Speaker 2:You're not believing in the religion sufficiently. But how do you execute that evaluation? How do you know if it's working for your team? And I think that's that's what I'd wanna see
Speaker 1:a book I'll buy for sure. Alright. On that note, thanks everyone, and we will, we'll see you next week. We're we're sorry for the the hiatus, but we're we're, we're kinda out for the last couple of weeks, but, hopefully, we'll get back on a more regular cadence. Thanks, everybody.
Speaker 3:Bye. Thank you. Thank you.
Speaker 1:Take care.