On Code Review

Speaker 1:

Alright. Excellent. We're here. The app, it seems to be working. It started after my minute of terror.

Speaker 1:

So, you know, really excited. This is for and thank you so much for, kind of, for the blog entry on on code review. I've got a lot to talk about or ask about on this topic. Just before we get started, Adam, I just I I have, like, a baseline question for you and Kendall, maybe for you as well.

Speaker 2:

Are

Speaker 1:

we of the last breed to have done an in person code review? Do people still do in per in person code reviews at all anywhere?

Speaker 3:

I still do in person code reviews, specifically when a code review is very large or covers a couple of things that I don't have context in. Another, time I do in person code reviews is if there's just a lot of back and forth in in a code review. Like, if a online code review takes more than, like, 2 or 3 revisions, I would just sit down with the person and do it at their desk. Or I used to do that when I was in an office.

Speaker 4:

That's right.

Speaker 1:

Oh, okay. So the I guess that's the follow-up question because so I and I'm just realizing that I don't think I'd have done an in person code review since, like, 2006 or earlier, I guess. Yeah. 2006. But I I really kinda miss it, and I'm wondering so, Kendall, do you do an in person code review even in the post pandemic world?

Speaker 1:

I mean, do you do it over video chat? Or

Speaker 3:

I haven't done that yet, in the post pandemic world.

Speaker 1:

But I I I liked your rubric about kinda when to go in person, and I got a bunch of kinda follow-up on that. But I I wanna get to, like first, can you give us some context? What prompted you writing the blog entries? I think as as we talked about it last week, this is something we don't talk about nearly enough.

Speaker 3:

Yeah. So my former company, indeed.com, they started doing these internal panel discussions with some of the senior engineers. So there's a bunch of topics on, like, observability, unit testing, design, and code review. And the one that I felt comfortable, like, volunteering to be on a panel discussion one was the code review one. And so it just forced me to, like, think about how I do code reviews and write it down, and it was easy enough to turn those topics into a blog post.

Speaker 3:

And so that's kind of how this ended up, on my blog.

Speaker 1:

Oh, that's great. So you got a it was a little bit of kind of prompting then from a kind of an organizational structure that, because I I thought this the blog entry was great. I I really in there there are a lot of things in here that resonate with me. I don't know. I'd I I I would love to I'm sure you felt the the same way.

Speaker 4:

Yeah. For sure. I mean, I think there's a lot of great topics in there that, that that definitely resonated with my experience, and I've seen the opposite of, of some of the, you know, positive advice you gave. I've seen the consequences of of the negative advice or the absence of some of those things too. So I thought that

Speaker 1:

that was great. Yeah. Kendall, you did a really good job keeping it really positive. But, surely, you must have been having some negative experiences in your mind with on some of this stuff.

Speaker 3:

Yeah. I mean, I think, especially in conversations like this, it's a lot easier to talk about some of the negative things. But, for something like public facing and and just trying to get the essence of my thoughts down, everything was for the most part, framed in, like, a positive context.

Speaker 1:

Yeah. Well, I have to say I love your first piece of advice is one that really, really, really resonates with me. Namely, review your code first. I have always found, like first of all, let me just full disclosure. I don't think I'm a very good code reviewer.

Speaker 1:

I I I think that I should and I think I probably, like, I probably need to stop thinking of myself that way, because I think it's something I could get a lot better at. But I just don't feel like Adam, you had a story that you should probably relay that I feel was very revealing.

Speaker 4:

Oh, the one the one I mentioned beforehand? Yeah. Yeah. Well, I I was, you know, as as we were I was thinking about this topic over the last hour, and I was thinking back to my co my first code review at Sun. And I think I was an intern, working for Brian and and our colleague, Mike Shapiro.

Speaker 4:

And I and I wrote some, addition to our debugger and I gave it to Mike to review. And it came back I think, like, every other character had, like, a red line on it or whatever. And, I

Speaker 1:

mean, it should be said that you're not talking about metaphorical red here.

Speaker 4:

No. No. So, yeah. We we killed trees to do these code reviews. Like, we we printed them out in, like, a a diff to postscript generator and sent it to the printer and it would sit there and and and generate, like, you know, not in this case, but in some cases, like, hundreds of lines of review.

Speaker 4:

And in fact, online code reviews, when they came out in the Surlyard's kernel group, were hugely controversial. Well, and come overstated.

Speaker 1:

And also the red was not metaphorical. Mike took out his red pen. He wouldn't he and I I still find it. Actually, do you find it a little bit triggering to even talk about the red pen? Like, I find it, like, he would do you know

Speaker 2:

what I mean?

Speaker 1:

Do you because, you know, like, he he had, like, this body language around like, oh, you want me to review this? And he kinda like I swear he would like turn, take out a red pen, like a parent taking out, like, a paddle.

Speaker 4:

And That's right. Like, cap the blue pen, take out the blue pen.

Speaker 1:

Right. Cap you

Speaker 3:

know what I'm talking

Speaker 5:

about. Right?

Speaker 1:

Like, I didn't make that up. Yeah. Absolutely. And then, like, if a red pen cap comes off and you're like, fuck. Here it goes.

Speaker 1:

Like, I am about to get it's just gonna get red ink. It's gonna be bled on this review.

Speaker 4:

Yeah. And and that and that is what happened. And I I think you, and and Brian, I think you saw the code review sort of sitting on my desk, like, dripping in writing. And we're, like, like, you'll you'll figure out. Like, you'll figure out how to do this thing.

Speaker 1:

I think

Speaker 2:

I did.

Speaker 4:

But, you know, took a minute.

Speaker 1:

I think that maybe why I'm I'm a suboptimal parent as well. I think that's, like, kinda my parent. I think it was like, yeah. You'll figure it out.

Speaker 4:

I I I, you know, I like to think at the time, which was, like, Mike takes everyone through the wood chipper, which which was, I think, basically true. That's how I chose to interpret at the time. But but who knows how who knows how it was intended?

Speaker 1:

So I alright. So I thought the point of that story may have been that I was inattentive as a code reviewer, but this is putting, you know, much more positive light.

Speaker 4:

But I will say now that you've broken the seal

Speaker 2:

on it.

Speaker 1:

Okay. Good. Yeah. If you

Speaker 6:

I yeah.

Speaker 1:

Why? Go ahead. I'm gonna have to say it.

Speaker 4:

One of the things we talked about in last week's, call was, that I think there are different people I'd go to for different kinds of review. And if I wanted an extremely thorough, deep, incisive review, it's fair, Brian, that I would not necessarily turn to you. And if I wanted like a rubber stamp,

Speaker 1:

you

Speaker 4:

know, I I had a number of people who could provide rubber stamps.

Speaker 1:

Thank you for stopping short on that one. I really appreciate it.

Speaker 7:

There we go.

Speaker 1:

Well And and Kendall, you raised this point too about what about picking your reviewer. The about finding someone I because I I thought actually it was really interesting. Kinda sent me to the the the the the bystander effect. Kendall, do you wanna talk about the bystander effect and what you talked about in terms of assigning someone specific?

Speaker 3:

Yeah. I mean, so in, like, general context, the bystander effect can be described as, like, you see a crime on the street, and instead of calling 911 yourself, you assume the other 30 people around you will do it. And those other 30 people were thinking the exact same thing. So nobody actually has actually calls 911. Same thing with code reviews, in that, typically, anybody on the team is allowed to do reviews and approve reviews.

Speaker 3:

But if you don't assign it to a specific person, everybody assumes that somebody else is gonna look at it. And I found that, like, making 1 or 2 individuals responsible for getting it done

Speaker 2:

puts it

Speaker 3:

on that person's, quote, unquote, to do list. But it also gives you somebody to reach out directly to if you've been waiting on review for a couple of days.

Speaker 4:

Yeah. I I think that's great advice. And I think the other the way I've seen that go wrong, though, is that the person who's being asked review, I think, also should, be asking themselves, am I the right person to review this? And what kind of other additional review could be warranted? You know, I've seen in different groups over the years, instances where you ask the person, why did you review this?

Speaker 4:

You know, you're not an expert on this topic. Like, what, you know, why why did you, like, give this the thumbs up? And they said, well, they asked me to. And, you know, it's it's fine often to say, hey, I'm not the right person to do this. Like, the person who's gonna do the best review on this is this other person who you should assign it to.

Speaker 4:

But it's sometimes hard for the reviewer to to remember that or feel like that's, an appropriate kind of feedback.

Speaker 3:

Yeah. In the spirit of being succinct, I left a lot out of assigning someone specific to your review. I think there are things like some people have significantly more context on the changes, so you want them to review it. Some people have, significantly more experience in a particular domain or on a specific system, so you want them to review it. Sometimes someone is, like, new to the team, and can ask interesting questions and say you want them to review it, or you have a mentor that you're working with.

Speaker 3:

So there's a ton of reasons to assign a reviewer. But being intentional about who you assign or ask to review is also, like, pretty important because you don't wanna waste people's time. Right? Like, if I ask somebody for a review, it's for a specific reason. I mean and sometimes a rubber stamp is that reason, and I'll pick people

Speaker 1:

for that. That's a why why why did you pick me? Well, because you're generally a rubber stamp. Like, oh, okay. So I I think it's it it kinda you also have a really good point about how, the you know, reviewing other people's code, reviewing our peers code is something that that is part of all of our jobs.

Speaker 1:

I like your your prompt of, like, looking for your as you submit something to be reviewed, using that as an opportunity to catch up on your own reviews. Because I felt, like, I I I don't know. Like, that sits really right with me for a bunch of reasons. It could you talk that in a little more detail?

Speaker 3:

Sure. I mean, I think something that I think most software engineers learn is, like, to try to leave things better than you found it. It's in, like, a ton of books or whatever. But just like my experience building software or working on teams is that people are usually blocked by code reviews, and I don't wanna be a blocker for anybody. And, usually, when I put out a code review myself, I end up being blocked on that code review, getting done.

Speaker 3:

And so I just use that as I put our I use that as an opportunity to try to unblock other people, and really low hanging fruit for doing that is doing quote reviews.

Speaker 1:

I I think that is great. I think that is such a, like, a a great little, I mean, life hack. I I hate the fact that I'm even using that term, but I just love that that the that you or because you're exactly right that when you submit some for a code review, you're kinda now blocked. You're actually ready for a context switch anyway. I also feel that it, like, just it it also amps up your empathy when you're going to do because I think that one of the things that I don't think you got into too much, but I do think can be a problem, is reviews that have the that that aren't really showing that empathy, can be really problematic.

Speaker 1:

And it's really important that that, you know, I think when you are conducting a review and submitting your code for a review at kind of the same, in in the same kind of time frame, it puts you in the right, gives you the right disposition.

Speaker 3:

Yeah. I agree. Empathy is an interesting word because I feel like in asking for reviews, but also in doing reviews, it's important to have empathy, especially in, like, word choice.

Speaker 1:

Oh, man. Oh, man. I I oh, I this one yeah. So yeah. Explain on that a little bit.

Speaker 1:

I think, I mean, you are obviously just bull's eye right on that. But, yeah, what do you

Speaker 3:

mean? Yeah. I mean, I think I'm similar to you, Brian, in that I don't think I'm a very good code reviewer, but I've got feedback that I am. And a part of that is just because I assume that the person knows more than me. Like, I I'm not super deep in a lot of things.

Speaker 3:

I ask a lot of questions. And even if I think something is wrong, I frame it as, like, a question. Hey. Why did you do it like this instead of this? Instead of saying this is wrong, you should do it this way.

Speaker 3:

And I know when I've got reviews, for the most part, people have good intentions regardless of how they frame it. But the way you receive that can be either, warm or, like, disheartening. And so Holy. If I was getting a review from somebody and they say, hey. This is completely wrong.

Speaker 3:

You should do it this way. Even though I know that the intentions aren't to, like, make me feel bad, that reads and puts me in a different mood than, hey. Why did you do this? This might be bad because x y z. And it forces the person to think about it rather than just giving them directives.

Speaker 1:

Man, that is like that is just some super dense wisdom there. So there's a bunch there that I wanna take apart. 1, asking questions. I I think that is really important. I honestly feel like the the the value that I personally have served in code review is asking questions of the code that the code itself didn't answer and getting people to write better comments.

Speaker 1:

I do feel that that is one thing that I've I I don't know. I mean, I can count I feel like on one hand the number of, like, really serious issues I have found in code review as a code reviewer, But I feel frequently I have found, hey. Like, what where is this coming from? Or why you know, this is not well, explain or do I'm just missing some context here. And the answer to that often is is a comment that can explain it.

Speaker 1:

But sometimes there's something that actually, like, oh, yeah. Now now that you asked that, there's actually there's a better way to do this, or this is kind of yeah. This is this could be better thought out. So I think that is really,

Speaker 2:

really good advice.

Speaker 4:

It's great. It's great show of empathy too because you've got I mean, the the act of asking for review is sort of a vulnerable moment for the person submitting

Speaker 2:

So I I I see review as also a tool for learning about code. Right? There's nothing better than, looking at some you know, having to answer those questions for yourself to some extent in in order you're less familiar with as the reviewer. And and framing things as a question is just a an an extension to that, essentially. You you ask the question and you get the answer.

Speaker 2:

Right? You learn something. I I think it's you you don't have to be positive that it's wrong. You could see something you don't understand and ask the question. Right?

Speaker 2:

Like, you may have just uncovered something that the person missed, and and then then you've you've already served your purpose. Like, you're the rubber duck. Yes. That's right. But you but but you actually force somebody to look at that again because it wasn't clear to you.

Speaker 2:

Or in the alternative, you've just learned something, and, you know, your understanding of the code base is is improved.

Speaker 1:

Yeah. I feel like my role in code review is really kind of as like a psychoactive rubber duck, basically. I I was basically like a talking rubber duck. The the for those who presumably have heard rubber duck is a metaphor, but this is like just the or a unit's a verb, like rubber ducking where you're just letting an engineer say their thoughts aloud so they can themselves can work through the problem. And I feel like, Edwin, just to your to your point, I very often, I would ask questions that would prompt the the the engineer.

Speaker 1:

I mean, I actually Adam, let me ask this. Whose code have you reviewed the most in your life, do you think?

Speaker 4:

That's a great question.

Speaker 1:

I mean, for me, like, there's a there's a just a super quick answer. So maybe maybe it's

Speaker 4:

Let me think about that. What's your what's your answer?

Speaker 1:

Oh, Bamik. No question.

Speaker 4:

Oh, yeah. It makes sense.

Speaker 1:

And it it for a bunch of reasons. I mean, one, the and this is a different era. Right? This is the in person code review era.

Speaker 4:

And you might just explain just ex give Jeff's short bio.

Speaker 1:

Yeah. So, Jeff Onwick is in source kernel development. When I I came out to Sun to work with Jeff, Jeff ultimately, led the charge along with Matt Aaron's on ZFS, developed the, ZFS file system and, that went on to DSSD after Sun. And, Jeff had a very idiosyncratic working style. So this is especially when so I was the youngest person in kernel development by a decade for that especially that first year that I was out.

Speaker 1:

And Jeff had basically gone full nocturnal by this point. So Jeff was this is in, like, 1996, late nineties. Jeff is actually is able to work from home a bit in an era when you really couldn't work from home, but would still basically come to the office. And he would come to the office, between, like, 2 and 3 PM. And then he would stick around the office until, like, dinner time, 6 or 7.

Speaker 1:

He would head home, and then he would be online until 4 or 5 in the morning. And I could not work from home. I'd not have a I'd at the time, you had to have be a certain, you know, rank in order to be able to get an at home setup, and it wouldn't have made sense anyway because I wasn't gonna get a spark station at home. So I just was at work until 4 or 5 in the morning. So I would just I was becoming I was basically keeping Jeff's hours.

Speaker 1:

And so as a result, when Jeff wanted to be get a code review at, like, 1 in the morning like, I'm definitely the only one that's there. And Jeff and I were working very closely together on a bunch of other things. And I think he was, you know, he was looking to kinda educate me about the system. And and then I think he also knew that, like, he was much more senior to me, obviously. He could just kind of, like, muscle his way through a code review to a degree.

Speaker 1:

And I found and this is actually kinda one question I wanted to ask you is I do feel when you're reviewing someone's code, it's helpful to get their expectations. Like, are you trying to integrate this now? Am I? Because you you said that, like, the code reviewers were often blocked by code review. And with Jeff and this took me a couple of years to kind of to be on with to be able to do this.

Speaker 1:

But the key to Jeff was when you're sitting down to review his code, the very first thing you have to say is, like, look. You're not integrating this tonight. No matter what you think you're gonna do, you're not gonna integrate this tonight because it's when he was in I mean, Adam, did you ever do his code when he was in, like, locked out of one of those mindset?

Speaker 4:

Go go go mode and

Speaker 1:

Oh, man.

Speaker 4:

Yeah. Just just, like, would would grab the pen out of your hand and give it the check mark if you need

Speaker 2:

it. No.

Speaker 1:

Wait. I mean, because he's on final approach. Right? He's on final approach on a big one, and I, like, I get it. And so it's, like, you know, you're giving him a lot of feedback that's gonna be like, oh, yeah.

Speaker 1:

Like, well, I'll consider that the future. I'll consider that the future. I'll consider that the future. This side this, I will also consider the future. It's like, you know, how about we do that, like, before we integrate it?

Speaker 1:

But I so I think no. I've I mean, that was definitely, to the point that do you remember screaming red chairs in the conference room, Adam?

Speaker 4:

No. I don't.

Speaker 1:

Yes. Okay. Screaming red chairs. We must have ripped out screaming red chairs when in so, Adam, was you were an intern in the summer of 2000. Right?

Speaker 4:

That's right. Yeah.

Speaker 1:

So it is like go go.com just nuttiness, and we were converting conference rooms into office space. And the I think the fact that that screaming red chairs was one of the ones that was converted in office space. So I think that was an office when you when you were at Sun. But we used to do all of our code reviews in screaming red chairs, which had screaming red chairs. So the the we did all the but we did but if this in person it was in person.

Speaker 1:

It was printed out. You had a pen. And I think that one of the things that the in person advantage had and, kinda, I wanna get back this because you've raised the raised this issue too about, like, when you go to in person. One of the great things because kinda you're talking about, like, words really matter. When you're in person, you've got a lot more to read as a as a software as either a reviewer or someone being reviewed.

Speaker 1:

You're getting a whole another set of body language to read that you're not getting over GitHub or whatever. Sorry. Kevin, go ahead.

Speaker 3:

No. I was just prematurely unmuting. I I assume that there was a question.

Speaker 1:

Yeah. Sorry. But well, I guess, I mean, I think that, like, your rubric now for when you go to in person is because you're talking about, like, where the the words really matter. And Mhmm. Phrasing things as a question, would actually, I'm just gonna ask you because we just cannot resist ourselves and and have to just get into a tooling conversation.

Speaker 1:

What do you use? How do you do code reviews? I mean, wait. What what tools do you use to do code reviews today?

Speaker 3:

Today, mostly GitHub. Previous company was GitLab. And then when I was at Amazon, they had I forget the name of it. It's like an internal tool for code reviews. I also use my ID sometimes.

Speaker 3:

I think there's some integrations into IntelliJ that allows you to do, like, GitLab or GitHub code reviews in the editor. So I do that sometimes. The code reviews are pretty large.

Speaker 4:

Oh, I know this makes me sound like a fossil, Brian, but but, I miss those in person code review. I miss those printed out code reviews. Like, I like GitHub. I I like I like it fine. I like some of these other tools fine.

Speaker 4:

But there there is something about being able to, like, really, you know, write your thoughts or explaining them to a person, you know, or not be confined by commenting on one line. You know, one of the big challenges I've had with GitHub is if change on line 17 actually affects something on line a 150. It's it's so hard to even see the that code to expand that window and and and make clear your intent. Anyway, I I

Speaker 1:

It's atrocious.

Speaker 4:

It it's gone. That that time is gone, and there's lots lot not to recommend it, but I do

Speaker 1:

I I I think it helps atrocious for code review. I have to tell you. I I I I I really think it's just the worst. I I and I I was really trying to be open minded about this, but it is it's a real, real, real struggle. It it's like and I don't know.

Speaker 1:

Maybe and because it kinda another point you had is, like, be mindful of the size of code that you're reviewing, and GitHub definitely enforces that because it does not like large change.

Speaker 2:

You know what

Speaker 1:

I mean?

Speaker 4:

I'm serious. Oh, yeah. It's like, oh, there's a lot of change here. I'm not gonna show you any of it. Is that helpful?

Speaker 1:

It's like I like there's a lot of GIFs here, so I'm not gonna show it to you. It's like, does that feel like a good review to you, GitHub? I mean, out of curiosity, it's like, no. I'm not gonna show you, like, oh, that was a lot of change here. Let's not look at that one.

Speaker 1:

It's like, does that

Speaker 4:

make sense to you? Color should we paint the

Speaker 1:

bike shed? Right? So Yeah. Go ahead, Kendall. Sorry.

Speaker 3:

No. Call review tooling is interesting. I don't have the experience of not using something similar to GitHub, so I don't have anything to contrast it to. But I do find especially, like, in my personal workflow, like, even how I look at diffs is different. Like, I prefer, side by side diffs instead of inline diffs, and most tools default to in line depths.

Speaker 1:

Amen on the side by side. I am totally with you on side by side dips. And then I actually do like if I get obviously, you do get the side by side dips. I am a big and I still, Adam, I still print my dips out for my own self review. Wow.

Speaker 1:

Alright.

Speaker 8:

Good for you.

Speaker 1:

I find out and I just feel that, like, I I still feel that, like, the the review I am much better at reviewing my own code. In fact, to try to improve myself as a code reviewer, I have tried to do a couple of things. 1 is to, try to review other people's code the way I review my own, which is really hard because I don't not the context. The other thing that I have tried with kinda mixed results is I will tell you a time when I am a very good reader of code, which is when I am now certain that the bug is in this code. I that I'm debugging something.

Speaker 1:

I've spent a lot of time debugging code in my life and debugging the code of others. And I get that kind of, you know, that total like, I mean, you almost feel like a predator with with with blood in the water. But then it also, like, that disposition is not necessarily helpful. Like, I tried that a couple it's like, I I it's you know, you you got kinda get an interesting point about, like, the, you know, the the job is not to just find, like, bugs in the code. And I wanna get to the is it Darren's rule?

Speaker 1:

What what was your with the Darren's law. So tell me about Darren's law. Does that come from someone named Darren, I assume?

Speaker 3:

It does. Darren was a principal engineer at Amazon. So for context, when I was at Amazon, I worked in internal security tooling. And so a lot of our things were not, like, critical per se, but, like, pretty important pieces of code for the company. And so, basically, anytime Darren looked at code at design reviews, he always kept his laws in mind.

Speaker 3:

It's to cover your ass, availability, security, and support. And it's an easy mnemonic for me to remember, and it I mean, I think, like, all rules are meant to be broken. Right? So, like, take it with a grant or something, but, like, it gives you things to focus on. Like, if I'm looking at a quote review and I don't know where to start, I look at, like, can this break our availability?

Speaker 3:

Is is it gonna be easy for me to support this when it's in pride and it eventually breaks later? It it frames the way I think of questions when I'm looking at code reviews.

Speaker 1:

I think it's great framing because I also think it gets you off of the minutia. It's very tempting, I think, in code review. And maybe this has gotten better over the years with with the the rise of formatting and reformatting tools. And, I will talk our Rust in a bit because I do think Rust has kinda changed this yet again. But I think it is tempting to focus on the smallest possible issues.

Speaker 1:

Adam, do you know what I'm talking about?

Speaker 4:

I don't.

Speaker 1:

Do you know what small issue I'm gonna bring up right now that been waiting for 20 years to bring up? Weird. There's a

Speaker 4:

20 year mistake that I made, but I was saying, just back to GitHub. Hold on. I'm I'm not teasing the subject.

Speaker 1:

Oh, yo. Yes. You are.

Speaker 4:

Is that Go ahead. Is that the the the diffs on GitHub do kind of narrow the window so tightly that all you can see is de minimis changes.

Speaker 1:

Yes.

Speaker 4:

But now, I do want to know what mistake I made 20 years ago.

Speaker 1:

It's not a mistake at all. You're right.

Speaker 4:

Oh, is it my one line? Oh, geez.

Speaker 1:

Go ahead. Go ahead. It sounds like you have something you wanna say. Go ahead and say it.

Speaker 4:

Okay. Okay. Well, this is an example of asking the wrong reviewers. I think we're gonna say it. So, in in a and and Brian, you might need to help with some of the details.

Speaker 4:

But one of the late things we did in DTrace was we converted a bunch of existing, basically trace points in in lots of different, to use to be DTrace trace points which meant that previously they had only been enabled in debug kernels, but because DTrace was dynamic, we allowed them to be in all kernels, in production kernels. And I had this really big diff where I was touching, like, hundreds of files, where I was taking these things out of debug content or out, you know, out from behind, you know, if def debug and putting them into just, like, the normal live production kernel. And I don't know. Like, and and there was a there's a fuck up in it. And, neither of my reviewers popped the fuck up and I didn't either, obviously.

Speaker 4:

But I don't think either of my reviewers really looked at it. And then the the thing is you experienced this very differently than I did. So I was working from home.

Speaker 1:

But I'd like to say just but we this is not what I was talking about at all. This is not what I was thinking. But but, yeah, but we this is a great story, and we

Speaker 4:

should This this is another fuck up? No. No. This is

Speaker 3:

not a

Speaker 1:

fuck up at all. I don't hold this one against you at all. No. This

Speaker 4:

is the Okay.

Speaker 1:

But this one you do hold against me? What are you talking about? Well, yeah. We'll get there in a second.

Speaker 3:

So let's

Speaker 4:

keep keep on So so I'm I'm working from home, like, just working on stuff. In the meantime, Brian and and our colleague, Mike, were at the office. And I guess, like, somebody wandered into your office, Brian, and said, oh, by the way, like, you you know, you guys are planning on integrating DTrace, but you, but, like, this one esoteric system, like, it's Nick is busted or something.

Speaker 1:

Close. It was a machine did not boot. And the messenger, there would be lots of reasons to question this particular messenger. Someone who who I was even surprised that they were so detailed oriented as to even be test. It was it it was a it's like, you know, I've been trying to detrace WAD on this machine which doesn't boot.

Speaker 1:

And I'm immediately like, yeah. Well, that's not I mean, okay.

Speaker 4:

That's probably you.

Speaker 1:

It's like oh, like, it's almost certainly you. I mean, I'll I will look at this, but this is almost certainly you. I didn't say that. I I just thought it. I went to go look at it, because we were on final we were within 24 hours of when we wanted to integrate at this point.

Speaker 1:

And sure enough, that machine did not boot. And then, we heard from someone else who's in QA, who I thought the world of, Claudia, who was very good. And she told me she had another machine that didn't boot. And then which was a pretty different kind

Speaker 3:

of machine.

Speaker 1:

And, yeah, I went to some shit scary places mentally. I'm like, we have done something because we were pulling some tricks with the trays for sure.

Speaker 6:

Oh, yeah. Yeah.

Speaker 1:

And I was like, one of the deeper tricks that I thought we were pulling off successfully can't actually be done, and I'm about to find out why. It's one of these old machine architectures. And that is actually funny that you mentioned. I know I wasn't thinking about this at all, but then we ended up actually talking about code review. Mike and I printed out all all of the diffs of DTrace, and we did a self review of all of DTrace.

Speaker 1:

And we it was a one liner that you had no. I never, you know, I never blamed you for that because, like, you were doing that was, like, a thankless work you were doing to go convert VTrace to VTrace. And it it it you made it. But the the line that you act you you deleted an additional line. In part of ripping all this stuff out, you ripped out one additional line and that additional line was the first interrupt that would fire on this old SCSI controller that only exist on these ancient machines.

Speaker 4:

And the thing that made me like, the the reason I held on to this is, like, you guys didn't even tell me. Like, I was just, like, fucking around at home, like, working on something else.

Speaker 1:

Dude, because I was so fucking stressed out. You guys are in you're

Speaker 7:

in, like, the boiler room.

Speaker 1:

Absolute boiler room? It it actually so it was was funny. So absolute boiler room. Everybody knows that with Detroit is supposed to integrate in less than 24 hours, and something is very, very wrong on the DTrace farm because, I am and I'm, like, just exuding, like, stay the fuck back. And I'm all I'm doing because these machines are, like, hard hanging.

Speaker 1:

So I'm just walking back and forth to the lab and then going into my office, closing the office door. Coming out, going to the lab. And into this, our colleague Eric Schrock was I think that was his first day.

Speaker 2:

It was,

Speaker 4:

like, his first day on

Speaker 8:

the job. Yeah.

Speaker 1:

His first day on the job. He's, like, I'm gonna go say hi to Brian. Like, I just got here. And I think even people were like, that's an extremely bad idea. Like, don't do and I remember he opened did not knock.

Speaker 1:

Opened my door and just, like, popped his head in. I remain grateful to this day that Eric did not quit on the spot based on what I I definitely, told him in no uncertain terms that this timing was extremely poor and that no. I was so relieved when we found that. And actually but although there's an interesting other lesson there that sometimes when we look at code review, that was not a code review suggestion. But, god, haven't we all had the code review suggestion broke my code?

Speaker 1:

Where I took someone's code review suggestion of, like, boy, you go clean up this other thing over here, and there's a cost associated with that, right, where we accidentally broke something else. I mean, it's like it does these things have costs when we have these suggestions and we implement them. And we're often very far away from that kind of economic conversation when we're in a good review. Like, okay. Like, I could go do this, but is it worth another week?

Speaker 1:

You know, I don't know.

Speaker 4:

But but but it is worth I mean, some of those mind numbing code reviews where it was, you know, small changes spread over hundreds of files. Like, those are the ones that, like, there there can be sleepers in there. There's sleepers.

Speaker 1:

So, no, I was thinking about it all. I was thinking of void.

Speaker 2:

Void.

Speaker 1:

Oh, god.

Speaker 4:

Oh, oh, oh, oh, oh. You're you're saying my my favorite useless code review comment? Yes. That's fair. So my favorite so, I have become a better code reviewer and, one of the ways in which I was not a good code reviewer is because I was young.

Speaker 4:

And one of the products of youth was, for me, one of the symptoms was being, a know it all. And so, the the thing that I knew that most people didn't, was that in c, if you, wrote a function that took no parameters, it needed to be like, foo open paren void close paren. Because otherwise, the compile, compiler could interpret it as k and r c and you could pass it any number of arguments you wanted. And this was like not really actually a bug, but sort of a bug if it were in a header file. But you're right, Brian.

Speaker 4:

That like if I saw that, I would. I could not resist back then when I was young at a bad code reviewer telling people.

Speaker 1:

I mean, it it it like mission accomplished. I stopped making that particular

Speaker 4:

mistake. Ostensible mistake.

Speaker 1:

Ostensible mistake. And it's, like, it's a little more I mean, it is kind of unconscionable that c has that. And actually so this this break actually is a good segue to something that I've been thinking about a lot recently, which is Rust kinda changes the game. Right? Like, that kind of a thing is just, like, not possible.

Speaker 4:

Before we go there, I would just say, the thing that it makes me remind mind me of and and it's at the intersection, of some of the things in in Kendall's post is is sometimes it's important what you don't say. And I I think, it's like just because something could be different

Speaker 1:

Yes.

Speaker 4:

And it it doesn't mean you need to say it. And even if it could be a little bit better, doesn't mean you need to say it. Right? If if a if you're if there's some little embedded n Square algorithm in a that could be log n or m log n and it's in a code path where performance literally doesn't matter, and n is only ever gonna be 6. Right.

Speaker 4:

Exactly. You can you can you can just keep that to yourself.

Speaker 2:

You know

Speaker 4:

what I mean? But we've all we've also I mean, I think all made that comment. Right? I don't need to tell people anymore that there's void, that they write needs to write void, but, we've all made that comment. We've all had that comment received.

Speaker 4:

But but to us

Speaker 3:

It's it's interesting. I don't know if I called this out in the blog post or not, but things like that, I think the the tie breaker or something like that is it's also gonna be the implementer's choice on whether to do it or not. Like, everything in a code review is a suggestion unless it's, like, breaking or incorrect. But, the class structure, verbiage, boy in the function signature, like, ultimately, it's up to the implementer whether or not they wanna do that unless you have, like, some very strong reason to block the change on it.

Speaker 1:

That's a really good point. And I everything there too, Kendall, about implementer's choice. Because I think

Speaker 9:

I I think I I think I disagree with that, though. I I think it it's about who has to maintain the code, and and it's not just about who's implementing the code that matters. So, you know, I I agree that small trivial things you sometimes have to let be, but at the same time, if it if it has an impact on on, maintainability, and that can be things like naming, you know, it can matter for for how easy it is to follow the code by you know, for somebody else to to follow what's going on. I I think

Speaker 1:

it it it does matter. Yeah. Go ahead, Kato.

Speaker 3:

Go ahead. I was I don't necessarily disagree with that, but, ultimately, like, certain decisions are, like, team wide decisions, and certain decisions are implementation details. Like, if you have a naming pattern and the implementer decides to reverse my review. This is a blocker. This is how we name things, versus saying, like, I don't know, calling this a a utility class versus, whatever class.

Speaker 3:

Like, that's a minor implementation detail that doesn't necessarily go against any standard. It's just kind of a style choice.

Speaker 1:

That's right. And I think that, you know, the way this was phrased by, actually, one of our our colleagues who since passed away, Joe Kowalski, he used to say that is there a problem with this code, or is it just not implemented the way you would implement it? And I do think that there is something like, you do need to leave it's very important. At least to me personally, it is very important that I retain my craft. It it it it helps me retain my agency, my voice.

Speaker 1:

It's the way I express myself. And there is a I mean, this kind of funny confluence of art and machine in software, and I don't wanna be completely deprived of that. And I you know, I if something no. I mean, obviously, like, it's a spectrum. And and, Edwin, to your point, like, nomenclature definitely matters.

Speaker 1:

And, you know, kind of, like, you said that too. It's like, hey. If a nomenclature is, like, flipped or if it's gonna be confusing, like, that is an issue. That's definitely an issue, then that should be that should be flagged. But you also wanna afford people the you you don't wanna deprive the implementer of you also don't wanna and it it just kinda blew me because I love your your everything is a suggestion.

Speaker 1:

I think one thing that we have to be mindful of in software is creating cultures where we effectively punish people for work. And that's actually, Adam, for whatever reason, like, for just so you know, like, the reason I never, for a second, on that VTrace issue, I never personalized it because you volunteered to do this shitty job. You know what I mean? And it's like you've gotta, like we can't have cultures that punish people for actually writing software. And we have to keep that in mind when we're doing code review.

Speaker 4:

Yeah. Totally agree.

Speaker 6:

I can before we move on to talking about auto formatters, which I think is the next topic that Brian is trying to

Speaker 10:

move on to,

Speaker 6:

I can I can talk a little bit to to some of the other tooling stuff that you talked about? Just a little background. I do have, I work for Atlassian, and I do have some amount of code running in Bitbucket Cloud, Bitbucket Data Center, and Crucible. So I do I have worked on some of the tooling that you're talking about, or at least a competitor to some of the tooling that you're talking about. So I have a little bit of insight on on some of these things.

Speaker 6:

I would say that some portion of this is, of the ability to handle large changes and the amount of context that is given around a pull request, around, like, a change, the numbers of lines and stuff. Some of that is a little bit down to, how the people who build the tool use the tool.

Speaker 1:

Yes.

Speaker 6:

Yeah. So you'll notice that Crucible, and, Bitbucket data center have a, per file view on a pool of rest. So you can see a single file at a time, and there is a file explorer on the left to be able to change, files. Like, there's a file pane. And that's in part because, both CRISPR and and, Bitbucket data center are written in Java.

Speaker 6:

So that means that you end up writing, for any change, you end up usually changing a larger number of files, to be able to achieve the same thing. And therefore, you end up with these slightly larger pull requests and they need a interface that can can handle that nicely and be able to comment on those. In contrast, Bitbucket Cloud has the same sort of all changes on one one page view, that has become customary in a number of these tools. And that's in part because Bitbucket Cloud is written in Python and a lot, unlike Django. And a lot of the changes, the pull requests that they build are are significantly smaller.

Speaker 6:

And as a result, they don't have the need for extremely large, changes. There's also a a bit of this, like, cloud versus, behind the firewall thing in the computing that diff, is on GitHub and or Bitbucket to be able to compute. So there is some amount of, like, computational, kind of constraints at play there, when dealing with a larger a larger diff. So, you know, there is there is that that at play. Another thing to talk about a little bit briefly while I still while I have the mic, you were talking a bit about, in person code reviews and the fact that you, when you're communicating purely over text, you don't have that same level of fidelity of body language and and tone and anything of that nature, and it's really hard to to convey the those over text.

Speaker 6:

One thing that I have found that, the extremely effective remote employees that I work with do is seamlessly change between a text based medium to a higher fidelity medium. Whether that's audio or or some kind of video to be able to, hash out problems. So what what does that mean? Well, you'll be talking to someone on poor request and they'll reach out on Slack and just be like, hey. Look.

Speaker 6:

It seems like we're going around in circles here. Let's just jump on a zoom. Drop the zoom link. They're on the zoom for like 5 minutes if that, and then they drop off. And that like quick shift up into a higher fidelity medium is so effective.

Speaker 6:

The other piece there is this new new kind of async video tools, which we're starting to play around with a bit more, such as Loom. Where, like, one one of my colleagues, Fran, has started, following his practice of attaching a loom to any, kinda significantly, sufficiently, like, complex PR. Right? And they will have a short length of maybe 5 minutes where he just talks through the change, what he's trying to achieve, how he got to the solution, he got, whatever. And that solves so many of the problems and so much of the back and forth.

Speaker 1:

So that is something that this is the that the engineer is presenting, gives a a a little intro effectively, video intro.

Speaker 6:

A 100%. A video intro usually has the pull request open, and he'll have his, like, little video in the corner. And he'll just talk through, like, this is the change I'm trying to do. This is the things that I was thinking about while I put this together. These are the main the the main, like, I don't know, meat of this poor request that I really think you need to have the most input in.

Speaker 6:

These are the things that, you know, you can kinda gloss over or whatever. So whatever kind of, intro you want there. But it does make a big difference compared to the the amount of information that you can usually glean from a pull request description and from from, like, commit messages.

Speaker 4:

I I like it a lot. It seems to also connect up with people's humanity. So it's not just, like, some random at username, even like some colleague of yours. It reminds you, oh, yeah. There's a human being who worked hard on this, and, I I should, you know, consider that and and when I'm providing feedback.

Speaker 6:

Totally. Yeah. Definitely humanizes that, and that you it's it's hard to get that sometimes when you're slogging through, you know, 15 pull request inbox or something. But, that that is a very helpful reminder that, yeah, there is a a person on the other end of this pull request who has put some amount of effort and thought into this, and, you know, to consider consider their feelings, as you are working your way through the proof of question and, leaving whatever, you you do fit.

Speaker 1:

Yeah. I mean, a lot of good points in there. I love what you said about these code review tools are reflecting the the way that the tool authors themselves work. I mean, that's something I've always believed for a long long time. I think it's one of the the ways that debuggers have struggled is because debuggers are often written by, adjacent to compile groups.

Speaker 1:

And so the code that they're debugging is often a compiler. And debugging a compiler looks really, really different than debugging a system, which is part of the reason. So I think it's really interesting that you're saying about these code review tools. Are they they're gonna be good or bad at various things or they're gonna have strengths and weaknesses that may very well reflect the way the way that software itself was was developed. I have to say and I'll I'll just grab another kinda third rail here.

Speaker 1:

And I know that Ryan's not here, but just to get Ryan to get your take on this. But I miss Garrett. I actually like I I like Garrett. And I know I feel like I have to say I actually like Garrett because I know Garrett gets really maligned. Maligned.

Speaker 1:

But I actually enjoy Garrett as a GoToRf tool. I think it's actually pretty good. But, you know, we I in what was a move that I think I would later come to regret, we were gonna be growing the team a lot. This is years ago at Joyant. And I kind of instituted Garrett.

Speaker 1:

I mandated Garrett. And, there were some people that really like that and some people that really, really, really did not like that. And I I definitely learned that it's it's stepping on to standardize on code review tools even. And I'd be curious what degree people because even I think, Adam, even we at Oxide have I mean, I think most of us are using GitHub for code review most of the time, but not by Fiat.

Speaker 4:

No. But it's also sort of like and, you know, you do what everyone else is doing, you know, like, it it it would be I think if someone were to propose that, like, their subgroup was gonna use some particular thing. I guess, it would be fine also because we have all these isolated repos. But it kinda wouldn't be fine to say half of the PRs for a particular repo went through one path and half went through another. That'd be kinda tough to manage.

Speaker 4:

Yeah.

Speaker 11:

I I kinda like the model of of letting the individual repos choose depending on who's primarily maintaining them because I certainly prefer Garrett. I was using GitHub before that. I've got plenty of years doing GitHub code reviews. And after getting used to Garrett, I find it extremely hard to review things on GitHub. So for me, if you want me to give a good review, it helps enormously

Speaker 4:

a review board, which was, like, fine, like, better than fine, I guess. I don't know. It was pretty good. I liked it better than hub, but, you know, then a bunch of the folks I worked with subsequently were most familiar with GitHub and, you know, like, I don't love it, but I don't feel like it is debilitating in terms of my ability to do reviews.

Speaker 1:

So I'm I I I feel that with Garrett versus GitHub, there's a bit of a false dichotomy in that. I Garrett, historically, I think this has actually changed, got in its own way by having a user interface that looked a little clunky and could be a little bit clunky at times. It it was it was kinda like, you know, someone who's got, like, who you know, bad breath or body odor or whatever. It's like, no. This isn't a an amazing person.

Speaker 1:

And it it I I I kinda feel that way with Garrett that the Garrett's kinda like first experience could be not so great. But, boy, if you were on a big diff, that was complicated And, Ryan, I don't know how you do it, but, man, the thing I really miss is, like, I'm on a big delta here. And you you know, Ryan's given me a ton of good feedback, and I've now, like, reworked a bunch of this stuff with his feedback in mind. For Ryan to go back and rereview my work on GitHub is, like, brutal. I mean, I I keep waiting.

Speaker 1:

Like, there must be some, like, hidden mode that makes this easier than what it feels like now. But right now, reviewing deltas on deltas is just that that would be, like, excruciating. I don't know.

Speaker 11:

Yeah. It's it's devastating for me. I can't I just

Speaker 1:

and I, you know, I I I think everyone has to kinda apologize for the Garrett affinity, but, I do actually like it as as a tool and, of course, it's all open source and so on. But I so I was actually you know, I was just for the record, I was not gonna go to the the formatters so we can talk about that a little bit too.

Speaker 4:

Yeah. And and just to add to the record, Brian, not a fan of the formatters. I don't definitely, what whatever you're gonna say about Rust, it was not Rust format, and and and I think maybe we that's a that's a yet another third rail to move Exactly.

Speaker 2:

To touch.

Speaker 12:

Could I just speak to the, question the thing before about making a presentation about your code review, actually putting a presentation out there in front of in front of people who are looking at it. I've got a little analogy there with a patch that I did when I was with HPE as a systems integrator, sort of systems engineer type person, not a developer. But we were doing 840 SCSI targets on a as block storage for a, ZFS Of

Speaker 1:

course, nice. Blaster

Speaker 12:

appliance. And that was not the actual design point that the the environment the because the environment service code was missing for 8 100 40 targets.

Speaker 6:

Yeah.

Speaker 12:

And so when you were talking before about, oh, this could be order n squared. Well, this was order n squared kinda. It was order n by m. It was like enclosures times tar targets in the well, times target total because they were redundantly scanning all the enclosures for all the targets as they presented each target. And they didn't have a, like, a batching mechanism for actually, not building the enclosure services stuff.

Speaker 12:

So the problem was that the thing would actually hold a SCSI subsystem lock on the entire SCSI subsystem for 40 minutes. Oh, okay. The drive or push to drive during that time, then nothing would happen. And then after 2 minutes, you get task hung messages. And I'm just gonna say here, because I don't work for HPE anymore, that we got these guys who had put this thing together in Houston and, never tested it at the scale that we were deploying it, in Australia.

Speaker 12:

And and they go, oh, yeah. We see those task con messages occasionally. And it's like, did you not actually wonder what they were from? Anyway, the customer had a really good kernel hacker, who was doing lots of luster work for them, and he I I managed to kind of get him interested in the problem because it wasn't my specialty. But he produced a 7 line patch in total, which, for the ordering of, disks and, enclosures that the HP, drivers presented to Linux.

Speaker 12:

It fixed the problem, and we went down to 17 seconds, initializing the environment services. So this meant a big deal for us. And I was trying to get this actually noticed by these folks in Houston, but they didn't really care. So I got the guy. I basically made it really easy of him to tell this wasn't wasn't fixed in mainline.

Speaker 12:

He wrote the patch, submitted it to Linux Guzy, and it just went into the yawning void. And, because he just wrote it as this fixes this problem, but he didn't put any detail. And the thing was, it's almost like a feature request. It's not really like a bug request, bug fix, because no one everyone's going, do I have 840 targets? Do I care?

Speaker 12:

And so he submitted it again. And, then finally, I said, I I I really wanna see this get in, so I can kinda, like, hand take my hands away from it. And, I wrote a code review effectively for the patch threw it at the mailing list. So I wrote, like, 80 lines of text and a whole bunch of included stuff about why this helps in any situation. You know, like, it can't hurt, and it can only help No.

Speaker 12:

Depending on the ordering these things could be.

Speaker 1:

This is a great story. Did did you write it as if you had I have never seen this patch before in my life, but I I would like to say this is No.

Speaker 2:

No. No.

Speaker 4:

I did

Speaker 1:

this is a gentleman

Speaker 12:

with a cell. Yeah. No. No. I I no one's gonna come along and say, oh, I have 840 targets as well, and I decided that this was a really great idea.

Speaker 12:

No. I I said, I'm the guy who's actually been integrating this thing, and here's exactly what it does. And I just wanted to put a tested by line there. And so the thing is it's it's more not not so much rubber duck debugging, but as much as yawning void code review. Because you throw this thing on a Linux skizzy, and the first you know, the patch just disappeared the first time.

Speaker 12:

It got reposted. Didn't not no one took any interest in it. Then I put this thing out there, and then we just got a message saying, oh, yeah. It's it's, you know, moved through. And, it'll go into a main line at x.

Speaker 12:

So that was kinda

Speaker 1:

like That Job done. Yeah. That's a there's a lot there. That's a what a great story, although albeit very dramatic. But you're just kinda getting to also well, first of all, like, when you do have that order of n squared or order of n cubed algorithm, like, is it 6, or could it be more than 6?

Speaker 1:

Could it be it it could it be 800 plus? But I also love your because I feel this happens in open source a lot where you get an issue that's open, and maintainers don't necessarily realize, like, is this important or is this not important? And what you were saying, Jason, is like, no. Like, this is, like, this is extremely important. Like, I this this was here is why this is important.

Speaker 12:

And the maintainer was in fact, James Bottomley was the guy who wrote the environmental services code, and no one had really touched it since then. So you had a big onus to try and, like actually demonstrate to him that what he did, sure, it was fine, except when we got this crazy course case, it really could be a lot better. And the thing is that once I was writing the review, I was thinking I could actually get into this and actually try and do a general purpose, rather than a very small change, a general purpose improvement to any ordering of targets and in, enclosures. But, that wasn't my brief at all. I was just trying to make it so that it worked with our configuration that we were selling into the market.

Speaker 1:

Adam, this just definitely reminds me of do you remember this, order of n cubed issue that I found in, in our colleague's code? Do you remember the this ring a

Speaker 4:

bell at all? The, Was this was this to I don't know. Was this to do with CFS?

Speaker 1:

This was to do with the SCSI target management facility. And this is a colleague of ours. God bless him. Love this guy. Anytime we we would get together and there'd be any alcohol consumed, there was a bit of a sobriety test that he would fail when he would offer to fight you in a way that he was, like, not actually gonna fight you.

Speaker 1:

I don't think. And so do you remember this at all, Adam? I was like No. Alright. Christmas party is coming up.

Speaker 1:

I at some point, Bill's gonna ask to fight me because that's what we do, and it's fine. It's like we're not actually gonna fight. It's gonna be it's it's like it's a friendly it's a friendly like, do you he actually to be clear, he says, do you wanna fight? And what I'm gonna say is, like, yes. Because of this order of n cubed issue, like, I wanna throw down over this issue.

Speaker 1:

Do you remember this? This is not right. Okay. Yeah. So this is exactly what happened.

Speaker 1:

I was, like, I was basically lying in wait, not much of a tricker, lying in wait at the holiday party. And And sure enough, you know, we get a couple of beers in. Bill's like, do you wanna fight? I'm like, I've been waiting for this for a week and a half. Yes.

Speaker 1:

I found this order of n q, did you? But it was all in good fun. No one was actually harmed. And, of course, now my kids listen to this kind of stuff, so I have to be like, no. It's not the way I want my kids to conduct themselves.

Speaker 1:

So we should be telling this story. But definitely it gotta be and, Jason, I think this highlight that when you think about, like, SCSI targets, like, how many SCSI targets can there possibly be? Like, when that code was written, 6. And the the but as time was going on, you know, more and more and more, and now you get to the point where it's actually, like, absolutely debilitating. And people are disconnected from the consequences of, which I think is a an interesting way of connecting people.

Speaker 1:

And I like the video.

Speaker 4:

So Brian, we know we're not we're not you're not excited about Rust format, but what what tell me about

Speaker 8:

where where were you going with Rust? So

Speaker 1:

I think that Rust the Rust compiler catches so many issues. It is so much easier to write correct code with Rust. I mean, you know much more when you have that code that compiles. And Yes. Certainly things that I would look for as a code reviewer.

Speaker 2:

I mean

Speaker 1:

so, I mean, classic is certainly in c. Mean, if I'm reviewing your c, I'm looking at all the error paths to make sure that everything is freed, that locks are dropped, and so on in those error paths. And when you find bugs, you often find it in those error paths. Not Rust doesn't make that impossible by any means, but it makes it harder. You know, it it just there are in so many ways, right, in terms of algebraic pipes, in terms of the the question mark operator.

Speaker 1:

I I just think it so it forces you to kinda like, what's okay. Wait a minute. Rust compiler, you kinda did my job. What's my job now? And I'm still, like, trying to figure that out.

Speaker 1:

Ryan, I don't know. What do you think?

Speaker 11:

Rust also what I like about it is it gives you seams. Right? Like, you can you can separate the different areas of the code better. Whereas when I'm reviewing c, like, it's literally, like especially if it's an area that I don't know know that well, it's like, well, I have to invent the universe to even to even begin to understand this versus Rust, you can have attractions that build seams around or build firewalls around the different parts. So I find it's easier to kind of review things both as a newcomer to the code and kind of like in isolation from other parts of

Speaker 2:

the code.

Speaker 4:

I'll tell you a place where I find Rust harder to to to review. Yeah.

Speaker 1:

It's good. Yeah.

Speaker 4:

And and maybe it's not maybe this isn't a fair comparison, but I love Rust analyzer. Like, I use it a ton. I find it just unbelievably helpful in terms of, like Brad, do you use Rust analyzer?

Speaker 1:

That that felt very pointed. Yeah. We can talk about syntax highlighting next. I feel like I'm No.

Speaker 4:

No. No.

Speaker 1:

Do I need to talk my lawyer?

Speaker 4:

No. No. I mean, you know, there's a the we're just having a friendly conversation.

Speaker 2:

Am I

Speaker 1:

being detained? Am I free to go? If I'm free if I'm free No. No. No.

Speaker 7:

You're free

Speaker 4:

you're free to go whenever you want. That's fine. But I just thought, you know, if you have nothing to hide

Speaker 1:

Just just make it easy on me and just let me know. Like, have you ever used Rust analyzer? Sorry. I I I haven't. But it's not because I'm against it.

Speaker 4:

No. No. No. I I of of course. Nothing like that.

Speaker 4:

No. I just made so I I maybe use it too much. But it you know, if I'm looking at the screen, it's like every one of these types that the compiler has inferred is now made explicit. And, you know, one of the one of the neat things about rest is like, you don't have to be overly verbose about the types because the compiler will figure out so much about it. But I do find that I miss it a ton when I'm going to code review.

Speaker 4:

Because, you know, it's it's I I mean, maybe it's a crutch in some respect. But it's, I don't know. It's it it I find it when I'm just looking at sort of dead code, it's tough. It's much harder to understand the typing of these things because Rust doesn't force you to make it explicit like some other languages do.

Speaker 1:

Well, it it actually Kendall, that allows us to circle back someday. You kinda said that we didn't really talk about that I thought was really interesting about doing a code review in the IDE itself. That seemed really interesting.

Speaker 3:

Yeah. I mean, I think, at least for me, especially when reviewing stuff that I'm not super familiar with, it's very useful to see it in context, and code review tools don't usually give you that. So, like, things like go to definition or, like, the function signature might have changed in a specific file, and that's in the diff, but then where that is used in a code isn't really exposed anywhere in the diff at all. So, like, a lot of times, if I'm just trying to understand what a code review is doing, I'll just check out that branch locally and step through the code in the IDE. It's a way to give me context, and, also, it allows me to use tools that I can't necessarily use in the code review tool.

Speaker 3:

Like, maybe I wanna run it myself or run, like, a linter across it for whatever reason. So there's a lot of things you can do by, like, exiting the tool that you can't do in the tool when you're doing a code review.

Speaker 1:

Yeah. That's interesting. Sorry, Josh. Go ahead.

Speaker 5:

Oh, thank you. I completely agree, and I think this bespeaks to, like, Brian's point that, like, the way we review gets reflected in the people and the tools that we use. And this is one of the things that I've actually gotten a huge amount from doing in person reviews. And, like, this reflects sort of when someone uses Rust analyzer as compared to when they don't. Right?

Speaker 5:

Like, that affects the way that you read code and that you work with it viscerally. And one of the things that I've gotten out of specifically in person reviews that I don't quite know how to get out of even a video review is when I like, we're talking about the code and working through it, people talking about how they actually work with the code and it like, when they see me, like, editing parts of the code, like, just watching how I actually write the software and getting tips about that. I found that kind of hidden knowledge writing you know, you're typing this way, but you can move your hands in this particular way as though you're playing the piano. And maybe this wasn't necessarily particular to the code review, but just seeing the different ways that you can use the computer and write software more efficiently.

Speaker 8:

Okay. Wait a minute. Wait. Wait. Wait.

Speaker 8:

Wait. Wait.

Speaker 1:

Wait. Wait. We we we can't just leave that one alone. Was that professor right? I mean, that's a gutsy professor to tell you how to type.

Speaker 5:

Yeah. No. He so part of this was that we had a really good relationship beforehand and, like, it was just kind of you know, if he'd we would get together at a whiteboard and we could finish each other's sentences. And so for him, it was, like, totally cool because, like, he comes from a family of musicians. And so to him literally typing code was, like, playing the piano, and he would, like, show me.

Speaker 5:

I would not recommend this in general. Like, this was because we had a very particular kind of relationship where that worked out well.

Speaker 1:

It's right. Similar to Exactly. Yes. I think this could get like, in general, I don't think you should tell people how to type. I mean, I think that that is that's just that that's a level of but I think you you got a good point though in terms of, like, seeing how and and this is something that we have lost in the pandemic.

Speaker 1:

Right? We're not in the same office. So we don't get a chance to and I know it's very interesting when we we were doing bring up in Minnesota a couple weeks ago. And I was, you know, physically with one of my colleagues who, you know, work very, very closely with one another, but not in the same room. And just like watching elements of his working style, I I was able to I I understood, like, oh, I got it.

Speaker 1:

I can see why certain things are important to you or certain things are more important to me than they are important to you.

Speaker 5:

Yeah. I agree. And and, like, I don't I don't know how you capture over Zoom. Like, I do think there's something that gets lost.

Speaker 1:

We gotta, like, Twitch stream our don't don't we, like can't we Twitch now? Or is that what the kids do these days when they you know? I've

Speaker 7:

That's right.

Speaker 1:

That's right,

Speaker 4:

Brian. They twitch.

Speaker 1:

Back here.

Speaker 3:

Sourcegraphs does something interesting. I think it's, like, weekly or bi but they grab a, quote, unquote, popular developer and then just kinda have them walk through their tooling and, some of the things they use to write code. It's pretty interesting. I think it loses some of the things. Like, keyboard shortcuts is not necessarily something you would capture in that, but, like, it does go over, like, people's working style and how they write code, which is pretty interesting.

Speaker 11:

We we did that at Basho back in, like, 2014, and I thought it was fascinating because I think about 5 different people kind of presented their workspaces and how they do, you know, how they their typical workflow. And they were also vastly different, and you learned about so many different tools. I think that was really cool.

Speaker 1:

And then, I think, Robert, you're trying to get in here. I think I think I saw your hand going up there earlier. Is that, z k? I just it's what's showing up here. Or Nick.

Speaker 1:

Sorry.

Speaker 8:

Oh, sorry.

Speaker 1:

Yeah. No worries.

Speaker 8:

Sorry. I I had a question. I think Kendall actually, answered it. I do have another question though. So you mentioned something about, like, empathy being important in code reviews and you wanting the best of the engineers and wanting to provide, you know, positive feedback.

Speaker 8:

How do you I guess, this this term is used a lot, but how do you, like, deploy empathy at scale? Because I find, like, you know, if I'm if I have 5 or, you know, maybe 6 or some small number of reviews in, you know, 1 or 2 days, I can go in the in-depth. I can do what I hope is, you know, a a meaningfully good job and provide some feedback of, you know, here's how your pipeline's wrong, here's how your, you know, code could be improved a little bit better, have you considered this, yada yada yada. But as that you know, my team's gotten bigger, and so I find it hard to actually, like, go in-depth and actually, you know, it's, it's gone from, like, very qualitative, like,

Speaker 1:

but now we just got

Speaker 8:

yeah. Exactly.

Speaker 1:

I think it's a problem. And I think it's like I think you as you scale, I do think I mean, kinda love your take on it because I think you've been obviously been in some cultures that really value code review. I think that you have to figure out a way to get that in the culture so you're not the only one that is that is doing this. You've gotta get other people. And, again, Ken, it goes back to that point you had earlier about, you know, using everybody being expected to review at, other people's code.

Speaker 3:

Yeah. I think I've been lucky in my professional experience in that code review was an expectation at every job I've been at. It's never been something that I had to convince people it's the right thing to do. It was already ingrained that we do code reviews. You should have an approval before you push a supplier.

Speaker 3:

Or if you don't have an approval, you should have, like, a really good reason why you don't.

Speaker 7:

What advice would you give to someone who is with an organization that does not do that? I mean, I I'm sure it's more like a value thing, like, either they see or they don't. But what would be some strong arguments to try and convince your team to start doing code reviews?

Speaker 6:

I could probably address that. I was part of the well, I in 2013, early 2013, I managed to convince the Jira team to switch from a, occasional code review practice to a code review on every change

Speaker 1:

practice. Okay. Wow. That's a big change. Which

Speaker 6:

which, yeah. I mean, at the time, we were not a public company and we did not have, you know, SOX compliance and various other compliance things as as forcing functions, which could be a forcing function for unique. Sometimes compliance is a is an apt forcing function to to get people on to contribute. The way that I went about, making that change was partially around pointing out instances where where in which, bugs had made their way through to production that I believe could have been caught during code review. Putting putting, like, concrete, like, here are some bugs that we we shipped that I'm pretty sure we could have caught if we had at least one other person read this code before it went out.

Speaker 6:

That was was a a pretty compelling argument. There was also, it was also just a little bit about, selling the benefits around, the extra layer of communication situ to be able to say, okay. This is the change that we're currently making architecturally from this pattern to this pattern. And it looks like you copy pasted the old pattern. This is the pattern that we're trying to move to and this is why.

Speaker 6:

Right? This is a like an opportunity for communication that, is is being missed by a team that is not doing code review. Same with like, you know, communicating best practices or communicating, you know, stylistic, choices. It's just an opportunity to learn that your your team's missing out on,

Speaker 3:

kiddos. I'm curious as to which of those arguments was the most useful. I think, in my experience and I feel like most people I've talked to kind of disagree with me, but I think finding bugs is the weakest argument for cult reviews.

Speaker 1:

That's interesting. That's really interesting. Yeah.

Speaker 3:

I think that everything else you said, I find true. Like, it's an opportunity for learning. It's an opportunity to talk about design decisions. But I think code reviews sometimes they catch bug bugs, and that's kind of a side effect, but it's not, like, the primary reason to do it. I've had countless bugs that get skipped in code reviews.

Speaker 3:

I've had bugs introduced based on code reviews.

Speaker 1:

Yes. Yes. Exactly. Code review broke my code.

Speaker 3:

I think the argument, that code reviews help catch bugs is, while true, not necessarily why code reviews are useful.

Speaker 7:

It's also hard to say, like, I want want you guys to start doing code reviews so I can pack so I can find your bugs. You know? Right.

Speaker 4:

That's a or even to say this was a bug that you introduced, is sort of like cross training. To say, look, look, this person is no longer on an island with their code. And if they're on vacation, we're no longer screwed if, if there's a bug that needs to be solved. So Interesting. It it you know, I I I I found that a lot of cultures that I've been from from, you know, big companies to small companies.

Speaker 4:

You know what? In small companies, you're like, we we don't have time to do code reviews. But I think on the other side of it is that small companies can feel so fragile that the the argument that this makes us more resilient, and in terms of, like, not being dependent on any one individual may be helpful.

Speaker 7:

Yeah. Yeah. And and this is definitely a smaller team, and we're and the biggest thing I think would be the the the bloat or the resistance would be in,

Speaker 2:

like, developer momentum Yes.

Speaker 7:

Or whatever. Right. Yeah. Velocity. I I I

Speaker 1:

Velocity. I I I Velocity.

Speaker 7:

That's what that's what I know.

Speaker 1:

Even though that term drives me nuts. So I say velocity is is for projectiles. I I I hate referring to developer velocity, but it is something that people love to talk about. Code review is gonna make us slower. That's why we can't do it.

Speaker 1:

And yeah. So we I I I what are the I've got my idea on the kind of counter arguments to that. But, I I mean, I love to hear them. Well, so sadly, this is I this is just gonna sound, like a total acquiescence. But I think you need to kind of what you what you are hearing organizationally is, what we care about is the speed at which software's developed.

Speaker 1:

And so then I think that you the argument needs to be couched as this makes us develop software faster in in the long run. And it that can take you into we will have less defects in our software. But I I agree totally agree with what Kendall said. Kendall, I'm very curious. And, you know, I definitely wanna hear your answer to Kendall's question, like, what argument carried the day?

Speaker 1:

I think that that's a very tough argument to make. I think the other argument to make, this is one that I feel I hear Adam and Kendall both making, is, hey. This actually communicates our architecture. This allows like, right now, you may have, you know, one person in charge of a single subsystem. No one knows enough about that subsystem to actually go in and meaningfully help them and work with them.

Speaker 1:

And that may be a attack to take if if that works.

Speaker 2:

Yeah.

Speaker 6:

Yeah. I mean, I would say I would say just, on that, like, developer velocity standpoint, the counterpoint is, like, velocity is is a sum of 2 values. Right? It's both magnitude and direction. And I feel like there may be a reduction in the magnitude, but, an improvement in the direction in the you know, instead of, you'll be more likely to be pointing in the correct direction even if the magnitude of your velocity is smaller.

Speaker 7:

Well, and Yeah. And this this all might change. This is like a this is like a race to get v 1 or, like, even below v 0 release.

Speaker 1:

Yeah.

Speaker 7:

This is a completely new, push within the organization. This is like a whole new arm of the of the organization or, you know, new branding, new release, new software, everything from the ground up. So

Speaker 3:

An interesting thing to try, is pair coding. I'm not a huge fan of pair coding, but in one of my previous teams, you could use a pair coding in session as a replacement for a code review.

Speaker 1:

Interesting. Yeah. And it it it's like finding so that's interesting, kinda, in terms of, like, can you find because, Nick, maybe this would help. But, like, what are some midpoints you can find along the way that would get you closer to code review without having to and so, you know, I actually just because I I I can't contain my curiosity any longer. How did you actually make it happen?

Speaker 1:

Did you was it did you get buy in from technical leadership? Was it done by Fiat? Was it done by the grassroots? How did you do it?

Speaker 6:

It was grassroots. I submitted this, proposal while I was, kind of feature lead on on a smaller team within within, Jira at the time. So I had the opportunity to to introduce, pull requests into the workflow for my sub team. So that was, you know, the obvious first step. And then, from there, other teams started to adopt it when they saw some some degree of benefit from from doing that.

Speaker 6:

And I I suspect that some portion of that was around the communication. Some portion of that was around, like, the team getting larger and just recognizing the need for, this opportunity to be able to communicate about changes. Particularly when you have newer people join the team who don't have the same level of familiarity with the code base. It it, you know, you have to have that some amount of communication about the changes if you're if you're a new person to the team. I don't think that Nick has the same level of problem because of you're coming from a smaller team, perhaps.

Speaker 7:

Yeah. Yeah. And I also wonder if there was already, like, a fundamental value of code review. Like, code review is already deemed something at some level was important or at least relevant.

Speaker 1:

Which that's Yeah.

Speaker 7:

It's kinda coming from strat from scratch, from that that that doesn't really necessarily exist to be the biggest difference, I think.

Speaker 1:

Yeah. Interesting. I know a bunch of folks got their hands up, so I wanna hear what other folks have to kinda jump in here. Austin, did you,

Speaker 10:

Yeah. One one thing I, ran into when I was trying to introduce code reviews is, some management was worried about how quickly could respond to, like, crises, like the tool being down or the site being down. So I've I found it eased the management's mind a lot to, like, say, okay. Here's a backup procedure to break glass of, you know, procedure to turn off permissions so people can land code in emergency to fix the site.

Speaker 1:

And, Austin, did you so were you in kinda Nick's predicament of needing to get a new org to adopt code review before it was too late?

Speaker 10:

Yeah. It was it was a small org, and getting people to see the value of having CI or even unit tests was kind of immediate like, was there

Speaker 1:

a moment

Speaker 7:

where everybody saw, like, oh, immediate like, was there a moment where everybody saw, like, oh, this was a good decision? This was, like, obviously worth it?

Speaker 10:

I mean, it was it was sort of, like, as the team got bigger and people got more annoyed by, like, the trunk it was it was sort of like that sort of developer philosophy slowdown, and so I included, code reviews as part of adding CI.

Speaker 1:

Yeah. Interesting. So you used it as a which is you should. It's a good argument. Joshua, I saw you also had your hand up.

Speaker 5:

Yeah. One skill that I found very useful when I was sort of working as a web developer. I found it useful to explicitly talk about it as an avenue for bettering communication, because communication is a skill, and it's a really hard one. And being a really good writer this is going a little bit to last week, but writing well is genuinely hard work. And so when you have an is very valuable to talk about, you know, this is one way that code review is valuable.

Speaker 5:

The other thing is accountability. Right? Like, just knowing that, okay, 2 people have, like, looked at this. They've signed it off. Like, we are you know, we're taking some measure of responsibility for this code.

Speaker 5:

That was also valuable when we were talking about it there.

Speaker 1:

Yeah. Interesting. Kendall, what what are your thoughts on that?

Speaker 3:

I think, all of these are really good points. I think what's important, though, is to identify what's actually important to your team. Right? Because I think core review is good, but if it wasn't solving a problem for me, I wouldn't be doing it. So if you can't honestly come up with a reason, that code reviews would be good for your team, maybe the things that code reviews are useful for just isn't valuable, for the current project you're working on.

Speaker 1:

Well and on that, I mean, I feel in any organization, there is code that we actually don't code review. At least for me personally. I've got code that is you know, stuff that's that is stuff that I'm using. Maybe it's tooling, maybe it's auxiliary, it's not in production, it's not gonna ship with a product. It's a tool that I use to, you know, you know, hit our air table instance to, you know, query something.

Speaker 1:

And so we're there's always gonna be that that boundary where you have code that is transitioning from code that's not code reviewed to code that is code reviewed. So I I feel that even in if you're in an organization that has a kind of these these twiners. So, yeah, just to your point, Kendall, in terms of, like, it's it's gotta be a fit for for each individual job. Code review needs to be a good fit.

Speaker 7:

Well, I these are all amazing points

Speaker 2:

and suggestions. I'm I'm actually gonna because I'm a lazy, you

Speaker 7:

know, points and suggestions. I'm I'm actually gonna because I'm a lazy man, I'm gonna just drop the link once it comes on YouTube into the Slack and see what, kind of conversation ensues afterwards.

Speaker 1:

Yeah. That sounds good. Well and I'll tell you when I get, and Robert and I will get to you in a second here. The the this happened to me. So just to give you some history, we were, at Joyant.

Speaker 1:

We had a bunch of different folks doing bunch of different work, and I kind of left it to each individual repository to figure out how they wanted to go do code review. And we had a very rigorous Garrett based way of doing code review in the operating system and then other people were it was a bit of a shrink to fit code review. But what we found is that there were actually some folks that just weren't doing code review. And I do think and, you know, I think this goes to, like, when it is the quality argument, even though I agree with you, Kendall, that it is not always the most it it can an argument that's hard to make. It is really tough when you are looking at someone's code that's in production, and you're thinking to yourself, I think I'm the first person that's ever looked at this.

Speaker 1:

Like, you, the person who wrote this didn't look at this. And to me, that absence of self review can be pretty galling. Jason, going back to your story of, like, you know, you're suffering in production with someone who really just didn't feel like they it feels slip this might even been right before you joined because I knew we were going to Samsung had acquired the company. We were gonna be, like, they've just be doubling the team. And I'm like, we need to get code review in place before Jordan starts.

Speaker 1:

Jordan was gonna join the team. She's coming out of school, and I'm basically like, we can't let Jordan see us without actually, like, organization wide code review. So that was the for better or for ill, that's what was Well, wait.

Speaker 11:

So there was no there was no code review before I

Speaker 1:

did? Okay. Well, that is the question of whether we did that before or after you arrived. Yeah. So before you showed up, we were it was more like code review was happening, but it was sporadic.

Speaker 1:

And so as you can imagine, the folks that, you know, you know, you could kinda fill in the gaps of the people that were always doing a review.

Speaker 11:

How did Robert allow this to happen? I just can't see Robert living in a world

Speaker 1:

where No. I know.

Speaker 11:

You're right. And and living recklessly and and just committing to to

Speaker 1:

master. Of course. Which we, of course, Robert never did. Robert Bustocki, our our colleague at both joint and now at Oxide. And, no, we will always did code review for all the parts of the code that that Robert was touching.

Speaker 1:

But there were other parts that we were not doing code review further afield. And we what we were doing was bringing everybody under kinda 110 and really mandating code review, which meant and I did Nick, I one of the traps that I fell into is that if you're gonna mandate code review, you're kinda gonna mandate a tool. And and then we ended up in this kind of protracted tool argument that was not as productive as I would have liked.

Speaker 7:

So Interesting.

Speaker 1:

I I I think you gotta be I mean, definitely the lesson that it taught me is, like, man, you gotta be real easy on the I'm not an auto everyone who, you know, works at Oxide knows I'm not I'm not an autocratic person by nature. Quiet, Adam. Don't unmute yourself. Thank you. The, but, no, I'm not an autocratic person.

Speaker 1:

Roots way. I I would just we we got there quickly, but but not without collateral damage is what I would say. And Robert, I know you've had your your hand up forever. So I wanna get you allow you to jump in here.

Speaker 8:

Thanks, man. Yes. So I'll I'll preface this by saying, I'm not recommending this as to Nick. Something that I found very useful at a past company was just as opposed to trying to prove that I'm right, just let the other person be wrong. So we had a it was a company culture that probably prioritize sales in the biz dev ops side of things far more than the engineering side.

Speaker 8:

And so we were in a position where the culture, you know, wasn't very strong. And so me and 2 other engineers sort of put an ad hoc process together of this is how code should be deployed. This is how it should be developed, but there's nothing, you know, like, official. And so we tried to make that official. We tried to formalize this process so that we could allocate proper time for these things, when we made deliverables.

Speaker 8:

So, I suppose you're saying, like, something looks like a week and justifying it with things that maybe some PM didn't really get, like, why we spent same time on code review because that's something we need to explain. We would do that ad hoc ourselves without actually formalizing this process. And when we actually went, to I think it was probably the VP of engineering, who was, again, more business inclined than engineering inclined, saw that as a time waste.

Speaker 1:

Waste.

Speaker 8:

And so the 3 so and

Speaker 1:

so and exactly Where where's the baseball emoji?

Speaker 8:

Let me find it. Stop it.

Speaker 1:

Are you alright?

Speaker 8:

But so so as opposed to we we thought about ways to argue a point. We thought about ways to make it more appealing. And being engineers are not very good at, you know, like, selling things to people. So we all came to the calculated risk decision of, okay, we can, like, try and convince this person over time, or we can we can just stop doing code review and see what happens in the development environment. So we had so we had segregated environments.

Speaker 8:

So this is so for, for context, like, you know, usually, you have a devastation or production environment. Had this been, like, one environment, we I would never have thought about this. That would have been suicide. But we had a safe environment where we could be willfully ignorant in regard and sort of, quote, unquote, get away with it. We knew it would come back on us because everyone, every other engineer, data scientist at the company sort of knew that we were the, you know, de facto code reviewers.

Speaker 8:

And so something went wrong in dev. They were pointing the finger at us. So like, okay. We're gonna do this. We're we are gonna stop reviewing code.

Speaker 8:

We're get we're gonna say we're reviewing it, but we're going to keep on pushing to dev, and we're just gonna see what happens. And within 2 weeks, probably 60% of the services in Kubernetes just did not work, or they had these, like, random,

Speaker 4:

they they they have

Speaker 8:

these, like, random yeah. It's it's

Speaker 7:

it's it's it's it's it's it's Machiavelli in your picture right there? Like, this is this is amazing.

Speaker 8:

Oh, it's, it's, Jacob Fugger, Jacob Fugger. Just, if you wanna go and Google that? Just, I'll go into that that detail. But, yeah, I guess, it it is sort of Machiavellian, but it was it was a thing where in the context of, you know, I'm I could we could sort of afford to do it. We'd all been, like, promoted or received raises in the past 2 months when we made this decision.

Speaker 8:

So we had some cache to say, like, okay, like, we won't be fired immediately if we do this. We'll be on thin ice, but, like, this is something worth doing. And, yeah, it predictably came back to us. We were asked, like, why, you know, how this happened? And we said, well, you wouldn't and very, I didn't have the stones to say it at the time.

Speaker 8:

Another engineer friend of mine said, well, you wouldn't let us do it, so we just showed you what would happen. And there's there's a very tense, like, staring that happened in this, like, very small office where where just 3 sat in front of this VP, and he's staring us down, like, the gall of you to, like, try and prove your point. Very scary. I was shaking after that meeting. But, like, that ended up paying off.

Speaker 8:

Like, we we we showed them what happened in real time. Like, we didn't we didn't, you know, try and persuade them. We didn't try and yank their arm. We just said, okay. But they see no value in this, so we're gonna show them, like, just how invaluable it is.

Speaker 8:

And that and that ended up solving the problem. So the overall theme, you know, I'm not saying you break dev. I'm just saying maybe you'll find a way to let them them be wrong so they can see that

Speaker 4:

I'm not You're not advocating terrorism. You're just saying Right.

Speaker 1:

No. No. No.

Speaker 8:

I'm not but in case the NSA is listening, I'm not advocating, cyber terrorism. I I'm I'm not Edward Snowden. Please don't arrest me. But yeah.

Speaker 7:

What what I'm taking from this is find a coworker and then just do co work co review really well at a high level with them, and then make it kind of the, the nice version of your story.

Speaker 3:

Like Yeah. I mean, I

Speaker 8:

I mean, this came it came from a place of, like, we did this ourselves, and we cared about the code. So, like, you can if you wanna do it ad hoc yourself is the wrong way.

Speaker 7:

Nice is the wrong way. Less of break.

Speaker 1:

Well, it And and I think that also I mean, step by step, get the little wins, I think, is that, you know, kinda you'd said that earlier, like, step by step. And, you know, you know, I thought one very important point in your story is that you didn't have the entire organization go at once. You got sub teams that began to adopt code review and get the wins and let people see the value and and let it Oh, yeah.

Speaker 6:

I think I mean, there's no there's there's every incremental line of route code reviewed is still better than 0 lines of code reviewed. So you don't have to go from, you know, low code review to code review is mandatory overnight, by fiat. There's definitely, like, steps along the way that you can take to be able to go on that journey.

Speaker 1:

There are a lot of steps along the way. Well, Kendall, this has been great. Thank you very much for I I think the blog entry was terrific. And I just think you're I mean, you had so much wisdom in here, I felt, about how I mean, you got such a, I mean, whoever's I I think you're right to be on that panel of pocket cut review. You're this is something that is not just and dear to your heart, but I think it's something that, you've got a great deal of wisdom for that, certainly, I benefited from.

Speaker 1:

I think everybody really

Speaker 4:

Thanks, Kendall. This is great.

Speaker 1:

Really great stuff. Nick is gonna be sending this recording out to his entire org. So we'll see how Thanks, guys. So look, when Nick is looking for another gig, you know, just like just extend him an offer because he he believes he could. No.

Speaker 1:

Really great conversation as always. Thank you everybody, and we'll, we'll see you next time. Thanks, everyone.

On Code Review
Broadcast by