How can engineering teams move beyond traditional metrics like velocity to create real business impact?
In this episode of Leaders of Code, Dan Lines, cofounder and COO of LinearB; Ben Matthews, Senior Director of Engineering at Stack Overflow; and host Ben Popper talk about why velocity should be a diagnostic tool, not the primary goal of engineering teams. They also touch on the need for cross-disciplinary collaboration to align engineering with business objectives.
They also discuss:
Episode notes:
[intro music plays]
Ben Popper Hello everybody, and welcome back to the Stack Overflow Podcast, a place to talk all things software and technology. I am Ben Popper, one of the hosts of the Stack Overflow Podcast, and today we are doing a Leaders of Code episode. I'm very excited to have with me two great engineering leaders: Dan Lines and Ben Matthews. We're going to have a conversation about moving with velocity and focusing on business impact. How do you unblock engineering productivity? What can leaders do? And who's fallen for the AI hype cycle and who's making something great out of it? So without further ado, Ben, Dan, welcome to the show.
Ben Matthews Thanks. Great to be here.
Dan Lines Awesome. Thanks for having us on.
BP Dan, you're the guest from outside of Stack Overflow. Why don't you introduce yourself real quick and just let folks know who you are, what it is you work on with LinearB and your podcast. Just a quick flyover of your bio.
DL So hey, everyone. I'm Dan Lines. I'm one of the founders at LinearB. We have a productivity platform for engineering teams and super excited to be on the pod today, talk some productivity, maybe talk some AI, and let's get the convo going.
BP Ben Matthews, say hello to the audience. Let them know what it is you do and what you're working on these days.
BM Hello. I'm Ben Matthews. I'm a Senior Director of Engineering at Stack Overflow, mostly working with the Community Products and Knowledge Solutions divisions. So I'm lucky enough to be part of that piece of Stack that gets to work with the community, the questions and answers, work with the moderators, and build tools that really empower them. And it's been a really rewarding five and a half years now. Very proud of what we're doing and looking forward to the next generation of things we'll do.
BP So let's start with a little bit of a prompt. How can engineering leaders move beyond velocity and focus on real business impact? Dan, you want to take this one away, and Ben, you'll volley?
DL I'll start with this. First of all, I think we're in one of the most exciting times for engineering teams, engineering organizations. I have the privilege of talking to CTOs, developer experience teams, all of that, and they're at the center of the universe right now. And what does that mean? Well, we have a few things that are going on. Rapid technology stuff happening, I think everyone knows with AI. They're getting asked, “Hey, what are you doing with AI? How are you going to deploy this?” They're also getting asked, “Hey, how are we going to measure our success? Hopefully it's not through velocity or something like that.” And they're also getting asked, here's a tough question: “What are you going to do with your workforce?” Okay, so that's what I'm hearing from the field. Now what you asked is, “Hey, how can we move beyond velocity and something like that and get to real business impact?” What our customer base and the people that I talk to, they're on the hook for productivity. So I think the first thing that we need to do is decide what is real business impact and does anyone know what that actually means? For some of the leaders that I work with, real business impact might be as simple as, “Hey, we’ve got to get to production faster. We’ve got to do something like reduce our cycle time.” Really, really clear. Other ones might say, “You know what? We're actually having issues in production. Our customers are a little angry at us right now. Real business impact is improving our quality.” And maybe another situation, I'll give you one more because this is the stuff I'm hearing: “Listen, business impact right now, it's reducing our costs. We’ve got to bring down the costs.” So the way that I would concisely answer it back is, “Make sure you understand what real business impact actually is with your organization, otherwise you'll get into trouble measuring velocity or something like that.”
BM I think you hit on a whole lot of good points there. I think it's a great diagnosis from some old school thinking of where people were trying to emphasize symptoms of hitting business impact, looking at velocity as more of a side effect of having real business impact is when you look at velocity, but it's not actually business impact. To be clear though, I'm still a fan of velocity as an introspective data point. That is not the goal, but it's a way to maybe diagnose a team. We still have some points now and again just so we could see, “Wow, we really did very well this sprint. Let's just look at what we did differently and how that moved the needle, especially around business impact so we can lean into that.” Or if it went down, “Okay, is there just something around this work or something maybe going on within the team that we could just do better?” But all of that, I think, to your point, Dan, is to drive business impact. If we're not moving those KPIs, delivering value, and really moving the needle on the bottom line, then what are we really doing here? That's what we're all here to do. And I think one of the ways that we can move away from some old school thinking around velocity is try to get more alignment between the disciplines within a company, I think helps a lot. You mentioned cost, you mentioned quality, you mentioned speed of delivery. These are things that, for sales, really cares about the quality and delivery so they can have a good relationship with their customers. Bring that to the engineers, make them part of that responsibility of what they do. So instead of just trying to move a velocity number, that's the KPI for the engineers too– how often are we addressing bugs? How quickly are we getting these features out? And it's not just how many stories did we move, let's actually talk about the bottom line. And then really just to have a focus on business impact, a lot of it is getting out of the engineers’ way sometimes, not trying to be a middleman and just bring them those results. You're a stakeholder like the rest of everyone else. We're putting you in meetings with users, we're putting you in meetings with sales and customer support, and that really, I think brings the business impact, I guess, more holistically to all of engineering.
DL I mean, when we talk velocity, I'm not talking about just generally moving faster. We're talking story points, that type of thing, so it's a subjective measure. What you don't want is have your engineering team say, “Oh, you want more velocity? This story instead of being two points, it's five points.” But when I've actually seen it work pretty well is when the business says, “You know what we're looking for? We're looking for predictable delivery. Actually, that's the thing that will help us with business impact. Why? Sales can communicate to the prospects and say, ‘Hey, this is when this new feature will come out and we can actually hit the date. We can make promises to customers. We can also line up with marketing.’” And I find if you do, let's say like you said, “Okay, I want to use velocity as a leading indicator,” my recommendation I've seen work well is actually go more so with the predictability side as opposed to do more or just up that velocity number.
BM The predictability is a great word for it too. Even if the news is good or bad, just keeping everyone in line and informed and making that predictable, because the worst thing you can do is make a promise to a customer or potential customer and then not make good on that. That's not just an engineering problem, that's a whole business problem.
DL It doesn't feel good. And that's actual business impact then, because it's a business problem.
BM 100%.
BP Ben, you had mentioned earlier, sometimes the best thing to do is just stay out of the engineers’ way. I would love to hear you discuss a little bit what are some mistakes you see engineering orgs make when trying to scale and how can they avoid them? Some lessons we can learn from maybe some things you've done in the past.
BM There's two big things that may almost seem counterintuitive, but they are really aligned. I think a lot of folks ignore the complexity that happens with just a bigger size organization. And I think the core piece of that is trying to keep communication fresh, keep it open between lines, across departments and also between engineering orgs. Because for instance, when you just have a two or three-team engineering department, that communication is much easier to do. When you scale up to 10 or 15 teams, now you can't just keep everyone in line with just a Slack message. You have to keep everyone up to date and more, I say formal for lack of a better word, measures. You keep some paradigms to keep everyone up to date and that there's a feedback loop. Without being able to do that, then you're really just having a divided organization that can't move towards those goals, those business goals together. But on the other side, you also don't want to add too much ceremony as you scale. As you do that, I think a lot of people add more red tape, add more process, and that takes away some of the key things that got you to a place where you're able to scale. Because when you look at a startup like a two or three-team startup, being able to make quick decisions, have responsibility for decisions, and move quickly is a huge part of that. Trying to add extra steps for them to feel empowered to actually make things, deliver them faster, that's going to take away from a big bottom line and away from probably what got you to the successful place you are now.
DL Ben, I agree with you. Actually, I think there's a nuance to this question of what scale are we talking about? So if we're talking about going from two to three teams to ten teams, there's a balance there. You want to stay nimble, you want to stay agile. You don't want too much red tape. But 10 teams for some companies is actually still really, really small. If you're in a situation that, hey, we're going from, let's say– I usually see a hundred developers as kind of the breaking point of scale. If we're going to cross that a hundred developer, let's say landscape, and maybe we're going to go into 150, 175, or up to like 500, actually, I think the balance shifts from what I've seen a little bit in the other direction where more centralization is actually key to scale. And the reason that I say that is I encounter kind of two different types of engineering organizations and kind of just have my feeling or opinion who I see be more effective. One who's crossing that 100 or 200 developer team kind of just says, “Hey, we're going to let every team be themselves and work on their own processes and kind of just do their own thing.” But I have seen that there's something good when you actually get a developer experience team, for example, centralized, and their responsibility is to help all teams become more streamlined and productive. Or sometimes they're called engineering excellence teams, whatever you want to call it. I do actually think that rises all ships once you cross a certain number of developers.
BM I think that's a great example, a developer experience or specialized teams. Because you bring up a good point, especially when you get to that super large amount, but even without getting to that large, I think there's opportunities even before that 100, 200 mark. At Stack, we have a front end specific team that just works on our front end framework. That means that takes a lot of lift off of a lot of these teams building stuff. They can rely on this library that everyone's built. We have a DevX team. We have even specialization around networking or other platform stuff. I think you bring up a lot of good points, Dan, of finding out what is commonalities between all of these teams, what are they all have on their plate? How can we just specialize around that, find someone that can do it really well and then scale it outwards? There's a lot of great opportunities for that and I think that also changes between orgs. I think the examples you gave are pretty consistent, great examples of where there's common between a lot of teams, but even then, there was just one team in a previous organization I was at that had nothing to do with anything other than just data security, making sure things were encrypted and all of these libraries to keep things safe just because it was over and over and over again. These teams had to find new ways to solve this problem. Having just one specialized team for that, it was a force multiplier for everyone. Everyone just got to move faster.
BP I want to touch on that for a second. Dan, he's referring to sort of a bottleneck there that they were able to eliminate by having this really great focus team. Can you tell us just a few things about LinearB, how it works these days? You and I were on a podcast talking about this two years ago so I'm sure it's changed a lot since then, but talk a little bit about some of what the cutting edge stuff that LinearB does, and maybe we can focus there on what are some of the biggest blockers to productivity that you get to analyze and find through LinearB and then perhaps solve for.
DL That's so awesome. We can compare two years ago to where we are now. I think a lot has changed in two years. Starting in the engineering landscape, I mean, most of the companies that we work with, we either have that CTO or maybe we have that developer experience team and they're on the hook for productivity gain. That's actually how they're making their business impact and what they're being asked to do. And so that's what LinearB is helping with. And we do it, I would try to say it in the simplest way possible. First of all, we will come in and measure end-to-end all aspects of the software development process. So for example, Ben, you were talking about velocity and maybe how teams are moving through sprints and that type of thing. We have data around that. And also on the other side, is it easy for us to ship code to production? Is it easy for us to go through a code review? And once you can see something like that, now you have the full landscape of, okay, where are the bottlenecks actually? Maybe I didn't have a problem over here and I thought I did, and I actually have a problem in this other place. So probably the newest thing with LinearB is after we find those bottlenecks, we actually deploy AI automation and bots into those areas of pain. So for example, we have an AI code review now, we have an AI PR summary description. We can identify actually bots that are raising pull requests that do not need a human review. So we kind of deploy all these automations to actually solve the problem now and not only measure where the problems are.
BM I mean, those are really interesting places though. Just to check, you said you have some bots that are actually creating PRs on their own without human review?
DL I mean, it's actually not even us. So when we go and scan a customer's engineering organization for example, and it's not even an AI bot, I think you all know probably Dependabot or Renovate. These are actually bots that are opening up a pull request and maybe there's a patch or an update or something like that. So when we'll come into an organization, we'll actually say, “Hey, you know what? Dependabot has raised a thousand PRs in the last three months, and you're still using a human reviewer for those PRs, but all the human developer is doing is immediately typing ‘Looks good to me,’ and then going back to their development work.” So for example, why don't we just run an AI code review and automatically merge that pull request instead of wasting the time of another developer and taking them off their focus. It's a really easy example.
BP Dan, I'm going back here to the show notes from when we chatted February 2023: engineering's hidden bottleneck pull requests. And at the time you were saying you looked at these 4 million PRs, you were figuring out what to do, and you had gitStream and it was sort of like, “Okay, we're going to find the right person within your organization to be reviewing the PRs.” Here we are a year and a half later, the right person is the AI bot, as long as there's a human in the loop somewhere and you've got enough testing and you've got enough version control and all that kind of stuff.
DL I mean, so the technology is still gitStream, but you're right. I mean, AI's here. We’ve got to be deploying this, we’ve got to do it in a responsible way. That's the thing. So you're trying to kind of identify, “Hey, where could we use AI to our advantage?” It's not everywhere, by the way, but where could we also make sure we're focusing our developers to make sure they're actually working or reviewing in areas of need? And it's not an all or nothing thing. We're not at the point where it's like, “Okay, AI's creating code and AI's reviewing the code and AI's deploying it.” Maybe we'll get there someday, but I think we're more so at the point of, “Hey, let's take some of this monotonous work off of developers and make sure that we can provide them with something that does the easy stuff and let them focus on the fun stuff.”
BM I think that's the part of AI where there is not hype, there's not this bubble that people are talking about. I admit I think AI is being thrown at a lot of places it doesn't need to be, but I think the examples you gave are exactly where it does need to be. There's a lot of boring, tedious tasks that are not creative exercises for people that are really just steps to be taken that should be automated. One thing that I thought was kind of past the pale on some AI things was an advertisement for an AI system that would read your kids stories and kind of read them through books. And the commercial I saw was while some of the parents got to do the dishes or do the laundry, their kids were entertained. I'm like, “No, that's backwards. I want an AI to do the laundry for me and dishes so I can be with my kids.” And I think this is example of that, the stuff that is not a creative exercise that doesn't need human beings in the loop, that's exactly where AI can really solve a lot of problems, free up people's times for the difficult problems.
BP Dan, to your point about what is toil work for an engineer, what is repetitive work, and what is engaging and challenging work, let's decide ahead of time and try to decide correctly which parts of this we want to pass off to the robots and make sure we get it right.
DL I mean we rolled out an AI PR summary, so all it does is when you open up a pull request, it will write a summary of what the changes are in English. Now if you think about it, for some developers in the world, English could be a second language, for example. Maybe I don't want to spend my time typing in English exactly what this PR did, and I could use some help to do so. I think that's another obvious example.
BM That's actually something that Stack has leaned into with we have a question assistant now that has shown 10 to 15% uptakes in question success in Stack Overflow, especially helping people with English as a second language, of just grammatical mistakes. Finding some commonalities between what makes a good question, what questions have been successful, and just bringing up things like, “Oh, you're missing a code block,” or even just some things like your question might have been answered somewhere else. We don't want this closed as duplicate. And for people that are first-time askers at Stack, that's probably the worst experience they could have of, “I'm opening this relationship of being part of this community, asking a question, and then I'm not having a successful experience.” And AI is just kind of saying, “Hey, I'm just nudging you towards a great way for you to express yourself. I'm not telling you how to do it. I'm not creating your content, but I'm giving you some guardrails of what would really be helpful.” I think that's another great place that AI can help.
DL That's really cool, because I mean, there's great developers all over the world that want to probably contribute on Stack and all of that, but maybe for example, if you're not, I think that's cool, not asking the question the right way or missing something there. I like it.
BM Well, even talking to experienced developers, asking questions, technical questions, in a good way is a skill unto itself. Being able to communicate around technology and around code, there's complexity there. It's almost a learned skill. So whatever we can do to kind of nudge them to it, give them some examples. I mean, we've been training some things here at Stack just to help around that experience and we've been having a lot of success with it.
BP I remember when I started at Stack Overflow, we had a course– Ben, I believe you were in the same cohort as me– and it was like, “No matter what part of the company you're at, you're going to learn a little bit of HTML, JavaScript, CSS, just to get you a little bit familiar with some of the stuff. If you're on the marketing team, the legal team, it doesn't matter. Stack Overflow is for developers, we want you to have a few of these skills.” What do the two of you think about the contributions of less technical staffers into an engineering organization? How do you do that right without frustrating your serious senior engineers, but for example, letting a marketing person maybe take some of the work, as you said before, Dan, that they consider tedious or non-essential off their plate and do it themselves?
DL I love marketing people, but I don't know if I ever want a marketing person in my app. No, I love marketing people. I can just tell you what we're doing. So you talked about gitStream. That's our underlying technology for all of these automations in AI. What we're experimenting now, kind of bleeding edge. Think about more I think of a product person. So a product person is kind of like a marketing person, not a developer, but also really good at designing products, how the customers should behave. And what we see is product people do open up a lot of tickets that you don't necessarily need a developer. Let's say that I'm a product person and I'm just saying, “Hey, you know what? This text is not clear. The wording here doesn't make sense,” and I open a ticket, or I want to change the color of something in the background. When I open up a Jira ticket, we're actually working on grabbing that ticket, opening up a PR, doing the code change automatically, not bothering engineering with that tedious task. Who wants to change the color from red to blue? I'm sure there's maybe some people, or who wants to inspect some grammar or something like that? And actually send that PR all the way through the system and not bother anyone in engineering. That's kind of where we're dabbling and I can see the product development starting to head in that direction.
BP I think that's more what I was getting at when I said I was getting involved in the vibe coding is, exactly your point. Hey, you know what? We want to A/B test this landing page for the new product. I'd like to shift this over here, make this button bigger, change the color and up this font. I don't want to open four Jira tickets. Maybe I have a certain level of access to the system, limited to not allowing marketers to do any real damage, but I can get under the hood and change a few things with it while the engineer stays in the flow state.
DL Listen, if you have the guardrails in place. I know we keep talking about an AI review, but even more so orchestration. If it's like, “Hey, we see that Ben has opened a PR, or our bot opened a PR on behalf of Ben, and it's in a low risk repo and all security has passed and all tests have passed and it went through a review, why not?” Go for it, Ben. It's automatically going to get merged. That makes sense.
BM And then if production goes down we know who to– no, I'm kidding.
DL Then we know to go back to Ben.
BM Everyone has to take protection. And I think there's a lot of parallels. Someone actually gave me a great example that I love to use when it comes to using AI, especially when it comes with automating some development tasks or taking some development tasks off people's plate is like a calculator. I mean, it's right now, you go through math tests, whatever. You're allowed to use a calculator, but when you learn the foundations, you're still doing it with pencil and paper. You learn how to do math that way, but eventually you get to a point where you don't need to do it that way. You've learned those foundational parts and you can go to a machine taking care of that for you. I think that analogy really resonated with me of maybe I want someone getting new into code. Don't use AI. I want you to actually go through the experience of changing fonts and colors, working within HTML, but at some point that becomes a tedious task. I want you to know those foundational skills because as we want to grow more mid-level and senior and staff level engineers, those folks come from junior engineers. They don't grow out of nowhere, they have to go through those tasks, but eventually you get to a point where you don't need them to go through things that they see as tedious. Another thing right now is, me personally, just being honest, I hate CSS with a passion. If I never have to write CSS again, I will not be sad. There are some people who can make it sing and do magical things with it. I am not one of them. So I use AI all the time of just saying I know I could go through this long process doing long division on paper if I wanted to, or I could have AI give me a class with rounded borders with blue, and then it just spits it out. And that's a wonderful experience.
BP What are you excited about coming up in the next 6 to 12 months? Obviously, if we get to the singularity, things will change, but assuming we just stay on the same exponential curve we're on, how do you think about what it's like to be a leader at a software engineering organization and what you're going to do with your product to help all of your customers continue improving their productivity?
DL What I'm most excited about is I do think engineering organizations are in the spotlight, are back in the spotlight. What I see is all businesses really value the engineering team, the developers, the CTO. You're at the forefront of the conversation. You're getting asked how to be more productive. What's your AI strategy? I mean, this is an exciting time in history. And with LinearB, we're just a part of that piece of history and we're kind of the ones that say, “Hey, I know you need an AI strategy. I know you need to roll out automations. I know you need productivity. Let me help you with that. Let me help make it happen.” I don't know if it will actually come to fruition, but I can see some of the best orgs start pumping out really, really, really great code and great products, maybe, I don't know, at a 10x faster pace. I think it's exciting for humanity and I'm just excited to see it happen now over the next 6-12 months, few years even.
BP Ben, I know there's been a lot of interesting changes at Stack Overflow over the last year or two as we adopted to the Gen AI era. As you mentioned, lots of great efforts to integrate AI appropriately and ethically with community, that human created knowledge base and community. What are you excited about for the next year looking ahead?
BM I think from two different areas. From a creator's point of view of engineers making code, as Dan touched on, I would say we're getting into an age of tooling now of how can tools make people go faster. I think that's really the inflection point, and AI I think is at the core of that. Though I think there are areas that are hype, it's not going away. It's going to really help people in certain ways as it’s still normalizing. Even though it's been around for a couple years now, we're still figuring out new ways where it can help, and I think over the next 12 months it's going to look totally different than it does now. It's going to keep refining itself and get bigger. I disagree with some of the people that think engineers are going to go away or designers are going to go away. I don't think it's that at all. I think engineers and designers are going to start going faster and more creative ways of doing tasks. I really think that's where the next step is. But even outside of engineering as we talk about just the general population and users, we're going to find a whole lot more guardrails and guidance on how they operate with other tools, with whether it's creating presentations or contributing to a community. I think those sorts of tools are also going to emerge. Especially for people who may not be as, for lack of a better term, technically literate or have some struggles, having more guardrails and tools to help them interact with the internet space, I think that's really ripe for a lot of tools to come in and help people.
BP Dan, you were shaking your head. Anything you want to add?
DL Just in agreement and that excited about the future.
BM The democratization that it's adding to a lot of people to be able to interact and create things, it's not a silo of certain people having keys to certain kingdoms when it comes to technology. A lot of these tools are opening up doors for folks, and right now maybe until that singularity happens, there's still going to be a big need for people who can understand the crux of things, operate it on a higher level, and think of new creative solutions to stuff. Because even as AI needs to grow, it needs that human input, it needs people to generate and feed off of in these new creative ideas for it to evolve. So how can AI have that symbiotic relationship of people using it so it learns more and people learning from the experiences of using AI and being in those spaces?
BP And Dan, if you ever want to write some marketing copy, don't let me stop you. I mean, just go for it, share it with your team, let them give a little feedback. Nobody says engineers can't write marketing copy, okay?
DL Touché. Fair point.
[music plays]
BP All right, everybody. It is that time of the show. Let's give a shout out to someone who came onto the Stack Overflow Network and shared a little bit of curiosity or a little bit of knowledge. “How can I test whether a variable holds a lambda?” Augusta gave a great answer. That answer was so good it got more upvotes than the accepted answer, earned themselves a Populist Badge. So congrats, Augusta, for helping out. “How can I test whether a variable holds a lambda?” 25,000 people benefited from that knowledge. As always, I'm Ben Popper. I'm one of the hosts here of the Stack Overflow Podcast. Find me on X @BenPopper, or shoot us an email, podcast@stackoverflow.com. Questions, suggestions, topics, guests, we're open to all of it. If you like what you heard today, leave us a rating and a review, but even more, just go tell one other person who works in your team, another developer or engineer that this is a great podcast and they should check it out.
DL Dan Lines, LinearB co-founder. You can find me on LinkedIn. Just look me up with my name. And if you're looking to improve productivity for your engineering organization, check us out at linearb.io.
BM I'm Ben Matthews, Senior Director of Engineering for Stack Overflow. You can find me on Bluesky @BenMatthews or LinkedIn @ Ben Matthews.
BP All right, everybody. Thank you so much for listening, and we will talk to you soon.
[outro music plays]