The Stack Overflow Podcast

ML and AI consulting-as-a-service

Episode Summary

The home team talks with Jaclyn Rice Nelson, cofounder and CEO of Tribe AI, about the explosion of hype surrounding generative AI, what it’s like to work at a startup after working at Google, and how Tribe is leveraging the power of a specialist network.

Episode Notes

Tribe is a distributed community of AI industry leaders, including ML engineers and data scientists, dedicated to helping companies apply machine learning to their business operations. Explore their case studies to see Tribe’s expertise in action.

Founder and CEO Jaclyn Rice Nelson formerly worked at Google, partnering with enterprise companies and incubating new ventures. As an early employee at CapitalG, Alphabet’s growth equity firm, she advised companies including Airbnb on scaling technical infrastructure, ensuring data security, and boosting growth with machine learning.

As we explored on our blog last year, the generative AI space has been expanding rapidly. Many of Tribe’s specialists have opted out of full-time employment, but are willing to provide companies without internal AI expertise with the skills they need to leverage this rapidly evolving technology inside their business.   

Connect with Jackie on LinkedIn or Twitter.

Today’s Lifeboat badge winner is PM 2Ring for their answer to Sort a list to form the largest possible number.

Episode Transcription

[intro music plays]

Ben Popper Hello, everybody. Welcome back to the Stack Overflow Podcast, a place to talk all things software and technology. I am your host, Ben Popper, Director of Content here at Stack Overflow, joined as I often by my wonderful colleagues and collaborators, Ryan Donovan and Cassidy Williams. Hey, y'all. 

Cassidy Williams Hello!

Ryan Donovan Hey. How’re you doing, Ben?

BP So today we are going to be talking a little bit about something near and dear to our hearts: AI, generative AI, and the explosive hype cycle happening in that area. From big companies like Microsoft putting AI into Bing, to startups, Cassidy, like your own, and open source projects like Stable Diffusion, this has really been, I think, the center of the topic of conversation, something that is constantly in the news cycle especially for developers and more recently even for the mainstream. So today we are lucky to have a guest, Jaclyn Rice Nelson, who's coming to us from Tribe AI. It's a network of professionals in sort of the ML data science space, and they help enterprise companies and startups connect with top talent. So Jaclyn, welcome to the program. 

Jaclyn Rice Nelson Hi, everyone. Great to be here. 

BP So we usually just start off by asking folks to give us a quick flyover of how they got here. I know you spent some time at Google and also at their venture arm maybe, and now Tribe. So walk us through how you ended up where you are today, and a little bit of what it is you do day-to-day.

JN So I started my career in finance and I quickly moved out to the Bay Area to enter tech and join Google, as you mentioned. And most of my career there I started kind of working with large tech B2B enterprise companies on more of the go-to-market side, where Google makes most of the money, so on the ad side of the business. I had a chance to kind of apply a similar kind of service model, which was essentially providing free consulting services to help optimize ad spend. But instead of doing that for large enterprises, do that for emerging high growth tech B2B companies, so working with Box and DocuSign really early days, and it really just became an extension of their team because startups are, as many of the listeners will know, very fast-paced, and the good ones will take advantage of all the resources they have available because you never have enough. And so that was kind of when I first got exposed to startups and the pace, the innovation, the idea of kind of being one team going after a goal was something that really spoke to me. And basically the rest of my career has been going earlier and earlier stage since then. So I officially got the startup bug. Google is very good at retaining employees with shiny things, so even when I tried to leave, I got pulled back in to help launch this marketplace product. Very network-based leveraging expertise, it was called Helpouts. We had a large 30-person engineering team when I went over as one of the first business people, and that was really my first experience working closely with engineers and product people. Again, I really loved it, and built and launched this product. Unfortunately, we crashed and burned. RIP Helpouts. It was, for me, this really formative experience though, which was really learning that I loved being part of building and bringing a product to go to market. And so after that, I sort of was looking for a similarly entrepreneurial opportunity within the Google family, still feeling really Google loyal, and joined CapitalG, which is Alphabet’s –formerly known as Google– capital. So under the Alphabet umbrella, a venture fund focused on late stage tech companies. And the core thesis there was actually that they're really good at scaling things. And so the thesis was if we invest in late stage tech companies, they've already found product market fit, and now we're trying to figure out how to go global, which applies to engineering scalability, it applies to product, it applies to sales, marketing, HR. Really all of the places where Google had built these incredible scaling playbooks. And so what I did was build this expert network of 50,000+ people to help harness kind of the expertise within Google and scale it to accelerate the growth of the companies we invested in. And what I saw was that companies really came to us for help in all of these areas, but the area we were most differentiated was of course on hard engineering. So from security engineering, Google has an incredible security engineering team. And then data science and machine learning were really these patterns that kind of started to emerge. And for me, I got to see both what really excellent looked like, so working with these top engineers within Google, because we of course start to optimize for the people who are really just top, top. They’re the ones who are going to drive differentiated insights for these companies. So I really learned what good looked like on the talent side, or excellent looked like on the talent side. And then on the company side, I started to really just see patterns across these questions that the companies had. And I'm not sure if it will be surprising to the audience, but it was very surprising to me at the time that the questions were not these really deep, sophisticated technical questions. They were, “How do I evaluate a data scientist?” Or, “Google, you're applying this technology to all these amazing places in your business. Where can I do that to have a lot of value, and not just do it in name, but actually do it to power my business. Where does it go and what should I be doing?” And so there were sort of these higher level strategic questions and it occurred to me that this was pretty shocking given the companies were Silicon Valley based, had large engineering teams, had Google as an investor, and still the questions were pretty fundamental, pretty basic. And I really sort of saw this as the barrier to AI adoption sort of across all companies. And so I set out to solve this problem. I left CapitalG and started really focusing on building solutions. The model actually was really similar to what I built at CapitalG, which was that expert network, talent network based model. I also teamed up with my co-founder who comes from Gigster, which was similarly this network of engineers that worked with companies on a consulting basis, and we kind of applied this model except for really the top 1% of engineers focused on data and AI. And the idea was that by actually figuring out how to unlock talent on the market and really bringing together some of the best minds in data science and machine learning, that if we could figure out how to partner them with companies that didn't know what to do, we could actually really make huge gains not just for those companies, but also for the industry. And so fast forward, we are a network of 250+ of these top experts in AI and machine learning. They come to us every week. We get dozens of applications. We do not even find them. And the reason for that is that we've built a technical brand that I think really speaks to things they care about, which are sort of freedom of how they work. That's mostly enabled by creating and finding opportunities for them to work in ways that are both really interesting from a technical perspective and provide them the freedom and flexibility to live their lives the way they want to. 

RD It's interesting that you say that the questions that these engineering companies had was basic, basic stuff. We talked to Intuit recently and when they were scaling up their AI program, one of the big things they had to do was just get their engineers and data scientists to speak the same language. Do you think there is a disconnect in how data scientists and engineers work today? 

JN Yeah, it's interesting. I mean, I always think back to kind of what were the patterns I both saw across companies at CapitalG, and then what do I see today? One of those other threads was basically any version of cross-functional networking. And so whether it was designers and engineers, or product and engineering, or now data science and engineering, I think that any time there are groups operating in silos, figuring out how to partner, how to do that handoff with other teams, particularly when you're moving in a fast-paced environment and scaling, tends to be where things break down. And so yes, I think that is almost certainly true. I would also argue that it's probably true on data science analytics, and sort of some of these more machine learning engineering functions, and product. So I think that has been a real unlock for us, because we work with companies in a few ways. One is there are going to be high growth companies that are quite technical. They just need speed. And we can just give them talent and they know how to work with that talent. So we can help provide them engineers that will work with them on a consulting basis and they're good to go. And then we have another bucket of companies that aren't as technical, and where they want to work on machine learning projects but they probably don't have as much experience working with that talent or really knowing how to apply it to their business. And there we’ll actually pull together and build teams of people that we will actually manage for them. And so they can still embed deeply with the sort of customer technical teams, but we're really responsible for driving the outcomes. And in those teams we always embed a product manager like a data PM with those really technical specialists because it is critical that you're always really orienting towards value and orienting towards integrating with the goals of the company. Otherwise, you're building things that are cool technically, but probably not going to do very much. 

BP It's interesting. Before I came here I worked as a journalist and did a whole bunch about the sort of long-term trend of outsourcing IT talent and what an Infosys or a Tata would do, or what it was like to hire a team in Ukraine to get your tech work done if you were not a very technical company. And one of the things you just said that struck me was that often if they needed a specialist because they had some experience, they could bring that in, or they could hire an entire team, kind of like you said. So when you're looking out at this network, I noticed in your website you'll list kind of what they worked on and where they worked and where they got their education, but what do you think are the key sort of experiences that folks have had that would qualify them to be the lead in a role like the one you just described, or to be able to bring a lot of value to a company as a consultant as opposed to a full-time hire?

JN I mean, you said so much in there that I think we can and should dive into, one of them being this sort of outsourcing trend and how is that evolving over time, particularly with large language models. Another piece I think is what you're describing, which is what are actually the skill sets that matter? And I think this is where I get the most excited about the power of a network-based model. The alternative of course is that I hire full-time all of these 250 people, they work for me or for Tribe, and we kind of pull from our bench. Now granted, there's no way I could hire these people in this case because these are people who are otherwise off the job market. Perhaps in the great resignation or other points in their career they decided that they didn't really want to work at some of these bigger companies, and really that's kind of the arbitrage play– that you can get this quality of people because you're not trying to hire them full-time. But let's just say for the sake of experiment I could, because that's what large consultancies do today, and then they just pull from their bench of who's available. So sure, you're sort of saying someone has the skill sets that are needed to do the work, but realistically you're probably just pulling from who has capacity, isn't on other projects, and is on your payroll. I think in our case we can really just go get to the right person for that work. So I think the first thing I would say is someone who has the experience for not the high level work but the actual work that needs to be done. So based on what you're talking about, is that a data engineer? Is that more of a strategist? Is that a model builder? There's sort of so many deep areas of what it takes to actually do the work. And because most companies don't even know what to do, which was sort of my CapitalG insight, they definitely don't know what needs to be done on the ground. And so generally when companies are making full-time hiring decisions, they both need to optimize for generalists because certainly at a startup or a company that's in any way constrained and is not a Google, you can't have a thousand specialists in any given area being sort of underutilized. And so they kind of optimize for the mean skill set which means you're generally not getting the person who's best at the work they're meant to be doing. And so instead, we can kind of pull in these specialists based on what actually needs to be done. And at the simplest level, the thing that's most important is that they've done the thing before. So it's not that they have to have worked in insurance or worked in construction or whatever it is that the company does. To me, the most amazing results happen when you take the industry expertise that lives in a company and marry it with deep technical expertise, people who know how to do the technical work and can learn the industry pieces and really pull those things together to drive incredible results.

CW When companies use Tribe is there an education component of, “This is actually the type of role that you need.” Because so many companies I've seen are just like, “Oh yeah, we just need a TPM and it'll be fine.” Or, “Oh, we need just someone to build the model things and then we'll be fine.” And it turns out that is not what they need at all; they need a data scientist or something like that. 

JN Yeah. I think the hardest part is actually figuring out what it is that they do need. Generally, what we'll try to do and in cases where we can sort of tell they don't really know what they need, is do kind of a quick investigation ourselves. And that usually means kind of two weeks to a month of working with them, putting really deep specialists based on what we think is the work they need to do, and someone who's more of a product thinker and sort of more of a strategist and really come up with where we actually think the opportunities are and what is that technical roadmap to get there such that we can then put together the right team. And so in those cases it's not knowable unless you actually do get into the weeds, and there's no one in the company who knows those answers. And so I think the hardest part is really helping people understand that actually doing that work is so important for the foundation of all future work and really derisks the technical investment you would make going forward because you will now have a plan to be working off of that is very informed by reality. 

BP It's interesting to hear you say that. Obviously the discovery and research part should come before the road mapping which should come before the execution. It doesn't always work that way, but hey. I guess I'm curious to hear your perspective on how you do that in a world where the pace of progress seems so rapid, that what you decide would be the best use of AI/ML during a six month research phase might look very different than what we now see in the wild six months later as state of the art. How do you skate to where the puck is going when there's just this constant drumbeat of breakthroughs and sort of leveling up across academia, open source, and company product?

JN I think the real question is still what matters for your business. So it's pretty rare that large companies are working on the most state-of-the-art technology, but the advantage they have is that they have data, and so if they can figure out how to use and harness that data and really understand who their customer is and have insights and really some of the basics, the things that I think in startups we kind of take for granted actually are not basics in a lot of companies. And so on the sort of larger end of the scale, we're just getting the data pipelines in place. We're really getting the sort of data to be usable and drive actual insights that are actionable. That lays the foundation for then you can build cool stuff on top. So that would be predictive models. And what I mean is forecasting or predicting failures in hardware. There are sort of some of these applications that are so critical for businesses that they're not doing, but that sort of advanced technologists might think of as approved, de-risked, tried and true methods that have been applied in various contexts. And so that is generally what big companies are sort of focused on; kind of where can I actually get value from my data? How do I actually use it to power my company? And I think that's the right starting point. It's just the reality that that takes time to build such that you can do the cool stuff. So if prediction is the kind of first milestone of cool stuff, or pricing optimization, well then the next level of that is the large language models, et cetera. And there are businesses that have amazing applications of that. I would take law as an incredible example where there are startups coming up who are focused on using large language models to do really, really interesting stuff in the legal industry, and where big companies similarly have a ton of data and if they can figure out how to really use that with advanced machine learning and build those technical capabilities, they too will have a real advantage. So I think there are really incredible applications. I think everything comes down to, what is the value this will drive for your company and therefore where is the right place for you to be spending your time. I also talk to lots of companies that want generative AI but where it really doesn't make any sense for their business. 

CW It's one of those things where when there's a hype cycle, people are just like, “Oh, we should be on that,” but they don't need it, necessarily. And again, it comes down to that education piece of figuring out what do these companies actually need to make AI work for them.

JN Look, if they really want generative, we can give them generative. But the first question that we ask is, “Are you going to be happy with the result?” Let's say you can wave your magic wand and you have generative AI, whatever that means. What happens to your business? What's different? Because alternatively, you're just going to end up throwing this stuff out and have spent a lot of money and probably not be thrilled with Tribe. So that's not the outcome we want to drive towards. We want to drive towards outcomes that people are going to be really excited about and that are going to be really core to their business.

BP For the insurance company, probably better to focus on price prediction and data analysis than creating customized art or text for all of your potential customers, but you never know.

JN You never know. I'd like to see that insurance art. 

BP Do you think similar to the way you talked about Google helping folks scale when you were there, that we’re getting now to a point where there will be enterprise offerings, cloud offerings, that don't do everything Tribe does, which is to actually have somebody in there writing code or product managing. But Microsoft has talked about AI as a service, letting other companies use it, use their data, slap their own label on it. So do you think that this is something we're going to start to see, and what would the difference be between using a service like that and relying on a network like Tribe?

JN Yeah, honestly, this is a super interesting question, and I don't just see the need from big companies. I spoke to a startup last week that wants exactly this. They don't want to invest all of their resources. They're not a generative startup. They're not a machine learning startup. They are an amazing company going out for the wealth management industry. They have a team, very technology forward, but less so on the core ML side, and they'd be thrilled to just pay for a service that they can integrate with, leverage their existing data, and offer something pretty differentiated to their customer base. And I think that that is a need across the market and I would be surprised if the answer is that everyone needs to build their own bespoke offering on top of these APIs. I suspect that there will be an even easier way to be plugging in. 

RD I was going to say, I think there already are some ML as a service. I know we talked to Warner Brothers Discovery, and they started off using a recommendation engine based on AWS. It was a full recommendation ML model. 

JN Yeah, that's true. On the recommendation side, I think the demand of the moment is more on this chat functionality, and I suspect that that will also become a horizontal layer. I think the biggest question that we get now is around IP. So that is the question of the moment. There are also lots of people working on IP solutions, but this is all a really gray area. So this is where I hear a lot of companies saying, “Yeah, this is super interesting, but I'm going to wait this out.” 

BP You have a waiver they can sign that makes you responsible? No, I'm just kidding.

JN We do take on some of the risk for sure. I wish we didn't, but I think the bigger issue is that nothing is known, right? There aren't answers right now. This is all pretty unchartered territory.

[music plays]

BP Alright, everybody. Thank you so much for listening to this episode. Tons of fascinating stuff to think about. I'm going to shout out a member of the community who came on and helped share some knowledge, saved some questions from the dustbin of history. Thanks to PM 2Ring, awarded a lifeboat badge on February 7th for an answer, “How to sort a list to form the largest possible number.” The largest. Helped over 10,000 people, so thank you. I am Ben Popper. I am the Director of Content here at Stack Overflow. You can always find me on Twitter @BenPopper. Email us with questions or suggestions, podcast@stackoverflow.com. And if you like the show, leave us a rating and a review. It really helps.

RD I'm Ryan Donovan. I edit the blog here at Stack Overflow. It's located at stackoverflow.blog. You can find me on Twitter @RThorDonovan. 

CW I'm Cassidy Williams. I'm CTO over at Contenda. You can find me @Cassidoo on most things. 

JN Awesome. I'm Jaclyn Rice Nelson. I'm the co-founder and CEO of Tribe AI. You can find me on LinkedIn and Twitter @JacSRice, and come check out our website: tribe.ai. 

BP All right. If you're in the AI/ML space and you want to sign up to do some great consulting work, you know where to head. All right everybody, thanks for listening and we will talk to you soon.

[outro music plays]