This week we chat with Lyel Resner, a fellow at the MIT Civic Data Lab and Adjunct Professor of Civic Technology at NYU's ITP program. He explains the surge in tech startups looking to ground themselves in ethical practices, and argues that the extra work and expense of these considerations are outweighed by the eventual costs that come with ignoring ethics until a company is more well established.
You can find more about Resner here.
Learn more about the topics we discussed by following some of Resner's suggested links below:
People to follow on Twitter: Safyia Noble, Ruha Benjamin, and Kamal Sinclair.
Ellen Pao and Project Include.
Eli Pariser and New Public by Civic Signals.
Lyel Resner If you're not meaningfully thinking about ethical and social risk, and you're just focused on like hyper growth in the short term, there will be blind spots in your business that will eventually cost you time, money and reputation.
[INTRO MUSIC]
Ben Popper Are you struggling to deploy cloud native applications to a hybrid cloud? Do you want to become familiar with Kubernetes and Istio? IBM cloud has a set of free, hands on training, ebooks, and an always on free tier of services to help you learn. Visit IBM.biz/StackOverflow to learn more. That's IBM.biz/StackOverflow.
Sara Chipps Well hello, everyone, and welcome to the Stack Overflow Podcast. I'm here with Paul today. Hey, Paul!
Paul Ford Hey Sara!
SC How's it going?
PF It's going fine in today's crazy mixed up world, I would say things are pretty good. How are you doing?
SC Good! How was your weekend?
PF Um, it was a good parenting weekend, I guess. [Sara laughs]
SC You do like success metrics or KPIs around parenting?
PF Um, not yelling. Not yelling is a KPI.
SC That's a good one.
PF For real, you're just like, ''Did I keep it together?''
SC That's a good one.
PF I'm at about a 75% for this, I'd like to get to more. 100 is impossible. But you know, if we could just get that up to like 85 i'd feel a lot better. And then also I give the KPIs to my wife, and she's a huge fan of that, she loves it. [Sara laughs]
SC Good to have SMART goals. Good to have SMART goals.
PF Oh, absolutely.
SC Well, today on the show, we have a guest we have Lyel Resner here with us today, who is a professor of Public Interest Tech at ITP. Also formerly the co founder of Swayable. Here to talk to us about some ethics in tech. Welcome, Lyel.
LR Hi Sara, Hi, Paul.
PF Hey, so we should point out that ITP is part of NYU in case you like acronyms.
LR That's true.
SC Yeah!
PF So New York University, ITP. Actually, what is ITP stand for? Better get this one right.
LR It's funny. Lot's of pressure. Interactive Telecommunications Program. It was started in the 70s by a pretty remarkable woman named Red Burns.
PF I mean, if people don't know that, no, this program, this is sort of like one of the archetypal new media thinking programs in the world. Like it got there so early, like you've got MIT, Stanford and ITP.
LR I'd agree. It's a really, really special place, this expressly kind of interdisciplinary, collaborative place across technology, engineering, design, media, art, and increasingly ethics, social responsibility, civics and public interest.
PF And they are--do they still do stupid hack days?
LR Haha. Is that a formal title? I haven't--
PF I don't know what it is. No, but there was a, there was a point where ITP was just doing really like dumb projects on purpose that was completely magical.
LR Well, I can say I can say and this is one of the wonderful things about it. But I guess also part of a tension that we're trying to balance with some of the work that we're doing. I think the ethos there is build something greater than the sum of its seemingly inauspicious not connected parts, right? Whether that's like a burrito 3d printer or something else.
PF That's right. It's this. It's the stupid s*** no one needs and terrible ideas hackathon. That's--I found it. It's, they did a cheese 3d printer, right? Things like that.
LR I mean, speak for yourself about the need for a cheese 3d printer. [Sara & Paul laughs]
SC Yeah, I was gonna say when people come to New York, when we have new transplants in New York, I tell them, they have to do two things, versus the New York tech meetup. And second is go to the ITP spring. Yeah, spring or fall shows. They're just like amazing experiences of people that have built just incredible things over their tenure at ITP, of just really cool technologies that you can interact with. It's really awesome.
LR Totally. 100%.
PF That sense of humor and is is so unusual in our industry, which is the least funny industry that has ever, like even carpeting is funnier. You know, it's like oh, you know, like, there's probably good jokes about like, deep pile versus shag. But then you get to tack and there's like, ''Oh, no, Perl sucks.'' Like that's, that's our that's our humor. And so it's just ITP you know, um, you know, just good stuff, good ridiculousness and a broader conversation about tech.
LR I mean, in tech's defense Perl elicits strong opinions. That's the polarizing. It's just polarizing.
PF One day we're gonna do a special episode of this podcast all about Perl. Perl 6.
LR Oh my god, you're probably gonna get hate email right now just joking about it.
PF That will just be me doing a monologue. Sara will be like ''and--'' no Sara, Paul, Paul needs to talk about Perl.
SC It's about Paul time. [Sara laughs]
LR I would just say about that, I think--
PF Oh, did we set that up for you? [Paul laughs]
LR Sorry. Well you could if you like. So I think ITP like I think ITP is really benefited from this sort of playful, experimental, cross disciplinary ethos. But like a lot of the rest of the technology community, there hasn't really been space for thinking about intention and consequence in what people build. And one of the exciting things that's happening now, and there's some terrific faculty driving, but I think one of the more exciting things is, I think we're really seeing it being driven by student interest and student demand. Our folks want to understand that they want those spaces, they want to be more thoughtful about what they create, and how they create and the sort of impact that it has on the world.
SC Yeah, that's really neat. I think I think this is a really great topic to discuss. And I'm glad that you're coming on. I think, recently, something that's made me think about this just even more, I know SpaceX, this is like a month ago, they announced that they were releasing the capability to deliver a missile in one hour to anywhere in the world. To me, my reaction was like, oh, man, there's a lot of coders working on that. And their project was let's get a missile to anywhere in the world that needs it within an hour. And what a deep thing to think about as part of your work, and what a rough thing to work on and maybe even live with. So I think this stuff is really interesting. What got you interested in it?
LR I mean, just personally, I've always sort of been at the intersection of technology and public interest or technology and social change, like mostly doing education and economic mobility work. And I think for this subject, specifically, and the work that we're doing now, this research project, education is my first love and I'd love to, I'd like to kind of find my way back there as as an entrepreneur again, but after the election in 2016, took a little bit of a detour and wound up starting this political tech company called Swayable. And we went up going through which I'm no longer affiliated with, I should caveat. So I'm speaking for myself and not for my co founders. But we wound up going through Y Combinator in early 2018. And basically, what Swayable is, is we measure how content changes people's minds. And we built it to support progressive advocacy organizations and political campaigns. And I think, from the outset, we were really cognizant that in the edge case, we were successful, we were building a giant repository of how content changes people's minds, which is a really frightening data set to bring into the world. And so we tried to be really thoughtful about both product and organizational design decisions that we could make to help ensure integrity of purpose. And really, why see, those conversations were happening. And we kind of felt like it was just hard to find resources and examples of people that were trying to be super intentional about that. And so thankfully, over the course of the last couple years, there have been these exploding conversations around ethical tech and responsible tech and rights respecting tech and public interest tech. And but I think they've felt a bit over indexed on the technology itself. When I think you know, if you build a company, you know, that the technology that gets created is ultimately subordinate to the mission, the values, the business model, the culture, the performance incentives, the infrastructure that exists within the company that builds it. And so our current work is really just trying to elevate those conversations around like organizational design, because entrepreneurs make tons and tons of like, little micro organizational design decisions every day. And so it seemed like a useful thing. Like there's no central, there's no like central body of knowledge for that. There are a lot of people doing cool stuff, but there's no central body of knowledge. So.
SC What do you mean when you mean organizational design? Like I'm being purposeful about my organizational design, what am I thinking about?
LR Yeah, great question. So it starts all the way like up with incorporation, right? Like are you have traditional for profit, you want to incorporate as a public benefit corporation, which is a really new, exciting legal structure that actually changes fiduciary responsibility for founders, from shareholders, to stakeholders, to who you fundraise from and under what terms and maybe how you bake in kind of social impact or stakeholder value expectations in with your, with your investors, to your governance model, to your ownership model, to your pricing model, to you know, how you hire and think about the AI work to how you think about kind of risks in your business model. I mean, they're all kind of in all those areas, all those categories of company building, there's just a ton of different things that you can do to try to make sure your company is having is ultimately world positive as I think obvious.
PF To describe what you are building like what what is going to I'm going to go to I guess a website, and like what's going to happen?
LR Yeah, good question. We don't know yet! So we're, we just got support for really like the data collection part of this. So right now, we just kicked off interviews with a bunch we're looking to do a you know, upwards of 100 interviews with diverse influential founders, operators and investors to hear their stories about how they're approaching these issues, so that we can hopefully kind of maybe generate case studies, examples, frameworks, tools that we'd like to make open source and freely available and build a community practice around. But we don't know what that looks like yet. We're just, we're kinda in the data collection phase. And it's really fun. Like we're about, I'd say, a dozen interviews in and the conversations have been freakin' awesome. I think increasingly, like this generation of founders and operators are like, we don't want to build a unicorn, only to discover that, like, we're tearing apart some important fabric of civil society, like they want something different. And there's a really terrific group that we admire called Zebras Unite.
SC I'm a huge fan.
PF Let's tell the people what it does.
LR They came up with this idea of like an alternative to unicorns as like an aspiration of founders. As in like, you know, I think the idea is like a Zebra, first of all, unicorns aren't real.
SC A unicorn being a billion dollar company, right? A company with a billion dollar valuation. Yeah, right.
LR And zebras are just like companies that are both profitable and sustainable and good for the world. And it's just a really terrific group that's also kind of creating a community practice around an alternate set of aspirations in tech. And it's led by a couple of really amazing women. And they're just doing really cool, inspiring work.
PF So give us a few examples, like you said, that conversation has been freaking awesome. And obviously, there's some confidentiality and stuff. But what, what have been some things that have jumped out?
LR Yeah, so one, for example, we talked to one founder who's really early stage and used to be an executive, a product executive at Google. And she, even in her like, super early stages, she's like trying to find product market fit and find her first set of customers has invested a ton in, I think, what she called project based hiring. So the idea that she's looking to hire folks with technical skills, mainly from the African continent. And so she's trying to like bypass traditional signals of competency, that I think a lot of folks in tech kind of use as a proxy and wind up giving us a less diverse talent pipeline than we could have. And so she's built out like this super impressive, project based hiring framework that took her a ton of time. So she invested all this time upfront to think about this. But she's talking about how it yielded so far. And the results are amazing. Like, she has this really diverse team, and folks who don't really have any traditionally fancy credentials, and they're all crushing it. So that's, I don't know, that's one example. We talked to another founder, who is also early stage, and she's like, in the middle of her seed raise, but she has been thinking a ton about and her team's been thinking about, like the accessibility of their business model and getting into really nuanced debates about like sliding price scales, and democratization versus access, just all sorts of super nuanced, interesting things that I think on the one hand are, again, nuanced, but also really generalizable. And I think something that a lot of founders, ultimately will have to confront as they're building their businesses and could learn a lot from if we if we tell their stories.
SC What I've observed, I don't know if you've seen the same with folks that think about ethical tech is like the people that are thinking about it, really prioritize it. But the people that are not thinking about it just aren't thinking about it and aren't, you know, it almost seems like they're not interested, right? They're like, they need to move too fast. There's so much to do. I can't be thinking about, you know--
PF Hold on, hold on, because what you're actually thinking when you know them, right, they're thinking, yeah, by building in the fastest and most effectively possible, I'm generating tons of opportunity. Money is a good thing. Like they're, they're fans of the system, right? And so, as far as they can tell, like lots of people in Uber think Uber is incredibly ethical, right? Like, it doesn't--no, they do! I mean, it's like, ''No, we changed. Yes, we had to change some laws. But now the laws are more fair. And there's more people coming into the industry and so on.'' So I mean, it's like, I like I hear it in it. But you have to be mindful, like there, there are people who are just like, absolutely not going to agree that they're unethical. They're going to say, ''Well, that seems like a lot of friction on my capitalism, which I really like. I'm a big house, and I created thousands of jobs. And you're gonna come here and tell me that I have to do everything completely differently.'' Like what? Anyway, Lyel, work that out for us! Fixed capitalism for me, would you? [Paul laughs]
LR So you know, I think, a couple of responses, I think the ethical framework is as sort of problematic, but it's kind of regrettably become kind of this umbrella catch all for a lot of these conversations. There are a bunch of other distinct framings. And I think the distinctions are important, although I don't want to, I feel like put people to sleep if we talk about--maybe I'll talk about one.
PF This is a podcast for programmers, it's okay. They're out there, we've talked about semi colons for 45.
LR Let's talk about Perl. Yeah. So I guess the most maybe the most important thing to call out in these kind of conversations around ethical tech, responsible tech rights respecting tech, civic tech, public interest, tech tech for good, humane tech, I guess the main one is, like the ethical tech framework is inadequate insofar as like ethics vary from person to person, they vary from group to group, and in an industry that already struggles considerably with representation an ethics framework can compound certain issues. And so it's almost at best, it's insufficient, and at worst, it's harmful. And one of the frameworks that I think has been is more powerful is this idea of almost like rights respecting tech. So the idea of like introducing considerations for thinking about harm, relative to human rights, and I am certainly not the first person to think about this. There have been a lot of scholars, in particular, like women of color, who have been on this problem for a long ass time, folks like Safiya Noble, who were lucky enough to have on our advisory committee who wrote Algorithms of Oppression, who's been just like saying this stuff for a really long time. And
Ruha Benjamin at Princeton, who wrote Race on Technology, but that just the idea that, you know, it goes, it goes beyond ethics, it's like, ultimately, you know, there's a difference between, you know, are you, you know, contributing to like the dopamine economy, or I think the example one of the examples, we draw, you know, Instagram removed the number of likes, right, from a post to like, less than the dopamine hit a little bit, right. And it's like, that's great. But it has nothing to do with your business model, and like how, at your core, the sort of harms that you might be perpetuating. And there's a difference between that and like, predictive policing, and facial recognition that, you know, disproportionately affects marginalized and black and brown communities, right. Like there's a, there's a fundamental divide there. And so I think in that conversation, it's important to kind of acknowledge that.
PF I think you're hitting on something that's really tricky in this conversation, right, which is that there is a relatively small community relative to the larger global world of tech that is very focused on this problem space, right? And it's, so where could you get the most change? Well, what everyone has to do is follow the guidelines of the major app stores and platforms, right, we don't have a choice. You can't have an app in the App Store, unless you do what Apple tells you to do. The same is also true government regulation. You know, I think that there's a when I when I'm listening to you, I'm like, this is good. And those guidelines will be very good. And then but there's also a real need for sort of regulatory and sort of larger platform pressure, because the other 95% just won't pay any attention.
LR I think it's threefold. I think there's definitely a need for a regulatory strategy here. But I think, you know, maybe to go back to the framework of like motivations, like why should people pay attention to this, if they're, they're convinced that their company's already ethical, like, I think you are seeing already and you are going to increasingly, like bump up against an aggressive and smart, hopefully, incoming administration, regulatory regime. And so kind of addressing these issues head on, can I think both save you costs in the long term and maybe buy you a little bit of social capital?
PF I think it's true. They've had four years to think about it, too. So yeah, yeah, it's gonna be a lot.
LR Yeah, for sure. I also think, you know, the millennials and Gen Z, in particular, are, like more inclined to buy products and services and be part of communities that resonate with their values than any community before. And so there's like a consumer pressure there. And then I think, and Sara, you kind of alluded to this with the missile kind of issue, I think, talent more than ever before. And I'm seeing this as a professor at NYU, like tech workers are increasingly not willing to build stuff that they have reservations about, and from a talent, attraction and retention standpoint, if you're not thinking about this stuff, I think it can also cost you. So I think there are, you know, even if you're just a hyper capitalist, I think there are real incentives for addressing these issues head on. And then I mean, my personal point of view is if you're not meaningfully thinking about ethical and social risk, and you're just focused on like, hyper growth in the short term, there will be blind spots in your business that will eventually cost you time, money and reputation. And if you're truly wanting to build, like a long term company, then it is a business imperative to think about these issues up front. And I think one way to think about it is almost as like insurance, like, you are probably going to have to pay some costs up front, right? It takes more time. There's a little bit of friction introduced into the system up front and so you pay your premium though, each month, right? And you pay your premium kind of your ethics or social responsibility premium each month to hopefully prevent a disaster much later on that can really kind of throw off your business.
PF Okay. [Lyel laughs]
LR Are you not convinced Paul?
PF Well, I'm living it in from a different perspective and scale. Like I see it also with a lot of words that I deal with, like scaling this is very hard. The framework that you're talking, you know, I've been watching this world for a while. And it feels like it works well for a small bootstrap startup, or cooperative models, like people can really work together, but it has a lot of trouble scaling to larger organizations, because lots and lots of different ethical systems start to happen in the org. And, and so like, there's an assumption with ethical tech that there is a what I would almost call like an academic model of ethics that everybody shares. And as after you get past about, let's say, let's throw a number out like 25 people, you're going to have five of those people are going to be like, ''What is this, I need to do my job.'' And then you can try to, you can talk a lot about values, but humans just don't line up that way. So I find that part complicated. I think it's also just like, VC. And the other ways that money gets into the tech ecosystem are often pretty indifferent to this. They see growth as its own good. Yeah. So that that's a big challenge to like, I mean, I think, look, I think this is important work, like I will go read every page on the website, I just, I'm in this industry for so long. And the things that you say, are things that a lot of people will pop up, like, I'm just seeing the Twitter accounts like ''No, no, I'm the best person in the world. How dare you say that? There's another way of seeing this!''
LR Yeah, I hear you, I think, and we hear a lot of kind of all the caveats, right? Like, we're already resource constraints. But I mean, more from early stage, like, you know, we're resource constrained, we're still trying to find product market fit, and focus on making payroll, it's too early, it's too hard to predict what can i realistically do? I sell SAS software, this doesn't apply to us. But I think there's sort of two mechanisms here. One is if people will want it to change, like, if people like, the only way it's gonna change is if we change it. And we are more intentional about it. And I think the upcoming generation is going to demand much more rigor and thoughtfulness, even if it is a challenging problem as you scale a company where things kind of splinter off, it's something that I think increasingly top talent and especially diverse top talent, wants to see prioritized in some kind of way. Because even with those challenges, just like anything else, I think there are a lot of easy wins that you can take off the table right off the top that I think people are missing right now. And so the second point is, you know, a lot of people, it's funny, I've had one of the other motivations for this is, I had a bunch of early conversations with folks and they're like, ''Um, you know, are we're ethical, right, like, and we, we, if there were a problem, if there were like a righteous problem, or something we'd know'' and like, ''Okay, well, how do you think about that?'' It's like, ''Well, we have a couple partners, and we think about this and we know'' it's like, ''Okay, well, do you do any of the following right? Have you ever engaged a third party to conduct an ethical review? Right? Have you ever used the risk assessment framework like ethical OS or consequence scanning or the data ethics Canvas? Does your team explicitly hold regular meetings to anticipate or manage harms that your product or business might cause? Do you have offsites about this? Do you have clear policies on how your product can or can't be used? Are you 50% women, bipoc, or LGBTQ+?'' I mean, they're like a bunch of concrete things if you're like, ''Oh, you're ethical? Okay, like, are you doing XYZ?'' Like, I think a lot of people do care. And they think that it's a challenging issue, and they kind of do their best on it. I think one of the things that we want folks to realize is, is there's a whole discipline around this that exists, right? There are a whole set of best practices and ways that you can introduce rigor into your practice, if you say that you care. And I think you can say that you don't care. But I personally think that that's a business mistake. And I think there's a growing case for that. And then if you do care, then like, these are relatively low cost, like things that you can do to help make good on on your word.
SC That's great. Um, can you tell us a little bit about your co founder, the person that you're working with on the project?
LR Yeah, yes. Thank you for that, Sara!
SC I know they were going to join us today. But they were around what you said a mess of children.
LR Right. So my brilliant partner, Dr. Wilneida Negrón, does some work at Coworker and has done strategic tech before and brings a super interesting background around kind of the intersection of tech and social justice and movement building. And I was hoping she'd be able to join us today. But she does a million things, including be a mom and yes, earlier just before the call, I think the words she used were she had a ‘’mess of children’’ at her apartment, because the schools were closed in New York City. And I think we agreed that, quote, mess of children should maybe be the new formal way to describe a group of children that largely aren't your own.
PF That's anything above one really, but yes. [Sara laughs]
LR But yeah, Wilneida is incredible. And it's been really fun to do this project with her.
SC That's great. So what would you say to folks that are listening that are like ''Oh, wow, I wonder if I work for an ethical company?'' Or how would you advise folks to evaluate if the work that they're currently doing is ethical or the company that are currently working at is ethical?
LR Great question, I think, twofold. So there really are a growing number of tools and communities that care about this stuff. I think probably the one of the kind of Pandora's box for looking into it is if check out the ethical OS framework that was built by Institute for the Future and supported by the beneficial tech group at Omidyar, they did really terrific work that just kind of suggests, like, you can just kind of get a sense of like, how if you say you care about this, just like how much work that really can be. I mean, they identify eight risk zones that your product or business model can touch on things like truth and disinformation and propaganda, economic inequality, algorithmic bias, surveillance, state, data control, monetization, harmful actors, something like that. So that would be one area, there are a whole bunch of amazing people that they should probably follow on Twitter, like Safiya Noble, and Ruha Benjamin, Kamal Sinclair, Ellen Pao, Project Include, Eli Pariser, Civic Signals.
PF Project Include is so good if you're running an org, it's very, very thoughtful, it's very easy to read through, I would recommend it to anybody. And the ethical is is very cool to remember the risk zones are good. Like if these are good mental models, they're very helpful. We'll put the links to all these in the show notes. So people listening can step away from their keyboards, and just click instead.
LR Cool. And then people should sign up for our email list. And we are looking to do I mean, we have something like 80 interviews in our pipeline, but any founders or operators or investors, really, that feel like they want to learn more about this stuff, or that they're like really proud of something that they've done in their company that they think will help ensure that it ultimately has a positive impact on society. We'd love to talk with them and tell their story and help amplify it so that other other founders can potentially do the same.
[MUSIC]
SC Great. Well, I'm Sara Chipps. I'm the Director of Community here at Stack Overflow. You can find me on GitHub at @SaraJo and check out Jewelbots this holiday season if you're looking for some gifts for kid family members.
PF Lyel, go ahead tell us how to reach you.
LR My name is Lyel Resner, Professor of Public Interest Technology at NYU and a co author of the Startup Society Initiative with Wilneida Negrón. You can find me online at by lyelresner.com or twitter at @LyelR.
PF And I'm Paul Ford friend of Stack Overflow check out my company Postlight and get yourself some jewelbots this holiday season. They're bots and they're jewels! They're shiny, they light up, they have embedded hardware, you're gonna love them, jewelbots! [Sara laughs]
LR Wow, that was amazing.
PF Alright. We don't have any lifeboats this week because one of Ben's children, his mess of children, spilled coffee or not coffee, why would a child be drinking coffee?
SC You don't know, you don't know their lives.
PF Spilt bourbon all over his laptop. So we'll hopefully we'll see him again or otherwise he's cut off from society and we'll never see him again.
SC Yeah, just a farmer.
PF We'll find out.
[OUTRO MUSIC]