The Stack Overflow Podcast

Building for AR with Niantic Labs augmented reality SDK

Episode Summary

We chat with Richard Bailey and Kelly Cho. Bailey leads the AR Experiences team at Niantic and Cho is a software engineer on the AR Experiences team.

Episode Notes

You can learn more about Lightship, Niantic's AR SDK, here. They are hiring developers, and openings can be found here.

Richard can be found on LinkedIn here

Kelly can be found on LinkedIn here.

A big thanks to our lifeboat badge winner of the week, Karim, for answering the question: Check if value exists in Array object Javascript or Angular?


 

 

Episode Transcription

Richard Bailey The future of augmented reality, of course, moves away from using your hands to hold up a portal and just puts a portal directly on our eyes. And then we can look out the world and then has the advantage of now we're hands free, and we can use our hands to interact with the world. Whereas right now in AR, we either interact through touching the screen, or we hold up the screen and reach out and track one handed activity. So if we can free up our hands, of course, we're happier.

[intro music]

Ben Popper Hello, welcome to the Stack Overflow Podcast everybody, a place to talk about software and technology. I am Ben Popper, Director of Content joined by my friend, Paul Ford of Postlight. Hi, Paul.

Paul Ford Oh, my goodness, are we doing this again? That's great. Good for us. Good for us.

BP Paul. We were on a podcast recently with the great Jen Schiffer. And she was mentioning, there's a lot of activity around web xR. And she thinks that a lot of this is coming from people's increasing interest in augmented and virtual reality. And you and I were kind of saying like, we're not seeing it, like not happening. She brought up an interesting example, which is trying on some lipstick through the browser, just like seeing the color. So we have two great guests with us today, hopefully to talk about, yeah, what's happening with AR programming, what that's like how people can get involved. We have Kelly Cho and Richard Bailey from Niantic Labs. Welcome.

PF Yay! Welcome!

Kelly Cho Hello!

RB Thank you very much. 

PF Wait, first of all, what is XR? What's the X? I'm guessing the R is reality.

RB The R is reality, XR, X went through a lot of iterations and a lot of explanation. So it's going to change based on who you ask. It started at one point it was a star, an Asterix as a wildcard. But then it became X as an extended, and so on and so forth. So xR, we generally speak who as the umbrella of technologies that include AR, VR, and MR. Part of the motivation was to get away from exclusively talking about the technology, and focus a little bit more in our articles about the human side of the equation. and say every article a few years ago was every other sentence was AR VR, MR AR, VR MR. And it was largely distracting from well, humans have to consume these experiences and enjoy them and sold or so forth.

BP And so Kelly, for folks who don't know what is Niantic Labs, and what is it that you do there?

KC So I am a engineer on the ARDK Unity team, on the ARDK Lightship is Niantic's platform for you know, AR development. And then the Unity team specifically is sort of the bridge between the research teams at Niantic and our ARDK customers. So I work mainly on the API and making sure that you know, all the cool things that the research team does is actually usable in a production environment. 

BP So yeah, for people who don't know, what are some of the things that Niantic makes in house and then yeah, I guess the degree that you can talk about clients, you know how to third parties use some of what you built to create their own experiences?

So we currently have 10 games in production. We've seen announcements for Pikmin oriented game, Heavy Metal, Catan. We've also seen announcements for reference platform that we're working on with Qualcomm for a headset device. And then of course, the most recent announcement around Lightship ARDK for the development kit. OF course the most popular game is as Pokemon Go.

BP Always and forever. [Ben laughs]

PF If I wanted to use these technologies, where would I start? What would I do? Not as a gamer, but as a programmer.

KC The very first step would be to register to be part of the beta program for Niantic Lightship.

PF I'm typing it in right now. I'm doing this as we go along. Okay, Niantic.dev, I love a good .dev.

KC Yes. And then once you're approved to be part of that beta, you'll receive an account information and you can download ARDK, as well as the examples, documentation and so on, you know ARDK being Lightship augmented reality development kit. And then yeah, we have a getting started project. Obviously, I'm a little bit biased because I work with ARDK every day, but I'm confident that anyone even who has just a slight level of experience in Unity can get up and running with ARDK within an afternoon.

PF Give me a sense of the sort of concepts and skills like obviously, it's good if I understand a little bit about programming, should I know about like mapping? Should I know about geography? Should I know about latitude and longitude? Like, what are the core or is it more pure geometry like what are the core concepts of this world that, that people should be interested in?

KC Well in terms of what people might assume they need to know, in order to build an AR app or with ARDK is really three things right? You know, Unity coding, specifically C sharp coding, because that's what Unity uses. And then three is, yeah, like what you mentioned, some computer vision, background and knowledge of what exactly is mapping and how does AR work? I think of all those things, you just need a basic understanding of Unity. And, you know, in in interest in building AR apps. We've made a substantial effort in ARDK to provide something that not only provides access to the exciting new technology that Niantic is working on, but also, you know, a high level layer that makes using and accessing that tech really quick and simple.

BP And so are there certain, like devices and cluster of, you know, sensors that I need in order for AR to work, like, when I go to the website, I see somebody rolling dice, you know, in a 3d environment, and they're, they're bouncing off the the world around them, you know, you can add clouds or weather, you know, to something you can do mapping of you know, your home and see what it'd be like to add extra furniture. So when you're using this, do people need to know, okay, I need a mobile device has to have a camera and an IMU and like various things? Or can this work almost like in a dev environment on my home PC, and then later, I can like, push that out to devices that need certain components?

KC So in order to run an actual air experience, you do need an AR capable device. And currently, that's limited to AR capable mobile devices, headsets being in the future, potentially. But also, you know, it doesn't have to be the fanciest or latest mobile device. You know, the great thing about the like, the depth and meshing technology that Niantic has developed is that it can, it can function off a camera image, it doesn't need, you know, fancy LIDAR detectors or anything like that.

PF What is meshing?

RB So meshing is the process by which we detect edges and features and corners out in the world, and then line them up from multiple camera angles to figure out what their 3d position is. So then we create a whole bunch of little triangles that represent the approximate shape over the world that we see.

PF I'm holding my camera up, and I'm turning the world into polygons. And then I can put things on top of those polygons.

RB Right, we can get physics collisions and things bouncing off of them. And we can also do special lighting effects based on the measures etc.

PF Alright, so that's, when I am holding my phone up and looking, that's what I'm doing. The computer is meshing, and then I'm able to sort of manipulate reality that way. What applications I mean, we know Pokemon Go, like we know the games, like, what other applications are you seeing that that are exciting for AR?

RB We've been, we've seen Amazon starting to put augmented products in there. So you can see furniture placed in your house while you're there. And IKEA had been doing the same thing. Of course, we're seeing lots of tape measure apps, of course. And we're trying to enable a whole bunch of scenarios by making the technology a lot more accessible to the typical developer. 

PF Where do you see this finding a niche? I mean, fast forward five years? Who should be learning this in our cohort of developers? I mean, besides everyone, right? Like, I know, there's that answer. [Richard laughs] Yeah, well, I'm a developer, I know a little JavaScript, I do some web stuff. You know, and suddenly, the boss comes to me and says, What do you think about AR? Like, obviously, I can go get the SDK, but like, what applications do you think are exciting outside of?

RB Yeah, absolutely. I think they'll fit in two categories. I think the category we've seen a lot of is the gimmicks, you do it just because it's cool and cute and interesting. And then there's the category where it has an actual utility or purpose, it gives you an advantage. Second category is going to be places where the content that you're working with, will be distinctly better if you can spatialize it into your physical world. And that's why we're seeing a pop up a lot with furniture, and so on and so forth. Because it's an obvious, you know, play furniture, carpet paint, or, you know, easy, clear and obvious today, but then any other application where, where you need unique data in a particular location, you know, rather than off at a computer terminal, you need it at the place where you're looking at it. I mean, you can imagine the temperature of a swimming pool, want to go back into the pump room to know the temperature in swimming pool or just want to look at the swimming pool and see the temperature there, and so on and so forth. And the data have increasingly move away from these devices, and be available wherever it's most appropriate to have the data and content and experiences in a world.

PF Are you thinking about interfaces? I mean, obviously, they antic has a very deep relationship with Google. So like Google Glass, or does that factor a lot into your thinking to kind of augmented reality at an eyeglass level? Or are you mostly focused on mobile?

RB I mean, mobile is one that's very accessible today. And Niantic's history is built around building real things that real people can use, right, that's what we have millions of players. And so that's, of course, our primary focus, because that's where we can ship games today. And we can enable developers today. But the future of augmented reality, of course, moves away from using your hands to hold up a portal and just puts a portal directly on our on our eyes, and then we can look out the world and then has the advantage of now we're hands free, and we can use our hands to interact with the world. Whereas right now in AR, we either interact through touching the screen, or we hold up the screen and reach out and track one handed. So if we can free up our hands, of course, we're happier.

BP Would that be an advantage in terms of yet sort of the fidelity of it? I know, you know, with VR, I've read about this, having it on on somebody's face and in front of their eyes and kind of like aligned with the way they see the world. Would that be interesting in terms of how you can create experiences?

RB Yeah, it's a, it's becomes, in some ways, much easier.I think for the user and for the developer, because when you have a screen up in front of your face, it acts as a keyhole, in any interaction that you do through the spatial world, you're doing through that tiny keyhole that's out here at a distance. So your degrees of of interaction are now very small, until you move the keyhole around like a flashlight, right and try to find the next thing to interact with. Whether it's up here, it's like you're putting your eye right up to the keyhole. And now you have a huge range of viewing. And now your arms could move in front of you and interact with things in that space very freely.

BP And so I guess, Kelly, before, you know, you got into AR, were you working in software in another field? Like, is there a field that tends to lead people here, whether they're in game development or doing Unity, what's a pathway that people often end up end up on when they find themselves doing AR stuff?

KC I was actually working at a VR startup. So doing tool development at a VR animation studio. And that was my first job out of university. So I've always sort of been in the AR VR space. Yeah, if anyone's interested in becoming an AR, or VR engineer, the field is, is early, it's like still in its nascent stages enough that you don't need to be someone with 10 years of experience in AR VR XR, because pretty much no one has 10 years experience in AR VR XR I think if you're excited about the field, if you have an interest in it, and you're open to to learning, you know, maybe you like prototyping or you know, game development, then there's so many opportunities to to jump into the fields.

BP I think, yeah, you know, one of the things that interested us in this was the idea that it's kind of growing in the background, and we're not seeing it, I had come across an article I was researching for this podcast from Harvard Business Review, just saying that the pandemic had really been an accelerant for AR experiences for people to try before they buy and try things on from home. You were mentioning, I guess, yeah, that like, typically you want this to be for a mobile device. And then you know, the headset is where we want to get to, but a lot of people It sounds like you know with Jen Schiffer or try before you buy are doing it with a webcam on their computer. So are there particular challenges that you face there? And is that something that you think through? Like, how do you make that a good experience, even though you're using a device and maybe a set up, that isn't ideal?

RB Definitely working through the mobile phone creates a creates a challenge, more recent experiences have been with with headsets and so working through the phone, it has a challenge that you're kind of interacting through this abstraction, you're touching the screen, in order to interact with something in the spatial domain. And then you know, really it just takes iteration to try things out. Think about what the physical world action that you're trying to perform is, and then work through what's the abstraction on the phone with touch and movement combinations that you want the user to do to try to achieve that, that physical interaction. So if you want to have the interaction of swinging a baseball bat, you know how through a touch and phone, might you create that interaction while the user can still look at the screen and see what they're, they're interacting with?

PF How in the world does one develop good 3d object creation skills? Every time I've opened up Blender, I give them 15 minutes later. As I'm looking through the demos, and I look at this world, I mean, the ability to make a little guy with funny years, who's like, Hi, welcome is critical to this world, right? You have to be able to do it. That to me feels like a big, a big hurdle to get over is to actually create an environment and objects that people will engage with.

KC At Niantic, we are very lucky to work with a team of very talented artists and technical artists who handle that side of it for us.

RB I've seen some the art of 3d artists, and I actually stood over their shoulder and watched over an hour. Just because the whole process was was fascinating to me. It's a it's an amazing skill. 

PF Alright, so that is, that's very comforting. So now we know that, you know, there's there's at least two jobs in the industry. That's, that's good, right?

BP Paul, they're not hiring for full stack XR experience.

PF Not yet, Ben, you know that's coming, you know that's coming. We'd like the full stack, you need to be able to make a movie. And you need to be able to program the interactions with AR. I mean, really, when I when I think about this, what I think about is, and some of I believe that we sort of do need to head towards a culture of greater maintainability. Because currently, we have a very disposable, you know, where people buy new cars, and they buy new things all the time. And I always think about this technology is a wonderful layer, to learn how to fix and repair things. And I'm wondering if that's just a fantasy I have or you think that there is things that are happening along those lines, like, you know, is this a tool for fixing your own car engine, or what kind of layers exist like that?

RB There's a couple aspects of that, that I see as being very realistic. One, we've seen a transition in the past few years, from owner's manuals, essentially not being used anymore, like getting started guide, right, everything's transitioned over to YouTube. And he's like, oh, here's the owner's manual below, just just jump on YouTube and search for it. And watch some two minute video on how to set the thing up. And then likewise, with the augmentation technology, you can now have the ability to have content where you look at the device, and then the step by step instructions show up floating around the device. And this is hugely beneficial because like, we've all assembled some piece of IKEA furniture and found some error and the diagram, right in some translation—

PF You put the wrong leg on at the wrong time. And then you have to—

RB Or the diagram also can be wrong, right? Because it's been translated from 3d to 2d. And then as interpreted by you or us, you look at the 2d and you try to translate it back to 3d and you got two layers of transitional, with augmentation, right, you can look at it in 3d, and you can see augmentations it's very dangerous, exactly what direction the pin is going. And so when you pick it up, move it around, and you can say, oh, exactly these holes, and you hold up a screw and it'll label the screw for you and say all this is a screw that you're looking at.

PF I just want to point out that Richard said IKEA can be wrong. So you know when the IKEA police [Richard laughs] hear this podcast.

RB I'm a world class IKEA fanboy. So I can definitely say I can make mistakes. 

BP There's an app that I really love. It's called Seek, and it lets you identify like plants and animals, but it uses you know, computer vision, and you know, people submit their picture so often you have to like rotate it around the mushroom or the bug, you know, give it like a bit more sort of like depth and angles for it to get it. And I was thinking about this the other day because I recently moved from the city to the country and now I have to repair all this machinery and I don't have a clue how to do it. But you what you were just saying makes a lot of sense. If I could take the hood off the lawnmower and look at it, it was showing me what to do. But even more especially like you're seeing with IKEA, what not to do, you know, like operation or something like don't stick you know this, you're going to break it and you won't be able to go back that's what I could really use.

PF That's you know, disassembly, right, like cuz every IKEA furniture survives, that may be the first move. And the second move you're back to is buying a whole bunch of other articles. Alright, so this is good. I like this. We're gonna take on IKEA with augmented reality. What other industries can we can we disrupt? Come on, let's get out there. Let's mess with things.

RB No, I was just gonna say there's, there's one where we have a fairly disposable industry around kind of decorating and refreshing decorations in our homes and workplaces, and so on and so forth. And so in this kind of environment, when we eventually when we have the eyewear that we have on, you can decorate your home with augmented things, and they don't have to be static anymore. And they also don't have to be expensive, right? If you put a big screen on the wall, you know, that's, that's hundreds of dollars, or 1000s of dollars that you're spending. But if you have glasses on, you can put as many screens on another wall as you want to. 

BP Yeah, what is the Venn diagram, the person who goes all out with the Christmas lights and yard decorations, really cares about the environment, is gonna, is gonna AR program their own, you know, Christmas decorations for when everybody comes over with the glasses. I like this.

PF I don't know, I might be in that Venn diagram. I mean, to me, I think this is interesting, because what you're what you're saying is that like, yeah, there's a vast amount of kind of energy consumption and production that happens to decorate our homes or to power them or to give us images through our big screens. And instead that would be a localized experience, something that kind of comes out of the CPU in your pocket or even on your eyeglasses. And that you'd have this kind of the world would be a little bit ad hoc, but also maybe, you know, you come into my house, you're going to see what I want on the walls versus, you know, what you want.

BP I can finally put up my NFTs. And it'll say underneath how much how much eth I spent on it. So justify all of that.

PF Oof. Oof.

RB It's kind of you get kind of democratized on demand pixels, right? We're all paying for the pixels that we each rendering, but they're only rendering the pixels that you happen to be looking at. So as you're turning your head, you know, we're at the TV, you turn your head, and you still pay for that TV, the pixels are paid for there on the wall, but you're not looking at the wall. With glasses, as soon as you look at it, the pixels look away, and the pixels aren't there anymore.

PF Interesting. So we may have the experience of lots of people sitting somewhere staring at a blank wall laughing.

RB I've already had that experience a few times. [Richard laughs] It's strange when you when you walk in. And I think that you touched on one of the kind of fundamental difference between AR and MR. In that when, when you walk into a room and some people are doing AR together, they're still together, and they're doing AR right. And then you walk in a room, they see you walk into the room. If they're doing VR, and you walk in the room, they don't know you've walked into the room, right. And so now to some degree, psychologically, they're vulnerable. Because if people can walk in and out of the room while they've got this VR on, and they can't see the room around them, and they don't really know what's going on in the room anymore. So AR can become a lot more approachable. Because you're not potentially blindfolded, kind of in the room, you still have spatial awareness of what's in the room around you.

PF Interesting. So just that that totally immersive aspect of VR, which is, it does, it pulls you out of the world, even if you're in a shared sort of space with others, it's not the same as something being projected onto the actual space that you're in. Now that we've identified the use case, which is I would like to watch TV on the wall, even through my mobile phone, I want us all to have a shared TV. So a video box needs to be projected on one wall in my home, I could name that wall. So I've come to you with this use case, Niantic team, and I want you to help me actually program this. So I'll go get the SDK. Now what do I need to do?

KC This, is actually pretty simple. 

PF Great!

KC Yeah, I don't know if you've ever tried the multiplayer buddy feature in Pokemon Go?

PF My children have. Yes. Yes, yes, I do know what that is. Okay. So you're kind of all looking at the same Bulbasaur?

KC Yes. So because you know, in this situation, all of you are in the same environment, in the same area, the first step would be to, you know, join a AR networking session, an AR multiplayer session all together.

PF It sort of has a box of news by lat long where we are GPS, and it's sort of like, hey, they're there.

KC Right now, the air multiplayer doesn't use GPS, you join together using a QR code or a session identifier? 

PF Oh, I see. Okay, so so we tell it, we're all together.

KC Yeah. And then once you're all in the same session, you will scan your environments to, you know, pick up feature points. And then ARDK will collect the data from all of your devices, and locate your device relative to your friends device. So now everyone has, you know, the same frame of reference. And you can place your virtual TV at, you know, coordinate XYZ in your share of world and everyone will see it in the same place.

PF Interesting. So it's an ad hoc network of things, sort of knowing where other things are, as opposed to some global positioning system. 

KC Yes.

PF We're all projecting at the same wall and computer says, yeah, I'm gonna put that rectangle there. That's cool. We all see the rectangle slightly at different angles as a result. 

KC Yeah, yeah. 

PF Interesting. I would have thought that, you know, this was more of an eye in the sky kind of thing. And it was like, okay, you are, you know, one arc millisecond away from this person. And, you know, we're going to out shows you how little I know about anything, including geo.

PF That has to do with the Technology Roadmap, the kinds of things we want to enable. The first layer of things you want to enable is basically enabling these developers to create the specific experiences they want, while we give them complete control around things like privacy, and so on, and so forth. So right now in this first layer experience, you have total control over who's going to join you in our experience. We have Pokemon Go and we have these Go Fest events. And then we have other games, building similar concepts, and so we we are heading towards is You know, eventually 1000s of people in a park, playing Pokemon, you know, together, so on and so forth. So that is a progression that we're that we're enabling.

PF In that case, they would sort of register with the system rather than kind of with each other. 

RB Right. The users would be opting into that experience.

PF Where does my data go? Is it in your world, is it still hosted by Niantic? Am I building my own cloud service with my own 3d assets? You know, on AWS? Like, how do you platform this?

RB What we enables is going to be some of both, basically, we're going to put options on the table. Not all game development studios have cloud engineers that are going to be able to build one of these things from ground up. So we're looking at enabling a combination of these things on our roadmap, so that smaller studios can create these multiplayer AR experiences without having the expertise in cloud synchronization, real time data so on and so forth.

BP Is this also a platform business for you, like it, you know, people can check it out. But right, if somebody is building an experience, they get to a certain scale. And then you know, Niantic would be like a cloud service provider sorts?

RB In some cases, we are already for the first party games that we've already shot. Of course, they all have cloud backends. So you can imagine a unified, unified roadmap, you know, in the future, right now we have Lightship, and we're focused on Lightship ARDK. Lightship will have other pieces that release over time as well. And so we can imagine futures were the most valuable pieces to every game company and start to become enabled and available.

PF Let me ask the most important question from the enterprise software point of view. What are my analytics? What kind of dashboard can I get if I build one of these experiences? Or what are the data points that people are most interested in?

RB From the gaming side, I haven't seen the data on that yet. Maybe on the reliability statistics, there's obvious things around if you have different points of interest in your game that are activated, whether they be gems or portals, or so on, so forth. You might want to know how many how busy each of those are for your games. There are other avenues if we as we kind of start to enable like advertising and promoting a business through your game. You want the analytics about that. And of course, of course, because that's how you get might be how you get paid, for instance. So there's a plenty of data around that as well.

PF It's an emerging space, analytics dashboards for XR experiences. There you go. Everyone just told you. I mean, in a funny way. I mean, Pokemon Go is a dashboard, where you hold it up and you you see the Pokemon that you've captured. Alright, great. I have no further questions.

[music]

BP Alright, everybody. Thanks for listening. It is that time of the episode, I'm gonna shout out the winner of a lifeboat badge. Today goes to Kareem, for answering the question "check evaluate because exist in array object JavaScript or Angular." So thank you to Kareem. I am Ben Popper, Director of Content here at Stack Overflow. You can always find me on Twitter @BenPopper and you can always email us podcast@stackoverflow.com. Paul, who are you? Where can people find you?

PF Thank you so much. I am Paul Ford, co-founder of Postlight, a growing software development firm and digital strategy firm. Check us out at postlight.com, apply to work there. It's all great.

BP Kelly?

KC I am Kelly from the ARDK Unity team at Niantic. If you are interested in developing an AR app, check out our Lightship platform. And also if you're interested in working at Niantic, we are hiring.

RB So I'm Richard Bailey. I'm a Senior Director for AR experiences at Niantic. We're definitely interested in hiring great, great talent and also getting game developers to come try out our Lightship ARDK platform at Niantic.dev. So come sign up. Thank you very much.

[outro music]