The Stack Overflow Podcast

Ryan Dahl explains why Deno had to evolve with version 2.0

Episode Summary

On today’s episode, we chat with Ryan Dahl, creator of Node.js and Deno. He explains why he feels the first version of Deno has reached certain limits and what he and his team are doing with Deno 2.0 to scale up the module system and ensure it's a great tool for the modern web.

Episode Notes

If you’ve never seen it, check out Ryan’s classic talk, 10 Things I Regret About Node.JS, which gives a great overview of the reasons he felt compelled to create Deno.

You can learn more about Ryan on Wikipedia, his website, and his Github page.

To learn more about Deno 2.0, listen to Ryan talk about it here and check out the project’s Github page here.

Congrats to Hugo G, who earned a Great Answer Badge for his input on the following question: 

How can I declare and use Boolean variables in a shell script?

Episode Transcription

[intro music plays]

Ryan Donovan Welcome to the Stack Overflow Podcast, a place to talk all things software and technology. I'm Ryan Donovan, I edit the blog here at Stack Overflow, and I'm joined by a returning guest, Ryan Dahl, co-founder and CEO over at Deno, and we're talking about his own sequel– the impending release of Deno 2. So Ryan, welcome back to the podcast.

Ryan Dahl Thank you, Ryan. Glad to be here. 

R Donovan So I know Deno was sort of an improvement on Node, which you also created. What's the impetus for Deno 2?

R Dahl So maybe some people remember I gave a talk at some point– 10 Things I Hate About Node, and it was kind of a litany of grievances that I have against my project, namely that it just hasn't really kept up with the times and evolved. We started Deno written in Rust. I think one of the major changes is that it uses HTTP specifiers, or let's say URLs, for a module system. You did not import from NPM. You can host files anywhere on the web, kind of a decentralized module registry, and the idea was to vastly simplify what was going on in the Node ecosystem by just running JavaScript. And we've kind of taken this to the limit in some sense. We've built all sorts of tooling around this. Deno has been worked on for years now under this idea. It's a nice idea because it's really simple. Everything gets very simple when all you have is URLs for modules, and it works really great for small scripts. But what we found is that as you kind of scale up your projects, and this is going to be unsurprising to many people, it becomes difficult to manage a bunch of URLs kind of scattered around your code base, and so the impetus behind Deno 2 is kind of scaling up the module system. So what we've done a lot of is work on NPM compatibility, the ability to have a URL that points to an NPM module and be able to pull that in, be able to use built-in Node modules, because it turns out that a lot of the JavaScript ecosystem depends heavily on that you might have an SDK for AWS that you depend on, and that thing is published to NPM and that thing is using the Node built-in APIs and it's pretty much a roadblock if you can't use that to some extent in Deno. So we spent a lot of time on NPM compatibility, but also just generally on the workflows around larger projects. So Deno 2 will have workspace support that is monorepo support, the ability to have multiple packages inside of it, and also, although we've kind of walked back from this, HTTP specifiers as kind of the central module system. We are still fairly unhappy with the NPM ecosystem. So you can still still import NPM modules and the compatibility is pretty great, but we've also introduced JSR, which is a very simple way to publish some JavaScript or TypeScript code and share it with people without a build setup and without too much rigmarole.

R Donovan It seems like for a long time people have been trying to get their pieces of their software smaller and smaller. You’ve got service architectures, you get Node, which is just doing little scripts here and there. Is Deno 2 sort of a move away from that? You talk about allowing monorepos and that. 

R Dahl No, definitely not. Deno 1 shines particularly well when you're doing some one-off scripts. You just have just a single file and you want to, I don't know, move some stuff around, parse some data, send it over the network somewhere else. It's really awesome for that and we definitely don't want that to be complicated. We don't want you to have to set up a directory and a package JSON and do a bunch of boilerplate just to get started. So it's very important to scale down code, which I think Deno is already good at and Deno 2 will continue to be good at, but the question is how do you scale up, because there are very large JavaScript projects out there. Any of these web frameworks that people are using constitute pretty serious JavaScript code bases, and how do you manage dependencies in that sort of world? Where are you pulling those dependencies from? How did they link to each other? How do you make sure that you don't have 10 copies of slightly different versions of the same thing inside your dependency graph? These are kind of the problems that we're trying to do. Deno 2 is the accumulation of a bunch of work, but I think it can roughly be summed up as Deno is ready for production now. We've spent a long time working on this thing. It's pretty stable, it's working pretty well, and it's now ready for bigger projects.

R Donovan So part of that stability and sort of enterprise-readiness, is there a security aspect to this? It sounds like previously there may be a chain of dependencies, and at some point, you're no longer controlling where those dependencies are. Is there an aspect of that trying to lock down which URIs are being called?

R Dahl So Deno has this secure by default behavior where we don't allow you to access a network or file system just arbitrarily. You kind of have to opt into this access. That is still maintained in Deno 2. And I should say that in Deno 2, all your Deno programs will continue to work. It's not like this is kind of a radical change. This is an additive change. So security aspect I would say remains roughly equivalent. What we're working on now is kind of the stability of this. So we are going to be releasing a long term LTS version of Deno signaling that the APIs and Deno and kind of the workflows of Deno are not going to change and actually offer support for that going forward. So that's not really answering your question. It's not really a security thing, but kind of. We'll be backporting fixes to this LTS version and the standard library for Deno is also reaching 1.0 in a sense, so we also will not be changing the standard library APIs going forward. In fact, we haven't been changing those APIs for quite a long time but we're now kind of signaling in a fairly concrete way that Deno is really pretty stable at this point and not going to change going forward, whether that's the standard library or the command line interface or the built-in APIs into Deno. This thing is ready to be built on. And coming back around to your question, in a sense that adds security to your system. You don't need to pull in random third party things to do small. If you need a TOML parser for example, the standard library has that in there, and that's a stable API that's going to be supported essentially forever.

R Donovan Were there specific customer needs and incidences that led to do this or was this part of your 10 Things I Hate About Node just sort of cascading down across projects? 

R Dahl I think it's just the maturity of the software. When you start a software project, you can change everything, everything's kind of crazy. And Node has definitely gone through this cycle as well. In the early days, everything was changing all the time. And bit by bit, month by month, it starts kind of getting set in stone. And with Deno, that is important because not everybody cares about how different functions are named or kind of the aesthetics of command line flags and stuff. We're just kind of not changing things arbitrarily anymore, and that is important for people to be able to build on stuff. People complain about things that change too much. 

R Donovan Breaking changes always make people mad. Stack Overflow would thank you for any breaking changes. We get plenty of traffic for it, but your customers would prefer not to, I'm sure. You mentioned you're adding NPM support, is that right? 

R Dahl NPM support has been added. Maybe 18 months ago we started work on this, but NPM support is not a feature that you implement in a month or something. You have no idea how much complexity goes into being able to run arbitrary NPM packages. You just have to support intricate built-in Node APIs. And it's just really good now. There's always going to be a long tail of incompatibilities and getting to 100% is essentially impossible and not necessarily the goal, but it's hard to put a number on it but we're at like 98% compatibility these days. Most packages you can run, most of the ones that you care about. You will probably still find things and there's issues every day, but we've been plugging away at this for almost two years now.

R Donovan You mentioned running the arbitrary code and I know you all created JSR as sort of a way to verify what packages you're pulling in. What did you have to do to secure and enable NPM in a sort of less arbitrary and dangerous way? 

R Dahl So all NPM modules sit on top of built-in Node APIs. And in Node, if you access the file system through ‘require Node:fs’ and you open a file, you get unconditional access to the file system. In Deno you can't do require, you have to use ECMAScript module syntax. You can do ‘import Node:fs’ and you can get the same APIs, but when you do readFile(), Deno's secure by default permission system kicks in and is going to check if the invoker of the program gave you access to read from the file system. So I think the answer is that we implemented the Node APIs with the same Deno security model that Deno started out with, and so when you do ‘allow-net,’ you can make outbound HTTP requests, whether those are with fetch or those are with the Node built-in HTTP client. And because all of the NPM stuff is built on top of Node built-in APIs, those permissions apply just as well to Deno. 

R Donovan I always am amazed at how often those holes open up on software. You give access to the file system and then you have all of it. You give access to the internet and then you have all of it. 

R Dahl It's a far from perfect system because for many things that you need to do, you are doing server-side JavaScript so kind of by nature you are interacting with the operating system. If you had a pure sandbox, then executing JavaScript that can't make network requests or can't talk to the file system is useful for some things, but a lot of what people are doing is talking to network and talking to the file system and talking to the operating system. So it's not perfect and there are many cases where you need to give the allow-all permission and you kind of fall into the same boat that Node is in where you kind of have arbitrary access. But it is really nice to be able to use the sandbox, the V8 JavaScript sandbox, that is secure, and for some applications maybe you just have an API server that makes outbound connections to a database and receives requests and is not talking to the file system and so you can bound that access there. You can do this at other layers as well. You can kind of do this in the kernel with Linux. You can do this with Docker containers, but it's pretty nice to just have this in user space, kind of just right on the Deno command line, the Deno binary. In particular, if you're on a Macintosh and you're just kind of hacking around, it's pretty nice to be able to gate access to the operating system, especially when you're pulling in third party dependencies that you really have no control over and you definitely are not going to be auditing that code. One of the things that we were pretty adamant about is that when you add a NPM dependency to your package, that can run what's called a post install script. So it'll download the source code from npmjs.com and it will optionally run a script that that package specifies, and that script can do anything, and this is pretty scary and not only is it the package you added but any dependencies of the package you added. So the Deno philosophy is that unless you kind of explicitly grant access to a package to run this post install script that we are not going to run this. This also kind of creates some incompatibility. You might download a package that expects a post install script to be run and it's not run, but Deno will be like, “Hey, just FYI, we didn't run this script because we don't run arbitrary scripts from the Internet without your permission. So you need to first give us permission to do that and then we'll run it.” So there are kind of minor incompatibility things, and you have to add flags to your command line invocations, and in some ways it is a little bit more verbose for users to invoke Deno commands, but I think it's ultimately beneficial to be able to say which programs from the internet you are actually going to be able to access from your system. It’s just all too easy for something to just open up, I don't know, your Chrome cache and dive in there and grab some information and then open a socket connection out to some nefarious server and dump those details. And I think we've seen a number of exploits. 

R Donovan I think you talked about performance enhancements. I saw a video a while back, somebody talking about all these companies when they hit market fit, scale up, doing a big rewrite of their software for 15/20 percent performance increases. Did you have to go that far or were you just tinkering?

R Dahl No, no. Definitely no rewrite, that would be insane because we have a big code base that has high velocity, but we definitely spend time on performance and I think performance is pretty great. We've been able to make use over the last year of what's called the v8 Fast Call API. So there's always been this problem where in Node, in JavaScript generally, of the transitioning between JavaScript and native code. You're in JavaScript running your program and then you have to call out into this native function and you have to cross this boundary and there's always a cost associated with that, and that can often dominate the program. This v8 Fast Call system basically makes this boundary crossing into something as fast as a native C call. So you're really happening in JavaScript and there's no extra set up to jump into native code. You can just jump in there in just a couple of CPU instructions. This Fast Call API is a pretty big change in v8 and kind of requires the native APIs to be rewritten for this. We've done this work and slowly it's percolating throughout the Deno codebase, and if people are interested in Rust, they should definitely check out the Deno repo because it is one of the largest Rust codebases, but it's also just very interesting how we do this binding layer between v8 and Rust and how optimal it actually can be at the end of the day.

R Donovan Speaking of Rust, Rust ends up being our most admired language in our annual survey almost every year for five or six years now. Were you a fan of Rust before and are you still a fan after working on it in a massive code base? 

R Dahl So the original prototype of Deno was written in Go in 2018, and once we decided that we were going to commit to this project and move beyond a demo, we quickly were like, “Go is not a great choice,” because Go has a garbage collector, and v8 has a garbage collector, and those things potentially are going to interact in unholy, unnice ways. It was pretty clear that we needed some native code, and the choice was essentially C++ or Rust. And at the time, Rust was pretty new and the choice to jump over and use Rust was pretty risky in my mind. I didn't know too much about it. Nobody I knew had used it. I know the creator of Rust, Graydon, so I sent some emails to him asking, “Is this a good choice? I don't know.” and basically just kind of jumped in with it. That turns out to be one of the best choices that we've made in Deno. It is fantastic. Rust scales well. In some ways, you can think of a system like Deno as kind of like a metropolis. It's like a city and there's roads connecting it to other systems, and building those roads is a lot of what we're doing. Let me not use the city analogy. The Linux distribution analogy is kind of also appropriate. In some ways, Deno and Node are two different distributions of v8, in the same way that Ubuntu is a distribution of the Linux Kernel. A lot of what we're doing is we're not programming JavaScript itself– we are making connections, we're making APIs that allow JavaScript to call AWS, for example, to talk to some AWS API. And being able to use the Rust ecosystem and pull in infrastructure from crates.io is incredibly powerful and useful in ways that were just inconceivable when building Node. Any kind of third party dependency in Node has to be carefully, carefully integrated into the build system and thought out very deeply. You don't just go willy-nilly pulling in random dependencies into the Node code base. It's very slow. But at Deno, we can really reach for the stars and move very quickly because of Rust and the ecosystem there, and it all compiles down to native code and we have very deep control over what actually gets executed on the CPU. So suffice to say, I'm a big fan of Rust, and in terms of building a company, I think it's also been fantastic because there's a lot of people out there who like programming Rust and find it fun, and just generally makes building a project like Deno much more enjoyable versus say working in C++ which is just terrible, just terrible.

R Donovan We did a piece covering that The White House was like, “Don't use C, don't use C++. It's not memory safe. Use Rust or something else.”

R Dahl Those aspects of it are also interesting and good, but we still have segfaults all the time. You can do really wacky stuff with Rust. I'm glad that it's memory safe and it is kind of like bowling with the bumper rails on. You just can't make too many mistakes, and because you can't make too many mistakes, you can just go that much faster when designing the systems. 

R Donovan And you want to sort of broaden it, you're not just building out Deno here. Like we talked about, you have JSR, you have an edge hosting service. It seems like you're building out more than just the roads and the city, building a little country around Deno. What's the philosophy behind that and how big is your country going to get? 

R Dahl I think the philosophy is that JavaScript is more than just another programming language. It is kind of the default programming language. I think just so much of human infrastructure is built on the web that if you interact with your Bank of America account, you're going to go through a website, or if you reserve a book from the library, you're going to probably go through a website. There's just so much infrastructure now that has been built on the web that I think it's safe to say that it's not going away anytime soon. I'm pretty sure web browsers and HTTP and CSS and HTML are going to be here five years from now, if not 10 or 20 or 50 years from now. That was not always the case. It was not super clear 10 years ago that that was going to happen, but I'm pretty bullish on the web surviving at this point. JavaScript is part of what makes the web work. In the modern web, you will not be able to log into your bank account without JavaScript. So JavaScript, like CSS and HTML and HTTP, has kind of this unfair advantage in how it persists into the future, and that's unlike other technologies. It's very hard to make predictions about where technology is going to be 5 or 10 years from now, and so the company is really founded around the observation that JavaScript has some future and that this technology is pretty useful, and that's evidenced, in particular, outside of the web browser– evidenced by Node, which essentially runs on every website. Unless you're programming HTML directly, every website in some form or another is using Node at some layer, and it's inevitable that this is going to become the bedrock of programming infrastructure. And I think we're still just at the infancy of this and I think Node is not the end state of where server side JavaScript is. I think serverless also is not in its end state. Serverless often means JavaScript. What most people do with AWS Lambda is program JavaScript. They're running Node programs in some serverless capacity. I think there's a lot to improve on there, and I think Cloudflare Workers is one direction that I think improves on serverless infrastructure. We are interested in taking this to its natural conclusion and really helping JavaScript actually become the structured, secure bedrock on which we can build all sorts of systems. Now, obviously, JavaScript is not appropriate for every use case. If you're building a database, don't write it in JavaScript. If you're building an operating system, don't write that in JavaScript. Those are totally inappropriate use cases for JavaScript. And there's going to be all sorts of use cases that are inappropriate for JavaScript, like building Deno itself. If I wrote that in JavaScript, it'd be absolutely ridiculous. Most of what I program, because I work at a lower layer, is Rust. So that's not to say that JavaScript is the end all and be all of everything, but I think when it comes to a lot of what you might call ‘business logic,’ you're not necessarily aiming for the fastest possible thing or the smallest possible memory footprints. Often what you're optimizing for is time to working program, how fast can you program, because the most expensive thing is the engineering time there. And so when you talk about ‘business logic’ in websites or other services, most of those cases are best-suited to a scripting language where you have garbage collection and you can just really move fast. Of the scripting languages– Pearl, Ruby, Python, there's a number of them– they're all more or less the same. By the way, I consider TypeScript to be exactly JavaScript. It's just like JavaScript v2. It's JavaScript with some added syntax on it, so I include TypeScript in this, although often it's thought of as a separate language. But what we're trying to do is remove as much boilerplate as possible so that people can really build these structured, simple business logic-only programs in the default programming language as fast as possible with as few errors, and that touches all sorts of aspects. There's the actual execution of JavaScript– that's what we're doing with Deno. There's the distribution of libraries and modules. We don't think NPM is ideal for that. There's a lot of problems about that so we've built JSR to make this really simple and nice to distribute TypeScript and JavaScript. And then running stuff in production is also something that we're undertaking with Deno Deploy and subhosting as kind of commercial offerings.

[music plays]

R Donovan Well, it's that time of the show again. We're going to shout out a question and answer combo from Stack Overflow, somebody who came on and earned a badge for their great answer. Today's Great Answer Badge went to Hugo G for dropping some knowledge on, “How can I declare and use Boolean variables in a shell script?” So if you're curious about that, there's a great answer down there. I am Ryan Donovan. I edit the blog here at Stack Overflow. You can find it at stackoverflow.blog. And if you liked what you heard today, please drop a rating and review. It really helps. 

R Dahl I'm Ryan Dahl. You can find me on Twitter @Deno_land or @Rough__sea, or tinyclouds.org is my website. 

R Donovan All right. Thank you very much everybody, and we'll see you next time.

[outro music plays]