The Stack Overflow Podcast

After 30 years, Java is still brewing up new features

Episode Summary

It’s Java’s 30th anniversary! Ryan welcomes back Georges Saab,  Senior VP of Development for the Java Platform Group and Chair of the OpenJDK Governing Board, to reflect on Java’s changes over the last five years, their shift from feature-based to time-based releases, and the advancements to Java’s rich ecosystem for AI workloads and beyond. Plus, where Georges envisions Java being five years from now.

Episode Notes

Connect with Georges on LinkedIn and see his work on inside.java.

Listen to our previous episode with Georges, a celebration of Java’s 25th anniversary. 

Today we’re shouting out the age-old question What is a NullPointerException, and how do I fix it?, which was answered 31 times as a community effort.

Episode Transcription

[intro music plays]

Ryan Donovan: Hello everyone. Welcome to the Stack Overflow podcast, a place to talk all things software and technology. I am Ryan Donovan, host of the podcast, and today catching we’re up with George Saab Senior VP of development for the Java Platform Group and Chair of the OpenJDK Governing Board. We had him on about five years ago to talk about the 25th anniversary of Java, and you know, five years later, it turns out it's the 30th anniversary. So welcome to the podcast, George. 

Georges Saab: Thanks Ryan. It's great to be back here with you again. 

RD: Slow five years, not much has happened, right? 

GS: Oh man. So much has happened. [Laughter]

RD: Yeah. I was going back over the old podcast and there was a line in there about creating roadmaps. Boy, I bet those have shifted a little bit, huh? 

GS: They, you know, they have. But it's really interesting because, you know, if you think back to five years ago, I think we had probably just released Java 9. Here we are in 2025 and we just released Java 24. 25 years to do nine versions, and we've done a lot since. Now we didn't somehow magically find a way to compress three years into six months. What we did was, we decided to change the way that we've been releasing Java. So we moved from a model that was very much feature-based with having a major release in theory every three years. But because it was feature-based, if you know something took longer, we'd have to delay the release and we moved to time-based releases. And we decided we we’re gonna do that on a rapid cadence where we have a release every six months. So what that's meant is we've turned things on its head a little bit, where, you know, we know we're gonna have a release every six months. What we don't know is exactly what's gonna be in it. And what's in it is what's done, at the time. And so the main sort of shift there has been changing how we think about putting things in. And the nice thing is, I liken it to when, if you've ever lived in a city with a subway and it's like late night, it's maybe cold and icy, and the train is on the tracks, and you're like, well, I've got two minutes to get down there. I'm gonna race and probably kill myself, like sliding down icy steps because if I miss it, it's gonna be an hour till the next train. Whereas if there's a train in five minutes, I'm like, you know, I'm gonna take it easy. I wanna be calm, I'm gonna wait. I'm gonna wait down on the platform. And then, you know, there are other people waiting on the platform to get on the train when it arrives. And so that's how it is for us with features now. Like basically people developing the JDK and my group and elsewhere have kind of taken this mentality of we're gonna wait until something's really, really ready and get good at testing it before it boards the train so we know it's ready to go when the train comes along. There were a lot of challenges in getting there, both technical and, but mainly cultural. But now that we've been doing this since Java 10, people could not be happier and you know, feel that it's gone really, really well. And so there's sort of a renewed excitement both among developers of the JDK, but also among users of it. We see around the ecosystem, people who were skeptical when we proposed this now being like ‘this is the best thing that's happened to Java’. Because now we're getting new and fresh things with more features on a really regular basis that's dependable and we can plan our schedules around it. 

RD: I wonder how that's changed the adoption curve. 'cause I remember anytime I worked at a company that used Java, they were usually two or three versions behind because of just the amount of work it would take them to update.

GS: Yeah. So I mean, interestingly, I think you're right, but I think it's a combination of things. So one is this idea of knowing that there are gonna be new things coming all the time, kind of keeps it front of mind and you're kinda looking at it going well, are there things there that I could take advantage of? The second thing is that many people who use Java, like typically they don't just write their program and it's like them and the JDK. Well, there's such a rich ecosystem of tools and libraries and frameworks that people want to take advantage of that, in some cases in the past, they might wanna move forward but couldn't because they had library dependencies and were sort of waiting on those libraries to update. We used to say there was sort of this catch 22 where we would ask people and they're like, we wanna move up, but our libraries haven't moved. And then we'd ask the library vendors and they would be like, well, we wanna move up, but like, our users haven't moved. And you're like, oh, okay. Well what do we do about that? So one of the things we’ve done is, we've actually for our own development adopted a method that we call tip and tail. This is, you know, nothing new. We didn't come up with it, but we've employed it, I think very well. Where all of the new development we do is on the tip. Now, it used to be the case because it was so long between releases that there would be a lot of pressure to take features from a new release and backport it to an old one. And essentially what we did when we made this change to the rapid cadence is we stopped trying to do. Our view then in the old days was, ‘well, this new stuff we're doing is so important that everyone wants it, and most people are using an old version, so we should put this into the old version’. And instead what we've done is kind of realized that like, ‘Hey, look, the people who are staying on the old version often do that because they don't want a bunch of new features. They just want something that's rock solid and stable and kept secure. But like if they want the new features, they'll move to the new version’. And so the second thing about that was A, making sure it was interesting and compelling sort of content in the new versions, but also doing things over time that really try to address the things that have been challenging or problematic for people moving forward. So I'll give you kind of an example. You know, Java is renowned for backwards compatibility. Write a program on 1.2 and it still runs great on Java 24, unless I use certain things that weren't part of the Java spec, right? Like maybe I, you know, was writing something and I looked at the implementation of some third party module. Or even part, you know, a library in the JDKI was like, that looks really convenient, I'm just gonna call that, even though it wasn't part of the spec that's guaranteed. Now you know, you do that long enough and in enough places and you end up with something that can never change because people are relying upon it. Even though you sort of had this thing all along that was like, well, here's what you can rely upon, anything that isn't, that you can't. The fact that so many people did meant that if you change that, you're gonna break code. So what did we do? Well in 9, as you recall, we put in the module system. And then gradually over time we used that to sort of ratchet up the amount of encapsulation. So at first it kind of gave you a warning if you're using stuff you shouldn't. Then it actually would say, no, you can't use it but there's still a switch that will allow you to kind of get outta jail. And then over time, what we've done is use that to help people understand where they had some kind of technical debt, where they, or something they were dependent on, were doing those kinds of things that were gonna cause problems for migrating forward. And hope people try to get away from doing it. And a key part of that was also understanding where people were really reliant upon something, maybe there was a reason, maybe there was something missing in the library or the APIs that we needed to add. So we've done much of that over time. 

RD: I feel like we're teasing the lead here. What is the new stuff that's come in the last five years? 

GS: Oh, wow. So much. Actually now I was looking back at the talk that we had a few years ago and you know, we were talking about a bunch of things then that were projects, some of them we had been working on for a while, but many of them were kind of new or just coming along. And the great thing is many of those have delivered in the last five years. So as an example, when we were talking five years ago, we were starting a project called Loom that was working on scaling up parallelism in the JDK and basically making it so I/O bound workloads could be unblocked in a way that allowed you to write simple synchronous code and not have to resort to async code or react. This delivered over the course of a couple of those six month releases. And has been just excellent and it's been getting better and better. So the first version that we did worked really great for New code you were writing, had a couple of issues around, you had legacy code that were using certain patterns and we've gone back and I think in 24 there are a bunch of things that address some of those corner cases so that now virtual threads and Java are awesome. That's something that we think is super popular. We've also seen a lot of advancement on the garbage collection side. So back then we were talking about ZGC, which was pretty new, which is a low latency garbage collector for extremely large heaps. That's just gotten better and better over time, and in fact, recently reduced a generational capability that extends the set of use cases that it can be good for. So that's been great. A whole area that we didn't even talk about that much last time is things from Project Amber, which is basically reducing ceremony. You can look at that as sort of, you know, simplified syntax, but it's actually a lot more than that. So, as an example, we had Lambdas in Java 8, but what we've been doing successively over time is having features come in from Amber that extend the story of pattern matching in Java. And that sort of a technique which is very, very popular, used a lot in functional programming circles and is basically a way of using Java for data oriented programming. There's great paper that Brian gets that, if folks haven't seen it you might wanna go and check that out. But that's something where in the past we would've spent years and years and years trying to perfect it all over the platform and then putting it into one of the old Big Bang releases. Instead, what we've seen is an approach where we've kind of successively added the capability to do pattern matching into different parts of Java. So starting off, an instance of adding it into Switch. Adding more capabilities there, including things having records in Java, which I think is another thing that's new since we last talked, which is about having objects that are essentially data carriers. And then a lot of those other projects we've continued to deliver on, Lighten has delivered in the last couple of releases. So that's about faster startup and warmup. And Valhalla is the one that we're still waiting for things to come in, but I think things are looking very, very promising there. James Gosling, I think, called out something like seven PhD thesis in one project. So it's really a pretty major overhaul of all of those sort of inner workings of Java, but it's now at a point where I think the design, the team is super happy with and polishing all of the bits to the point where it's good enough for Java.

RD: I know that we talked about the Valhalla project a little bit. Is that something that's been rolled out slowly or has it been in sort of preview releases? 

GS: So preview releases, we actually call 'em ‘preview features’ as a very specific meaning for Java, where it is a feature that is in the mainstream JDK that you download. But you have to turn on a switch in order to try them out. And that's so that you don't inadvertently end up depending on something that is likely to change. At that point though, like in the old days, we would've called it done. It's fully specified. We think it's production quality, but we're kind of humble enough to realize that so many people use Java for so many different things, that we wanna make sure that people have had a chance to try to use it in anger, in real life, and give us feedback so that we can make it as good as it can be, so that when it's not gonna change for the next 30 years, right? It's in great shape. Even before that though, the projects when they're working on features have the ability to do early access bills, and so that's often what you'll see. And we've actually seen several successive early access rounds of early access bills for Valhalla. That's why like when we look at the project, it hasn't delivered things in preview form yet, but I think we're pretty close to that. Another neat thing about this is, for developers who want to sort of follow the progress of one of these projects, you can kind of clearly see how far along things are and kind of gauge, ‘this has been in preview, there've been a couple previews. I can see down the road to when this is gonna be final’. And hopefully the changes are successively smaller as you go along.

RD: [Laughter] Hopefully. Hopefully. You talked about the 17 PhD thesis that are -

GS: I think I said seven, but you know who knows how many, it’s a bunch - 

RD: One of your hats is the Chair of the OpenJDK Foundation, like how much of the community drives these evolutions in these changes? 

GS: We actually publish every time we do a new Java release, so every six months, a little heat map chart that sort of shows who has been contributing and how much. And you know, one of the things we're very happy about is we have a sort of steadily increasing number of contributors. It still, though, tends to remain the case Oracle is by far the largest contributor and usually are the ones who are leading these kinds of evolution projects for Java and the the JVM. We've got a lot of people who are contributing smaller bug fixes or maybe there's a thing that's particularly interesting to them. Like we're quite happy to see people who are making their own hardware or operating systems stepping up and taking responsibility for making sure that the support for the new hardware that they're bringing out is already really great in Java once the hardware hits the streets.

RD: That is an interesting question. Java has always run on a JVM, and it's been JVMs for individual operating systems, but now you have a lot of different chips coming into the market. Is that causing an issue, having these OS / chip set combinations?

GS: Obviously that is sort of the complexity that we take on, so that if you're writing a Java program, you don't have to worry about it. And my view is one of the things that's been a strength of Java over the years, write once run anywhere, is what we started out with. I used to joke that write once run anywhen. It's like I'm writing a program not necessarily knowing what the hardware will look like at the time that it's in production, or over the lifetime that it's in production. But one of the things that people who have made the investment of doing something in Java have often reaped over the years is, as the new hardware comes along, the Java runtimes evolved to take advantage of it. And without your having to usually change your Java program at all. And so that's an underappreciated value of what you get when you choose Java. But I think that, you know, when people look back at the track record, they see that's been the case over and over again. Another thing that's interesting I mentioned folks from the hardware development community working side by side with us in OpenJDK to provide great support for their hardware. They know what's coming, and they often are able to plan for it in advance and try to make sure that we're skating to where the puck will be and help out with that triangulation. It's even the case that having those people involved in helping to evolve Java has meant that they often go back and are able to provide feedback to their design teams about things they can do to make the hardware better for Java workloads. 

RD: Are those folks ever pushing code into the open source?

GS: Yeah, absolutely. We see code contributions from a number of those kinds of companies. So, and you can look the set of people who are contributors to OpenJDK, and in fact, all of the individual contributions you can see. That's actually another thing that's great for Java developers, right? It's not like, it used to be in the old days, like things developed behind a curtain and thrown over the wall. That's not how we do things in the Java community these days. 

RD: One of the other huge changes in the past five years is the rise of AI. Is Java doing any changes, making any changes to support that shift? 

GS: Interestingly, many of the projects that we were talking about five years ago were actually things that we put together based on the challenges that we saw for workloads that people wanted to do in Java, but may have seen some kind of issue. And so as we've been delivering those all along, and looking at new things, the workloads that people want to run have evolved. Back then, we spent a bunch of time talking with people who had done a lot of work to support big data in Java. That community kind of evolved. We saw machine learning and now we see AI. But interestingly, the problems are the same in many ways, right? It's like you gotta have lots and lots of data. You've gotta be able to process that data very, very quickly. You need to be able to express algorithms that can take advantage of low level hardware, but you wanna do that at the highest level possible so that you're not writing a bunch of very specific code that only works in one hardware situation. This is exactly what we've been doing, right? We talked about five years ago, Project Panama, which hadn't delivered, but since then has. Foreign function of memory, API, which is basically about making it easier to call native code. So, as an example, if you wanna wrap tensors in Java. Great way to do it. But then, also projects like Valhalla are basically about representing more complex, but structured data in a way that can be much more compact. Where large amounts of this kind of data can be processed really quickly because the runtime knows enough to be able to make sure that memory caches are fully utilized and populated. You're getting the data to the CPU, that particular cord's gonna be working on something. And so, you know, exactly those projects, I think are the ones that turned out to be the things that people who want to do AI in Java or want to do AI in general, can find that Java's a great platform for doing it in. The things that we've been doing don't stop with those ones that we started five years ago. And we do have sort of a continuing set of projects coming along. So as an example, we got a project called Babylon, which is in general about code transformation. But specifically there's a part of that that we've been working on called hat, which is about specifically looking at the use case of trying to do code transformation so that you can do things like take algorithms in Java and turn it into distributed code for a form of GPUs.

RD: Is that getting better at creating an abstract syntax tree from the code or something? 

GS: That's essentially the idea. But again, Babylon is much more general technology that can be applied to specific cases. Like this one of algorithms turn into GPU code, whether it be Kudo or something else. 

RD: Is that GPU interface through the virtual machine, is that more difficult than if you had bare metal access?

GS: Generally, people want to be writing at the highest level that supports the amount of power that they need. Question is, well, what is the ideal thing to be writing that in, and how generally applicable can it be? And again, do I know that the actual hardware that I'm gonna have available in the cloud where I'm running this thing is going to be from a particular vendor that only supports a certain instructions? Or do I want to be able to write it in a way that is more abstract? But as things evolve over time, that translation layer will be there written by the experts in that particular hardware, rather than my having to change my code because the world changed. 

RD: This is a sort of semi layman's wondering about the virtual threads. Is that a way to sort of lighten the load on the GPU? Is that possible? 

GS: Virtual threads are not something that mostly help with hardware bound or CPU bound today kinds of issues. It really tends to be more about things that are threaded, like IO bound problems. Right? That mostly in the past has been something where, let's say you've got a feed of data that's coming in that you need to be able to process. Well, how are you gonna handle that if there's more coming in than you can process in time? Right. You know, in the past what people have typically tended to do, starting out in Java synchronously, Java threads were mapped one-to-one with native threads. And so once you got to as many things as you could process with the number of native threads that you had on your system, okay, now things are getting backed up. You can't process it. What are you gonna do? And so typically what people would do is they would turn to libraries that would do async protocols, basically turn things around so that they would be able to have back pressure and often resulted in code that was hard to understand, often hard to debug. It did help you scale, but the complexity shot way up and the maintainability of your code suffered. And so what we've done with virtual threads is let the runtime handle that for you so that rather than being one Java thread to one native thread, you now can have hundreds of thousands of Java threads to a native thread, and the runtime handles all of that for you. So what that means is you can turn around and write your code in the simple synchronous code and actually have it continue to scale in places where you've got, for instance, so many transactions coming in that you need to be able to handle that.

RD: Talking about Java, it always brings me back, 'cause that's second programming course I ever took was in Java, and I think it was like Java 1 or 2. It seemed like you were really pushing into the educational space to get this in front of students as soon as possible. Is that still part of Java's DNA?

Georges Saab: I would say that not only is it, but it's something where we've really been putting a lot of focus in the last couple of years. So it's interesting, right, I think in academia and you know, educational circles, there's always this sort of challenge and balance, right? Do you wanna teach things that are easy for people to learn, but also how useful is it? Once they've learned it, right, is it a stepping stone to get to something else or where's it gonna be? And one of the things that we found is when Java was originally created, like if you look at the simplest program you could possibly think of to write, you know, ‘Hello, World!’, and you compare how you would normally write that in older versions of Java with other languages out there. Java was just much more complex in the sense that you really needed to understand about 32 principles of object-oriented programming before you could write ‘Hello, World!’. And so for teachers, this was always a challenge for educators because it's like, well, do I explain to my students everything that's going on here? ‘Like, okay, returns and this and that and what is this magic main thing’. And like, we just wanna print a low world for God's sake. You know, versus some other languages, I think did a bunch of work early on to make things more accessible. But maybe that accessibility came with some baggage that like it was easy to start, but you really couldn't do that much. And so what we've been doing is trying to get to the point where we really sort of look at what can we do to simplify that initial experience to make Java much more approachable? And this is a concept we call Paving the on-ramp, if you search for ‘Paving the on-ramp Java’, you'll see a couple of papers about that. And what it means, and what we've been doing over time, to get to the point where you really can have just a very, very simple CRI line ‘Hello, world!’. But as you are learning, as you're trying to do things that are more complex than simply printing out ‘Hello, world!’, you can gradually start to introduce more of the concepts. You only have to learn them when you need them, because that's kind of the place where people are gonna learn, right? It's like, ‘oh gosh, this is a problem how I do that. Oh, there's this thing. Oh, now I get it. This makes sense’. An interesting thing that we found originally thought about some of those simplifications or so on as things that were really for novice programmers or people new to Java. But what we've actually learned over time is that for many more experienced developers having a way of doing simple things, that you can just get started out and do a small experiment or a small thing, is actually super useful as well. And of course, as we've always done, we try to be really careful to not create a weird different dialogue of Java where like, ‘okay, I'm doing the simple thing, but I fall off the cliff because now I've got all the complexity’. But really make it so that it can be kind of gradual, that you can learn and apply those new concepts.

RD: I think when I first studied it, it was interesting 'cause we'd got the intro to object oriented stuff and seeing Java was like, oh, there's sort of primitives here for streams and threads. Like I can access this stuff directly and I don't have to allocate memory directly either. So it almost seemed like a simpler process.

GS: It depends on what you're comparing with, right? You know, are you comparing it with a native technology where you have to worry about memory management yourself? Or are you comparing it with maybe an interpreted language that does a bunch of like syntax magic that you don't understand what it's doing, but makes your life simple? We're definitely in the middle. We're comfortable being in the middle, but we're trying to make it so that that initial approach for learners of language is just much easier. And of course as always working with people who are making popular tools like IDs to try to make sure that the experience there is as good as it could be. And I'd have to say that's something where I've really been pleased. Like one of the things we wondered about when we started ratcheting up the speed of releases was, ‘well, are tools like IDs gonna be able to kind of keep up with things?’ And thankfully, I think in part because we've been doing the development in the open in OpenJK, so they can kind of see what things are like. You can down, for instance, today, you can download builds of 25, although 25 won't be GA until September. You can see what the current state is. You can test it out and, and I think the tools vendors do that regularly so that they're able to basically declare support for new versions just as they're coming out. Sometimes they even do it before, which is weird. But yeah, there are all kinds of reasons why Java remains strong and popular. We as always are looking for the things that people want to do with the technology. I mean, generally these are the factors we look at. Like how are the systems that people use and want to run on evolving, and what can we do to support them? What are the programming paradigms that people want to be using and how can we support them? And finally, what are the workloads that are important to people and what are we doing to support those as well? So we're pretty pleased with where things are and even more, where they're headed. We feel that this way of doing releases has not only allowed us to get more feature content out into the hands of developers. But also has put us in a point where it used to be you would do one of these big releases and now we have no idea what's next, and you're kind of starting from scratch. In this case, we have these projects that are ongoing and they're continuing to improve. So we feel like the pipeline of new things coming is richer than it's ever been. We want to thank everyone who's been using Java and giving us feedback 'cause that's also the thing that has made the Java ecosystem great. 

RD: So speaking of what's coming, what can people expect for the next five years? 

GS: People should look at the projects that are ongoing and how they're coming along. Because you know, basically what you can expect to see is projects like Valhalla, like Babylon, like Leiden, to continue delivering interesting improvements to Java that make your choice of Java be proven to be the right one. Having the choice of vendors, of lots of different people working on it and bringing different perspectives helps make it a really strong technology and always a wise choice for just about anything you want to do.

RD: It's that time of the show again where we shout out somebody who came on to Stack Overflow, dropped a little knowledge, shared some curiosity. Today, we're not shouting out a badge, we are shouting out a very popular Java question. What is null pointer exception and how do I fix it? [laughter] I'm sure that's been asked hundreds of times, so if you are curious, we have the answer. I'm Ryan Donovan. I edit the blog, host the podcast here at Stack Overflow. If you have questions, concerns, ratings, reviews, send it over to podcast@stackoverflow.com. And if you wanna reach out to me directly, you can find me on LinkedIn. 

GS: Thanks, Ryan. This is George Saab, the Senior Vice President of Development for the Java Platform Group at Oracle, and the chair of the Open JDK governing board. Wanna really thanks for the time talking about Java 30th anniversary and looking forward to the next 30 years. 

RD: All right. Well thank you very much for listening everyone, and we'll talk to you next time.