Erik: Jack, thanks for joining me on the podcast today.
Jack: Yeah, thanks for having me.
Erik: I'm looking forward to this conversation. Actually, the topic came up. I was hosting a panel last night on this topic of innovating in Asia. We had people from Bosch, from Intel, from JLL, the property developer, on the panel. Then the RISC-V came up as a topic because, obviously, China semiconductors is an important issue right now. Then you just corrected me during our conversation here that despite the fact that it reads RISC-V as the international standard, it is actually RISC-V. So maybe that's a good place to start before we jump into your company, SiFive, to discuss a little bit what is the international standard, and what is the international standard body behind this technology.
Jack: Yeah, I think it's a great question. I think it might help if I give you a little bit history of how RISC-V came to be. The fact that it came up on a panel yesterday with that diverse audiences is pretty incredible in terms of the global phenomenon that is RISC-V. Because it started off in 2010 as an academic project. A couple of grad students at UC Berkeley, along with a professor, needed an instruction set that they could modify. An instruction set, which we'll get into more details later, is the interface between hardware and software. For research purposes, they needed an instruction set architecture that they can play with, and change, and do different things just for their own PhD purposes.
Really, the only options realistically that exist in the world are either x86 or ARM. This is pretty interesting because if you look at all of the compute interfaces — think Ethernet, think Wi-Fi, think PCI Express, think any interface, anything that's a standard — the ISA is the only one where all that exists are proprietary standards. So x86 is something that's very complicated. There's only two companies in the world that pretty much build it: Intel and AMD. If you want to use that, you have to buy a chip from them. The other is ARM, which is used across the world in most chips in the world today. Many different vendors all deploy ARM-based solutions. But actually, all of the cores come from one company called ARM. That's also proprietary. The problem that the researchers faced at UC Berkeley back in 2010 was, they couldn't modify these ISAs without permission from either Intel, AMD or ARM. It's just not possible. So they decided to create their own — originally as a three-month project, as a research project to have something that can be used for research purposes.
The reason it's called RISC-V is actually, at UC Berkeley, that's where they invented the original RISC style computing. Back in the '80s when they were ARM, the CISC versus RISC debates, the RISC style architecture came out of UC Berkeley. This was actually the fifth-generation research project that came out of Berkeley. That's why it's called RISC-V, because it's a fifth generation of it. They just used the Roman numeral 'V' to announce five both because they are students but also because one of the key areas of future innovation was vector style computing. So it's kind of an inside joke that V would stand for vectors. But really, it was driven by the fifth generation of RISC style computing, which is why it's called RISC-V.
Erik: What does RISC-V acronym stands for?
Jack: RISC, that stands for Reduced Instruction Set Computing which is a style of compute, as compared to CISC which is Complex Instruction Set Computing. Traditionally, an x86 would be considered a CISC style machine, whereas ARM would be considered a RISC style machine. But these are more semantics these days. Because the CISC type x86 does a lot of RISC things under the hood. The RISC-like ARM actually has a lot of complex instructions. So it's really more semantics more than anything. At the end of the day, it's about what is your performance, power performance area, what does your ecosystem look like that the people really care about, as opposed to is it a pure CISC or a pure RISC instruction set. Most people learn that in their computer architecture classes or their computer science classes. So it's an easy delineation for people to make.
RISC-V obviously follows the RISC lineage. Part of it is the ease of implementation in that it is a reduced instruction set, so it's less complex than some of the other instruction sets out there. As it turns out, the original principles apply. You can actually make much more efficient implementations with a clean instruction set. That's reduced instruction set. Actually, it's more capable than adding lots of things. It tends to become baggage after a while. After 30 years of instruction set, adding things, you end up with baggage that can be detrimental for modern workloads. But we can't get to that because now I think I'm getting off your original question.
Erik: Well, I know that SiFive focuses more on IoT applications. Is that generally for the RISC-V standard? Is it generally more well-suited to IoT applications as opposed to, let's say, for a desktop computer or a server?
Jack: No, actually, that's a misconception. Actually, RISC-V is well-suited for all sorts of computing. We say RISC-V is inevitable. We think RISC-V will be in every single device — from embedded small IoTs to consumer devices, to phones, to data centers, to AI, aerospace, autonomous driving, automotive. It's suitable for all of these cases. Just because it's RISC-V, you can have different implementations of RISC-V, very high-performance limitations, very area efficient, very small, very tiny power. You're going to have all these different implementations. One of the beauties of RISC-V is it's going to share all the common software, an ecosystem, which is one of the things that's so appealing about it.
The reason why most folks know about RISC-V in the context of maybe IoT and edge devices is: that's where it got its start. Because as RISC-V, I mentioned earlier, it was a research project in 2010. SiFive, the company, wasn't founded till 2015 when we started to then commercialize RISC-V devices. RISC-V is the standard upon which you have to be compliant to. But then, you have to have different implementations. It turns out that a lot of the early implementations of RISC-V tended to be for the smaller IoT type devices, less complexity type devices because it's simply faster to get there.
Over the last few years, you've seen this explosion in RISC-V, not just from SiFive but from other companies that are going into these other applications, because it is very-well suited. It does give better performance, power area, competitive solutions in these high-performance spaces as well, such as autonomous driving data center server. Really, RISC-V will go everywhere. It's something that I think — it'd be 10 years ago when Calista when the mentors of RISC-V said that RISC-V is going to be in every computing device in the world, people were kind of like, oh, that's nice. That's very ambitious. I would say now people are like, "Oh, yeah. That's going to happen." It's just a matter of when and how long will it take, and which markets along the way. It doesn't mean that everything else goes away. But it's like all the new things are moving there, which I guess I'll go back to your original question here.
Why is there a RISC-V International? Why is there a standards body? Because what's very important about an instruction set is that everybody knows what it is. Because it's really the contract. You can think of it as the contract between the hardware and the software. So when you write the software, and you have the compilers and everything, what does that compile down to that the hardware knows how to execute? This standard has to be open and maintained. That's the big, big differentiation from RISC-V versus ARM or x86. It's that the RISC-V standard is an open standard, which means it's accessible by anybody. Anybody can use it. You can use a standard, and you can develop your own CPU. You can develop your own CPU and keep it a secret. You can develop your own CPU, and tell everybody you're doing this but not give it to anybody. You can develop it, and you can license it. That's what SiFive does. We license our CPU designs for other companies.
But because it's an open standard, everybody gets to compete. Everybody gets to participate in this ecosystem, which is very different than the proprietary ones that I mentioned earlier which are controlled by one or two companies. Because it's an open standard, you need an effective standards body which manages the standards. As things change, new enhancements, how do we decide what goes in? What doesn't? Well, that's the job of RISC-V International. RISC-V International is a nonprofit made up of member companies. Pretty much, all the technology semiconductor companies you can think of are members of RISC-V International. RISC-V International, their job is to manage the specification and keep it open for anybody to access. That's why there's that body there. They don't actually produce anything. They don't create any core. They really just maintain a spec, plus some general marketing and goodwill. But really, their real output is managing the specification.
Erik: Okay. I guess the advantage in the market of open source over proprietary protocol is that you have a group of people that are group of companies contributing to it move forward, this protocol, as opposed to a small set of companies that own the protocol. But you mentioned also that, of course, companies, when they develop something, they can keep it a secret. They don't necessarily have to share it. Who tend to be the big contributors here? Is it companies like SiFive? Is Intel involved in this? Are they pushing it forward despite having their own protocol, or is it more universities and academic institutions that are putting forward the research that enhance the standard overtime?
Jack: Yeah, so a couple of points in your question there. I think one thing that you mentioned was the whole open source versus the open standard. This is a very important distinction that is easy to get confused. The RISC-V instruction set standard, the specification of the instructions and how it interacts, that's an open standard. That means it's accessible. It's a standard. You can use it to do whatever you want with, which is different in open source.
Because the standard is open, I mentioned earlier, it allows for different implementations. You can have your secret private one. You can do it. You can license it. You can do whatever. It also means you can implement a core and put it in open source. This is what's happened with RISC-V. There are multiple open source implementations of RISC-V, including the original design that was done at Berkeley. It's available in open source, which means you can just download it and access it. That does not mean all of RISC-V is open source. That doesn't mean all implementations are open source. It just means that you can have open-source implementations. But this is still appealing because there's just places where you want open source implementations. Maybe you're doing research. Maybe you want to share what you're doing. Certain projects will do it. Then there's place for commercial implementations. If you're spending hundreds of million dollars to build a chip, you probably want a company to stand behind your CPU so that there's choice. An open standard gives people choice. So it's not correct to say it's an open source. It's an open standard.
What we've seen in our industry is, all the other interfaces in our industry are all open standards. Think Ethernet, for example. Anybody can implement Ethernet, but there's companies that specialize and do very well like Cisco, for example. They implement Ethernet chips. They do it better than everybody else. So that's the big difference between open standard and open source. But then within the standard itself, you can ask who are the contributors to it. It's actually both. You get the power of all the different companies looking into this.
RISC-V International has a technical standards committee. It's made up of member companies like SiFive, our commercial. They got their commercial interests. They got their commercial deployments, so they have to make sure that what is in the standard makes sense from a deployment standpoint. But you also have the community — open-source developers, maintainers. Their input gets to come in because they are writing software, and they have very good input on what makes sense and what makes things more efficient. So you do get the benefit of everybody contributing together. So your innovation happens much faster versus it being just controlled by any one single company.
Then because it's an open standard, if there's ever something that is not in there that you need, you can just go ahead and add it yourself, and put it in. Now, once you do that, you have the option of coming back to the standards body and say, "Hey, we should add this thing in there. It's really useful," whatever, or you can keep it for yourself. The key point here is you're never stuck. You're never held captive by any individual company like ARM. For example, if you're a customer and you go to ARM, "Hey, please do this. Please add this to your thing," and ARM says no, well, you really don't have much of a choice if you want to keep using ARM. They're the only ones who can write it for you. Whereas with RISC-V, you have the freedom to do it yourself. You can have SiFive do it. You can have other people who compete with SiFive do it for you. So the competitive environment is completely different in that ecosystem versus the others, which is why you're going to have a proliferation of different design points. You can have way more design points. You can cover all these different markets that you talked about earlier with different implementations. Because one company doesn't have to do it all.
Erik: Got you. Okay. Great. Thanks. So maybe we can turn now then to SiFive. If I think about SiFive just very high level in terms of the business, would it be comparable to ARM but built on an open standard? Is that a good way to think about the business?
Jack: Yeah, that's probably the simplest way to think about the business. It's mostly accurate. Of course, there are some differences. But yeah, fundamentally, SiFive, we were founded by the inventors of RISC-V — the same folks at Berkeley that I mentioned. We focus on developing RISC-V technologies. We have a very broad portfolio of licensable RISC-V cores, RISC-V IP that we license to companies who build chips. These are traditional chip companies that build chips. But these days also, a lot of OEMs is getting more fashionable. They build your own chip. A lot of companies are building their own chips. So that's also why the semiconductor industry is growing so fast. Because the number of people who build chips is not just limited to a chip company that wants to sell their final chip product to an OEM. But a lot of the OEMs themselves, a lot of the data center guys out of the cloud, everybody is also building their own chips. So anybody who builds a chip would be a customer of SiFive.
I mentioned earlier that it's all built on the RISC-V standard. The standard is a standard, but you can have different implementations that go into many different types of products. So we have four families. Of course, we have our essential cores which go in a lot of the embedded and IoT things that you talked about earlier. We have a performance line. This is like very high-performance cores. This will go into mobile consumer data center, edge server type devices, very high-performance CPU cores. We have an intelligence product line which combines RISC-V with very wide vectors, the vectors that I mentioned earlier. This is very well-suited for all the new classes of generative AI and transformers, AI calculations and connecting accelerators. It's about number crunching and programmable number crunching in that space. That's our intelligence product line. Then we have a fourth one called automotive, which takes the elements of the other three but adds functional safety, qualifications. All the next-generation autonomous driving, central compute IVI type automotive functionality is also well covered.
So it's quite wide in that sense. That's actually one of the strengths of SiFive. One of the strengths of RISC-V is that the wider it goes, the more software is written, the more the ecosystem is there, and the more cross sharing there is. So it's a very momentum-driven ecosystem that it keeps growing and growing. Because it can touch so many different market segments, as that grows, you just get more software. Then it moves around, and just makes each market stronger as that happens.
Erik: So let's use an example. Let's say I'm an automotive OEM, and I want a chip for my new e-vehicle that has a lot of computing requirements. What would the more typical case today be that I'm buying or licensing your SiFive automotive — what would you call this? I mean, it's IP. But would you call this like a—
Jack: Yeah, like IP core or CPU core.
Erik: CPU core. Okay. Then I would be using that off-the-shelf IP, sending that to a fab and having them manufacture chips for me, or would I typically be taking that in-house and then spending six months a year to maybe work with you or at least have my internal team customize that for my set? Is that customization quite common, or is that a bit of the possible but less, more for special circumstances today?
Jack: Yeah, it's kind of in between that. In your example, there's two ways really the automotive companies can work. Some of the new ones, especially with the electrification and what's happening with EV, it's an opportunity for a lot of new entrants for them to rethink how their electronics all work. So you see a lot of automotive companies looking to build their own chips to differentiate their products. I think Tesla is a very famous example of this. They started building their own chips. A lot of people follow in that way. Especially, in China — I know you live in China. You see probably more EV cars than we do here in the US. I guarantee you. The amount of EV cars in China is incredible. Those are all brand new. So we're looking at how do we build new things? Either they're building themselves option A, or they're working with their existing chip supplier and say, "Hey, chip supplier, please do this." They're specifying it for the chip company. They say, "Hey, we need these capabilities. We need these features. We need that type of stuff." Or, they do it themselves.
In either case, the way they would work with somebody like SiFive is, obviously, they need the CPU IP from us. You're looking at new capabilities, new functionality. They might work with us to make sure that it meets the next-generation products. There might be certain enhancements that we might do, different configurations that might make sense. But that's still just the CPU core. It's a large part of the chip, and there's a lot of work that can be co-done together. But then, they need to still put that into their SoCs. There's other components in the SoC peripherals, GPUs, maybe dedicated accelerators that either the chip company or automotive company would need to work on themselves or with other partners. Now, you want those things to work well with your main CPU, and your subsystems, and your safety islands and all that stuff. So it's a very collaborative process during that entire chip-building period. That can be 12 to 18 months to design, something like that.
Fundamentally, the baseline CPU IP core, we provide. We work with the customers closely in terms of configurations, customizations, extensions, things they want to do in that front — both for the core itself or working within their SoC to make it match up very well with their accelerators and other stuff. Because it's about differentiating the product. RISC-V lets you differentiate your product in ways that you couldn't. Because if the existing market today, you get the product from ARM, your product is the same as everybody else's product. That's one of the issues that they're facing. Because it's all one and the same. It came from one place.
One of the other driving factors — this is a really good example. So sorry for the long answer. One of the other driving factors is: the car industry is going through a change. The cars are getting electrified. The autonomous driving capabilities are expanding significantly. So they all have to write brand new software. When they write brand new software, they have a fundamental decision they have to make. It's like, what ecosystem do I want to invest all of my software development, all my software resources on? Really, there's x86, ARM or RISC-V. We see a lot of customers say, "Hey, I had to rewrite this anyway. Why would I not use RISC-V where I'm going to have lots of choices and options to innovate in different suppliers in doing so, versus one of the proprietary older ones where then I'm going to be single-vendor limited? So if I'm investing all my resources in a single vendor, that doesn't seem that prudent, especially when there's choice. So that's really the big, if you zoom out, why is RISC-V taking off. Why is everybody looking at it, besides the performance is good, and power is good, and features are good? It's that choice of freedom and that being in that ecosystem where there's going to be lots of options. It's really what's driving it, first order. Then they get to, like, hey, the power is good. Performance is good. That's validating the initial decision.
Erik: Yeah, you see this in a lot of areas of the tech stack or different tech stacks today in terms of PLCs, and so forth. Also, companies are starting to really push for open standards or, at least, more integration between solutions so that they're not locked into one vendor. This really seems to be a bit of a Zeitgeist in the industry right now or in the broader economy really. Because it's really cross-sector.
Jack: Right. Yeah, because when you're locked in a single vendor, not only are there negative commercial implications. But it's really bad for your innovation capability. Because you're dependent on what that one vendor could do in a highly competitive environment, which semiconductors and automotive grow. You can't be the one that's stuck on that if your competitors are moving.
Erik: Then you have this software portfolio as well. If we think about how your business works, is software a profit center? Or is it that you licensed the technology and then you say okay, and then we give you access to our software so that you can customize it to meet your needs? How does that work on the business level?
Jack: Yeah, software is basically enablement. A lot of the software that we do is actually in open source. So this is one of the very nice things. If we had to build an entire RISC-V software ecosystem by ourselves, SiFive as a company, one, it will take forever. Two, there's no way we could do everything that's going on. So this is one of the really, really nice things about RISC-V. It's you have this whole community developing software for all these different markets in these different segments.
Where we view our role when it comes to software is, we are key maintainers for a lot of key technologies. We do a lot of contributions in LLVM, the compiler side, certain libraries. Things that might be perceived as hard, or things that people aren't going to do, or things that are tied to the hardware, we want to make it, enable it so people can use it. But the key on the software side is not that we want to do anything ourselves; we want to be the enablement of software so that more software can be written. Things that are very important or maybe very performance-dependent with some of our microarchitectures, then we will go. We'll do that ourselves and contribute that back as much as we can, so people can then leverage and build on top of that.
Software is the big enablement. It's the big ecosystem driver that's going to help RISC-V grow as a whole. We do have a sizable software team. A lot of the software development that happens in the RISC-V community happens on SiFive cores and SiFive silicon. Because, throughout our history, because we have been first with new products, with the new specification, with the new standards because we lead, we also put out developer boards and things so that people can write software against them that are compliant to the standard, so that there's actual silicon for people to write software against. Because developing silicon is a big investment of cost. Not everybody can do that. We're lucky enough to be in a commercial position where we can do so. So we put that out, again, to enable not just our software development team but the broader software ecosystem, to contribute. Because this is really where the community aspect of RISC-V is. Really, it sets it apart and it gives it a very sustainable differentiator.
Erik: Yeah, absolutely. Certainly, the strength of open. You also have this SiFive vectors solution. I'm curious how that compares to the SiFive intelligence. Because they both seem oriented around enabling AI, ML computer vision applications. But then RISC-V vector is a unique offering here. What is the solution around this?
Jack: That's a great question. Thanks for bringing that up. We should clarify our marketing a little bit. SiFive vectors — vectors is really a technology. Vector style computing is not new. It was actually invented by Cray back in supercomputers. But it has come back in vogue with a lot of the workloads that exist today. Even ARM or Intel, they all have adopted various types of vector style computing. RISC-V vectors is kind of an enhancement of compute. You can have different vector lengths, different implementations of that. It's very well-suited for a lot of data crunching. So if you're doing sensor data, you're collecting that. But it also lends itself very well to machine learning algorithms, AI, transformers, things of that nature. Those are where it's starting to see adoption.
Vectors is a technology. Then we have different implementations depending on the type of core that we have that have 256-bit wide vectors, 512-bit wide vectors and other things. The intelligence product line that you mentioned is our product line that has the wide vectors built in. Intelligence product line, people will license our intelligence products will generally be used in some of those applications — maybe data center training, AI offloads at the edge, if you want to do AI on the system, your smart cameras, for example, that use those products and just do the processing locally. That's where you would use the intelligence products. They just leveraged the vector technology. Hopefully, that cleared that up a little.
Erik: Yeah, that's clear. Okay. Thanks. Maybe we can touch then a little bit more on the markets in the use cases. Automotive is obviously a critical use case because of the changes happening in the industry and the size of the industry. Therefore, you have a unique product for automotive. Beyond automotive, you have your performance line. You have your intelligence line and your essential line. But what are the other industries that you're particularly excited about at SiFive, its potential for scaling up rapidly for adoption? Maybe, especially if we look at some of the more emerging ones, the ones that might be using ML on the edge and so forth.
Jack: Yeah, absolutely. I think automotive, we already covered that. That's definitely one that's very exciting because of the changes happening in that ecosystem, the growth, and the number of companies that's participating in that space. That's definitely one of them. The other one, of course, revolves around AI and what people are doing here. This is rapidly, rapidly changing.
One of the things that I think we've seen over the last few years is: AI has changed so fast that if you don't have a programmable solution, it's very difficult. It's relatively straightforward to build hardware solutions that are hardwired, that are really good at one particular function. But the industry is moving so fast. The types of algorithms that we need to run is changing so quickly, that none of those have been really truly successful. Because by the time you're out in the market, everything has already moved. So having a programmable solution — which is what our vector technology enables, what intelligence product line presents — is actually very attractive for people to then connect their accelerators to that programmable engine. Then by doing so, they get the benefit of their accelerators. But there's all this thing up, this programmability, that lets them adapt to new things like transformer, which is relatively new.
I think we're very excited about this. We've seen quite a bit of adoption on this front, both in what I would call the edge AI case, like the smart camera where you're doing all the processing on the edge with vector machine, but also in the data center. Google announced a while ago that they were using our intelligence products, along with the TPU in the data center. So they connected their GPU accelerator to the vector core. Then it made it much easier for them to program the system, to run pre-processing, post-processing, offload, certain things from their hardware engine into the intelligence core. It made for a very nice match together. Then that's in the data center. It's not the data center server, or 'main compute,' or whatever. But it's handling a very important offload and acceleration tasks. Those are areas on the AI side that I'm quite excited. I think everybody's excited about AI because it's just changing so quickly. There's so much opportunity there. But I think that's a technology that's very well suited. Again, if you're writing new software, you get to choose what ecosystem you want to write it on. In those cases, to me, it's a pretty easy choice. Of course, you choose the open ecosystem.
Now, if we talk about a market where there's lots of existing software, well, those are harder to change because you have all this existing software and ecosystem. You're like, well, why do I want to switch over? I had to redo all the software work. It's a lot. It's all these other things. But we see that happening for RISC-V also. For example, one of the other markets we are very excited about is the mobile consumer market. If you haven't been following closely, Google has made a few announcements in the last few months here about Android support coming for RISC-V, meta-Android. RISC-V will be a first-class citizen for Android. A couple of weeks ago, at the RISC-V Summit in Barcelona, Google was on stage talking about how the RISC-V NDK will be available by the end of this year, 2023. Meeting these ideas are being worked on. You can tell Google is very serious about Android work for RISC-V.
I can't speak for them because that's not my place. So I can only point to the public information, ask you and the listeners to read between tea leaves. But Android is clearly coming for RISC-V. That opens up so many opportunities for the RISC-V products that they'd go into. So that's another area that I'm quite excited for. Because if you ask me, when we started SiFive, we started the company in 2015. I was here, and we had to literally explain to the world every person we met what RISC-V was and why it mattered. If you'd asked me then if I thought Android would come to RISC-V, I would say maybe. But that might be the last thing that come after everything else is gone. But it's happened much sooner than that, which just goes to show I think the global phenomenon that RISC-V has tapped into was bigger than even us, the biggest proponents there could be. We all started the company for this. We couldn't have predicted it back then.
Erik: What about LLMs? Because you have this situation right now where NVIDIA pretty much dominates the market, but you really have supply chain constraints that are limiting people's ability to actually build on this new technology. What would be a timeline for RISC-V becoming a viable competitor for NVIDIA in terms of enabling companies to build LLMs?
Jack: Yeah, I think getting in the prediction business is dangerous. I would say that I think what NVIDIA has done really well — I was at NVIDIA before I joined SiFive. To me, the strength that they have is they have a programmable system. They have CUDA. People can program to their devices. They have the biggest, best programmable data thing. Really, that's why people use it. As things continue to evolve, a new model come up, people are going to look for new additional ways, different ways, more efficient ways. I think the very nice part about this is it's not a done thing. We're at the very beginning stages of what AI and what LLM and what going to be next can do. So I think there's going to be lots of software changes that are still happening.
Where there's compute and where there's more compute, I think RISC-V has a very strong play. I don't view it as like, oh, can you get there, and it just replaces all the NVIDIA thing. I think it's going to be, hey, as new things get developed, some things are going to run better on NVIDIA GPUs. Some things will probably run better on RISC-V vectors. Some things will run better on RISC-V vectors, plus accelerators. It's going to be kind of open competition for a combination of the hardware and software together. So this is really exciting. Because I don't think anybody can predict it, other than we know that the amount of compute that the world needs is going to massively increase. That's why everybody loves the chip companies and what we're doing right now in this space.
Erik: Yeah, great. Let's touch quickly on China. I'm sitting here in Shanghai. I've been here for about 12 years now, so it's kind of my second home. I know you're going to be visiting soon. China has a fascinating market, right? Because there's this enormous demand driven by e-commerce, by traditional industries, but also by this very rapid development of the EV market here. It's also now for your industry but more of a complicated market because of the tensions between the US and China. How does this look as a market maybe for SiFive but also just for RISC-V adoption more generally?
Jack: I think certainly, from a semiconductor standpoint, nobody in the world can ignore China, right? I mean, if you look at the number of companies in China that are trying to build chips, develop products — you mentioned the EV companies. The number of EV companies developing autonomous driving solutions in China is mind boggling. I was there a few months ago post pandemic. I got a refresh. If people haven't been there — you're there so you understand — it's like the amount of electrical vehicles, the energy level, the enthusiasm for building is just something that you can't compare. The number of things being designed in China is just at very high number. So that naturally presents huge opportunity in terms of design sockets and other things that require CPUs, including RISC-V.
But the other really important thing about RISC-V, I mentioned earlier, is it's an open standard. So this is something that is very attractive to China, because they want to develop on an open standard. Because then as they develop more software and they develop products, an open standard provides a choice of different options for them to do so.
SiFive, we're a US company. We're very careful and diligent in making sure we abide by all the regulations and things like that. Where there's business, we're happy to work with customers there. But customers view RISC-V as a way for them to develop within China more products in ways that will be less constrained. There's a positive push towards RISC-V in China that is actually larger than in other regions of the world. Actually, the RISC-V event that happened, the biggest event happened in China. The most number of attendees are actually in China.
The other thing that's happened with RISC-V is RISC-V is now the standard upon which computer architecture is taught at universities all around the world. So the number of engineers who are graduating who know RISC-V is increasing. But it's also increasing the most in China, because they had the most engineering and graduate students graduating each year. All this is to say that I think RISC-V is very important for China and the China market. Actually, a lot of the open-source software work, some of the technology is being driven very much by companies that are local in China.
Erik: Okay. That makes complete sense. There's an enormous hunger here. I really encourage anybody who's listening who hasn't made it to China since 2018, 2019 to make a trip over. You might have come over for the automotive fair. I've talked to so many German executives who came over the first time in three or four years and were just stunned by what they saw at the automotive fair in terms of what's coming.
Jack: It was shocking. I was shocked too. I was like I was just walking on the streets of Shanghai. You're like, all these cars are brand new and they're all electric. You can tell because they got the green license plate. You're like, all these are electric. They're all brand new. That's how I felt. These companies that you maybe only partially heard of, they're all here. They're all doing this stuff. They all sell their cars in the open-air malls with the fancy displays, like for American listeners like how Tesla does it. There's more car dealerships than Starbucks. It's crazy in that sense. It was definitely a post-pandemic, quite shocking in a good way surprise, I would say.
Erik: Yeah, well, I think in Q1 of this year, they surpassed Japan as the leading automotive exporter. So we'll see if they maintain that for the remainder of the year. But that would be quite a people over the past few years. Because only five years ago, China was viewed as a market for vehicles but basically unable to produce anything competitive. It's all state-owned enterprises that were partly having joint ventures with foreign automotive OEMs and basically just manufacturing centers, but nothing branded and nothing particularly innovative. So that's been a pretty rapid shift.
Jack: Absolutely. I think you see the global OEMs also reacting now as they see this happening. I think part of this, the innovation in automotive, is going to happen much faster than it has historically. The idea of introducing platforms every few years versus every year, you're going to see it happening sooner and sooner primary driven by a lot of the Chinese OEMs. So I think it's good for competition around the world. Definitely good for the consumer, good for semiconductor providers like ourselves. Because that means there's more new designs.
Erik: Yeah, well, maybe this is a good place for me to ask you my final question, which is, you guys are really at the nexus of a lot of the technological shifts, the economic shifts that are happening in the world right now. So what are you most excited about? If you look at the SiFive product roadmap, if you look at the market forces right now, what excites you?
Jack: I think if you look at the company when we started, there was this belief that, hey, this was pretty cool. I was like, well, can we convince everybody of that? Okay. We did that. It's like, okay, well, then can we actually build the products that people care about? And we did that. Now I think that momentum is just self-fulfilling. It's clear that RISC-V is going to be adopted everywhere in all of these segments, all these new things that are happening. That's the most exciting thing. Actually, what keeps me up at night is our ability to execute towards all these customer demands, because we get pulled in different directions. Because if I can go into these different markets for us to make sure that we properly service our customers in the right market so that they can be successful and then grow it, that's the challenge. But the excitement is like, oh, my goodness. I see a world where everything is RISC-V. That's pretty exciting for us.
Erik: Yeah, I know. You have a broad product portfolio. Pretty much, any industry in the world could be your customer. Now this is an execution.
Jack: That's right. An execution. Then the picky and choosey, make sure you prioritize the right ones. Because as much as you'd like, you can't do everything right away, everything maybe over time. But that's kind of the path. That's the fun, right? At least, my day job is working with different customers and understanding what they're looking at and how they want to differentiate their thing and how we can help them. That's a really fulfilling part of what we do.
Erik: Yeah, great. Well, Jack, I think you've given me a good education. Hopefully, the audience feels the same. Anything that we didn't touch on that is important for folks to know? Any last thoughts?
Jack: No, I think I would just encourage anybody who's in the space to go read more, learn more about RISC-V. It will be pervasive. You can get involved. There's lots of different ways to get involved. If you're a software developer, on the open-source software side, if you're a hardware developer, either with companies or with open source, all these, there's lots of different ways. If you find this exciting or interesting, then you can get involved either at an individual level, or within your companies, or whatever else. It's pretty easy to find the resources if you just search SiFive or RISC-V.
Erik: Yeah, great. For the folks that are listening, that's riscv.org at RISC-V. Then SiFive is SiFive.com. Jack, thanks so much for taking the time today. I really appreciate it.
Jack: Yeah, thank you for having me.