Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I really worry about a startup taking a massive bet on their own custom hardware now in 2022. The world was much, much different in December 2019 when Oxide started than it is now. Let's hope the investment cash keeps flowing and the hardware gets to folks that purchased it.


Right now is, in fact, the best time to be betting on custom hardware.

Moore's Law has been dead for a while. Getting "performance" now requires design and architecture again rather than just sitting back for 18 months and letting Moore's Law kill your competitor.

The big problem right now is that custom chip hardware is still too stupidly expensive because of EDA software. Fab runs are sub $50K, but EDA software is greater than 100K per seat and goes up rapidly from that.


Do you really need proprietary EDA tools to get started on designing custom chips? Higher-level design languages like Chisel are showing a lot of potential right now, with full CPU cores being designed entirely in such languages. Of course EDA will be needed once the high-level design has to be ported to any specific hardware-fabbing process, but that step should still be relatively simple since most potential defects in the high-level design will have been shaken out by then.


> Do you really need proprietary EDA tools to get started on designing custom chips?

Yes, actually, you do.

The "interesting" bits in chip design aren't the digital parts--the interesting bits are all analog.

A RISC core is an undergraduate exercise in digital design and synthesis in any HDL--even just straight Verilog or VHDL. It's a boring exercise for anyone with a bit of industry experience as we have infinite and cheap digital transistors. (This is part of the reason I regard RISC-V as a bit interesting but not that exciting. It's fine, but the "RISC" part isn't where we needed innovation and standardization--we needed that in the peripherals.)

However, the interfaces are where things break down. Most communication is now wireless (WiFi, BLE, NB-IoT) and that's all RF (radio frequency) analog. Interfacing generally requires analog to digital systems (ADCs and DACs) and those are, obviously, analog. Even high-speed serial stuff requires signal integrity and termination systems--all of that requires parasitic extraction for modeling--yet more analog. And MEMS are even worse as they require mechanical modeling inside your analog simulation.

If your system needs to run on a coin cell battery, that's genuinely low power and you are optimizing even the digital bits in the analog domain in order to cut your energy consumption. This means that nominally "digital" blocks like clocks and clock trees now become tradeoffs in the analog space. How does your debugging unit work when the chip is in sleep?--most vendors just punt and turn the chip completely on when debugging but that screws up your ability to take power measurements. And many of your purely digital blocks now have "power on/power off" behavior that you need to model when your chip switches from active to sleep to hibernate.

All this is why I roll my eyes every time some group implements "design initiatives" for "digital" VLSI design--"digital" VLSI is "mostly solved" and has been for years (what people behind these initiatives are really complaining about is that good VLSI designers are expensive--not that digital VLSI design is difficult). The key point is analog design (even and especially for high performance digital) with simulation modeling along with parasitic extraction being the blockers. Until one of these "design initiatives" attacks the analog parasitic extraction and modeling, they're just hot air. (Of course, you can turn that statement around and say that someone attacking analog parasitic extraction means they are VERY serious and VERY interesting.)


> It's a boring exercise for anyone with a bit of industry experience as we have infinite and cheap digital transistors.

Having "infinite and cheap" transistors is what makes hardware design not boring. It means designs in the digital domain are now just as complex as the largest software systems we work with, while still being mission-critical for obvious reasons (if the floating point division unit you etched into your latest batch of chips is buggy and getting totally wrong results, you can't exactly ship a software bugfix to billions of chips in the field). This is exactly where we would expect shifting to higher-level languages to be quite worthwhile. Simple RISC cores are neither here nor there; practical multicore, superscalar, vector, DSP, AI etc. etc. is going to be a lot more complex than that.

Complicated analog stuff can hopefully be abstracted out as self-contained modules shipped as 'IP blocks', including the ADC and DAC components.


Why? If anything, commodity/non-custom hardware is what's hurting right now. Fat margins on hardware imply a kind of inherent flexibility that can be used to weather even extreme shocks.


There are plenty of commodity chips that go into making a full server rack. If any little power regulator, etc. is backordered for months and years it's just more unexpected pain. And that's before we even get to the problems of entire factories shutting down, just look at what's happening to Apple & Foxconn of all companies in Shenzhen this week. If the big players are struggling the small fries are in for pain too.


The supply chain crisis is very, very real, but we are blessed with absolutely terrific operations folks coming from a wide range of industrial backgrounds (e.g., Apple, Lenovo, GE, P&G). They have pulled absolute supply chain miracles (knocking loudly on wood!) -- but we have also had the luxury of relatively small quantities (we're not buying millions of anything) and new design, where we can factor in lead times.

tl;dr: Smaller players are able to do things that larger players can't -- which isn't to minimize how challenging it currently is!


Just curious, are you all working out of the same place or all remote? Curious about hardware startups and how that works. Thanks


We have an office, but many people aren't in the Bay Area (myself included). Not everyone is doing hardware, and some folks who do have nice home setups they enjoy working with. It's a spectrum, basically.


Thanks steve




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: