> Do you really need proprietary EDA tools to get started on designing custom chips?
Yes, actually, you do.
The "interesting" bits in chip design aren't the digital parts--the interesting bits are all analog.
A RISC core is an undergraduate exercise in digital design and synthesis in any HDL--even just straight Verilog or VHDL. It's a boring exercise for anyone with a bit of industry experience as we have infinite and cheap digital transistors. (This is part of the reason I regard RISC-V as a bit interesting but not that exciting. It's fine, but the "RISC" part isn't where we needed innovation and standardization--we needed that in the peripherals.)
However, the interfaces are where things break down. Most communication is now wireless (WiFi, BLE, NB-IoT) and that's all RF (radio frequency) analog. Interfacing generally requires analog to digital systems (ADCs and DACs) and those are, obviously, analog. Even high-speed serial stuff requires signal integrity and termination systems--all of that requires parasitic extraction for modeling--yet more analog. And MEMS are even worse as they require mechanical modeling inside your analog simulation.
If your system needs to run on a coin cell battery, that's genuinely low power and you are optimizing even the digital bits in the analog domain in order to cut your energy consumption. This means that nominally "digital" blocks like clocks and clock trees now become tradeoffs in the analog space. How does your debugging unit work when the chip is in sleep?--most vendors just punt and turn the chip completely on when debugging but that screws up your ability to take power measurements. And many of your purely digital blocks now have "power on/power off" behavior that you need to model when your chip switches from active to sleep to hibernate.
All this is why I roll my eyes every time some group implements "design initiatives" for "digital" VLSI design--"digital" VLSI is "mostly solved" and has been for years (what people behind these initiatives are really complaining about is that good VLSI designers are expensive--not that digital VLSI design is difficult). The key point is analog design (even and especially for high performance digital) with simulation modeling along with parasitic extraction being the blockers. Until one of these "design initiatives" attacks the analog parasitic extraction and modeling, they're just hot air. (Of course, you can turn that statement around and say that someone attacking analog parasitic extraction means they are VERY serious and VERY interesting.)
> It's a boring exercise for anyone with a bit of industry experience as we have infinite and cheap digital transistors.
Having "infinite and cheap" transistors is what makes hardware design not boring. It means designs in the digital domain are now just as complex as the largest software systems we work with, while still being mission-critical for obvious reasons (if the floating point division unit you etched into your latest batch of chips is buggy and getting totally wrong results, you can't exactly ship a software bugfix to billions of chips in the field). This is exactly where we would expect shifting to higher-level languages to be quite worthwhile. Simple RISC cores are neither here nor there; practical multicore, superscalar, vector, DSP, AI etc. etc. is going to be a lot more complex than that.
Complicated analog stuff can hopefully be abstracted out as self-contained modules shipped as 'IP blocks', including the ADC and DAC components.
Yes, actually, you do.
The "interesting" bits in chip design aren't the digital parts--the interesting bits are all analog.
A RISC core is an undergraduate exercise in digital design and synthesis in any HDL--even just straight Verilog or VHDL. It's a boring exercise for anyone with a bit of industry experience as we have infinite and cheap digital transistors. (This is part of the reason I regard RISC-V as a bit interesting but not that exciting. It's fine, but the "RISC" part isn't where we needed innovation and standardization--we needed that in the peripherals.)
However, the interfaces are where things break down. Most communication is now wireless (WiFi, BLE, NB-IoT) and that's all RF (radio frequency) analog. Interfacing generally requires analog to digital systems (ADCs and DACs) and those are, obviously, analog. Even high-speed serial stuff requires signal integrity and termination systems--all of that requires parasitic extraction for modeling--yet more analog. And MEMS are even worse as they require mechanical modeling inside your analog simulation.
If your system needs to run on a coin cell battery, that's genuinely low power and you are optimizing even the digital bits in the analog domain in order to cut your energy consumption. This means that nominally "digital" blocks like clocks and clock trees now become tradeoffs in the analog space. How does your debugging unit work when the chip is in sleep?--most vendors just punt and turn the chip completely on when debugging but that screws up your ability to take power measurements. And many of your purely digital blocks now have "power on/power off" behavior that you need to model when your chip switches from active to sleep to hibernate.
All this is why I roll my eyes every time some group implements "design initiatives" for "digital" VLSI design--"digital" VLSI is "mostly solved" and has been for years (what people behind these initiatives are really complaining about is that good VLSI designers are expensive--not that digital VLSI design is difficult). The key point is analog design (even and especially for high performance digital) with simulation modeling along with parasitic extraction being the blockers. Until one of these "design initiatives" attacks the analog parasitic extraction and modeling, they're just hot air. (Of course, you can turn that statement around and say that someone attacking analog parasitic extraction means they are VERY serious and VERY interesting.)