Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
CLaSH: A functional hardware description language (haskell.org)
87 points by rwosync on April 15, 2014 | hide | past | favorite | 44 comments


I did my bachelor's project on translation between functional HDLs that were also reversible, i.e. only describing total, bijective functions. The idea of functional HDLs goes back to at least the 1970s, and the coolest language I found was 𝜇FP by Mary Sheeran. It was an algebraic VLSI design language where logical gates as combinators are the only language constructs.


That sounds cool. I don't know anything about HDLs, but can you explain why you might want your functions to be bijective? What does that give you in hardware?


Maybe for reversible computing? [0]

[0] https://en.wikipedia.org/wiki/Reversible_computing




I think we have a winner for the most poorly named programming language of the last decade. I thought that "Go" and "Hack" were bad, but at least I could type those on my keyboard.


Seriously. I spent more time trying to figure out how to read/say it than reading about it. C(lambda)aSH? clambdaash? Oh, CLaSH.


Clam-dash. It makes your shellfish go fast, or your car smell.


The command-line tool / url / package name / ... are all "CLaSH"


I think ClaSH is the proper romanization.


Squirrel is also high up on that list.


I am able to get about 4 decent/good results by googling "clash language". Though I got more results with "CλaSH language".


I am hardware engineer in this industry. And I know Haskell and made my site with Haskell language. But I do not understand why Haskell is used to hardware design.

Designing hardware is much more important than describing hardware logic itself. IMO, VISIO and Excel are the tools to design hardware logic not Verilog nor like this CLaSH.

But, If this kind of HDL can be used along with Verilog, it might be helpful to build verification IPs.


Wait, so what are you using to describe the circuit on the logic or RTL level? Do you have VLSI engineers using Cadence or something? From my understanding, using Verilog in ASIC design (and not just verification) is pretty widespread in industry.


I think what he's saying is the architectural decisions outweigh the implementation language.

When I was taught Verilog for IP implementation, one thing I noticed is that people get caught in the trap of trying to abstract away the hardware or approach it from a higher level. Haskell/Verilog 2001/SystemVerilog all give us tools to do this. However, when trying to make real silicon, you need to understand what is actually getting built (i.e. know exactly how many flip flops you're creating and how they fan out) and then use the language to describe it. If you use a 'for' loop to try to do computation, as you might in a programming language, you could end up with something entirely unexpected or unsynthesizable.

Traditionally you first design your module conceptually on a whiteboard (or Excel, Viso, etc.), then implement it in an HDL. Because of the influx of software engineers trying to get into hardware (via FPGAs, etc.), there has been a trend in trying to obfuscate away the details of the implementation, and this can cause a lot of confusion.

That said, I've heard of projects that already translate native Haskell to HDL with some success. I'm not a programmer so I don't claim to understand if it's a good idea, but I still think understanding exactly what's being output is important to knowing if it can perform in a reasonable way, especially if you're doing something of any complexity.


FWIW, it is quite easy to write Verilog code that ends up being unsynthesizable, since the language wasn't originally designed to be an HDL. Many of the alternative HDLs, such as UC Berkeley's Chisel (https://chisel.eecs.berkeley.edu/) are designed with the express goal of making it impossible (or at least quite difficult) to write unsynthesizable code.

Also, though figuring out what Verilog to write is not difficult if you've properly thought out the microarchitecture, it can be rather tedious and error-prone to actually write it. I'm not sure how CLaSH works, but Chisel allows you to essentially script generation of hardware using Scala. This removes some of the tedium of writing Verilog and also encourages code reuse (for instance, by allowing you to generate a 32-bit adder and an 8-bit adder using the same code but with different parameters).


Thank you for explaining my thought. I'm not good at English.

In my experience, it is quite easy to describing the hardware logic if the architecture is designed well. So, what I mean in "VISIO and Excel are much more important" that the architecture should be concise and cycle accurate. Then verilog coding is just a piece of cake.


The problem is that despite the Verilog being relatively easy, it's still incredibly tedious and error prone.

It's amazing how Verilog manages to be too low level and too high level at the same time. It's a simulation language not originally intended for synthesis, so it doesn't have access to hardware primitives, and requires you to write specific patterns to ensure they're inferred correctly. But at the same time, it's too low level to even allow you to abstract those patterns.


The need to know exactly what is being built is not completely incompatible with the notion of abstraction. Sure, trying to apply software ideas to hardware with no understanding is a recipe for disaster, but that's not what people are suggesting. The goal is to recognise and abstract patterns in hardware design.

Your example of for loops being fragile is actually a good argument for higher level abstractions: maps and folds are much better tools for working with hardware, since they constrain you to a specific hardware layout, and make it clear what's happening.

>Traditionally you first design your module conceptually on a whiteboard (or Excel, Viso, etc.), then implement it in an HDL.

So wouldn't it be nice if the language you used could express the same concepts you use in your higher level diagrams?


How is this any different from compiling C to assembly? Why would higher level languages create unsynthesizable circuits? You trust the C compiler to create the proper instructions for your target architecture then I don't see why the same can't be done with a Haskell DSL that compiles to Verilog.


From memory, in several HDLs like VHDL and Verilog, a "boolean" value can have up to 8 or so different values (true, false, undefined, hi, lo (different from true and false apparently) plus a few more).

My experience with Verilog is that it's very easy to write things which look fine, you can simulate and then fail in hardware; the semantics are just wrong in the languages.


Higher level languages inevitably come with built in semantics that the programmer takes for granted, but can't be synthesized directly to hardware. In C, it's the function call stack. In Haskell it's higher order types and recursive data structures (and more). You could in theory create some runtime package that you'd compile to hardware for your program to be synthesized to, or "run on".....but then you'd just be making a straight up computer, wouldn't you. ;-)


None of these things are relevant in most HDLs implemented as DSLs in high level languages. The point of most of the HDL work in Haskell (for example Lava, and Bluespec) is to provide primitives to talk about hardware and to use a sane language as a way to manipulate them to build larger specifications. It is embarrassing that people use tools that allow you to write un-synthesizable code.


A computer is a much simpler abstraction and much less leaky than a circuit model.

Yes, in theory a computer could get a high level description of a circuit, and turn it into a very efficient hardware implementation. In practice our computers are not good enough - the same way they were not good enough for compiling high level languages at the 70's, and people wrote assembly by hand.


There's already a commercial functional hardware description language called Bluespec. http://en.wikipedia.org/wiki/Bluespec,_Inc.


Bluespec has a strange story: initially it was developed like Clash, as a subset of Haskell that could be translated to a HDL. However its users hated the Haskell syntax, so the Bluespec designers added a layer of syntax on top to make it look like more a HDL, and they also downplayed the powerful type system for adoption.

Since then, Bluespec has not become too popular, but Haskell has. Clash may have a chance.


I work with Bluespec. It's miles better than Verilog, but coming from a Haskell background, I can't help but feel disappointed with the limitations their alternative syntax puts on my ability to abstract things. (No lambdas, no do-notation, needlessly verbose sum types)

I'm also not really convinced by their hardware model (Guarded Atomic Actions). In practice, I've found that it leads to very disjointed flow control, and forbids what seem like intuitive designs.


In a former life (a long time ago), I used ELLA for designing processors. It was a pretty decent functional language. Great at abstracting components or subsystems and infinitely better than modelling stuff with C.

IIRC European Silicon Structures used to offer it as a part of their toolset when VLSI design was young and hot.

https://en.wikipedia.org/wiki/ELLA_%28programming_language%2...


Is this meant for hardware people or software people? Because I'm a hardware guy and I don't know Haskell. None of my EE friends know Haskell either. I really don't see a serious hardware engineer using this over VHDL or Verilog, even if it is more beautiful or provably better or whatever.


Strangely enough, alternative HDLs embedded in Haskell seem to be a thing. Besides this, there is also Bluespec and Lava.

http://wiki.bluespec.com/

http://raintown.org/lava/


Tom Hawkins is probably around here somewhere.

http://en.wikipedia.org/wiki/Atom_(programming_language)


Immutability looks too much like hardware.


For such a mission critical application (hardware design) my mind is blown by how laughably bad the HDLs are when compared to modern programming languages. Chunks of the languages aren't synthesizeable, and which bits are/aren't are not consistent across vendors. There's also IEEE standards that nobody reeeally seems to care about too much. Additionally, it's not clear from a syntax point or a types point whether or not the code you will be writing is synthesizable. You just need to "know" ahead of time. It's terrible! It makes this whole field really obtuse and hard to enter from a design standpoint.

The field really needs some fresh blood or dare I say "disruption". I'm glad people are trying to make things better.


From the limited amount of VHDL I did in my Digital Design class at my university, I found hardware description similar to functional programming. State is expensive to represent in hardware (you need latches, flip-flops, etc), and so VHDL's concept of signals is similar to constants, and components map well to pure functions. With an intelligent enough compiler, I think this is totally feasible.

From a more short-term practical standpoint, no, I don't expect anyone to use this. Hardware engineers are incredibly stubborn when it comes to software. Their work typically involves large time investments with lots of costs and risks. For better or worse, they typically don't ever want to add more risk by using an "untrusted" tool, creating a chicken-and-egg problem.

In the second paragraph, the author admits that there's not yet a good way to represent a recursive algorithm in his language. I think this is more of a proof-of-concept that could eventually become useful.


If the finance industry (which is similarly large-investment and high-risk) can begin to use functional programming for critical applications, then I can see hardware moving towards it as well. If tools like Clash can be made to emit human-readable VHDL, I can see that as a path forward towards adoption.


> If the finance industry (which is similarly large-investment and high-risk) can begin to use functional programming for critical applications then I can see hardware moving towards it as well

I don't think these things are at all comparable. The finance industry is writing code to do software things. The hardware industry is writing code to build hardware.

It is not just a matter of how much money and risk is involved, it's a matter of whether the language is a good mapping for the things it is describing.


The finance industry is writing code for hardware to do financial things. http://www.pcmag.com/article2/0,2817,2424495,00.asp


About EE conservatism in regards to software tools, this is mainly because we are highly restricted in what tools we can use by the platforms we have to design for. If you want to use a specific FPGA or a specific ASIC process, you have to use the tools that the FPGA vendor or ASIC foundry officially supports.


I totally agree and sympathize here. I've seen a few solutions that compile to VHDL or Verilog, as most platforms support one of those, to work around this.


That is what UC Berkeley's Chisel (https://chisel.eecs.berkeley.edu/) does, and it seems to be what CLaSH does as well. The issue is the same as in the various compile-to-JS languages. The additional level of indirection can make things difficult to debug (and heaven knows HDL code is hard to debug already).

Also, there is the issue of interfacing with third-party IP blocks or builtin FPGA hardware slices. You either have to stub these out in your high-level description or simulate using the generated HDL.


without having looked at it in much detail, does anyone know how/whether this is related at all to bluespec (http://en.wikipedia.org/wiki/Bluespec,_Inc.)?


It's not related to bluespec. (cf my other comment in this thread)


Interesting language, I'll be interested to see how Clash develops.


Clash? C-lambda-ash? C-lambda-a-ess-aitch? C-lash?




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: