FPGARelated.com
Forums

Is FPGA code called firmware?

Started by Marko February 20, 2006
fpga_toys@yahoo.com wrote:
> Isaac Bosompem wrote: > > For me the biggest hurlde of learning to utilize VHDL was programming > > my brain to not think of it as a programming language. Then everything > > began to fall into place. > > Interesting discussion. In a prior discussion regarding "programming" > or "designing" with C syntax HLL or HDLs, it was interesting how many > people took up arms that they could do everything in VHDL or Verilog > that could be done with a C based fpga design language such as > Celoxica's Handel-C, Impulse-C, FpgaC or similar tools. That arguement > was that VHDL/Verilog really isn't any different that C based HLL/HDL's > for FPGA design, and frequently with the assertion that VHDL/Verilog > was better.
Definitely C is linked fairly close to VHDL/Verilog. But there are a few key differences that I had to consider when learning HDL's to truly understand what was going on. For example the non-blocking statements in a clocked sequential processes in VHDL. I orignally assumed that like software , signal assignments would happen instantly after the line has executed, but I was wrong. A few minutes playing around with ModelSim revealed that they occur on the following clock pulse (when the flip flops sample the data input). So there was a bit of a retraining process even though the syntax was somewhat familiar.
> So is an fpga design in VHDL/Verilog hardware, and the same realized > equiv gates written in in Celoxica's Handel-C software just because of > the choice of language? Or is a VHDL/Verilog design that is the same > as a Handel-C design software?
This is a fairly tough question as we wouldn't be discussing this if this was something that we could all agree on. I believe that both are hardware and I will explain my reasoning: FpgaC for example is a totally different ball game from VHDL/Verilog but they ultimately result in a piece of hardware at the output. FpgaC (from the example posted at the TMCC's website at U of Toronto, where I happen to live :) ) hides completely the hardware elements from the designer. Allowing them to give a software-like *DESCRIPTION* (key word) of the hardware. What you get is ultimately hardware that implements your "program". VHDL/Verilog on the other hand do hide most of the grunt work of doing digital design but still you have somethings left over like what I pointed out above about the non blocking signal assignments. We have always progressed towards abstraction in the software world,similar pushes have also been made in the hardware world with EDA's and CAD software packages like MATLAB, which automate most of the grunt work. Perhaps program like HDL's are the new progression. All I can say though, is only time will tell. It depends on how well compilers like FpgaC will be able to convert a program to hardware description. Also how well it be able to extract and fine opportunities for concurrency. -Isaac
Sorry I have a bad habit of not reading through my replies. I am using
Google so please spare me :)

I meant "Perhaps programs like FPGAC are the new progression"

On Wed, 22 Feb 2006 11:47:56 +1300, Jim Granville
<no.spam@designtools.co.nz> wrote:

...
> There is another thread, where this actually matters from a medical >systems /regulatory basis. > > Since you must have SOFTWARE to create the bitstream, then the >admin has to include software-handling discipline.
Does this include almost all ASIC design where synthesis SOFTWARE is still used to generate gates from RTL, SOFTWARE is used to place those gates and SOFTWARE is used to route the connections (not to mention sw to run lvs/drc etc.) ? It must also include then even any schematic entry based ASICs because SOFTWARE is used to enter/netlist all the schematics. Forcing software handling discipline if software is in the path is not an easy requirement in my opinion. Unless you want to go back to paper napkin diagrams and tape over transparencies.
mk wrote:
> Forcing software handling discipline if software is in the path is not > an easy requirement in my opinion. Unless you want to go back to paper > napkin diagrams and tape over transparencies.
Forcing software handling discipline on software teams isn't easy either.
mk wrote:

> On Wed, 22 Feb 2006 11:47:56 +1300, Jim Granville > <no.spam@designtools.co.nz> wrote: > > ... > >> There is another thread, where this actually matters from a medical >>systems /regulatory basis. >> >> Since you must have SOFTWARE to create the bitstream, then the >>admin has to include software-handling discipline. > > > Does this include almost all ASIC design where synthesis SOFTWARE is > still used to generate gates from RTL, SOFTWARE is used to place those > gates and SOFTWARE is used to route the connections (not to mention > sw to run lvs/drc etc.) ? > > It must also include then even any schematic entry based ASICs because > SOFTWARE is used to enter/netlist all the schematics. > > Forcing software handling discipline if software is in the path is not > an easy requirement in my opinion. Unless you want to go back to paper > napkin diagrams and tape over transparencies.
"software-handling discipline" relates to the tools, as much as your own code. It is fairly common practice to archive the tools, when a design is passed to production, and then ALL MAINT changes are done with those tools. So just because what you ship the customer might look like HW, you still have to do risk-reduction in house. For a live, and classic, example look at the ISE v8 release. Some of the flaws that shipped in this, are frankly amazing, and one wonders just what regression testing was done.... -jg
Isaac Bosompem wrote:
> We have always progressed towards abstraction in the software > world,similar pushes have also been made in the hardware world with > EDA's and CAD software packages like MATLAB, which automate most of the > grunt work. Perhaps program like HDL's are the new progression.
Actually VHDL stuck it's toes into this some 20 years back. By 1993 1076.2 Standard Mathematical Package was part of the standards proces, then 1076.3 Numeric Standards, not long later IEEE 1076.3/floating point, and then discussions for supporting sparse arrays and other very high level concepts for pure mathmatical processing rather than hardware logic from a traditional view point. Interest in C based HDL/HLL's for hardware design predate even Dave's TMCC work which is also over a decode old. So, I don't think it's all that new. Rather it started with sequential high level syntax and automatic arithmetic/boolean expression processing was added to VHDL. When computers were expensive in the 1960's and 1970's we traded design labor for microcode and assembly language designs (frequently done by EE's). As computers dropped drastically in price, that practice became rapidly not cost effective, and was almost completely replaced with higher, and higher levels of abstract language compilers to improve design productivity traded off against inexpensive computer cycles. We see the same process logic "hardware logic simulators" .... AKA FPGA's where they have dropped rapidly in price, allowing huge designs to be implemented on them that is no long cost effective in schematic form. And, we are seeing even larger designs implemented that are not even cost effective to design at the gate level using first generation HDL's that allow the designer to waste design labor on detailed gate level design. Hardware development with 2nd and third generation description languages is likely to follow the software model of using higher degrees of abstraction specifically to prevent designers from obsessing over a few gates, and in the process creating non-verifably correct designs which may break when ported to the next generation FGPA or logic platform.
> All I can say though, is only time will tell. It depends on how well > compilers like FpgaC will be able to convert a program to hardware > description. Also how well it be able to extract and fine opportunities > for concurrency.
FpgaC/TMCC has a number of things that are less than optimal, but it rests on a process of expressing all aspects of the circuit on boolean expressions, then agressively optimizing that netlist. The results are suprising to some, but hey, it's really not new, as VHDL has covered nearly the same high level language syntax to synthesis too. I think what is suprising to some, is that low level software design is long gone, and low level hardware design is soon to be long gone for all the same reasons of labor cost vs. hardware cost.
Jim Granville wrote:
> For a live, and classic, example look at the ISE v8 release. > Some of the flaws that shipped in this, are frankly amazing, > and one wonders just what regression testing was done....
Or more importantly, why the select beta list developers designs didn't stumble into the same problems. In large software land, alpha and beta pre-release cycles are the critical part of not slamming your complete customer base with critical bugs. The alpha and beta testers willing to do early adoption testing is probably one of the most prized vendor assets, and carefully controlled access resources, that any software vendor can develop. And for that priv, and to build that relationship, it's frequently necessary to give your product away to those early adopters long term .... both the beta's and the clean releases that follow.

fpga_toys@yahoo.com wrote:
> Isaac Bosompem wrote: > > For me the biggest hurlde of learning to utilize VHDL was programming > > my brain to not think of it as a programming language. Then everything > > began to fall into place. > > Interesting discussion. In a prior discussion regarding "programming" > or "designing" with C syntax HLL or HDLs, it was interesting how many > people took up arms that they could do everything in VHDL or Verilog > that could be done with a C based fpga design language such as > Celoxica's Handel-C, Impulse-C, FpgaC or similar tools. That arguement > was that VHDL/Verilog really isn't any different that C based HLL/HDL's > for FPGA design, and frequently with the assertion that VHDL/Verilog > was better. >
snipping
> > So is an fpga design in VHDL/Verilog hardware, and the same realized > equiv gates written in in Celoxica's Handel-C software just because of > the choice of language? Or is a VHDL/Verilog design that is the same > as a Handel-C design software?
It reaslly depends on the style of design and synthesis level. Traditionally Synopsys style DC synthesis performed on RTL code is hardware design more or less no matter how the RTL is prepared, even javascript if thats possible. But HandelC and the other new entrants from my knowledge are usually based on behavioural synthesis, thats the whole point for their existance is to raise productivity by letting these tools figure how to construct the RTL dataflow code so that mere mortal software engineers don't have to be familiar with RTL design. They still find out about real harware issue sooner or later though. On the ASIC side, Synopsys BC didn't fare too well with hardcore ASIC guys, better with system guys. Since FPGAs let software, system guys into the party, BC style synthesis is going to be far more acceptable and widespread, cost of failure so much lower than with ASICs. As for is it HW or SW, I decline, but the ASIC guys would tend to call BC design too software like given early results, I don't think their opinion for C style behavioural design has changed any either. If the end result of BC style design produces results as good as typically achieved by DC synthesis then it is everybit as good as hardware but does it produce such results? In hardcore hardware land we expect to drive upwards of 300MHz cycle rates and plan a hardware design onto a floorplan but I wouldn't expect such performance or efficiency from these BC tools. Do they routinely produce as good results, I very much doubt? Replacing RTL for behavioural design may raise productivity but it is not the same thing as replacing assembler coding for HLL coding IMHO given the current nature of OoO cpus.
> I think > what is suprising to some, is that low level software design is long > gone,
Back in the old days, it was common to build FSMs using ROMs. That approach makes it natural to think of the problem as software - each word in the ROM holds the instruction you execute at that PC plus the right external conditions. That still seems like a good approach to me Seems pretty low level too. I've done a reasonable amount of hack programming where I count every cycle to get the timing right. I could probably have done it in c, but I'm a bottom up rather than top down sort of person. -- The suespammers.org mail server is located in California. So are all my other mailboxes. Please do not send unsolicited bulk e-mail or unsolicited commercial e-mail to my suespammers.org address or any of my other addresses. These are my opinions, not necessarily my employer's. I hate spam.
JJ wrote:
> On the ASIC side, Synopsys BC didn't fare too well with hardcore ASIC > guys, better with system guys. Since FPGAs let software, system guys > into the party, BC style synthesis is going to be far more acceptable > and widespread, cost of failure so much lower than with ASICs. As for > is it HW or SW, I decline, but the ASIC guys would tend to call BC > design too software like given early results, I don't think their > opinion for C style behavioural design has changed any either.
As a half EE and CSc guy from the 1970's I spent more than a few years doing hard core assembly language systems programming and device driver work. The argument about C for systems programming was much louder and much more opinionated about "what real systems programmer can do". As an early Unix evanglist and systems programmer, it didn't take long to discover that I could code C easily to produce exactly the asm I needed. As the DEC systems guys and the UNIX systems guys war'ed over what was best, it was more than fun to ask them for their rosetta assembly language, and frequently knock out a faster C design in a few hours, for a piece of code that took weeks to fine tune in asm. It was almost always because they got fixated on micro optimization of a few loops, and missed the big picture optimizations. Rewriting asm libraries in C as we ported to microprocessors and away from PDP11's was seldom a performance hit at all. I see similar things happening with large ASIC and FPGA designs, as the real performance gains are in highly optimized, but more complex architectures, and less in the performance of any particular FSM and data path. Doing the very best gate level designs, just like the very best asm designs, at some point is just a distraction, when you start looking at complex systems with high degrees of parallelism and specialized functional units where the system design/architecture is the win, not a few cycles at the bottom of some subsystem. The advantage of transfering optimization knowledge into HLL tools, is that they do it right EVERY time after that. Where the same energy spent optimizing one low level design is seldom leveraged to other designs. Because of this HLL programming languages routinely deliver three or four nines of the performance hand coding will do, and frequently better as all optimations are automatically taken and applied, where a hand coder would not be able to. We see the same evolution in bit level boolean design for hardware engineers. A little over a decade ago, all equations were hand optimized .... today that is a lost art. As the tools do more of it, probably in another decade it will no longer be taught as a core subject to EE's, if not sooner. There are far more important things for them to learn that they WILL actually use and need. That will not stop the oldie moldies from lamenting how little the kids today know, and claiming they don't even know their trade. The truth is, that the kids will have new skills that leave the Dino's just that.
> If the end result of BC style design produces results as good as > typically achieved by DC synthesis then it is everybit as good as > hardware but does it produce such results? In hardcore hardware land we > expect to drive upwards of 300MHz cycle rates and plan a hardware > design onto a floorplan but I wouldn't expect such performance or > efficiency from these BC tools. Do they routinely produce as good > results, I very much doubt? Replacing RTL for behavioural design may > raise productivity but it is not the same thing as replacing assembler > coding for HLL coding IMHO given the current nature of OoO cpus.
I believe, having seen this same technology war from the other side, that it is not only the same, but will actually evolve better because the state of the art and knowledge about how to take optimizations and exploit them is much better understood these days. The limits in software technology and machine performance that slowed extensive computational optimizations for software compilers are much less of a problem with VERY fast cpus and large memory systems today. Probably the hard part will be yanking the knowledge set to do a good job from the minds of heavilly protesting EE's worried about job security. I suspect a few, will see the handwritting on the wall, and will become coders for tools, just to have a job.