FPGARelated.com
Forums

Is FPGA code called firmware?

Started by Marko February 20, 2006
Hal Murray wrote:
> Back in the old days, it was common to build FSMs using ROMs. That > approach makes it natural to think of the problem as software - each word > in the ROM holds the instruction you execute at that PC plus the right > external conditions.
Any of us educated in engineering school in the 1970's probably have more than a few times. On the other hand, I also built a DMA engine out of an M68008 using address space microcoding which saved a bunch of expensive PAL's and board space, plus used the baby 68k to implement a scsi protocol engine to emulate a WD1000 chipset. The whole design took a couple months to production. Having done it the hard way with bipolar bit slices, just gives you the tools to take a more powerful piece of silicon and refine it better. That is the beauty of FPGAs as computational engines today. Looking past what it's ment to do, and looking forward to what you can do with it tomarrow, by exploiting the parallism and avoiding the sequential bottlenecks of cpu/memory designs. Designers that only know how to use a cpu + memory, and lack the skills of designing with lower level building blocks miss the big picture - both at a software and hardware level.
> I've done a reasonable amount of hack programming where I count > every cycle to get the timing right. I could probably have done > it in c, but I'm a bottom up rather than top down sort of person.
it's not about up or down, its simply learning your tools. for 30 years I've written C thinking asm, and coding C line for line with the asm it produced. For the last couple years, after learning TMCC and taking a one day Celoxica intro seminar, I started writing C thinking gates, just as a VHDL/Verilog engineer writes in that syntax thinking gates. Hacking on, and extending TMCC as FpgaC has only widened my visualization of what we can do with the tool. The things that TMCC/FpgaC does wrong are almost purely masked by the back end boolean optimizer which comes very close to getting it right. Where it doesn't, is because its synthesis rules are targeted at a generic device, and it lacks device specific optimizations to target the available CLB/Slice implementation. That will come with time, but really don't have that big an impact today. Right now we are focused on implementng the rest of the C language that was left out of TMCC, which is mostly parser work, and some utility routine work inside the compiler. That will be completed march/april, then we can move on to back end work, and target what I call compile, load and go work, which will focus on targeting the backend to current devices. With that will come distributed arithmetic optimized to several platforms as well as using carry chains and muxes available in the slices these days. At that point, FpgaC will be very close to fiting designs as current HDL's do .... if you can learn to write C thinking gates. FpgaC will over time hide most of that from less skilled programmers, requiring only modest retraining. The focus will be computing with fpgas, not fpga designs to support computing hardware development. RC. For the last couple weeks we have been doing integration and cleanup from a number of major internal changes, mostly from a symbol table manager and scoping rules fix to bring TMCC inline with Std C scoping and naming, so that we could support Structures, typedef and enum. We've also implemented the first crack at traditional typing in prep for enum/typedef, allowing unsigned and floating point now in the process. Both are likely to be in beta-2 this month. The work I checked in last night has the core code for FP using intrinsic functions, and probably needs a few days to finish. It also now has do-while and for loops, along with structures and small LUT based arrays or BRAM arrays. It currently regression tests pretty well, with a couple minor problems left, including one that cause some temp symbols to end up with the same names. That should be gone by this weekend, as FP gets finished and hopefully unsigned is done too. svn co https://svn.sourceforge.net/svnroot/fpgac/trunk/fpgac fpgac alpha/beta testers and other developers welcome :)
Gabor wrote:
> At our company we call FPGA configuration code "software" if it > is stored on the hard drive and uploaded at run time by a user > application. When it is stored in a serial PROM or flash memory > on the board we call it firmware. > > I don't think the terms "firmware" or "software" have as much to do > with the programming paradigm as with the delivery of the bits to > make the actual hardware run.
At my current and previous jobs, FPGA "loads" are/were considered firmware, for the same reason that processor boot code and the lowest-level debug monitor was considered firmware: the images are stored on board in some kind of non-volatile memory. -a
> I think > what is suprising to some, is that low level software design is long > gone,
????? No-one ever programs in assembly language any more then?
> and low level hardware design is soon to be long gone for all the > same reasons of labor cost vs. hardware cost.
Where price, performance and power consumption don't matter a higher level language might become more prevalent. I think we'll always need to be able to get down to a lower level hardware description to get the best out of the latest devices, stretch the performance of devices or squeeze what needs to be done into a smaller device. I also wonder if price/performance/power consumption will become much less important in the future, as it has with software. These days you can assume application software will be run on a 'standard' sufficiently powerful PC. It won't be the case that at the start of every hardware project that you can assume you have a multi million gate FPGA (or whatever) at your disposal. Nial.
Nial Stewart wrote:
> > I think what is suprising to some, is that low level software design is long gone, > No-one ever programs in assembly language any more then?
When I started doing systems programming in the late 1960's, everything core system important was written in assembly language on big IRON IBM's - 1401's, 1410's, 360's, with applications in RPG, COBOL, and Fortran. Pretty much was the same when I started programing minicomputers, DG's, DEC's, and Varian's, except some flavor of Basic was the applications language of choice. 99% of systems code - operating system, utilities, compilers, linkers, assemblers, ... etc was assembly language. I suspect that number is less than a very small fraction of a percent today, that is new designs.
> > and low level hardware design is soon to be long gone for all the > > same reasons of labor cost vs. hardware cost. > Where price, performance and power consumption don't matter a higher > level language might become more prevalent.
The power argument is moot, as the difference between power for a good C coder and a good asm coder, is probably less than a fraction of a percent. Ditto for performance. The only case I can think of, is using the wrong compiler for the job, such as using an ANSI 32bit compiler on a micro that is native 8 or 16 bits, rather than downsizing to a subset compiler which was designed to target that architecture. And there are a lot of really good subset 8/16 bit compilers for micros, and it only takes a man week or two to adapt SmallC/TinyC derivatives to a new architecture. Cost on the other hand, I believe is strongly in favor of using C or some other low level HLL instead of assembly. When the burdened cost of good software engineers is roughly $200-250K/yr and the productivity factor between asm and HLL's is better than a factor of 3-10 over the lifecycle of the product, it's nearly impossible to make up the difference in volume. Usinging junior labor that chooses the wrong tools and botches the design is a management problem. There are plenty of good HLL coders that can do low level software fit to hardware, and get it right. And it only takes a good person a few weeks to train an experienced coder how to do it when experienced low level guys are in high demand.
> I think we'll always > need to be able to get down to a lower level hardware description > to get the best out of the latest devices, stretch the performance > of devices or squeeze what needs to be done into a smaller device.
Yep ... but coding C other other systems class HLL line for line as asm, the fraction of a percent gained by coding more than a few lines of asm is quickly lost in time to market, labor cost, and ability to maintain the product long term, including porting to a different mirco to chase the low cost components over time.
> I also wonder if price/performance/power consumption will become much > less important in the future, as it has with software. These days > you can assume application software will be run on a 'standard' > sufficiently powerful PC. It won't be the case that at the start of > every hardware project that you can assume you have a multi million > gate FPGA (or whatever) at your disposal.
Today, the price difference between low end FPGA boards and million gate boards is getting pretty small, with megagate FPGAs in high volume. Five, or even two years ago, was pretty different. The real issue is that FPGA's with CPU's and softcore CPU's allow you to implement the bulk of the software design on a traditional sequential architecture where performance is acceptable, and push the most performance sensitive part of the design down to using raw logic. The Google description for this group is: Field Programmable Gate Array based computing systems, under Computer Architecture FPGA. And, after a few years, I think we are finally getting there .... FPGA based coputing instead of CPU based computing. The days of FPGA's being only for hardware design are slipping away. While this group has been dominated by hardware designers using FPGA's for hardware designs, I suspect that we will see more and more engineers of all kinds here doing computing on FPGA's, at all levels.

fpga_toys@yahoo.com wrote:

snipping, kind of tired of being told software taking over hardware
design, it ain't

> The Google description for this group is: Field Programmable Gate Array > based computing systems, under Computer Architecture FPGA. And, after > a few years, I think we are finally getting there .... FPGA based > coputing instead of CPU based computing. > > The days of FPGA's being only for hardware design are slipping away. > While this group has been dominated by hardware designers using FPGA's > for hardware designs, I suspect that we will see more and more > engineers of all kinds here doing computing on FPGA's, at all levels.
While thats likely somewhat true and I really welcome anyone with interesting content problems, I suspect it's already too late for new comers without strong EE backgrounds or associates. 15 years ago FPGAs were pretty darn simple and not much use for anything but glue logic. Good ole days when any old CMOS logic slapped together just worked. Synthesis just around the corner. 10 years ago they got big enough but not performance enough to start to make predictions about RC and the possibly of replacement of general purpose cpus with hardware computing ie the 4000 days and a couple of new companies to boot. ASIC design started to get harder. 5 years ago with Virtex I'd say they started to get the ASIC performance with the embedded blocks and specialized IO resources making performance almost even to cover the much slower LUT blocks, and we also got the hugely more complex data sheets. Today most FPGAs seem to have the whole kitchen sink in there to make complex systems more practical if the sink can be made small enough to hide cost when not used. Look at any data sheet for modern parts, maybe 5% or less could be understood by your avg SW engineer (far less I bet), the rest is all electrical stuff, signal integrity, power supplies, packaging, reliabilty, clocking in no particular order. Ask around here for books on FPGA computing, there aren't any, there all old shit 10yrs or more from the easy days. I have one that covers the 3000 series. Ray has one coming and it sure ain't targeted at software guys, he's too busy with real work, as are most EEs with a job to write up their current knowledge. FPGAs are simply moving too fast to be documented for the laissez faire user. SW engineers with the mathematical applications are used to dealing with ready made PC boxen, give enough ventilation and hot math shouldn't faze a P4. There really isn't anything available in the same sense of off the shelf FPGA computing that can be sold as a std board to all the math, idea guys with out HW pain. Yeh there are lots of FPGA PCI cards but they are mostly not useable to software guys without some EE around as well as hardware lab tools. So that means a special application likely needs special boards to be built. Welcome to the real world, power supplies, interfaces, SI, GHz IO's, lead free. They haven't tought that in school in CS ever, and perhaps maybe some EE schools too. I can feel pretty sure that EEs that don't know this won't get much work. I suspect logic classes are going to be with us too for ever. When I interviewed candidates that don't know basic bool algebra but would like to do mil gate designs, I'd say let you know later, or let them know what they need to know. Is that job protection, sure it is, EEs don't want Joe90 liability around, bad ASIC design kills companies. We are going the same place with FPGA systems, bad designs will never work but only the project is lost, not million $ mask sets. My last employer's FPGA project cost far more than previous predecessor full custom mixed signal ASIC, it had lots of nice new math in it to figure out. Even really good EEs make logic mistakes, so some further abstractions are likely but that doesn't help much with all the dirty backend EE stuff.
>From time to time we have had a few math, bio guys come here with
questions about their interesting problems, but what I noticed is that they seem to be pretty coy about what they are upto. I suspect the days of SW engineers coming to the FPGA party are already over. FPGAs are getting bigger and more interesting but a darn site harder to use and that won't ever get covered up by synthesis tools. Also from what I have seen of some of the applications of FPGAs to computing v PC computing, the FPGA projest didn't even match the PC solution on cost. Not because the FPAG doesn't have the grunt but because too much was left on the table since the design was done by math oriented people. Now as I said before cpu designers have decided to go the same way FPGA are, packing density plus any incremental clock speed so its a parallel race again. My gut tells me that PC computing is still the best way to go if a plain C description works using 32 bit int math and esp FP math. But when the math looks like crypto with S boxes and shuffles or has dedicated IO interface, FPGAs cream all over. Multi disciplined teams are the future but EEs won't be in the back seat. I am done John Jakson transputer guy Marlboro MA BTW I don't know how to change brake pads or do oil changes (or even spell) so none of the above makes diddly squat.
<fpga_toys@yahoo.com> wrote in message news:1140727198.395726.97600@u72g2000cwu.googlegroups.com...


>> > and low level hardware design is soon to be long gone for all the >> > same reasons of labor cost vs. hardware cost. >> Where price, performance and power consumption don't matter a higher >> level language might become more prevalent. > > The power argument is moot, as the difference between power for a good > C coder and a good asm coder, is probably less than a fraction of a > percent.
I was talking about the FPGA domain here, not SW.
>> I also wonder if price/performance/power consumption will become much >> less important in the future, as it has with software. These days >> you can assume application software will be run on a 'standard' >> sufficiently powerful PC. It won't be the case that at the start of >> every hardware project that you can assume you have a multi million >> gate FPGA (or whatever) at your disposal. > > Today, the price difference between low end FPGA boards and million > gate boards is getting pretty small, with megagate FPGAs in high > volume. Five, or even two years ago, was pretty different.
Not every design has the need for million gate device functionality, Altera and Xilinx's low cost families seem to be selling in big numbers. Sometimes it's important to push the performance of these lower cost devices to keep costs down. Getting the same functionality into a smaller device can also be important if power consumtion is critical (my original point). How many power supplies do you need for your big devices?
> The Google description for this group is: Field Programmable Gate Array > based computing systems, under Computer Architecture FPGA. And, after > a few years, I think we are finally getting there .... FPGA based > coputing instead of CPU based computing.
This newsgroup and FPGAs were around long before some numpty at Google decided what their description should be. I don't think we should be taking this as a guiding pointer for the future.
> The days of FPGA's being only for hardware design are slipping away. > While this group has been dominated by hardware designers using FPGA's > for hardware designs, I suspect that we will see more and more > engineers of all kinds here doing computing on FPGA's, at all levels.
That's probably true, and I expect to be using other tools as well as VHDL in 5 years. However as John posted above, there's alot more to implementing an FPGA design than the description used for the logic and I think we'll still be using HDLs to get the most out of them for a long time to come (to a bigger extent than with C/asm). Nial.
Nial Stewart wrote:
> Not every design has the need for million gate device functionality, > Altera and Xilinx's low cost families seem to be selling in big > numbers. Sometimes it's important to push the performance of these > lower cost devices to keep costs down. Getting the same functionality > into a smaller device can also be important if power consumtion is > critical (my original point). > > How many power supplies do you need for your big devices?
Who said anything about C based HLL's NEEDING a large FPGA? ... The only NEED for a large FPGA is if you are doing reconfigurable computing on a grand scale. C based HLL's work just fine for small devices too. Since devices get bigger in 100% size jumps for most product lines, and the cost penalty for using a C based HLL is under a few percent, the window for justifying an HDL on device fit is pretty small, or non-existant. Any project which is crammed into a device with zero headroom, probably needs the next large size just to make sure that minor fixes don't obsolete the board or force an expensive rework replacing the chip with the next larger device in the middle of the production run.
> > The days of FPGA's being only for hardware design are slipping away. > > While this group has been dominated by hardware designers using FPGA's > > for hardware designs, I suspect that we will see more and more > > engineers of all kinds here doing computing on FPGA's, at all levels. > > That's probably true, and I expect to be using other tools as well as > VHDL in 5 years. However as John posted above, there's alot more to > implementing an FPGA design than the description used for the logic > and I think we'll still be using HDLs to get the most out of them for > a long time to come (to a bigger extent than with C/asm).
Probably the biggest change is that EE's will still be putting the chips on boards as they have always done, and the FPGA programming will shift to systems programming staff, which are frequently Computer Engineering folks these days (1/2 EE and 1/2 CSc, or CSc types with a minor in the digital side of EE). Similar to the 70's transition where EE's were doing most of the low level software design and drivers, and it shifted to a clearer hardware/software split over time. With that, tools that expect a designer to mentally be doing gate level timing design are less important that higher level tools which handle that transparently.
>> That's probably true, and I expect to be using other tools as well as >> VHDL in 5 years. However as John posted above, there's alot more to >> implementing an FPGA design than the description used for the logic >> and I think we'll still be using HDLs to get the most out of them for >> a long time to come (to a bigger extent than with C/asm). > > Probably the biggest change is that EE's will still be putting the > chips on boards as they have always done, and the FPGA programming will > shift to systems programming staff, which are frequently Computer > Engineering folks these days (1/2 EE and 1/2 CSc, or CSc types with a > minor in the digital side of EE).
That may be the case for large multi designer designs, for smaller devices someone who understands the underlying architecture and what they're actually trying to design to will be needed. This is a quote from a current thread "Entering the embedded world... help?" on comp.arch.embedded. I don't know how accurate this is. "If you meant to say "most everyone these days uses an HLL, and for embedded applications most people choose to use C whereas a significant minority choose to use C++" I would not have objected much - although, in terms of code volume on the shelf, assembly language is still at least 30% of all products. Consider that the really high-volume projects use the really cheap micros. I've seen numbers that say asm 40%, C 40%, C++ 10%, other 10%, and I'm quite prepared to believe them. The problem is, people who talk about this stuff get into their niche and see everything else from that perspective. Few people routinely work with a broad spectrum of systems from 4-bit to 64-bit and code volumes from a few hundred bytes to a few dozen megabytes." You seem to have a deeply entrenched view of the FPGA development future. Only time will tell if you are correct or not, I don't believe you are and I'll leave it at that. Nial.
Nial Stewart wrote:
> That may be the case for large multi designer designs, for smaller > devices someone who understands the underlying architecture and > what they're actually trying to design to will be needed.
I think that has always been the case for embedded, and realtime, and any other tightly integrated hardware/software design of any size.
> The problem is, people who talk about this stuff get into their niche > and see everything else from that perspective. Few people routinely > work with a broad spectrum of systems from 4-bit to 64-bit and code > volumes from a few hundred bytes to a few dozen megabytes."
Certainly true. As a consultant, I can only view the diverse sample of my clients for a perspective ... and that is certainly harder for W-2 employees that have lived inside the same company for the last 10 years. It would be interesting to take a survey at a developers embedded conference as get a better feel for the real numbers.
> You seem to have a deeply entrenched view of the FPGA development future. > Only time will tell if you are correct or not, I don't believe you are > and I'll leave it at that.
More like a receintly converted evangelist, with a pragmatic view of my prior 35 years of systems programming experiences casting a view on this new field and watching what is happening around me too. I did have a little fun this evening, writing a PCI target mode core in FpgaC as an example for the beta-2 release that is nearly at hand. It's not quite done, but checked in to subversion on sourceforge in the FpgaC examples directory. For something that is a bus interface state machine, it is expressed in C pretty nicely, and will get better as unions/enums are added to FpgaC. It brought out a couple problems with using I/O ports as structure members that I need to fix in FpgaC tomarrow, then finish the pci coding along with a C test bench before testing/installing on my Dini DN2K card.
On Monday, February 20, 2006 at 1:50:15 PM UTC-8, James Morrison wrote:
> On Mon, 2006-02-20 at 10:18 -0800, Marko wrote: > > Traditionally, firmware was defined as software that resided in ROM. > > So, my question is, what do you call FPGA code? Is "firmware" > > appropriate? > > In a former position I pondered this very question. > > What is firmer than firmware (the term they used to describe code > designs for micro-controllers) but softer than hardware (designs using > wires to connect together various components)? > > The answer I came up with was "stiffware". > > The problem is that there are elements of both in FPGA code, or at least > there can be. And depending on how you write your VHDL it may resemble > one more than the other. > > James.
"Stiffware" I love it!!