FPGARelated.com
Forums

Finally! A Completely Open Complete FPGA Toolchain

Started by rickman July 27, 2015
On 8/5/2015 9:13 PM, glen herrmannsfeldt wrote:
> rickman <gnuarm@gmail.com> wrote: > > (snip, I wrote) >>> But one reason we have free HDL tools (from Xilinx and Altera) >>> now is related to the competition between them. With only >>> one FPGA company, there would be no need for competition, >>> tools could be expensive, and there could be a significant >>> advantage to FOSS tools. > >> I'm not sure the price of the tools is so much related to the >> competition between the companies. Hypothesizing only one FPGA company >> is not very realistic and it is certainly far down my list of concerns. >> I expect the price of tools is much more related to promoting the >> "exploration" of the use of FPGAs. If you even have to spend $100, that >> makes for a barrier to anyone wanting to start testing the tools. I ran >> into this myself in jobs where I wanted to try something, but couldn't >> get one dime spent. > > OK, but as I understand it Altera started distributing free versions, > and Xilinx followed, presumably for competitive reasons. > > As you note, the free versions allowed exploration. > > If one hadn't done it first, the other might not have.
Perhaps, or it was just a matter of time. Clearly the business model works and I think it was inevitable. MCU vendors understand the importance and pay for tools to give away. Why not give away $100 tool or even a $1000 tool if it will get you many thousands of dollars in sales? It's the tool vendors who I expect have the bigger problem with this model. For FPGAs the funny part is I was told a long time ago that Xilinx spends more on the software than they do designing the hardware. The guy said they were a software company making money selling the hardware they support.
>> I can always find a little free time to spend on >> ideas, but spending money almost always goes through a review of some >> sort where they want you to show why and the "why" is what you want to >> determine. > > The way free market is supposed to work.
Free market? I'm talking about company internal management. It is so easy to track every penny, but hard to track your time to the same degree. Often this is penny wise, pound foolish, but that's the way it is. I'm clear of that now by working for myself, but I still am happier to spend my time than my money, lol. -- Rick
On Tuesday, August 4, 2015 at 4:46:49 PM UTC-7, rickman wrote:
> Not trying to give anyone grief. I'd just like to understand what > people expect to happen with FOSS that isn't happening with the vendor's > closed, but free tools. >
Here's one example: during development, I'm targeting an FPGA that's several times larger than it needs to be, and the design has plenty of timing margin. So why in the name of Woz do I have to cool my heels for 10 minutes every time I tweak a single line of Verilog? If the tools were subject to community development, they probably wouldn't waste enormous amounts of time generating 99.9% of the same logic as last time. Incremental compilation and linking is ubiquitous in the software world, but as usual the FPGA tools are decades behind. That's the sort of improvement that could be expected with an open toolchain. It's as if Intel had insisted on keeping the x86 ISA closed, and you couldn't get a C compiler or even an assembler from anyone else. How much farther behind would we be? Well, there's your answer. -- john, KE5FX
On 8/5/2015 9:48 PM, John Miles wrote:
> On Tuesday, August 4, 2015 at 4:46:49 PM UTC-7, rickman wrote: >> Not trying to give anyone grief. I'd just like to understand what >> people expect to happen with FOSS that isn't happening with the >> vendor's closed, but free tools. >> > > Here's one example: during development, I'm targeting an FPGA that's > several times larger than it needs to be, and the design has plenty > of timing margin. So why in the name of Woz do I have to cool my > heels for 10 minutes every time I tweak a single line of Verilog? > > If the tools were subject to community development, they probably > wouldn't waste enormous amounts of time generating 99.9% of the same > logic as last time. Incremental compilation and linking is > ubiquitous in the software world, but as usual the FPGA tools are > decades behind. That's the sort of improvement that could be > expected with an open toolchain. > > It's as if Intel had insisted on keeping the x86 ISA closed, and you > couldn't get a C compiler or even an assembler from anyone else. How > much farther behind would we be? Well, there's your answer.
Don't know about Intel, but I seem to recall that Xilinx tools have incremental compilation. Maybe they have dropped that. They dropped a number of things over the years such as modular compilation which at one point a Xilinx representative swore to me was in the works for the lower cost Spartan chips and would be out by year end. I think that was over a decade ago. Even so, there are already FOSS HDL compilers available. Do any of them offer incremental compilation? I believe the P&R tools can work incrementally, but again, maybe that is not available anymore. You used to be able to retain a portion of the routing and keep working on the rest over and over. I think the idea was to let you have a lot of control over a small part of the design and then let the tool handle the rest on autopilot. -- Rick
On Wednesday, August 5, 2015 at 8:37:41 PM UTC-7, rickman wrote:
> I believe the P&R tools can work incrementally, but again, maybe that is > not available anymore. You used to be able to retain a portion of the > routing and keep working on the rest over and over. I think the idea > was to let you have a lot of control over a small part of the design and > then let the tool handle the rest on autopilot.
If there's a way to do it in the general case I haven't found it. :( I wouldn't be surprised if they could leverage *some* previous output files, but there are obviously numerous phases of the synthesis process that each take a long time, and they would all have to play ball. Mostly what I want is an option to allocate extra logic resources beyond what's needed for a given build and use it to implement incremental changes to the design. No P&R time should be necessary in about 4 out of 5 builds, given the way my edit-compile-test cycles tend to work. I'm pretty sure there's no way to tell it to do that. It would be nice to be wrong. -- john, KE5FX
On 8/6/2015 1:54 AM, John Miles wrote:
> On Wednesday, August 5, 2015 at 8:37:41 PM UTC-7, rickman wrote: >> I believe the P&R tools can work incrementally, but again, maybe >> that is not available anymore. You used to be able to retain a >> portion of the routing and keep working on the rest over and over. >> I think the idea was to let you have a lot of control over a small >> part of the design and then let the tool handle the rest on >> autopilot. > > > If there's a way to do it in the general case I haven't found it. :( > I wouldn't be surprised if they could leverage *some* previous output > files, but there are obviously numerous phases of the synthesis > process that each take a long time, and they would all have to play > ball. > > Mostly what I want is an option to allocate extra logic resources > beyond what's needed for a given build and use it to implement > incremental changes to the design. No P&R time should be necessary > in about 4 out of 5 builds, given the way my edit-compile-test cycles > tend to work. I'm pretty sure there's no way to tell it to do that. > It would be nice to be wrong.
I'm not sure what that means, "allocate extra logic resources" and use them with no P&R time...? Are you using the Xilinx tools? -- Rick
On 06/08/15 01:46, glen herrmannsfeldt wrote:

> People use gcc because it works well, and it works well because > people use it, and want it to work well. >
One key difference here is that gcc is written in C (and now some C++), and it's main users program in C and C++. Although compiler design/coding is a different sort of programming than most of gcc's users do, there is still a certain overlap and familiarity - the barrier for going from user to contributor is smaller with gcc than it would be for a graphics artist using GIMP or a writer using LibreOffice, or an FPGA designer using these new tools. The key challenge for open source projects like this is to develop a community of people who understand the use of the tools, and understand (and can contribute to) the coding. Very often these are made by one or two people - university theses are common - and the project dies away when the original developers move on. To be serious contenders for real use, you need a bigger base of active developers and enthusiastic users who help with the non-development work (documentation, examples, testing, support on mailing lists) - MyHDL is an example of this in the programmable logic world.
On 06.08.2015 00:52, thomas.entner99@gmail.com wrote:

> For FPGA design, the user base is much smaller than for a C compiler. > How much of them would really use the open source alternatives when > there are very advanced free vendor tools? And how much of them are > really skilled software gurus? And have enough spare time? Of course > you would find some students which are contributing (e.g. for their > thesis), but I doubt that it will be enough to get a competitve > product and to maintain it. New devices should be supported with > short delay, otherwise the tool would not be very useful.
I don't see the big difference to compilers targeting microcontrollers here. There are plenty of older FPGA types, such as the Xilinx XC9500 still in use. A free toolchain for them would be useful, and having advanced optimizations would be benefitial there as well. On the microcontroller side, SDCC also targets mostly older architectures, and a few newer ones, such as Freescale S08 and STMMicroelectronics STM8. You don't need every user to become a developer. A few are enough. Philipp
On Thursday, August 6, 2015 at 9:48:11 AM UTC+8, John Miles wrote:
> On Tuesday, August 4, 2015 at 4:46:49 PM UTC-7, rickman wrote: > > Not trying to give anyone grief. I'd just like to understand what > > people expect to happen with FOSS that isn't happening with the vendor's > > closed, but free tools. > > > > Here's one example: during development, I'm targeting an FPGA that's several times larger than it needs to be, and the design has plenty of timing margin. So why in the name of Woz do I have to cool my heels for 10 minutes every time I tweak a single line of Verilog? > > If the tools were subject to community development, they probably wouldn't waste enormous amounts of time generating 99.9% of the same logic as last time. Incremental compilation and linking is ubiquitous in the software world, but as usual the FPGA tools are decades behind. That's the sort of improvement that could be expected with an open toolchain. > > It's as if Intel had insisted on keeping the x86 ISA closed, and you couldn't get a C compiler or even an assembler from anyone else. How much farther behind would we be? Well, there's your answer. > > -- john, KE5FX
Incremental synthesis/compilation is supported by both Xilinx (ISE and Vivado) and Altera (Quartus) tools, even in the latest versions. One needs to use the appropriate switch/options. Of course, their definition of incremental compile/synthesis may not match exactly with yours. They tend to support more at the block level using partitions etc.
The quoted post has beed turned upside-down for the purposes of my typing.

On Wed, 05 Aug 2015 18:50:11 -0400, rickman wrote:

> Maybe I just don't have enough imagination.
A distinct possibility. On Wed, 05 Aug 2015 18:50:11 -0400, rickman wrote:
> On 8/5/2015 5:30 PM, Philipp Klaus Krause wrote: >> On 05.08.2015 01:46, rickman wrote: >>> On 8/4/2015 7:05 PM, Aleksandar Kuktin wrote: >>>> >>>> Hackability. If you have an itch, you can scratch it yourself with >>>> FOSS tools. If you discover a bug, you can fix it yourself. If you >>>> want to repurpose, optimize or otherwise change the tool, you can do >>>> it with FOSS. >>> >>> That's great. But only important to a small few.
Few matter. How many ISA designers are there? Yet, if they get good tools that let them creatively hack out the solution, we're all better of. Same with random dudes banging on some FPGA somewhere. You never know where the next thing you want will appear, and having good peer-reviewed tools creates more potential for good stuff to be made.
>>> I use tools to get work done. I have zero interest in digging into >>> the code of the tools without a real need. I have not found any bugs >>> in the vendor's tools that would make me want to spend weeks learning >>> how they work in the, most likely, vain hope that I could fix them.
Maybe you just didn't try hard enough? Maybe you did but didn't notice you found a gaping bug in vendor tools.
>>> Not trying to give anyone grief. I'd just like to understand what >>> people expect to happen with FOSS that isn't happening with the >>> vendor's closed, but free tools.
Maybe you would be able to generate a FPGA handheld device that can reconfigure itself on the fly. Like a smartphone^H^H^H^H^H^H^H^H^H^H PDA^H^H^H trikoder that runs on some energy-efficient MIPS and that has a scriptable (meaning CLI) synthesizer that you can feed random Verilog sources and then instantiate an Ethernet device so you can jack yourself in while at home, a FM radio to listen to while driving down the road, a TV receiver with HDMA output so you can view the news and maybe a vibrator or something for the evening. Anyway, that's what I want to have and can't right now but COULD have with FOSS tools (since I'm not gonna use QEMU to instantiate a VM so I could synthesize on my phone).
>> This resulted in some quite unusual optimizations in SDCC currently >> not found in any other compiler.
Okay, now we need to check out SDCC.
> I think this is the point some are making. The examples of the utility > of FOSS often point to more obscure examples which impact a relatively > small number of users. I appreciate the fact that being able to tinker > with the tools can be very useful to a few. But those few must have the > need as well as the ability.
> With hardware development both are less likely to happen.
But they will happen nevertheless.
On Wed, 05 Aug 2015 15:52:52 -0700, thomas.entner99 wrote:

> Of course you > would find some students which are contributing (e.g. for their thesis), > but I doubt that it will be enough to get a competitve product and to > maintain it. New devices should be supported with short delay, otherwise > the tool would not be very useful.
With GCC, Linux and the ilk, it's actually the other way around. They add support for new CPUs before the new CPUs hit the market (quoting x86_64). This is partially due to hardware producers understanding they need toolchain support and working actively on getting that support. If even a single FOSS FPGA toolchain gets to a similar penetration, you can count on FPGA houses paying their own people to hack those leading FOSS toolchains, for the benefit of all.