FPGARelated.com
Forums

FPGA security, Actel down, now Xilinx too?

Started by Antti July 26, 2011
Hi

its maybe not so commonly known that there have been products using Actel s=
ecure FPGA's have been cloned already many years ago (readback done by dark=
 engineers at Actel), few month ago a paper was published indicating that P=
roAsic3 (and other newest Actel FPGA's) have master key that is known not o=
nly inside Actel but also for the dark side outside the company. There is a=
t least one known successful Actel ProAsic3 based product cloning done (ass=
umed readback done at Actel fab, not outside).

following post has link to documents that show that Xilinx V2/V4/V5 are vul=
nerable as well.

http://it.slashdot.org/story/11/07/21/1753217/FPGA-Bitstream-Security-Broke=
n

P.S. We do not have more info nor the master keys, please do not ask :)


Antti Lukats
http://trioflex.blogspot.com/




On Jul 26, 2:04=A0am, Antti <antti.luk...@googlemail.com> wrote:
> Hi > > its maybe not so commonly known that there have been products using Actel=
secure FPGA's have been cloned already many years ago (readback done by da= rk engineers at Actel), few month ago a paper was published indicating that= ProAsic3 (and other newest Actel FPGA's) have master key that is known not= only inside Actel but also for the dark side outside the company. There is= at least one known successful Actel ProAsic3 based product cloning done (a= ssumed readback done at Actel fab, not outside).
> > following post has link to documents that show that Xilinx V2/V4/V5 are v=
ulnerable as well.
> > http://it.slashdot.org/story/11/07/21/1753217/FPGA-Bitstream-Security... > > P.S. We do not have more info nor the master keys, please do not ask :) > > Antti Lukatshttp://trioflex.blogspot.com/
No one should ever assume the device security offered is 100% uncrackable. I used to know a guy who did legit "dark engineering" for government devices, and it was amazing to hear stories of drilling out holes in "secure devices" and extracting data using microscropic probes. Another engineer I knew has a collection of IC's embedded in epoxy - the company he worked for would shave them layer by layer to extract the design physically. (So no, going to an ASIC won't necessarily be 100% secure either) If man can make it, man can break it. The trick is to make it more expensive for the cloners to crack than it would be to just license, buy, or reverse engineer another way. Besides, a lot of places still send bit streams to China for programming during assembly, and at that point, adding bit-stream security is a bit like setting the deadbolt on an already open, and empty, barn. A better metric for FPGA bitstream security, or any security product, is the cost per breach and/or time per breach. Assume it can be breached, and pick a method where the [cost/time]/[breach] equation works out in your favor. BTW - this also means that devices with a master key are very bad - because the time/breach is only paid once, and you can rest assured, someone besides the manufacturer has it already. For an example of this done right, there is an IBM crypto chip that I believe is still unbroken - but it has wires around the die that control power the SRAM memory holding the crypto keys. If you drill into the package, and cut one of the wires, the device loses its memory - and becomes a dud. Obviously, you also have to do this work with the chip in-system, and running, for the same reason. This is the equivalent of the lock on an underground bank vault. We will know FPGA vendors are equally serious when the offer a part with that level of security. Until then, it's pretty much the equivalent of the standard locks on our front doors. Good enough to keep the riff-raff out, but not enough to keep the serious thieves away.
On Wed, 27 Jul 2011 05:17:32 -0700, radarman wrote:

> On Jul 26, 2:04&nbsp;am, Antti <antti.luk...@googlemail.com> wrote: >> Hi >> >> its maybe not so commonly known that there have been products using >> Actel secure FPGA's have been cloned already many years ago (readback >> done by dark engineers at Actel), few month ago a paper was published >> indicating that ProAsic3 (and other newest Actel FPGA's) have master >> key that is known not only inside Actel but also for the dark side >> outside the company. There is at least one known successful Actel >> ProAsic3 based product cloning done (assumed readback done at Actel >> fab, not outside). >> >> following post has link to documents that show that Xilinx V2/V4/V5 are >> vulnerable as well. >> >> http://it.slashdot.org/story/11/07/21/1753217/FPGA-Bitstream-
Security...
>> >> P.S. We do not have more info nor the master keys, please do not ask :) >> >> Antti Lukatshttp://trioflex.blogspot.com/ > > No one should ever assume the device security offered is 100% > uncrackable. I used to know a guy who did legit "dark engineering" for > government devices, and it was amazing to hear stories of drilling out > holes in "secure devices" and extracting data using microscropic probes. > Another engineer I knew has a collection of IC's embedded in epoxy - the > company he worked for would shave them layer by layer to extract the > design physically. (So no, going to an ASIC won't necessarily be 100% > secure either) > > If man can make it, man can break it. > > The trick is to make it more expensive for the cloners to crack than it > would be to just license, buy, or reverse engineer another way. Besides, > a lot of places still send bit streams to China for programming during > assembly, and at that point, adding bit-stream security is a bit like > setting the deadbolt on an already open, and empty, barn.
More like shipping all the barn's contents to a pack of known thieves, and asking them to please put them back in your barn and lock the door behind them when they leave.
> A better metric for FPGA bitstream security, or any security product, is > the cost per breach and/or time per breach. Assume it can be breached, > and pick a method where the [cost/time]/[breach] equation works out in > your favor. BTW - this also means that devices with a master key are > very bad - because the time/breach is only paid once, and you can rest > assured, someone besides the manufacturer has it already. > > For an example of this done right, there is an IBM crypto chip that I > believe is still unbroken - but it has wires around the die that control > power the SRAM memory holding the crypto keys. If you drill into the > package, and cut one of the wires, the device loses its memory - and > becomes a dud. Obviously, you also have to do this work with the chip > in-system, and running, for the same reason. This is the equivalent of > the lock on an underground bank vault. > > We will know FPGA vendors are equally serious when the offer a part with > that level of security. Until then, it's pretty much the equivalent of > the standard locks on our front doors. Good enough to keep the riff-raff > out, but not enough to keep the serious thieves away.
Another problem with any high-secure device, whether it be electronic or physical, is that the more it protects you from someone else's maliciousness, the more it'll harm you when you make an honest mistake. To extend your bank vault analogy, consider what happens if you lock yourself in, or lose the key/combination/whatever. -- www.wescottdesign.com
> A better metric for FPGA bitstream security, or any security product, > is the cost per breach and/or time per breach. Assume it can be > breached, and pick a method where the [cost/time]/[breach] equation > works out in your favor.
The paper implies the cost is minimal, at least for the V2P parts. It seems that the equipment required places the attack within the reach of many universities and electronics companies. http://eprint.iacr.org/2011/390.pdf http://eprint.iacr.org/2011/391.pdf "A full key recovery using 50000 measurements finishes in 8x39 minutes, i.e., in 6 hours (Virtex 4), and a full recovery on Virtex 5 devices using 90000 measurements finishes in 8x67 minutes,i.e., about 9 hours." A semi-official Xilinx response is available on their forums: http://forums.xilinx.com/t5/Virtex-Family-FPGAs/Successful-side-channel-attack-on-Virtex-4-and-5-bitstream/m-p/169062#M11290 In his post Austin Lesea says: "...the attack is a sophisticated known attack method (Differential Power Analysis) which all crypto chips and systems are subject to, and there are no known and tested methods to avoid the attack (in theory, all crypto chips are vulnerable -- although one company is selling their patents, and is the primary driver behind getting this research into the public eye). In practice, the attacker requires access, so any means to prevent access (anti-tamper) will prevent the attack, or make it more difficult. Encryption of the bitstream is one aspect of the solution: access control, and anti-tamper may also be required. Xilinx continues to research (and provide) solutions. As with any solution in crypto, the attackers will figure it out, and succeed again. It is a never-ending battle between attacker, and defender."
> following post has link to documents that show that Xilinx V2/V4/V5 are vulnerable as well. > > http://it.slashdot.org/story/11/07/21/1753217/FPGA-Bitstream-Security...
Thought I'd add a few links to the discussion. A post from a Xilinx employee (Austin Lesea) from 2008, discussing the lack of successful Differential Power Analysis (DPA) attacks on Xilinx FPGAs: http://groups.google.com/group/comp.arch.fpga/msg/12769d42109799c4 "All 7 challengers gave up. Their basic conclusion was all the things they thought would work, differential power attack, spoofing by power glitches, attack with freeze spray, etc. FAILED." A recent post from the same Xilinx employee responding to the latest announcement of successful DPA attacks on V2P, V4, and V5 FPGAs: http://forums.xilinx.com/t5/Virtex-Family-FPGAs/Successful-side-channel-attack-on-Virtex-4-and-5-bitstream/m-p/169062/message-uid/169062/highlight/true#U169062 "Encryption of the bitstream is one aspect of the solution: access control, and anti-tamper may also be required." Original papers describing attacks: http://eprint.iacr.org/2011/390.pdf http://eprint.iacr.org/2011/391.pdf Stephen