While the electronics industry works to improve its supply chain integrity, the need for chip companies to keep some features secret sometimes clashes with the need for more transparency at the OEM -- and ultimately the end-user -- level.
Thatís the case in one of the examples cited in a Gartner report published last fall. (registration required). In this final of three blog posts on the insecurity of the commercial IT supply chain as described in that report, Iíll explain the incident and the broader problem it represents.
Last summer, University of Cambridge PhD candidate Sergei Skorobogatov and fellow research Christopher Woods reported in an academic paper that they had discovered a backdoor in Microsemi/Actel ProASIC3 FPGA chips. The paper said that the researchers had used a technique called pipeline emission analysis to extract a key to the back door.
According to Skorobogatov:
If you use this key you can disable the chip or reprogram it at will, even if locked by the user with their own key. This particular chip is prevalent in many systems from weapons, nuclear power plants to public transport. In other words, this backdoor access could be turned into an advanced Stuxnet weapon to attack potentially millions of systems.
The paper caused a flurry of concern because the ProASIC FPGAs are reportedly widely used in military systems, flight control, and industrial and automotive applications. But Microsemi (which acquired Actel in 2010) says that what the researchers found was not a backdoor, but rather an integral part of the chipís security. In order to access the information -- pre-programmed data such as the unique ID of the device and other data necessary for the production, manufacturing, and testing of the device -- a hacker would have to break into the chipís first-line security, or the front door, first, says Paul Ekas, vice president of marketing for SOC products at Microsemi.
He adds that the chip that was the focus of the research paper was an old device -- designed eight years ago -- and that the companyís latest line of chips has improved, state-of-the-art security features.
Complex and Easy to Hack
Few FPGA customers implement security features on FPGAs, because it is so inconvenient.
Nevertheless, few FPGA customers implement security features on FPGAs, because it is so inconvenient, says Ekas. ďVery few customers choose to encrypt that bit stream, because it tends to cost money and time, since most FPGAs require a battery backup.Ē He notes that MicroSemiís FPGAs use flash and have security features that make implementing security easier.
The OEMs that are most concerned, and most likely to implement the security, he says, are in government and avionics markets. But the Gartner reportís main point was to alert CIOs to the fact that chips in commercial IT systems may not be secure. Indeed, one of its recommendations is that CIOs require their IT suppliers to be transparent about the functions implemented in firmware, programmable logic arrays, and FPGAs. Itís unlikely that OEMs selling into the commercial market are securing the code in the FGPAs used in their designs. And they certainly wouldnít think to tell IT customers that their products even incorporate such chips.
Even if they did, the FPGA manufacturers hold a lot of this information close to the vest. Any OEM customer that wants to know more about the chip architecture and security features can get the information, but it has to sign an NDA first, says Ekas.
ďOur customers want us to be open and transparent with them. They donít want us to be open and transparent to the whole market.Ē Thatís because releasing this information to the public also releases it to potential attackers, who can then use it to try to discover and exploit vulnerabilities, he says. The FPGA vendors are also protecting their own IP, of course. The problem is, this also means the information will not be passed along to other manufacturers in the supply chain (from disk drive vendor, for example, to server manufacturer), much less to the final user. That leaves IT equipment in corporate America potentially vulnerable.
The other recommendations in the report:
- If IT equipment suppliers canít supply information about the functions implemented in programmable chips, buyers could use independent third-party inspection and verification.
- Buyers should ask the IT supplier whether there are default or hard-coded passwords in the system that allow access to chips.
- Buyers should ensure that all testing and debugging interfaces in both software and chips are disabled and, if possible, removed from the IT system before it is shipped.
I suspect weíll be hearing more about this problem in commercial systems, not just in FPGAs and not just in military equipment. As Skorobogatov and Woods say in their research paper, in todayís global semiconductor manufacturing industry ďan adversary can introduce Trojans into the design during a stage of fabrication by modifying the mask at a foundry or fab. It can also be present inside third partiesí modules or blocks used in the design.Ē
Do you think commercial IT equipment is vulnerable to chip-level tampering? Or am I just being paranoid?