[PLUG] How do we know who to trust?

Louis Kowolowski louisk at cryptomonkeys.org
Sun Oct 7 13:32:55 PDT 2018


On Oct 5, 2018, at 1:24 AM, Keith Lofstrom <keithl at kl-ic.com> wrote:
> 
> On Thu, Oct 04, 2018 at 12:21:22PM -0700, Dick Steffens wrote:
>> The story about Elemental's computers having a spy chip on their
>> motherboards raise the question, how can we know if our computers
>> are compromised?
>> 
>> https://www.oregonlive.com/silicon-forest/index.ssf/2018/10/chinese_planted_spy_chips_insi.html
> 
> Assume your machines ARE compromised.  The only question is
> how many different organizations have their own compromises
> in your machine.
> 
> Without a completely open production process, end to end,
> which includes open source chip design, and a back end chip
> teardown process to compare design intent to samples of the
> actual silicon, there are just WAY too many places that 
> very complex behavior may be inserted.  An extra chip on
> the circuit board, like this unconfirmed hack, is far too
> obvious for a deep-pockets adversary to bother with.
> 
> My nightmare:
> 
> The easiest place to insert malware is into the firmware
> boot tracks on your hard drive.
> 
> Hard drive behavior is controlled by "digital signal
> processing" software for motor control, head movement, 
> and the high level, pack-the-bits-onto-a-track behavior. 
> That behavior is complex (vastly more complex than hard
> drives or even whole computers a decade ago), and is way
> more than they want to freeze into logic chips or store
> in an EPROM.   So the drive manufacturer stores those
> megabytes on the disk itself, in the "low performance"
> areas of the disk platter.
> 
> A few percent of the platter area is low performance, too
> slow to move user data quickly, but usable at lower speeds
> or bit densities, or with simpler encodings usable by
> simple "boot-the-boot" hardware.  There is room to store
> gigabytes of potential boot information in that area,
> a vast opportunity for mischief and malware. 
> 
> I can imagine conditions that trigger the loading of
> alternate disk control software, which inserts exploits
> into an operating system as it is read off the disk. 
> There is enough room on the disk to do this for hundreds
> of common operating systems.  That would NOT include all
> the zillions of variant kernels used by the Linux
> community, but there are many fewer variants of other
> linux security software, like the SELinux suite.
> 
> My former neighbor worked for a Vancouver Washington
> company ("C") that builds network monitoring systems. 
> "C" assembles their machines in China, and installs
> firmware there so they can do acceptance testing on
> arrival here.  After acceptance, they wipe the hard
> drives down to the boot tracks and rebuild them, Just
> In Case, because their systems control the Internet. 
> 
> The silicon might still be compromised, though.  I am
> a chip designer.  If I control the fabrication process,
> especially the ion implanter or the photomask aberration
> correction system, I can hide behavior in a chip that you
> won't be able to find unless you take the chip apart atom
> by atom and compare that to a detailed mask level
> specification, then compare the mask specification to
> a mind-bogglingly expensive series of simulations.
> 
> Optimization-by-complexity is the antithesis of security.
> 
> In simple words, complex chips are vulnerable.  Use
> simpler chips, or avoid making enemies. 
> 
If you assume the hardware is compromised, how can it be used in a way that would allow you to believe the results it provides? The software by definition couldn't correct the compromise.

--
Louis Kowolowski                                louisk at cryptomonkeys.org
Cryptomonkeys:                                   http://www.cryptomonkeys.com/

Making life more interesting for people since 1977



More information about the PLUG mailing list