Download Insert Title Here

Survey
yes no Was this document useful for you?
   Thank you for your participation!

* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project

Document related concepts

Wireless security wikipedia , lookup

Mobile device forensics wikipedia , lookup

Cyber-security regulation wikipedia , lookup

PaX wikipedia , lookup

Address space layout randomization wikipedia , lookup

Cross-site scripting wikipedia , lookup

Trusted Computing wikipedia , lookup

Cryptanalysis wikipedia , lookup

History of cryptography wikipedia , lookup

Cryptography wikipedia , lookup

Computer security wikipedia , lookup

Post-quantum cryptography wikipedia , lookup

Fault tolerance wikipedia , lookup

Mobile security wikipedia , lookup

Security-focused operating system wikipedia , lookup

Next-Generation Secure Computing Base wikipedia , lookup

Secure multi-party computation wikipedia , lookup

Transcript
Secure Hardware Design
Secure Hardware Design
The Black Hat Briefings
July 26-27, 2000
Brian Oblivion, Kingpin
[oblivion, kingpin]@atstake.com
Why Secure Hardware?
 Embedded systems now common in the industry
 Hardware tokens, smartcards, crypto accelerators,
internet appliances
 Detailed analysis & reverse engineering techniques
available to all
 Increase difficulty of attack
 The means exist
Solid Development Process
 Clearly identified design requirements
 Identify risks in the life-cycle





Secure build environment
Hardware/Software Revision control
Verbose design documentation
Secure assembly and initialization facility
End-of-life recommendations
 Identify single points of failure
 Security fault analysis
 Third-party design review
Sources of Attack
 Attacker resources and methods vary greatly
Resource
Teenager
Academic
Org. Crime
Gov’t
Time
Limited
Moderate
Large
Large
Budget ($)
<$1000
$10K-$100K
$100K+
Unknown
Creativity
Varies
High
Varies
Varies
Detectability
High
High
Low
Low
Target
Challenge
Publicity
Money
Varies
Number
Many
Moderate
Few
Unknown
Organized
No
No
Yes
Yes
Spread info?
Yes
Yes
Varies
No
Source: Cryptography Research, Inc. 1999, “Crypto Due Diligence”
Accessibility to Product
Purchase
All attacks possible
Evaluation
Most attacks possible with
risk of detection
Active, in-service
Most attacks possible
Remote access
No physical access
Attack Scenarios
System
Enclosure
Circuit
Firmware
Attack Scenarios
 System
 Initial experimentation & probing
 Viewed as a “black box”
 Can be performed remotely
 Bootstrapping attacks
Attack Scenarios
 Enclosure
 Gaining access to product internals
 Probing (X-ray, thermal imaging, optical)
 Bypassing tamper-proofing mechanisms
Attack Scenarios
 Circuit






PCB design & parts placement analysis
Component substitution
Active bus and device probing
Fault induction attacks1
Timing attacks2
Integrated circuit die analysis3
Attack Scenarios
 Firmware
 Low-level understanding of the product
 Obtain & modify intellectual property
 Bypass system security mechanisms
 Ability to mask failure detection
Attack Scenarios
 Strictly Firmware - no product needed!
 Obtain firmware from vendor’s public facing
web site
 Can be analyzed and disassembled without
detection
What Needs To Be Protected?
 Firmware binaries
 Boot sequence
 Cryptographic functionality (offloaded to
coprocessor)
 Secret storage and management
 Configuration and management
communication channels
System
System
Firmware
Circuit
Enclosure
Trusted Base
 Minimal functionality
– Trusted base to verify the integrity on firmware and/or
Operating System
– Secure store for secrets
– Secrets never leave the base unencrypted
– Security Kernel
 Examples of a Trusted Base
– A single IC (some provide secure store for secrets)
– May be purchased or custom built (Secure Coprocessor)
–
–
–
–
All Internals - circuit boards, components, etc.
Entire trusted base resides within tamper envelope
Firmware
Security Kernel
System
Security Kernel
 Better when implemented in Trusted Base,
but can function in OS
 Enforces the security policy
 Ability to decouple secrets from OS
Example: Cryptlib4
System
Trusted Base example
CSOC
Control
Bus
Bulk
Transfer
Bus
External
Memory
Bus
Data Memory
(may be dual ported.)
for bulk encrypt/decrypt
Host
Firmware
Memory
Mapped
Bus
Host
Processor
Main memory
(DRAM)
System
Communication
Interface(s)
Failure Modes
 Determine how the product handles failures
 Fail-open or fail-closed?
 Response depends on failure type
 Halt system
 Set failure flags and continue
 Zeroization of critical areas
System
Management Interfaces
 Do not include service backdoors!
 Utilize Access Control
 Encrypt all management sessions
 SSH for shell administration
 SSL for web administration
System
Firmware
System
Firmware
Circuit
Enclosure
Secure Programming
Practice
Firmware
 Code obfuscation & symbol stripping
 Use compiler optimizations
 Remove functionality not needed in production
 Two versions of firmware: Development, Prod.
 Remove symbol tables, debug info.
Secure Programming
Practice
Firmware
 Buffer overflows5
 Highly publicized and attempted
 If interfacing to PC, driver code with overflow
could potentially lead to compromise
Firmware
Boot Sequence
Host System
Common
Boot Model
Flash (BIOS)
(May be ROM)
New or Overloaded
functionality
FlashDisk or Fixed
Disk
Embedded OS or
state machine
Hardware
Reset
FlashDisk or Fixed
Disk
Applications
Time
Trusted Boot Sequence
Host System
CSOC
Common
Boot Model
Bootrom
Flash
POST, Security
Kernel
New or Overloaded
functionality
Verify Bootrom and
Flash
Verify Embedded
OS
Hardware
Reset
FlashDisk or Fixed
Disk
Time
Embedded OS or
state machine
Verify Applications
FlashDisk or Fixed
Disk
Applications
Run-Time Diagnostics
Firmware
 Make sure device is 100% operational all the
time
 Periodic system checks
 Failing device may result in compromise
Secret Management
Firmware
 Never leak unencrypted secrets out
 Escrow mechanisms are a security hazard
 If required, perform at key generation, in the
physical presence of humans
 Physically export Key Encryption Key and protect
 Export other keys encrypted with Key Encryption
Key
Cryptographic Functions
Firmware
 If possible, move out of firmware
 …into ASIC
 Difficult to modify algorithm
 Cannot be upgraded easily
 Increased performance
 …into commercial CSOC or FPGA




Can reconfigure for other algorithms
May also provide key management
Increased Performance
Reconfiguration via signed download procedure
(CSOC only)
Field Programmability
Firmware
 Is your firmware accessible to everyone from
your product support web page?
 Encryption
 Compressing the image is not secure
 Encrypting code will limit exposure of intellectual
property
 Code signing
 Reduce possibility of loading unauthorized code
Circuit
System
Firmware
Circuit
Enclosure
PCB Design
Circuit
 Remove unnecessary test points
 Traces as short as possible
 Differential lines parallel (even if on separate
layers)
 Separate analog, digital & power GND planes
 Alternate power and GND planes
Parts Placement
Circuit
 Difficult access to critical components
 Proper power filtering circuit as close to input
as possible
 Noisy circuitry (i.e. inductors)
compartmentalized
Physical Access to
Components
 Epoxy encapsulation of critical components
 Include detection mechanisms in and
under epoxy boundary
Circuit
Power Supply & Clock
Protection
 Set min. & max. operating limits
 Protect against intentional voltage variation
 Watchdogs (ex: Maxim, Dallas Semi.)
 dc-dc Converters, Regulators, Diodes
 Monitor clock signals to detect variations
Circuit
I/O Port Properties
Circuit
 Use unused pins to detect probing or
tampering (esp. for FPGAs) - Digital Honeypot
 Disable all unused I/O pins
Programmable Logic &
Memory
 Make use of on-chip security features
 FPGA design
 Make sure all conditions are covered
 State machines should have default states in place
 Be aware of what information is being stored in
memory at all times6 (i.e. passwords, private keys,
etc.)
 Prevent back-powering of non-volatile memory
devices
Circuit
Advanced Memory
Management
Circuit
 Often implemented in small FPGA
 Bounds checking in hardware
 Execution, R/W restricted to defined memory
 DMA restricted to specified areas only
 Trigger response based on detection of “code
probing” or error condition
Bus Management
Circuit
 COMSEC Requirements
 Keep black (encrypted) and red (in-the-clear)
buses separate
 Data leaving the device should always be black
 Be aware of data on shared buses
Enclosure
System
Firmware
Circuit
Enclosure
Tamper Proofing
Enclosure
 Resistance, Evidence, Detection, Response
 Most effective when layered
 Possibly bypassed with knowledge of method
Tamper Proofing
 Tamper Resistance





Hardened steel enclosures
Locks
Encapsulation, potting
Security screws
Tight airflow channels, 90o bends to prevent optical
probing
 Side-effect is tamper evident
Enclosure
Tamper Proofing
 Tamper Evidence
 Major deterrent for minimal risk takers
 Passive detectors - seals, tapes, cables
 Special enclosure finishes
 Most can be bypassed7
Enclosure
Enclosure
Tamper Proofing
 Tamper Detection
 Ex:
Temperature sensors
Micro-switches
Radiation sensors
Magnetic switches
Nichrome wire
Flex circuit
Pressure contacts
Fiber optics
Tamper Proofing
 Tamper Response
 Result of tampering being detected
 Zeroization of critical memory areas
 Provide audit information
Enclosure
RF, ESD Emissions &
Immunity
Enclosure
 Clean, properly filtered power supply
 EMI Shielding
 Coatings, sprays, housings
 Electrostatic discharge protection
 Could be injected by attacker to cause failures
 Diodes, Transient Voltage Suppressor devices
(i.e. Semtech)
External Interfaces
Enclosure
 Use caution if connecting to “outside world”
 Protect against malformed, intentionally bad packets
 Encrypt or (at least) obfuscate traffic
 Be aware if interfaces provide access to internal bus
 Control bus activity through transceivers
 Attenuate signals which leak through transceivers with
exposed buses (token interfaces)
 Disable JTAG and diagnostic functionality in
operational modes
In Conclusion…
As a designer:
 Think as an attacker would
 As design is in progress, allocate time to
analyze and break product
 Peer review
 Third-party analysis
 Be aware of latest attack methodologies &
trends
References
1.
2.
3.
4.
5.
6.
7.
Maher, David P., “Fault Induction Attacks, Tamper Resistance, and Hostile
Reverse Engineering in Perspective,” Financial Cryptography, February
1997, pp. 109-121
Timing Attacks, Cryptography Research, Inc.,
http://www.cryptography.com/timingattack/
Beck, F., “Integrated Circuit Failure Analysis: A Guide to Preparation
Techniques,” John Wiley & Sons, Ltd., 1998
Gutmann, P., Cryptlib, “The Design of a Cryptographic Security
Architecture,” Usenix Security Symposium 1999,
http://www.cs.auckland.ac.nz/~pgut001/cryptlib.html
Mudge, “Compromised Buffer Overflows, from Intel to SPARC version 8,”
http://www.L0pht.com/advisories/bufitos.pdf
Gutmann, P., “Secure Deletion from Magnetic and Solid-State Memory
Devices,” http://www.cs.auckland.cs.nz/~pgut001/secure_del.html
“Physical Security and Tamper-Indicating Devices,”
http://www.asis.org/midyear-97/Proceedings/johnstons.html
Additional Reading
1.
2.
3.
4.
5.
6.
DoD Trusted Computer System Evaluation Criteria (Orange Book),
5200.28-STD, December 1985,
http://www.radium.ncsc.mil/tpep/library/rainbow/5200.28-STD.html
Clark, Andrew J., “Physical Protection of Cryptographic Devices,”
Eurocrypt: Advances in Cryptography, April 1987, pp. 83-93
Chaum, D., “Design Concepts for Tamper Responding Systems,” Crypto
1983, pp. 387-392
Weingart, S.H., White, S.R., Arnold, W.C., Double, G.P., “An Evaluation
System for the Physical Security of Computing Systems,” Sixth Annual
Computer Security Applications Conference 1990, pp. 232-243
Differential Power Analysis, Cryptography Research, Inc.,
http://www.cryptography.com/dpa/
The Complete, Unofficial TEMPEST Information Page,
http://www.eskimo.com/~joelm/tempest.html
Thanks!