Download Lecture 2

Survey
yes no Was this document useful for you?
   Thank you for your participation!

* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project

Document related concepts
no text concepts found
Transcript
CS 325: CS Hardware and Software
Organization and Architecture
Introduction 2
1
Computer Architecture
•Understand where computers are going
•Future capabilities drive the (computing) world
•Understand high-level design concepts
•The best architects understand all the levels
•Devices, circuits, architecture, compiler,
applications
•Write better software
•The best software designers also understand
hardware
•Need to understand hardware to write fast
software
2
Thoughts on Computer Architecture
•Principle of equivalence of hardware and
software:
•Anything that can be done with software can also
be done with hardware.
•Anything that can be done with hardware can also
be done with software.
3
Computer Components
•At the most basic level, a computer is a device
consisting of four pieces:
•A CPU to interpret and execute programs
•Memory to store both data and programs
•A system interconnection for communication
among the CPU, memory, and I/O devices
•Interfaces for transferring data to and from the
outside world
4
An Example System
•Consider this advertisement:
• What does it all mean?
5
Measures of Capacity
Data Measurement
Size
Example
Bit
Single Binary Digit (0 or
1)
Byte
8 bits
ASCII value with parity
Kilobyte (KB)
1,024 Bytes
2KB RAM AGC, landed on the moon
Megabyte (MB)
1,024 Kilobytes
Avg. MS Word Document size
Gigabyte (GB)
1,024 Megabytes
4 – 8 GB, typical RAM capacity
Terabyte (TB)
1,024 Gigabytes
2 – 4 TB, typical large capacity HDD
Petabyte (PB)
1,024 Terabytes
~16PB/Week delivered to Steam users
Exabyte (EB)
1,024 Petabytes
1Gram of DNA = ~450EB of data
Zettabyte (ZB)
1,024 Exabytes
~40ZB of total digital data by 2020
(400,000,000,000 GB!)
Yottabyte (YB)
1,024 Zettabytes
~$100 Trillion for 1 YB of data storage
6
Historical Development
•Generation Zero:
•Calculating clock – Wilhelm Schickard (1592)
•Pascaline – Blaise Pascal (1623)
•Difference Engine – Charles Babbage (1791)
•Punched card tabulating machines – Herman
Hollerith (1860)
•Punched cards were commonly used for
computer input well into the 1970s.
7
Historical Development
•Generation One – Vacuum Tube:
•1945 - 1953
• Electronic Numerical Integrator and Computer (ENIAC)
(1946)
• 1800 sq ft in size
• 150kW
• 5 kHz
• 5k +/- ops
• 357 mult ops
• 35 div ops
8
Historical Development
•Generation One – Vacuum Tube:
•1945 - 1953
• IBM 650 (1953)
• First mass-produced computer
9
Historical Development
•Generation Two – Transistors:
• 1954 – 1965
• Can be though of as controlled diodes
• A diode is a two-terminal component that conducts electricity in
one direction
• Why is this important?
• Diodes are used to build AC/DC rectifiers
10
Historical Development
•Generation Two – Transistors:
• 1954 – 1965
• Solid state semiconductor used for switching electrical signals
and maintaining a digital state.
• IBM 7094 (scientific)
• IBM 1401 (business)
• Digital Equipment Corporation DEC PDP-1
11
Historical Development
•Generation Three – Integrated Circuit:
• 1965 – 1980
• Solid state circuits made of semiconductor material. Much
smaller than discrete circuits.
• IBM 360
• DEC PDP-8, PDP-11
• Cray-1 Supercomputer
12
Historical Development
•Generation Four – VLSI:
• 1980 - ?
• Very Large Scale Integrated Circuits, billions of transistors are
now common.
• Enabled the creation of microprocessors.
• The first was the 4-bit Intel 4004.
13
Historical Development
•Moore’s Law (1965)
• Gordon Moore, Intel founder
• “The density of transistors in an integrated circuit will
double every year.”
• Contemporary version:
• “The density of silicon chips will double every 24 months.”
But this “law” cannot hold forever…
14
Historical Development
•Rock’s Law
• Arthur Rock, Intel financier
• “The cost of capital equipment to build semiconductors will
double every four years.”
• In 1968, a new semiconductor manufacturing plant cost
about $12,000
• At the time, $12,000 would buy a nice home in the
suburbs.
15
Historical Development
•Rock’s Law
• 2015, Intel D1X fabrication plant in Hillsboro, Oregon cost
over $3 billion.
• $3 billion is more than the GDP of some small countries.
• For Moore’s Law to hold, Rock’s Law must fall, or vice versa.
• But no one can predict which will give out first.
16
Computer Level Hierarchy
•Writing complex programs requires a “divide and
conquer” approach.
• Each software module solves a part of the problem.
•Complex computer systems employ a similar
technique through a series of machine layers.
17
Orientation: A Server
18
Orientation: MacBook Air
19
Orientation: Iphone
20
Computer Level Hierarchy
•Components at each level
execute their own
particular instructions,
using components at
lower levels to perform
tasks as required.
21
Computer Level Hierarchy
• Level 6: The User Level
• Program execution and GUI
• Most familiar level
22
Computer Level Hierarchy
• Level 5: High-Level Language
• Write and interact with languages
• Java
•C
• Python
23
Computer Level Hierarchy
• Level 4: Assembly Language
• Lower level programming language in which there
is a strong correspondence between the language
and the CPU’s machine code instructions
24
Computer Level Hierarchy
• Level 3: System Software
• Controls executing processes on the
system.
• Protects system resources.
• OS Kernels
25
Computer Level Hierarchy
• Level 2: Machine Level
• Also known as the Instruction Set Architecture (ISA) Level
• Consists of instructions that are particular to the architecture
of the CPU
• Programs written at this level do not need compilers,
interpreters, or assemblers.
26
Computer Level Hierarchy
• Level 1: Control Level
• A control unit decodes and executes instructions and moves
data through the system
• Control units can be microprogrammed or hardwired.
• A microprogram is a program written in a low level language
that is implemented by the hardware.
• Hardwired control units consist of hardware that directly
executes machine instructions.
27
Computer Level Hierarchy
• Level 0: Digital Logic Level
• Lowest abstracted level consisting of digital circuits
• Gates and connections
• Implements the mathematical logic of all higher levels.
28
Computer Level Hierarchy
• Level -1: Discrete Component Level
• Resistors, Diodes, Capacitors, Transistors
• Very complex!
• 10 components needed just to create a NOT gate
• Not covered in detail
29
The Von Neumann Model
• On the ENIAC, all programming was done at the digital
logic level.
• Programming the computer involved moving plugs and wires!
• A different hardware configuration was needed to solve
every unique problem type.
• Configuring the ENIAC to solve a “simple” problem required
many days labor by skilled technicians.
30
The Von Neumann Model
• John Von Neumann:
• Introduced in 1945 the concept of the stored-program
computer.
• Digital computer that keeps programs and data together in
read/write memory called Random-Access Memory (RAM)
• Today’s current computer architecture is based on the Von
Neumann stored-program concept.
31
The Von Neumann Model
The von Neumann Stored-Program Computer:
1. Since the device is a computer, it must frequently perform
arithmetic operations (+, -, *, /)
• This requires the development of the first specific part: Central Arithmetical
(CA)
2. A logical control mechanism must be designed if the device is to be
elastic and general purpose.
• Development of the second specific part: Central Control (CC) allows for
control of general purpose program instructions.
3. If the device must carry out long/complicated sequences of
operations, it must have considerable memory.
• This requires the third specific part: Memory (M)
• The three specific parts, CA, CC, and M, are the organs of the device.
32
The Von Neumann Model
The von Neumann Stored-Program Computer:
• The device must have the ability to maintain input and output.
• This could be devices for human-computer interaction
• This medium is called: The outside recording medium of the device
(R)
4. The device must have a way to transfer information from R into CA,
CC, and M.
• This requires development of the fourth specific part: Input (I)
5. The device must have a way to transfer information from CA, CC,
and M into R.
• This requires development of the fifth specific part: Output (O)
• With few exceptions, all of today’s computers have this same general
structure and function
• Referred to as the von Neumann Architecture (supplemental readings 331 & 2)
The Von Neumann Model
• Today’s stored-program computers have the following
characteristics:
• Three hardware systems:
• A CPU.
• A main and secondary memory system.
• An I/O system.
• The capacity to carry out sequential instruction processing.
• A single data path between the CPU and main memory.
• This single path is known as the Von Neumann bottleneck.
34
The Von Neumann Model
• General depiction of a Von Neumann stored-program system
• These computers use
• Fetch, decode, execute cycles to run programs
35
The Von Neumann Model
• The control unit fetches the next instruction from memory
using the program counter to determine where the instruction
is located.
36
The Von Neumann Model
• The instruction is decoded into a language that the ALU can
understand.
37
The Von Neumann Model
• Any data operands required to execute the instruction are
fetched from memory and placed into registers within the CPU.
38
The Von Neumann Model
• The ALU executes the instruction and places results in registers
or memory.
39
The Von Neumann Model
• Conventional stored-program computers have undergone many
incremental improvements over the years.
• These improvements include:
• specialized buses
• floating-point units
• cache memories
• Etc..
• But enormous improvements in computational power require
departure from the classic von Neumann architecture.
• Adding processors is one approach.
40
The Von Neumann Model
• late 1960s
• high-performance computer systems were equipped with dual
processors to increase computational throughput.
• 1970s
• supercomputer systems were introduced with 32 processors.
• 1980s
• Supercomputers with 1,000 processors were built in the 1980s.
• 1999
• IBM announced its Blue Gene system containing over 1 million
(single core) processors.
• 2017
• Sunway TaihuLIght in China is the world’s most powerful
supercomputer with over 10.6 million CPU cores and 1.3PB RAM
(1.3 million GB)
41
The Von Neumann Model
•Parallel processing is only one method of providing
increased computational power.
•DNA computers, quantum computers, and dataflow
systems are also heavily researched.
•At this point, it is unclear whether any of these
systems will provide the basis for the next
generation of computers.
42