Download 2015 Computer Engineering

Survey
yes no Was this document useful for you?
   Thank you for your participation!

* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project

Document related concepts
no text concepts found
Transcript
Computer Engineering:
Digital Logic
Digital Logic
The input is‘1’ when the button is pressed. The output is ‘1’ when the LED is ON.
A
B
A AND B
A OR B
A XOR B
0
(unpressed)
0
0
0
0
(unpressed)
(off)
(off)
(off)
0
(unpressed)
1
0
1
1
(pressed)
(off)
(on)
(on)
1
0
0
1
1
(pressed)
(unpressed)
(off)
(on)
(on)
1
1
1
1
0
(pressed)
(pressed)
(on)
(on)
(off)
A
B
0
0
0
1
1
0
1
1
Unknown 1
Unknown 2
Unknown 3
Unknown chip 1 is : ____________________________________
Unknown chip 2 is : ____________________________________
Unknown chip 3 is : ____________________________________
Computer Engineering:
Microcontrollers and Coding
A microcontroller is a really small computer on a single board. Engineers use them in lots of
ways - from turning on the lights in your home from your phone to monitoring the
temperature in a weather balloon. Microcontrollers are often connected to other sensors or
devices to help complete these tasks. Then, engineers program the microcontroller to do what
they want. Programming microcontrollers is a really important part of computer engineering.
Scratch is a cool way to start exploring how to code. You can go to the scratch website and
start playing some modules! Scratch will help teach you how to think like a programmer and
will let you program your own stories, games, and animations.
https://scratch.mit.edu/
If you feel confident to try something harder, try exploring python! Python is a high-level
programming language that has syntax that is easy to read. Codecademy has tutorials to learn
how to use python effectively!
https://codecademy.com/tracks/python
Assembled microcontroller
Inside a microcontroller
Computer Engineering:
Digital Logic/Binary
Computers aren’t very smart. They can only do exactly what you tell them to do, and they can
only “think” in ones and zeroes.
In our world, we think in letters, numbers, words, ideas, and concepts. So how do we make
computers understand us?
In order for computers to understand what we want them to do, they need to convert
characters into binary. They use what is referred to as ASCII (“AS-KEY”), which stands for the
American Standard Code for Information Interchange. This will take a character and convert it
into a number.
This number can be represented in an infinite number of different systems. The number
system called “decimal” is what you learn in school and what you use to do your math
homework in. But, we can represent numbers using other systems. The two most common
number systems, other than decimal, are “hexadecimal” and “binary”.
In hexadecimal, we have 16 different digits that we can use to represent a number. In decimal,
we only have 10 (0-9), so we have to use letters to supplement the numbers! For a
hexadecimal number, you can see any digit from 0-9, or any letter from A-F to represent the
number.
Binary is more restrictive, and can only use two digits (0-1). This is what computers use to do
different operations.
In decimal:
12
In hexadecimal: C
In binary:
1100
So, if we use ASCII, we can encode any character into a decimal number, a hexadecimal
number, or a binary number!
Take the letter “R”, for example.
In decimal:
In hexadecimal:
82
52
In binary:
01010010
Can you use the ASCII tables to write your name?
Can you use them to write a secret message?
Can you decipher our secret message: 01000111 01101111 00100000 01010100 01101001
01100111 01100101 01110010 01110011?
Try your initials!
First Initial: __________________
Decimal Number: _____________
Binary:
Middle Initial: ________________
Decimal Number:_____________
Binary:
Final Initial:__________________
Decimal Number:_____________
Binary:
i
http://1.bp.blogspot.com/-gKRqkSphQY4/Tw5mYJ9E1WI/AAAAAAAAACk/O4jlRIa8x_E/s1600/ascii-chart.png