Download Digital Imaging - faculty.rsu.edu

Survey
yes no Was this document useful for you?
   Thank you for your participation!

* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project

Document related concepts
Transcript
Digital Imaging
Introduction
Have you been to the dentist’s office and had a picture of your teeth taken, downloaded on a
computer and then adjusted to show the results of upcoming dental work? Or taken a picture
with a digital camera, downloaded the image and emailed it to the grandparents thousands of
miles away in just seconds? These are examples of digital imaging. It affects each and every
one of us on a daily basis. In the past ten years, digital imaging has become cheaper and easier
to use than conventional photography. Digital imaging has gone main stream, but the uses go far
beyond what we normally think of.
How Digital Imaging works
For most of us, we pick up a digital camera take a picture and download it into our computer. Or
we take an old 35mm print and scan it in using a scanner. Seems like a very simple process but
the technology that goes into it is just simply amazing.
The sensor that is used in most digital cameras and scanners is called a charge coupled device
(CCD). Early cameras and low-end cameras use a device called complementary metal oxide
semi-conductor (CMOS). CCD sensors use more power but produce better images. Since
CMOS sensors are not as commonly used, we will concentrate more on the CCD sensors and
how they work.
CCD Architecture
CMOS Sensor
In order for a digital device to capture an image it needs to have a method of converting light
(transmitted in the form of photons) into an electrical charge. This is done through photosites.
The CCD is a group of photosites. Each photosite is equal to one pixel. Photosites are set up in
an array, such as 1600 by 1200, and the more photosites the higher the megapixel rating.
Basically a light will strike the photosite, and covert the light to energy. The stronger the light
the greater the electrical charge. The CCD then transmits the value of the charge and reads it at
one corner of the array. Since electrical charges at this time are analog, an analog to digital
converter turns the signal digital. CCDs are normally very high quality and can transport the
signal without signal loss or distortion.
Photosites cannot differentiate between colors. They only record the total strength of the light.
The incoming light needs to be broken down to the three primary colors, red, green, and blue in
order to be recorded. This is accomplished through filters. As we know from computer monitors
and television screens, once all three colors are recorded, they can be recombined to the variety
of colors that we normally see. The best way of splitting the light beam is to place a filter over
the photosites, where each photosite is assigned a red, blue, or green pixel.
.
The Bayer pattern, one of the more common filters, alternates rows of red and green filters with a
row of blue and green filters. There are not an equal number of pixels because the human eye is
not equally sensitive to each color. It needs more green pixels to present what appears to be a
normal color to our eyes. There are as many green pixels as red and blue pixels combined.
Only one sensor is with multiple photosites is required to record an image and all the color
information is recorded simultaneously. Cameras using this type of filter and sensor
arrangement are smaller, more inexpensive, and have more uses.
It may be hard to understand how a camera with 2.1 million pixels can achieve its full resolution
if it takes four separate pixels (one red and one blue, two greens) to determine the color of a
single pixel. This is accomplished through using demosaicing algorithms to convert separate
colors into an equally sized pattern of true colors. Each colored pixel can be used more than
once. The true color is decided by averaging values from the closest surrounding pixels.
Because digital sensors are smaller then film negatives lenses can be smaller. A six-megapixel
camera (currently the high-end of professional digital imaging) has a sensor array that is the
same size as a 35mm negative. The smaller lens sizes can be used for newer applications such as
photomicrography.
Terms used in Digital Imaging
Some terms used in digital imaging are defined differently from standard film, and understanding
these terms will help the user to better grasp the concepts behind digital. Programs such as
Adobe Photoshop or Paintshop Pro allow the user to manipulate some of these settings in the raw
images to produce better photos and correct deficiencies in the quality of the image.
Color Space
Color space is how the colors are mixed together to form a specific color
in the spectrum. The two types of color space that are most commonly
used are Red Green Blue (RGB) that is used in digital cameras, video
cards, monitors and televisions and Cyan Yellow Magenta Black
(CYMK) which is used by printing devices and some specialty cameras.
By adding different color combinations at different levels the spectrum
of color becomes available to use. It is not possible to create every color
in the visible spectrum perfectly through adjusting red, green and blue.
Color spaces allow us to change the definitions of each primary color so
we can get better reproduction.
Dynamic Range
Dynamic range is defined as the ratio between the brightest and darkest parts of a scene.
High dynamic range is when the image goes from very bright to dark in the same scene, a low
light indoor scene is said to have low dynamic range. Digital cameras are unable to capture the
infinite dynamic range that exists in nature so they must compromise. The cameras will only
capture the part of the spectrum that is most important to interpretation by the human eye.
Good Dynamic Range
Poor Dynamic Range
Interpolation
Interpolation is the process of increasing the size of a digital image. Most digital cameras and
scanners use interpolation to produce a larger image than sensor recorded. Windows viewers
and photo editing programs use some sort of method of interpolation for resizing images. How
smoothly images are enlarged relates directly to the algorithm used.
Nearest Neighbor Interpolation takes the colors of the pixels in the nearest neighbors on the
sensor and uses it to fill the space between pixels created by enlarging the image. This normally
results in pixelation and does not produce a smooth image.
Bilinear Interpolation is a much smoother method and uses the nearest 2 x 2 pixels to create the
output of a bilinear function in the new pixels created in the source image.
Bi-cubic interpolation is very sophisticated and produces among the smoothest appearances. The
output pixel is a value of the 16 pixels in the 4 x 4 neighborhood of the recorded pixel. This
method is the most commonly used by printing and photo-editing software.
Fractal interpolation was developed by the Altamira group as a specialized product for lossless
manipulation and resizing of images. This software, called Genuine Fractals Pro is excellent for
resizing to larger images. Fractal interpolation is commonly used in higher end applications such
as the medical field.
Resolution
Resolution is the level of detail captured by a digital camera. Resolution is normally measured
in pixels. For most purposes, the higher the resolution the better the quality of the image. But
that is not always true. For example, if your computer screen is set to 800 x 600, and you are
displaying an image recorded at 2048 x 1600 (3.2 megapixels), the image you see will not be the
resolution of the image, just the monitors interpolation of it. When it comes to reproduction of
photos, the higher the resolution, the larger the image size we can reproduce. A 1.3-megapixel
camera can reproduce an image up to 4 x 6 inches with little to no graininess or pixelation. A 4.0
megapixel camera is capable of reproducing stunning 13 x 19-inch photos. The higher the
resolution, the less interpolation needs to be done to reproduce the image. Depending on the
application, the number of megapixels used can be significantly different. For example, to send
an email or use a web cam for streaming video, a resolution of 300 x 600 to 640 x 480 is
acceptable. But a high-end digital photographer of scientific equipment will use sensors in the
4.0 to 6.0 megapixel ranges.
Noise
Noise is interference in the image caused by the electronics in the camera. Noise can affect
certain colors more than others, with blue usually being the least affected. How well the camera
can remove the interference determines the level of noise that we will see. Compression formats
can increase the level of noise through the algorithms. JPEG compression will amplify noise in a
picture.
Low Noise Level
High Noise Level
White Balance
White balance is the system used to compensate for different lighting conditions. The human eye
will compensate for types of lighting, such a fluorescent or sunlight, but digital cameras need a
way of determining what that white point is. Fluorescent lights will normally cause a greenish
cast in the background, while mercury vapor and halogen lamps will cause an orange yellow hue
in the picture. Digital cameras normally come with an automatic white balance. The camera
will automatically look at the scene, pick a white point and set the level. Some of the better
models of cameras allow the operator to manually adjust the white balance to a series of preset
defaults.
File Formats and Storage
File Formats
A digital image uses memory. A 1600 x 1200 image in compressed format can be around 1.5
MB of information. An uncompressed file can be almost 4 times larger. Digital imaging devices
use compression to more efficiently store images. Compression is accomplished though
repetition and irrelevancy. In an average image, certain patterns are evident. A picture of a red
car is going to have the color red repeated over and over again. The software will recognize this
and take advantage of this so there is no loss of data on reconstruction. Irrelevancy means that
digital cameras can record more information that what the human eye can perceive. The
software in the will eliminate details that are not important to the human eye. For smaller
images, like a thumbnail size view, there is a lot of image that can be thrown away without
sacrificing quality.
Tagged Image File Format (TIFF)
Tagged Image File Format (TIFF) was designed from the ground up to address problems with
compression normally associated with fixed file formats. TIFF combined all the aspects of
existing image file formats. TIFF was designed to do three things. It had to be extendable so
that older images and applications could still be read. It had to be portable so that it did not rely
on the operating system to be viewed and it had to be revisable. Too many new technologies are
not upgradable and TIFF was designed with this in mind. All major office applications and
scanning technologies support the TIFF format. The TIFF format solves some the problems in
earlier file formats, but because the code is longer it results in slower execution times reading
and writing files and requires better technology to adequately process the images.
Joint Photographic Experts Group (JPEG)
The Joint Photographic Experts Group or JPEG compression method is probably the most widely
used format today. The purpose of the committee creating this method was to reduce the size of
natural, true-color images without affecting the quality as determined by the human eye. JPEG
format is best suited for natural photography because there are no sharp lines in nature. Abrupt
changes and hard edges do not compress well with JPEG. As JPEG compresses files, the
deterioration in image quality becomes more evident. JPEG works by converting raw data into
luminance (brightness of the subject) and chrominance (hue or colors). Because luminance is
more important to our vision, JPEG algorithms retain more luminance information.
Storage
The great thing about digital imaging is that if you don’t like the image you can always delete it.
However the down side is that you need to have a method for storing the images. Cameras store
images by using a variety of memory types. Cameras will normally store their images in JPEG
format, with higher quality and lower quality modes available. Higher end cameras will support
the recording of raw data, but it takes up much more memory and needs to be converted to a file
type prior to being viewed. The advantage of this is that all the raw data is saved and the image
can be manipulated using the original data. The disadvantage is that it uses a lot of memory and
you need to have adequate memory for storing the images.
Built-in Flash memory can hold a limited number of images, but cannot be upgraded. Compact
Flash cards are available in capacities up to one gigabyte. They work similar to a hard drive in
the method they use for storing data. SmartMedia are smaller modules and were one of the first
types of cards used for storing data. Sony cameras use a proprietary storage called Memory
Stick. Some older cameras and less expensive cameras use a floppy disk drive. These are very
limited on the number of pictures that can be taken, but are easy to use because most computers
already have a floppy drive for transferring the images. Images can be transferred to the
computer for storage on a hard drive or recordable media like DVD-RAM or CD-R/CR-RW
through a USB or serial interface. Some newer cameras use infrared to transmit data, and some
newer technologies such as Bluetooth are being incorporated.
Current uses of Digital Imaging
Home Uses
For the average home user, we normally use digital imaging to email pictures to friends or store
images on our computers. More advanced users will create web pages or takes several shots to
create panoramic views by stitching the images together. New software makes manipulation of
images extremely easy. You can cut a person out of one image and place them in another, add
text, resize or crop. More and more people are investing in digital cameras as the prices continue
to drop for good quality cameras.
Business Uses
The entire printing industry is moving to digital format. Newspapers and magazines get their
images in digital format and this allows easy manipulation for page layouts and design.
Space and the Hubble Space Telescope
Launched in 1990, the technologies used in the Hubble Space Telescope have revolutionized
digital imaging. Because of its modular design, new components can be added and upgraded on
a regular basis. It produces stunning images of our universe and on any given day transmits ten
to fifteen gigabytes of data to astronomers. Advances in CCD technology have provided for three
recent upgrades to the Hubble Space Telescope. The first is the Advanced Camera for Surveys
(ACS). It employs two 8-megapixel CCDs with enhanced coatings. These coatings allow 85%
of the photons to be absorbed into the detector. The larger field of view and sensitivity increases
the recognition of deep space objects by a factor of 10. The second is the Wide Field Camera 3
(WFC3) uses a two-detector system for the recording near-ultraviolet to near-infrared spectrum.
The new sensors have less noise and higher sensitivity. The third is the Cosmic Origins
Spectrograph (COS). This device records ultraviolet light so that the distribution of matter in the
empty spaces between galaxies can be measured.
Medical Uses
Uses in the medical field are endless. Major advances in radiology in recent years can be
attributed to the change from analogue to digital imaging and to advances in electronics and
computing. Common uses such as magnetic resonance imaging (MRI), ultrasound, computed
tomography (CT) and nuclear medicine are digital in format, but until recently have been
displayed using conventional film in an analogue format. Advances in computers and digital
have made displaying images of the same quality of film and the digitization of images allows
the user to manipulate them for better processing. Digital images can be easily transferred within
an internal network or across the country in seconds so a specialist can view the images. This
also provides better coverage for rural areas where small hospitals and clinics cannot afford to
keep specialist staff on duty. Storing images digitally allows for quick recall and comparison of
a patient’s medical condition. Several physicians, not even located in the same areas of the
country, can review a patient’s condition and make recommendations as to treatment.
Government, Law Enforcement and Security
Digital imaging is used in the biometrics field for law enforcement and security. A scan can
quickly be made of an individual and be loaded into a computer for processing. The information
can be shared nationwide in a matter of seconds. No more ink for fingerprinting and visual
identification. Fingerprints can be digitally scanned and compared to thousands (or millions) of
prints in a matter of minutes. The State of Connecticut recently employed a system that required
welfare recipients to be biometrically imaged for identification purposes. Fingerprints are
digitally imaged and scanned into a computer for a reference file. A digital facial image and
signature scan is completed as well. When the welfare recipient applies for benefits, a
comparison scan is done to ensure matching on file. This system basically eliminates fraud and
prevents recipients from applying for benefits at several welfare offices under different names.
Other Uses
Archaeology has found uses for digital imaging as well. Contents of crypts, containers, and even
envelopes can be scanned using CT or MRI technology and the images transferred to a computer
for manipulation. These methods are non-destructive and do not require the breaking open of
artifacts to determine their contents.
Cosmetic Surgeons and hair stylists can take a picture of a person and show the possible results
of different hair styles or cosmetic procedures before the work is even done. Don’t like your
hairstyle or color? You can try to see how different ones look before changing it.
Conclusion
Digital imaging is used in everything from home photography to the study of historical
documents, and advances in the medical and space technology fields. Once considered too
expensive and the quality would never compare to film, it has since proven that it is a much more
effective medium for storage of images. Purists will tell you that digital imaging will never
replace film, but image quality is already at a point that more than rivals 35-mm prints. As
sensors continue to improve, better quality images will result and new uses will be developed.
References
http://howstuffworks.com
http://www.dpreview.com/
http://hubble.gsfc.nasa.gov/
http://www.mja.com.au/public/issues/176_01_070102/adl10654_fm.html
http://www.dss.state.ct.us/digital.htm
http://www.exploratorium.edu/snacks/cymk/
http://www.dsdesign.com/articles/jpeg.htm