Download Sensing for Robotics & Control – Remote Sensors

Survey
yes no Was this document useful for you?
   Thank you for your participation!

* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project

Document related concepts

Compressed sensing wikipedia , lookup

Transcript
Sensing for Robotics & Control –
Remote Sensors
R. R. Lindeke, Ph.D
Remote Sensing Systems:

Radar – uses long wavelength microwaves for point
or field detection




Speed and range analysis
Trajectory analysis
Sonar – uses high energy/high frequency sound
waves to detect range or create images in
“conductive media”
Vision Systems – Operations in Visible or near
visible light regimes. Use structured light and high
contrast environments to control data mapping
problems
Parts of The Remote Sensors – field
sensors


Source information is a structured illumination
system (high contrast)
Receiver is a Field detector – in machine vision it is
typically a CCD or CID



CCD is a charge coupled device – where a detector
(phototransistor) stores charge to a capacitor which is regularly
sampled/harvested through a field sweep using a “rastering”
technique (at about 60 hz)
CID is a charge injected device where each pixel can be
randomly sampled at variable times and rates
Image analyzers that examine the raw field image
and apply enhancement and identification algorithms
to the data
The Vision System issues:

Blurring of moving objects –



A result of the data capture rastering through the receiver’s 2-D
array, here, the sampling system lags the real world as it
processes field by field with parts of the information being
harvested and adjacent pixels continuing to change in time
Limits speed of response, speed of objects and system thru-put
rates
Contrast Enhancements are developed by examining
extrema in field of view:
I rec  I min
I enh 
I max  I min
applicable at each pixel
Some Additional Issues:

Must beware of ‘Bloom’ in the image


Bloom is a problem when a high intensity pixel overflows into
adjacent pixels increasing or changing the size of an
information set
Also consider Lensing and Operational errors:




Vignetting – lenses transmits more effectively in the center
than at their edges leading to intensity issues across the field
of the image even without changes in the image field
information itself
Blur – caused by lack of full field focus
Distortion – parabolic and geometric changes due to lens
shape errors
Motion Blur – moving images “smeared” over many pixels in
capture (for CCD system we typically sample up to 3 to 5 field
to build a stable image limiting one to about 12 to 20 stable
images/sec)
Data Handling Issues:



Typical Field Camera (780x640 or
499,200 pixels/image) with 8-bit color –
means 3 separate 8 bit words (24 bit
color) per pixel (32 bit color typically
includes a saturation or brightness byte
too)
Data in one field image as captured
during each rastering sweep: 499200/8 =
62400 bytes/image*3 bytes of color =
187200 bytes/image
In a minute: 187200*60fr/s*60s/m = 673.9
MBytes (raw – ie. without compression or
processing) (40.4 Gigs/hour of video
information)
Helping with this ‘Data Bloat’

Do we really need Color?
–

Do we really need “shades”?
–
–

If no, the data is reduced by a factor of 3
If no, the data set drops by a factor of 8
but this requires ‘thresholding’ of the data field
Thresholding is used to construct ‘bit maps’


After sampling of test cases, setting a level of pixel intensity
corresponding to a value of 1 or ‘on’ while below this level of
intensity the pixel is 0 or ‘off’ regardless of image difficulties
and material variations
Consideration is reduced to 1 bit rather than the 8 to 24 bits in
the original field of view!
Analyzing the Images

Do we really need the entire field – or just
the important parts?
–


But this requires post processing to
analyze what is in the ‘thresholded’ image
Image processing is designed to build or
“Grow” field maps of the important parts of
an image for identification purposes
These field maps then must be analyzed
by applications that can make decisions
using some form of intelligence as applied
to the field data sets
Image Building



First we enhance the
data array
Then we digitize
(threshold) the array
Then we look for image
edges – an edge is
where the pixel value
changes from 0 to 1 or
1 to 0!
Raw image before thresholding
and image analysis:
Working the Array – hardware and
software
Bottles for selection,
After Reorganizing
the Image Field
Field Image after
Thresholding:
After Threshold



The final image is a series of
On and Off Pixels (the light
and dark parts of the 2nd
Image as seen on the
previous slide)
The image is then scanned
to detect edges in the
information
One popular method is
using an algorithm “GROW”
that searches the data array
(of 1 and 0’s) to map out
changes in pixel value
abc
def
g
Using Grow Methods



We begin a directed Scan – once a
state level change is discovered we
stop the directed scan and look
around the changed pixel to see if it is
just a bit error
If it is next to changed bits in all “new”
directions, we start exploring for edges
by stepping forward from the 1st bit
and stepping back and forth about the
change line as it “circumvents” the
part
The algorithm then is said to grow
shapes from full arrays but done
without exhaustive enumeration!
So lets see if it works:
___
___
__
---
Once Completed:



An image must be compared to
standard shapes
The image can be analyzed to
find centers, sizes or other shape
information
After analysis is completed,
objects can then be handled and
or sorted
Sorting Routines:

Based on Conditional Probabilities:
p(x|wi ) 

p(x|wi )  p  wi 


  p(x|wj )  p  w j  
 j

This is a measure of the probability that x is a
member of class i (Wi) given a knowledge of the
probability that x is not a member of the several other
classes in the study (Wj’s)
Typically a Gaussian Approximation is
Assumed:


We perform the
characteristic measurement
(with the Vision System)
We compute the conditional
probability that X fits each
class j – that with the
highest value is accepted
as a best fit (if the classes
are each mutually
exclusive)
1 Z 2 2
p (x|wi ) 
e
2
where:
Z
x j
j
Lets Examine The Use Of This
Technique:




Step One: present a “Training Set,” to the
camera system including representative
sizes and shapes of each type across its
acceptable sizes
Step Two: For each potential class, using its
learned values, compute a mean and
standard deviation for each class
Step 3: Present unknowns to the trained
system and make measurements – compute
the appropriate dimensions and compute
conditional probabilities for each potential
class
Step 4: Assign unknown to class having the
highest conditional probability – if the value
is above a threshold of acceptability
Using Vision To determine Class &
Quality

A system to sort by “Body Diagonals” (BD) for a
series of Rectangular pieces:




A is 2±.01” x 3±.01”
B is 2±.01” x 3.25±.01”
C is 1.75±.01” x 3.25±.01”
Body Diagonals with part dimensions at acceptable
limits:
–
–
–
A: (1.992 + 2.992) to (2.012 + 3.012)
 3.592 to 3.619 (mean is 3.606”)
B: (1.992 + 3.242) to (2.012 + 3.262)
 3.802 to 3.830 (mean is 3.816”)
C: (1.742 + 3.242) to (1.762 + 3.262)
 3.678 to 3.705 (mean is 3.691”)
Computing Class Variances:

Can use Range techniques: find range of samples for a
class then using a statistic d2 to compute σ: σclassi =
(Rsample)/d2

Can also compute an estimate of σclass using sample
standard deviations and a c4 statistic: σclassi = (ssample)/c4

c4 or d2 are available in any good engineering statistics
text! – see handout
Computing Variances:

Here using estimates
from ideal values on BD
range, σclassi is:
If sample size is 2
– changes based
on sample size
0.0277
A 
 .02394
1.128
from "range"
0.02752
B 
 .02439
1.128
0.02709
C 
 .02402
1.128
Fitting an unknown:
3.681  3.606
ZA 
 3.1928
0.02349
1 (3.1928)2 2
p(x|wA ) 
e
 0.00244
2

3.681  3.816
 5.5351
0.02439
1 ( 5.5351)2 2
p(x|wB ) 
e
 8.876 x108
2


ZB 
3.681  3.6912
ZC 
 0.4248
0.02401
1 ( 0.4248)2 2
p(x|wC ) 
e
 0.3645
2
Unknown Body Diagonal
is measured at 3.681”
Compute Z and Cond.
Probability (each class)
From our analysis we
would tentatively place
the Unknown in Class C
– but more likely we
would place it in a hand
inspect bin!