Download Lecture slides (pdf)

Survey
yes no Was this document useful for you?
   Thank you for your participation!

* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project

Document related concepts

Electrical engineering wikipedia , lookup

Public address system wikipedia , lookup

Electronic engineering wikipedia , lookup

Transcript
Studying Analog Meter System
using
LabVIEW-based Vision
Raul G. Longoria
Updated Summer 2014
ME 144L – Prof. R.G. Longoria
Dynamic Systems and Controls Laboratory
Department of Mechanical Engineering
The University of Texas at Austin
The basic analog voltage meter is an electromechanical system. The
rotational dynamics are constrained, with deflections over ~90 deg.
FRONT PANEL
Electrical
circuit model
Vin
Meter
movement
Rotational system
REAR
TERMINALS
needle
Series
resistor
ME 144L – Prof. R.G. Longoria
Dynamic Systems and Controls Laboratory
Department of Mechanical Engineering
The University of Texas at Austin
im
The analog voltage meter is designed to position the needle at an
angle proportional to the voltage applied at the input terminals.
Vin
Im
For this meter, to deflect the needle 90 degrees (full-scale; red
arrow below) requires a current of 1 mA, so to size the series
resistor use KVL:
Vin − Rseries I m − Rm I m − VG = 0
VG
This is
voltage drop
across meter
EM
conversion.
Rseries
Vin − Rm I m − VG 15 ⋅ V − ( 85 ⋅ Ω )1 ⋅ mA − 0 ⋅ V
=
=
= 14915 ⋅ Ω
Im
1 ⋅ mA
When needle is not moving, the voltage VG = 0 V.
Note that this is a static model; no dynamic
effects are considered important since the
meter is designed to measure DC voltage.
The static predictive model is thus:
 90 ⋅ deg 
θ =
⋅Vin

 15 ⋅ V 
ME 144L – Prof. R.G. Longoria
Dynamic Systems and Controls Laboratory
See Appendix A for description of lab
equipment
Department of Mechanical Engineering
The University of Texas at Austin
Pre-Lab 1. Sizing the series resistor in the analog meter
If you wanted the analog meter used in this lab to have full
scale response with a 10 volt input, what series resistance
would you use? Show and explain your work.
Vin
VG = 0
Im
Vin − Rseries I m − Rm I m − 0 = 0
Rseries
ME 144L – Prof. R.G. Longoria
Dynamic Systems and Controls Laboratory
Vin − Rm I m − VG
=
Im
Department of Mechanical Engineering
The University of Texas at Austin
As long as there are no physical changes, the needle position will
respond in the manner intended by the design.
This is an open-loop control system design.
Vin
 90 ⋅ deg 
θ =
⋅ Vin

⋅
V
15


θ
Analog Meter
Think of this as a system that we are trying to control precisely, and
that there could be changes or disturbances that were not accounted
for in the design.
Later we’ll discuss the advantages of using a closed-loop control
system and that a benefit of feedback is to make a system insensitive
to these effects.
ME 144L – Prof. R.G. Longoria
Dynamic Systems and Controls Laboratory
Department of Mechanical Engineering
The University of Texas at Austin
First, however, we’ll study the input-output relation for this system.
A measurement system is needed that provides a measure of θ.
The meter as a system under study:
Vin
θ
Analog Meter
We want to verify the meter model that relates position to the input
voltage – a transfer function.
In this lab, all measurement and control is conducted using:
(a) LabVIEW-based vision to measure position, θ, and
(b) DAQ analog output* (AO) to provide the excitation voltage.
*It is an advantage of this small lab system that the DAQ output has sufficient
current to drive our actuator – no amplifier is needed.
ME 144L – Prof. R.G. Longoria
Dynamic Systems and Controls Laboratory
Department of Mechanical Engineering
The University of Texas at Austin
Lab Objectives
st
(1
week)
• Learn how to use LabVIEW for image analysis
and image capture
• Run basic image analysis and image capture
experiments
• Characterize the meter system using a static
gain relating angular position to input voltage
• Demonstrate open-loop control of angular
position
ME 144L – Prof. R.G. Longoria
Dynamic Systems and Controls Laboratory
Department of Mechanical Engineering
The University of Texas at Austin
We’ll develop a way to measure the
angular position using machine
vision VIs
( x0 , y0 )
( xv , yv )
c
a
θ
b
( xv , yv )
It should be clear that measuring the position of the needle is difficult if not
impossible using any ‘contact’ sensors, so vision is an excellent option.
ME 144L – Prof. R.G. Longoria
Dynamic Systems and Controls Laboratory
Department of Mechanical Engineering
The University of Texas at Austin
Pre-Lab 2. Finding angle from three measured points
Given you can measure three different points from the
image, as shown below, express an algorithm that will
allow you to find needle length, a, the distance, c, and the
angle, θ.
( x0 , y0 )
( x, y )
c
a
θ
b
( xv , yv )
ME 144L – Prof. R.G. Longoria
Dynamic Systems and Controls Laboratory
Department of Mechanical Engineering
The University of Texas at Austin
LabVIEW-based Vision
• LabVIEW Vision enables you to read/create image
files and provides means for managing those files
• There are built-in functions (VIs) for analyzing image
files (select areas of interest, measure intensity, etc.)
• It is necessary to also have LabVIEW IMAQ software
which enables you to acquire images from cameras.
• In this course, we want to demonstrate how you can
use these software tools to develop a simple visionbased measurement system, particularly for object
motion.
ME 144L – Prof. R.G. Longoria
Dynamic Systems and Controls Laboratory
Department of Mechanical Engineering
The University of Texas at Austin
Overview of LV-Based Vision Tools
• Image data type
• Analyzing images
• Capturing images
ME 144L – Prof. R.G. Longoria
Dynamic Systems and Controls Laboratory
Vision Utilities
Image processing
Machine vision
Department of Mechanical Engineering
The University of Texas at Austin
Analyzing Images
• Vision Utilities – VIs for creating and
manipulating images, etc.
• Image Processing – provides ‘low level’ VIs
for analyzing images.
• Machine Vision – groups many practical VIs
for performing image analysis. For example,
the “Count and Measure Objects” VI is found
under this group.
ME 144L – Prof. R.G. Longoria
Dynamic Systems and Controls Laboratory
Department of Mechanical Engineering
The University of Texas at Austin
Vision Utilities
To create and manipulate images
•
•
•
•
•
Image management (create, dispose, etc.)
File handling
Image manipulation
Pixel editing
etc.
• Best to learn use through examples.
ME 144L – Prof. R.G. Longoria
Dynamic Systems and Controls Laboratory
Department of Mechanical Engineering
The University of Texas at Austin
Reading an Image File
This VI block diagram opens an existing image file (e.g., a bitmap), reads the
file, and then displays it.
On the Front Panel, place an
‘Image Display’ to get this
terminal; then wire image data.
ME 144L – Prof. R.G. Longoria
Dynamic Systems and Controls Laboratory
Department of Mechanical Engineering
The University of Texas at Austin
Example – looking at an image
The simple VI in the
previous slide can be
used to open an image
file.
This read-out indicates the
size of the image (pixels).
When moving the cursor
around the image, the
readout shows cursor (x,y)
coordinates and the
‘intensity’ value at that
location.
This is the ‘clamp’ example file provided in LabVIEW
ME 144L – Prof. R.G. Longoria
Dynamic Systems and Controls Laboratory
Department of Mechanical Engineering
The University of Texas at Austin
Machine Vision Concepts*
•
Machine (or computer) vision has six principal areas
1.
2.
3.
4.
5.
6.
•
•
Sensing – which yields a visual image
Preprocessing – noise reduction, image enhancement, etc.
Segmentation – partitioning an image into objects of interest
Description – computation of features for differentiating among types
of objects
Recognition – identifying objects (e.g., bolt, wrench, etc.)
Interpretation – assigning meaning to an ensemble of recognized
objects
Levels of processing are divided into low (1, 2), medium
(3,4,5), and high (6)
In this course, we’ll primarily be concerned with low-level
vision, and will utilize some functions of medium-level vision.
ME 144L – Prof. R.G. Longoria
Dynamic Systems and Controls Laboratory
Department of Mechanical Engineering
The University of Texas at Austin
Machine Vision Concepts*
•
•
•
Low-level vision involves processes considered ‘primitive’ (or
automatic) and requiring no ‘intelligence’ (1,2). This could be
thought of as analogous to how a human eye senses and
adapts.
Medium-level vision extracts, characterizes, and labels
components in an image.
High-level vision refers to processes that attempt to emulate
perception and cognition.
*From Fu, Gonzalez, and Lee, Robotics: Control, Sensing, Vision, and Intelligence, McGraw-Hill,
New York, 1987.
ME 144L – Prof. R.G. Longoria
Dynamic Systems and Controls Laboratory
Department of Mechanical Engineering
The University of Texas at Austin
LabVIEW Machine Vision VIs
• Select Region of Interest
• Light Meter
• Count and Measure Objects
• You’ll learn how to use some of these in the
lab. There are many others you can skim
through to get an idea of what is available.
ME 144L – Prof. R.G. Longoria
Dynamic Systems and Controls Laboratory
Department of Mechanical Engineering
The University of Texas at Austin
IMAQ Select Region of Interest
You can focus on regions based on:
point, line, rectangle, annulus
The output from this VI can be sent to other VIs that
require that bounding information.
ME 144L – Prof. R.G. Longoria
Dynamic Systems and Controls Laboratory
Department of Mechanical Engineering
The University of Texas at Austin
IMAQ Light Meter
If you want to examine
pixel intensity in a certain
region, you need rectangle
information.
IMAQ Select Rectangle
Use to specify a rectangular
area in the image.
Rectangle coordinates are
output and can be sent to
next function.
Rectangle
Image
ME 144L – Prof. R.G. Longoria
Dynamic Systems and Controls Laboratory
Histogram data (send
directly to a waveform
graph)
Department of Mechanical Engineering
The University of Texas at Austin
Count and Measure Objects
Another VI that needs rectangle information, and which is very
useful for basic segmentation is the ‘Count and Measure Objects’
VI.
This VI needs several inputs, as shown below.
NOTE: This VI requires that you convert the image to grayscale.
ME 144L – Prof. R.G. Longoria
Dynamic Systems and Controls Laboratory
Department of Mechanical Engineering
The University of Texas at Austin
IMAQ Cast Image
Example usage shown below:
ME 144L – Prof. R.G. Longoria
Dynamic Systems and Controls Laboratory
Department of Mechanical Engineering
The University of Texas at Austin
Example – Finding Objects and Intensities
Image
Rectangle
Error
3 objects detected
The limit on object size prevented the 3 larger
objects in the ROI from being identified
ME 144L – Prof. R.G. Longoria
Dynamic Systems and Controls Laboratory
Department of Mechanical Engineering
The University of Texas at Austin
Define ‘Threshold’
Bright objects have ‘high’ intensity values (e.g., 255 for 8-bit)
Dark objects have ‘low’ intensity values (e.g., 0 for 8-bit)
These objects have an
intensity of close to
zero.
These objects have an
intensity of close to
255.
For 8-bit image
The ‘Threshold’ must often be specified as an input to some machine vision VIs.
ME 144L – Prof. R.G. Longoria
Dynamic Systems and Controls Laboratory
Department of Mechanical Engineering
The University of Texas at Austin
Generating Intensity Histogram
Light Meter
Within the ROI, a histogram is generated of the
intensity values. Note that most of the image is
made up of pixels with intensity greater than about
180. White is 255.
ME 144L – Prof. R.G. Longoria
Dynamic Systems and Controls Laboratory
Department of Mechanical Engineering
The University of Texas at Austin
Capturing Images
•
•
•
•
•
PCI cards (for capturing from streaming source)
USB cameras* (web cams, etc.) IMAQdx refers to
VIs that can be used
with cameras that
Firewire cameras
interface directly
Ethernet/wireless cameras
(‘direct show’)
USB is the approach targeted for this course:
– Low-cost
– Relatively easy to use
ME 144L – Prof. R.G. Longoria
Dynamic Systems and Controls Laboratory
Department of Mechanical Engineering
The University of Texas at Austin
USB Cameras
• USB cameras are probably the slowest cameras
available, especially the way they are to be used in this
course.
• Our experience has shown that the maximum
bandwidth we can achieve for image acquisition is
about 10 frames/sec (within LabVIEW).
• Some online sources indicate that ‘hacked’ webcams
can achieve 30 frames/sec.
• So, it is the software environment (Windows, LV,
communications, etc.) that we’ve chosen that is likely
placing the restrictions on the performance.
ME 144L – Prof. R.G. Longoria
Dynamic Systems and Controls Laboratory
Department of Mechanical Engineering
The University of Texas at Austin
Example: code to acquire USB
camera image
ME 144L – Prof. R.G. Longoria
Dynamic Systems and Controls Laboratory
Department of Mechanical Engineering
The University of Texas at Austin
Example: Front Panel
Cursor
location
NxM, 32-bit RGB image
ME 144L – Prof. R.G. Longoria
Dynamic Systems and Controls Laboratory
RGB levels…
Department of Mechanical Engineering
The University of Texas at Austin
In lab: Use NIMAX to test the camera, fine tune settings
1.
2.
3.
4.
ME 144L – Prof. R.G. Longoria
Dynamic Systems and Controls Laboratory
NI-IMAQ devices
Camera name
Look at the
Acquisition
Attributes. Sometimes
the default settings are
not suitable. For
example, this camera is
actually a doc cam and
the resolution was very
high and frame rate too
slow for what is needed
in this lab. It was set to
lower resolution, higher
fps.
Camera Attributes lets
you adjust settings such
as focus, contrast, etc.
(as long as you have
installed the proper
camera drivers)
Department of Mechanical Engineering
The University of Texas at Austin
Specify analog out voltage
In lab:
build a VI to
measure the
relationship
between input
voltage and
needle position
This VI uses vision
to measure the
meter position and
controls the analog
out on myDAQ to
apply a specified
DC voltage.
ME 144L – Prof. R.G. Longoria
Dynamic Systems and Controls Laboratory
‘bob’
needle length and angle from
(0,0) point measured
Department of Mechanical Engineering
The University of Texas at Austin
In lab: Run experiments and measure the static relationship
A linear regression
is shown, giving a
model,
θ m = Kθ / v ⋅ v
Kθ / v = 15.9 deg/V
vOL = Kθ / vθ d
For this example, vOL is the input voltage you’d need to ‘dial in’ to position the
needle at a desired angular position, θd. We could define the open-loop (dc) gain
for this case as,
−1
K v /θ = Kθ / v = 0.063 V/deg
Relating back to the model,
Kθ / v
ME 144L – Prof. R.G. Longoria
Dynamic Systems and Controls Laboratory
 rm 
= 
 Rm 
Department of Mechanical Engineering
The University of Texas at Austin
In lab: Build an open-loop controller
A simple open loop control system for positioning the needle ‘bob’ at a
given angular location might be formulated as follows:
1. Specify a desired angular position, θd.
2. Use the static model to compute the open loop control voltage,
vOL = K v /θ θ d
3. Send this voltage command to an ‘analog output’ VI in LabVIEW
4. Measure the actual position of the needle ‘bob’ center using USB vision
measurement.
The following data was obtained for a few simple trials:
These quick tests show that you can get reasonable
results, but it’s hard to judge whether you could
position any better.
ME 144L – Prof. R.G. Longoria
Dynamic Systems and Controls Laboratory
Department of Mechanical Engineering
The University of Texas at Austin
Open loop control positioning – specify a path
A formula node can be used
to create a desired trajectory.
Below is a graph of the desired trajectory and the
measured open loop response. Absolute error is
especially large during dynamic response.
qd is the ‘desired angle’
T is a period parameter
t is the ‘time value’
Note, this uses an open loop
control model to specify the
voltage command to be
generated by the AO device.
ME 144L – Prof. R.G. Longoria
Dynamic Systems and Controls Laboratory
Department of Mechanical Engineering
The University of Texas at Austin
Summary
• We begin study of an electromechanical analog meter
• Vision VIs in LabVIEW provide a way for us to include image
acquisition and analysis to our existing set of tools (simulation,
DAQ).
• The vision VIs alone allow you to use an image as a data type.
• Images can be loaded from a file or acquired using IMAQ
routines.
• Once within LabVIEW, an image can be processed using some
very sophisticated built-in programs.
• Machine vision VIs enable us to build a simple ‘motion capture’
system for studying the analog meter response and control.
ME 144L – Prof. R.G. Longoria
Dynamic Systems and Controls Laboratory
Department of Mechanical Engineering
The University of Texas at Austin
Appendix A: 144L analog meter box and circuit
• Both switches in 'up' position (1)
places Rs in series with meter (15 V
input gives full scale deflection)
• Both switches in 'down' position (2)
places the potentiometer and 5.1K
resistor in series (allows to change
properties of the system)
ME 144L – Prof. R.G. Longoria
Dynamic Systems and Controls Laboratory
Department of Mechanical Engineering
The University of Texas at Austin
Appendix B: Image Data Type
Menu: NI Measurements->Vision->Vision Utilities->Image Management
This VI is used to create an
image. It is called prior to,
say, capturing an image
using a camera.
ME 144L – Prof. R.G. Longoria
Dynamic Systems and Controls Laboratory
Department of Mechanical Engineering
The University of Texas at Austin
Appendix C: Image Type and Bit Depth
• We know digital images are formed by an array of pixels, and each pixel is
quantized into a number of levels based on the number of bits available.
• Depending on whether pixels are black and white, grayscale, or color, pixels
have different bit depths. Bit depth refers to the amount of information
allocated to each pixel.
• When pixels are either black or white, pixels need only two bits of
information (black or white), and hence the pixel depth is 2.
• For grayscale, the number of levels used can vary but most systems have 256
shades of gray, 0 being black and 255 being white. When there are 256
shades of grey, each pixels has a bit depth of 8 bits (one byte). A 1024 x
1024 grayscale images would occupy 1MB of memory.
• In digital color images, the RGB (red green blue, for screen projection) or
CMYK (printing color) schemes are used. Each color occupies 8 bits (one
byte), ranging in value from 1-256. Hence in RGB each pixel occupies 8x3
=24 (3 bytes) bits, in CMYK 8x4 = 32 bits (4 bytes).
• Note, LV uses an ‘alpha’ channel for RGB. The alpha channel stores
transparency information--the higher the value, the more opaque that pixel
is.
ME 144L – Prof. R.G. Longoria
Dynamic Systems and Controls Laboratory
Department of Mechanical Engineering
The University of Texas at Austin