Download LIDAR MEASUREMENTS

Document related concepts

Reflecting telescope wikipedia , lookup

International Ultraviolet Explorer wikipedia , lookup

CfA 1.2 m Millimeter-Wave Telescope wikipedia , lookup

Transcript
UNIVERSITY OF NOVA GORICA
SCHOOL OF APPLIED SCIENCES
LIDAR MEASUREMENTS
REPORT FOR PHYSICAL LABORATORY III
Anže Peternel
Mentor: prof. dr. Samo Stanič
Nova Gorica, 9. 4. 2013
1 CONTENTS
1
Contents .............................................................................................................................. 2
2
What is remote sensing and LIDAR .................................................................................. 3
3
Mobile LIDAR at UNG ...................................................................................................... 7
4
Experiment ......................................................................................................................... 9
5
Questions .......................................................................................................................... 11
5.1
What are the killers of the LIDAR signal? ............................................................... 11
5.2
Which parameters are related to the atmospheric optical properties from the LIDAR
equation, and what do they mean? ....................................................................................... 11
5.3
Analyze the sources of noise and what is your way to eliminate the noise from
return signal? ........................................................................................................................ 11
5.4
Why do we do range-corrected and take the natural logarithm to the return signal?
11
6
Conclusion ........................................................................................................................ 12
7
References ........................................................................................................................ 13
2
2 WHAT IS REMOTE SENSING AND LIDAR
Remote sensing is defined as the measurement of object properties on the Earth's surface
using data acquired from aircraft and satellites. It is therefore an attempt to measure
something at a distance, rather than in situation. Since we are not in direct contact with the
object of interest, we must rely on propagated signals of some sort, for example optical,
acoustical or microwave. There are two logical types of remote sensing. First one is passive
remote sensing, which depends on natural source, such as radiation of observed object or
reflection of radiation which comes from third object, which we cannot control. Reflected
sunlight is the most common source of radiation measured by passive sensors. Examples of
passive remote sensors include film photography, infrared, charge coupled devices and
radiometers. The second type is more flexible and is called active remote sensing. This type
depends on an artificial source, which is controlled by those who are doing the observation.
Source of signal can be located near or far from the sensor and can be either electro-magnetic
waves or sound waves. Typical examples of active remote sensing are SODAR, SONAR,
RADAR and LiDAR.
SODAR stands for SOnic Detection And Ranging and which is an active remote sensing
device, used to remotely measure the vertical turbulence structure and the wind profile of the
lower layer of atmosphere. For detection it uses sound waves.
Figure 1: Application of SODAR
SONAR stands for SOund Navigation And Ranging and is similar to SODAR. This technique
also uses propagating sound waves, usually underwater, to navigate, communicate or detect
objects on or under the surface of the water, such as other vessels. SONAR is essentially used
in navy. It can be both active and passive. Passive is essentially only listening for the
3
sound, made by other vessels, active, on the other hand, is emitting pulses of sounds and
listening for echoes.
Figure 2: SONAR used in navy
RADAR stands for RAdiowave Detection And Ranging. It is an active remote sensing device
which uses electromagnetic specter from radio waves to microwaves. The RADAR dish or
antenna transmits those electromagnetic pulses which bounce off any object in their path. The
object returns a tiny part of the wave’s energy to a dish or antenna which is usually located at
the same site as transmitter. The uses of RADAR are highly diverse, including air traffic
control, air-defense systems, antimissile systems, astronomy, ocean research, outer space
surveillance, meteorological monitoring and geological observations.
Figure 3: NEXRAD weather RADAR
4
LIDAR works quite similar as RADAR, except that instead of radio waves it uses ultraviolet,
visible or near infrared light. LIDAR stands for LIght Detection And Ranging or Laser
Imaging Detection And Ranging. LIDAR devices can be applied in even more fields than
RADAR. It has been used extensively for atmospheric research and meteorology. Downwardlooking LIDAR instruments fitted to aircraft and satellites are used for surveying and
mapping. It is also used for environmental research, space research and for ocean research. It
can also be found in military where it is used for defense; it is perfect for detecting biological
weapon attack.
Figure 4: Sondrestrom Rayleigh Lidar
Figure 5: Experimental Advanced Airborne Research
Lidar (EAARL)
The LIDAR is made of few different components. These include a transmitter, receiver and a
detection system with a controller. From transmitter we shoot short pulses of light that has
wavelength as close as possible to the radius of particles we want to observe. LIDAR uses
ultraviolet, visible or near infrared light to image objects and can be used with a wide range of
targets, including non-metallic objects, rocks, rain, chemical compounds, aerosols, clouds and
even single molecules. The wavelength of light pulse depends on laser that is installed in
transmitter. Light then hits the particles and then some of the light reflects back to the
receiver. This is called backscattering. There are different types of scattering that are used for
different LIDAR applications. Most common are Reyleigh scattering, Mie scattering, Raman
5
scattering and fluorescence. Based on different kinds of backscattering, the LIDAR can be
accordingly called Rayleigh LIDAR, Mie LIDAR, Raman LIDAR, Na/Fe/K Fluorescence
LIDAR, and so on. Receiver is a set of mirrors which redirect light towards photodetector.
Photodetector then converts signal into digital and sends it into computer. The data can then
be analyzed. The delay in the received pulse tells us the distance from the instrument to the
particle and the intensity of the reflected pulse tells us the density of the particles.
Figure 6: Schematics of LIDAR measurement
More about measuring and processing data can be found in section »experiment«.
6
3 MOBILE LIDAR AT UNG
University of Nova Gorica has in 2007 developed a mobile LIDAR station which includes
two elastic (Mie scattering) channels (at 266nm and 1064nm) and a fluorescence channel (at
296nm). The LIDAR can operate both in the daytime and nighttime conditions.
Figure 7: Mobile LIDAR on a parking lot during measurements
Transmitter is CFR400 by Quantel Big Sky pulse laser, which is capable of simultaneous
emission of light at different wavelengths. CFR400 emits light at base wavelength of 1064 nm
(IR), second harmonic (532 nm, green) and fourth harmonic (266nm, UV). As the attenuation
of UV in air is much larger than attenuation of IR, IR light is used for regular Mie scattering
operation and the UV light for the excitation of the Tryptophane fluorescence in organic
materials.
12'' Dopsonian telescope by company Guan Sheng Optical, serves as the receiver. 302 mm
parabolic primary mirror collects the backscattered light and the induced fluorescence to its
focal length at 1520 mm, where another mirror is placed. Secondary mirror redirects light into
7
detecting system installed outside the telescope.
Dichroic mirrors made by SLS Optics Limited were applied to divide elastic scattering (at
1064nm and 266nm) and induced fluorescence (UV). The first dichroic mirror in the receiver
separates UV from IR. UV light is divided once more with the second dichroic mirror, where
induced fluorescence is separated from elastic scattering. In order to separate laser
backscattering signal from the background, interference filters by BARR Associates were
installed. After the filters, the backscattered beam is focused onto the photomultiplier tubes
Hamamatsu R7400-06 (266 and 296 nm) and an avalanche photodiode Si APD S8890-30 by
Hamamatsu. These sensors convert the received light into measurable electrical signals.
Amplitudes of the electrical signals are proportional to the power of received light.
Digitalization of LIDAR measurements is performed by an analog/digital (AD) converter
(Licel transient recorder) and read out by a Linux based computer for data acquisition and
analysis.
Figure 8: The CFR400 by Quantel Big Sky pulse laser
8
4 EXPERIMENT
The experiment took place on 1st of March in 2013 at parking place behind Nova Gorica
University at Rožna Dolina and it lasted from 11:26 am to 11:58 am. The weather was sunny
with only few clouds. LIDAR angle was 49° from horizon. The data came in twelve ASCII
files with two columns, where the first values represented distance and the second
corresponding power. Each file had 2667 measurements. For data analysis and plotting,
Wolfram Mathematica 8 software was used. Plotted raw data looks like this and it doesn’t
represent much, except the fact that returned power decreases with distance. This plot was
drawn using data of the first measurement which took place at 11:26 am.
Figure 9: Raw data plot recorded at 11:26 am
Next think in data analysis to do was to remove noise, which was caused by background light
from airglow, starlight, and reflection of sunlight. To get rid of the noise, the average of the
last 10 percent of the returned power signal was taken and was subtracted from returned
power signal. Then for each plot the range correction had to be done. This is done by
multiplying each value of returned power by corresponding square of range. For the plot to be
clearer, the natural logarithm of range corrected signal was calculated. The plot of the first
measurement looks like this.
9
Figure 10: Range corrected plot at 11:26
Experiment lasted for about thirty minutes and in that time twelve measurements were made.
This is more than enough to put together all the data and to create time dependant density
plot. The color scheme represents the amount of reflected energy.
10
5 QUESTIONS
5.1
What are the killers of the LIDAR signal?
The signal attenuation happens when light hits a target for example rocks, rain, chemical
compounds, aerosols, clouds or even single molecules. The attenuation is highly dependant on
the wavelength of emitted light. For instance, the attenuation of UV light is much higher than
that of IR light.
5.2
Which parameters are related to the atmospheric optical properties from
the LIDAR equation, and what do they mean?
This is single-scattering LIDAR equation for a monostatic single-wavelength pulsed laser:
P( z )  P0 
r
 (r )
c 
 A  2  exp[2  (r ' )dr']
2
r
0
The parameters related to the atmosphere are β(r) which is the volume backscatter of the
atmosphere and σ(r) which is the attenuation coefficient of the atmosphere.
5.3
Analyze the sources of noise and what is your way to eliminate the noise
from return signal?
The source of noise is background light from airglow, starlight, and reflection of sunlight. The
device that is sampling the return (LICEL) has a larger working range than the laser itself so
the measurements beyond the working range of laser is noise. In order to get rid of this noise
we can take the mean value of about 10 percent of that noise and subtract it from the
measurements.
5.4
Why do we do range-corrected and take the natural logarithm to the return
signal?
Range correction is done in order to amplify returned power. Because the range corrected
return signal falls exponentially with the distance, we take the natural logarithm of it in order
to make plots clearer and more readable.
11
6 CONCLUSION
The process of learning how LIDAR actually works took me a while. The first day when I
was introduced to LIDAR made me confused a bit, because some things were not trivial to
me. But I have tried to remember as much as possible so I will be able to completely
understand how LIDAR works. For me this instrument seems very important because I was a
bit shocked by the fact that LIDAR can be used in so many different fields in science. And I
am also a bit shocked because for about a year ago I have not even heard of LIDAR before.
Well, the last think changed and I can assure that I have learned quite a lot.
Another think that took me quite some time to clarify was data analyzing. I had some
difficulties with plotting in Mathematica 8 software and sometimes I was not sure whether the
plots I made were correct or not. With help of my schoolmates Miha Gunde, Simon Lukman,
Tine Bavdaž and Gregor Maver, I was able to overcome the problems I have encountered. I
would also like to thank teaching assistang Tingyao He and Andrea Sušnik for all the help.
12
7 REFERENCES
- http://books.google.si/books?id=KQXNaDH0XIC&pg=PA2&redir_esc=y#v=onepage&q&f=true
- http://www.sodar.com/about_sodar.htm
- http://en.wikipedia.org/wiki/Radar
- Cracknell, Arthur P.; Hayes, Ladson (2007) [1991]. Introduction to Remote Sensing (2 ed.).
London: Taylor and Francis.
- James D. Klett; Stable analytical inversion solution for processing lidar returns
- http://sabotin.ung.si/~sstanic/CRA/lidar/mobile/
13
Fakulteta za aplikativno naravoslovje
Fizikalni laboratorij 3 report
Lidar
Gorenje Nekovo, 15.4.2013
Author: Gregor Maver
Mentor: Tingyao He
Lecturer: Prof.dr. Samo Stanič
Indexes table
1.
Introduction ..................................................................................................................................... 3
2.
Mobile Lidar Systems at UNG .......................................................................................................... 4
3.
Results of experiment ..................................................................................................................... 5
4.
Comments ..................................................................................................................................... 13
5.
References ..................................................................................................................................... 13
2
1. Introduction
What is lidar? Lidar stands for LIght Detection and Ranging. It is a remote
sensing method that uses light in the form of a pulsed. These light pulses, combined
with other data, generate precise, three-dimensional information about the shape of
the observed object and its characteristics. Main components of a lidar are:
transmitter – laser, receiver – telescope, detector and analysis part.
LIDAR uses ultraviolet, visible, or near infrared light to image objects and can be
used with a wide range of targets, including non-metallic objects, rocks, rain,
chemical compounds, aerosols, clouds and even single molecules. A narrow laser
beam can be used to map physical features with very high resolution.
Picture 1: Lidar concept
The lidar is used for many reasons; atmospheric, environmental, space, ocean,
astronomy exploration and other industrially and military reasons.
3
2. Mobile Lidar Systems at UNG
University of Nova Gorica has in 2007 developed a mobile lidar station which
includes two elastic (Mie
scattering) channels (at
266nm and 1064nm) and
a fluorescence
channel (at 296nm). The
lidar can operate both in
the
daytime
and
nighttime conditions.
UNG lidar uses CFR400
by Quantel Big Sky pulse
laser for transmitter. It is
a pulse laser capable of
emitting
multiple
wavelengths at once, in
our case 1064 nm IR
light, the 532nm green
light and 266nm UV.
Picture 2: UNG lidar
For receiver has been using 12'' Dopsonian telescope. 302 mm parabolic primary
mirror collects the backscattered light and the induced fluorescence to its focal
length at 1520 mm, where another mirror is placed. Secondary mirror redirects
light into detecting system installed outside the telescope. The tube is rolled steel
perfect to carry transmitter and detecting system.
Diachronic mirrors made by SLS Optics Limited were applied to divide elastic
scattering (at 1064nm and 266nm) and induced fluorescence (UV). The first
diachronic mirror in the receiver separates UV from IR. Ultraviolet light is divided
once more with the second diachronic mirror, where induced fluorescence is
separated from elastic scattering. In order to separate laser backscattering signal
from the background, interference filters by BARR Associates were installed. After
the filters, the backscattered beam is focused onto photomultiplier tubes
Hamamatsu R7400-06 (266 and 296 nm) and an avalanche photodiode Si APD
S8890-30 by Hamamatsu. These sensors convert the received light into
measurable electrical signals. Amplitudes of the electrical signals are proportional
to the power of received light.
4
Digitalization of lidar measurements is performed by an analog/digital (AD)
converter and read out by a Linux based computer for data acquisition and
analysis.
3. Results of experiment
We have measure with UNG lidar on March 1 for half of an hour from 11:26 to
11:58. Then we receive 12 text files for collecting data every 5 minutes. In this files
was to columns; one representing height in meters and other was a returned
power in watts. Every file contents 2667 measurements. The height goes up to
10km.
0 63.256
3.75 63.3071
7.5 63.2667
11.25 63.5643
15 61.1536
18.75 55.9333
22.5 72.5786
This is an example of our data. This is raw data, so we have to process it and make
an analysis. I decided to use Mathematica for my processing software. First I have
imported one file. From 12 available I have chosen the one, which was made at
11:46. Then I plot this raw data, just to see all this data represented in a graph.
Graph 1: Raw plot
5
Graph 2: Raw plot all
But these two graphs don’t show us the real picture. The data inside has a lot of
noise and they are not properly range weighed.
First I had to remove the noise. V (z) = P(z) – Pnoise I have decided that above
m (2200 row) is all that we collected background noise. So I have calculated
the average value of returned power above 8246m and then subtract it from my
data. So I get graph looking like this:
Graph 3: Noiseless plot
6
Now we have our noiseless plot, but couldn’t be the correct one. First, we could
not get negative numbers. Returned power is always above zero. So we should fix
this. Lets go to lidar equation for the answers.
P(z): received power K: system constant C: light speed τ: pulse duration P0: transmitted power Ar: effective
telescope area Y(z): overlap factor, 0≤Y(z) ≤1
σ(z) and β(z) are the main parameters of the atmosphere, σ(z) is the atmospheric optical thickness and
visibility, β(z) is the density of aerosols and molecules.
So σ(z) and β(z) are the main parameters of the atmosphere. Then I used Klett
method for solving the lidar equation:
( )
( )
Where k depends on lidar wavelength and properties of aerosol, 0.67≤k ≤1.3
For our measurements we chose k to be 1, so solution:
(
( )
( )
( )
( )
(
∫
2
( )
Where S(z) = ln[Z V(z)]
7
)
( )
)
So I had to calculate ( )
( ) or range correction.
Graph 4: Graph before Log
And then I had to logarithm this result and I get this:
Graph 5: Range corrected graph
He we can actually see, what was happening at 11:46 am on March 1. Some
interesting things are happening around 2000m. There are some smaller jumps
and falls in the graph. There are no bigger picks or falls in the graph, so it was
more or less steady conditions.
8
But we have to see “bigger” picture. So I try to plot time-series graphs. Now I had
to import all 12 files. I repeated whole process like plotting just one file, but this
time plotting all 12 files together.
Graph 6: Noiseless plot for all data
Here we can see all 12 lines like on the noiseless graph with one. Now I had to correct it by range so I
get this graph:
Graph 7: Plot all
9
From this graph we can see, that from 3000m is almost steady power, so I have
decided to give closer look just between 0 and 2700m. The range corrected graph:
Graph 8: Range corrected
Now I had all necessary for time-series plot. I used density plot to represent time
and height changing conditions:
(Note: I didn’t know how to change time scale on y-axis so I wrote time as 11.40. I know that this is wrong and it should be
written as 11:40)
10
This is now 2D picture of a sky above our lidar when we were measuring. We can
clearly see an aerosol layer on 1400m and one big below this. Unfortunately we
didn’t measure for longer time period. We could see more layers. It is clearly seen,
that the conditions in the sky didn’t change so much in this 30 min period. It was
pretty steady.
Then I went to calculate the atmospheric extinction coefficient σ(z). So we know
the Klett method:
( )
( )
And we know the expression for σ(z):
(
( )
( )
(
∫
( )
( )
( )
)
( )
)
Zc is the maximum detection range in our case 8246m.
( )
( )
And according to the U.S Standard atmosphere model 1976:
( )
(
[
Where λ is wavelength. λ = 1064 nm
11
)
](
)
I plot this extinction coefficient just for one measurement and I used that at
11:46.
Here we can see almost identical picture as time-series plot. At 1400m we have
one big pick or an aerosol layer and one thick below. One small pick is at 2000m
and from that on is almost nothing.
12
4. Comments
I think that the main killers for lidar signal are absorption by gases and aerosol
scattering and of course anything that can block our light. There are two
atmospheric optical properties in the lidar equation. σ(z) and β(z) are the main
parameters of the atmosphere, σ(z) is the atmospheric optical thickness and
visibility, β(z) is the density of aerosols and molecules. The noise is expected in our
data. We were using laser in IR spectrum. This is very common in nature. And
noise become from the absorption and scattering of our light. We had to range
corrected our signal, because in logical that our signal will be losing strength with
distance. So we have to multiply our signal with distance squared. And if we want
to get our “real” signal out from it, we must use natural logarithm to the return
signal.
I thing that this experiment was very good. It is good example how to process
data. You must do so much work to get form raw signal to the proper values to be
able to plot some graphs to see what is happening in nature.
5. References




http://www.ung.si/~htingyao/teaching/lidar/index.html
http://sabotin.ung.si/~sstanic/CRA/lidar/mobile/index.html
https://en.wikipedia.org/wiki/LIDAR
http://www.lidar-uk.com/
13
University of Nova Gorica
Fakulteta za aplikativno naravoslovje
LIDAR measurements
Experimental report
Miha Gunde
Teaching assistant: Tingyao He
Professor: Prof. dr. Samo Stanič
Nova Gorica, 2013
CONTENTS
1.
2.
3.
4.
5.
6.
Introduction
Our system
Results of experiment
Conclusions
Questions
Sources
1.
INTRODUCTION
What is the lidar system?
Lidar, Light-Detection-and-Raging, is a remote optical sensing technology, which exploits
reflection of light to get information about the target it is pointed at. Lidar is somewhat similar to radar
and sonar in that they all use the time delay of the returned signal to determine the range at which the
signal scattered back. They differ, though, in the type of energy they emit.
Radar uses an antenna to emit and receive electromagnetic energy (radio waves), while sonar
emits acoustic energy (sound) through an electric-acoustic converter (speaker), and receives an echo of
it through the headphones.
Sonar
Radar
Lidar emits light via the laser, and receives the returned light through a telescope and a photon
detector (photomultiplier). The laser emits light at wavelengths from 250nm to 10μm, which then gets
scattered on small particles into all directions, some of it directly back into our receiver (telescope). We
can use the time delay between the emitted and received signal to
determine the range at which our signal scattered, and through the
photomultiplier, we can also determine the exact power returned
to the telescope.
Lidar
Different types of Lidar applications require the use of different
types of scattering. Most common types of scattering used are
Rayleigh scattering, Mie scattering and Raman scattering. The
laser allows us to use a really narrow beam, which results in a very
high resolution mapping of physical features, compared to radar
and sonar technologies. Also, many chemical compounds interact
strongly at wavelengths near the visible light, which results in a
stronger image of those materials. Using a combination of lasers,
allows mapping of contents by looking for wavelength-dependent
changes in the intensity of the signal. We can use the lidar system
pointed at just one point, or we can use a motor to move it around
and scan a bigger portion of the sky.
By attaching a mobile Lidar system to an airplane, one can successfully map out tree layouts in
hardly-accessible forests and/or jungles, or create a very precise map of the landscape relief. But then,
other methods of light reflection are to be used.
2.
OUR SYSTEM
At the University of Nova Gorica we have two lidar systems: one stationary, located at Otlica,
and the other, mobile, located at the main university building. Our experiment was done on the mobile
edition of it.
This one can transmit two elastic channels with wavelengths 266nm and 1064nm, and one
fluorescence channel at 296nm. Backscattered light is collected using a telescope with a parabolic
mirror of 302mm radius, which focuses the light at the focal point, from where it gets reflected one
more time into the spectrometer. The spectrometer filters received signal by the wavelengths, and sends
the desired ones into the photomultipliers. There it gets digitalized and sent to a computer.
3.
RESULTS OF EXPERIMENT
Our experiment took place on a nice day with not many clouds on the sky. The motor which
rotates the lidar system was not operational, so we pointed it into one direction under the angle 49°.
Our data was collected from 11:26 to 11:58, in 4 or 5 minute intervals. Laser frequency was 10 hertz.
The spatial resolution of the lidar is 3.75m. Data from the computer comes in .txt files, each containing
two columns, first the range in meters, the second is the corresponding return power.
Firstly, I multiplied range with sin(49°), so to get the true height.
After that step, the plot of power as a function of range looks like this:
Which makes sense, because the returned power decreases rapidly due to the noise we get from outside
sources, such as sun, starts, etc. The theoretical range of the system is 10 kilometers, so we can easily
say that the last 10 percent of the data is just noise. I've removed the noise by averaging the last 10% of
the data and subtracting the value from all the other returned powers. The average noise through the
data files seems to be around 64, which is a relatively small value.
Next I did the range correction of the data. This is to get rid of the distance-dependent
quantities, so only the atmospheric properties values remain. It follows from the lidar equation:
Where P(z) is the returned power, K is the system constant, which depends on the lidar system,
P0 is the transmitted power, c is the speed of light, τ is the pulse duration, Ar is the effective telescope
area, z is the range at which light has been reflected, Y(z) is the overlap factor, 0<Y(z)≤1, β is the
atmospheric back-scattering coefficient, in other words, “density” of the aerosols and molecules in the
air. And σ is the atmospheric extinction coefficient, or the “thickness”.
After the range correction, the power(range) plot look like this:
In order to get a better look at what's actually going on, I did the logarithmic scale plot:
And the same plot of some other data file:
We can see that, clearly there wasn't much going on in the atmosphere. That is due to the nice
weather we had during the experiment. The time difference between the two plots is about 25minutes.
To get a better view at the time-dependent changes of the piece of atmosphere we were shooting our
lidar at, I've made a contour plot with the clock on the x-axis, in the form (hhmm):
4.
CONCLUSIONS
From the time-dependent range contour plot, I can conclude that there was not much happening
in the atmosphere at the time of our experiment. The darker patches on the plot represent the high
reflected energy, while the white patches low.
That means, there were a few layers of really thin clouds of particles, which didn't change much
with time. They start to appear at around 3 kilometers height, and are quite evenly spaced all the way
up to 10 kilometers. Some layers disappear, while new ones are also emerging.
The darker line at the near-zero range is due to the signals not being totally overlapped and the
random reflecting surfaces near our system, such as the trees and buildings, or other light not coming
from our source.
5.
QUESTIONS
What are the killers of the lidar signal?
The extinction of the lidar signal happens hits any target, for example clouds, aerosol particles,
trees, rain, chemical compounds, or even single molecules. This depends highly on the wavelength of
the emitted laser-light. Extinction of the shorter wavelength light is much higher than that of the longer
wavelengths. So the extinction of UV-light is much higher than extinction of IR-light.
Which parameters are related to the atmospheric optical properties from the lidar equation,
and what do they mean?
Parameters related to the atmospheric optical properties are the β(z), which is the atmospheric
back-scattering coefficient or the “density”, the other atmospheric parameter is the σ(z), which is the
atmospheric extinction coefficient or the “thickness”.
Analyze the sources of noise and what is your way to eliminate the noise from the return signal?
Sources of noise are the reflection of sunlight from random surfaces, the starlight and the airglow. The sampling device has a larger working range than the laser, so the data received beyond the
laser range is noise. In order to eliminate the noise, I averaged the last 10 percent of the data and
subtracted it from the measurements.
Why do we range-correct and take natural logarithm of the return signal?
We range correct the return signal in order to get rid of the distance-dependent quantities of the
data, so only the atmospheric properties remain. We take the natural logarithm in order to get a better
view of what is going on at larger changes of range. A logarithmic scale can represent smaller and
bigger changes more easily than a linear function. In a logarithmic scale, the smaller changes get
relatively amplified, while the bigger ones get relatively muted down.
6.
SOURCES
–
–
–
–
presentations from http://www.ung.si/~htingyao/teaching/lidar2013/index.html
http://en.wikipedia.org/wiki/LIDAR
http://sabotin.ung.si/~sstanic/teaching/physlab/Lidar/
James D. Klett; Stable analytical inversion solution for processing lidar returns
University of Nova Gorica
School of applied sciences
Course: Physical laboratory 3
Teaching assistant:Dr. Tingyao He
Lecturer: Prof.dr. Samo Stanič
Experimental report
LIDAR measurements
15.4.2013
Simon Lukman
1
Table of Contents
What is LIDAR..........................................................................................................................................3
Lidar working principle.........................................................................................................................3
LIDAR at UNG..........................................................................................................................................4
Experiment data.........................................................................................................................................5
Data analysis..........................................................................................................................................6
Lidar equasion..................................................................................................................................6
Conclusion...............................................................................................................................................10
References................................................................................................................................................10
2
What is LIDAR
LIDAR (LIght Detection and Ranging or Laser Imaging Detection and Ranging) is an optical remote
sensing technology that can measure the distance to, or other properties of, targets by illuminating the
target with laser light and analyzing the backscattered light. LIDAR technology has applications in
geomatics, archaeology, geography, geology, geomorphology, seismology, forestry, remote sensing,
atmospheric physics, airborne laser swath mapping (ALSM) and contour mapping.
Lidar working principle
The principle behind LIDAR is really quite simple. Shine a small light at a surface and measure the
time it takes to return to its source.
The LIDAR instrument fires rapid pulses of laser light at a surface, some at up to 150,000 pulses per
second. A sensor on the instrument measures the amount of time it takes for each pulse to bounce back.
Light moves at a constant and known speed so the LIDAR instrument can calculate the distance
between itself and the target with high accuracy. By repeating this in quick succession the instrument
builds up a complex 'map' of the surface it is measuring.
3
Generally there are two types of LIDAR detection methods. Direct energy detection, also known as
incoherent, and Coherent detection. Coherent systems are best for Doppler or phase sensitive
measurements and generally use Optical heterodyne detection. This allows them to operate at much
lower power but has the expense of more complex transceiver requirements. In both types of LIDAR
there are two main pulse models: micropulse and high-energy systems. Micropulse systems have
developed as a result of more powerful computers with greater computational capabilities. High energy
systems are more commonly used for atmospheric research where they are often used for measuring a
variety of atmospheric parameters such as the height, layering and density of clouds, cloud particles
properties, temperature, pressure, wind, humidity and trace gas concentration.
LIDAR at UNG
University of Nova Gorica has two LIDARs, one is stationed in Otlica and another one is a mobile
LIDAR stored at university. Our experiment has been measured with mobile LIDAR, that transmits
two elastic channels at wavelength 1064nm and 226nm and one fluorescence channel at wavelength
296nm. The laser used was the CFR400 by Quantel Big Sky pulse laser. With a mounted motor, it can
vary both the azimuth and elevation angles, thus creating a 3D image.
4
Backscattered light is collected using parabolic mirror with radius 302mm which focuses all received
light at the another mirror placed at it’s focal point, which is 1520mm away where light is reflected one
more time into spectrometer. Spectrometer filters received signal separating light of different
wavelengths and sending desired ones into photomultiplier. There signal is digitalized and send to
computer where the data is stored.
5
Experiment data
Our experiment took place on 27th February 2013, between 11:26 and 11:58. recording the signal in
approximately 3 minute intervals. Output data was gathered in separate files, each for one time interval.
Files consisted of two columns, one for altitude and another one for appropriate return power. LIDAR
was positioned behind UNG on a parking lot. Weather was sunny with clear sky. Laser frequency was
set to 10 pulses per second with laser inclination of 67 degrees.
6
Data analysis
Lidar equasion
P(z) represents return power, K is system constant dependent on setup of LIDAR, Po is transmitted
power, c speed of light, τ pulse duration, Ar effective telescope area, z is distance at which the light has
been reflected, Y(z) is total overlap factor (0<Y<1), σ(z) and β(z) are the main parameters of the
atmosphere, σ(z) is the atmospheric optical thickness and visibility, β(z) is the density of aerosols and
molecules.
For data analysis and plotting I am using Python with matplotlib library with corresponding extensions.
This is a plot of raw data recorded at 11:34 AM. We can see that return power is decreasing but after
certain distance it stays the same. The reason why return power stays the same after some distance is
the noise coming from the airglow, starlight etc.
7
Next step was getting rid of noise, by averaging the last ten percent of data and subtracting them from
original return power. In order to amplify returned power, we do the range correction, which origins
from the equation above. Each pulse of sent power has been scaled by the square of range at which it
has reflected back. Multiplying the equation with square of z separates the distance dependent
quantities from the ones, which are set only by atmosphere and LIDAR setup, so range correction
makes clear how the atmospheric properties vary with distance.
Final step is to take logarithmic scale. When subtracting noise some of the results turned zero or
negative, in order for logarithm to take a definite value I used absolute value of return power. Final
noiseless plot taken on 11:34 AM with range correction looks like this:
8
As our experiment took about 30 minutes, we can also plot 2D time plot from 11:26 AM to 11:58 AM.
9
Conclusion
Our experiment took time in quite clear sky, so the results from plotting weren't that interesting as from
a standpoint of detecting a cloud for example. But the whole process of analyzing and applying data
was the most interesting part. In instructions for report were some questions which I believe I answered
throughout report. Thanks to fur assistant Tingyao He for all the help with experiment.
References
- Wikipedia :http://en.wikipedia.org/wiki/LIDAR
- UNG LIDAR page: http://sabotin.ung.si/~sstanic/teaching/physlab/
- http://www.lidar-uk.com/how-lidar-works/
- http://www.umass.edu/windenergy/research.topics.tools.hardware.lidar.php
10
University of Nova Gorica,
School of Applied Sciences
Physics Laboratory III,
Experimental report:
Lidar measurements
By: Tine Bavdaž
Teaching assistant: Tingyao He
Lecturer: Prof. dr. Samo Stanič
Contents:
1. Introduction to LIDAR ...........................................................................................................3
2. Mobile LIDAR system at UNG................................................................................................5
3. Data processing and experimental results............................................................................7
4. Conclusion...........................................................................................................................12
5. References...........................................................................................................................13
2
Introduction to LIDAR
LIDAR has been used extensively for atmospheric research and meteorology. Downward-looking
LIDAR instruments fitted to aircraft and satellites are used for surveying and mapping – a recent
example being the NASA Experimental Advanced Research Lidar. In addition LIDAR has been
identified by NASA as a key technology for enabling autonomous precision safe landing of future
robotic and crewed lunar landing vehicles.
LIDAR (Light Detection and Ranging) is an optical remote sensing technology that measures
properties of scattered light to find range and/or other information of a distant target. The prevalent
method to determine distance to an object or surface is to use laser pulses. Similar to radar
technology, which uses radio waves instead of light, the range to an object is determined by
measuring the time delay between transmission of a pulse and detection of the reflected signal.
LIDAR uses ultraviolet, visible, or near infrared light to image objects and can be used with a wide
range of targets, including non-metallic objects, rocks, rain, chemical compounds, aerosols,
clouds and even single molecules. A narrow laser beam can be used to map physical features with
very high resolution.
3
There are several major components to a LIDAR system:
• Laser — 600–1000 nm lasers are most common for non-scientific applications. They are
inexpensive but since they can be focused and easily absorbed by the eye the maximum power is
limited by the need to make them eye-safe. Eye-safety is often a requirement for most applications.
A common alternative 1550 nm lasers are eye-safe at much higher power levels since this
wavelength is not focused by the eye, but the detector technology is less advanced and so these
wavelengths are generally used at longer ranges and lower accuracies. They are also used for military
applications as 1550 nm is not visible in night vision goggles unlike the shorter 1000 nm infrared
laser. Airborne topographic mapping lidars generally use 1064 nm diode pumped YAG lasers, while
bathymetric systems generally use 532 nm frequency doubled diode pumped YAG lasers because 532
nm penetrates water with much less attenuation than does 1064 nm. Laser settings include the laser
repetition rate (which controls the data collection speed). Pulse length is generally an attribute of the
laser cavity length, the number of passes required through the gain material (YAG, YLF, etc.), and Qswitch speed. Better target resolution is achieved with shorter pulses, provided the LIDAR receiver
detectors and electronics have sufficient bandwidth
• Scanner and optics — How fast images can be developed is also affected by the speed at which it
can be scanned into the system. There are several options to scan the azimuth and elevation,
including dual oscillating plane mirrors, a combination with a polygon mirror, a dual axis scanner.
Optic choices affect the angular resolution and range that can be detected. A hole mirror or a beam
splitter are options to collect a return signal.
• Photodetector and receiver electronics — Two main photodetector technologies are used in lidars:
solid state photodetectors, such as silicon avalanche photodiodes, or photomultipliers. The
sensitivity of the receiver is another parameter that has to be balanced in a LIDAR design.
• Position and navigation systems — LIDAR sensors that are mounted on mobile platforms such as
airplanes or satellites require instrumentation to determine the absolute position and orientation of
the sensor. Such devices generally include a Global Positioning System receiver and an Inertial
Measurement Unit (IMU)
4
Mobile LIDAR system at UNG
University of Nova Gorica has in 2007 developed a mobile lidar station which includes two
elastic (Mie scattering) channels (at 266nm and 1064nm) and a fluorescence channel (at 296nm).
The lidar can operate both in the daytime and nighttime conditions.
Transmitter
CFR400 by Quantel Big Sky pulse laser, which is capable of simultaneous emission of light at different
wavelangths is being used as the transmitter. CFR400 emits light at base wavelength of 1064 nm (IR),
second harmonic (532 nm, green, blocked in our case) and fourth harmonic (266nm, UV). As the
attenuation of UV in air is much larger than attenuation of IR, IR light is used for regular Mie
scattering operation and the UV light for for the excitation of the Tryptophane fluorescence in
organic materials.
Receiver
12'' Dopsonian telescope by Guan Sheng Optical company serves as the receiver. 302 mm parabolic
primary mirror collects the backscattered light and the induced fluorescence to its focal lenght at
1520 mm, where another mirror is placed. Secondary mirror redirects light into detecting system
installed outside the telescope. The tube is rolled steel perfect to carry transmitter and detecting
system.
Spectroscopic filters
Dichroic mirrors made by SLS Optics Limited were applied to devide elastic scattering (at 1064nm and
266nm) and induced fluorescence (UV). The first dichroic mirror in the reciever separates UV from IR.
Ultraviolet light is devided once more with the second dichronic mirror, where induced fluorescence
is separated from elastic scattering. In order to separate laser backscattering signal from the
background, interference filters by BARR Associates were installed. After the filters, the
backscattered beam is focused onto the to the the photomultiplier tubes Hamamatsu R7400-06 (266
and 296 nm) and an avalanche photodiode Si APD S8890-30 by Hamamatsu. These sensors convert
5
the received light into measurable electrical signals. Amplitudes of the electrical signals are
proportional to the power of received light.
Data Acquisition
Digitalization of lidar measurements is performed by an analog/digital (AD) converter (Licel transient
recorder) and read out by a Linux based computer for data acquisition and analysis.
From the time delay between the laser broadcasted light pulse and the received signal the distance
to scatterer (the aerosol layer) is calculated, and from the intensity of backscattered light the density
of aerosol layer is obtained.
6
Data processing and experimental results
Killers for the lidar signal:
The major killer for lidar signal is long range; this gives us weak signal (backscattered power
decreases as 1/z 2 ).
As we know lidar works in the spectrum of IR light, which is quite common in nature; this gives us
background noise.
Another problem is the absorbtion of light in the gases in the air; this shortens the lidar range.
Lidar equation:
P(z): received signal power
K: lidar system constant
P0 : transmitted power
C: speed of light
τ:
pulse duration
Ar : effective telescope area
Y(z): overlap factor (0≤Y(z) ≤1)
σ(z): atmospheric optical thickness and visibility
β(z): density of aerosols and molecules
7
Recorded data:
We got recorded data in twelve .txt files, each containing a single measurement. In .txt file there
were two columns, one representing height and the other corresponding power.
I have decided to use software Wolfram Mathematica 8 for data analysing and plotting.
First, I had to import the data into Mathematica in the correct form. Since we have 12
measurements taken at different times and 2667 different heights, the easiest way for me to start
analyzing was to create a matrix in which I have 12 columns and 2667 rows.
Eliminating of noise:
First of all, we have to know that when we are measuring something in the real world we will always
get some errors, because of the limited precision of our measuring devices.
𝑉 𝑧 = 𝑃 𝑧 − 𝑃𝑛𝑜𝑖𝑠𝑒
𝑃𝑛𝑜𝑖𝑠𝑒 =
10000 𝑚
7500 𝑚
𝑃(𝑧)
𝑛
𝑃 𝑧 = received signal power (𝑅𝑎𝑤 𝑑𝑎𝑡𝑎)
I have treated all measurements above 7500m as a background noise, because we have noise floor at
around 63W.
Then I have calculated the average power of all measurements from 7500m to 10000m and I have
subtraced it from the received signal power.
Noiseless data plot:
8
Range-corrected data:
Why do we do range-corrected and take the natural logarithm to the return signal?
We do range-corrected, because then the variation of atmospheric properties with respect to height
can be seen much clearer. In fact, we normalize signal.
To get range corrected data I have multiplied noiseless data with the square of the corresponding
height and I have calculated the natural logarithm of it.
9
Height-Time plot:
From this plot we can see that it was a sunny day with no clouds in the sky.
10
Extinction coefficient:
According to U.S. Standard Atmosphere Model 1976:
λ = wavelength (λ = 1064nm)
We have to be careful, because the wavelength is in nanometers, but the height is in meters!
Zc = maximum dettection range (7500m)
S(z) = ln[𝑧 2 P(z)]
σ(𝑧𝑐 ) = LR×β(𝑧𝑐 )
LR = lidar ratio (LR = 50)
Klett method for solving the lidar equation tells us that β(z) and σ(z) are related:
Where k depends on lidar wavelength and properties of aerosol (0.67≤k ≤1.3).
Assistant told us to set k=1.
11
Conclusion
I have decided to include answers to four questions from the instruction for this report into
paragraph ''Data processing and experimental results'', rather than just writing down one by one
answer.
Athough I had some hard problems finding right software solutions and studying the experiment
properties took me quite a lot of time, but I'm very happy that I went through it. I have learned a lot
how to process and analyze data and most of all I have learned a lot about LIDAR system and how it
works.
Many thanks also to our teaching assistant Tingyao He for all the help.
12
References:




http://www.usgs.gov/pubprod/aerial.html
http://en.wikipedia.org/wiki/LIDAR
http://www.fkaglobal.com/index.php/lidar-services
http://www.ung.si/~htingyao/teaching/lidar2013/index.html
13