Download VISUALIZATION OF MODIS DATA IN THE BLENDER ENVIRONMENT: by Jonathan D. Wilson

Document related concepts

Nonlinear dimensionality reduction wikipedia , lookup

Transcript
VISUALIZATION OF MODIS DATA IN THE BLENDER ENVIRONMENT:
OR SCIENCE IN A BLENDER
by
Jonathan D. Wilson
A senior thesis submitted to the faculty of
Brigham Young University - Idaho
in partial fulfillment of the requirements for the degree of
Bachelor of Science
Department of Physics
Brigham Young University - Idaho
July 2012
c 2012 Jonathan D. Wilson
Copyright All Rights Reserved
BRIGHAM YOUNG UNIVERSITY - IDAHO
DEPARTMENT APPROVAL
of a senior thesis submitted by
Jonathan D. Wilson
This thesis has been reviewed by the research committee, senior thesis coordinator, and department chair and has been found to be satisfactory.
Date
Todd Lines, Advisor
Date
David Oliphant, Senior Thesis Coordinator
Date
Kevin Kelley, Committee Member
Date
Stephen Turcotte, Chair
ABSTRACT
VISUALIZATION OF MODIS DATA IN THE BLENDER ENVIRONMENT:
OR SCIENCE IN A BLENDER
Jonathan D. Wilson
Department of Physics
Bachelor of Science
Modis data files are visualized by way of Blender. Care is used to maintain
the accuracy of the files data and geolocation information. The Modis teams
georeferencing is used as the basis for locating the data within an x,y,z space.
ACKNOWLEDGMENTS
I would like to thank the advisors who have assisted me in the process of
writing this thesis and the code for Blender. I would also like to thank Kyle for
the insights during the editing process. I would also like to thank my family
for their patience while I talked their ears off on this subject.
Contents
Table of Contents
xi
List of Figures
xiii
1 Introduction
1.1 Visualization of large amounts of data . . . . . . . . . . . . . . . . .
1.2 Current methods of displaying satellite data . . . . . . . . . . . . . .
1.3 Why a new method? . . . . . . . . . . . . . . . . . . . . . . . . . . .
1
2
2
4
2 History
2.1 Modis . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
2.2 Blender . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
2.3 HDF 4 and HDF 5 . . . . . . . . . . . . . . . . . . . . . . . . . . . .
5
5
6
7
3 Procedures
9
4 Results
4.1 Analysis of h4toh5 . . . . . . . . . . . . . . . . . . . . . . . . . . . .
4.2 Analysis of a Blender rendering of a sweep . . . . . . . . . . . . . . .
4.3 Theory of Data display . . . . . . . . . . . . . . . . . . . . . . . . . .
11
11
12
13
5 Conclusion
5.1 Did Blender work to a satisfactory level at displaying the data? . . .
5.2 Did Blender render in a satisfactory amount of time? . . . . . . . . .
5.3 Future Directions . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
15
16
16
19
Bibliography
21
A Instructions for use
A.1 Installation . . . . . . . . . . . . . . . .
A.2 Use of Software . . . . . . . . . . . . . .
A.3 Methods of interacting with the software
A.4 Computer Specifications . . . . . . . . .
23
23
25
28
29
xi
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
xii
CONTENTS
B Python Code
B.1 Version 2 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
B.2 Version 1 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
B.3 Version 0 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
33
33
40
47
List of Figures
4.1
4.2
File conversion times . . . . . . . . . . . . . . . . . . . . . . . . . . .
An example of a rendered image . . . . . . . . . . . . . . . . . . . . .
12
13
5.1
5.2
5.3
5.4
An example of a rendered image
An example of a rendered image
Times for importing the MODIS
Times for importing the MODIS
. . . . . . . . . . .
. . . . . . . . . . .
file(first version) .
file(third version)
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
17
17
18
18
A.1
A.2
A.3
A.4
The first thing you see . . . . .
Results from running the code .
Highlighting how to change to a
The contents of the editor types
. . . . .
. . . . .
different
menu .
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
25
26
26
27
xiii
. . .
. . .
view
. . .
.
.
.
.
.
.
.
.
.
.
.
.
Chapter 1
Introduction
A challenge that exists is that of displaying information about a three-dimensional
object onto a two-dimensional plane. A map is an example of this. Over the years
there have been many developments that have tried to overcome the challenge of
displaying information about the globe onto flat sheets. This has resulted in numerous
types of maps to answer the call for greater accuracy of one aspect or other, or increase
the ease of use.
The venerable Cartesian coordinate system has been used with great success to
show the relationship between two different sets of data. This does not work as well
with multiple data sets being compared simultaneously. With computer technology
we can start to explore the idea of creating a three-dimensional space that we can
explore and interact with like the real world.
This paper will discuss the methodology and philosophy behind using Blender [1]
as a visualization program for scientific data sets. Blender is a 3-D(three-dimensional)
animation tool that is open source. The benefits of using Blender are that it is free
to use and easily modified to suit a particular purpose. This will be demonstrated
through the specific application of Blender to MODIS [2] data sets taken from the
1
2
Chapter 1 Introduction
Aqua [3] and Terra [4] satellites. In addition, the use of the software written to achieve
this visualization will be discussed.
1.1
Visualization of large amounts of data
A significant challenge of the era of satellite-based research is the task of extracting
meaning from the vast amount of data that is gathered. One method is to automate
the detection of events that are significant to the researchers involved. Another
method is to provide a visual representation of the data in a manner that can be
clearly seen; both the events that you are looking for, as well as understanding the
context in which the event is found. The second method is the one that is necessary
to research because this is the method that will aid in the communication of the
ideas and evidences that are found to someone who is not intimately familiar with
the conventions that are used for that satellite.
This challenge has been faced in the past and this is where we get the Cartesian
coordinate system. This was a method created to show the relationship between two
separate quantities in a clear manner. As we go forward we are seeking meaning
and relationships between larger data sets and larger numbers of quantities. There
are now two-dimensional graphs that can include the relation between three or more
quantities. Being able to represent data accurately within a 3-D space will increase
the clarity of the relationships.
1.2
Current methods of displaying satellite data
At this time there are several pieces of software that can be used to visualize the data
in a satellite data product; ArcGIS [5], Erdas [6], ENVI [7] and Google Earth [8]. Ar-
1.2 Current methods of displaying satellite data
3
cgis, Erdas and Envi are predominately used to generate two-dimensional maps from
the data for visualization purposes, although they are working to incorporate a threedimensional analysis into them. They are coming from a history of two-dimensional
representation and trying to convert to a three-dimensional representation. This fundamental shift will require a great deal of work on their part.
Since the data is three-dimensional, this leads to visual warping as a large enough
data set is translated from a two-dimensional representation to a three-dimensional
one. Google Earth is more appropriate for large volumes of data in terms of design
because it is inherently three-dimensional and thus provides the data in a form that
gives context, as well as including a way to tag data as coming from a certain time.
Unfortunately Google Earth, at least the free edition, is not a good choice for large
data sets due to its loading everything into memory, and thus, at a certain point, it
exceeds its resources and crashes. Another limitation of Google Earth is that it does
not have fine controls for the display of the time sensitive data, which is necessary
for maintaining the time aspect.
There are challenges when dealing with combining satellites’ data sets from the
georeferencing. Satellite data sets do not use a uniform reference system for defining
latitude and longitude. Modis data itself comes in various levels of processing. This
project uses level 1A geolocation and level 1B data products which are georeferenced
to the WGS84 Geoid [9] and contain the height relative to said geoid. Levels 2 and
beyond use different coordinate systems that are more appropriate to the intended
application. When trying to combine data sets there can be confusion from the
differences that results in improperly aligned data due to the different methods of
referencing.
4
Chapter 1 Introduction
1.3
Why a new method?
There are several reasons a new method is needed to display information using
Blender. Using open source software olves the issues created by evolving software. As
future technology changes the methods are freely available to be updated. There is a
great deal of effort expended on updating libraries that were programmed in Fortran
and highly optimized. This project is itself built on this very idea. The NumPy [10]
library is built on a collection of libraries that were programmed in the Fortran language and highly optimized. By using NumPy from within Python [11], I was able
to decrease the amount of time it took to perform the longest section of code from 77
seconds to a fifth of a second.
The next reason this method is needed is Blender is designed to make time series of
three-dimensional objects and thus contains a large number of tools for working with
those time-resolved data sets. All of the separate methods of georeferencing can also
be coded to convert to a standard x,y,z coordinate system using whichever references
are needed and the separate shapes can be included within the space. These make for
an easy-to-use environment, bringing all of these data sets within and viewing them
concurrently if desired. It is also possible in Blender to create multiple views of the
different data sets that are all viewed from the same universal perspective over time.
The final reason for using Blender is because it contains a number of functions that
are for smoothing three-dimensional shapes and the coloring of them. These tools use
in some fashion linear and non-linear extrapolation. If in the future someone could
vet these functions they could be applied directly to the data. This final reason is
partially addressed in section 4.3 where the future works are described.
Chapter 2
History
This project covers several different areas that would benefit from a bit of background
for better context. Firstly, there is the sensor itself that flies on the Aqua and Terra
satellites. Secondly, is the piece of software called Blender. Finally, there is the HDF
file format. Each one of these elements brings something to the project that makes
it possible for this to work together.
The background for the satellite is complex and deserves a close reading of the the
Algorithm Theoretical Basis Documents for the Level 1A geolocation and Level 1B
data products [9] [12]. Blender is also complicated. However, it is more user-friendly
and can be learned by trial and error [13]. HDF is used in this research as a black
box that enables a unix-like directory access to the data.
2.1
Modis
Modis has been flying on Aqua and Terra for a decade. These satellites are in sun
synchronous flight at an altitude of 705 km. The Terra satellite is flying at 10:30
am and 10:30 pm local time descending and Aqua is flying at 1:30 pm and 1:30 am
5
6
Chapter 2 History
local time ascending. The sensor sweeps out an area at the rate of 20.3 rotations
per minute perpendicular to the orbit path. The swath that is viewed in a pass is
approximately 2330 km (perpendicular to orbit) by 10 km (parallel to orbit).
The data that comes from Modis [12] is processed and packaged into 10 minute
slices of data. The level 1A geolocation set contains the latitude and longitude coordinates to the 1km resolution level. The level 1B contains the measured radiances of the
instrument. The level 1B comes in three versions: the 250m contains the data from
the sensors that have the highest resolution, the 500m contains the mid resolution
data as well as a consolidated data from the 250m data set, and the 1km contains the
lowest resolution data, as well as consolidated data from the 500m and 250m data.
In addition there are several methods of georeferencing that are used by the different data products for the MODIS instrument. Data products that are labelled
“swath” use WGS84. Other data products are labelled with “SIN” or “SIN Grid”,
and they use as a reference a sphere, 6,371,007.181 m in radius.
2.2
Blender
Blender started its life as an in-house tool to create 3-D animations for movies and
other forms of entertainment. The tool was created in 1995 and since then was released as an open source tool. Coming from a field where all revenue is based on the
ability to create compelling images and animations has lead to the refinement of processes to take an idea and visualize it. One of the latest refinements that has enabled
this project was the inclusion of the Python language for scripting purposes and making most of the commands in the Blender software accessible from the Python script
itself. Most of what the user can do in the Blender environment can be reproduced
in a script in Python. This allows the rapid testing of methods in an environment de-
2.3 HDF 4 and HDF 5
7
signed to work with 3-D objects. Once a method is found that works, the conversion
to a script to repeat the task over and over again is a simple process.
Blender has decided to use Python version 3.x in it’s scripting. This choice has
implications for using scientific Python packages because they are slowly being ported
from version 2.x to version 3.x. For this project, this has prevented the use of PyHDF
[14]. PyHDF is only compatible with Python version 2.x. This library would have
allowed for the use of the HDF4 format, which is the native format for MODIS data,
thus skipping the step of translating from HDF4 to HDF5. PyHDF has not released
a new version since October of 2008, two months before Python 3.0 was first released.
H5py [15] is Python 3.x compatible as well as NumPy and SciPy. Going forward,
additional packages will become compatible which will expand the libraries that are
available.
2.3
HDF 4 and HDF 5
HDF has seen twenty years of development and use in storing and distributing scientific data. Over the years it has grown to take into account more generic ideas.
The change from HDF4 to HDF5 marks the addition of the ability to perform parallel IO. The HDF format allows for self-description, which for MODIS includes the
measurements of the height, range, angle to the sun as well as descriptions of what
range these values should have.
Several aspects of HDF make it useful for Blender visualization. Because the data
can be accessed in subsets the entirety of the file does not need to be loaded into
memory. This has a large impact on performance in the pipeline.
8
Chapter 2 History
Chapter 3
Procedures
The procedure is to go to the website http://reverb.echo.nasa.gov/reverb/ and select
the appropriate settings including; satellite, instrument, level of processing, time
period and geographic location. With these selected, you will get a list of granules to
download via ftp. With the files downloaded, convert the HDF4 files to HDF5 using
h4toh5 [16]. Once the files are converted, set the path parameter in the script to
point to the directory in which you have the files, as well as the file name parameter.
Once the script has run, you can apply further effects from the Blender environment to smooth the object generated. You have the option to output the results
as an image or video. Image output has the options for bmp, png, jpeg and tiff as
well as other formats with settings for the file output contents. For the image, the
recommended setting is to output in RGBA so that it includes the alpha channels,
which is part of the generated object. Alpha channel is the information about the
transparency of the object. Video output comes in a variety of formats as well, with
avi, h.264, mpeg, ogg theora, and xvid. The settings for the video are rather complicated, and should be fine with the default settings. However, some of these settings
will affect the render-time and the quality of the output visually.
9
10
Chapter 3 Procedures
Chapter 4
Results
The result of this research is a tool that can handle a known data format that is robust.
Multiple satellite data sets can be used within the environment, so long as they use
HDF4 or HDF5 format and the appropriate code is written to include it. The scripts
included in Appendix B can be used as a template for the steps necessary to import
the data and create an object. The system can be implemented and automated to
generate video over specified sections of the earths surface, or to follow the satellite
through its flight and give an accurate recreation of its view.
The time that it takes for the rendering is good for use in a real-time rendering
system. This process is not yet completed. There are additional steps that could be
taken to improve the efficacy of the process.
4.1
Analysis of h4toh5
The tool h4toh5 used to convert the data files from the HDF4 format to HDF5 format
worked wonderfully. The time it took to complete the conversion is within the desired
time frame, with delays of under a minute for the size of files that MODIS produces.
11
12
Chapter 4 Results
Figure 4.1 The time taken to convert is consistent with the size of the file
and follows a linear trend.
The conversion does alter the internal file structure in a consistent fashion.
The alterations that the program h4toh5 introduce are relatively minor. The
original data path is still intact and can be used with h5py to access the data. There
are some artifacts that are generated but they can be ignored and will not affect the
operation of the script at all.
4.2
Analysis of a Blender rendering of a sweep
Blender takes a short time to render for the small data sets used. The length of time
to render, and the accuracy of the render can be controlled intimately through several
settings. This allows for the software to be configured to do either a fast, close to
real time, or a slower more accurate view of the scene that is easier on the eyes. This
greater control could become necessary as either the number of data sets imported
increases or the resolution of data imported increases. See figure 4.2 to see what the
default settings will give you for an image.
4.3 Theory of Data display
13
Figure 4.2 This is the resulting image when rendered. This is an early
image where the lighting is not in place.
The NumPy functions that handle the trigonometric functions do return differences in values based on the smallest possible change in the lattitude and longitude
values. The color values range in Blender handles a range of 0.000000 to 1.000000
and thus will handle the conversion from an integer range of 0 to 32767 which is what
the MODIS data is scaled to.
4.3
Theory of Data display
The user is tempted to place within view all of the information that they have available. This temptation is best avoided due to the increased risk of misrepresenting
what is actually there as others view the image. From Tufte and his work analyzing two-dimensional data representations, the conclusion can be drawn that a simple
14
Chapter 4 Results
style is needed for this complex topic [17]. There is the concept of space, time, and
intensity of light to all display in the same image. The animation helps to provide
context for space and time through a smoothly changing reference frames. Limiting
the displayed data to a single value for each point, instead of trying to put all of the
frequencies measured into a single point, allows for the highest contrast of grayscale
to be used. Additionally, there is error in the data and error ranges that should be
conveyed at the same time so that a false view is not presented and accepted. In
order to achieve this, a transparency layer is added to the data layer which will allow
a third color to show through based on the error value. A small error will allow a
small amount of the other color through the data object, while a large error will allow
a large amount through and immediately lets the viewer know that there is something
wrong in the data object.
In order to display information about all of the frequencies, there needs to be a
consistent function that is applied across the view. This can be seen in graphs where
the axis are labeled in logarithmic scale. Tufte points out that if a graph has the axis
abbreviated it can exaggerate the difference in the relationship. Data displayed in a
three-dimensional space can have a similar problem because of the chosen view point.
As an example, terrain data can be taken and -using a view that is just above the
surface- will exaggerate the height differences. A good study of this would be optical
illusions to find where the presentation method will break down.
Chapter 5
Conclusion
There is a great deal of potential in exploring Blender for the visualization of data.
There are the smoothing functions and the subdivide functions that can potentially
take a data set and estimate intermediate values for an entire space and present the
results. In addition, the standardized three-dimensional space can be used to bring
in disparate data sets for display over each other.
With the standardized file format access there is the very real possibility that
Blender can perform visualization for a large number of different projects. A super
computer can handle the heavy lifting of the simulating and a script can be written
to view the data quickly and easily from multiple angles over time. In addition, the
data can be rendered quickly by only pulling the data that is to be viewed at a certain
angle and time.
15
16
Chapter 5 Conclusion
5.1
Did Blender work to a satisfactory level at displaying the data?
Blender did do a good job at visualizing the data. There are some aspects that
did not work out as anticipated. The biggest drawback being that currently Alpha
layer information is not used at the vertex level, but at a face level which makes
for difficulty in displaying the error as proposed earlier in this paper. That can be
overcome mostly by taking the vertex errors and blending them to generate a face
alpha quantity. This is by no means perfect, and there is the chance that in the future
the vertex alpha may be implemented. This should only be a problem for data that
has large amounts of space between the data points.
There is a need for a visual reference to anchor where people think they are
when viewing the videos or pictures. This will be the subject of further research and
development. At the time of writing, the image produced will be based on where you
have moved the camera to view.
The figures 5.1 and 5.2 are results from renderings of a file. These files are a 5 km
subsampling of an entire granule of 1 km data from the MODIS instrument.
5.2
Did Blender render in a satisfactory amount
of time?
The time that it took Blender to import the data was a bottle neck to the process.
In these graphs the script’s major actions were benchmarked. Step 1 is where the
latitude and longitude are read in and converted from WGS84 to x,y,z coordinate
triplets. Step 2 is generating the face list using a “for” loop. Step 3 is generating
the object and linking it to the scene. Step 4 is generating the material and adding
5.2 Did Blender render in a satisfactory amount of time?
17
Figure 5.1 Produced from file MYD02SSH.A2012001.0840.005.2012001181210.2.h5.
This is the red channel taken at night.
Figure 5.2 Produced from file MYD02SSH.A2012001.1945.005.2012002174518.1.h5.
This is the red channel taken during the day.
18
Chapter 5 Conclusion
Figure 5.3 The time taken to import the file and set up the environment
is dominated by the first step.
Figure 5.4 The time taken to import the file and set up the environment
is now dominated by the painting of vertices with the data.
5.3 Future Directions
19
the vertex paint layer to the object. Step 5 is reading in the data from the HDF file.
Step 6 is checking for error codes in the data. (The data has a range of values from
0 to 32767 with values higher than this indicating errors in the data which must be
handled.) Step 7 is painting the vertices with the data done with a “for” loop.
As can be seen from the graphs, the time taken for importing the data can be
decreased significantly. Version 0 did not use the NumPy libraries trigonometric
routines while Version 2 did. Steps 3 and 4 are built-in functions that will not be
able to be sped up with changes to the script. The rest of the steps can be improved
in the future in regards to time and features.
For the rendering to an image, once the data is imported to Blender, the time
taken is three to five seconds. I have not had the chance to benchmark this part
thoroughly, because the render time was previously a fraction of the time spent. With
the improvements to import time, render time has again become a consideration for
whether or not this is a good process for realtime rendering. Better hardware for
rendering would speed this step immensely. Video rendering has not yet been tested,
but should be a nearly linear extrapolation from the time it takes to render a single
image.
5.3
Future Directions
In the future, Blender could automatically take in data and render from the cues
in the HDF file. A standardized set of formats for the marking up within the HDF
file will allow parameters to be passed in from an external program. Gaining an
understanding of how some of the smoothing functions work so that, if some of our
standard data-fitting routines are already programmed in, we can use them with
a single command to process large sets of data. Finally, there is the challenge of
20
Chapter 5 Conclusion
integrating several data sources either through a layer approach within Blender, or
in the Python script before it renders.
There is also the pursuit of efficiency in the pipeline, so that it can handle the
large amounts of data that are out there. The amount of data that is imported could
be limited by knowing what will be viewed which would use less memory and less
rendering time. Improvements are possible in the coding for Python. Additionally
the rendering could be split up among different segments and stitched together afterwards to provide either different views for the same time frame, or to render many
overlapping time-frames, or view information in different wavelengths simultaneously.
Bibliography
[1] Blender http://www.blender.org/ (Accessed July 16, 2012).
[2] Brandon Maccharone, Shannell Frazier, Modis Website http://modis.gsfc.nasa.
gov/index.php (Accessed July 16, 2012).
[3] Steven Graham, Claire Parkinson, Aqua Project Science http://aqua.nasa.gov/
(Accessed July 16, 2012).
[4] Holly Zell, Brian Dunbar, NASA - Terra http://www.nasa.gov/mission pages/
terra/index.html
[5] Esri Inc., ArcGIS [Software], http://www.esri.com/software/arcgis
[6] Erdas Inc.,
Erdas Imagine [Software],
http://geospatial.intergraph.com/
products/ERDASIMAGINE/ERDASIMAGINE/Details.aspx
[7] Exelis Visual Information Solutions Inc., ENVI [Software], http://www.exelisvis.
com/ProductsServices/ENVI.aspx
[8] Google Inc., Google Earth [Software], http://www.google.com/earth/index.html
[9] Nishihama, R. Wolfe, D. Solomon, F. Patt, J Blanchette, A. Fleig, and E. Masuoka, MODIS Level 1A earth location: Algorithm Theoretical Basis Document ,
21
22
BIBLIOGRAPHY
Version 3.0 (Report SDST-092, Laboratory for Terrestrial Physics, NASA Goddard Space Flight Center, Greenbelt, MD, USA, 1997).
[10] D. Ascher et al., Numerical Python, tech. report UCRL-MA-128569, Lawrence
Livermore National Laboratory, 2001; http://numpy.scipy.org.
[11] G. van Rossum and F. L. Drake(eds), Python Reference Manual, (Python Software Foundation, 2006), http://docs.python.org/ref/ref.html.
[12] J. Xiong, G. Toller, V. Chiang, J. Sun, J. Esposito, and W. Barnes, MODIS Level
1B Algorithm Theoretical Basis Document , Version 3.0 (Prepared for: National
Aeronautics and Space Administration)
[13] J. v. Gumster, Blender for Dummies, 2nd ed. (Wiley, Indianapolis, 2011).
[14] Andre
Gosselin,
PyHDF,
2008
(http://pysclint.sourceforge.net/
pyhdf/)(Accessed July 16, 2012).
[15] A. Collette, HDF5 for Python, 2008 (http://h5py.alfven.org)(Accessed July 10,
2012)
[16] H4toH5 Conversion Library API Reference Manual with H4/H5 Command-line
Conversion Utilities , Release 2.2.1 (The HDF Group, 2012 http://www.
hdfgroup.org/ftp/HDF5/tools/h4toh5/src/unpacked/doc/h4toh5lib RM.pdf)
(Accessed July 16, 2012).
[17] E. R. Tufte, The Visual Display of Quantitative Information (Graphics Press,
Cheshire, 1983).
[18] Blender
2.6
Python
API
Documentation
http://www.blender.org/
documentation/blender python api 2 63 5/ (Accessed July 10, 2012).
Appendix A
Instructions for use
The procedures outlined here include a list of the packages used in this work, as well
as a step-by-step set of instructions for installing the Python packages. In addition,
the scripts for Blender will also be described. Finally, the basic methods of interacting
will be described.
A.1
Installation
The following is a step-by-step set of instructions for the installation of the software
libraries that will be used. All of the commands to be entered at the command line
are surrounded by double quotes.
1. Install Blender, libatlas-dev, libhdf5-serial-dev, libhdf5-serial-1.8.4, libblas-dev,
libblas3gf, liblapack3gf, liblapack-dev and gfortran from your distributions repositories.
2. Download the source code for NumPy and h5py from http://sourceforge.net/
projects/numpy/files/NumPy/ and http://code.google.com/p/h5py/downloads/
list
23
24
Chapter A Instructions for use
3. Unzip the source code packages.
4. Open the terminal and change to the directory you unzipped NumPy to.
5. From the command line run “python3.2mu setup.py build –fcompiler=gnu95”.
6. After the code has compiled, this will take a while if succesful for the first time,
then run “sudo python3.2mu setup.py install”.
7. From the command line run “find /usr -name numpy”. This will tell you where
your software is installed. Take note of the directory structure before the first
occurence of NumPy. This will be used later. An example of the output is
/usr/local/lib/python3.2/dist-packages/numpy
/usr/local/lib/python3.2/dist-packages/numpy/core/include/numpy
/usr/local/lib/python3.2/dist-packages/numpy/numarray/include/numpy
8. Move to the directory in which you unzipped h5py and run “python3.2mu
setup.py build”.
9. Run “sudo python3.2mu setup.py install”.
10. Run “find /usr -name h5py” to double check that h5py is installed in the same
directory.
Installation will use the standard Blender install. In linux either use a package manager to install Blender or go to the website www.blender.org to obtain the
latest stable package. In addition, you will need to install libatlas-dev, libhdf5-serialdev, libhdf5-serial-1.8.4, libblas-dev, libblas3gf, liblapack3gf, liblapack-dev, as well as
gfortran. The directions at http://www.scipy.org/Installing SciPy/Linux should be
followed with one exception. The reason for this change is that Blender uses Python
3 with wide unicode support which is not the standard settings.
A.2 Use of Software
25
Figure A.1 When you first start Blender this is the starting view.
In order to find out where Python is searching for the libraries open Blender.
Select the Python console from the editor menu. This menu is found in the lower
left of the work space and is one of these images based on the editing mode for the
window. From the drop down menu select the Python console. In the console type in
“import sys” and then “sys.path”. This will list off the different directories in which
Blender’s Python is looking for libraries. The default directory is set by the system.
You can either append onto the sys.path the directory where your libraries are stored
or move the libraries into one of the searched directories. If you choose to leave them
in the default location then you will need to set the path in the script. Any changes
made to the sys.path variable are lost when Blender is closed and reopened.
A.2
Use of Software
Once the program, scripts and libraries are installed you can use them from within
Blender by supplying the script with a list of files. You will also need to give it a
location to track for the rendering. For this you will need to specify a position in
26
Chapter A Instructions for use
Figure A.2 After the code you will see the object colored if you switch to
vertex paint mode in the object viewing area.
Figure A.3 By selecting the image in the bottom left of a window you can
change the view
A.2 Use of Software
Figure A.4 The ones in use for this are the Python console, Text editor
and the 3-D view.
27
28
Chapter A Instructions for use
space by giving it a latitude and longitude coordinates and a height above the geoid
in WGS84. In addition, you need to supply the viewing parameters for the camera.
Once these are set, run the script and it will generate a video file.
There will be several options for control of the camera once the script is set up.
There is the possibility of the video following the satellite’s track or to remain located
above a certain location for the duration of the video. There will also be settings for
the amount of time that is rendered. The data file contains the data for ten minutes
of flight time and the video could recreate faithfully the time that it takes to view,
or it could generate a time lapse at a set rate.
A.3
Methods of interacting with the software
At this time, the methods of interacting with the code is by settings within the script
file itself. This is a little ungainly and ultimately the program should be accessible
by command-line interface. That has not been worked out just yet.
There are some challenges to interfacing that have not been addressed. When a
Python script is run in Blender, it does lock out the other controls as intended [18].
There are methods of updating in the middle of a script so that the process can
be viewed. However, it is not recommended by the documentation. It would be
preferable to run this script without the user interface running to preserve more
system resources for the rendering itself, and because this will be repetitive and
should be left to a machine to crank through. For more information look, in the
section “Gotcha’s” in Blender’s Python api documentation.
A.4 Computer Specifications
A.4
29
Computer Specifications
A set of tables containing hardware descriptions of the computer on which all of the
timing tests were performed. A table of the software packages as listed in the software
repositories is also included here.
Table A.1 Cpu
Brand
AMD
Name
Athlon II X2 260
Model
ADX260OCGMBOX
Core
Regor
Multi-Core
Dual-Core
Operating Frequency
3.2GHz
Hyper Transports
4000MHz
L1 Cache
2 x 128KB
L2 Cache
2 x 1MB
Manufacturing Tech
45 nm
64 bit Support
Yes
Hyper-Transport Support
Yes
30
Chapter A Instructions for use
Table A.2 Harddrive
Brand
Seagate
Series
Barracuda Green
Model
ST1500DL003
Interface
SATA 6.0Gb/s
RPM
5900 RPM
Cache
64MB
Average Seek Time
12ms
Average Write Time
13ms
Average Latency
4.16ms
Table A.3 Graphics Card
Brand
ZOTAC
Model
ZT-40704-10L
Interface
PCI Express 2.0 x16
Chipset Manufacturer
NVIDIA
GPU
GeForce GT 440 (Fermi)
Core Clock
810MHz
Shader Clock
1620MHz
CUDA Cores
96
Effective Memory Clock
1600MHz
Memory Size
1GB
Memory Interface
128-bit
Memory Type
GDDR3
3-D API
DirectX, DirectX 11, OpenGL, OpenGL 4.1
Features
NVIDIA PhysX , NVIDIA CUDA
A.4 Computer Specifications
31
Table A.4 Memory
Brand
CORSAIR
Series
Vengeance
Model
CML16GX3M4A1600C9B
Type
240-Pin DDR3 SDRAM
Capacity
16GB (4 x 4GB)
Speed
DDR3 1600 (PC3 12800)
Cas Latency
9
Timing
9-9-9-24
Multi-channel Kit
Dual Channel Kit
Table A.5 Software
Package
Version
Ubuntu
11.10(12.04)
Blender
2.58-svn37702-1ubuntu1
libblas3gf
1.2.20110419-2ubuntu1
libblas-dev
1.2.20110419-2ubuntu1
libatlas-dev
3.8.4-3build1
liblapack3gf
3.3.1-1
liblapack-dev
3.3.1-1
libhdf5-serial-dev
1.8.4-patch1-2ubuntu4
libhdf5-serial-1.8.4
1.8.4-patch1-2ubuntu4
gfortran
4:4.6.1-2ubuntu5
32
Chapter A Instructions for use
Appendix B
Python Code
B.1
1
Version 2
#Libraries used
2
3
import bpy
4
import sys
5
6
#The following string is used to indicate where the library packages are stored
7
# change to match the location on your system This is necessary to find numpy
8
# and h5py if the libraries are already in one of the searched directories then
9
# comment out this section of code.
10
11
systemPath = ’/usr/local/lib/python3.2/dist-packages’
12
13
#The following checks against each element of the system path list and if the
33
34
Chapter B Python Code
14
# variable systemPath is not found adds it. This could break if your sys.path
15
# variable contains elements that are not strings.
16
# is something to check.
If the code breaks this
17
18
if [s for s in sys.path if systemPath in s] == []:
sys.path.append(systemPath)
19
20
21
#
22
import numpy
23
import h5py
24
25
#Configuration values
26
27
#Name is the name of the hdf file and the path is the location to it.
28
#In the case of the MYD02SSH you can use the internal latitude and longitude
29
# coordinates because they map one to one.
30
# 1km resolution does not include the one to one coordinate points to data
31
# points therefore you need to include an additional data file that contains
32
# those. Which is where name2 comes from.
The other data set that is the
33
34
name = ’MYD02SSH.A2012001.1945.005.2012002174518.1.h5’
35
#name2 = ’MYD03.A2012001.1945.005.2012002170952.1.h5’
36
37
#ShortName is to get around the name size limitation in Blender.
38
#Otherwise I would use the name of the file.
39
# will take longer names up to 63 characters in length.
The newer version of Blender
So this will be
B.1 Version 2
35
40
# phased out in later versions but will necessitate using Blender 2.62
41
# and up.
42
ShortName = ’First’
43
path = ’’
44
45
#These are the dimensions of the data set.
This data does exist within the
46
# data file itself but I have not taken the time to learn how to access that
47
# specific attribute yet.
48
# recognized because it has nested attributes.
49
# be found using HDFView and entered here.
50
# faces which is one row and column less and referenced so often that it makes
51
# more sense to compute once and reference.
52
n = 406
53
n2 = n-1
54
m = 271
55
m2 = m-1
This might be because the hdf-eos extension is not
Either way these values can
n2 and m2 is the shape of the
56
57
#Creating the file handle
58
H5File = h5py.File(path+name,’r’)
59
#H5File2 = h5py.File(path+name2,’r’)
60
61
#these are the list objects where the data for the mesh frame will be stored
62
vertex_list = []
63
face_list = []
64
65
#This block of code computes the x,y,z coordinates from the latitude and
36
Chapter B Python Code
66
# longitude included in the hdf file Numpy makes this much faster.
The dtype
67
# is included to make the accuracy achieved match my previous method of using
68
# the math package and a for loop.
69
# then that can be removed for a speed increase of three times for this
70
# section.
If this level of accuracy is unnecessary
71
72
a = 6.378137 #This is the semi major axisreference datum for wgs 84
73
b = 6.356752314245 #This is the semi minor axis
74
p = ((numpy.array(H5File[’MODIS_SWATH_Type_L1B’][’Geolocation Fields’]
[’Latitude’],dtype=float)*numpy.pi)/180)
75
76
r = numpy.sqrt(1/((numpy.square(numpy.sin(p))/(a*a))
77
+(numpy.square(numpy.cos(p))/(b*b))))
78
t = ((numpy.array(H5File[’MODIS_SWATH_Type_L1B’][’Geolocation Fields’]
[’Longitude’],dtype=float)*numpy.pi)/180)
79
80
x = numpy.multiply(numpy.multiply(r,numpy.sin(p)),numpy.cos(t))
81
y = numpy.multiply(numpy.multiply(r,numpy.sin(p)),numpy.sin(t))
82
z = numpy.multiply(r,numpy.cos(p))
83
vertex_list = zip(x.flatten().tolist(),y.flatten().tolist(),
84
z.flatten().tolist())
85
86
#This section could also be sped up a bit with some work.
87
#The face list is a set of four numbers that are indexes into the list of
88
# vertex points and should be in the order of upper left, lower left,
89
# lower right, and upper right.
90
# norm of the face facing.
91
for i in range(0,n2):
I don’t know which direction this puts the
B.1 Version 2
92
93
37
for j in range(0,m2):
face_list.append(((i*m+j),(i*m+j+m),(i*m+j+m+1),(i*m+j+1)))
94
95
#Generate the mesh object and link with the object
96
me = bpy.data.meshes.new(ShortName+’Mesh’)
97
ob = bpy.data.objects.new(ShortName, me)
98
ob.location = (0,0,0)
99
ob.show_name = True
100
# Link object to scene
101
bpy.context.scene.objects.link(ob)
102
103
# Create mesh from given verts, edges, faces. Either edges or
104
# faces should be [], or you ask for problems
105
me.from_pydata(list(vertex_list), [], face_list)
106
107
# Update mesh with new data
108
me.update(calc_edges=True)
109
110
#The following generates a new material if there is none available
111
if len(bpy.data.materials.keys())<1:
112
bpy.ops.material.new()
113
114
#Takes the name of the first material
115
MaterialName = bpy.data.materials.keys()[0]
116
117
#This next line sets the active object to the one we just created, necessary
38
Chapter B Python Code
118
# for the ops commands to work
119
bpy.context.scene.objects.active = ob
120
121
#Add a new material slot to the active object
122
bpy.ops.object.material_slot_add()
123
124
#Set the Material for the object to the first one
125
ob.material_slots[’’].material = bpy.data.materials[MaterialName]
126
127
#Set the first material to use Vertex color painting
128
bpy.data.materials[MaterialName].use_vertex_color_paint = True
129
130
#Finally generate the data structure that stores the vertex paint information
131
bpy.data.objects[ShortName].data.vertex_colors.new()
132
133
#Take in the information about the recorded radiances and scale to between 0
134
# and 1 This is also where you can set what channel to use
135
Edata1 = (H5File[’MODIS_SWATH_Type_L1B’][’Data Fields’][’EV_1KM_RefSB’][7]
136
137
138
139
140
/32767)
Edata2 = (H5File[’MODIS_SWATH_Type_L1B’][’Data Fields’][’EV_1KM_RefSB’][3]
/32767)
Edata3 = (H5File[’MODIS_SWATH_Type_L1B’][’Data Fields’]
[’EV_500_Aggr1km_RefSB’][0]/32767)
141
142
#This for loop structure is to make sure that the elements are between 0 and 1
143
# because outside of that indicates error, for now I am zeroing the data for
B.1 Version 2
39
144
# testing purposes.
145
for i in range(0,n):
146
This will need to be addressed in greater depth later.
for j in range(0,m):
147
if Edata1[i][j] > 1:
148
Edata1[i][j] = 0
149
if Edata2[i][j] > 1:
150
Edata2[i][j] = 0
151
if Edata3[i][j] > 1:
152
Edata3[i][j] = 0
153
154
155
for i in range(0,n2):
for j in range(0,m2):
156
index = i*m2+j
157
ob.data.vertex_colors[’Col’].data[index].color1 = (Edata1[i][j],
158
Edata2[i][j],
159
Edata3[i][j])
160
ob.data.vertex_colors[’Col’].data[index].color2 = (Edata1[i+1][j],
161
Edata2[i+1][j],
162
Edata3[i+1][j])
163
ob.data.vertex_colors[’Col’].data[index].color3 = (Edata1[i+1][j+1],
164
Edata2[i+1][j+1],
165
Edata3[i+1][j+1])
166
ob.data.vertex_colors[’Col’].data[index].color4 = (Edata1[i][j+1],
167
Edata2[i][j+1],
168
Edata3[i][j+1])
40
B.2
1
Chapter B Python Code
Version 1
#Libraries used
2
3
import bpy
4
import math
5
import sys
6
import time
7
8
#The following string is used to indicate where the library packages are stored
9
# change to match the location on your system This is necessary to find numpy
10
# and h5py if the libraries are already in one of the searched directories then
11
# comment it out
12
13
sys.path.append(’/usr/local/lib/python3.2/dist-packages’)
14
15
import numpy
16
import h5py
17
18
#Internal functions
19
20
#WGS84toXYZ takes in the latitude and longitude in terms of degrees with
21
# latitude and longitude and computes the x,y,z coordinates for them based on
22
# the WGS84 Geoide.
23
# system.
24
def WGS84toXYZ(phi,theta):
This does not account for any rotation of the coordinate
B.2 Version 1
41
25
a = 6.378137 #This is the semi major axisreference datum for wgs 84
26
b = 6.356752314245 #This is the semi minor axis
27
p = (phi*numpy.pi)/180
28
r = math.sqrt(1/((numpy.sin(p)*numpy.sin(p)/(a*a))+((numpy.cos(p)*
numpy.cos(p))/(b*b))))
29
30
t = ((theta)*math.pi)/180
31
x = r*numpy.sin(p)*numpy.cos(t)
32
y = r*numpy.sin(p)*numpy.sin(t)
33
z = r*numpy.cos(p)
34
return x,y,z
35
36
#This function takes in the latitude and longitude and computes the x,y,z
37
# coordinate for them based on the Sinusoidal projection system used by some
38
# MODIS gridded data products
39
def MODISSINtoXYZ(phi,theta):
40
r = 6.371007181
41
p = (phi*math.pi)/180
42
x = r*math.sin(p)*math.cos(t)
43
y = r*math.sin(p)*math.sin(t)
44
z = r*math.cos(p)
45
return x,y,z
46
47
#Configuration values
48
49
#Name is the name of the hdf file and the path is the location to it.
50
name = ’MYD02SSH.A2012001.1945.005.2012002174518.1.h5’
42
Chapter B Python Code
51
#name2 = ’MYD03.A2012001.1945.005.2012002170952.1.h5’
52
ShortName = ’First’
53
path = ’’
54
#These are the dimensions of the data set.
55
# data file itself but I have not taken the time to learn how to access that
56
# specific attribute yet.
57
# recognized because it has nested attributes.
58
# be found using HDFView and entered here.
59
# that it makes more sense to compute once and reference.
60
n = 406
61
n2 = n-1
62
m = 271
63
m2 = m-1
This data does exist within the
This might be because the hdf-eos extension is not
Either way these values can
n2 and m2 are referenced so often
64
65
66
#for profiling the for loops to see where the most time is spent is handled
67
# through time.clock and written to file every time it is run.
68
Time = open(’time’,’at’)
69
Time.write(’This is the time for a run of ’+name+’\n’)
70
Time1 = 0
71
Time2 = 0
72
73
#Creating the file handle
74
H5File = h5py.File(path+name,’r’)
75
#H5File2 = h5py.File(path+name2,’r’)
76
B.2 Version 1
43
77
#These are the list objects where the data for the mesh frame will be stored
78
vertex_list = []
79
face_list = []
80
81
#Take the time before starting the loop
82
Time1 = time.clock()
83
84
85
86
for i in range(0,n):
for j in range(0,m):
vertex_list.append(WGS84toXYZ(
87
H5File[’MODIS_SWATH_Type_L1B’][’Geolocation Fields’][’Latitude’][i][j],
88
H5File[’MODIS_SWATH_Type_L1B’][’Geolocation Fields’][’Longitude’][i][j]))
89
90
#Take the time after finishing the loop
91
Time2 = time.clock()
92
93
94
Time.write(’Converting from WGS84 to XYZ took using numpy’+str(Time2-Time1)
+’ seconds\n’)
95
96
Time1 = time.clock()
97
98
99
100
for i in range(0,n2):
for j in range(0,m2):
face_list.append(((i*m+j),(i*m+j+m),(i*m+j+m+1),(i*m+j+1)))
101
102
Time2 = time.clock()
44
103
Chapter B Python Code
Time.write(’Generating the face list took ’+str(Time2-Time1)+’ seconds\n’)
104
105
Time1 = time.clock()
106
me = bpy.data.meshes.new(ShortName+’Mesh’)
107
ob = bpy.data.objects.new(ShortName, me)
108
ob.location = (0,0,0)
109
ob.show_name = True
110
# Link object to scene
111
bpy.context.scene.objects.link(ob)
112
113
# Create mesh from given verts, edges, faces. Either edges or
114
# faces should be [], or you ask for problems
115
me.from_pydata(vertex_list, [], face_list)
116
117
# Update mesh with new data
118
me.update(calc_edges=True)
119
Time2 = time.clock()
120
Time.write(’Generating the object and linking it took ’+str(Time2-Time1)
121
+’ seconds\n’)
122
123
Time1 = time.clock()
124
#The following generates a new material if there is none available
125
if len(bpy.data.materials.keys())<1:
126
bpy.ops.material.new()
127
128
#Takes the name of the first material
B.2 Version 1
129
45
MaterialName = bpy.data.materials.keys()[0]
130
131
#This next line sets the active object to the one we just created, necessary
132
# for the ops commands to work
133
bpy.context.scene.objects.active = ob
134
135
#Add a new material slot to the active object
136
bpy.ops.object.material_slot_add()
137
138
#Set the Material for the object to the first one
139
ob.material_slots[’’].material = bpy.data.materials[MaterialName]
140
141
#Set the first material to use Vertex color painting
142
bpy.data.materials[MaterialName].use_vertex_color_paint = True
143
144
#Finally generate the data structure that stores the vertex paint information
145
bpy.data.objects[ShortName].data.vertex_colors.new()
146
147
Time2 = time.clock()
148
Time.write(’Generating the material and adding the vertex paint layer took ’
149
+str(Time2-Time1)+’ seconds\n’)
150
151
Time1 = time.clock()
152
#Take in the information about the recorded radiances and scale to between 0
153
# and 1. This is also where you can set what channel to use.
154
Edata1 = (H5File[’MODIS_SWATH_Type_L1B’][’Data Fields’][’EV_1KM_RefSB’][7]
46
155
156
157
158
159
Chapter B Python Code
/32767)
Edata2 = (H5File[’MODIS_SWATH_Type_L1B’][’Data Fields’][’EV_1KM_RefSB’][3]
/32767)
Edata3 = (H5File[’MODIS_SWATH_Type_L1B’][’Data Fields’]
[’EV_500_Aggr1km_RefSB’][0]/32767)
160
Time2 = time.clock()
161
Time.write(’Reading in the data took ’+str(Time2-Time1)+’ seconds\n’)
162
163
Time1 = time.clock()
164
#This for loop structure is to make sure that the elements are between 0 and 1
165
# because outside of that indicates error, for now I am zeroing the data for
166
# testing purposes
167
for i in range(0,n):
168
for j in range(0,m):
169
if Edata1[i][j] > 1:
170
Edata1[i][j] = 0
171
if Edata2[i][j] > 1:
172
Edata2[i][j] = 0
173
if Edata3[i][j] > 1:
174
Edata3[i][j] = 0
175
176
Time2 = time.clock()
177
Time.write(’Checking for error codes in the data took ’+str(Time2-Time1)
178
+’ seconds\n’)
179
180
Time1 = time.clock()
B.3 Version 0
181
182
47
for i in range(0,n2):
for j in range(0,m2):
183
index = i*m2+j
184
ob.data.vertex_colors[’Col’].data[index].color1 = (Edata1[i][j],
185
Edata2[i][j],
186
Edata3[i][j])
187
ob.data.vertex_colors[’Col’].data[index].color2 = (Edata1[i+1][j],
188
Edata2[i+1][j],
189
Edata3[i+1][j])
190
ob.data.vertex_colors[’Col’].data[index].color3 = (Edata1[i+1][j+1],
191
Edata2[i+1][j+1],
192
Edata3[i+1][j+1])
193
ob.data.vertex_colors[’Col’].data[index].color4 = (Edata1[i][j+1],
194
Edata2[i][j+1],
195
Edata3[i][j+1])
196
197
Time2 = time.clock()
198
Time.write(’Painting the vertices took ’+str(Time2-Time1)+’ seconds\n’)
199
Time.close()
B.3
1
Version 0
#Libraries used
2
3
import bpy
48
Chapter B Python Code
4
import math
5
import sys
6
import time
7
8
#The following string is used to indicate where the library packages are stored
9
# change to match the location on your system. This is necessary to find numpy
10
# and h5py if the libraries are already in one of the searched directories then
11
# comment it out
12
13
sys.path.append(’/usr/local/lib/python3.2/dist-packages’)
14
15
import numpy
16
import h5py
17
18
#Internal functions
19
20
#WGS84toXYZ takes in the latitude and longitude in terms of degrees with
21
# latitude and longitude and computes the x,y,z coordinates for them based on
22
# the WGS84 Geoide.
23
# system
This does not account for any rotation of the coordinate
24
25
def WGS84toXYZ(phi,theta):
26
a = 6.378137 #This is the semi major axisreference datum for wgs 84
27
b = 6.356752314245 #This is the semi minor axis
28
p = (phi*numpy.pi)/180
29
r = math.sqrt(1/((numpy.sin(p)*numpy.sin(p)/(a*a))+((numpy.cos(p)*
B.3 Version 0
49
numpy.cos(p))/(b*b))))
30
31
t = ((theta)*math.pi)/180
32
x = r*numpy.sin(p)*numpy.cos(t)
33
y = r*numpy.sin(p)*numpy.sin(t)
34
z = r*numpy.cos(p)
35
return x,y,z
36
37
#This function takes in the latitude and longitude and computes the x,y,z
38
# coordinate for them based on the Sinusoidal projection system used by some
39
# MODIS gridded data products
40
def MODISSINtoXYZ(phi,theta):
41
r = 6.371007181
42
p = (phi*math.pi)/180
43
x = r*math.sin(p)*math.cos(t)
44
y = r*math.sin(p)*math.sin(t)
45
z = r*math.cos(p)
46
return x,y,z
47
48
#Configuration values
49
50
#Name is the name of the hdf file and the path is the location to it.
51
name = ’MYD02SSH.A2012001.1945.005.2012002174518.1.h5’
52
#name2 = ’MYD03.A2012001.1945.005.2012002170952.1.h5’
53
ShortName = ’First’
54
path = ’’
55
#These are the dimensions of the data set.
This data does exist within the
50
Chapter B Python Code
56
# data file itself but I have not taken the time to learn how to access that
57
# specific attribute yet.
58
# recognized because it has nested attributes.
59
# be found using HDFView and entered here.
60
# that it makes more sense to compute once and reference.
61
n = 406
62
n2 = n-1
63
m = 271
64
m2 = m-1
This might be because the hdf-eos extension is not
Either way these values can
n2 and m2 are referenced so often
65
66
67
#for profiling the for loops to see where the most time is spent is handled
68
# through time.clock and written to file every time it is run
69
Time = open(’time’,’at’)
70
Time.write(’This is the time for a run of ’+name+’\n’)
71
Time1 = 0
72
Time2 = 0
73
74
#Creating the file handle
75
H5File = h5py.File(path+name,’r’)
76
#H5File2 = h5py.File(path+name2,’r’)
77
78
#These are the list objects where the data for the mesh frame will be stored
79
vertex_list = []
80
face_list = []
81
B.3 Version 0
82
#Take the time before starting the loop
83
Time1 = time.clock()
51
84
85
86
87
for i in range(0,n):
for j in range(0,m):
vertex_list.append(WGS84toXYZ(
88
H5File[’MODIS_SWATH_Type_L1B’][’Geolocation Fields’][’Latitude’][i][j],
89
H5File[’MODIS_SWATH_Type_L1B’][’Geolocation Fields’][’Longitude’][i][j]))
90
91
#Take the time after finishing the loop
92
Time2 = time.clock()
93
94
95
Time.write(’Converting from WGS84 to XYZ took using numpy’+str(Time2-Time1)
+’ seconds\n’)
96
97
Time1 = time.clock()
98
99
100
101
for i in range(0,n2):
for j in range(0,m2):
face_list.append(((i*m+j),(i*m+j+m),(i*m+j+m+1),(i*m+j+1)))
102
103
Time2 = time.clock()
104
Time.write(’Generating the face list took ’+str(Time2-Time1)+’ seconds\n’)
105
106
Time1 = time.clock()
107
me = bpy.data.meshes.new(ShortName+’Mesh’)
52
Chapter B Python Code
108
ob = bpy.data.objects.new(ShortName, me)
109
ob.location = (0,0,0)
110
ob.show_name = True
111
# Link object to scene
112
bpy.context.scene.objects.link(ob)
113
114
# Create mesh from given verts, edges, faces. Either edges or
115
# faces should be [], or you ask for problems
116
me.from_pydata(vertex_list, [], face_list)
117
118
# Update mesh with new data
119
me.update(calc_edges=True)
120
Time2 = time.clock()
121
Time.write(’Generating the object and linking it took ’+str(Time2-Time1)
122
+’ seconds\n’)
123
124
Time1 = time.clock()
125
#The following generates a new material if there is none available
126
if len(bpy.data.materials.keys())<1:
127
bpy.ops.material.new()
128
129
#Takes the name of the first material
130
MaterialName = bpy.data.materials.keys()[0]
131
132
#This next line sets the active object to the one we just created, necessary
133
# for the ops commands to work
B.3 Version 0
134
53
bpy.context.scene.objects.active = ob
135
136
#Add a new material slot to the active object
137
bpy.ops.object.material_slot_add()
138
139
#Set the Material for the object to the first one
140
ob.material_slots[’’].material = bpy.data.materials[MaterialName]
141
142
#Set the first material to use Vertex color painting
143
bpy.data.materials[MaterialName].use_vertex_color_paint = True
144
145
#Finally generate the data structure that stores the vertex paint information
146
bpy.data.objects[ShortName].data.vertex_colors.new()
147
148
Time2 = time.clock()
149
Time.write(’Generating the material and adding the vertex paint layer took ’
150
+str(Time2-Time1)+’ seconds\n’)
151
152
Time1 = time.clock()
153
#Take in the information about the recorded radiances and scale to between 0
154
# and 1. This is also where you can set what channel to use.
155
Edata1 = (H5File[’MODIS_SWATH_Type_L1B’][’Data Fields’][’EV_1KM_RefSB’][7]
156
157
158
159
/32767)
Edata2 = (H5File[’MODIS_SWATH_Type_L1B’][’Data Fields’][’EV_1KM_RefSB’][3]
/32767)
Edata3 = (H5File[’MODIS_SWATH_Type_L1B’][’Data Fields’]
54
160
Chapter B Python Code
[’EV_500_Aggr1km_RefSB’][0]/32767)
161
Time2 = time.clock()
162
Time.write(’Reading in the data took ’+str(Time2-Time1)+’ seconds\n’)
163
164
Time1 = time.clock()
165
#This for loop structure is to make sure that the elements are between 0 and 1
166
# because outside of that indicates error, for now I am zeroing the data for
167
# testing purposes
168
for i in range(0,n):
169
for j in range(0,m):
170
if Edata1[i][j] > 1:
171
Edata1[i][j] = 0
172
if Edata2[i][j] > 1:
173
Edata2[i][j] = 0
174
if Edata3[i][j] > 1:
175
Edata3[i][j] = 0
176
177
Time2 = time.clock()
178
Time.write(’Checking for error codes in the data took ’+str(Time2-Time1)
179
+’ seconds\n’)
180
181
Time1 = time.clock()
182
for i in range(0,n2):
183
for j in range(0,m2):
184
index = i*m2+j
185
ob.data.vertex_colors[’Col’].data[index].color1 = (Edata1[i][j],
B.3 Version 0
55
186
Edata2[i][j],
187
Edata3[i][j])
188
ob.data.vertex_colors[’Col’].data[index].color2 = (Edata1[i+1][j],
189
Edata2[i+1][j],
190
Edata3[i+1][j])
191
ob.data.vertex_colors[’Col’].data[index].color3 = (Edata1[i+1][j+1],
192
Edata2[i+1][j+1],
193
Edata3[i+1][j+1])
194
ob.data.vertex_colors[’Col’].data[index].color4 = (Edata1[i][j+1],
195
Edata2[i][j+1],
196
Edata3[i][j+1])
197
198
Time2 = time.clock()
199
Time.write(’Painting the vertices took ’+str(Time2-Time1)+’ seconds\n’)
200
Time.close()
56
Chapter B Python Code