Download exjob40

Document related concepts

Image-guided radiation therapy wikipedia , lookup

Medical imaging wikipedia , lookup

Transcript
Report No. 91
Master Thesis
Simulation of B-scan Imaging Using
Linear Acoustic Theory and Photographic Image Data
-A Module in The Image Guided Surgery Training Tool Project
Developed In Cooperation With
The Nuclear and Radiological Engineering Department At
The University of Florida
Simulering av B-scan-bildgivning
med hjälp av linjär akustikteori och fotografisk bilddata
By
Mathias Nygård
Gainesville 2001
I
ABSTRACT
Tools in the image guided surgery arena are aimed at assisting in the performance and planning of
medical operations. They are designed to aid in decision-making during the pre-operational phase
and navigation of surgical instruments in the intra-operational phase. In this thesis, a computer
program has been developed to simulate the ultrasound B-scans provided by the SiteRite® II image
guidance tool. This equipment constitutes a portable ultrasound imaging system dedicated to needleguidance procedures for percutaneous vascular access. It is used interactively in the intra-operational
phase by the clinical staff during minimal invasive medical operations in the cervical region, but is
also consulted during the pre-operational phase of other procedures. Simulating the B-scans of this
equipment is a step in developing a training tool replica of the SiteRite unit. The surgeon will be able
to use this training tool to gain experience and improve his skills before operating the real equipment.
The IDL programming language for windows was chosen as the software development
environment. Anatomically correct photographic images of transverse slices of the cervical region
were transformed into gray scale to serve as input to the program. Based on linear acoustic theory
and transmission coefficients of selected human tissue types, the ultrasound transport through, and
echoes from, anatomical structures in the images are calculated. The required acoustic parameters are
read from transform tables where image gray scale values have been mapped to pre-collected
acoustic properties of tissues.
Resulting B-scan images show close resemblance with real ultrasound images obtained by the
SiteRite system. The overall appearance is good and anatomical structures and view dependent
phenomena such as shadows can be observed, but artifacts can also appear. Undesired reflections
can occasionally speckle the simulated B-scan to the unrecognisable where the anatomical structures
in the tissue image are represented by very dark regions. These artifacts can be derived to the
technique used for mapping gray scale to acoustic properties. Improvement of the segmentation
technique has therefore been suggested for future work.
Keywords: Ultrasound simulation, image guidance, training tool, cervical region, computer-aided
surgery
II
SAMMANFATTNING
Verktyg som används för bildunderstödd vägledning inom kirurgin syftar till att assistera och bistå
personalen vid planering och utförande av medicinska operationer. De ska fungera som stöd för
beslutsfattning under den preoperationella fasen och som stöd för navigering av de kirurgiska
verktygen under den intraoperationella fasen av operationen. Inom ramarna för denna uppsats har ett
program skrivits för simulering av de B-scan-bilder som genereras av SiteRite® II, ett ultraljudbaserat
bildgivande verktyg. Utrustningen är portabel och används för visuell vägledning vid ingrepp som
involverar perkutanvasculär åtkomst med nål. Den används intraoperationellt och interaktivt vid
minimalinvasiva operationer som mestadels rör den cervicala regionen, men konsulteras även
preoperationellt vid andra typer av ingrepp. Att simulera B-scan-bilderna är ett steg i utvecklingen av
en kopia av SiteRite. Denna kopia ska resultera i ett träningsverktyg som tillåter kirurgen att öva upp
sina färdigheter. Kirurgen kan därmed få erfarenhet i att hantera utrustningen och tolka dess bilder
innan han stiger in i det kliniska arbetet med patienter och konfronteras med verkligheten.
Programmeringsspråket IDL valdes som utvecklingsmiljö. Anatomiskt korrekta bilder av
cervicalregionen transformerades till gråskala och användes som indata till programmet. Baserat på
linjär akustikteori och transmissionskoefficienter från olika vävnadstyper beräknas transport genom,
och ekon från, de anatomiska strukturerna i bilden. Nödvändiga parametrar tas från en tabell där
bildens gråvärden tillordnats akustiska egenskaper som tidigare sammanställts.
De resulterande B-scan-bilderna stämmer i allmänhet bra överens med de bilder som tas med
SiteRite. Anatomiska strukturer och t ex skuggfenomen uppträder bra, men även artefakter
förekommer i bilderna. Där vävnader i indatabilden är representerade med en mycket mörk
gråskaleton, kan icke önskvärda reflektioner uppträda på motsvarande plats i den simulerade
ultraljudsbilden. Följden kan bli att en mörk anatomisk struktur inte lätt kan urskiljas från sin
omgivning p g a att oönskade fläckar eller salt- och pepparlikt brus fyller dess plats. Detta kan
härledas till den segmentationsteknik som använts för att tillordna akustiska egenskaper till
gråvärdena i bilddatat. För framtida arbete har därför förbättring av segmentationstekniken
föreslagits.
Nyckelord: Ultraljudsimulering, bildledning, träningsverktyg, cervicalregionen, datorstödd kirurgi
III
AKNOWLEDGEMENTS
First and foremost I wish to give my appreciation to Dr David Hintenlang for providing me with this
master thesis project and who’s personality created a friendly and warm atmosphere of collaboration.
Thanks also to Dr Wesley Bolch for initially inviting me to University of Florida. I don’t think I ever
would have made it here without your help. I put my hands together for Archana Mayani for being
my linguistical consultant during long office hours in the lab, which by the way was an exceptional
environment to think and discuss in, away from the sunlight and (the) fresh air (of the tempting
outdoors). I won’t forget your o’so nice biggedibong-sounding music, and I don’t think you’ll forget
mine. A single but well aimed hand clap goes out to Chris, Didier, Paul and Lisa and the other
department students for adding cultural diversity and amusing moments to my stay in Gainesville. I
wish I could have done some more diving with some of you, and maybe eaten some Chinese food
together with others of you. We never got there.
I am grateful to Dr Rune “Texas” Lindgren for accommodating me in “The Swedes’ Nest”
down at Stoneridge Apartments. Since I came, the place has been shaped up with the collective
efforts of fellow students from Sweden who also deserve a great shoulder cuff, they know why. The
flying friends of Fluid Dynamic (Jonas, Marcus and Per) for fetching me from the airport with their
ugly car, all the furiously friendly physicists (Mariwan, Henrik and Gustav) for their corny jokes and
seldom seen earnest gestures of goodwill all the way through my stay, every one of the not so modest
nor mature, but bright and soon good looking, mechanical engineering students (Johan, Maria &
Basse, Maria & Martin) I had the pleasure to make acquaintance with. Key West, Miami, Bikers
Week, Savannah .... Thanks for filling my spare time with singing, cooking, bling bling and other ingtings. Post party hitch hiking, or should I say highjacking, for example!
A special gratitude to all my near and dear back home in Sweden for holding the fortress. And
even though I met with you during Christmas Jessica, I can’t wait to see you now. Älgkött,
hjortronsylt, trottoarer, kramar och hela baletten. Heja Sverige !!!
IV
TABLE OF CONTENTS
ABSTRACT
I
SAMMANFATTNING
II
AKNOWLEDGEMENTS
III
TABLE OF CONTENTS
IV
1.
LITTERATURE REVIEW
1.1. IMAGE GUIDED SURGERY REVIEW
1.1.1.
Interactive Pre-operative Planning Tools
1.1.2.
Intra-Operative Real-Time Surgery Guidance Tools
1.2. ULTRASOUND SIMULATION REVIEW
2.
PROJECT BACKGROUND AND INTRODUCTION
2.1.
2.2.
2.3.
2.4.
3.
GOALS OF THE ULTRASOUND IMAGE GUIDED SURGERY TRAINING TOOL PROJECT
THE DYMAX SITERITE® II ULTRASOUND IMAGING EQUIPMENT
MASTER THESIS OBJECTIVES
MINIMAL INVASIVE MEDICAL OPERATIONS REQUIRE TOOLS FOR GUIDANCE
PHYSICAL PRINCIPLES OF ULTRASOUND
3.1. ASSUMING LONGITUDINAL ULTRASOUND WAVE FORM
3.2. THE ULTRASOUND TRANSDUCER GENERATES PLANE WAVEFRONTS
3.3. PARAMETERS OF IMPORTANCE TO THE ACOUSTIC PROPERTIES OF TISSUE
3.3.1.
Acoustic Impedance
3.3.2.
Wave Equation
3.3.3.
Intensity
3.3.4.
The Attenuation Coefficient
4.
PROTOTYPE ULTRASOUND IMAGING TECHNIQUES
4.1.
4.2.
5.
PHENOMENA INCORPORATED IN THE ULTRASOUND MODEL
5.1.
5.2.
6.
TRANSMISSION AND REFLECTION
NEITHER SCATTER NOR SPECKLE
METHODS
6.1.
6.2.
6.3.
6.4.
7.
A-SCAN
B-SCAN
1
1
3
4
6
6
6
7
8
9
9
9
10
10
12
12
13
15
15
17
20
20
20
20
BUILDING A TEST IMAGE
20
ANATOMICALLY CORRECT TISSUE MODEL FOR EVALUATION
21
EXTRACTING ACOUSTIC IMPEDANCE AND ATTENUATION COEFFICIENTS FROM GRAY
SCALE IMAGE DATA
22
SUPERIMPOSING SEVERAL A-SCANS TO BUILD THE B-SCAN
23
RESULT AND DISCUSSION
7.1.
7.2.
7.3.
1
24
B-SCAN ALGORITHM APPLIED TO THE TISSUE TEST MODEL
24
B-SCAN ALGORITHM APPLIED TO A GRAY SCALE IMAGE OF THE CERVICAL REGION 24
FILTERED GRAY SCALE IMAGES OF THE CERVICAL REGION FOR FUTURE WORK
25
V
8.
CONCLUSION
APPENDICES
A. TISSUE MODEL OF THE CERVICAL REGION
B. TRANSFORM TABLES USED
C. TGC-FILTERS
D. ALGORITHM APPLIED TO TEST IMAGE
E. EXAMPLES OF SIMULATED B-SCAN IMAGES
F. ARTIFACT EXAMPLE IN JUGULAR B-SCAN
G. MEDIAN FILTERED TISSUE
H. SMOOTH FILTERED TISSUE MODEL
I.
HISTOGRAM FILTERED TISSUE MODEL
J.
LIST OF ABBREVIATIONS AND ENGLISHSWEDISH TRANSLATIONS OF KEYWORDS
K. PROGRAM MANUAL
L. PROGRAM CODE
M. REFERENCES
26
Simulation of B-scan Imaging
Using Linear Acoustics and Photographic Image Data
Mathias Nygård
1
1. LITTERATURE REVIEW
There are numerous types of equipment in the field of image-guided surgery. A review of
representative equipment and methods will be given in this section to show what can be achieved in
this field. The rest of this report will deal with the visual guidance provided by the Dymax SiteRite®
II and how to simulate it. A section of this literature review presents previous work done on
ultrasound imaging simulation.
1.1. Image Guided Surgery Review
Image-guided surgery is essentially the art of performing medical operations with guidance from
some advanced system that supports the surgeon in his task. The general concept for such a system
is to provide supportive visual information before and during a medical operation. The medical
operation can generally be divided into two phases, a pre-operational and an intra-operational phase.
The operation is planned during the pre-operational phase and the intra-operational phase is the part
during which the actual surgery is performed.
Image guided surgery tools have been developed to assist in both of these phases and can
therefore be based on very different imaging modalities and serve a wide range of purposes. The type
of information provided varies from equipment to equipment and whether it contributes to the preor intra-operational phase. In the intra-operational phase, this information could be on how to
operate surgical instruments and tools in a region where direct visual control by eye no longer is
achievable. In the pre-operational phase, the information can come from a system used for tracking
and highlighting critical structures in images taken in order to make them more visible to the clinician
planning the treatment.
In this subsection a few examples of image guidance systems will be given along with a
discussion to elucidate the purpose of image-guided systems. To give a perspective, it can be
mentioned, that ongoing research also investigates the possibility of providing information via other
cognitive modalities than vision. Alternative human-machine interfaces, which for example are based
on sound, are under development 1 and will be mentioned briefly.
1.1.1. Interactive Pre-operative Planning Tools
To diagnose and plan medical operations and treatments images are often acquired and analyzed.
This is of course done before entering the operation theatre and is therefore known as pre-operative
imaging. Analyzing the images and making a diagnosis was first performed in a purely visual manner
by the clinician. He compared what he perceived from the images with his own knowledge of
anatomy and pathology and came up with a surgical plan. In recent years the increased capabilities of
computers and imaging, have made way for tools that can assist in diagnosis and planning. The
information they provide can later be passed on to the image guided surgery tools used in the
subsequent phase; the intra-operational phase. Here it is used interactively to give the surgeon realtime feedback highlighting deviations from the original surgical plan.
A review of the image guided surgery tools available shows that the different (pre-operative)
imaging techniques used range from ultrasound imaging, which is restricted to soft tissue imaging, to
CT suitable for imaging more bone-like structures like the skull region. However, the purpose of
Simulation of B-scan Imaging
Using Linear Acoustics and Photographic Image Data
Mathias Nygård
2
every image guidance tool in the pre-operational phase is to provide information on the spatial
organization and localization of structures. More precise dosimetry, more precise and well planned
surgery etc can thereby be achieved:
M. Breeuwer et al. 1 gives a summary on image guided surgery reported within the framework of
the European Applications in Surgical Interventions (EASI), a project devoted to improve the
effectiveness and quality of image guided surgery. The project focuses on two image guided surgery
application areas: Neurosurgery of the brain and vascular abdominal aortic surgery of aneurysms.
Several of the tools are in or have their origin in the product line of Philips Medical Systems.
The Philips EasyVision CT/MR planning station exemplifies a tool used in pre-operative work
and is described in the text by Breeuwer. It consists of a Sun workstation with software for planning
in the intra-operational phase. CT, MR and in some versions CT angiographiy (CTA) is used for
imaging and serves as input to this system. The software is dedicated to different tasks. Scanner
induced geometric distortions, coming from imperfect magnetic fields in MRI or imprecise tilting of
the gantry in CT, can be corrected with satisfying result by the software. Accurate images not only
help in diagnosis but are also a prerequisite for precise navigation and positioning in the intraoperational phase, which is described in subsection 1.1.2. The EasyVision software is also designed
for visualizing structures in 3D. For this purpose it uses advanced automatic or semi-automatic
segmentation techniques. Here, segmentation is the art of subdividing image data into clearly
different structures. For example, a tumor may be visually discriminated from surrounding critical
tissue, such as healthy brain tissue and major blood vessels. Different segmentation algorithms serve
different purposes. According to Breeuwer, special software has been written that can be used with
the EasyVision for segmentation of tumors in the brain, tracking of abdominal arteries, segmentation
of lumen of abdominal aorta and thrombus in abdominal aorta. Software has also been developed for
different types of planning: With the “craniotomy planner tool” biopsies can be planned in an
interactive way. Entry points in the skull bone can be planned and the path to the target can be
discussed and considered before the medical operation takes place. For abdominal aorta aneurysm, a
recent technique is to reinforce the wall of the aorta by placing an endoprothesis inside it. The
dimensions and the optimal localization of the endoprothesis can be calculated using additional
software and a modified EasyVision platform.
When reviewing applications in the field of image guided surgery almost every reported type of
equipment has been developed for one particular purpose or area of application, having some
specific feature that makes them particularly suitable for just that type of task. They also often, but
not always, feature the possibility to function interactively with the (image) information. This seems
to characterize tools for image guidance; the EasyVision software is written specifically for interactive
planning of craniotomies and with other particular software it can be interactively used in planning
abdominal surgery. Each system is designed for a specific task.
A. Fenster et al describe another tool, developed in Canada 2. It finds its use in prostate cancer
diagnosis and treatment, where it provides the ability to interact with 3D ultrasound image data of
the prostate during both pre- as well as during the intra-operational phase. The system consists of
three major components: An ultrasound machine with a transducer for transrectal insertion to get
near the prostate, a microcomputer with software used for imaging, and a motorized assembly to
rotate the imaging angle of the transducer under computer control. The slender ultrasound
transducer is inserted into the rectum and rotated around its axis while a series of 2D ultrasound
images (B-scans) is taken. The procedure takes about 8 seconds and covers an 80º sector for a typical
3D-scan. The tool then reconstructs the series of 2D images into a single 3D image of the prostate
region. Using 3D visualization software the volume can be sliced and viewed from different angles
allowing a better understanding of the internal and external structures of the prostate. An optimal
treatment can thereby be planned in the pre-operational phase. Motives given by Fenster for
developing the 3D tool for transrectal imaging are based on the limitations of conventional 2D handheld ultrasound scanning. He states that conventional methods like image- guided cryosurgery or
image-guided brachytherapy already show good results in destroying tumors and preserving adjacent
Simulation of B-scan Imaging
Using Linear Acoustics and Photographic Image Data
Mathias Nygård
3
tissue, but claims that their outcome is inconsistent and varies a great deal because of the difficulties
related to imaging. The conventionally used hand-held ultrasound scanner is hard to orient to the
optimal image plane due to the patient’s anatomy or position. He argues furthermore that even if the
positioning of the scanner is performed successfully, it is still hard for the surgeon to mentally project
the 3D volume of the prostate when guided by only a single 2D image taken from an awkward angle.
For image guidance in general, the attempt to decrease cognitive workload for the surgeon is,
apart from making planning in the pre-operative phase of a medical operation easier, one of the main
motives for developing and improving image guidance systems.
Operating alone, MRI, CT or conventional ultrasound imaging such as hand-held 2D
ultrasound scanners for multi-purpose use are in general not considered image guided surgery tools.
The reason is that they usually do not provide the possibility for the user to access and process the
images interactively. They merely provide diagnostic information. An image guidance tool must serve
as an interactive interface with the user.
1.1.2. Intra-Operative Real-Time Surgery Guidance Tools
With the advances of image guidance, the intra-operational phase can not only be significantly
shortened, but also carried out with greater accuracy and precision when following the course of
action planned in the pre-operational phase. Tools in the intra-operational phase often make use of
the information processed under the preceding phase of the medical operation in order to enhance
the spatial perception of the surgeon and inform him on the location of important structures and
instruments. The applications present the information by using different cognitive modalities, of
which visualisation by far is the most common. In the end of this subsection work done that
investigates the possibility of using sound as a means for guidance in surgery will be discussed.
When reviewing tools used for image guidance, some equipment is far more technically
advanced than other. Today's least sophisticated use of image guidance is simply to utilize optical
fibers, X-rays or ultrasound for real time monitoring of medical operations. However, image
guidance not only features visualization of hard-to-reach and obscured regions, but also the
possibility to simultaneously present other valuable data to the surgeon. In the case of a sophisticated
guidance tool a computed view of the ongoing procedure may be presented to the surgeon.
Sometimes a camera view of the real world is enhanced with a superimposed virtual world. The
user’s view of the world is supplemented with objects and items whose meaning is aimed at enriching
the information content of the real environment. The purpose is to not only to facilitate the 3D
perception of the surgeon, but also to make him aware of the location of, or distance to, critical
structures such as main arteries, nerves, a tumor etc. The functionality of such sophisticated systems
can be described by giving some examples.
A complement to the EasyVision, described in the previous subsection, is the EasyGuide intraoperational navigation tool. This tool is also part of the Philips Medical Systems product line. It
consists of a Sun workstation, software for navigation and guidance, a set of so-called surgical
pointers that are used for tracking the surgical instruments during an operation, a system of two
infrared cameras for detection of the infrared light emitted by the surgical pointers and a highresolution monitor for display of the virtual reality model. The EasyVision and the EasyGuide
together make out the so-called EASI-Neuro demonstrator system for image guided neuro surgery in
the skull region. The pre-operation images, provided by the EasyVision , are registered to the patient
anatomy by the EasyGuide equipment. This is done by mapping the images to fiducial markers
(landmarks) that are attached to the patient’s skin during the entire medical procedure. Using the
fiducial markers and the surgical pointers as spatial reference, the system is able to track the spatial
position of the surgical instruments and fit them into a virtual reality model built from the pre-
Simulation of B-scan Imaging
Using Linear Acoustics and Photographic Image Data
Mathias Nygård
4
operational images. The surgeon can then follow the procedure interactively on the monitor
displaying the virtual reality images with the instruments incorporated in the view. Only computer
graphics and pre-operationally taken images are featured in this virtual reality view. For example, this
helps in navigating the needle in a biopsy. In addition structures can be distinguished and better
visualised by software and image processing. Based on the pre-operational image data the EasyGuide
is also able to calculate the distance to anatomical features that cannot be visualised on the monitor
because they lie deeper inside the tissue than can be displayed, e.g. in a plane other than that being
displayed on the monitor.
Sophisticated systems for image guidance known as augmented virtually reality equipment
extend the virtual reality approach in the intra-operational phase of the operation even further 3.
Additional cameras are mounted in the operation theatre to peer over the surgeon’s shoulder and
observe the region undergoing operation. A 3D reconstruction of the pre-operational images is then
carefully aligned with the corresponding structures in the camera view displayed. Software is used to
augment the camera view with the virtual reality graphics calculated from the model. The result is
displayed in real-time on a monitor. The surgeon can thereby interactively monitor and navigate his
surgical instruments as he creates an incision. The instruments can be tracked and are incorporated in
the displayed view by using computer graphics. Additional information calculated from the model
can of course also be incorporated; A critical structure can be graphically emphasized on, the 3D
model can be made transparent to facilitate the assessment of underlying tissue, etc.
The examples given have described equipment designed for close range guidance, where the
surgeon is standing next to the patient. However, with advancing technology image guided surgery
systems have emerged that enables surgeons to perform rather difficult surgical operations without
even having to be adjacent to the patient. The U.S. military has developed a system in which the
surgeon operates his surgical instruments from a remote console, preferably located in safe location
like a combat army support hospital 3. A robot has replaced him on the field. It receives instructions
from the console and performs the operation under the supervision of an assistant. Operating
instruments from a distance is promoted by so-called telepresence systems: 3D Visual and haptic*
interfaces are principle features in the operating console and are, of course, very important for the
success of the operation. The science of Haptics has made it possible for the surgeon to, for
example, sew stitches almost as well as if manually operating the surgical instruments. A sensory
feedback system enables him to feel the force exerted on the surgical instruments during operation.
Vision and sensation are important human-computer interfaces. To promote the perceptual
presentation of biomedical data even more, sound is being used as an additional new information
display in augmented virtual reality for image-guided surgery 4. The idea is to provide the surgeon
with an additional information interface that enables him to operate his instruments without having
to take his eyes off the patient. By Releaving the visual sense for other tasks, multiple data streams
and faster perceptive processing than vision are advantages of the sound modality. Difficulties arise
in the lack of persistence that comes with sound and its dependence on individual perception by
different users. The limited human ability to perceive precise quantities and absolute values presents
another challenge.
1.2. Ultrasound Simulation Review
Although computational methods have been used by scientists for simulation purposes a long time
and in a variety of fields, it has not yet been used to any greater extent by engineers solving problems
related to ultrasound transport problems. The use of computational models seems to be making its
way into ultrasound technology, but in general, not much work is based on simulation software as a
*
Haptics is the use of computers and mechanical arrangements to feel through forces on a tool. The word
comes from the Greek for 'touch', haptikos.
Simulation of B-scan Imaging
Using Linear Acoustics and Photographic Image Data
Mathias Nygård
5
means of exploring ultrasound physics. This makes it hard to find software dedicated for the
intractable task of simulating transport through media as complex as tissue.
In work done by L. Yadong et. al. 5 the prospects of assessing bone structure and bone density
with ultrasound was evaluated computationally. A software package named Wave2000 was used 6 to
carry out the calculations. This package can simulate the complete 2D elastic wave equation given
parameters, such as density and viscosity, and is used by L. Yadong in transmission calculations
where a transducer and a receiver are used. It solves the equation using a method based on finite
differences (pixels) and the medium’s parameters are set according to its gray-scale values. This
software is a stand-along application and can be used on a regular PC. A similar software solution, in
terms of its windows-like appearance and stand-alone feature, is the Imagine 3D tool kit 7. It is aimed
at performing ultrasound calculations for the industry. In its newer versions however, it particularly
features the simulation of A- and B-scans and may therefore be of interest to ultrasound simulation
in medicine. It uses a parabolic wave equation for ultrasound transport computation performed for a
voxel model of the propagation media, which is designed by the user to simulate the ultrasound
beam’s environment. It can be constructed from a list of primitive shapes or be imported from a 3D
DXF CAD program. It is not clear to the author of this review exactly how the computational work
is carried out and what hardware is required. Nor is it clear what type of input parameters the
program needs.
The Imagine 3D software seems more powerful than the Wave 2000 software. It has more
features that enable its incorporation into medical simulation applications. For example is it possible
to steer the program’s functions from other software applications by using the program’s built in
ActiveX-interface. ActiveX is a Microsoft product and was launched to compete with Java. It enables
programs to communicate and utilize each other’s functionality and can only be used on a system
running Windows.
The only application with purely medical simulation intentions found is a system designed for
practicing ultrasound imaging on a mannequin 8. An ultrasound model of the human body has been
built from authentic ultrasound images using real ultrasound equipment. The transducer held by the
user is spatially tracked when scanning the mannequin. Ultrasound images are then sliced in real-time
from the model, given the orientation and position of the transducer. The software can slice the
model in an arbitrary angle and as a consequence, the images thereby produced can show
deficiencies. Shadows from the image acquisition process are cast in the same direction regardless of
angle of the simulated transducer and inevitable artifacts because of body/organ movements are
some examples. An obvious advantage is that the image simulation can be performed in real-time
and that the images displayed are accurate enough for practice purposes. There is also an advantage
in the possibility to simulate different pathological cases.
Summarized, methods reported in literature for simulated ultrasound imaging are either based
on solving transport wave equations or simply by slicing a 3D voxel set that was built from authentic
B-scan images. Calculating the transport by solving an equation requires much memory and
computational capacity from the computer used. A user looking for a medically applicable solution
suited for real-time image simulation can be hindered by this fact. Moreover, if an interface exists for
such a program to enable interaction with other (self made) simulation software, it cannot only be
too slow for real-time imaging, but may also restrict the user’s choice of platform to Windows only.
The approach of slicing a voxel model requires pre-acquired ultrasound image data. The image data
can be hard to gather and it inherently exhibits shadows that can be derived to the physical properties
of ultrasound and the single sample direction used during the image acquisition process.
Simulation of B-scan Imaging
Using Linear Acoustics and Photographic Image Data
Mathias Nygård
6
2. PROJECT BACKGROUND AND INTRODUCTION
This section outlines the intentions of the ultrasound image guided surgery training tool project,
which was initialized at the Department of Nuclear and Radiology Engineering, University of Florida.
In addition, technical background for the SiteRite® II equipment and the objectives of the Master
Project giving birth to this thesis will be presented.
2.1. Goals of The Ultrasound Image Guided Surgery Training Tool
Project
Department of Radiology at the Shands Hospital, Fl, owns an ultrasound equipment, Dymax
SiteRite® II. It is designed primarily to monitor percutaneous intravascular operations in the cervical
region and is described further in section 2.2. In the scope of this project, a training tool simulating
this ultrasound imaging equipment is to be developed. The tool will enable the surgeon to gain
experience and practice before performing real surgery on patients, aided by utensils such as the
SiteRite. He shall be able to use the training tool as if it was the real equipment, with the ability to
experience sensation as well as the ultrasound image guidance incorporated. Incorporation of these
features, the image guidance and the sensation of operating (e.g. with a needle for vascular access),
are the major objectives of the image guided surgery training tool project. The use of a training tool
will confidently lead to safer and more precise procedures to the benefit of both patients and surgical
staff.
In a first step, a tissue model will be designed to serve as database for the feedback systems
comprising the training tool and realizing visual guidance and sensation. The model shall provide
with input not only for the program developed in the scope of this thesis, which lays the foundation
for the visual guidance, but also for the hardware to be developed within 4-5 years. The latter will
provide sensation to the surgeon to be trained.
Parameters needed for these tasks are among others acoustic impedance and elasticity of
different types of tissues. This thesis deals with the acoustic properties of tissue and ultrasound
transport calculations. Furthermore, the image guided surgery tool project involves modeling of the
elastic properties of tissue subject to deformation caused by the surgeon using the training tool.
These are topics occupying two Master Students for the time being. The project will entail many
skills.
2.2. The Dymax SiteRite® II Ultrasound Imaging Equipment
The image guided training tool to be developed is based on the SiteRite® II equipment. It is
manufactured and distributed by the Dymax Corporation 9, and it utilizes the ultrasound B-scanning
technique for imaging. The equipment consists of a stationary control box and a transducer as shown
in Figure 1. A monitor with a 3” sector display where the B-scan can be viewed is built in the control
box. When in operation, the scan is performed over an angle of 25º and is updated with a frequency
of 40 Hz. Ultrasound frequencies of 7.5 MHz or 9.0 MHz can be selected on the front panel of the
control box.
Simulation of B-scan Imaging
Using Linear Acoustics and Photographic Image Data
Mathias Nygård
7
To visualise the underlying tissue, the transducer is put in direct contact with the skin closest to
the region to be visualised. It is important to not allow any air between the skin and the transducer.
The low acoustic impedance of air column would cause a reflection of the ultrasound beam before
even reaching the skin. No interesting image data would reach the transducer, and this is why it is a
requirement to use connective gel to avoid air columns and to facilitate a better environment for the
ultrasound beam.
The surgeon uses the Site Rite for visual guidance when performing surgery where the
procedure cannot be monitored directly by sight. The equipment is designed for use in any type of
soft tissue surgery requiring visual guidance, but it mostly supports surgery for vascular access in the
jugular region. As the surgeon operates the transducer, the image data is gathered by the transducer
and sent to the monitor, where it can be viewed as the 2D fan formed image known as a B-scan.
 3” (7.62 cm) sector display
 40 frames per second
 25º imaging sector angle
 Frequencies available are 7.5 MHz and 9.0 MHz
Figure 1.The Dymax SiteRite® II ultrasound imaging equipment.
2.3. Master Thesis Objectives
The Dymax SiteRite® II equipment provides surgical guidance by ultrasound imaging. It is chiefly
intended for medical operations in the jugular region but can be used elsewhere. A training tool
simulating the SiteRite equipment is being developed at the UF Department of Nuclear Sciences. It
will enable surgeons to practice before entering real operations involving a patient.
The purpose of this Master Thesis is to lay the foundation for the image guidance system
simulating the ultrasound guidance given by the SiteRite. Within the framework of this thesis a
program shall be designed, which simulates ultrasound transport through tissue. Input for the
software is chiefly jpg-image data of transverse slices of the cervical region. They will primarily be
collected from The Visible Human Project 10.
When bearing in mind the need of computational speed and will to quickly and easily build
input data for the future real-time application, the review on ultrasound simulation indicates that
another approach than evaluating wave equations or merely slicing a pre-designed model should be
taken. The algorithm to be developed must be accurate enough and not burdened with unwanted
shadow deficiencies, which come with a pre-acquired image set of real ultrasound images. To gain
the possibility to only use the very available image data from the visible human project and
furthermore also increase the computational speed, it was decided to only use ordinary image data as
Simulation of B-scan Imaging
Using Linear Acoustics and Photographic Image Data
Mathias Nygård
8
input and base the calculations only on transmission coefficient data for each pixel. This approach
will be tested and evaluated in this Master Thesis.
2.4. Minimal Invasive Medical Operations Require Tools For Guidance
Image guidance applications are often inevitable when operating under circumstances that not allow
direct visual control. Many minimal invasive operations exemplify that. A minimal invasive operation
is any operation with the aim not to damage the tissue blocking the region subject to the actual
medical operation. Medical operations of this kind are therefore overall less painful and adverse for
the patient. Furthermore, the time of recovery will become shorter than if performed in a classic
straight forward manner, penetrating tissues (as an example, skin) in order to get to the region of
immediate interest. Minimal invasive operations will as a consequence cut costs for the hospital and
decreases the risk of post operational complications. They are therefore preferred in many cases.
There is subsequently also a need for equipment guiding the surgical staff while performing these
types of operations.
Simulation of B-scan Imaging
Using Linear Acoustics and Photographic Image Data
Mathias Nygård
9
3. PHYSICAL PRINCIPLES OF ULTRASOUND
Ultrasound diagnostics have gained in popularity during the last years. An explanation to this is the
fact that ultrasound equipment is easy to upkeep and not so expensive to purchase as for example
equipment for MRI is. By contrast with imaging methods like CT, which if not correctly used may
expose the patient to harmful doses of X-rays, ultrasound imaging also constitutes a relative safe
modality for the patient. Non-audible sound waves with frequencies above 20 kHz are referred to as
ultrasound and it is not known to cause any adverse effects 11, 12, 16.
3.1. Assuming Longitudinal Ultrasound Wave Form
In general, acoustic characteristics are functions of the density and elasticity of the medium hosting
the wave. Since there are several different types of tissue, spanning a vast range of acoustic
properties, the problem of designing a model suitable for ultrasound calculations has arisen. Only
phenomena adequate for producing a B-scan simulating program have initially been incorporated in
the model.
For sound (including ultrasound) propagating inside a gas or liquid like medium, the motion of
particles is longitudinal, i.e. in the direction of propagation 13. Waves having these properties are
referred to as longitudinal. If the medium is in bulk form and is extended, shear waves will also be
present 17. The latter would make the calculations more difficult, since both waveforms must be
taken into consideration. For soft tissue, however, the intra molecular forces needed for a shear wave
to propagate cannot be sustained in the liquid like environment 14. Hence, soft tissue will be
considered a liquid-like medium, and the calculations will only deal with longitudinal waves.
When performing transport calculations through other tissue media that in fact could host shear
waves, such as the hard and rigid skull bone, again only the longitudinal parts of the wave will be
concerned. An ultrasound beam reflected by a bone like tissue structure it is bound to travel through
soft tissue before it reaches to the transducer again. So if a reflection in fact would cause a shear
wave, it would not be able to reach the transducer, since the intermediate soft tissue not is able to
sustain such wave types.
3.2. The Ultrasound Transducer Generates Plane Wavefronts
The transducer generates a narrow burst of ultrasound pulses, which propagate through the tissue.
They are reflected as they encounter tissue with different acoustic impedance. A returning pulse can
then be detected by the transducer through the reciprocity of the piezo crystalline material used in
the transducer head. Since the distance between the transducer and the point of reflection is
proportional to the time it takes for the echoes to return, the distance can easily be calculated and
plotted on a monitor. The transducer is formed as a cylinder, which when vibrating in the direction
of its axis, is known to radiate plane wave fronts 13, 15, 17. The program will therefore only involve
algorithms based on sound propagating with a plane wave front.
At each point in the propagation medium (all tissue), the wave front will therefore be
considered plane. The surface of the ultrasound transducer can be modeled as an infinite number of
Simulation of B-scan Imaging
Using Linear Acoustics and Photographic Image Data
Mathias Nygård
10
aligned point radiators 16, each one radiating ultrasound at the same frequency. Huygen’s Principle
renders possible to show that the ultrasound radiating from the transducer has a plane wave front 17.
Calculations, based on the principle of Huygen’s, also allow the ultrasound beam to be
subdivided into a near and a far field 16, 18, 19. The near field, also known as the Fresnel Zone, is
characterized by a slightly converging beam, which gains in intensity when approaching its end. Here
the far zone begins. The greatest intensity in the beam is found just before the transition to far the
field 13. When entering the far field, the beam starts to diverge, which results in a steady loss in
intensity as the beam continues. Another name for the far field is the Fraunhofer zone. The length of
the near field and the angel of divergence at the start of the far field, are given in Figure 2.
Length of near field 
D
D2
4
Far field
Ultrasound
Transducer

A plane wave front is transmitted


Angle   arcsin 1.22


D
Figure 2. The near field length and the angle of diversion, depend on the diameter of the cylinder shaped
transducer and of the wavelength of the ultrasound radiated
3.3. Parameters of Importance to The Acoustic Properties of Tissue
Parameters discussed in this section are related to the acoustic impedance. Since the acoustic
impedance will only be considered in a macro perspective and is obtained from tabulated data, the
computer program for ultrasound transport calculations will make no direct use for all of the
parameters discussed in this section. The attenuation coefficient is the most important parameter
used in the program except for the acoustic impedance. The intention of the subsection is to outline
the properties of tissue that determine its acoustic impedance.
3.3.1. Acoustic Impedance
Simulation of B-scan Imaging
Using Linear Acoustics and Photographic Image Data
Mathias Nygård
11
A material’s acoustic properties are determined by its acoustic impedance (Z). Authors often refer to
both specific acoustic impedance as well as to characteristic acoustic impedance, of which the latter is the most
common one. If nothing else is said, acoustic impedance refers to this parameter. It is of interest
here, since it is used to calculate the change in intensity for the ultrasound beam when propagating
into another medium.
By definition 17, the specific acoustic impedance is the complex acoustic pressure ( p ) divided by the
complex particle velocity ( u ), resulting in a complex valued impedance ( Z spec ).
Z spec 
p
u
Specific acoustic impedance
(1)
Complex variables are used here to provide information on the phase relation between pressure
and particle velocity at a point in the medium. A phase relation appears when the wave transfers into
a medium with different acoustic properties (impedance), the frequency remaining unchanged.
When the point of calculation is in a continuous medium, i.e. not performed in an interface of
medias of different acoustic impedances, there is no phase difference between pressure and velocity.
Hence, the complex fraction turns into a real value and by using fundamental concepts from fluid
mechanics 17, it can be shown that the specific acoustic impedance turns into the real value expression
called characteristic acoustic impedance:
Z  c
Characteristic acoustic impedance
(2)
This real value way of the acoustic impedance, a product of density of the propagation medium
(ρ) and the speed of sound in the propagation media (c), is what will be used in the ultrasound beam
reflection and transmission calculations. The acoustic impedance has a tendency to grow with weight
and rigidity of the material. Acoustic impedances for selected tissues are listed in Table 1.
Tissue
Air
Fat
Kidney
Liver
Spleen
Muscle
Skull bone
Acoustic impedance (kg m-2s-1)
0.0004106
1.38106
1.62106
1.64106
1.64106
1.70106
7.8106
Table 1. (Characteristic) acoustic impedances for different types of tissue 2, 4,20
The acoustic impedance will be considered constant throughout each tissue type. Subsequently,
the propagation speed (c) of a wave will also be considered constant, although it in fact depends on
the elastic properties of the tissue it propagates in (see 3.3.2). In soft tissue, generally speaking, the
Simulation of B-scan Imaging
Using Linear Acoustics and Photographic Image Data
Mathias Nygård
12
speed will be taken as 1540 m/sec. Most ultrasound scanners are also calibrated for this value of the
speed of sound 11. For bone the speed of sound is about 4000 m/sec 16.
3.3.2. Wave Equation
The most fundamental properties of a longitudinal propagating ultrasound wave are described by the
longitudinal wave equation. Here it is given in one dimension, the propagation direction. The relation
showed in the wave equation21 may also be used for determining density, sound pressure, or particle
velocity by simply substituting for the longitudinal particle displacement,  ( x, t ) . Here the wave
equation is written in a form that relates acceleration of a particle in the propagation media to the
second order derivate of its displacement. This wave equation is describes a propagating wave in a
liquid medium, which is suitable for modeling soft tissue. C is the speed of the propagating wave in
the medium:
 2 ( x, t )
 2 ( x, t )
2
c
x 2
t 2
c B 
Wave Equation
(3)
Speed of sound in liquid-like
medium (soft tissue)
(4)
The speed of a wave propagating in a liquid medium, e.g. blood, depends on the adiabatic bulk
modulus (B), and the density of the propagation medium (  ) 17. In case of a solid medium, such as
bone, B should be switched for Young’s modulus, Y. This parameter is also referred to as the stretch
modulus and is used whenever shear waves are present in the medium. A metal rod is one example
where this is applicable. Since shear waves cannot propagate in soft tissue (see 3.1), they will not be
taken into consideration.
3.3.3. Intensity
Energy is transported as an acoustic wave propagates. The instantaneous acoustic intensity is defined
as the instantaneous flow of energy through a unit surface perpendicular to the direction of
propagation 22. The time average of the energy flow yields by definition the acoustic intensity, which
has the unit watts / meter2. As the sound wave propagates it is attenuated due to thermal losses,
scatter and molecular movement in the media.
Suppose the ultrasound is a sinusoidal harmonically oscillating wave. At any point in a
continuous medium (no change in acoustic impedance), the pressure (p) and particle velocity (u), can
then be written:
p( x, t )  P0 cos(t  kx)
[ Newton / meter2 ]
u ( x, t )  U 0 cos(t  kx)
[meter / sec]
(5)
Simulation of B-scan Imaging
Using Linear Acoustics and Photographic Image Data
Mathias Nygård
13
(6)
Note that there is no difference in phase, since the propagation medium is assumed to be
continuous (see 3.3.1). The product of wave pressure and particle velocity yields the instantaneous
intensity at any point along the direction of propagation (x) at any time. The unit becomes work /
(meter2·time) = watts / meter2.
Simulating the propagation of ultrasound, it is not crucial to evaluate the intensity
instantaneously. The time average of the intensity is of greater interest, and goes under the name
acoustic intensity:
I
T
P0U 0
1
2
P
U
cos
(

t

kx
)
dt

0
0
T 0
2
(7)
Remembering the more general definition of (specific) acoustic impedance, Z spec  p u , with
zero phase difference and assuming a continuous medium ( Z  c ), yields the expression for acoustic
intensity most commonly used. The relation I~P02 is important:
2
P
I 0
2Z
(8)
In ultrasound imaging the beam is pulsed, and therefore not sinus shaped as assumed in the
beginning of this subsection. This would imply that the expression for the acoustic intensity is not
correct and not valid for use here. But since the pulse sent out stretches over a period of time longer
than one cycle (2-3 cycles) 16, the assumptions can be regarded accurate enough for ultrasound
transport calculations. Sometimes the acoustic intensity is referred to as just intensity.
3.3.4. The Attenuation Coefficient
Due to viscosity, heat transfer, scattering of the beam and to some extent also molecular movement,
the intensity (I) of the ultrasound beam will attenuate exponentially as it propagates 16, 17, 18. The
decrease in intensity for a propagating wave can be described either by the attenuation coefficient or
the absorption coefficient of the tissue. The attenuation coefficient (a) refers to the actual loss in
intensity and is expressed as the ratio of which the intensity decreases per unit length (dB/unit
length), whereas the absorption coefficient is an exponential factor found in the expression for
intensity attenuation (9).
The attenuation coefficient is an important parameter for ultrasound transport calculations and
must be incorporated in the ultrasound transport model. This enables intensity attenuation
calculations to be performed easily. With the intensity deposited in the surface close to the transducer
(I(0)) as reference, the function describing the exponential loss in intensity is 17, 21.
Simulation of B-scan Imaging
Using Linear Acoustics and Photographic Image Data
Mathias Nygård
I ( x)  I (0)e 2x
14
(9)
In this expression, α is the absorption coefficient, x indicates the depth measured from the
surface and the factor 2 comes from the intensity being proportional to the square of the pressure
(see 3.3.3). The same absorption coefficient can as a consequence be used in pressure calculations.
Regarding the depth (x), the pixel width used will be introduced as unit length in the computer
program.
The absorption coefficient expresses the rate of loss due to viscosity, thermal activity and
molecular relaxation 16, 17, 18, 21. It is used in the calculation of intensity attenuation. Based on the
definition of intensity level17 divided by depth, the attenuation coefficient (a) measured in dB / unit
length can be derived:
 I(x) 

10 log 10 
I(0)  10 log 10 e 2x

a

 8.686
x
x


[dB / unit length]
(10)
The minus sign indicates a decrease in intensity.
The absorption coefficient depends on the frequency used 16,17. No accurate table of absorption
coefficients for different types of tissue has been found for the frequencies used by the SiteRite® II
equipment (7.4 MHz and 9.0 MHz). Attenuation coefficients at the frequency 1MHz are on the other
hand given by Bushberg et al 16, and will be used. A rule of thumb16,18 also says that the attenuation
increases by 0.5-1 dB per cm per MHz. These guidelines have been used in the intensity attenuation
calculations.
Tissue
Liver
Fat
Kidney
Brain
Soft Tissue (average)
Attenuation Coefficient, a (dB/cm)@1MHz
0.7-0.94
0.6-0.65
0.9-1.0
0.8-0.9
0.5-1.0
Table 2. Attenuation coefficients for tissues 16.
Simulation of B-scan Imaging
Using Linear Acoustics and Photographic Image Data
Mathias Nygård
15
4. PROTOTYPE ULTRASOUND IMAGING TECHNIQUES
The program being developed, which simulates the ultrasound image of the SiteRite II, is a
significant part of the image guided surgery tool project. In this section selected ultrasound imaging
methods will be presented upon which the algorithms of the program are based.
4.1. A-scan
Holding the angle of the ultrasound beam fixed, only receiving echoes from one direction and at the
same time monitoring the time elapsed between successive echoes, is the principle for A-scan
imaging. The A-scan corresponds to one-dimensional depth scanning, showing the distance from the
transducer to boundaries of tissues having contrasting acoustic impedances.
The ultrasound beam produced by the transducer head propagates into the tissue and is partly
reflected when a tissue interface between acoustic impedance is encountered. The reflections are then
detected as they return to the transducer. The physically inevitable attenuation of the ultrasound
beam implies that the further the echo has traveled before returning to the transducer head, the
smaller the detected amplitudes. As a consequence, tissue interfaces having the same reflection
coefficient, but located at different depths, should appear differently on the monitor. The closer one
should have the highest amplitude. Regardless of depth however, equally reflective interfaces are
presented with the same amplitude when looking at an A-scan. This is achieved by using TGC (time
gain compensation) amplifiers 16, 20, which compensate for the attenuation so that long distance
echoes are amplified more than close range echoes. Equally reflective interfaces will therefore be
displayed with the same amplitude regardless of depth. The amplitudes of the reflections still
correspond to the acoustic properties of the encountered interfaces though. An interface, at which
the reflection is great, appears with large amplitude when performing an A-scan.
The TGC-filters are integrated with the ultrasound equipment and are often preceded by
another amplification unit, the pre-amplification or overall gain. This step amplifies the electrical
signal from the transducer to a level suitable for the TGC-filter. It is needed when the amplitude of
the reflections are low, which can be a consequence of using a transducer emitting low intensities.
Simulation of B-scan Imaging
Using Linear Acoustics and Photographic Image Data
Mathias Nygård
16
Some ultrasound imaging applications transform the amplitudes provided by the A-scan into a
row of brightness value pixels. A pixel related to a certain depth in the A-scan, adopts the brightness
value matching the amplitude from the A-scan. Equally bright pixels then correspond to equally
reflective interfaces and can be plotted as a line in a 2D space versus another variable. Most common
is to plot the line from the A-scan versus time (M-scan) or versus the orientation of the ultrasound
beam when performing a scanning motion. The latter is what the program simulates: The B-scan.
= Tissue type 1
= Tissue type 2
(a) Transducer
…
(b) Detected
TGC Filter
Reflections
t,x
(c) Displayed
Signal
t,x
Figure 3. The A-scan: (a) Reflections from interfaces are (b) detected and time-gaincompensation filtering is applied. This enables reflections from equally reflective interfaces to
be (c) displayed with equal amplitude.
Simulation of B-scan Imaging
Using Linear Acoustics and Photographic Image Data
Mathias Nygård
17
4.2. B-scan
By stepping the angle of an A-scan and simultaneously superimposing the A-scans as a function of
spatial orientation a fan formed 2-dimensional view of the area covered can be achieved. This is
referred to as a B-scanning. The letter B originates from the word brightness, since before
superimposing an A-scan, its amplitude values are transformed into brightness values that build the
view. For each additional A-scan the angle of the ultrasound beam is stepped up in such a way that
the sector to be B-scanned is ultimately covered.
One way of achieving a sector sweep is the phased array method. Here, using the principle of
Huygen and introducing a phase difference between adjacent transducing sources is used to flexes
the plane wave front, without having to mechanically move the transducer. For the SiteRite® II, the
imaging sector angle is 25º and the sector sweep is achieved mechanically.
The B-scan pixels reflect the acoustic impedance delivered by the A-scan through their
brightness value respectively. Thus, by using B-scanning, a tomographic-type slice of the body is
visualised, with each pixel corresponding to the acoustic tissue characteristics of its spatial location in
the body (to be exact; spatial location of its interface to surrounding tissue).
Air
Transducer beam performing a scanning motion
Tissue
A-scan beam
Figure 4. A B-scan is obtained by superimposing several A-scans taken from different angles. It
gives a tomographic slice image of the underlying tissue.
Simulation of B-scan Imaging
Using Linear Acoustics and Photographic Image Data
Mathias Nygård
18
5. PHENOMENA INCORPORATED IN THE ULTRASOUND
MODEL
Parameters discussed in previous sections will serve to realize the calculations in a way that
represents reality as far as possible. The knowledge of the value of parameters such as acoustic
impedance and attenuation coefficients lay the foundation for the ultrasound modality. Therefore, in
order to correctly simulate B-scans, physical aspects based on these parameters must be taken into
consideration. Some of them will be incorporated in the B-scan simulation program.
5.1. Transmission and Reflection
When encountering media with different acoustic impedance (Z2), a sound wave is partly reflected
back into the first media (Z1). The remaining part transmits into the next media with a loss in
intensity as a consequence. The reflection phenomenon is a prerequisite for ultrasound imaging.
The reflection coefficient (R) of the interface separating the two media, shows how much of the
incident wave intensity will be reflected. The transmission coefficient (T) is, of course, closely related
to the reflection coefficient, and corresponds subsequently to the part of the wave intensity that
continues to propagate in the same direction as before the new medium was encountered. Equations
(11) and (12) are only valid for a plane ultrasound wave perpendicularly incident on a structure
providing a large flat interface. For an oblique incidence, a rough or curved interface, the reflection
will be less than stated by the equations given here. Only a part of the reflected ultrasound beam will
in this case reach the transducer to allow detection. A partial reflection does not accurately represent
the properties of the reflecting interface. The program will therefore consider every pixel
encountered applicable to the equations (11) and (12) to achieve accurate calculations.
 Z  Z2 

R   1
 Z1  Z 2 
T  1 R
2
Reflection coefficient for intensity
(11)
Transmission coefficient for intensity
(12)
5.2. Neither Scatter nor Speckle
If the wave front encounters objects, much smaller than its wavelength, e.g. a blood corpuscle with a
diameter measuring 10 µm, it will be scattered in all directions according to Snell’s law 13, 18. Some of
the reflected sound will reach the transducer and hence, enable the detection of such small structures.
The intensity reflected will however in this case be very small compared to that of sound reflected
against a large structure or interface. The backscattered ultrasound also gives rise to coherent
interference, which manifests itself in the ultrasound image as time-varying fluctuations called speckle
20, 23. This was once considered just noise, but is now a days looked upon with great interest; The
characteristics of the backscattered signal depend upon the density and size of the (small) structure
from which it came. Using signal analysis it may be possible to classify different types of tissue based
Simulation of B-scan Imaging
Using Linear Acoustics and Photographic Image Data
Mathias Nygård
19
on information extracted from the speckle phenomena. The aim is to find a quantitative method for
this, extending attempts that have already been made to distinguish between healthy and diseased
tissue of the same type 24, 25 using this phenomena.
In the ultrasound transport calculation model this is not taken into account, because the pixel
(or voxel) will always be a great deal larger than one wavelength. At most, the SiteRite® II is able to
emit wavelengths of 205 µm (1540 m/sec / 7.5 MHz), and the pixel width available is much larger.
Considering that a beam from the SiteRite transducer extends to a depth of 4 cm into the tissue and
that this corresponds to about 80 pixels, structures with a diameter no less than 500 µm (4.0 cm / 80
pixels) can be visualised in the images used. This sets the limit for detectable structures in the image
based calculations, and thus the scattering phenomena, is not accounted for in the simulation
program.
Simulation of B-scan Imaging
Using Linear Acoustics and Photographic Image Data
Mathias Nygård
20
6. METHODS
As the future goal is to build a real-time simulator of the SiteRite equipment, algorithms have to be
fast when simulating the image guidance. C++ is a possible choice, since this is a reasonably fast
language. A digital signal processor is also a suitable option to fulfill real-time requirements. For the
initial phase of the project of developing the simulator tool, IDL was been chosen for the evaluation
of algorithms to realize image guidance and segmentation of image data. IDL has powerful features
for image and signal processing and was therefore considered a good first choice. Methods used to
realize the ultrasound transport calculations will be presented in this section. The programming work
was performed on a standard Pentium III-PC running Windows 98.
6.1. Building a Test Image
A test image was designed for the testing and evaluation of the algorithms. The model for tissue
properties was planned to be built from anatomically correct images having black background (see
6.2). Therefore, black was chosen for background in the test image. It was also evident that harder
tissues could be seen in the anatomically correct images as gray or white structures. For that reason, it
was assumed that white color corresponds to bone-like tissue, which has the greatest acoustic
impedance in this context, and that darker colors represent softer tissue types with lower acoustic
impedance. Black fields were interpreted as cavities or blood vessels, since the model is built upon
photos taken from slices that had been drained from blood.
Fields of different gray scale values were arranged in the test image to represent a set of
different anatomical structures. They would represent the gray scale ranged by the different tissue
types possible. To simulate surrounding tissue, a middle value (128) was chosen. The arrangement of
the fields in the test image allows an ultra sound beam to propagate through different media
constellations (fields of different gray scale values) from different positions. This helps in evaluating
the transport algorithms and in adjusting parameters like the length unit used in the attenuation
calculations. In IDL, black is implemented as 0 and white as 255. The gray scale values of IDL are
illustrated in the lower left corner of the test image, which is given in figure 5.
Simulation of B-scan Imaging
Using Linear Acoustics and Photographic Image Data
Mathias Nygård
21
.
Figure 5. Gray scale test image. The
black background with pixel values equal
to zero represents air. Fields of different
gray values up to 255 for white were
placed in the picture to simulate different
tissues.
6.2. Anatomically Correct Tissue Model For Evaluation
A 39 year old man named Joseph Paul Jernigan was convicted of murder and executed in Texas
1993, he donated his body to science. The same year, his body was transversally sliced and optically
photographed in the visible human project 10. It was first imaged with CT and MRI.
Image data from the visible human project was downloaded from the Internet 10 to form a
database of anatomically correct images for the image guided surgery tool project. This way, the
output of the program could later be compared to the real images given by the SiteRite. The
transverse slices selected were given as 2024 x 1216 pixel 24-bit color images of the cervical region
(neck). This region was chosen because the SiteRite primarily finds it use in helping the surgeon to,
for example, locate the jugular veins, which run longitudinally through this region. Each pixel
corresponds to an anatomical data resolution of 1/3 mm. In order to form a 3D voxel model of the
cervical region the 2D images have to be axially aligned and superimposed. This has not yet been
implemented but will be in the future. At this point, the program only uses 2D images. The image of
the transaxial slice used for input to the program is given in appendix A.
Simulation of B-scan Imaging
Using Linear Acoustics and Photographic Image Data
Mathias Nygård
22
Posterior
( y-axis)
Right
Left (x-axis)
Anterior
Figure 6. A 3D tissue model will be built from visible human project 2D image data. Currently is
only the uppermost 2D image used for algorithm evaluation purposes. It shows a transverse slice
of the cervical region and is viewed from the head. It is given appendix A.
6.3. Extracting Acoustic Impedance and Attenuation Coefficients From
Gray scale Image Data
The pixels comprising an image may correspond to different properties of the tissue imaged. For an
image that originates from MRI, each pixel value corresponds to the density of hydrogen in the
tissue. For CT, each pixel reflects the absorption properties of X-rays of the tissue. Here however,
producing simulated B-scans, the program is only concerned with intensity calculations and the only
tissue property of interest is the acoustic impedance and the attenuation coefficient. A delicate and
intractable problem is to segment the image data, to map its gray scale values to corresponding tissue
properties. This is however not the main concern in this thesis, but has shown to be of great
importance to the resulting images of the program.
Initially, the acoustic impedance and attenuation coefficient information will be gathered
through gray scale evaluation of the image data using interpolated transform table values. In a later
step, the color properties of the image will be taken into consideration to achieve more accurate
segmentation. At this point we only utilize the gray scale information when importing an image, it is
directly transformed into 8-bit gray scale representation.
Researchers have put effort into solving how different forms of the same tissue can encompass
different acoustic impedances. Measurements of the acoustic impedance of tissue of the same type
but of different form (grained and chunks), of fresh and almost fresh tissue have been done with
various results, none of them clearly revealing why the acoustic impedance differs for different forms
of the same tissue type 13. Based on literature studies, it seemed impossible to find a way to relate a
gray scale pixel value to only one tissue type. Tissue property coefficients, collected from tables 2, 4, 16,
were therefore first linearly mapped to a corresponding gray scale value so that the relation and
Simulation of B-scan Imaging
Using Linear Acoustics and Photographic Image Data
Mathias Nygård
23
distance between tissue values were preserved. Air was mapped to a gray scale value near black (0)
and skull bone was mapped to a gray scale value near white (255). A simple and straightforward
method was then chosen to relate intermediate gray scale values to the intermediate tissue property
values. The pixel value domain was mapped to the acoustic impedance value domain by using the
piecewise linear model shown in appendix B. Every known value for the acoustic impedance and
attenuation coefficient was plotted versus the gray scale ranging from 0 to 255. A homogenous
region corresponds to one type of tissue in an image as seen by the human eye. The region
surrounding of each known value was therefore made constant with the purpose to capture
fluctuations in gray scale that was expected within a tissue type. This would impede reflections from
within almost homogenous regions with respect to gray scale values and hopefully provide a better
simulation of ultrasound imaging. The mapping of several gray scale values to an acoustic impedance
value can also be considered a first step towards segmenting the gray scale values into different tissue
types. A separate procedure was written to smooth the progress of mapping gray scale value to
acoustic impedance and to allow other mapping methods to easily be incorporated in the program.
6.4. Superimposing Several A-Scans to Build The B-scan
Having the real world scanning procedure as model, the ultrasound beam sweeping over a sector and
at the same time plotting received echoes versus the spatial orientation of the beam, the program was
designed to do the same: While stepping the angle of propagation of a simulated ultrasound beam in
the tissue model images, several A-scans are calculated and superimposed to cover a sector like the
one covered by a real B-scanning procedure.
The first step in performing the ultrasound transport simulation is to extract a straight line from
a plane sliced from the tissue image model. The line corresponds to the A-scan. It extends from the
transducer’s position to the B-scan sector’s front. The pixels comprising it will then correspond to
the tissue encountered by the propagating ultrasound beam. The intensity of one of the endpoint
pixels of the line, the one corresponding to the transducer, was set to an empirically derived value
that would correspond to the real ultrasound image intensity deposited in the tissue. Altering the
intensity parameter has the same effect as altering the pre-amplification or overall gain of the real
ultrasound equipment, a measure that has to be considered in for example obstetrical ultrasound
where the power deposited by the transducer is low. The pre-amplification gain is raised to a level
appropriate for the subsequent TGC-amplification to compensate for the low signal amplitude that
returns to the transducer.
Starting at the pixel adjacent to the transducer, the A-scan line was pixel wise stepped through
while simultaneously calculating the ultrasound beam’s transportation. Two adjacent pixels in the line
may represent two medias and can consequently form an interface where a reflection of the
simulated beam can take place. The ultrasound beam attenuation, reflection and returned intensity
were calculated for every interface in the line using equations (10), (11) and (12). When speaking in
terms of A-scans, it should be mentioned that intensity amplitudes returned and detected by the
simulated transducer are never explicitly plotted, but instead instantly introduced in the B-scan as
brightness values.
The algorithms were first applied to the test model image for evaluation purposes. The intensity
and beam length of the simulated transducer were calibrated to coincide with the real images
obtained from the SiteRite. The next step was to form a B-scan from a set of A-scans. By stepping
the angle for every new A-scan line, a fan formed sector imitating the real B-scan form was built. The
intensity values returned from every A-scan calculation were superimposed and plotted versus spatial
orientation. When the B-scan’s sector angle is spanned by A-scans, the resulting image shows the
simulated B-scan.
Simulation of B-scan Imaging
Using Linear Acoustics and Photographic Image Data
Mathias Nygård
24
7. RESULT AND DISCUSSION
Once the program was implemented and a test environment was designed, the succeeding work
consisted of algorithm evaluation and assessment of the resulting B-scan images. The disadvantage of
having used an overly simplistic segmentation technique, i.e. the transform tables used, will be
discussed.
7.1. B-scan Algorithm Applied To The Tissue Test Model
Results from the test image evaluation of a single A-scan were satisfying and clearly showed the
attenuation of the ultrasound beam as well as different reflections from pixels having different
acoustic impedance. Several A-scan lines were subsequently superimposed to form a B-scan. The
result of the B-scans applied to the test tissue model can be seen in appendix D. Both attenuation
throughout each pixel and the loss due to pixel to pixel transitions cause the dimming of the
simulated ultrasound beam’s intensity. In order to clearly visualise this, no TGC-filter was applied
when scanning the tissue test model. As anticipated, interfaces with great reflection coefficients
reflect more than interfaces with small reflection coefficients. For example are the upper right
reflections in scan no. 2 (appendix D) greater than the upper left reflections in the same scan because
of the difference in gray scale value between the simulated tissue types encountered.
7.2. B-scan Algorithm Applied To A Gray scale Image of The Cervical
Region
The transducer was positioned to cover regions representing different anatomical structures and gray
scale values. Three regions where chosen for evaluation. The position and result of each simulated Bscan is given in appendix E. The lateral scan covers a thin layer of fat and muscle tissue. The
posterior scan spans over the same tissue types but the fat layer is much thicker. The last scan was
taken from a frontal position and it covers a part of the trachea and the tissue anterior to it. The last
of the simulated A-scans performed in each B-scan is marked with a brighter shade to illustrate the
direction of the scanning motion and orientation of the result. This is done in the tissue model image
as well as in the resulting B-scan images.
The trachea appears in a white shade that corresponds to a bone-like structure according to the
simple transform table used (appendix B). The trachea consists of cartilage tissue, which is not as
hard as bone and consequently does it not have the great acoustic impedance of bone. It is much
more rigid than the surrounding soft tissue though and the ultrasound beam should as shown be
reflected here. The contour of the trachea can hence be distinguished in the lower left corner of the
right-most B-scan showed in appendix E.
The large black region closer to the transducer in the same B-scan corresponds to the darker
muscle portion in the tissue model (cricothyroid muscle) and is more apparent than the contour of
trachea. This is, of course, a consequence of the greater difference in gray scale value towards its
surroundings, which causes a greater difference in acoustic impedance according to the transform
table. This fact points out something that has become evident during the evaluation of the simulated
ultrasound images: Great amounts of reflection, or even total reflection, appear more often than in
Simulation of B-scan Imaging
Using Linear Acoustics and Photographic Image Data
Mathias Nygård
25
real ultrasound imaging. One contributing factor to this fact is that the tissue model constitutes a
discrete interpretation of the reality. Inherently, there are many more pixel interfaces toward which
the beam can be reflected. Left and middle B-scans in appendix E show how muscle/fat interfaces
can appear. The overall appearance resembles the real image though and it should be possible to use
the algorithm for simulation purposes in the image guided surgery tool project. The transform tables
should be empirically adjusted and a concerted effort should be put into segmenting the pixel data to
get more correct reflection coefficients. Artifacts can often be related to the simple segmentation
technique used.
Artifacts related to the reflection phenomena and consequently also to the inferior segmentation
technique used can be seen in appendix F. A real B-scan image taken from a SiteRite unit and a
simulated B-scan calculated for roughly the same tissue regions are presented. When comparing these
images it must be kept in mind that they are compiled from imaging data of different human beings.
As a consequence, the anatomical structures and their spatial relation are not identical. The blood
vessels are also drained in the input image for the computed version, which not is the case for the
real image. Both images show roughly the same portion of the internal jugular vein and the carotid
artery region. When viewing the tissue model image, pixel domains representing blood vessels appear
in a relatively dark shade (gray scale value of 10-30) and are not homogenous. A deeper analysis
shows a salt and pepper-like spread of pixels with even darker shades throughout the jugular vein
portion of the image. This also appears to some extent in the region representing the carotid artery.
In reality, no relevant structure exists inside these vessels, nor are the pixels with different gray scale
value clearly visible to the eye due to the small difference in relation to neighbouring pixels. When
consulting the transform table for acoustic impedance, it turns out that the slope is significant for
dark gray scale values between 10 and 30. A difference in acoustic impedance results, which allows
the calculated ultrasound beam to be falsely reflected in regions being more or less homogenous in
reality. Accordingly, the boundaries outlining the jugular and carotid artery in the simulated B-scan
are partly hidden in pixels representing these reflections. Using a better technique for segmentation
of the tissue model images is a way to solve this problem.
Another phenomenon can be seen in the simulated image in appendix F, the shadow
phenomena. Every blood vessel is interpreted as air, since they have been drained and are
represented in black color. They may therefore constitute a very reflective surface to neighbouring
structures if the preceding pixels in the ultrasound beam’s pathway have a distinct whiter shade. A
shadow can be cast where the ultrasound beam passes through a gray region followed by a much
blacker one. In comparison, the images in appendix E do not contain regions presenting enough
great differences in gray scale for distinct shadows to be cast.
7.3. Filtered Gray Scale Images of The Cervical Region For Future Work
Filters were applied to the original gray scale image to prepare for future work that may concern prefiltering of images. Median, smoothening and histogram filters were used to illustrate the possibilities
of filtering.
Median and smoothening filters aim at removing or decreasing the salt and pepper-like noise in
homogenous domains in the image. A smaller number of unwanted reflections from anatomical
structures that in reality are homogenous can be achieved by using these filters. Histogram filtering
improves the contrast in an image. For information on how these filters work see 19 or the IDL
programming manual, where these filters are presented in detail. They are often used in image
processing and should be familiar to anyone active in the field.
Simulation of B-scan Imaging
Using Linear Acoustics and Photographic Image Data
Mathias Nygård
26
8. CONCLUSION
Only little effort was required to learn the IDL-programming language, since the author has
previously worked with Matlab. In spite of network problems, delay in shipment of software etc, the
program was completed with a positive outcome in due time. As consequence of a simple
segmentation technique, some images produced with the program have artifacts. The overall
appearance of the simulated B-scan images is good compared with real ultrasound images produced
by the SiteRite unit.
Reasonably homogenous structures with edges clearly distinguishable from the surroundings
produce a good result and become very visible in a simulated B-scan. The resemblance with a real
ultrasound image is evident and the texture revealing underlying tissue structure is sometimes more
easily identified than in real ultrasound imaging. Future work will have to decide whether this is good
or bad for practical purposes, since it can give a false impression of the reality. An example of a
clearly visible structure is the cricothyroid muscle, which appears as a gray, nearly homogenous
domain of pixels, embedded in an almost white surrounding (cartilage of larynx). This can be seen in
the lower right simulated B-scan image in appendix E. Other calculated B-scans of structures
differing considerably in gray scale from their surroundings can be seen in the same appendix.
Shadowing phenomena can appear if the difference in gray scale value for different structures is great
enough. This can be seen in the simulated B-scan in appendix F.
Some artifacts have been encountered during the assessment of the simulated images. Randomly
distributed noise in domains that in reality are homogenous can cause unwanted reflections in a
simulated B-scan. If present, reflections of noise partially speckle the B-scan and may make it
indistinguishable from surrounding pixel domains representing other tissue types. Consequently,
some anatomical structures can be hard to distinguish in the simulated B-scan. The idea of
interpreting the reflections as speckle noise came up but was discarded, since real speckle noise
origins from very small, but still existing, structures. The salt and pepper-like spread noise mostly
appears inside homogenous regions that should stay homogenous at this point. Problems causing the
artifacts have been localized to the transform tables, which indirectly serve as techniques for
segmentation of the tissue model images. At this point, the transform table for acoustic impedance is
the source of improper values causing unwanted reflections. Pixels in the lower part of the gray scale
are subject to this phenomenon, since a steep slope in the table induces relatively large reflection
coefficients for a small difference in gray scale. The bare eye can have difficulties in distinguishing
between pixels having just a small difference in gray scale, especially for low values. A dark pixel
domain, only differing little in gray scale from its surrounding appear smooth and homogenous in the
tissue model image, but will be speckled with reflections in the simulated B-scan because of the
simple segmenting technique used. This can be seen when imaging the jugular vein in appendix F,
where a dark field shall be distinguished from a relatively dark back ground.
The segmentation process has to be improved in order to achieve more accurate simulated Bscans. A more accurate segmentation procedure will enable the program to be used with fewer
artifacts and for all types of different constellations of pixel tissue representation. For example using
color information would make it easier to distinguish between different tissue types, and hence
produce more accurate reflection coefficients in regions where it is needed. This is suggested as
suitable future work. Supplement program modules for simulation of power line hum, patient motion
and speckle noise are also suggested for future work.
APPENDICES
A. TISSUE MODEL OF THE CERVICAL REGION
B. TRANSFORM TABLES USED
C. TGC-FILTERS
D.
ALGORITHM APPLIED TO TEST IMAGE
E. EXAMPLES OF SIMULATED B-SCAN IMAGES
F.
ARTIFACT EXAMPLE IN JUGULAR B-SCAN
G.
MEDIAN FILTERED TISSUE MODEL
H.
SMOOTH FILTERED TISSUE MODEL
I.
HISTOGRAM FILTERED TISSUE MODEL
J.
LIST OF ABBREVIATIONS AND ENGLISHSWEDISH TRANSLATIONS OF KEYWORDS
K.
PROGRAM MANUAL
L.
PROGRAM CODE
M. REFERENCES
A.
TISSUE MODEL OF THE CERVICAL REGION
The image guidance provided by the SiteRite can be used to monitor medical operations in the
cervical region. Such operations often concern the jugular vein. A photo of a transaxial slice of the
neck has been taken from the visible human project data to serve as an anatomically correct tissue
model for the program (upper image). It is viewed from above. The photos were taken from a dead
person and some blood vessels have therefore collapsed because of low pressure.
Transaxial
cervical slice
viewed from the head
Trapeziod muscles
Esophagus
Sternocleidomastoid
muscle
Posterior
Right
Left
Thyroid gland
Anterior
Sternothyroid muscle
Internal jugular vein
Carotid artery
Cartilage of larynx
Trachea
Cricothyroid muscle
Sagital view
Frontal view
B.
TRANSFORM TABLES USED
The piecewise linear mapping of pixel gray scale values to acoustic impedance is shown in (a). The
mapping of attenuation coefficients to gray scale is given in (b). The acoustic impedance
corresponding to a gray scale value is used to calculate the transmission coefficient for adjacent
pixels. The attenuation is used for attenuation calculation of the beam within each tissue type (pixel).
The implementation of flat portions of the curves aimed at giving the same acoustic properties to
pixels within visible anatomical structures defined by more or less homogenous regions. It was
anticipated that a whiter shade of a pixel corresponded to a harder and more rigid anatomical
structure. The fact that hard and rigid materials often have greater acoustic impedance than soft and
flexible material became the guideline when implementing the Z coefficient transform table. Hence,
the whitest shade was mapped to the highest acoustic impedance and the darkest shade to the lowest
acoustic impedance respectively. The attenuation coefficient transform table was implemented
having the implementation methods for the acoustic impedance table as model. No physical aspect
taken into consideration here, just a consequent way of mapping.
(a) Transform table used for mapping gray scale to acoustic impedance.
(b) Transform table used for mapping gray scale to attenuation coefficient mapping.
C.
TGC-FILTERS
A TGC-filter (Time Gain Compensation) has been programmed to increase the brightness
throughout the A-scans. It is employed to compensate for loss in intensity due to transmission
effects between and attenuation throughout pixels caused by the programs ultrasound transport
algorithm. In reality, the ultrasound beam is attenuated exponentially and a filter resembling (b) is
used. It was first used, but was switched for (a) because this filter produced more evenly amplified
images in terms of brightness and will be easier to design in future work, which may involve digital
signal processor programming. The amplification is performed by multiplying the brightness value in
the A-scan subscripted by the x-axis values below with the corresponding y-axis value in the filter.
(a) TGC-filter currently used.
(b) An alternative TGC-filter resembling the filter used in real equipment.
D.
ALGORITHM APPLIED TO TEST IMAGE
The B-scan algorithm was applied to a test image as shown in (a). The resulting B-scans are shown in
(b). The purpose of testing the algorithm was to evaluate and adjust parameters, such as simulated
intensity in order to later get a better result from real images. The purpose was also to se how
interfaces with different reflection coefficient would appear. No TGC-filter was used for this imaging
session to easier be able to resolve different gray scale values in the result.
According to the gray scale to acoustic impedance transform table in appendix C and equation
(11), the reflection coefficients for the interfaces parting the tissues (gray scale value 255, 192 and 0)
from the surrounding tissue (128) are 0.185, 0.167 and 1.0 respectively. Tissue property no. 1 and 2
let through enough intensity to be reflected against the upper row of test tissue and to be detected
again by the transducer.
(a) Test Image Domain
(1)
(2)
(3)
(b) Resulting B-scans
(1)
(2)
(3)
E.
EXAMPLES OF SIMULATED B-SCAN IMAGES
The program converts image data into gray scale before processing. Here, the result of three B-scans
of tissue regions holding different anatomical structures and gray scale combinations are shown. The
brighter edge of the B-scan contours in (a) corresponds to the brighter (left) edge of each resulting Bscan shown in (b).
(a) Anatomically correct tissue model
(2)
(1)
Posterior
( y-axis)
Right
(3)
Left (x-axis)
Anterior
(b) Resulting B-scans taken from lateral (1), posterior (2) and anterior (3) positions
(1)
(2)
(3)
F.
ARTIFACT EXAMPLE IN JUGULAR B-SCAN
The problem with unwanted reflections from non-homogenous pixel clusters in the lower region of
the gray scale is visualised. The jugular vein region in the tissue model appears to be smooth and
homogenous, but the resulting B-scan contains reflections. The jugular vein (and carotid artery)
portion of the image has become more or less indistinguishable. This is a consequence of the
inadequate segmentation technique used. The imaging is also done on different persons, of which
one was alive. In the case of the dead person, low pressure may have caused anatomical errors such
as collapsed blood vessels. Shadows are cast from the internal jugular vein and the carotid artery in
the simulated B-scan image.
(a) Anatomically correct tissue model.
Internal jugular vein
Carotid artery
B-scan
Posterior
( y-axis)
Right
Left (x-axis)
Anterior
(b) Real B-scan image and a simulated B-scan image.
Internal
jugular vein
Internal
jugular vein
Carotid artery
Carotid artery
Real B-scan
Simulated B-scan
G. MEDIAN FILTERED TISSUE MODEL
This appendix is preparatory for future work and outlines how salt and pepper-like noise can be
removed by using a median filter. The result of the median filter is an image with more homogenous
pixel fields. Each pixel in the original image is given the median value of the gray scale of the pixels
in its surrounding. Here, a 5 by 5 pixel square surrounding has been used to calculate the median
value for each pixel.
(a) Median filtered tissue model
(2)
(1)
Posterior
( y-axis)
Right
(3)
Left (x-axis)
Anterior
(b) Resulting B-scans taken from lateral (1), posterior (2) and anterior (3) positions
(1)
(2)
(3)
H. SMOOTH FILTERED TISSUE MODEL
The result of smooth filtering an image is presented in this appendix to facilitate future work. It
operates by giving each pixel in an image the gray scale average of its surrounding. A 5 by 5 pixel
surrounding has been used. The operation smoothens the gray scale structures of an image and
softens sharp edges.
(a) Smooth filtered tissue model
(2)
(1)
Posterior
( y-axis)
Right
(3)
Left (x-axis)
Anterior
(b) Resulting B-scans taken from lateral (1), posterior (2) and anterior (3) positions
(1)
(2)
(3)
I.
HISTOGRAM FILTERED TISSUE MODEL
The histogram filter is a plausible means for improving the segmentation technique. It is used for
contrast enhancement and visualizes structures that are not clearly visible. Small variations in gray
scale that can be hard to see in the original image are augmented by using the histogram of the image
for the mapping of new gray scale values. For more information see 19.
(a) Histogram filtered tissue model
(2)
(1)
Posterior
( y-axis)
Right
Left (x-axis)
Anterior
(3)
(b) Resulting B-scans taken from lateral (1), posterior (2) and anterior (3) positions
(1)
(2)
(3)
J. LIST OF ABBREVIATIONS AND ENGLISHSWEDISH TRANSLATIONS OF KEYWORDS
Abbreviation
English
Swedish
CT
IDL
IGS
Computed Tomography
Interactive Data Language
Image Guided Surgery
Datortomografi
JV, the
MR
MRI
Jugular Vein
Magnetic Resonance
Magnetic Resonance Imaging,
also referred to as MRT for
Magnetic Resonance
Tomography
With reference to the
horizontal/transversal plane
With reference to the
horizontal/transversal plane
University of Florida
Ultrasound
Acoustic impedance
Transaxial
Transversal
UF
US
Z
Bildledd Kirurgi, Kirurgi med
bildvägledning
Jugularvenen
Magnetresonans
Magnetresonansbildgivning,
Magnetresonanskamera, även
kallat MRT för
magnetresonanstomografi
Avser horisontalplanet /
transversalplanet
Avser horisontalplanet /
transversalplanet
Ultraljud
Akustisk impedans
English
Swedish
Adjacent
Assess
Attenuation coefficient
Aneurysm
Cartilage tissue
Cervical region, Neck region
Esophagus, the gullet
Gland
Imaging
Larynx, region of
Percutaneous, through the skin
Pixel, 2D surface element
Angränsande
Bedöma, uppskatta
Dämpningskoefficient
Aneurysm, åderbrock
Broskvävnad
Halsregionen
Esofagus, matstrupen
Körtel
Bildgivning
Struphuvudsregionen
Percutant, genom huden
Pixel, 2-dimensionellt
volymelement
Vävnad
Trakea, luftstrupe
Sköldkörtel
Vaskulär, avser blodkärl
Voxel, 3-dimensionellt
volymelement
Tissue
Trachea, the windpipe
Thyroid gland
Vascular
Voxel, 3D volume element
K. PROGRAM MANUAL
A set of functions and procedures has been written to simulate ultrasound transport and B-scan
imaging. This brief manual serves as an overview and will explain how to use different sub-routines.
The programming language used is IDL and the code written can be found in c:\rsi\idl54\myprogs.
To make this the current folder for IDL just click at the prompt and type cd, ‘c:\rsi\idl54\myprogs\’
<enter> or cdmp <enter>. This is a requirement for the program modules to run correctly. The
latter command runs a program stored under c:\rsi\idl54 containing the code for changing
directories. Now, when in the right directory, the code for ultrasound simulation can be run. A batch
file has been written to show the functionality of different program modules. Its name is model and
can be run by typing @model <enter> at the prompt. Give the command and open the file model to
look at the code. It will guide you through the rest of this manual and the examples given can be
found in this file. Available program modules are listed in Table 3.
Routine
Model
Description
Batch file to operate the test environment for used in imaging.
It contains the transducer’s initial settings such as frequency etc.
Initmodel
Reads image data to build an anatomically correct tissue model, which
can be used by other routines.
Position
Positions the transducer within the tissue model
Rotate
Rotates the transducer within the tissue model
Bscan
Performs the simulated B-scan calculation on the tissue model and
plots.
Ascan
Performs an A-scan calculation and transforms the resulting detected
reflections into brightness values
Transport
Calculates the transport of the ultrasound beam for the A-scan
function. Transmission and attenuation throughout and reflection from
each pixel scanned by an A-scan.
Look_up Maps tissue properties to gray scale values. Segmentation.
TGC
Amplifies brightness of distant pixels in an A-scan. Imitates the Time
Gain Compensation filter used in real equipment.
Rejection
Sets gray scale values of an A-scan to zero if they are below a
threshold. Eliminates less reflective structures and imitates the
Rejection filter used in real equipment.
Table 3. Program code developed in the framework of the simulated B-scan imaging project.
Program Modules and Their Functionality
The program uses an image of a transaxial slice of the cervical region as tissue model. It is converted
into 8-bit gray scale before processing and will later be replaced by a whole 3D model. This image
(240axial.jpg) and others can be found in c:\rsi\idl54\
myprogs\images\. Resulting B-scan images can also be found in this folder.
Loading the Tissue Model
Initmodel is the interface for loading different image data. It reads only jpeg-images, converts them
to 8-bit grays scale representation and returns a pointer to the array representing the image.
Example:
Domain = Initmodel(‘imagename.jpg’)
Positioning of The Transducer Head
The position procedure is used to position the transducer within the tissue model. This enables
imaging of different regions of interest and is helpful in assessing the segmentation technique used.
Using the rotate procedure can also rotate the transducer to an arbitrary angle. Coordinates given to
the position procedure designate the location of the sharp edge of the B-scan sector.
Cartesian coordinates are used to position the transducer within the tissue model. Spherical
coordinates are used to orient it. Following department and general conventions, x- and y-axis lie in a
transverse plane and the z-axis in the intersection of a sagital and a frontal plane: The y-axis
represents a posterior direction, the y values increasing with a greater distance to the dorsal side of
the human body. The x-axis points in the left arm direction and is, of course, perpendicular to the yaxis. The z-axis points in a cranial direction or upwards. Origo of an image is located in the lower left
corner.
Example: Position the transducer on (x,y) = (275, 0) and rotate it to an angle (phi) of π/2, which would be a frontal
scanning position in the y-direction facing the larynx.
position, td, [275, 0]
rotate, td, [!pi/2.]
The command ‘rotate, td, [x, y]’ would rotate the transducer to an x rad phi angle and to an y rad
theta angle, but since only 2D transversally sliced images are used theta will stay unchanged. Theta
has been implemented for future use only.
The coordinate parameter values given within brackets are added to current angle or position if the
keyword inc is used with either the position or rotate procedure. The positioning can also bee done
manually by giving the keyword mouse followed by clicking the tissue model image with the mouse.
The transducer, or sharp edge of the B-scan sector, will then be positioned at this position.
Example: Reposition the transducer by increasing the angle by π/12, decrease x by 7 pixels and increase y by 2 pixels.
Then keep the angle but position the transducer by using the mouse. Click in the tissue model window.
rotate, td, [!pi/12], /inc
position, td, [-7, 2], /inc
position, td, /mouse
Performing The B-scan
Calling the Bscan procedure causes the program to simulate a B-scan image. Giving another value to
the offset variable, which is found in the B-scan procedure, can alter the appearance of the B-scan
sector.
Example: Produce the simulated B-scan with the coordinates and angles given earlier for the given transducer type (td)
and do it on the tissue model (domain).
Bscan, td, domain
The result of the B-scan and the tissue model used can be saved to file by giving a filename
parameter. It will be saved in jpeg-format under c:\rsi\idl54\myprogs\images\.
Example: Perform B-scan and save the result along with the tissue domain used in two separate files named
B_scan_anterior and D_org.
Bscan, td, domain, Bscan_file = 'B_scan_anterior', Domain_file = 'D_org’
Ultrasound Transport Routines
A-scan, Transport, Look_up, TGC and Rejection are routines directly or indirectly called by the Bscan procedure to carry out the ultrasound transport calculations. They work behind the interface
provided by B-scan. Hence, the user never needs to confront these functions. Appendix K contains
the code for all of the functions presented in this manual.
L.
PROGRAM CODE
The test environment batch file (model) and routines comprising the simulated B-scan program are
presented in this appendix. A manual is given in appendix J.
;******************************************************************************************************
;*
*
; * Name:
model (run batch file by typing @model at the prompt)
*
;*
*
;*
*
; * Description:
Batch file setting the surroundings for functions A-scan and B-scan *
;*
*
; * Annotation:
-sector always in degrees
*
;*
-sp = size of step in cm per pixel to calculate
*
;*
transport with (not pixel size on monitor)
*
;*
-phi and theta always in radians
*
;******************************************************************************************************
;Change to myprogs directory
cdmp
;----------------------------------------------------------------------------- create SiteRite II equipment
; Geometry
pos=INTARR(3)
;kartesian td-positioning: pos(0)=x, pos(1)=y, pos(2)=z
ori=FLTARR(2)
;td-orientation in sph. coord.: ori(1)=phi ori(2)=theta
; SiteRite II tranducer(td) settings
td=CREATE_STRUCT('pos', pos, 'ori', ori, 'F', LONG(7500000.), 'Depth', 4.0 , 'Sector', 25, $
'sp', 0.01, 'pixel_width', 0.05, 'I0', 1000.)
;-------------------------------------------------------------------------------- set up test environment
DEVICE, RETAIN=2, DECOMPOSED=0
; recomended initial settings
SET_PLOT, 'win'
WINDOW, 0 ; used here to view domain
WINDOW, 0, TITLE='Original Data:'
WINDOW, 1 ; Bscan result
WINDOW, 2 ; For Ztables, a-tables, TGC filter viewing etc.
WSHOW, 2, 0
;-------------------------------------------------------------------------- creating experimental domains
domain = initmodel('240axial.jpg')
medianfiltered_domain = MEDIAN( domain, 5)
smoothened_domain = SMOOTH( domain, 5 )
histogramfiltered_domain = ADAPT_HIST_EQUAL (domain )
med5histf_domain = ADAPT_HIST_EQUAL (medianfiltered_domain )
;--------------------------------------------------------------------------------------- function testing
; Position the transducer to x=330 y=110 z=5
;position, td, [330, 110]
; rotate the transducer to zero
;rotate, td, [!pi/2., 0]
;Test: change position
;position, td, [1,1,1], /inc
;position, td, [-1, -1], /inc
;position, td, /mouse
; Test: rotate
;rotate, td, [!pi]
;rotate, td, [!pi/8,!pi/3], /inc
;rotate, td, /mouse
;------------------------------------------------------------------------------------- Performing JV –scan and save results
rotate, td, [!pi*1/8] & position, td, [135, 120]
Bscan, td, domain
;------------------------------------------------ Performing three (ex.) Bscans on original domain and save result
;rotate, td, [!pi*2/8] & position, td, [122, 175]
;Bscan, td, domain, Bscan_file = 'B_scanex_org_lateral', Domain_file = 'D_org_domain_ex'
;rotate, td, [!pi*3/2] & position, td, [260, 382]
;Bscan, td, domain, Bscan_file = 'B_scanex_org_posterior'
;rotate, td, [!pi*7/9] & position, td, [362, 40]
;Bscan, td, domain, Bscan_file = 'B_scanex_org_anterior'
; **************************************************************************************************
;*
*
; * Name:
initmodel
*
;*
*
; * Description:
initializes and builds the tissue modell from images of axial slices. *
;*
*
; * Input:
image1 ; Tissue voxel model 2D-slice
*
;*
*
; * Returns
ZDomain ; Gray scale pixel domain used as tissue model
*
;*
*
; * Annotation:
positioning :
positive z = cranial direction
*
;*
positive y = frontal
*
;*
positive x = right
*
;*
*
; **************************************************************************************************
FUNCTION initmodel, image1
myfile = FILEPATH(image1, SUBDIR=['myprogs\images'])
result = QUERY_JPEG(myfile, info)
PRINT, 'QUERY_JPEG from initmodel ', 'on >>> ', myfile, ' <<< results in: ', result
READ_JPEG, myfile, OrgDomain, /GRAYSCALE
PRINT, ' and its dimensions are: ',info.DIMENSIONS
close,1
ZDomain= OrgDomain
Return, ZDomain
END
; **************************************************************************************************
;*
*
; * Name:
position
*
;*
*
; * Description:
positions the transducer within tissue model
*
;*
*
; * Input:
td ; For transducer specific info
*
;*
pos ; new coordinates or increment given as array [x,y,z]
*
;*
inc ; keyword set -> increment current coordinates with pos
*
;*
mouse ; keyword set -> manual positioning by using mouse
*
;*
*
; * Result:
The transducer gets a new position within the tissue model
*
;*
*
; **************************************************************************************************
PRO position, td, pos, inc = inc, mouse = mouse
CASE N_params() OF
0:Message, 'I got no input at all'
1:
IF Keyword_set(mouse) THEN BEGIN
WSET, 0
CURSOR, x, y, 1, /dev
td.pos(0)=x & td.pos(1)=y & td.pos(2)=0
ENDIF ELSE BEGIN
Message, 'I miss positioning input'
ENDELSE
2:
IF N_elements(pos) EQ 1 THEN BEGIN
Message, 'Either have you missed a / before a keyword, or only given me info on x and
not y(,z). '
ENDIF ELSE BEGIN
D3 = N_elements(pos) GT 2 ;xy or xyz ?
IF D3 THEN BEGIN
PRINT, 'Only using 2D right now, so I will set your z to 0'
z=0
ENDIF ELSE BEGIN
z=0;default
ENDELSE
td.pos = D3 ? KEYWORD_SET(inc)*td.pos + pos : $
KEYWORD_SET(inc)*td.pos + [pos, z]
ENDELSE
ELSE: Message, 'I do not think the possitioning was correctly done ! (check parameters)'
ENDCASE
PRINT, 'Positiontd: Transducer is now positioned at : ', td.pos
END
; **************************************************************************************************
;*
*
; * Name:
rotate
*
;*
*
; * Description:
rotates the transducer
*
;*
*
; * Input:
td ; For transducer specific info
*
;*
myori ; new orientation or increment given in degrees
*
;*
inc ; keyword set -> increment current coordinates with myori
*
;*
mouse ; keyword set -> rotate manualy by using mouse
*
;*
*
; * Result:
The transducer gets a new position within the tissue model using *
;*
spherical coordinate system angles phi and theta.
*
;*
*
; **************************************************************************************************
PRO rotate, td, myori, inc = inc, mouse = mouse
CASE N_params() OF
0:
Message, 'I got no parameters'
1:
BEGIN
IF Keyword_set(mouse) GT 0 THEN BEGIN
Message, 'Orientation by mouse is not implemented'
ENDIF
Message, 'The orientaion info is missing'
IF Keyword_set(inc) THEN BEGIN
Message, 'I need info on the orientation increment too'
ENDIF
2:
END
BEGIN
theta = !pi/2
IF N_elements(myori) GT 1 THEN BEGIN
PRINT, 'Your theta will not be used (since 2D now). I will set it to ',$
theta/!pi, ' pi'
td.ori(1) = theta
ENDIF
td.ori = Keyword_set(inc)*td.ori + [myori(0), theta]; later switch 0 for theta
END
ENDCASE
PRINT, 'The transducer is rotated to: phi=', td.ori(0)/!pi, ' pi and theta=', td.ori(1)/!pi, ' pi'
END
; *****************************************************************************************************************
;*
*
; * Name:
Bscan
*
;*
*
; * Description:
Extracts and calculates a B-scan from a tissue voxel model 2D-slice
*
;*
*
; * Input:
td ; For transducer specific info
*
;*
slice ; Tissue voxel model 2D-slice
*
;*
NOATTENUATION ; keyword set -> calculation without attenuation in each pixel *
;*
Domain_file; keyword set -> save Domain
*
;*
Bscan_file; keyword set -> save resulting Bscan
*
;*
*
; * Result:
thisBscan ; B-scan image
*
;*
*
; * Annotation:
(uu,vv) = position in B-scan image corresponding to the (u,v) position in
*
;*
slice
*
;*
*
; *****************************************************************************************************************
PRO Bscan, td, slice, NOATTENUATION=noattenuation, Bscan_file=Bscan_file, Domain_file=Domain_file
u = td.pos(0)
v = td.pos(1)
gamma = td.ori(0)
;new image containing the Bscan
s = size(slice)
nrows = s(2)
ncols = s(1)
thisBscan = MAKE_ARRAY(ncols, nrows, /INTEGER, VALUE = 0)
offset = 50; push line outwards to imitate real view
scl = 1; magnify image
startbeg_diff = 65; mark boundary to increase spatial understanding
framecolor_B = 80 > startbeg_diff
framecolor_D = 255 > startbeg_diff
sector_start = gamma + td.sector*(!pi/180) / 2.00
sector_end = gamma - td.sector*(!pi/180)/2.00
direction = sector_start
n = double(td.depth)/double(td.pixel_width)
dsector = -2*Asin(1./(2*(n+offset)));sector boundary step size.
IF (td.sector EQ 0) THEN direction = sector_end
WHILE direction GT sector_end DO BEGIN
offsetu = offset*cos(direction)
offsetv = offset*sin(direction)
IF keyword_set(noattenuation) THEN BEGIN
pixelresult = Ascan(slice, td, direction, offsetu, offsetv)
ENDIF ELSE BEGIN
pixelresult = Ascan(slice, td, direction, offsetu, offsetv, /noattenuation)
ENDELSE
uu = pixelresult(*,0) & vv = pixelresult(*,1) & colorvekt = pixelresult(*,2)
endpoint = n_elements(colorvekt)-1
uucenter = ncols/2 - (u) ;positioning of the resulting Bscan
vvcenter = nrows/2 - (v)
i = indgen(endpoint+1)
thisBscan(uu+uucenter,vv+vvcenter) = fix(colorvekt(i))
IF (direction EQ sector_start) THEN BEGIN;mark first Ascan, darker
slice[uu,vv]=framecolor_D-startbeg_diff &
thisBscan[uu+uucenter,vv+vvcenter]=framecolor_B-startbeg_diff
ENDIF ELSE BEGIN;mark sector boundary in Ascan, darker
slice[uu(endpoint),vv(endpoint)] = framecolor_D-startbeg_diff
thisBscan[ 0 > uu(endpoint)+uucenter > uu(endpoint) $
, 0 > vv(endpoint)+vvcenter > vv(endpoint)] = framecolor_B-startbeg_diff
ENDELSE
direction = direction + dsector
ENDWHILE
;mark last Ascan brighter
slice[uu,vv] = framecolor_D
thisBscan[uu+uucenter,vv+vvcenter]=framecolor_B
;Domain and Bscan screen otput
SET_PLOT, 'WIN'
WSET, 0;Domain
WINDOW, 0, TITLE = 'Domain to be scanned : '
TV, slice
IF keyword_set(noattenuation) THEN BEGIN
WINDOW, 1, TITLE = 'B-scan (no attenuation) '+', td.I0 = ' + string(td.I0)
ENDIF ELSE BEGIN
WINDOW, 1, TITLE = 'B-scan '+', td.I0 = ' + string(td.I0)
ENDELSE
; Rotate the image to viewing position and magnify it scl times
thisBscan = ROT(thisBscan, -(270-gamma*180/!pi), scl, /pivot);, /INTERP)
;thisBscan(WHERE(thisBscan EQ 0B)) = 255B ; invert
WSET, 1;Bscan
LOADCT, 0; b&w
TV, thisBscan
;Domain file output
IF keyword_set(Domain_file) THEN BEGIN
jpg_name = Domain_file + '.jpg'
write_jpeg, jpg_name, slice, quality=100
cmd = 'move ' + jpg_name + ' images'
spawn, cmd
ENDIF
;Bscan file output
IF keyword_set(Bscan_file) THEN BEGIN
jpg_name = Bscan_file + '.jpg'
write_jpeg, jpg_name, thisBscan, quality=100
cmd = 'move ' + jpg_name + ' images'
spawn, cmd
ENDIF
END
; ***********************************************************************************************************
;*
*
; * Name:
Ascan
*
;*
*
; * Description:
Extracts an array of pixels (line) from input data (tissue_slice) and
*
;*
calculates reflections from each pixel interface detected at the transducer *
;*
*
; * Input:
tissue_slice ; Tissue voxel model 2D-slice
*
;*
td ; For transducer specific info
*
;*
direction ; A-scan's direction in B-scan plane (tissue_slice plane)
*
;*
offsetu, offsetv; spatial offset for A-scan in <direction> direction (cosmetic) *
;*
noattenuation ; keyword set -> do not calculate with attenuation
*
;*
*
; * Returns:
[[uu], [vv], [thisAscan]] ; coordinates in tissue_slice and calculated
*
;*
reflections from these positions.
*
;*
*
; * Annotation:
(u0, v0) = lower left corner of image
*
;*
(tdu, tdv) = transducer's (u,v) coordinate
*
;*
*
;*
*
; **********************************************************************************************************
FUNCTION Ascan, tissue_slice, td, direction, offsetu, offsetv, noattenuation=noattenuation
u0=0 & v0=0
u = td.pos(0) & v = td.pos(1)
n = double(td.depth)/double(td.pixel_width); no of pixels that correspond to td.depth, ~100
s=Size(tissue_slice)
su = s[1] & sv=s[2]
Position1:; Transducer location
tdv = v - v0 + offsetv
tdu = u - u0 + offsetu
IF !order NE 0 THEN tdv = sv - 1 - tdv ;Invert?
IF (tdu LT 0) THEN tdu = 0
IF (tdu GE su) THEN tdu = su
IF (tdv LT 0) THEN tdv = 0
IF (tdv GE sv) THEN tdv = sv
Position2:; Sector Boundary
ru = u + Round(n*cos(direction)) + offsetu ; Maybee Flor or Round ?
rv = v + Round(n*sin(direction)) + offsetv
u1 = ru - u0
v1 = rv - v0
IF !order ne 0 then v1 = sv - 1 - v1
IF (u1 LT 0) or (u1 GE su) or (v1 LT 0) or (v1 GE sv) then begin
Message, 'Ultrasound Beam reaches outside tissue_slice and will be cut off.',
/CONTINUE
ENDIF
IF (u1 LT 0) THEN u1 = 0
IF (u1 GE su) THEN u1 = su-1
IF (v1 LT 0) THEN v1 = 0
IF (v1 GE sv) THEN v1 = sv-1
;Line Extraction
du = Float(ru-tdu)
;delta tdu
dv = Float(rv-tdv)
n = abs(du) > abs(dv)
IF n EQ 0 THEN Message, 'Zero length line.'
r = FLTARR(n+1)
IF abs(du) GT abs(dv) THEN BEGIN
IF u1 GE tdu THEN s=1 ELSE s=-1
sv = (v1-tdv)/abs(du)
ENDIF ELSE BEGIN
IF v1 GE v THEN sv=1 ELSE sv=-1
s = (u1-tdu)/abs(dv)
ENDELSE
uu = Long(Findgen(n+1l)*s+tdu)
vv = Long(Findgen(n+1l)*sv+tdv)
extracted = tissue_slice[Long(vv)*su + uu]
IF keyword_set(noattenuation) THEN BEGIN
tpt = Transport( td, extracted, /noattenuation)
ENDIF ELSE BEGIN
tpt = Transport( td, extracted)
ENDELSE
thisAscan = tpt
thisAscan = TGC(tpt)
;thisAscan = Rejection(thisAscan, threshold = 20)
RETURN, [[uu], [vv], [thisAscan]]
END
; ********************************************************************************************************
;*
*
; * Name:
Transport
*
;*
*
; * Description:
Calculates the ultrasound transport and reflection result for an
*
;*
Ascan vector (g)
*
;*
*
; * Input:
td ; For transducer specific info
*
;*
g ; An A-scan vector
*
;*
noattenuation ; keyword set -> do not calculate with attenuation
*
;*
*
; * Returns:
Igray_from_I ; gray scale values indicating how strongly each pixel has *
;*
reflected the ultrasound beam from these positions
*
;*
*
; * Annotation:
Calculation and Transformation through (linear) interpolation of alfa
*
;*
values in Look_up()
*
;*
*
;*
*
; ********************************************************************************************************
FUNCTION Transport, td, g, noattenuation=noattenuation
L=N_ELEMENTS(g);total length of Ascan
white = 255.
properties = Look_up(td);
z = Float(properties[*, 0])
a = Float(properties[*, 1])
m = 1 + indgen(L-1); all but first pixel index for T_into
T_into = [1, 1.-(( ( z(g(m))-z(g(m-1)) )*( 1./ ( z(g(m))+z(g(m-1)) )) )^2 >0.)]
max_values = machar()
minus_infdB = 10*alog10(max_values.xmin)
max_infdB = 10*alog10(max_values.xmax)
I0dB = 10.*alog10(td.I0) < max_infdB
IdB_scale_into_g = DOUBLE(white)/I0dB
I_scale_into_g = DOUBLE(white)/DOUBLE(td.I0)
noatt = KEYWORD_SET(noattenuation) ? 0:1
IF L NE 0 THEN BEGIN; -----------------------------------------------------------calculations performed in dB scale
IdB_at_front_n = MAKE_ARRAY(L, VALUE=minus_infdB);/NOZERO)
IdB_reflected_from_n = MAKE_ARRAY(L, VALUE=minus_infdB)
I_reflected_from_n = MAKE_ARRAY(L, VALUE=0)
I_reflected_from_n(L-1) = 0; no reflection for last pixel
IdB_after_transandpixeltpt = (I0dB > minus_infdB);start condition
;forth
FOR i=0, L-1 DO BEGIN
log_T = 10*alog10(T_into(i)) > minus_infdB
IdB_after_transandpixeltpt = ((IdB_after_transandpixeltpt+log_T) &
- noatt*a(g(i))*td.sp) > minus_infdB
IdB_at_front_n(i)
=
(finite(10^(IdB_after_transandpixeltpt/10.)))
IdB_after_transandpixeltpt : minus_infdB ;korrekt om raknar i dB
IdB_after_transandpixeltpt = IdB_at_front_n(i)
ENDFOR
;back
FOR i=L-2, 0, -1 DO BEGIN
log_R = 10*alog10(1-T_into(i+1)) > minus_infdB
IdB_just_reflected = ( IdB_at_front_n(i) + log_R ) > minus_infdB;Reflected
IF (log_R NE minus_infdB) THEN BEGIN;follow the reflected wave
IdB_just_after_beg = IdB_just_reflected
still_intensity_in_beam_and_not_reached_td = 1
j=i
WHILE (still_intensity_in_beam_and_not_reached_td) DO BEGIN
templog_T = 10*alog10(T_into(j)) > minus_infdB
log_T = (templog_T EQ minus_infdB) ? 0 : templog_T
IdB_just_after_beg = (j EQ i) ?
(IdB_just_reflected - noatt*a(g(j))*td.sp) > minus_infdB : $
( (IdB_just_after_beg + log_T) - noatt*a(g(j))*td.sp) >
minus_infdB
IF (IdB_just_after_beg EQ minus_infdB) OR (j EQ 0 ) THEN $
still_intensity_in_beam_and_not_reached_td = 0
j = j-1
ENDWHILE
ENDIF ELSE BEGIN;nothing reflects
IdB_just_after_beg = minus_infdB
ENDELSE
IdB_reflected_from_n(i) = IdB_just_after_beg
ENDFOR; end of transport calculation--------------------------------------------------------------------------I_reflected_from_n = 10^(IdB_reflected_from_n/10.); anti-log result
Igray_from_I = Round(I_reflected_from_n*I_scale_into_g)
RETURN, Igray_from_I
ENDIF ELSE BEGIN
MESSAGE, 'The Ascan vector to be calculated must contain more than 0 element'
RETURN, [0]
ENDELSE
END
?
; ****************************************************************************************************************
;*
*
; * Name:
Look_up
*
;*
*
; * Description:
Maps acoustic impedance and attenuation coefficient to each gray scale value *
;*
between 0 (black) and 255 (white). Used for tissue/gray scale segmentation *
;*
*
; * Input:
td; For transducer specific info
*
;*
*
;*
*
; * Returns:
[[z], [a]]; A 256 by 2 floating-point array holding acoustic impedances [z]
*
;*
and attenuation coefficients [a];
*
;*
*
; * Annotation:
Attenutattion coeffs are scaled to td:s freq using 0.5-1 dB inc per MHz
*
;*
*
; ****************************************************************************************************************
FUNCTION Look_up, td
;settings
freqscale = (td.f GT 1000000.) ? 0.5/1000000. : 0. ; [dB / MHz]
white = 255.
gmax = white-20.; represents skull bone middle gray value
Zmax = 7800000. & amax = td.f*freqscale & gmax = white-20
z_into_gscale = gmax/Zmax;linear
a_into_gscale = gmax/amax;
;look up tables
tissue=['air', 'fat', 'kidney', 'liver, spleen', 'muscle', 'bone']
z0 = 400. & a0 = 1.0+td.f*freqscale
; air
z1 = 1380000. & a1 = 0.6+td.f*freqscale
; fat
z2 = 1620000. & a2 = 0.9+td.f*freqscale
; kidney
z3 = 1640000. & a3 = 0.7+td.f*freqscale
; liver, spleen
z4 = 1700000. & a4 = 0.5+td.f*freqscale
; muscle
z5 = Zmax
& a5 = amax
; skullbone
z_table = [z0, z1, z2, z3, z4, z5]
a_table = [a0, a1, a2, a3, a4, a5]
ntableentries = N_ELEMENTS(z_table)
;estimated
;table
;table
;table
;table
;estimated
IF not (ntableentries EQ N_ELEMENTS(a_table)) THEN BEGIN
MESSAGE, 'z_table and a_table must have same number of entries'
ENDIF
g_map = Round(z_table*z_into_gscale)
;flat piece design
percentage_from_end = .2
percentage_from_beg = .3
LAST = 0
;first slope
CASE ntableentries OF
1: LAST = 1
0: MESSAGE, 'no entries in z_table/a_table'
ELSE:
ENDCASE
g_slope_left = 0
higher_index_offset = LAST ? (white-g_map(n)) : g_map(0)*(1-percentage_from_end)
g_slope_right = fix(higher_index_offset)
no_el_in_piece = g_slope_right - g_slope_left
next_z = LAST ? Zmax : z_table(0)
next_a = LAST ? amax : a_table(0)
indgen_scale = no_el_in_piece - 1
IF no_el_in_piece GE 2 THEN BEGIN
z = [z, z_table(n) + findgen(no_el_in_piece)*next_z/indgen_scale]
a = [a, a_table(n) + findgen(no_el_in_piece)*next_a/indgen_scale]
ENDIF ELSE BEGIN
IF no_el_in_piece EQ 1 THEN BEGIN;1
z = [next_z]
a = [next_a]
ENDIF ELSE BEGIN;0
z = [0]
a = [0]
ENDELSE
ENDELSE
; first slope
g_flat_left = g_slope_right
LAST = 0
FOR n=0, ntableentries-1 DO BEGIN; add flat through point thereafter add slope
IF n EQ (ntableentries-1) THEN LAST=1
g_n = g_map(n)
next_g_n = LAST ? white : g_map(n+1)
following_int_size = next_g_n - g_n
;flat piece
higher_index_offset = LAST ? (white-g_n)*percentage_from_beg : &
following_int_size*percentage_from_beg
g_flat_right = fix(g_n + higher_index_offset)
no_el_in_piece = g_flat_right - g_flat_left
IF no_el_in_piece GE 1 THEN BEGIN
z = [z, MAKE_ARRAY(no_el_in_piece, VALUE=z_table(n))];interpol
a = [a, MAKE_ARRAY(no_el_in_piece, VALUE=a_table(n))];interpol
ENDIF
;slope piece
g_slope_left = g_flat_right
higher_index_offset = LAST ? (white-g_map(n)) : following_int_size*(1-percentage_from_end)
g_slope_right = fix(g_n + higher_index_offset)
no_el_in_piece = g_slope_right - g_slope_left
next_z = LAST ? Zmax : z_table(n+1)
next_a = LAST ? amax : a_table(n+1)
indgen_scale = no_el_in_piece - 1
IF no_el_in_piece GE 2 THEN BEGIN
z = [z, z_table(n) + indgen(no_el_in_piece)*(next_z-z_table(n))/indgen_scale]
a = [a, a_table(n) + indgen(no_el_in_piece)*(next_a-a_table(n))/indgen_scale]
ENDIF ELSE BEGIN
IF no_el_in_piece EQ 1 THEN BEGIN
z = [z, next_z]
a = [a, next_a]
ENDIF
ENDELSE
g_flat_left = g_slope_right
ENDFOR
RETURN, [[z], [a]]
END
; ****************************************************************************************************************
;*
*
; * Name:
TGC
*
;*
*
; * Description:
Amplifies every brightness value in an A-scan.
*
;*
Imitates the Time Gain Compensation filter. Pixels further from transducer are *
;*
amplified more than closer ones to compensate for attenuation
*
;*
*
; * Input:
myAscan ; The result from an A-scan
*
;*
*
; * Returns:
tgcAscan ; A-scan with raised brightness values
*
;*
*
; ****************************************************************************************************************
Function TGC, myAscan
white = 255.
; -------------------------------------------------------------------------------------------------- a linear filter
L = n_elements(myAscan)
filter1 = 1 + indgen(L)*(white-1)/(L-1);straight forward just amplify
tgcAscan = myAscan*filter1 < white; and the cut tops
; -----------------------------------------------------------------------------------------------a non-linear filter
;L = n_elements(myAscan)
;n = indgen(L)
;just_a_factor = 0.01;
;normalizedfunc = ( 1-exp(-n*just_a_factor)) / (1-exp(-(L-1)*just_a_factor))
;filter2 = 1+(white-1)*(normalizedfunc)
;tgcAscan = myAscan*filter2 < white
;set_plot, 'win'
;wshow, 2
;window, 2, title = 'filter2'
;plot, indgen(L), filter2
RETURN, tgcAscan
END
; *****************************************************************************************************
;*
*
; * Name:
Rejection
*
;*
*
; * Description:
Imitates the rejection of too small reflections in/for myAscan
*
;*
It sets every gray scale value in thisAscan bellow t to zero.
*
;*
*
; * Input:
myAscan ; An A-scan
*
;*
THRESHOLD ; gray scale value threshold
*
;*
*
; * Returns:
myAscan ; which is myAscan without gray scale values subseeding t *
;*
from these positions
*
;*
*
; **************************************************************************************************
Function Rejection, myAscan, THRESHOLD=t
B = WHERE(myAscan LT t, count, COMPLEMENT=B_C, NCOMPLEMENT=count_c)
myAscan(B) = 0
Return, myAscan
END
M. REFERENCES
Breeuwer M et al. The EASI Project –Improving the Effectiviness and Quality of Image Guided Surgery.
IEEE Transactions on Information Technology in Biomedicine 1998 Sep;2(3):156-168.
2
Fenster A et al. Three-Dimensional Ultrasound Imaging System for Prostate Cancer Diagnosis and
Treatment. IEEE Transactions on Instrumentation and Measurement 1998 Dec;47(6):1439-1447.
3
Angier J (Producer) & Chedd, G (Producer). Scientific Amerikan Frontiers: #605 [TV Series] available
from Scientific American Frontiers, 70 Coolidge Hill Road, Watertown, MA 02172, USA.
4
Jovanov E et al. Tactical Audio and Acoustic Rendering in Biomedical Applications. IEEE Transactions
on Information Technology in Biomedicine 1999 June;3(2):109-118
5
Yadong L et al. Computational Methods for Ultrasound Bone Assessment. Medicine and Biology
1999;25(5):823-830.
6
Wave2000 [Software]. Cyberlogic Inc. New York, NY, USA.
http://www.cyberlogic.com/about2000.html. Web page existed 2001 Feb.
7
Imagine 3D [Software]. Utex Scientific Instruments Inc. Mississauga, Ontario, Canada.
http://www2.utex.com/webdb/whitepapers.nsf/pages/i3d+features [Internet web site]. Web page existed
2001 Feb.
8
Aiger D, Cohen-Or D. Real-Time ultrasound Imaging Simulation. Real-Time Imaging 1998
Aug;4(4):263-274.
9
Dymax Corporation. 271 Kappa Drive, Pittsburgh, PA 15238 USA, http://www.dymax-usa.com [Internet
web site]. Web page existed 2001 Feb.
10
http://www.npac.syr.edu/projects/vishuman/about.html [Internet web site]. Web page existed 2001 Feb.,
http://www.nlm.nih.gov/pubs/factsheets/visible_human.html [Internet web site]. Web page existed 2001
Feb.
11
Martini FH. Fundamentals of Anatomy and Physiology. Upper Saddle River, New Jersey 07485: Prentice
Hall; 4th Edition: 26, 685-694.
12
National Coucil on Radiation Protection and Measurements (NCRP). Biological Effects of
Ultrasound:Mechanism and Clinical Implications. Bethesda, Md. 20814. NCRP Publications. Report No.
74: 1, 34-39.
13
Wells PNT. Biomedical Ultrasonics. New York 10003, USA. Academic Press inc: 15-28, 43f, 120-144
14
Dr David Hintelang [Interview]. Nuclear and Radiological Engineering Department, University of
Florida. 2000 Dec.
15
Kinsler LE, Frey AR. Fundamentals of Acoustics. New York, USA 1962. John Wiley & Sons Inc:11-25
Chapter 9.
16
Bushberg Jerrold T et al. The Essential physics of medical imaging. Baltimore, Maryland 21202 USA.
Williams & Wilkins 1994:367-416.
17
Stumpf FB. Analytical Acoustics. Michigan 48106 USA. Ann Arbour Sciencie Publishers Inc 1990:96,
101f, 113, 242ff, 261f.
18
Allisy-Roberts PJ, Farr RF. Physics for Medical Imaging. London, Great Britain. Harcourt Publishers Ltd
1999: 196-206.
19
Meyer-Ebrect D. Digitale Bildverarbeitung I. Aachen, Germany 1999. Lehrstuhl fuer Messtechnik und
Bildverarbeitung der RWTH-Aachen:68-72.
20
Angelsen B. Ultrasound Imaging vol 1. Trondheim, Norway. Emantec AS 2000:Chapter 1.2.
21
Kleppe JA. Engineering Applications of Acoustics. Norwood, MA 02062 USA. Artech House Inc
1989:11-25.
22
Fahy FJ. Sound Intensity. Essex IG11 8JU, England. Elsevier Science Publichers Ltd.p46-61.
23
Ault T, Siegel M. In Situ Calibration for Quantitative Ultrasounic Imaging. IEEE Instrumentation and
Measurement Magazine 1998 Sep;1(3):9-18.
24
Saio Y, Sasaki H et al. Ultrasonic tissue characterization of diseased myocardium by scanning acoustic
microscopy. Journal of Cardiology 1995;25(3):127-132.
25
Oosterveld B, Thijssen J et al. Detection of diffuse liver disease by quantitative echocardiography:
dependence on a priori choice of parameters. Ultrasound in Medicine and Biology 1993;19(1):21-24.
1