Download Radiography Practice Analysis

Document related concepts

Forensic epidemiology wikipedia , lookup

Medical image computing wikipedia , lookup

Transcript
Examination Content Specifications and
Clinical Competency Requirements
for ARRT® Certification in
Radiography
Radiography
Practice Analysis
January 2009January 2012
Final Report
January 2012
Copyright © 2011 by The American Registry of Radiologic Technologists. All rights reserved. Reproduction in
whole or part is not permitted without the written consent of the ARRT®
Radiography Practice Analysis Report
TABLE OF CONTENTS
Chapter 1:
Project Background and Methodology ....................................................................2
Introduction...................................................................................................................................2
Practice Analysis Methods ...........................................................................................................3
Advisory Committee.....................................................................................................................4
Project Schedule ...........................................................................................................................4
Chapter 2:
Survey Methods .......................................................................................................6
Chapter 3:
Data Analysis and Results .....................................................................................10
Overview .....................................................................................................................................10
Data Analysis Techniques ..........................................................................................................10
Staff Survey Results....................................................................................................................11
Chapter 4:
Radiography Managers’ Survey ............................................................................22
Chapter 5:
Use of the Center for Medicare and Medicaid Services (CMS) Data
to Supplement the Radiography Practice Analysis ................................................36
Chapter 6:
Overview of Supplemental Data and Follow-up Practice Analysis
Advisory Committee Meeting ...............................................................................41
Chapter 7:
Revision of Task Inventory, Content Specifications, and Clinical
Competency Requirements ....................................................................................45
APPENDICES
Appendix A:
Appendix B:
Appendix C:
Appendix D:
Appendix E:
Appendix F:
Appendix G:
Appendix H:
Appendix I:
Appendix J:
Appendix K:
Radiography Practice Analysis Staff Radiographer Survey Questionnaire
Questionnaire Demographic Results
Questionnaire Results (Unsorted)
Questionnaire Results (Sorted by Percent Responsible)
Radiography – Final Task Inventory
2012 Content Specifications for Radiography
2012 Didactic and Clinical Competency Requirements for Radiography
Spring 2010 Radiography Managers Survey Questionnaire
Professional Comment Process
Weighting Exercise
References
Radiography Practice Analysis Report
CHAPTER 1
PROJECT BACKGROUND AND METHODOLOGY
Introduction
In the past, the content of most certification exams was closely linked to the curriculum of educational
programs or to the table of contents of a prominent textbook. In the late 1970s and early 1980s, certification
boards and testing professionals began to realize that certification requirements should be closely linked to the
requirements of practice. It is now recognized that the content of certification exams should be determined
only after systematically studying and identifying the activities performed in the work setting. Enrichment
topics, such as the history of a profession, should not be tested on a certification exam unless these topics are
clearly job-related (NCHCA, 1979).
The job-relatedness of an examination is generally established through a job or practice analysis (AERA,
APA, NCME, 1999). Practice analysis is useful for determining the topics to be covered by an examination
and the degree of emphasis that each topic receives. The rationale for job and practice analysis is outlined in
the Standards for Educational and Psychological Testing (AERA, APA, NCME, 1999) and in the standards
adopted by the National Commission for Certifying Agencies (NCCA, 2004). Legislative activity and legal
precedence also stress the importance of practice analysis in the development and validation of certification
exams. The Uniform Guidelines on Employee Selection adopted by the U.S. Equal Employment Opportunity
Commission, Department of Labor, and Department of Justice also indicate that practice analysis is critical in
the development of examinations related to employment (EEOC, 1978). Practice analysis is equally critical
for establishing other types of certification requirements such as educational standards, experience
requirements, and other eligibility criteria.
In 1980, The American Registry of Radiologic Technologists® (ARRT®) initiated its first large-scale
effort to systematically document the job requirements of entry-level personnel in the areas of Radiography,
Nuclear Medicine Technology, and Radiation Therapy Technology (Reid, 1983). Since the original project
was completed, the ARRT has conducted a practice analyses for those disciplines every six years for the
purpose of updating the task inventory and content specifications. Such updates are important for professions
that continually evolve, due to advances in technology, because they help assure that the content
specifications and other certification requirements (e.g., clinical experience requirements) reflect current
practice1.
1
The ARRT now completes an interim update to content specifications and clinical competency requirements every
three years; a thorough and comprehensive practice analysis is conducted every six years.
2
Radiography Practice Analysis Report
Practice Analysis Methods
Practice analysis studies can be conducted in a variety of ways (Raymond, 2001). These methods
include direct observation, the use of work diaries, the use of task inventory surveys, and by logical analysis,
i.e., convening panels of experts and eliciting their opinions about practice responsibilities. The choice of
practice analysis method can be influenced by a number of factors including, but not limited to, previous
studies, the size of the profession, and the amount of resources available to conduct the study. These factors
affect various decisions on how to conduct the study. Perhaps the two most important decisions pertain to: (a)
the type of practice-related information that is obtained; (b) the source(s) of that information.
Type of Information. Practice analysis involves reducing to words the things people do in work, and
different types of descriptors can be used to accomplish this. On the one hand, work can be described in
terms of behaviors necessary to complete a job, solve some problem, or create output, product, or service. For
example, the statement “Verify that informed consent has been obtained” is a task-oriented descriptor. On the
other hand, person-oriented approaches to job analysis focus on the knowledge, skills, and abilities (KSAs)
that a person should possess to successfully complete the tasks required of a job. “Knowledge of radiation
physics” is an example. Task-oriented descriptors indicate the activities performed on the job, while personoriented descriptors reflect the KSAs and other personal characteristics presumed to be required for successful
job performance. Practice analyses can be designed to collect information about tasks/activities, about
personal qualities, or both.
Sources of Information. Practice-related information can be obtained from various sources. Physician
requisitions, patient charts, and billing statements all document, to some extent, what occurs in the practice
setting. However, most practice analyses obtain data directly from persons who are knowledgeable about the
work. This could include practitioners, supervisors, managers, educators, or committees of subject-matter
experts (SMEs). The source of practice-related information will influence both the method of data collection
and sample size.
Method for Present Study. The results of this study will be used to develop a task inventory, establish
clinical competency requirements, and develop exam content specifications. These multiple needs require
different types of information – data about actual practice activities and about the KSAs required to carry out
those activities.
Although the study could be completed by a committee of SMEs, we rely on two
independent sources of information. For the present study, we first collect data regarding actual work
activities primarily from entry-level staff radiographers with a task inventory survey. Survey recipients are
asked to rate each task frequency and responsibility. The task inventory is an efficient way to obtain
extensive information about the nature of a profession. It is also conducive to statistical analyses that can help
distinguish among a large number of employees who work in diverse settings.
3
The task inventory is
Radiography Practice Analysis Report
consistent with the methodology employed for previous ARRT studies, and will enable changes in practice to
be monitored over time. Once data about specific work activities are collected, a committee of SMEs will
meet to provide judgments regarding the KSAs required to perform those activities. In short, the present
study relies on staff technologists to find out what is done on the job, while SMEs are used to establish
clinical education requirements and to revise exam content specifications.
The report is organized as follows. The remainder of this chapter discusses the establishment of the
Advisory Committee (i.e., SMEs) and summarizes the project schedule. Chapter 2 discusses details related to
survey development and administration, as well as analysis of results. Finally, the procedures for translating
the results of the task analysis into the content specifications and clinical competency requirements are
presented in Chapter 3.
Advisory Committee
For most practice analyses, an Advisory Committee is established by the ARRT Board of Trustees for
the purposes of providing guidance to project staff by reviewing the plans for the conduct of the study,
revising documents as required, and evaluating the results of all data collected during the project. Based on
the results of its deliberations, the Advisory Committee makes recommendations to the Board of Trustees
concerning the final composition of the task inventory, content specifications, and clinical competency
requirements. The individuals serving on the Advisory Committee included:
Advisory Committee
ARRT Board Representatives
Robin R. Berke, B.S., R.T.(R)
Kellie S. Cranfill, MSRS, R.T.(R)(BD)
Debra Reese, M.P.H., R.T.(R)
Helena A. Coello, M.Ed., R.T.(R)
Jose L. Martinez, B.S., R.T.(R)(CT)(MR)
Michael DelVeccio, B.S., R.T.(R)
Eileen M. Maloney, M.Ed., R.T.(R)(M)
ARRT Staff
Nance Cavallin, B.A., R.T.(R)(T)
Michael Yoes, Ph.D.
Ben Babcock, Ph.D.
Project Schedule
Projects such as this require a closely monitored time schedule to assure that all activities are completed
in a timely fashion and within budget. The table on page 5 presents the general time and task schedule used
to guide this project. This does not include other, more specific timelines which were used to help manage
certain aspects of the project (e.g., survey mailing and data entry).
4
Radiography Practice Analysis Report
Time and Task Schedule for Radiography Practice Analysis
January 2009January 2012
Schedule of Activities
Approx. Date
Activity
Fall 2009
Advisory Committee reviews 2005 Task Inventory and other materials and makes notes
regarding additions to new task inventory.
* Jan. 2009
Advisory Committee meets to review and update task inventory, and to discuss survey
content and format.
Feb. 2009
Staff prepares first draft of survey and mails to Advisory Committee for review.
Feb. 2009
Advisory Committee members contact staff to discuss survey changes.
Feb. 2009
Staff prepares final draft of survey; submits for internal editorial review.
Mar. 2009
Staff prepares printer-ready copy and sends out for printing.
Mar. - Apr.
2009
Printer mails surveys to large sample of technologists.
 initial mailing
 send thank-you/reminder postcard
 additional mailing to nonrespondents (survey + cover letter)
May 2009
Data returned to ARRT from printer.
May 2009
Staff analyzes survey data, prepares preliminary report, and mails report to Advisory
Committee.
* Jun. 2009
Advisory Committee meets to review survey results and edit task inventory; update
clinical competency requirements; and revise content specifications.
July 2009
Draft clinical competency requirements and content specifications mailed to professional
community for review and comment.
Aug. 2009
Staff collates comments from professional community.
* Oct. 2009
Advisory Committee meets to review professional community comments, revise clinical
competency requirements and content specifications.
Jan. 2010
Board reviews task inventory, content specifications, and clinical competency requirements.
March 2010
Data gathered from CMS and radiology managers.
April 2010
Staff analyzes data, prepares preliminary report, and mails report to Advisory Committee.
* May 2010
Advisory Committee meets to review CMS and radiology manager data, edit task
inventory, revise clinical competency requirements and content specifications.
Aug. 2010
Item Writers are notified of new content areas.
Fall 2011
Test items in item bank are reclassified according to new content specifications.
Jan. 2011
Final clinical competency requirements and content specifications mailed to professional and
educational community.
Jan. 2012
Revised content specifications and clinical competency requirements become effective.
* Indicates committee meeting
5
Radiography Practice Analysis Report
CHAPTER 2
SURVEY METHODS
The staff and Advisory Committee developed a questionnaire during fall and winter 2008 - 2009. The
questionnaire consisted of the procedures, positions, tasks, and equipment maintenance thought to relate to
staff radiologists. It is primarily based on the activities comprising the ARRT task inventory in use since
2005. A copy of the questionnaire is provided in Appendix A.
Staff Radiographer Questionnaire Development. The staff questionnaires consisted of three sections.
Section 1 included 112 procedures, positions, and tasks/activities performed by staff radiographers in clinical
settings. The questionnaire did not include all possible activities, but was limited to those for which the
Advisory Committee felt there was some benefit to obtaining information. Activities known to be performed
by virtually all staff radiographers were excluded as a means to control survey length, and this fact was
explained in the questionnaire instructions.
Section 1 of the questionnaire had a rating scale relating to the frequency with which each clinical
activity or procedure was performed. The rating scale included five response categories: not responsible for
performing, quarterly, monthly, weekly, and daily. Instructions asked respondents to indicate “approximately
how often you perform” each activity. We refer to these 112 items as the task inventory section of the
radiography practice analysis survey.
Section 2 of the questionnaire consisted of a list of 16 quality assurance tasks or procedures performed
by radiographers (i.e., appropriate maintenance and quality checks of clinical equipment). We refer to these
16 items as the equipment section throughout this report. The rating scale for Section 2 asked radiographers
to indicate if they performed each quality assurance task or procedure and, if so, what levels of participation
best matched their workplace experience with each of these equipment maintenance activities. The rating
scale for section 2 included four responses categories: no responsibility for this procedure, delegate or request
someone else, personally perform, and review results. Given the manner in which the scale categories are
provided the instructions for this section allow for selecting more than one of three categories of involvement
(i.e., excluding the not responsible category).
Section 3 consisted of 12 questions on education, experience, and workplace demographics. These
demographic data are valuable in determining the general characteristics of the individuals in the returned
practice analysis survey sample.
Staff Radiographer Sample. ARRT staff compiled names and addresses for study participants from the
database of registered radiographers maintained in the ARRT registered technologist database. The criteria
6
Radiography Practice Analysis Report
identified for a population of individuals to sample from were identified as follows: Full-time Employment,
Radiography listed as their Primary Discipline, job title of “Staff Technologist”, and between 1 to 10 years of
work experience.
The population of registrants meeting these sampling criteria was identified from the ARRT registry
database in January 2009. This population of interest included 37,881 individuals who listed radiography as
their primary discipline; working full-time, whose job title was “Staff Technologist”, and who had between 1
and 10 years of work experience. It is important to note that this population is not representative of the
distribution of years of experience in the complete radiographer population which contains many
technologists with over 10 years of work experience. In this “population” approximately 43.8% had between
1-3 years of experience, approximately 25.3% had between 4-5 years of experience, and 30.8% had 6-10
years of experience. The identified target population of 37,881 staff radiographers was divided into three
separate strata based on years of experience. A stratified random sample of 2,000 radiographers was then drawn
such that 60% of the sample had three or fewer years of experience, 20% had 4 to 5 years of experience, and
another 20% had 6 to 10 years of experience.
The reason for stratifying the sample on the basis of years of experience was that, for purposes of
developing certification requirements, the ARRT gives emphasis to staff radiographers in the early stages of
their careers, a practice consistent with accepted psychometric principles.
It should be noted that
approximately 40% of all radiographers have more than 10 years of experience. Although more experienced
radiographers were under-sampled, the data should be useful for describing contemporary practice, given that
practice analysis studies typically report that years of experience has little influence on the job responsibilities
of those in staff positions.
The questionnaire was mailed in March 2009 to the sample of 2,000 radiographers. The ARRT
employed a three-stage mailing strategy, which consisted of an initial mailing, a reminder postcard, and a
follow-up questionnaire to those who did not respond after the first two mailings. A total of 1,008 useable
questionnaires were returned within a six week period for a response rate of 50.4 percent.
Evaluation of Characteristics of the Returned Survey Sample
It may be helpful, at this point, to examine the characteristics of the returned survey sample and compare it
against the original stratified random sample. This evaluation is useful in establishing the representativeness of
the final survey data compared with the originally drawn stratified sample.
Since the returned survey data (N=1,008 respondents) were a subset of the original sample, to whom
surveys were mailed, the returned data set still consisted of radiographers who were working full-time, who listed
radiography as their primary discipline, and who had reported having a job title of Staff Technologist.
7
Radiography Practice Analysis Report
As can be seen in Table 2-1 below, the percentages for the categories of the demographic variable ‘Years of
Work Experience’ in the returned data set appear to be very close to the originally targeted sample.
Table 2-1. Comparison of Original and Returned Samples on Years of Work Experience.
Years Exp
Original
Sample
(N=2,000)
Returned
Sample
(N=1,008)
1-3 YRS
4-5 YRS
6-10 YRS
60.0%
20.0%
20.0%
59.1%
19.7%
21.2%
Although education level was not a variable that was involved in the stratification or selection process,
Table 2-2 indicates that this demographic variable also appears to be very similar to the original (stratified
random) sample of N=2,000 to whom the survey was mailed.
Table 2-2. Comparison of Original and Returned Samples on Education Level.
Education
Level
H.S. + RT
Certificate
Associates
Baccalaureate
Masters
M.D.
Other
Original
Sample
(N=2,000)
Returned
Sample
(N=1,008)
4.3%
12.7%
67.5%
14.4%
0.3%
0.2%
0.4%
3.2%
12.0%
67.2%
15.7%
0.3%
0.2%
0.5%
An examination of the secondary disciplines, listed in the registry database, may also be helpful in
ascertaining whether the returned sample appeared to differ in any significant way from the original sample (to
whom the surveys were mailed). Table 2-3 lists the comparison of the original and returned survey samples
based on Secondary Discipline listed in the ARRT registrant database.
8
Radiography Practice Analysis Report
Table 2-3. Comparison of Original and Returned Samples on Secondary Discipline.
Secondary
Discipline
Original
Sample
(N=2,000)
Returned
Sample
(N=1,008)
None listed
RAD
NMT
THR
CT
MRI
MAM
SON
BD
VI
CI
Other
49.2%
16.5%
0.3%
0.1%
17.9%
2.5%
1.9%
1.1%
4.4%
1.9%
0.3%
4.2%
53.1%
13.5%
0.2%
0.0%
18.3%
2.5%
2.2%
0.7%
5.0%
1.6%
0.1%
2.9%
The primary purpose of presenting this information is to validate that the nature of the returned radiography
practice analysis survey sample did not change appreciably from the original stratified random sample based on
the specified sampling criteria.
The next chapter presents the results from the radiography practice analysis questionnaire.
9
Radiography Practice Analysis Report
CHAPTER 3
DATA ANALYSIS AND RESULTS
Overview
This chapter summarizes the results of the questionnaire completed by staff radiographers.
The
demographic characteristics are first, followed by discussions of the results for the task inventory, and
equipment maintenance sections of the survey. All tables corresponding to the staff questionnaire appear in
Appendix B.
Data Analysis Techniques
This report used three different ways of analyzing the frequency with which each activity was
conducted. The first was to look at the percentage of respondents that responded in the highest category. The
second was to look at the percentage of respondents who indicated that they were not involved at all with an
activity. Finally, the data were analyzed using the Rasch Rating Scale Model (Andrich, 1978).
First, the percentage of people responding in the highest category is a good indicator of the frequency
of conducting various activities. For Section 1, the highest category corresponded to conducting the activity
daily. For Section 2, it is less clear what the highest level of involvement is for performing quality assurance
procedures on equipment. It is presumed that the highest category corresponded to personally performing the
task or procedure. The clinical activities, tasks, positions, and procedures that radiographers marked as being
done quite often should obviously be included in the content specifications.
The percentage of respondents who indicated that they were not involved with an activity is also a
good indicator of whether or not to include an item on the content specifications. If enough people do not
conduct each specific clinical activity, or perform specific clinical procedures, or participate in quality
assurance procedures on a specific piece of equipment, then that particular clinical activity may not be
included in the content specifications. These numbers are also informative as to which procedures should and
should not be required for clinical competencies. For Section 1, “not responsible” only included the lowest
category, which was “not responsible”. For Section 2, “not responsible” was also the lowest category, though
any ordering of the categories may be debatable.
Finally, the data for each section were used to conduct a Rasch Rating Scale Analysis (Andrich,
1978) for exploratory research purposes. The Rasch Rating Scale Model is similar to the Rasch model that
ARRT uses to analyze its large-volume certification exam data. The main difference is that the Rasch Rating
Scale Model accounts for more than two category response curves corresponding to the multiple response
categories for each item on the survey. Each item on the survey has a “location” parameter according to this
10
Radiography Practice Analysis Report
model. Where necessary, the ARRT staff combined certain categories with low response frequencies only for
the purposes of the Rasch analysis in order to make the analysis more stable. This analysis used a linear
transformation to place the Rasch location parameters on a scale from 0 to 100; with 100 corresponding to the
most frequently conducted activity and 0 corresponding to the least frequently conducted activity. This made
the results of the Rasch analysis more easily interpretable.
Because this was the first time that the ARRT used the Rasch Rating Scale in its practice analyses, the
Advisory Committee did not use the results of the Rasch analysis to inform its decisions for which activities
to include and exclude from the content specifications.
The Advisory Committee first looked at the
percentages of people who had some expressed level of responsibility for a specific clinical activity or
procedure (where percentage responsible = 100 – percentage not responsible). Advisory Committee members
debated which items should and should not be included.
Radiography Practice Analysis Staff Survey Results
ARRT certification exams assess the knowledge and skills required to carry out the major tasks typically
required at entry into a specialty or modality. In the primary modalities entry-level is generally interpreted by
ARRT as 1 to 3 years of experience working full-time in the modality of interest. Because more experienced
radiographers were also included in the sample, it seemed worthwhile to also evaluate their responses. The
patterns of responses between the entry-level respondents (i.e., those with 1-3 years of experience) were
compared against more experienced radiographers (i.e., those with six or more years of experience). None of
the statistical comparisons between entry-level and more experienced radiographers were statistically
significant after adjusting for the number of independent statistical tests that were being made (Bonferroni
adjustment). The differences between the entry-level respondents and the more experienced radiographers
were generally quite small. Therefore, the results are presented for the full group only. Appendix B contains
tables summarizing responses to the questionnaire for the total group.
The following text summarizes the demographic characteristics of the sample based on responses to
Section 3 of the questionnaire (see survey questionnaire in Appendix A). This is followed by analyses of the
task invnetory (section 1), and equipment maintenance (section 2) parts of the survey. For each of the survey
items pertaining to sections 1 and 2, we report the percentage of respondents who were responsible for that
clinical activity.
11
Radiography Practice Analysis Report
Demographics, Tables in Appendix B summarize the demographic responses of those taking the survey.
Note that the questions and responses that appear in the tables have been abbreviated; the survey in Appendix
A presents the full text of each question. Notable findings are discussed below.

The target group and total group were nearly identical in terms of demographic composition.
Because of the high level of similarity between the target and total groups, all demographic
statistics are for the total group.

The majority of radiographers were employed in a hospital setting (71.8%), with the remaining
working in physician group practice/clinics (18.5%) and free-standing imaging centers (5.3%). A
large percentage of the radiographers work in relatively large hospitals (36.5% in hospitals with
over 250 beds).

Almost 44% of radiographers indicated that they worked in an urban setting, with almost equal
numbers (approximately 26% each) indicating either suburban or rural workplace settings.

Almost half of the radiographers reported working in a department with more than 15
radiographers.

Target levels of work experience were close to the original sampling (shown in Table 2-1). The
demographic responses to this question differed somewhat from the analysis that was based on
ARRT registry renewal form information at the time of the sampling. This was likely due to a
number of individuals whose years of experience variable based on renewal form information may
have been close to a year old at the time of the survey and some percentage of those individuals
crossed over between the categories of 1-3 years, 4-5 years, and 6+ years. The general pattern,
however, confirmed that the sample was predominantly representative of entry-level radiographers.

Almost all respondents (96.9%) reported working more than 30 hours per week. The
overwhelming majority of those surveyed were, therefore, full-time employees. The vast majority
of people (82.8%) had the job title of Staff Technologist.
Evaluation of Continuing Use of Film-Screen Radiography
Responses to question 22 “Use film-screen cassettes and automatic film processing” were used to create
groups for evaluating the ongoing use of film-screen in radiography. Tables 3-1 through 3-3 are based on a
subset of survey respondents (using data only from those who marked “Not Responsible” (49% of total) and
those who marked “Daily” (39% of total).
For purposes of the results that are summarized in tables 3-1 through 3-3, the label of “Digital” consists
of those individuals who marked “Not Responsible” in response to question 22 and the label “Film” consists
of only those individuals who marked “Daily” in response to question 22 (though some of these individuals
may work on both digital and film-screen equipment).
12
Radiography Practice Analysis Report
Note: Percentages are within a category (i.e., “Digital” or “Film”) for comparison purposes
Table 3-1. Comparing Primary Workplace and Workplace Setting Between Digital and Film
Primary Workplace Setting
Primary Workplace
Urban
Suburban
Rural/Small
Town
Total
362 (75.4%)
244 (49.3%)
Hospital / Medical Center
Digital
Film
175 (36.5%)
116 (23.4%)
79 (16.5%)
64 (12.9%)
108 (22.5%)
64 (12.9%)
Physician Group / Clinic
Digital
38 ( 7.9%)
26 ( 5.4%)
20 ( 4.2%)
84 (17.5%)
Film
33 ( 6.7%)
38 ( 7.7%)
30 ( 6.1%)
101 (20.4%)
Digital
Film
19 ( 4.0%)
16 ( 3.2%)
11 ( 2.3%)
10 ( 2.0%)
4 ( 0.8%)
4 (0.8%)
34 ( 7.1%)
30 ( 6.1%)
Free standing Imaging
Center
Table 3-2. Comparing Size of Hospital Workplace Facility Between Digital and Film
Size of Workplace Facility
Less than
100 Beds
100 to 250
beds
251 to 500
beds
More than
500 beds
Total
Digital
74 (26.0%)
98 (34.4%)
104 (36.5%)
91 (31.9%)
285
Film
59 (24.0%)
68 (27.6%)
67 (27.2%)
52 (21.1%)
246
Primary Workplace
Hospital / Medical Center
Table 3-3. Comparing Size of Department and Primary Workplace Between Digital and Film
Size of Department (number of radiographers)
Primary Workplace
1-5
6-10
10-15
Digital
19 ( 3.9%)
47 ( 9.7%)
65 (13.5%(
233 (48.2%)
364 (75.4%)
Film
17 ( 4.5%)
26 ( 6.9%)
51 (13.5%)
148 (39.1%)
242 (63.9%)
Physician Group / Clinic
Digital
62 (12.8%)
11 ( 2.3%)
6 ( 1.2%)
11 ( 2.3%)
80 (16.6%)
Film
69 (18.2%)
12 ( 3.2%)
9 ( 2.4%)
16 ( 4.2%)
106 (28.0%)
Free standing Imaging Center
Digital
18 ( 3.7%)
10 ( 2.1%)
4 ( 0.8%)
7 ( 1.4%)
39 ( 8.0%)
Film
13 ( 3.4%)
9 ( 2.4%)
3 ( 0.8%)
6 ( 1.6%)
31 ( 8.2%)
Hospital / Medical Center
more than 15
Total
Thus, the data would seem to support that although most radiography workplaces support digital
imaging there is still film-screen imaging equipment that is in usage enough to warrant integration into the
content specifications.
13
Radiography Practice Analysis Report
Differences in Target and Non-Target Groups.
Table 3-4 summarizes the results of a test for
significant differences in survey responses between entry-level radiographers (1-3 years) and more
experienced radiographers (6+ years of experience). None of these statistical comparisons were significant
when adjusted for the large number of statistical tests being conducted. That is, none of the comparisons are
statistically significant when controlling the family-wise error rate to 0.05 (N=111 independent statistical
comparisons were made; Bonferroni adjusted significance level for an individual significance test was set to
0.00045 in order to achieve a family-wise significance level of 0.05). A visual inspection of the crosstabulations of the variables that were flagged by the individual χ2 significance test (at the nominal level of p <
0.05) did not reveal any notable differences that would support that the overall practice analysis should be
done only on a targeted group of 1-3 years experienced individuals (with the resulting reduction in the sample
size for those analyses). Because of the high similarity of the target and non-target groups, all analyses were
conducted with all of the data combined.
14
Radiography Practice Analysis Report
Table 3-4
RAD Practice Analysis
Statistical Comparison of Responses to Survey Questions
between Entry-Level (1-3 yrs.) and Experienced (6+ yrs.) Technologists
Question
1
2
3
4
5
6
7
8
9
10
11
12a
12b
13a
13b
13c
14a
14b
Procedures and Clinical Activities
Q1 - Sequence imaging procedures to avoid residual contrast material affecting future
exams.
Q2 - Communicate scheduling delays to waiting patients.
Q3 - Verify or obtain patient consent as necessary (e.g., contrast studies)
Q4 - Prior to administration of contrast agent, gather information to determine appropriate
dosage.
Q5 - Prior to administration of contrast agent, determine if patient is at increased risk of
adverse reaction.
Q6 - Confirm type of contrast media and prepare for administration.
Q7 - Perform venipuncture for contrast administration.
Q8 - Administer IV contrast media.
Q9 - Observe patient after administration of contrast media to detect adverse reactions.
Q10 - Obtain vital signs.
Q11 - Clean, disinfect or sterilize facilities and equipment, and dispose of contaminated
items.
Q12a - Document required information in patients’ medical record - on paper
Q12b - Document required information in patients’ medical record - electronically
Q13a - Determine appropriate exposure factors using Fixed kVp technique chart
Q13b - Determine appropriate exposure factors using Variable kVp technique chart
Q13c - Determine appropriate exposure factors using calipers (to determine patient
thickness)
Q14a - Select radiographic exposure factors - Automatic Exposure Control (AEC)
Q14b - Select radiographic exposure factors - kVp and mAs (manual or set by hand)
15
χ2
df
p
2.638
5
0.756
6.392
1.458
4.011
5
5
5
0.270
0.918
0.548
4.379
5
0.496
3.719
4.148
3.707
6.820
2.510
5.731
5
5
5
5
5
5
0.591
0.528
0.592
0.234
0.775
0.333
2.973
3.533
10.342
3.769
6.939
5
5
5
5
5
0.704
0.618
0.066
0.583
0.225
5.449
4.461
5
5
0.364
0.485
Radiography Practice Analysis Report
Table 3-4 (continued)
RAD Practice Analysis
Statistical Comparison of Responses to Survey Questions
between Entry-Level (1-3 yrs.) and Experienced (6+ yrs.) Technologists
Question
14c
15a
15b
16a
16b
16c
16d
16e
17a
17b
17c
17d
17e
18a
18b
18c
18d
18e
18f
19
20
21
Procedures and Clinical Activities
Q14c - Select radiographic exposure factors - Pre-programmed techniques
Q15a - Operate radiographic unit and accessories - Fixed unit
Q15b - Operate radiographic unit and accessories - Mobile unit (portable)
Q16a - Select exposure factors - Digital fluoroscopic unit
Q16b - Select exposure factors - Non-digital fluoroscopic unit
Q16c - Select exposure factors - Fixed fluoroscopic unit
Q16d - Select exposure factors - Mobile fluoroscopic unit (C-arm)
Q16e - Select exposure factors - Mobile vascular fluoroscopic unit (C-arm)
Q17a - Operate specialized units - Dedicated chest unit
Q17b - Operate specialized units - Tomography unit
Q17c - Operate specialized units - Mammography unit
Q17d - Operate specialized units - Bone densitometry unit
Q17e - Operate specialized units - Panorex unit
Q18a - Operate - Computerized Radiography (CR)
Q18b - Operate - Direct Digital Radiography (DR)
Q18c - Operate - Picture Archival and Communication System (PACS)
Q18d - Operate - Film Digitizer
Q18e - Operate - Hospital Information System (HIS)
Q18f - Operate - Radiology Information System (RIS)
Q19 - Perform post-processing on digital images
Q20 - Use laser copy to print hard copy images
Q21 - Add electronic annotations on digital images
16
χ2
df
p
8.913
3.769
6.598
6.213
1.468
7.764
4.146
3.203
8.963
8.260
7.929
6.010
1.590
14.372
3.228
10.847
7.266
0.408
4.342
4.466
3.363
13.589
5
5
5
5
5
5
5
5
5
5
5
5
5
5
5
5
5
5
5
5
5
5
0.113
0.583
0.252
0.286
0.917
0.170
0.529
0.669
0.111
0.143
0.160
0.305
0.902
0.013
0.665
0.055
0.202
0.995
0.501
0.484
0.644
0.018
Radiography Practice Analysis Report
Table 3-4 (continued)
RAD Practice Analysis
Statistical Comparison of Responses to Survey Questions
between Entry-Level (1-3 yrs.) and Experienced (6+ yrs.) Technologists
Question
22
23
24
25
26
27
28
29a
29b
30
31a
31b
32
33a
33b
34
35
36
37
38
39
40
Procedures and Clinical Activities
Q22 - Use film-screen cassettes and automatic film processing
Q23 - Determine corrective measures if image is not of diagnostic quality
Q24 - Chest
Q25 - Ribs
Q26 - Sternum
Q27 - Soft tissue neck
Q28 - Abdomen
Q29a - Esophagus - Assist with examination
Q29b - Esophagus - Post fluoroscopy radiographs/images
Q30 - Swallowing dysfunction study
Q31a - Upper GI series, single or double contrast - Assist with examination
Q31b - Upper GI series, single or double contrast - Post fluoroscopy radiographs/images
Q32 - Small bowel series
Q33a - Barium enema, single or double contrast - Assist with examination
Q33b - Barium enema, single or double contrast - Post fluoroscopy radiographs/images
Q34 - Surgical cholangiography
Q35 - ERCP
Q36 - Cystography
Q37 - Cystourethrography (voiding)
Q38 - Intravenous urography
Q39 - Retrograde pyelography
Q40 - Cervical spine
17
χ2
df
p
5.951
5.401
15.310
6.972
12.922
10.252
19.207
13.399
9.454
13.367
14.712
9.925
16.421
19.599
18.371
11.309
10.913
10.656
8.577
15.650
6.947
5.364
5
5
5
5
5
5
5
5
5
5
5
5
5
5
5
5
5
5
5
5
5
5
0.311
0.369
0.009
0.223
0.024
0.068
0.002
0.020
0.092
0.020
0.012
0.077
0.006
0.001
0.003
0.046
0.053
0.059
0.127
0.008
0.225
0.373
Radiography Practice Analysis Report
Table 3-4 (continued)
RAD Practice Analysis
Statistical Comparison of Responses to Survey Questions
between Entry-Level (1-3 yrs.) and Experienced (6+ yrs.) Technologists
Question
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
Procedures and Clinical Activities
Q41 Q42 Q43 Q44 Q45 Q46 Q47 Q48 Q49 Q50 Q51 Q52 Q53 Q54 Q55 Q56 Q57 Q58 Q59 Q60 Q61 Q62 -
Thoracic spine
Scoliosis series
Lumbar spine
Sacrum and coccyx
Sacroiliac joints
Pelvis and hip
Skull
Facial bones
Mandible
Zygomatic arches
Temporomandibular joints
Nasal bones
Orbits
Orbits for foreign body
Paranasal sinuses
Toes
Foot
Calcaneus (os calcis)
Ankle
Tibia, fibula
Knee
Patella
18
χ2
df
p
2.185
11.891
3.781
8.715
6.007
15.134
11.771
16.564
18.967
18.934
18.671
21.379
11.992
13.480
16.508
16.466
3.427
5.159
3.692
11.290
4.862
10.026
5
5
5
5
5
5
5
5
5
5
5
5
5
5
5
5
5
5
5
5
5
5
0.823
0.036
0.581
0.121
0.306
0.010
0.038
0.005
0.002
0.002
0.002
0.001
0.035
0.019
0.006
0.006
0.634
0.397
0.595
0.046
0.433
0.074
Radiography Practice Analysis Report
Table 3-4 (continued)
RAD Practice Analysis
Statistical Comparison of Responses to Survey Questions
between Entry-Level (1-3 yrs.) and Experienced (6+ yrs.) Technologists
Question
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83a
83b
Procedures and Clinical Activities
Q63 - Femur
Q64 - Fingers
Q65 - Hand
Q66 - Wrist
Q67 - Forearm
Q68 - Elbow
Q69 - Humerus
Q70 - Shoulder
Q71 - Scapula
Q72 - Clavicle
Q73 - Acromioclavicular joints
Q74 - Bone survey
Q75 - Long bone measurement
Q76 - Bone age
Q77 - Soft tissue/foreign body
Q78 - Arthrography (assist)
Q79 - Myelography (assist)
Q80 - Venography (assist)
Q81 - PICC line insertion assistance
Q82 - Position patient and operate MRI scanner to produce diagnostic images
Q83a - Position patient and operate CT scanner to produce diagnostic images - Head
Q83b - Position patient and operate CT scanner to produce diagnostic images - Neck
19
χ2
df
p
6.669
2.177
8.016
7.565
6.596
5.960
12.813
9.428
13.673
10.056
9.227
7.941
5.355
6.798
3.116
10.073
11.480
9.162
3.699
2.762
4.453
2.008
5
5
5
5
5
5
5
5
5
5
5
5
5
5
5
5
5
5
5
5
5
5
0.246
0.824
0.155
0.182
0.252
0.310
0.025
0.093
0.018
0.074
0.100
0.160
0.374
0.236
0.682
0.073
0.043
0.103
0.594
0.737
0.486
0.848
Radiography Practice Analysis Report
Table 3-4 (continued)
RAD Practice Analysis
Statistical Comparison of Responses to Survey Questions
between Entry-Level (1-3 yrs.) and Experienced (6+ yrs.) Technologists
Question
83c
83d
83e
83f
83g
Procedures and Clinical Activities
Q83c Q83d Q83e Q83f Q83g -
Position patient and operate CT scanner to produce diagnostic images - Chest
Position patient and operate CT scanner to produce diagnostic images - Abdomen
Position patient and operate CT scanner to produce diagnostic images - Pelvis
Position patient and operate CT scanner to produce diagnostic images - Biopsy
Position patient and operate CT scanner to produce diagnostic images - Other
20
χ2
df
p
6.817
2.122
2.605
1.177
6.091
5
5
5
5
5
0.235
0.832
0.761
0.947
0.298
Radiography Practice Analysis Report
Section 1: Task Inventory. Appendix C and D presents the details for each of the clinical activities
appearing in the questionnaire, indicating the percentage of respondents who said they conduct the
activity (by time frequency category), and the percentage who responded as “not responsible” for that
particular clinical activity. Appendix C presents the questionnaire results in the original (unsorted)
question order. Appendix D presents the questionnaire results sorted by percentage responsible (highest
to lowest). Responses to the task inventory section indicated that radiographers have a broad range of
responsibilities.
Initial recommendations sent to ARRT Board of Trustees in January 2010
The ARRT Board of Trustees reviewed the practice analysis Advisory Committee recommended
changes to the RAD examination (content specifications and task inventory) and clinical competency
requirements at their January 2010 meeting along with reviewing the community’s comments.
The Board felt that there was not adequate consensus, at that point in time, and asked for additional
data to be gathered and reviewed prior to adopting the committee’s recommendations.
At the direction of the ARRT Board of Trustees efforts were made to gather some supplemental data
regarding current practice in the area of radiography that approached the idea of practice and to examine
how this supplemental data tended to corroborate with the data resulting from the RAD practice analysis
survey.
21
Radiography Practice Analysis Report
CHAPTER 4
RADIOGRAPHY MANAGERS’ SURVEY
Overview
As part of a comprehensive investigation into the validity of the survey results from the
Radiography (RAD) exam 2009 Comprehensive Practice Analysis, the ARRT Board of Trustees directed
staff to collect information from department managers (administrators) to see how those data aligned with
the outcomes from the RAD practice analysis (i.e., RAD PA) survey. Hereafter this will be referred to as
the RAD managers’ survey (or simply as the “managers’ survey”).
Method
Instrument
A survey was constructed to gather information on whether specific procedures were being
performed at facilities and how frequently they were being performed. Managers were asked to provide
information regarding whether or not specific procedures were performed in their facility for the 2009
calendar year, how many of each procedure were performed in 2009, whether such a procedure is
routinely performed by entry-level (0 – 3 years) radiographers, and the number of full-time equivalent
(FTE) staff that they had available to perform each procedure. Based on preliminary discussions with a
small pilot sample of managers it was felt that most managers/administrators would have access to this
type of information.
Given that the survey was asking for detailed information (counts for the numbers of time
procedures were done in 2009, and the number of FTE staff available to perform those procedures) there
was a concerted effort to simplify the managers’ survey as much as possible. The procedures that were
included were those that appeared as potentially inconsistent in the recommended outcomes from the
RAD practice analysis. A couple of procedures that were frequently performed, however, were placed
onto the managers’ survey in order to provide a baseline measure to help verify results.
A copy of the managers’ survey is included in Appendix H.
22
Radiography Practice Analysis Report
Sample
The sample for the RAD Managers’ survey was drawn from the ARRT registry database based on
meeting all of the following criteria.
1.
2.
3.
4.
5.
Working Full-time
RAD Certified
Primary Discipline listed as “Radiography”
Job Title either “Administrator/Manager or Chief Technologist”
Institution type either “Hospital”, “Clinic”, or “Private Office”
This resulted in a potential population of 7,397 individuals (although it turns out there were nine of
these individuals who were not currently employed or working in the field). Of these about 63% were
titled ‘Administrator’ or ‘Manager’ and the remaining 37% were titled as ‘Chief Technologist’. Sixtyeight percent of those in this population were working in hospitals, with about equal percentages (of the
remainder) working in clinics and private offices. They were all RAD certified, 484 of them also reported
that they were AHRA (CRA) certified and 102 reported that they were Quality-Management certified.
A random sample of 2,500 of this target population was selected for this RAD managers’ survey. A
larger sampling was used because it was felt that the effort required to complete this type of survey might
lower the return rate of completed surveys.
Surveys were returned to a service vendor for scanning and entry of non-scannable information
(including numeric information). Halfway through the allotted survey response time period a reminder
post-card was sent to these individuals to encourage them to complete and return the surveys.
Processing of Returned Surveys
Surveys were scanned as well as having all non-scannable information entered and matched-up with
the scanned information. A bar-coding of the surveys provided the capability to tie each survey form with
the most recent ARRT registry database information and therefore enabled staff to access demographic
information from renewal forms for that particular manager. It should be noted, however, that analyses
involving any of the registry demographic information was only done in an aggregate form.
Survey response data files, consisting of the merged (scanned and data-entry information) data, was
provided to ARRT psychometric staff for analysis.
Upon receipt of the managers’ survey response data files it became obvious that there were some
challenges in terms of data clean-up, data analysis, and interpretation of results. First, the return rate, as
expected, was relatively low for ARRT surveys (although not tremendously low for such types of surveys
in other organizations). A total of 620 surveys were returned (from a mailing of 2,500 managers). This
23
Radiography Practice Analysis Report
represented a return rate of just under 25% of the surveys that were returned. Additionally, there were
large amounts of missing data in those surveys that were returned. There is no clear way to determine the
meaning of the missing data in any specific situation. The percentage of data missing across the four
questions regarding each procedure also varied across questions but ranged from 20 to 90% of the data
missing. The average percentage of missing data, across procedures, was just over half (54.4%) so it is
obvious that there were concerns both about the nature of the analyses as well as how results should be
interpreted given the large amount of incomplete data. Another challenge resulted from cases where
managers had written things like “fewer than 25” in response to a question of number of times a specific
procedure was performed in 2009. There were some necessarily subjective decisions that had to be made
in order to retain such data points as part of the overall data clean-up efforts. Finally, there were also
cases where it seemed obvious that numbers being provided were estimates and not actual numbers (e.g.,
all values ended in either zero or five and this was true across procedures; or in some situations where
counts were preceded with a tilde indicating they were an approximation).
In these instances the
questionable data was simply treated as if it were real values (since there is no other information that
could easily clarify the data). The bottom line was that there was a bit more data cleanup effort involved
in this study than is typically done, there was also a lower survey return rate than typical (for ARRT), and
there was a much higher incidence of missing data than typical. All of these factors have the potential of
combining to make the analysis and results more challenging to interpret.
24
Radiography Practice Analysis Report
The problem of missing data in the returned surveys can be visualized by the following graphic:
Pie Chart of Managers' Response Rate
Not Returned
Data We Have
Returned but Missing Data
Where the red area represents the non-returned surveys and the green area represents the percentage
of surveys that were returned but had missing data on any particular survey question. Another way to
look at these data is that for any given survey question, the returned data had about 50% of the data
missing. The percentage of missing data across all questions was significantly higher.
Analysis
Frequencies were determined separately for each procedure and for each question (A through D of
each procedure) in an effort to minimize the impact of all the missing data on the results. Results were
summarized in percentages for ease of interpretation (and because the counts varied so much from
procedure to procedure due to varying amounts of missing data). An estimate of the yearly number of
each specific procedure performed by available staff was computed for each facility and procedure. This
computation involved dividing the number of times a procedure was performed (in 2009) by the number
of available FTE staff to perform that procedure. This estimate is the closest thing to the information that
was collected in the original RAD practice analysis where large numbers of radiographers (primarily
entry-level radiographers) were asked how frequently they performed each procedure/task.
Another key factor in the RAD managers’ survey was the part (C) of the set of questions
regarding each procedure. That question asked whether the procedure was routinely performed
by entry-level (0 – 3 years) radiographers.
25
Radiography Practice Analysis Report
The primary analyses were simple frequencies and percentages of these data. One primary purpose
of the study was to determine how the results of the RAD managers’ survey compare with the original
RAD practice analysis survey results.
Results
Demographics
An examination of the demographics data from the RAD managers’ survey indicates that the
returned survey results may not be a representative subset of the targeted population that all surveys were
mailed to. In other words the returned surveys may have been disproportionately returned by certain
types of individuals for unknown reasons. As was mentioned previously the targeted population that the
survey was mailed out to had 68% of the individuals working in hospitals.
1. Which of the following best describes your place of employment?
Hospital
Clinic
Private Office
Other
Managers
Survey
56.5%
17.6%
19.8%
5.2%
RAD PA
Survey
72.0%
It is difficult to make more of a comparison with the percentages from hospitals because the RAD
practice analysis survey had slightly different wording on the other options so there may have been
differing interpretations. Still these results do tend to indicate that there were fewer managers working in
hospital environments that completed and returned the survey.
2. If you work in a hospital/medical center, what is its approximate size (number of beds)?
Less than 100
100 – 250
251 – 500
More than 500
Managers
Survey
47.4%
26.3%
20.0%
6.3%
RAD PA
Survey
16.9%
20.4%
21.4%
17.0%
26
Radiography Practice Analysis Report
This second question tended to indicate that the managers’ survey data also tends to include a larger
percentage of managers from small hospitals than was the representative work environment of the RAD
practice analysis survey.
3. Which of the following best describes the community where you work?
Urban
Suburban
Rural/Small Town
Managers
Survey
26.8%
30.0%
40.8%
RAD PA
Survey
44.5%
27.8%
26.6%
The results from this question seem to indicate that the results from the RAD managers’
survey also tended to be over represented by individuals working in rural or small towns than the
RAD practice analysis results.
Following with the above pattern of an over representation of rural/small town workplaces
that tend to be smaller and fewer percentage of hospitals, the number of radiographers employed
in the facility also appeared to be over represented in the lower categories (i.e., small workplace
environments).
4. How many radiographers (FTEs) are employed in the facility where you work?
1–5
6 – 10
11 – 15
more than 15
Managers
Survey
44.8%
12.7%
8.1%
31.7%
RAD PA
Survey
21.3%
11.9%
16.7%
49.5%
The RAD managers’ survey actually defined these numbers of radiographers employed in the
facility a bit more finely than the RAD practice analysis (by design) but the categories were able to be
aggregated to match those from the RAD practice analysis (as was done here). It may be worth noting
that 24.5% of the respondents in the managers’ survey indicated only one or two radiographers are
employed in their facility.
A related question dealt with number of patients seen on an average day. Although this question
was not a part of the RAD practice analysis, the results also seem to reinforce the overall pattern that
these managers’ survey data may not be as representative as one would desire.
27
Radiography Practice Analysis Report
7. About how many patients are seen on an average day in your department?
1 – 50
51 – 100
101 – 250
250 or more
Managers’
Survey
46.1%
19.2%
24.0%
10.6%
Two other questions that are particularly interesting in comparison to the RAD practice analysis, and
potentially in the interpretation of the managers’ survey results, were the following:
8. Are CT procedures being performed in your facility?
Yes
No
Managers’
Survey
61.9%
38.1%
9. Do any entry-level (0 -3 years of experience) radiographers perform CT procedures in
your facility?
Yes
No
Managers’
Survey
36.5%
63.5%
Interestingly, although this last question was asked of all respondents in the RAD survey (although
that survey was proportionately representative of entry-level radiographers by design), the percentage
reporting that they spend 0% of their work time performing CT activities was 59.2% (which generally
compares with the 63.5% indicating ‘No’ in the managers’ survey).
28
Radiography Practice Analysis Report
A follow-up question was asked for those who answered ‘Yes’ to question nine above.
10. If # 9 answer was ‘Yes’; about what percent of work time do the entry-level
radiographers spend performing CT?
1 – 5%
6 – 25%
26 – 50%
51 – 75%
75 – 100%
Managers’
Survey
35.6%
33.2%
21.2%
6.7%
3.4%
But again, these percentages are based on the 36.5% who indicated that they did have entry-level
radiographers performing CT.
The results of the remaining demographic questions on the managers’ survey are as follows:
5. How many entry-level radiographers (FTEs) are employed in the facility where you
work?
Managers’
Survey
0
30.6%
1
14.8%
2
12.1%
3–5
16.1%
6 – 10
6.6%
11 – 15
1.5%
16 – 20
1.1%
The remainder of the managers’ survey data on this question was missing. Since this was an open-ended
question it may be that some managers elected not to complete it.
6. How many years have you worked as a manager?
Managers’
Survey
Less than 1 year
1.7%
1 – 3 years
13.6%
4 – 5 years
9.8%
6 – 10 years
21.5%
11 – 20 years
27.0%
More than 20 years 26.5%
29
Radiography Practice Analysis Report
Although it is not clear how representative the findings for the above questions may be for managers
of radiography departments in general, it does indicate that the managers represented in our survey tend to
be experienced, with 75% of them indicate having six or more years of experience as a manager.
Results of Procedural Questions
Table 4-1 shows the summary of the results of the procedural questions on the RAD managers’
survey. Of particular note in this table is the very large percentage of missing data. In the portion of each
procedural question that asked about whether the procedure was performed by entry-level radiographers
the percentages of missing data ranged from 15% to 80%. For the computations that estimated the yearly
number of procedures for each FTE the ranges of percentage of the data that were missing ranged from
19% to 90%, with an average missing percentage of 54.4% of the data across all procedures.
Looking at the right-most column in Table 4-1 we see the 90th percentile point for the estimated
number of procedures performed yearly by each FTE radiographer. Setting the two procedures for knee
and the CT procedures aside for a moment you will note that the remaining procedures do not appear to
be frequently performed by radiographers. It is worth noting that all of these procedures have more than
90% of radiographers performing the procedure fewer than 50 times a year (i.e., on average
approximately once a week). Many of those procedures are performed even less frequently (on a yearly
basis) based on an examination of these results. For example, look at the P90 (90th percentile) values for
the mandible, zygomatic arches, and the temporomandibular joints procedures. The values indicate that
90% of the FTE radiographers are performing only 3 - 7 of those procedures over the course of a year.
The two knee procedures were placed on the managers’ survey to provide a benchmark, since those
are relatively common procedures, it was anticipated that the number of estimated knee procedures per
year would be relatively large for most radiographers. That hypothesis appeared to be verified in the
manager survey data since the P90 (90th percentile) value was up in the hundreds per year for these two
procedures.
The CT procedures also appear to be done with great frequency but it is not completely clear from
these data whether the radiographers performing those procedures are also CT trained, or whether those
radiographers may also be CT certified (or CT specialists). Thus the estimated numbers of CT procedures
per year (per FTE) appear to be contradictory to the demographic information that was also provided
(from the same managers) indicating that 65% of entry-level radiographers are not doing any CT work.
Note that the CMS data also provides no information about the level of experience of the radiographers
performing the procedure.
30
Radiography Practice Analysis Report
Table 4-1
31
Radiography Practice Analysis Report
Table 4-1 (Continued)
32
Radiography Practice Analysis Report
Discussion
The RAD managers’ survey was undertaken to provide information that might prove useful in
evaluating the quality of the RAD practice analysis data. Unfortunately the demographics of the survey
tend to indicate that the respondents may not be representative. A larger issue is the large amount of
missing data in the returned survey data. For these reasons it is recommended that the results be
interpreted with great caution.
One interesting side note is that some comments, and speculation, have been made that seemed to
imply that perhaps the role of radiographers in rural or small town work settings may be different. While
this may be the case for some individuals, one interesting finding of these data was that it did not appear
to validate those conjectures since these data generally tended to corroborate the RAD practice analysis
findings.
Even with the cautions regarding the missing data and the potential non-representativeness of these
data the results still seem to generally corroborate the RAD practice analysis results. The procedures
included on the managers’ survey were specifically targeted because they had relatively low frequency
rates in the RAD practice analysis survey. The results of the RAD managers’ survey appeared to
corroborate that these procedures (excluding the knee and CT procedures as discussed) did indeed appear
to be less frequently performed by radiographers in the workplace. Most of the procedures are performed
roughly once a week or less by radiographers in the workplace. Note that the frequency of a procedure
does not address its importance, simply how frequently it occurs in the work settings of most
radiographers.
Factoring the information about the percentages of entry-level radiographers that
managers say do not perform any of these procedures also aligns with the RAD practice analysis data as
the correlation between the percentage responsible for these procedure based on entry-level radiographers
and the RAD practice analysis results is a very nearly a perfect relationship (r = 0.92; shown in the
following scatter plot).
33
Radiography Practice Analysis Report
Plot of RAD PA Survey and Managers' Survey
Percentage Responsible, CT Procedures in Red
100
RAD PA % Responsible
80
60
40
20
10
20
30
40
50
Managers' Survey % Responsible
In summary, although the managers’ survey was not as representative as might have been desired
and the data contained large amounts of missing information, the general trends in the RAD practice
analysis results appear to have been supported by these results which were obtained in a very different
manner. Along with procedural information that has been analyzed from the CMS data files (described in
the following chapter) the results all seem to validate the usefulness and quality of the RAD practice
analysis data as it was collected. The managers’ survey provides some unique perspectives on the
procedural information gathered during the RAD practice analysis. If an easier mechanism could be
devised to gather similar data so that the burden on completing the survey could be reduced then perhaps
the return rate and representativeness of this type of survey could be improved. The CMS data also has
great potential in the conduct of future practice analyses.
Finally it should be clearly noted that any data collected and summarized for evaluation in practice
analysis studies is generally based on what “most” radiographers are doing. This is the purpose of careful
sampling plans for the survey participants. There will always be some variations among workplace
environments in terms of the kinds of knowledge, skills, and abilities that radiographers may need in
order to be successful in practice. It is important, however, to recognize, that certification needs to be
representative of the breadth of potential work environments in which radiographic technologists may
find themselves.
34
Radiography Practice Analysis Report
CHAPTER 5
USE OF THE CENTER FOR MEDICARE AND MEDICAID SERVICES (CMS) DATA
TO SUPPLEMENT THE RADIOGRAPHY PRACTICE ANALYSIS
Description of the Data
The Center for Medicare and Medicaid Services (CMS) publishes data annually on Medicare and
Medicaid utilization. There are numerous types of data available, but the data available for public
purchase is devoid of information that one could use to identify either the patients or individual care
givers for confidentiality reasons. The particular database that the ARRT purchased was the 2008
Physician/Supplier Procedure Summary database. The critical piece of information in this database, for
ARRT’s use, is number of times that all providers billed a given procedure to the U.S. government. The
main objective of obtaining the CMS data was to compare the number of billed procedures with the
frequency of procedures as indicated by the latest Radiography practice analysis survey (RAD PA), which
was sent out in the first quarter of 2009. There were two reasons to compare the RAD PA results with the
2008 CMS data. First, because the RAD PA data were gathered early in 2009, the ARRT staff felt that
comparing the RAD PA to the 2008 CMS data was the most valid. Second, 2008 was the newest CMS
data available for analysis, because CMS generally has a six to seven month data time lag before a new
database is available.
One must consider several issues when comparing the CMS and RAD PA data. First, the CMS data
are the raw number of procedures billed. The RAD PA data are the relative frequency of procedures as
indicated by radiographers (daily, weekly, etc.). Second, the CMS data do not indicate whether or not
entry-level radiographers are conducting the given procedures for reasons of medical staff confidentiality.
The RAD PA data do indicate which procedures entry-level radiographers conduct, because the ARRT is
directly targeting those technologists and asking them which procedures they conduct. Finally, the CMS
data account only for Medicare- and Medicaid-billed procedures. The RAD PA data include Medicare-,
Medicaid-, and private insurance-billed procedures.
36
Radiography Practice Analysis Report
CMS Descriptive Statistics
The ARRT staff compiled the number of times that 55 different radiologic procedures were billed to
CMS. The RAD PA also contained these 55 procedures. Below is a table of basic descriptive statistics
for the CMS data. The mean (arithmetic average) number of billed procedures was much larger than the
median number of times billed. This occurred because of the maximum score of near 37 million billed,
which corresponded to chest radiographs. The number of times that chest radiographs were billed was
about 30 million greater than the second most frequently billed procedure. The chest radiograph was
quite clearly an extreme outlier. Because of the great influence of this outlier, all analyses comparing the
CMS and RAD PA data were conducted using the rank ordering of the procedures. Using rankings
minimized the extreme influence of the single outlying procedure.
Table 5-1
Basic Descriptive Statistics for the 2008 CMS Number of Procedures Conducted
Statistic
Number Billed
Mean
1,721,147
Median
211,108
Minimum
7,105
Maximum
36,996,100
Comparing CMS with the RAD PA
Figure 5-1 is a scatterplot of the RAD PA data, as sorted by the percentage of radiographers
indicating responsibility for a procedure, on the CMS data. There is strong agreement between the RAD
PA data and the CMS data for the blue radiographic procedures. The data sources disagree concerning
the red CT procedures. The CMS data ranked CT procedures highly, but it does not appear that entrylevel radiographers are responsible for CT procedures very often. The Spearman correlation, a statistical
index of rank agreement ranging from −1 to 1, also tells the same story. The Spearman correlation, for
only the radiography procedures, is 0.67, but including the CT procedures, decreased the correlation to
0.35, which is an extremely large drop.
37
Radiography Practice Analysis Report
Figure 5-1: 2008 CMS Ranking
and
RADCMS
PA and
“Responsible” Ranking,
Rank Plot
of 2008
CT
Procedures
in
Red
ARRT "Not Responsible" PA Data, CT Procedures in Red
RAD PA Rank
Rank
“Responsible”
Rank
ARRT PA Not Responsible
50
40
30
20
10
0
0
10
20
30
40
50
2008 CMS
CMS Rank
2008
Rank
Figure 5-2 is a scatterplot of the RAD PA data, as sorted by the Rasch Model overall frequency, on
the CMS data. There again appears to be high agreement between the RAD PA data and the CMS data
for the radiographic procedures but disagreement concerning the CT procedures.
The Spearman
correlation for the radiography procedures is 0.82, but including the CT procedures decreases the
correlation to 0.52, which is again, an extremely large drop.
38
Radiography Practice Analysis Report
Figure 5-2: 2008 CMS Ranking
andofRAD
Rasch
Rank Plot
2008PA
CMS
and Frequency Ranking,
CT Procedures
Red
ARRT Rasch
PA Data, CTin
Procedures
in Red
ARRT PA Rasch Rank
RAD PA Rasch Rank
Frequency Rank
50
40
30
20
10
0
0
10
20
30
40
50
2008
2008CMS
CMS Rank
Rank
The extremely high agreement between the CMS data and the RAD PA data concerning non-CT
procedures is uncanny. The Spearman correlation of 0.82 for the frequency data is much higher than
typically observed in scale validity studies such as this one, especially considering the sizable differences
in the two types of data as previously discussed. This confirms that the data-gathering and analysis
methodologies of the ARRT’s practice analyses are quite strong. The disagreement between the CMS
and RAD PA data concerning the CT procedures indicates that entry-level radiographers are not the
personnel doing most of the work in CT at this time. CT procedures are undoubtedly among the most
frequently conducted medical imaging procedures today, but these data indicate that CT does not
currently play an equally prominent role in the practice responsibilities of most entry-level radiographers.
39
Radiography Practice Analysis Report
The analysis of the spring 2008 CMS data was quite helpful in providing an independent validation
of how radiographic procedures would be rank-ordered. As demonstrated in the analysis described in this
chapter, the relationship between the rank-ordering of procedures based on the original RAD practice
analysis survey data and the CMS data was very strong (when CT procedures were excluded; CMS data
does not have any way to differentiate what type of technologist is performing any procedure). The
relationship between the CMS data and the original RAD PA survey data are shown in Figure 5-1.
An additional future use of longitudinal CMS data will also be as a mechanism to help identify
procedures that are increasing or decreasing in frequency. This could be helpful in identifying procedures
to ask about on interim practice analysis updates.
40
Radiography Practice Analysis Report
CHAPTER 6
OVERVIEW OF SUPPLEMENTAL DATA
AND FOLLOW-UP PRACTICE ANALYSIS ADVISORY COMMITTEE MEETING
The supplemental data were helpful in gaining additional perspective on the radiography practice
analysis. In particular it helped to corroborate the adequacy of the RAD practice analysis returned survey
sample data. All of these data were useful in assisting the RAD practice analysis advisory committee in a
final consideration of their recommendations to the ARRT Board of Trustees.
RAD PA Advisory Committee’s May 2010 Meeting
The RAD practice analysis Advisory Committee was reconvened to consider the new data as it
related to the original data collected. The additional data came from two supplementary sources: (1)
radiology managers, and (2) from the Centers for Medicare & Medicaid Services (CMS). Attempts were
also made to help clarify the presentation of the original survey results.
The Advisory Committee’s original recommendations regarding the tasks to cover on the
examination were based on how frequently radiographers performed procedures on a daily and weekly
basis. At the spring meeting the Advisory Committee recommended following the general guideline of
covering those tasks and procedures on the exam that at least 40% of radiographers indicated as being
their responsibility for performing with the caveat that the frequency of performance for these tasks and
procedures should be “put on watch” and revisited during the next interim update to the practice analysis.
Procedures that less than 40% of radiographers reported being responsible for performing were, in most
cases, not included at this time. An example of an exception to this 40% guideline is “vital signs” as the
Advisory Committee felt that in emergency situations it is important for a radiographer to have this skill.
Procedures that more than 40% of radiographers reported as being their responsibility, but that less
than 20% reported as being performed on at least a daily or weekly basis (e.g., AC joints, mandible,
zygomatic arches, TMJ’s, IVU, cystourethrography, retrograde pyelography, and myelography) were
recommended to be covered on the exam, but will be “put on watch” and revisited during the interim
update process to determine if practice patterns have changed such that the guidelines for responsibility
and frequency are met. Other tasks that were not recommended for inclusion on the exam at this time, but
which will also be “put on watch” and revisited during the interim update are those for which less than
40% of radiographers indicated responsibility, but which were listed as being performed more than 20%
of the time on a daily or weekly basis by those who did report responsibility for performing. This
category included CT procedures.
41
Radiography Practice Analysis Report
Decision Guidelines
Less than 40%
Responsible
Greater that 40%
Responsible
Frequency Less than 20% Daily or
Weekly
Exclude from Exam
Include on Exam but
“On Watch”
Frequency Greater than 20% Daily
or Weekly
Exclude from Exam but
“On Watch”
Include on Exam
The points below summarize the major activities accomplished at the May 22, 2010 practice analysis
advisory committee meeting.
1. The Advisory Committee discussed new data received from a managers survey conducted during
the winter of 2010, and data received from CMS for 2009. Based on the strength of the original
data and the Board’s concern over the Advisory Committee’s previous recommendation to
remove 19 procedures from the task inventory, the Advisory Committee revisited the original
2009 radiographers’ data.
The Advisory Committee’s previous decisions were based on
frequencies or, in other words, how often procedures are performed by radiographers on a daily
and weekly basis.
At this meeting the Advisory Committee decided to include tasks and
procedures that at least 40% percent of radiographers indicated that they were responsible for
performing.
In addition, any tasks and procedures with frequencies below 20% on a daily and
weekly basis should be included at this time, but reconsidered at the next interim update. The
Advisory Committee recommends that future practice analyses use the same criteria.
2. Task Inventory: Based on the 40% responsibility cut the task inventory was revisited. The
Advisory Committee recommends removing venography from the original 2005 Task Inventory
as only 20.3% indicated responsibility.
a. Three tasks that did not meet the 40% cut-off, but were kept, are:

Perform venipuncture – 38.9% responsible: The Advisory Committee felt that entrylevel radiographers need to have this skill for IV contrast administration. May not
perform at current workplace because of laws or institutional requirements – but could
later change employment and be required to perform venipuncture.

Operate tomography unit – 37% responsible: This knowledge is important when
performing IVPs for which 49.2% of radiographers indicated that they were responsible.

Obtain vital signs – 30% responsible: This is a basic patient care skill that is required in
an emergency situation.
42
Radiography Practice Analysis Report
b. Several new tasks were added to the new task inventory, based on the survey results such as:

Prior to administration of contrast agent determine if patient is at increased risk of
adverse reaction (preparatory medication reconciliation).

Perform post-processing on digital images in preparation for interpretation (e.g., exposure
indicator, brightness/contrast, window and level).

Add electronic annotations on digital images to indicate position or other relevant
information (e.g., time, upright, decubitus, post-void).

Perform routine maintenance on digital equipment.
c. Some tasks related to film/screen radiography were dropped such as:

Tasks related to darkroom maintenance such as daily sensitometry, screen film contact
and safelight fog were all well below the 40% responsible cut off mark.

Perform basic evaluations of radiographic equipment and accessories (e.g., beam
restriction, beam alignment) were reported by less than 40% of radiographers as
responsible.
d. CT procedures were discussed but the Advisory Committee concluded that using a 40%
responsibility cut off line, they could not justify adding them to the task inventory at this
time.
e. Interim Update: It is recommended that procedures with less than 20% of respondents
indicating performance on a daily or weekly basis should be included on the interim update
survey: acromioclavicular joints, mandible, zygomatic arches, temporomandibular joints,
bone age, long bone measurement, cystourethrography, intravenous urography, arthrography,
retrograde pyelography, and myelography. CT procedures should also be included in the
interim update survey.
3. Clinical Competency Requirements: Surgical cholangiography and retrograde pyelography were
replaced with two C-Arm procedures, orthopedic and non-orthopedic, to give candidates more
flexibility. The number of mandatory procedures dropped from 36 to 31 and the number of
electives increased from 30 to 35. Candidates must, however, perform at least one elective
procedure from both the head and fluoroscopy sections.
43
Radiography Practice Analysis Report
4. Content Specifications: The content specifications were revised to reflect the changes to the task
inventory. The Advisory Committee recommended adding a topic to the content specifications in
section A. Radiation Protection, called “Medical Exposure of Patients” referenced to NCRP
#160. Another area in section E. Patient Care and Education, that has been on the content
specification since 2005 is “respond to inquiries about other health care related services such as
CT, MRI, mammography, etc”. The Advisory Committee recommended that this area include
topics regarding dose differences between CT and radiography.
The major topic weights
remained as previously recommended:
CONTENT CATEGORY
A. Radiation Protection
B. Equipment Operation and Quality Control
C. Image Production Acquisition and
Evaluation
D. Radiographic Image Procedures
E. Patient Care and Education
44
PERCENT
OF TEST
NUMBER OF
QUESTIONS 2
20% 22.5%
12% 11.0%
25% 22.5%
40 45
24 22
50 45
30% 29.0%
13% 15.0%
100%
60 58
26 30
200
Radiography Practice Analysis Report
CHAPTER 7
REVISION OF TASK INVENTORY, CONTENT SPECIFICATIONS AND
CLINICAL COMPETENCY REQUIREMENTS
Overview
The previous chapter presented the data obtained from administering the practice analysis survey to a
national sample of ARRT registered radiographers. This chapter describes the process for using that data to
revise the task inventory, update the content specifications, and revise the clinical competency requirements.
As noted in Chapter 1 of this report, the purpose of conducting the practice analysis survey is to assure that
the content specifications and clinical competency requirements are job related. The first step in drafting the
content specifications and clinical competency requirements is to establish the task inventory based on the
results of the practice analysis.
Table 7-1 lists the key meetings and activities required to complete the project after the initial January
2009 Advisory Committee meeting. The continuing text then summarizes the process for carrying out the
activities.
Table 7-1. Key Meetings Required to Complete Project
June 2009
Advisory Committee meets to review survey results and edit task inventory;
update clinical competency requirements; and revise content specifications.
July 2009
Draft clinical competency requirements and content specifications mailed to
professional community for review and comment.
October 2009
Advisory Committee meets to review professional community comments,
revise clinical competency requirements and content specifications.
January 2010
Board reviews task inventory, content specifications, clinical competency
requirements and requests additional data.
March 2010
Data gathered from CMS and radiology managers.
May 2010
Advisory Committee meets to review CMS and radiology manager data, edit
task inventory, revise clinical competency requirements and content
specifications.
45
Radiography Practice Analysis Report
Revision of the Task Inventory
The Advisory Committee initially reviewed the survey results and revised the task inventory at
their January 2009 meeting to include: (a) tasks on the original inventory that were intentionally excluded
from the survey because they are job requirements (e.g., wears a film badge); (b) tasks on the survey
performed by at least 40% of the survey group; (c) tasks not meeting the 40% criterion but which the
Advisory Committee felt should nonetheless be included. The Advisory Committee noted that the data
showed a significant drop in the number of radiographers who use film-screen imaging; however, the
Advisory Committee recommended that—for the time being—those topics should remain.
Based upon the premise of eliminating activities performed by at less than 40% of the survey group,
the Advisory Committee recommended the deletion of 19 imaging procedures including: swallowing
dysfunction study, surgical cholangiography, ERCP, cystography, IVU, zygomatic arches, arthrogram,
and myelogram.
Discussion of adding CT procedures to Task Inventory. The Advisory Committee considered
adding CT procedures (head, neck, chest, abdomen, and pelvis) to the task inventory, based on survey
data that showed that these CT procedures were performed daily or weekly by roughly 25%-30% of
radiographers (see Table 7-2). The Advisory Committee also took into consideration recommendations
from the report Computed Tomography in the 21st Century Changing Practice for Medical Imaging and
Radiation Therapy Professionals, by Sal Martino, Jerry Reid, and Teresa G. Odle, that states:
“There are not enough technologists educated in CT to provide adequate staffing
for CT coverage around the clock, particularly in smaller and rural facilities. This
is further compounded by the increasing number of orders written for CT scans.
The previous consensus statement in education and practice addressed the need for
education and training of radiographers who perform CT procedures. This
statement addresses the consequences of the current lack of education. In fact, it
embraces many of the concerns the panel discussed surrounding education,
certification and availability of radiographers in the present and future to
adequately perform the growing number of CT scans.”
The Advisory Committee felt that many of the radiography procedures that radiographers were no
longer performing were now being imaged with CT and that the number of radiographers being asked to
perform CT is on the rise. Without training in radiation safety and protection that is specific to CT the
public could be at risk when a radiographer performs a scan.
46
Radiography Practice Analysis Report
Table 7-2. CT Procedures Being Performed by Radiographers
2011 Survey Activities
Position patient and operate CT scanner to produce the
following diagnostic images:
a. Head
b. Neck
c. Chest
d. Abdomen
e. Pelvis
%
Performing
% Not
Responsible
27.6%
23.8%
24.6%
26.1%
25.7%
69.4%
71.0%
71.7%
70.3%
70.8%
Although the percentage of radiographers performing specific CT procedures fell below the 40%
cut-off, the demographics portions of the survey, as shown in Table 7-3, indicated that the percentage of
radiographers who reported that some portion of their time was spent in CT was over the initial
designated criteria of ‘performed by 41%’ of the survey group.
Table 7-3. Radiographers Time Spent in CT
% of Time
in CT
0.0 %
1–5%
6 – 25 %
26 – 50 %
%
51 – 75 %
76 – 100 %
59%
8%
9%
23%
6%
7%
The Advisory Committee also considered the reported lack of specific CT education and training
of those radiographers who indicated they performed CT studies. Table 7-4 shows the survey results
regarding education or training. Thirty percent of the radiographers that responded to this question
indicated that they were trained on the job and 12% indicated that they received no training. A mere 3%
indicated some formal training in CT.
Table 7-4. CT Education or Training Received
Application Specialist in workplace
8%
Continuing Education on your own
9%
Formal Course
3%
On the job by other CT Technologists
30%
None
12%
47
Radiography Practice Analysis Report
New Decision Guidelines Evolve. At their January 2010 meeting, the Board of Trustees reviewed
the recommendations of the Advisory Committee to add CT procedures and remove 19 radiography
procedures that were infrequently performed, and requested that the Advisory Committee look at
additional data to support their recommendation.
The Advisory Committee reconvened in May of 2010 and reviewed additional data from the Center
for Medicare and Medicaid Services (CMS), radiology managers, and the feedback from the ARRT Board
of Trustees. The additional data supported their initial recommendations, however, they discussed the
issue of “responsibility” versus “frequency” and reconsidered their previous decision to eliminate the
activities performed less frequently and instead look at the percent of radiographers responsible for
procedures. The Advisory Committee determined that they would use the decision guidelines outlined in
Table 7-5.
Table 7-5. Decision Guidelines
Performed Less than
20% Daily or Weekly
Performed Greater than
20% Daily or Weekly
Less than 40% Responsible
Exclude from Task Inventory
Exclude from Task Inventory
but resurvey during next
interim update
Greater that 40%
Responsible
Include on Task Inventory
and resurvey during next
interim update
Include on Task Inventory
Based on the new decision guidelines the Advisory Committee ultimately recommended that CT
procedures should not be included in the task inventory at this time, however, the issue should be
readdressed during the next interim update. In addition, they recommended that only venography should
be removed from the task inventory as 79.7% indicated having no responsibility for the procedure.
48
Radiography Practice Analysis Report
Final Recommendations. Tasks that did not meet the new decision guidelines but that the Advisory
Committee felt should nonetheless be included, are listed in Table 7-6 along with the rationale used for
retaining them.
Table 7-6. Retained Tasks that did NOT meet the Decision Guidelines
2012
Task #
19.
2011 Survey Activities
Perform venipuncture for
contrast administration.
22.
Obtain vital signs.
14b.
Operate tomography unit
Rationale
May not do because of state laws or
institutional requirements – but
could get a job where it is required
Need to know for emergency
situations – basic patient care
This knowledge is important when
performing IVPs for which 49.2%
of radiographers indicated that they
were responsible
%
Performing
24.7%
% Not
Responsible
61.1%
18.5%
69.7%
1.0%
63.0%
Four new activities, as shown in Table 7-7 were added to the 2012 Task Inventory. These tasks were
added to the initial survey and met the new decision guidelines.
Table 7-7. New Tasks
2012
Task #
16.
44.
46.
58.
2011 Survey Activities
Prior to administration of contrast agent determine if patient is at
increased risk of adverse reaction (preparatory medication
reconciliation).
Perform post-processing on digital images in preparation for
interpretation (e.g., exposure indicator, brightness/contrast, window
and level).
Add electronic annotations on digital images to indicate position or
other relevant information (e.g., time, upright, decubitus, post-void).
Perform routine maintenance on digital equipment.
%
Performing
57.0%
% Not
Responsible
29.1%
81.5%
16.0%
87.1%
10.3%
Final Approval. The Board of Trustees, at their July 2010 meeting, approved the new decision
guidelines along with the recommendation that activities and procedures with less than 20% of respondents
indicating performance on a daily or weekly basis should be included on an interim update survey:
acromioclavicular joints, mandible, zygomatic arches, temporomandibular joints, bone age, long bone
measurement, cystourethrography, intravenous urography, arthrography, retrograde pyelography,
myelography, and CT procedures.
Ultimately, the only procedure removed from the previous task
inventory was venography. The 2012 Task Inventory for Radiography is located in Appendix E.
49
Radiography Practice Analysis Report
Updating the Content Specifications
The revision of the content specifications was based on changes made to the task inventory. For
instance, since venography was eliminated from the task inventory because less than 40% indicated
responsibility, it was also removed from the procedures section in the content specifications. Other
revisions to the content outline included the reorganization of certain topics and the elimination of some
film-related content. The most notable were:

In Section A. Radiation Protection, a topic called “Medical Exposure of Patients” referenced to
NCRP #160 was added.

Photon ‘Interactions with Matter’ and its subcategories were moved from Section B to Section A.
‘Biological Aspects of Radiation’. The Advisory Committee felt this was a more appropriate
place for this topic.

Section B.2.D., ‘Image Display’ was moved to section C. The Advisory Committee felt this was
a more appropriate place for this topic.

The ‘Image and Acquisition’ section (new B.2.D.) was renamed ‘Components of Digital
Imaging’ (CR and DR) and additional detail was added.

Film screen receptors were removed from Section B.3.C. and ‘display monitor quality assurance
was added’.

Section C.1.D., was edited from ‘Image Receptors’ to’ Digital Imaging Characteristics’. Filmscreen topics were eliminated from this section.

Section C.2.B., was renamed ‘Film-Screen Processing’ and over half of the film-processor topics
were eliminated.

Section C.2.D., ‘Image Display’ was moved from Section B.

Created Section E.6. Pharmacology, using topics from the ‘Contrast Media’ section.

Section E. Patient Care and Education contains a topic “respond to inquiries about other health
care related services, such as CT, MRI, mammography, etc”. A topic was added to this section
regarding dose differences between CT and radiography.
To ensure that the content specifications are job related, the Advisory Committee participated in a
linkage activity. For every activity in the task inventory, the Advisory Committee was asked to consider
the knowledge and skill required to successfully perform that task and to verify that the topic was
addressed in the content specifications. In other words, if one’s knowledge of a topic would have an
impact on the proficiency with which a task is performed that topic should be included in the content
50
Radiography Practice Analysis Report
specifications. Topics were similarly scrutinized for practice relevance. That is, topics that could not be
linked to practice were not included on the content outline. Each task was reviewed and then linked to the
appropriate topic in the content specifications. The task inventory lists the content codes that indicate
their linkage to the content specifications.
Assignment of Weights. As a final step in revising the radiography content specifications, the
Advisory Committee established weights to indicate the percentage of test questions that should be
allocated to each section. The process for establishing the weights involved both independent judgments
and consensus building. Each member was first asked to independently assign weights to each major
section of the content outline and then to the subcategories within those sections. The survey form
presented in Appendix J was used for collecting ratings from each Advisory Committee member. During
this process, members were asked to consider their own experience as well as the Practice Analysis
survey ratings. The weights assigned by individual Advisory Committee members were averaged and
later discussed by the entire Advisory Committee.
While all Advisory Committee members were
encouraged to discuss the weights they had assigned, those providing particularly low or high values for a
given category were specifically asked to explain their rationale. The discussion for a given section of the
content outline continued until the Advisory Committee reached agreement on a set of weights for that
particular section. This process proceeded section by section, until the entire content outline was covered.
In addition, input from radiography program directors who were surveyed during the professional
comment process was considered during the final weighting exercise. Please refer to Appendix I for a
summary of the professional comment process. The weights for each content category, which were
originally expressed in terms of percentage of items, were then converted to numbers of items. The final
number of questions in each content area is noted in the Table 7-8, along with number of questions from
the previous practice analysis.
51
Radiography Practice Analysis Report
Table 7-8. Final Number of Questions in Content Categories
Content Category
2005 number of
questions
2012 number of
questions
A. Radiation Protection
B. Equipment Operation & QC
C. Image Acquisition & Evaluation
D. Image Procedures
E. Patient Care and Education
Total
40
24
50
60
26
200
45
22
45
58
30
200
Final Approval.
The Advisory Committee reviewed the content specifications, weights and
comments from the professional community and recommended a final version to the Board of Trustees
for approval in Spring 2010. The Board of Trustees approved the content specifications effective January
2012. The 2012 Content Specifications for the Examination in Radiography, which includes the numbers
of items for each topic, is in Appendix F.
Revision of the Clinical Competency Requirements
The purpose of the clinical competency requirements is to ensure that individuals certified by
ARRT have competently performed a core set of procedures that comprise a modality. When establishing
the clinical competency requirements, the Advisory Committee focused on those procedures in the task
inventory typically performed by most entry-level technologists. The Advisory Committee also added
more flexibility to the document by increasing the number of elective procedures. Some minor changes
were made, most notably:

Surgical cholangiography and retrograde pyelography were replaced with two C-Arm
procedures, orthopedic and non-orthopedic, to give candidates more flexibility.

The number of mandatory procedures dropped from 36 to 31 and the number of electives
increased from 30 to 35.

Candidates must perform at least one elective procedure from both the head and fluoroscopy
sections.

Of the electives from the fluoroscopy section the candidate must perform either and upper GI
or a barium enema.
Final Approval. The Board of Trustees approved the document effective January 2012. A final
copy of the 2012 Radiography Didactic and Clinical Competency Requirements appears in Appendix G.
52
Radiography Practice Analysis Report
Appendix A
Radiography Practice Analysis
Staff Radiographer Survey Questionnaire
A-2
Radiography Practice Analysis Report
RADIOGRAPHY
PRACTICE ANALYSIS QUESTIONNAIRE
Dear Registered Technologist:
The American Registry of Radiologic Technologists is revising the content specifications and clinical
competencies for the examination in radiography. It is our philosophy that a certification exam should be
based on the job responsibilities of practicing technologists. Therefore, we are asking a select group
of technologists to inform us about the typical job duties, types of equipment, and current radiographic
procedures in today’s workplace.
You are one of the carefully selected professionals from whom the ARRT is requesting input. On the
enclosed questionnaire, we have assembled lists of activities that may be performed by radiologic
technologists. Our goal is to focus on various aspects of radiography practice; however, the list of
activities is not exhaustive. In order to shorten the questionnaire, many important activities that may be
part of your day-to-day responsibilities have been omitted. Since this questionnaire is being sent to only a
sample of technologists across the country, rather than to all technologists, it is important that you return it.
Your answers represent hundreds of your colleagues.
Please complete the questionnaire and return it within one week in the enclosed postage paid envelope.
It should take less than 25 minutes to answer the questions. Simply enclose the questionnaire, seal the
envelope, and drop it in the mail.
You may be assured of the complete confidentiality of your responses. Individual responses will not be
released to anyone under any circumstances.
Thank you very much for taking time from your busy schedule to assist the ARRT with this project. Your
participation helps to assure the integrity of the certification process.
Respectfully,
Jerry B. Reid, PhD
Executive Director
March, 2009
FOR OFFICE USE ONLY
1
A-3
Correct marks
Incorrect marks
Radiography Practice Analysis Report
• Please use #2 pencil or blue or black pen to complete this survey.
• Do not use red pencil or ink.
• Do not use X's or check marks to indicate your responses.
• Fill response ovals completely with heavy, dark marks.
SECTION 1: TASK INVENTORY
Directions: This section contains a list of tasks and procedures performed by radiographers. Our goal is to determine
how often you perform each task or procedure. Please mark the oval that best approximates how often you perform
each task or procedure. If you are NOT repsonsible for a procedure, just mark NR and proceed to the next one. Mark
only one oval for each item. Thank you for your valuable input.
Please fill in
only one
oval per item.
D – Daily: on average, at least once a day
W – Weekly: on average, at least once a week
M – Monthly: on average, at least once a month
Q – Quarterly: on average, quarterly or less often
NR – Not responsible: not responsible for performing
1. Sequence imaging procedures to avoid residual contrast material affecting future exams.
2. Communicate scheduling delays to waiting patients.
3. Verify or obtain patient consent as necessary (e.g., contrast studies).
4. Prior to administration of contrast agent, gather information to determine appropriate dosage.
5. Prior to administration of contrast agent determine if patient is at increased risk of adverse
reaction (preparatory medication reconciliation).
6. Confirm type of contrast media and prepare for administration.
7. Perform venipuncture for contrast administration.
8. Administer IV contrast media.
9. Observe patient after administration of contrast media to detect adverse reactions.
10. Obtain vital signs.
11. Clean, disinfect or sterilize facilities and equipment, and dispose of contaminated items
in preparation for next examination.
12. Document required information on patient’s medical record (e.g., radiographic requisitions,
radiographs).
a. on paper
b. electronically
13. Determine appropriate exposure factors using:
a. Fixed kVp technique chart
b. Variable kVp technique chart
c. Calipers (to determine patient thickness for exposure)
14. Select radiographic exposure factors.
a. Automatic Exposure Control (AEC)
b. kVp and mAs (manual or set by hand)
c. Pre-programmed techniques
15. Operate radiographic unit and accessories.
a. Fixed unit
b. Mobile unit (portable)
16. Select radiographic exposure factors.
a. Digital fluoroscopic unit
b. Non-digital fluoroscopic unit
c. Fixed fluoroscopic unit
d. Mobile fluoroscopic unit (C-arm)
e. Mobile vascular fluoroscopic unit (C-arm)
2
NR
Q
M
W
D
NR
Q
M
W
D
NR
Q
M
W
D
NR
Q
M
W
D
NR
Q
M
W
D
NR
Q
M
W
D
NR
Q
M
W
D
NR
Q
M
W
D
NR
Q
M
W
D
NR
Q
M
W
D
NR
Q
M
W
D
NR
Q
M
W
D
NR
Q
M
W
D
NR
Q
M
W
D
NR
Q
M
W
D
NR
Q
M
W
D
NR
Q
M
W
D
NR
Q
M
W
D
NR
Q
M
W
D
NR
Q
M
W
D
NR
Q
M
W
D
NR
Q
M
W
D
NR
Q
M
W
D
NR
Q
M
W
D
NR
Q
M
W
D
NR
Q
M
W
D
NR
Q
M
W
D
A-4
SECTION 1: TASK INVENTORY (continued)
Please fill in
only one
oval per item.
Radiography Practice Analysis Report
D – Daily: on average, at least once a day
W – Weekly: on average, at least once a week
M – Monthly: on average, at least once a month
Q – Quarterly: on average, quarterly or less often
NR – Not responsible: not responsible for performing
17. Operate specialized imaging units.
a. Dedicated chest unit
b. Tomography unit
c. Mammography unit
d. Bone densitometry unit
e. Panorex unit
18. Operate electronic imaging and record keeping devices.
a. Computerized Radiography (CR)
b. Direct Digital Radiography (DR)
c. Picture Archival and Communication System (PACS)
d. Film Digitizer
e. Hospital Information System (HIS)
f.
Radiology Information System (RIS)
19. Perform post-processing on digital images in preparation for interpretation (e.g., exposure
indicator, brightness/contrast, window and level).
20. Use laser printer to print hard copy images.
21. Add electronic annotations on digital images to indicate position or other relevant
information (e.g., time, upright, decubitus, post-void).
22. Use film-screen cassettes and automatic film processing.
23. Determine corrective measures if radiographic image is not of diagnostic quality and
take appropriate action.
Position patient, x-ray tube, and image receptor to produce the following diagnostic images:
24. Chest
25. Ribs
26. Sternum
27. Soft tissue neck
28. Abdomen
29. Esophagus
a. Assist with examination
b. Post fluoroscopy radiographs/images
30. Swallowing dysfunction study
31. Upper GI series, single or double contrast
a. Assist with examination
b. Post fluoroscopy radiographs/images
32. Small bowel series
33. Barium enema, single or double contrast
a. Assist with examination
b. Post fluoroscopy radiographs/images
34. Surgical cholangiography
35. ERCP
36. Cystography
37. Cystourethrography (voiding)
3
NR
Q
M
W
D
NR
Q
M
W
D
NR
Q
M
W
D
NR
Q
M
W
D
NR
Q
M
W
D
NR
Q
M
W
D
NR
Q
M
W
D
NR
Q
M
W
D
NR
Q
M
W
D
NR
Q
M
W
D
NR
Q
M
W
D
NR
Q
M
W
D
NR
Q
M
W
D
NR
Q
M
W
D
NR
Q
M
W
D
NR
Q
M
W
D
NR
Q
M
W
D
NR
Q
M
W
D
NR
Q
M
W
D
NR
Q
M
W
D
NR
Q
M
W
D
NR
Q
M
W
D
NR
Q
M
W
D
NR
Q
M
W
D
NR
Q
M
W
D
NR
Q
M
W
D
NR
Q
M
W
D
NR
Q
M
W
D
NR
Q
M
W
D
NR
Q
M
W
D
NR
Q
M
W
D
NR
Q
M
W
D
NR
Q
M
W
D
NR
Q
M
W
D
A-5
Radiography Practice Analysis Report
SECTION 1: TASK INVENTORY (continued)
Please fill in
only one
oval per item.
D – Daily: on average, at least once a day
W – Weekly: on average, at least once a week
M – Monthly: on average, at least once a month
Q – Quarterly: on average, quarterly or less often
NR – Not responsible: not responsible for performing
Position patient, x-ray tube, and image receptor to produce the following diagnostic images
(continued):
38. Intravenous urography
39. Retrograde pyelography
40. Cervical spine
41. Thoracic spine
42. Scoliosis series
43. Lumbar spine
44. Sacrum and coccyx
45. Sacroiliac joints
46. Pelvis and hip
47. Skull
48. Facial bones
49. Mandible
50. Zygomatic arches
51. Temporomandibular joints
52. Nasal bones
53. Orbits
54. Orbits for foreign body
55. Paranasal sinuses
56. Toes
57. Foot
58. Calcaneus (os calcis)
59. Ankle
60. Tibia, fibula
61. Knee
62. Patella
63. Femur
64. Fingers
65. Hand
66. Wrist
67. Forearm
68. Elbow
69. Humerus
70. Shoulder
71. Scapula
72. Clavicle
73. Acromioclavicular joints
74. Bone survey
75. Long bone measurement
76. Bone age
77. Soft tissue/foreign body
4
NR
Q
M
W
D
NR
Q
M
W
D
NR
Q
M
W
D
NR
Q
M
W
D
NR
Q
M
W
D
NR
Q
M
W
D
NR
Q
M
W
D
NR
Q
M
W
D
NR
Q
M
W
D
NR
Q
M
W
D
NR
Q
M
W
D
NR
Q
M
W
D
NR
Q
M
W
D
NR
Q
M
W
D
NR
Q
M
W
D
NR
Q
M
W
D
NR
Q
M
W
D
NR
Q
M
W
D
NR
Q
M
W
D
NR
Q
M
W
D
NR
Q
M
W
D
NR
Q
M
W
D
NR
Q
M
W
D
NR
Q
M
W
D
NR
Q
M
W
D
NR
Q
M
W
D
NR
Q
M
W
D
NR
Q
M
W
D
NR
Q
M
W
D
NR
Q
M
W
D
NR
Q
M
W
D
NR
Q
M
W
D
NR
Q
M
W
D
NR
Q
M
W
D
NR
Q
M
W
D
NR
Q
M
W
D
NR
Q
M
W
D
NR
Q
M
W
D
NR
Q
M
W
D
NR
Q
M
W
D
NR
Q
M
W
D
A-6
SECTION 1: TASK INVENTORY (continued)
Please fill in
only one
oval per item.
Radiography Practice Analysis Report
D – Daily: on average, at least once a day
W – Weekly: on average, at least once a week
M – Monthly: on average, at least once a month
Q – Quarterly: on average, quarterly or less often
NR – Not responsible: not responsible for performing
Assist radiologist with the following invasive procedures:
78. Arthrography
79. Myelography
80. Venography
81. PICC line insertion assistance
82. Position patient and operate MRI scanner to produce diagnostic images.
83. Position patient and operate CT scanner to produce the following diagnostic images:
a. Head
b. Neck
c. Chest
d. Abdomen
e. Pelvis
f.
Biopsy
g. Other (please fill in)
NR
Q
M
W
D
NR
Q
M
W
D
NR
Q
M
W
D
NR
Q
M
W
D
NR
Q
M
W
D
NR
Q
M
W
D
NR
Q
M
W
D
NR
Q
M
W
D
NR
Q
M
W
D
NR
Q
M
W
D
NR
Q
M
W
D
NR
Q
M
W
D
NR
Q
M
W
D
84. What education/training did you receive before performing CT scans? Mark all that apply or leave blank if you do not
perform CT.
Application specialist in workplace
Continuing education on your own
None
Formal course through an educational program
On the job by other CT technologists
SECTION 2: EQUIPMENT MAINTENANCE
Directions: This section contains a list of quality assurance tasks or procedures performed by radiographers. Our
goal is to determine if you perform each task or procedure. If you are NOT responsible for a procedure, just mark
NR and proceed to the next one. You may select more than one oval per item.
May select
more than one
oval per item.
R – Review results
P – Personally perform
D – Delegate or request someone else
NR – Not responsible for this procedure
EXAMPLE: This is how you would respond if you delegated or requested someone else to
peform and you reviewed results.
85. Peform basic evaluations of radiographic equipment and accessories.
a. Beam restriction system
b. Beam alignment
c. Source-to-image receptor distance indicator
d. Radiation protection devices (lead aprons and gloves)
86. Perform routine maintenance on digital equipment.
a. Perform start-up or shut-down
b. Erase CR plate
c. Equipment cleanliness (e.g., imaging plates, CR cassettes)
d. Recognize and report malfunctions
e. Perform laser printer quality control
87. Perform basic evaluations of film-processing equipment and accessories.
a.
b.
c.
d.
Darkroom fog (e.g., safelight, light leak)
Screen cleanliness
Screen-film contact
Daily sensitometry
5
NR
D
NR
P
R
P
NR
D
P
R
NR
D
P
R
NR
D
P
R
NR
D
P
R
NR
D
P
R
NR
D
P
R
NR
D
P
R
NR
D
P
R
NR
D
P
R
NR
D
P
R
NR
D
P
R
NR
D
P
R
NR
D
P
R
A-7
Radiography Practice Analysis Report
SECTION 2: EQUIPMENT MAINTENANCE (continued)
88. The following questions refer to terminology used in the workplace when working with radiologists and other
radiographers. Fill in one oval for each question.
a.
b.
c.
In discussing DOSE which term do you use more frequently?
In discussing DOSE EQUIVALENT which term do you use more frequently?
When referring to SID which term do you use more frequently?
gray
sievert
centimeter
rad
rem
inch
SECTION 3: DEMOGRAPHIC AND WORK EXPERIENCE
6
Secondary
Workplace
0. Which of the following best describes your place of employment?
a. Hospital/medical center
b. Physician group practice/clinic
c. Free-standing imaging center
d. Other
1. Which of the following best describes your place of employment?
a. Hospital/medical center
b. Physician group practice/clinic
c. Free-standing imaging center
d. Other
2. If you work in a hospital/medical center, what is its approximate size (number of beds)?
a. less than 100
b. 100 to 250
c. 251 to 500
d. more than 500
3. Which of the following best describes the community where you work?
a. Urban
b. Suburban
c. Rural/small town
4. How many radiographers are employed in the facility where you work? (include yourself)
a. 1-5
b. 6-10
c. 10-15
d. more than 15
5. Which of the following best describes your job title?
a. Staff technologist
b. Lead or chief technologist
c. Administrator (manager)
d. Educator (program director, clinical instructor, staff educator)
e. Modality technologist (CT, MRI, angiographer, etc.)
f.
Other
Primary
Workplace
EXAMPLE: If you are employed 30 hours per week at a hospital, 10 hours per week at
a free-standing clinic, and work occasional weekends at a third hospital, you should
complete the form as follows.
Other (Third)
Workplace
Directions: The following questions refer to your workplace(s) in radiography. If you work at one job, consider
it your primary workplace and leave the secondary and other columns blank. If you have a second job, consider
it your secondary workplace and answer each question accordingly. The ‘Other’ column is for those who may
have a third radiography workplace.
P
S
O
P
S
O
P
S
O
P
S
O
P
S
O
P
S
O
P
S
O
P
S
O
P
S
O
P
S
O
P
S
O
P
S
O
P
S
O
P
S
O
P
S
O
P
S
O
P
S
O
P
S
O
P
S
O
P
S
O
P
S
O
P
S
O
P
S
O
S
P
O
A-8
Radiography Practice Analysis Report
SECTION 3: DEMOGRAPHIC AND WORK EXPERIENCE (continued)
Please take a few minutes to answer the following questions.
Less than 10
10-24
24-32
32-40
More than 40
8. What type of entry level educational program in
radiologic technology did you complete? Mark only
one.
Associate degree
Bachelor’s degree
Master’s degree
Doctoral degree
11. Please estimate the percentage of time you spend with
patients in each of the age categories listed below.
%
00
-1
76
5%
-7
51
0%
-5
26
%
25
65%
1-
9. If you have obtained a degree since your graduation
from your RT program, what is the highest level?
Mark only one.
Radiography
CT
MRI
US
Mammography
Angiography
Bone Densitometry
PACS
QA/QM
Clinical Staff Educator
Other
(please fill in)
0%
Hospital certificate
Technical/vocational certificate
Associate’s degree
Bachelor’s degree
Other
%
00
-1
76
5%
-7
51
0%
-5
26
%
25
65%
1-
7. How many total hours per week are you employed
as a radiographer?
10. About what percent of your work time do you spend
performing the following activities?
0%
6. How many years have you worked as a radiographer?
Less than 1
1-5
6-10
10-15
More than 15
Children (0-12)
Adolescents (13-17)
Adults (18-64)
Elderly (65+)
12. Does your department employ any of the following
personnel on a full-time or part-time basis?
Mark all that apply.
Radiology nurse
Darkroom personnel
Processor maintenance specialist
Quality assurance staff
Radiologic technologist’s assistant
Radiologist extender (R.R.A./RPA)
PACS Administrator
Clinical Staff Educator
Thank you for taking the time from your busy schedule to complete this very important survey.
AMERICAN REGISTRY OF RADIOLOGIC TECHNOLOGISTS
1255 NORTHLAND DRIVE
SAINT PAUL, MINNESOTA 55120
7
Radiography Practice Analysis Report
Appendix B
Results of the Staff Radiographer Survey Questionnaire Demographics
B-1
Radiography Practice Analysis Report
Tables B.1.: Demographic Statistics
1. Place of Employment
N
Hospital/medical center
696
Physician group
179
practice/clinic
Free-standing imaging
51
center
Other
28
3. Location of Workplace
N
Urban
432
Suburban
261
Rural/small town
258
%
71.8
18.5
5.3
251 to 500
202
20.4
2.9
More than 500
163
16.5
%
43.8
26.4
26.1
5. Job Title
Staff technologist
Lead/chief technologist
Administrator
Educator
Modality technologist
Educator
Other
N
731
79
1
1
47
0
5
2. Hospital Size (approx.)
N
%
Less than 100
159
16.1
100 to 250
193
19.5
%
82.8
8.9
0.1
0.1
5.3
0.0
0.6
4. Number of Radiographers
N
%
1-5
207
20.9
6 - 10
113
11.4
10 - 15
156
15.7
More than 15
487
49.1
6. Work Experience
N
Less than 1
3
1-5
687
6 - 10
249
10 - 15
51
More than 15
11
Missing
B-2
12
%
0.3
68.5
25.0
5.1
1.1
Radiography Practice Analysis Report
Tables B.1.: Demographic Statistics (continued)
7. Hours Worked per Week
N
%
Less than 10
7
0.7
10 - 24
4
0.4
24 - 32
19
1.9
32 - 40
738
74.0
More than 40
228
22.9
Missing
12
8. Type of Educational Program
N
Hospital certificate
107
Technical/vocational certificate
88
Associate’s degree
730
Bachelor’s degree
58
Other
10
Missing
15
9. Highest Degree Attained
N
%
Associate degree
276
71.3
Bachelor’s degree
108
27.9
Master’s degree
3
0.8
Doctoral degree
0
0.0
Missing
621
B-3
%
10.7
8.8
73.3
5.8
1.0
Radiography Practice Analysis Report
Tables B.1.: Demographic Statistics (continued)
Activity
Radiography
CT
MRI
Sonography
Mammography
Angiography
Bone Densitometry
PACS
QA/QM
Clinical Staff Educator
Other (see list)
0%
1.2%
59.2%
92.5%
97.4%
62.5%
88.2%
77.8%
26.1%
62.1%
82.3%
10. Percentage of Time in Activities:
1-5%
6-25%
26-50%
1.0%
7.7%
2.4%
0.5%
1.3%
4.0%
9.0%
14.4%
18.0%
5.9%
3.9%
8.9%
2.1%
0.8%
0.5%
3.2%
2.6%
15.7%
7.5%
3.7%
7.9%
11.5%
0.9%
0.6%
0.8%
1.5%
2.3%
5.5%
3.1%
2.4%
51-75%
76-100%
13.7%
6.2%
0.3%
0.5%
1.0%
0.9%
1.7%
5.7%
2.8%
2.7%
72.3%
6.5%
1.8%
0.3%
0.4%
2.2%
1.5%
32.7%
6.4%
2.8%
About what percent of your work time do you spend performing the following activities?
Section 3 – Question 10 - Other
Admin. Duties / Office Work / Ordering, Administrative / Schedules/Secretary, etc. / Scheduling & Phone /
Scheduling, Working with Provider / Clerical/Ordering / Clerical/Reception/Transport / Front Desk,
Scheduling, Clerical, Cleaning, Delegating Exams, Rooms, and Techs (Coordinator's Desk) / Gathering
Supplies, Room Maintenance / Order supplies for Department / Upkeep in department cleaning and
ordering supplies/props used in x-ray department / Billing/Insurance / Data Entry, Phones, Film Printing,
Student Training / Paperwork / Paperwork, Pt. Transportation
Number of
Write-ins
17
Arthrograms
1
Asbestos Medical Surveillance
1
Assisting Doctors with Patients and Translating for Patients / Assist with Nurses in Urgent Care
3
Cardiac Cath Lab
1
C-Arm / C-Arm in OR
4
Cast Tech and Medical Assistant
1
Cleaning & Meetings
Clinical Instructor / Clinical Assistant / Clinical Student Educator /Teaching hospital students / Teaching
Students / Train current radiology students / Work with x-ray students
1
12
CMA
2
CR
1
CT & X-Ray Tech Different Shifts / Help in CT
2
Diagnostic
1
B-4
Radiography Practice Analysis Report
Tables B.1.: Demographic Statistics (continued)
About what percent of your work time do you spend performing the following activities?
Number of
Write-ins
Section 3 – Question 10 - Other
Epidural Steroid Injections with C-Arm
1
Every 3 months I will do 8 hours in diagnostic x-ray to keep up my skills.
1
Fluoroscopy
1
Interventional Radiography / Interventional with Exception to Angiography / IR - Special Procedures
4
IT Support
1
Lab Draws & EKGs / Lab Tech / Lab Work / Labs, Vitals, Rooming Patients
4
Lead C-arm Operator / Lead
2
Lithotripsy / Lithotripsy - C-Arm
3
Medical Assistant
6
Mobile EKS/ECG Exams
1
Myelography
1
Nuc Med
1
OR (Main or Tech)
1
PACS Supervisor / PACS - Philips Isite
2
PET - Nuc Med
1
Phlebotomy and Cardio Pulmonary Diagnostics
2
PICC/PermCath
1
Portable C-arm (surgery)
1
Portable Radiography
1
Process Report Pages
1
Radiologist Assistant
1
Simulation for Rad Tx, HDR, and Cyber Knife
Surgery / Surgery - C-Arm / Surgery (OR) Technologist / Surgery Coordinator / Surgery Fluoro (C-Arm) /
Surgery X-ray Tech / Surgical Assist
1
11
Triage Patients, Venipuncture (MA related tasks) / Triage, Lab Work / Triage/In-house Lab
3
TOTAL
B-5
99
Radiography Practice Analysis Report
Tables B.1.: Demographic Statistics (continued)
Activity
Children
(0 – 12)
Adolescent
(13 – 17)
Adults
(18 – 64)
Elderly (65+)
11. Percentage of Time Spent with Different Types of Patients:
0%
1-5%
6-25%
26-50%
51-75%
76-100%
5.5%
29.3%
39.7%
12.3%
7.1%
6.1%
3.4%
18.4%
45.5%
19.0%
7.4%
6.2%
0.5%
1.3%
10.2%
28.6%
31.6%
27.5%
1.9%
2.3%
12.1%
22.2%
32.2%
29.2%
12. Other Types of Staff in Dept.
N
Radiology nurse
498
Darkroom personnel
44
Processor maintenance specialist
239
Quality assurance staff
317
Radiologic technologist’s assistant
225
Radiologist extender (R.R.A./RPA)
74
PACS Administrator
600
Clinical Staff Educator
344
B-6
%
49.4
4.4
23.7
31.4
22.3
7.3
59.5
34.1
Radiography Practice Analysis Report
About what percent of your work time do you spend performing the following activities?
Section 3 – Question 10 - Other
Admin. Duties / Office Work / Ordering, Administrative / Schedules/Secretary, etc. / Scheduling & Phone /
Scheduling, Working with Provider / Clerical/Ordering / Clerical/Reception/Transport / Front Desk,
Scheduling, Clerical, Cleaning, Delegating Exams, Rooms, and Techs (Coordinator's Desk) / Gathering
Supplies, Room Maintenance / Order supplies for Department / Upkeep in department cleaning and ordering
supplies/props used in x-ray department / Billing/Insurance / Data Entry, Phones, Film Printing, Student
Training / Paperwork / Paperwork, Pt. Transportation
Number of
Write-ins
17
Arthrograms
1
Asbestos Medical Surveillance
1
Assisting Doctors with Patients and Translating for Patients / Assist with Nurses in Urgent Care
3
Cardiac Cath Lab
1
C-Arm / C-Arm in OR
4
Cast Tech and Medical Assistant
1
Cleaning & Meetings
Clinical Instructor / Clinical Assistant / Clinical Student Educator /Teaching hospital students / Teaching
Students / Train current radiology students / Work with x-ray students
1
12
CMA
2
CR
1
CT & X-Ray Tech Different Shifts / Help in CT
2
Diagnostic
1
Epidural Steroid Injections with C-Arm
1
Every 3 months I will do 8 hours in diagnostic x-ray to keep up my skills.
1
Fluoroscopy
1
Interventional Radiography / Interventional with Exception to Angiography / IR - Special Procedures
4
IT Support
1
Lab Draws & EKGs / Lab Tech / Lab Work / Labs, Vitals, Rooming Patients
4
Lead C-arm Operator / Lead
2
Lithotripsy / Lithotripsy - C-Arm
3
Medical Assistant
6
Mobile EKS/ECG Exams
1
Myelography
1
Nuc Med
1
B-7
Radiography Practice Analysis Report
About what percent of your work time do you spend performing the following activities?
Number of
Write-ins
Section 3 – Question 10 - Other
OR (Main or Tech)
1
PACS Supervisor / PACS - Philips Isite
2
PET - Nuc Med
1
Phlebotomy and Cardio Pulmonary Diagnostics
2
PICC/PermCath
1
Portable C-arm (surgery)
1
Portable Radiography
1
Process Report Pages
1
Radiologist Assistant
1
Simulation for Rad Tx, HDR, and Cyber Knife
Surgery / Surgery - C-Arm / Surgery (OR) Technologist / Surgery Coordinator / Surgery Fluoro (C-Arm) /
Surgery X-ray Tech / Surgical Assist
1
11
Triage Patients, Venipuncture (MA related tasks) / Triage, Lab Work / Triage/In-house Lab
3
TOTAL
B-8
99
Radiography Practice Analysis Report
Appendix C
Radiography Practice Analysis
Summary of Survey Questionnaire Responses
Unsorted
C-1
Radiography Practice Analysis Report
No
Responsibility
NR
TASKS / PROCEDURES
1..
Sequence imaging procedures to avoid residual contrast material affecting future
exams.
2.
Communicate scheduling delays to waiting patients.
3.
5.
Verify or obtain patient consent as necessary (e.g., contrast studies).
Prior to administration of contrast agent, gather information to determine appropriate
dosage.
Prior to administration of contrast agent determine if patient is at increased risk of
adverse reaction (preparatory medication reconciliation).
6.
Confirm type of contrast media and prepare for administration.
7.
Perform venipuncture for contrast administration.
8.
Administer IV contrast media.
9.
Observe patient after administration of contrast media to detect adverse reactions.
4.
10.
11.
12.
Obtain vital signs.
Clean, disinfect or sterilize facilities and equipment, and dispose of contaminated
items in preparation for next examination.
Document required information on patient’s medical record (e.g., radiographic
requisitions, radiographs).
12a. on paper
12b. electronically
13.
Daily
D
49.6%
18.5%
21.4%
7.6%
4.9%
4.6%
7.3%
7.5%
7.1%
14.5% 20.8%
22.3% 46.6%
15.3% 51.5%
33.9%
6.1%
7.6%
12.8% 39.6%
29.1%
25.5%
61.1%
44.5%
30.4%
69.7%
7.2%
5.6%
5.6%
8.4%
9.8%
6.9%
6.6%
7.0%
4.3%
7.1%
8.8%
4.9%
14.1%
16.0%
8.7%
13.8%
15.8%
4.6%
7.1%
1.3%
1.5%
5.5% 84.6%
16.4%
16.7%
1.6%
0.9%
2.6%
0.5%
3.2% 76.2%
3.5% 78.2%
22.8%
21.9%
50.2%
7.9%
6.7%
20.1%
2.2%
2.4%
7.8%
4.2% 63.0%
4.3% 64.8%
7.2% 14.8%
11.3%
2.2%
15.7%
1.3%
0.6%
2.3%
1.4%
1.0%
1.7%
2.5% 83.3%
4.4% 91.7%
3.9% 76.4%
42.9%
45.7%
16.0%
26.1%
35.2%
13.9%
Determine appropriate exposure factors using:
13a. Fixed kVp technique chart
13b. Variable kVp technique chart
13c. Calipers (to determine patient thickness for exposure)
14.
Quarterly Monthly Weekly
Q
M
W
Select radiographic exposure factors.
14a. Automatic Exposure Control (AEC)
14b. kVp and mAs (manual or set by hand)
14c. Pre-programmed techniques
C-2
Radiography Practice Analysis Report
No
Responsibility
NR
TASKS / PROCEDURES
15.
15b. Mobile unit (portable)
16b. Non-digital fluoroscopic unit
16c. Fixed fluoroscopic unit
16d. Mobile fluoroscopic unit (C-arm)
16e. Mobile vascular fluoroscopic unit (C-arm)
17b. Tomography unit
17c. Mammography unit
17d. Bone densitometry unit
17e. Panorex unit
0.3%
2.0%
1.5%
6.2%
94.5%
69.0%
40.3%
57.9%
42.2%
33.2%
65.0%
5.0%
5.9%
6.3%
7.4%
7.5%
6.5%
4.1%
6.9%
9.8%
7.5%
11.7%
7.3%
11.7%
19.8%
7.3%
36.5%
24.8%
32.7%
29.8%
12.9%
58.4%
63.0%
94.3%
82.9%
74.9%
1.9%
14.8%
0.5%
1.9%
5.8%
1.4%
11.3%
0.6%
3.6%
9.5%
3.1%
6.6%
1.2%
4.5%
6.6%
35.1%
4.4%
3.4%
7.1%
3.2%
15.6%
47.2%
13.0%
57.7%
44.2%
30.1%
0.6%
1.5%
0.5%
7.1%
1.3%
0.9%
1.2%
1.9%
0.6%
6.0%
1.4%
0.8%
2.4%
2.4%
2.0%
5.2%
2.9%
1.4%
80.0%
47.0%
83.6%
23.7%
49.9%
66.7%
16.0%
0.5%
1.7%
5.2%
76.3%
Operate electronic imaging and record keeping devices.
18a. Computerized Radiography (CR)
18b. Direct Digital Radiography (DR)
18c. Picture Archival and Communication System (PACS)
18d. Film Digitizer
18e. Hospital Information System (HIS)
19.
0.4%
1.1%
Operate specialized imaging units.
17a. Dedicated chest unit
18.
3.3%
21.7%
Operate fluoroscopic unit and accessories.
16a. Digital fluoroscopic unit
17.
Daily
D
Operate radiographic unit and accessories
15a. Fixed unit
16.
Quarterly Monthly Weekly
Q
M
W
18f. Radiology Information System (RIS)
Perform post-processing on digital images in preparation for interpretation (e.g.,
exposure indicator, brightness/contrast, window and level).
C-3
Radiography Practice Analysis Report
TASKS / PROCEDURES
20.
21.
22.
23.
Use laser printer to print hard copy images.
Add electronic annotations on digital images to indicate position or other relevant
information (e.g., time, upright, decubitus, post-void).
Use film-screen cassettes and automatic film processing.
Determine corrective measures if radiographic image is not of diagnostic quality and
take appropriate action.
No
Responsibility
NR
27.1%
Quarterly Monthly Weekly Daily
Q
M
W
D
10.0%
13.4%
19.0% 30.5%
10.3%
48.8%
0.7%
3.7%
1.7%
3.9%
6.7%
4.7%
80.4%
38.9%
3.9%
0.8%
2.2%
5.8%
87.3%
3.9%
3.8%
7.7%
8.4%
7.4%
2.0%
5.0%
46.5%
15.9%
2.5%
3.0%
16.4%
24.6%
28.5%
2.7%
1.7%
35.5%
8.6%
26.1%
7.6%
89.3%
39.0%
12.4%
21.2%
79.2%
24.8%
40.8%
42.8%
9.2%
11.0%
8.7%
11.4%
12.0%
10.1%
21.2%
18.1%
17.9%
23.3%
18.1%
20.4%
34.4%
38.9%
34.4%
9.7%
11.2%
9.5%
10.4%
10.8%
12.8%
20.1%
18.2%
23.2%
25.4%
20.9%
20.2%
35.2%
35.3%
15.6%
15.8%
15.7%
15.2%
20.2%
20.2%
13.4%
13.3%
Position patient, x-ray tube, and image receptor to produce the following
diagnostic images:
24.
Chest
25.
Ribs
26.
Sternum
27.
Soft tissue neck
28.
Abdomen
29.
Esophagus
29a. Assist with examination
29b. Post fluoroscopy radiographs/images
30.
Swallowing dysfunction study
31.
Upper GI series, single or double contrast
31a. Assist with examination
31b. Post fluoroscopy radiographs/images
32.
Small bowel series
33.
Barium enema, single or double contrast
33a. Assist with examination
33b. Post fluoroscopy radiographs/images
C-4
Radiography Practice Analysis Report
No
Responsibility
NR
46.5%
57.0%
47.0%
47.8%
50.8%
57.1%
2.3%
2.4%
20.8%
2.5%
2.7%
7.7%
2.6%
8.6%
10.6%
13.6%
27.0%
32.4%
10.7%
16.7%
16.0%
14.0%
3.4%
2.5%
3.2%
2.5%
TASKS / PROCEDURES
34.
Surgical cholangiography
35.
ERCP
36.
Cystography
37.
Cystourethrography (voiding)
38.
Intravenous urography
39.
Retrograde pyelography
40.
Cervical spine
41.
Thoracic spine
42.
Scoliosis series
43.
Lumbar spine
44.
Sacrum and coccyx
45.
Sacroiliac joints
46.
Pelvis and hip
47.
Skull
48.
Facial bones
49.
Mandible
50.
Zygomatic arches
51.
Temporomandibular joints
52.
Nasal bones
53.
Orbits
54.
Orbits for foreign body
55.
Paranasal sinuses
56.
Toes
57.
Foot
58.
Calcaneus (os calcis)
59.
Ankle
C-5
Quarterly Monthly Weekly Daily
Q
M
W
D
13.4%
13.5%
17.2% 9.2%
10.7%
10.0%
14.6% 7.3%
15.8%
13.8%
15.3% 8.2%
19.7%
14.9%
10.8% 6.7%
19.8%
13.5%
10.8% 5.1%
16.7%
9.8%
11.2% 5.0%
0.9%
3.3%
21.7% 71.7%
1.8%
7.0%
27.7% 60.9%
21.5%
22.2%
19.9% 15.6%
1.0%
2.7%
19.6% 74.1%
11.8%
28.8%
33.2% 23.4%
40.3%
26.2%
14.3% 11.3%
0.6%
3.6%
19.2% 73.9%
24.5%
25.3%
25.3% 16.2%
29.7%
29.7%
20.4% 9.4%
41.3%
28.0%
12.1% 4.9%
52.0%
14.0%
4.3% 2.7%
51.8%
9.1%
3.8% 2.6%
26.7%
35.0%
18.5% 9.0%
40.0%
22.5%
13.4% 7.5%
32.2%
19.3%
19.9% 12.7%
24.7%
25.5%
21.3% 14.4%
6.2%
15.1%
32.0% 43.0%
1.0%
3.2%
19.9% 73.2%
13.6%
29.6%
30.1% 23.3%
0.9%
2.7%
17.7% 76.1%
Radiography Practice Analysis Report
No
Responsibility
NR
2.7%
2.4%
5.4%
2.9%
3.3%
2.5%
2.5%
2.7%
2.7%
2.8%
2.4%
4.7%
3.7%
10.7%
22.2%
44.6%
32.9%
9.4%
TASKS / PROCEDURES
60.
Tibia, fibula
61.
Knee
62.
Patella
63.
Femur
64.
Fingers
65.
Hand
66.
Wrist
67.
Forearm
68.
Elbow
69.
Humerus
70.
Shoulder
71.
Scapula
72.
Clavicle
73.
Acromioclavicular joints
74.
Bone survey
75.
Long bone measurement
76.
Bone age
77.
Soft tissue/foreign body
Quarterly Monthly Weekly Daily
Q
M
W
D
0.9%
7.9%
26.6% 61.9%
0.7%
2.0%
16.6% 78.2%
13.5%
18.7%
21.3% 40.9%
3.8%
13.6%
30.0% 49.6%
1.2%
6.5%
24.7% 64.3%
0.8%
2.4%
16.4% 77.9%
0.8%
2.5%
17.1% 77.1%
1.2%
7.3%
28.5% 60.3%
0.7%
6.6%
28.1% 61.9%
2.8%
14.7%
32.5% 47.2%
1.5%
4.1%
22.5% 69.5%
24.2%
31.3%
17.5% 22.1%
12.8%
31.2%
28.4% 24.0%
48.0%
23.4%
8.8% 9.2%
30.7%
22.2%
16.1% 8.6%
33.4%
11.2%
6.3% 4.4%
34.6%
14.8%
10.0% 7.5%
17.4%
29.0%
26.9% 17.3%
Assist radiologist with the following invasive procedures:
78.
Arthrography
79.
Myelography
80.
Venography
81.
PICC line insertion assistance
56.2%
57.3%
79.7%
72.9%
C-6
12.2%
13.9%
11.6%
6.6%
11.0%
9.2%
4.4%
5.7%
14.5%
13.8%
2.9%
6.4%
5.9%
5.8%
1.3%
8.2%
Radiography Practice Analysis Report
TASKS / PROCEDURES
82.
Position patient and operate MR scanner to produce diagnostic images.
83.
Position patient and operate CT scanner to produce the following diagnostic
images:
69.4%
71.0%
71.7%
70.3%
70.8%
86.4%
85.1%
83a. Head
83b. Neck
83c. Chest
83d. Abdomen
83e. Pelvis
83f. Biopsy
83g. Other (see listing)
84.
No
Responsibility
NR
93.8%
CT Education/Training Received
Percentage
Application Specialist in workplace
7.7%
Continuing Education on your own
9.3%
Formal Course
3.4%
On the job by other CT Technologists
28.7%
None
11.9%
C-7
Quarterly Monthly Weekly
Q
M
W
0.9%
0.7%
1.4%
0.9%
1.2%
1.0%
0.7%
0.9%
4.6%
1.5%
2.0%
3.9%
2.7%
2.9%
2.6%
2.7%
2.9%
6.8%
8.7%
7.0%
6.1%
6.0%
3.6%
3.7%
Daily
D
3.0%
20.8%
15.1%
17.6%
20.0%
19.7%
2.5%
6.7%
Radiography Practice Analysis Report
Equipment Maintenance (Section 2)
85.
Perform basic evaluations of radiographic equipment and accessories
85a.
85a. Beam restriction system
85b.
85b. Beam Alignment
85c.
85c. Source-to-image receptor distance indicator
85d.
85d. Radiation protection devices (lead aprons and gloves)
Not
Responsible
NR
Delegate or request
someone else
D
Personally
Perform
P
Review
Results
R
65.3%
62.9%
60.1%
45.8%
14.6%
15.3%
13.2%
12.9%
19.5%
20.8%
23.8%
40.5%
10.2%
10.6%
11.0%
12.8%
23.2%
23.1%
16.3%
14.5%
79.1%
7.1%
6.3%
8.5%
10.0%
8.8%
69.5%
69.5%
75.9%
76.2%
10.2%
11.7%
10.5%
11.1%
12.0%
4.7%
82.4%
68.8%
77.4%
86.7%
4.7%
5.1%
6.3%
5.7%
11.8%
26.3%
15.2%
6.4%
3.7%
4.3%
3.2%
3.1%
gray
4.6%
rad
95.4%
sievert
1.9%
rem
98.1%
centimeter
6.1%
inch
83.8%
86.
86a.
86a. Perform start-up or shut-down
86b.
86b. Erase CR plate
86c.
86c. Equipment cleanliness (e.g., imaging plates, CR cassettes)
86d.
86d. Recognize and report malfunctions
86e.
86e. Perform laser printer quality control
87.
87a.
87a. Darkroom fog (e.g., safelight, light leak)
87b.
87b. Screen cleanliness
87c.
87c. Screen-film contact
87d.
87d. Daily sensitometry
Terminology used in the workplace when working with radiologists
and other radiographers
88.
88a.
88b.
88c.
88a. In Discussing DOSE which term do you use more frequently?
88b. In discussing DOSE EQUIVALENT which term do you use more
frequently?
88c. When referring to SID which term do you use more frequently?
C-8
Radiography Practice Analysis Report
Position patient and operate CT scanner to produce the following diagnostic images:
Section 1 – Question 83g – Other
Number of
Write-ins
All CT Exams
1
All Other Routine Exams (Extremity)
1
Angiogram / Angiography (CTA)(CCTA) / Angiogram Studies
4
Chest Angiography for PE's / CI Angiograms / Cardiac / Cardiac Scoring
2
C-Spine
1
CT Guided Drainages/ Aspirations
1
CT L Spines, Facial Bones, IAC's, Sinuses, Extremities
1
CTA
2
CTA, Runoff, Head, Carotids, Hearts
1
Drainage / Drainage/Aspiration
4
Epidural Injections
Extremities / Drainage, Extremities / Extremities, Joints, Spine Exams – Trauma / Extremity
(Knee/Shoulder) / Extremity, Facial Work, Urography / Extremity, Spines, Facial, Sinus
1
34
Face/Orbits
1
Facet Injections
1
Facial Bones / Facial Bones, Sinus / Facial, Mandible
8
Kidney Stone Protocol
2
Lower Extremity
2
Maxillary, Facial Bones / Maxillofacial
2
C-9
Radiography Practice Analysis Report
Position patient and operate CT scanner to produce the following diagnostic images:
Number of
Write-ins
Section 1 – Question 83g – Other (continued)
PE, AAA-Urogram, Venogram, etc. / PE, CTA / PE, Lower Extremity, Stone Studies
3
PET CT, Sinuses, Neck, C-Spine
1
Post Myelography
1
QA
1
Sims for Radiation Therapy / Simulation for Rad Tx, HDR, and Cyber Knife
2
Sinuses / Sinuses, Facial Bones, Extremities
Spine / Spine, Extremity / Spine, Facial Bones / Spines - C-T-L / L-Spine / Shoulder, Wrist, Ankle,
Knee / Lumbar Punctures / Lumbosacral Spine / Spines, Aspiration, Drainage / Angio / T-Spine
6
20
Trauma / Trauma, Anagram, Extremities
2
Upper & Lower Extremity
2
Whole Body, Cardiac, Ortho
1
TOTAL
C-10
108
Radiography Practice Analysis Report
Appendix D
Radiography Practice Analysis
Summary of Survey Questionnaire Responses
Tasks/Procedures Ranked by Percent Responsible
D-1
Radiography Practice Analysis Report
Tasks/Procedures Ranked by Percent Responsible
Tasks/Procedures
Percent
Quarterly
Responsible
Monthly
Weekly
NR
Q
M
14b.
kVp and mAs
97.8%
2.2%
0.6%
1.0%
4.4%
91.7%
40.
Cervical spine
97.7%
2.3%
0.9%
3.3%
21.7%
71.7%
41.
Thoracic spine
97.6%
2.4%
1.8%
7.0%
27.7%
60.9%
61.
Knee
97.6%
2.4%
0.7%
2.0%
16.6%
78.2%
70.
Shoulder
97.6%
2.4%
1.5%
4.1%
22.5%
69.5%
43.
Lumbar spine
97.5%
2.5%
1.0%
2.7%
19.6%
74.1%
57.
Foot
97.5%
2.5%
1.0%
3.2%
19.9%
73.2%
59.
Ankle
97.5%
2.5%
0.9%
2.7%
17.7%
76.1%
65.
Hand
97.5%
2.5%
0.8%
2.4%
16.4%
77.9%
66.
Wrist
97.5%
2.5%
0.8%
2.5%
17.1%
77.1%
46.
Pelvis and hip
97.4%
2.6%
0.6%
3.6%
19.2%
73.9%
44.
Sacrum and coccyx
97.3%
2.7%
11.8%
28.8%
33.2%
23.4%
60.
Tibia, fibula
97.3%
2.7%
0.9%
7.9%
26.6%
61.9%
67.
Forearm
97.3%
2.7%
1.2%
7.3%
28.5%
60.3%
68.
Elbow
97.3%
2.7%
0.7%
6.6%
28.1%
61.9%
69.
Humerus
97.2%
2.8%
2.8%
14.7%
32.5%
47.2%
63.
Femur
97.1%
2.9%
3.8%
13.6%
30.0%
49.6%
58.
Calcaneus (os calcis)
96.8%
3.2%
13.6%
29.6%
30.1%
23.3%
15a.
Operate Fixed unit
96.7%
3.3%
0.4%
0.3%
1.5%
94.5%
64.
Fingers
96.7%
3.3%
1.2%
6.5%
24.7%
64.3%
56.
Toes
96.6%
3.4%
6.2%
15.1%
32.0%
43.0%
72.
Clavicle
96.3%
3.7%
12.8%
31.2%
28.4%
24.0%
25.
23.
Ribs
Determine corrective measures and take
appropriate action.
96.2%
96.1%
3.8%
3.9%
5.0%
0.8%
16.4%
2.2%
35.5%
5.8%
39.0%
87.3%
24.
Chest
96.1%
3.9%
2.0%
3.0%
1.7%
89.3%
D-2
W
Daily
D
Radiography Practice Analysis Report
Tasks/Procedures Ranked by Percent Responsible (continued)
Tasks/Procedures
Percent
Responsible
NR
Quarterly
Monthly
Weekly
Daily
Q
M
W
D
71.
Scapula
95.3%
4.7%
24.2%
31.3%
17.5%
22.1%
62.
11.
Patella
Clean, disinfect or sterilize facilities /
equip., dispose of contaminated items
94.6%
92.9%
5.4%
7.1%
13.5%
1.3%
18.7%
1.5%
21.3%
5.5%
40.9%
84.6%
28.
Abdomen
92.6%
7.4%
2.5%
2.7%
7.6%
79.2%
26.
Sternum
92.3%
7.7%
46.5%
24.6%
8.6%
12.4%
45.
Sacroiliac joints
92.3%
7.7%
40.3%
26.2%
14.3%
11.3%
27.
Soft tissue neck
91.6%
8.4%
15.9%
28.5%
26.1%
21.2%
47.
Skull
91.4%
8.6%
24.5%
25.3%
25.3%
16.2%
77.
21.
Soft tissue/foreign body
Add electronic annotations on digital
images
90.6%
89.7%
9.4%
10.3%
17.4%
0.7%
29.0%
1.7%
26.9%
6.7%
17.3%
80.4%
48.
Facial bones
89.4%
10.6%
29.7%
29.7%
20.4%
9.4%
52.
Nasal bones
89.3%
10.7%
26.7%
35.0%
18.5%
9.0%
73.
Acromioclavicular joints
89.3%
10.7%
48.0%
23.4%
8.8%
9.2%
14a.
Automatic Exposure Control
88.7%
11.3%
1.3%
1.4%
2.5%
83.3%
18c.
Operate PACS
87.0%
13.0%
0.5%
0.6%
2.0%
83.6%
49.
Mandible
86.4%
13.6%
41.3%
28.0%
12.1%
4.9%
55.
Paranasal sinuses
86.0%
14.0%
24.7%
25.5%
21.3%
14.4%
18a.
Operate (CR)
84.4%
15.6%
0.6%
1.2%
2.4%
80.0%
14c.
19.
Pre-programmed techniques
Perform post-processing (exp. indicator,
brightness/contrast, window/level).
84.3%
84.0%
15.7%
16.0%
2.3%
0.5%
1.7%
1.7%
3.9%
5.2%
76.4%
76.3%
54.
Orbits for foreign body
84.0%
16.0%
32.2%
19.3%
19.9%
12.7%
12a.
Document on paper
83.6%
16.4%
1.6%
2.6%
3.2%
76.2%
12b.
Document electronically
83.3%
16.7%
0.9%
0.5%
3.5%
78.2%
53.
Orbits
83.3%
16.7%
40.0%
22.5%
13.4%
7.5%
D-3
Radiography Practice Analysis Report
Tasks/Procedures Ranked by Percent Responsible (continued)
Tasks/Procedures
Percent
Responsible
81.5%
NR
18.5%
Quarterly
Q
4.9%
Monthly
M
7.5%
Weekly
W
22.3%
Daily
D
46.6%
2.
Communicate scheduling delays to
waiting patients.
42.
3.
Scoliosis series
Verify or obtain patient consent as
necessary (e.g., contrast studies).
79.2%
78.6%
20.8%
21.4%
21.5%
4.6%
22.2%
7.1%
19.9%
15.3%
15.6%
51.5%
15b.
Operate Mobile unit (portable)
78.3%
21.7%
1.1%
2.0%
6.2%
69.0%
13b.
Variable kVp technique chart
78.1%
21.9%
6.7%
2.4%
4.3%
64.8%
74.
Bone survey
77.8%
22.2%
30.7%
22.2%
16.1%
8.6%
13a.
Fixed kVp technique chart
77.2%
22.8%
7.9%
2.2%
4.2%
63.0%
29a.
6.
Esophagus/ Assist with examination
Confirm type of contrast media and
prepare for administration.
75.2%
74.5%
24.8%
25.5%
9.2%
5.6%
11.4%
7.0%
21.2%
16.0%
23.3%
45.7%
50.
20.
Zygomatic arches
Use laser printer to print hard copy
images.
Determine if patient is at risk of reaction
(PMR)
73.0%
72.9%
27.0%
27.1%
52.0%
10.0%
14.0%
13.4%
4.3%
19.0%
2.7%
30.5%
70.9%
29.1%
7.2%
6.6%
14.1%
42.9%
18f.
9.
Radiology Information System (RIS)
Observe patient after contrast to detect
reactions.
69.9%
69.6%
30.1%
30.4%
0.9%
9.8%
0.8%
8.8%
1.4%
15.8%
66.7%
35.2%
51.
Temporomandibular joints
67.6%
32.4%
51.8%
9.1%
3.8%
2.6%
76.
Bone age
67.1%
32.9%
34.6%
14.8%
10.0%
7.5%
16d.
Operate Mobile fluoroscopic unit (C-arm)
66.8%
33.2%
7.4%
9.8%
19.8%
29.8%
4.
Gather info to determine dosage.
66.1%
33.9%
6.1%
7.6%
12.8%
39.6%
31a.
Upper GI Assist with examination
65.6%
34.4%
9.7%
10.4%
20.1%
25.4%
32.
Small bowel series
65.6%
34.4%
9.5%
12.8%
23.2%
20.2%
33a.
BE, assist with examination
64.8%
35.2%
15.6%
15.7%
20.2%
13.4%
33b.
BE radiographs/images
64.7%
35.3%
15.8%
15.2%
20.2%
13.3%
31b.
Upper GI radiographs/images
61.1%
38.9%
11.2%
10.8%
18.2%
20.9%
5.
D-4
Radiography Practice Analysis Report
Tasks/Procedures Ranked by Percent Responsible (continued)
Tasks/Procedures
Percent
Responsible
NR
Quarterly
Monthly
Weekly
Daily
Q
M
W
D
16a.
Operate Digital fluoroscopic unit
59.7%
40.3%
5.0%
6.5%
11.7%
36.5%
29b.
Esophagus/radiographs/images
59.2%
40.8%
11.0%
12.0%
18.1%
18.1%
16c.
Operate Fixed fluoroscopic unit
57.8%
42.2%
6.3%
6.9%
11.7%
32.7%
30.
Swallowing dysfunction study
57.2%
42.8%
8.7%
10.1%
17.9%
20.4%
18e.
Hospital Information System (HIS)
55.8%
44.2%
1.3%
1.4%
2.9%
49.9%
8.
Administer IV contrast media.
55.5%
44.5%
8.4%
7.1%
13.8%
26.1%
75.
Long bone measurement
55.4%
44.6%
33.4%
11.2%
6.3%
4.4%
34.
Surgical cholangiography
53.5%
46.5%
13.4%
13.5%
17.2%
9.2%
36.
Cystography
53.0%
47.0%
15.8%
13.8%
15.3%
8.2%
18b.
Operate (DR)
52.8%
47.2%
1.5%
1.9%
2.4%
47.0%
37.
22.
Cystourethrography (voiding)
Use film-screen cassettes and automatic
film processing.
Sequence imaging procedures to avoid
residual contrast.
52.2%
51.2%
47.8%
48.8%
19.7%
3.7%
14.9%
3.9%
10.8%
4.7%
6.7%
38.9%
50.4%
49.6%
7.6%
7.3%
14.5%
20.8%
13c.
Calipers
49.8%
50.2%
20.1%
7.8%
7.2%
14.8%
38.
Intravenous urography
49.2%
50.8%
19.8%
13.5%
10.8%
5.1%
78.
Arthrography
43.8%
56.2%
12.2%
11.0%
14.5%
5.9%
35.
ERCP
43.0%
57.0%
10.7%
10.0%
14.6%
7.3%
39.
Retrograde pyelography
42.9%
57.1%
16.7%
9.8%
11.2%
5.0%
79.
Myelography
42.7%
57.3%
13.9%
9.2%
13.8%
5.8%
18d.
Operate Film Digitizer
42.3%
57.7%
7.1%
6.0%
5.2%
23.7%
16b.
Operate Non-digital fluoroscopic unit
42.1%
57.9%
5.9%
4.1%
7.3%
24.8%
17a.
7.
Operate Dedicated chest unit
Perform venipuncture for contrast
administration.
41.6%
38.9%
58.4%
61.1%
1.9%
5.6%
1.4%
4.3%
3.1%
8.7%
35.1%
16.0%
17b.
Operate Tomography unit
37.0%
63.0%
14.8%
11.3%
6.6%
4.4%
1.
D-5
Radiography Practice Analysis Report
Tasks/Procedures Ranked by Percent Responsible (continued)
Tasks/Procedures
Percent
Quarterly
Responsible
35.0%
Monthly
Weekly
Daily
NR
65.0%
Q
7.5%
M
7.5%
W
7.3%
D
12.9%
30.6%
69.4%
0.9%
2.0%
6.8%
20.8%
Obtain vital signs.
30.3%
69.7%
6.9%
4.9%
4.6%
13.9%
CT scanner ..Abdomen
29.7%
70.3%
0.7%
2.9%
6.1%
20.0%
CT scanner ..Pelvis
29.2%
70.8%
0.9%
2.6%
6.0%
19.7%
CT scanner ..Neck
29.0%
71.0%
1.2%
3.9%
8.7%
15.1%
CT scanner ..Chest
28.3%
71.7%
1.0%
2.7%
7.0%
17.6%
81.
PICC line insertion assistance
27.1%
72.9%
6.6%
5.7%
6.4%
8.2%
17e.
Operate Panorex unit
25.1%
74.9%
5.8%
9.5%
6.6%
3.2%
80.
Venography
20.3%
79.7%
11.6%
4.4%
2.9%
1.3%
17d.
83g.
Operate Bone densitometry unit
17.1%
82.9%
1.9%
3.6%
4.5%
7.1%
CT scanner ..Other (please fill in)
14.9%
85.1%
1.5%
2.9%
3.7%
6.7%
83f.
CT scanner .. Biopsy
13.6%
86.4%
4.6%
2.7%
3.6%
2.5%
82.
Position patient and operate MR scanner
6.2%
93.8%
0.9%
0.7%
1.4%
3.0%
17c.
Operate Mammography unit
5.7%
94.3%
0.5%
0.6%
1.2%
3.4%
16e.
Operate Mobile vascular fluoroscopic unit
(C-arm)
83a.
CT scanner ..Head
10.
83d.
83e.
83b.
83c.
D-6
Radiography Practice Analysis Report
Appendix E
Radiography Practice Analysis
Final Task Inventory
E-1
Radiography Practice Analysis Report
TASK INVENTORY FOR
RADIOGRAPHY
Publication Date:
August 2010
Implementation Date:
January 2012
Certification requirements for Radiography are based, in part, on the results of a comprehensive
practice analysis conducted by ARRT staff and the Practice Analysis Advisory Committee. In 2009
the ARRT surveyed a large, national sample of radiographers to identify the job responsibilities
typically required of staff technologists at entry into the profession (1 to 3 years of experience). The
results of that practice analysis are reflected in this document. The attached task inventory is the
foundation for both the Clinical Competency Requirements and Content Specifications.
Basis of Task Inventory
The practice analysis survey was used to identify the responsibilities typically required of staff
technologists. When evaluating survey results, the Advisory Committee applied a 40% guideline.
That is, to be included on the task inventory an activity must have been the responsibility of at least
40% of staff technologists at entry into the profession. Occasionally, an activity that did not meet the
40% criterion was retained if there was a compelling rationale to do so (e.g., the task is especially
critical in some settings, or the task is related to an emerging technology).
Application to Clinical Competency Requirements
An activity must appear on the task inventory to be considered for inclusion in the Clinical
Competency Requirements. For an activity to be designated as a mandatory requirement, survey
results likely indicated that it was performed by a vast majority of staff technologists. Clinical
activities performed by fewer technologists, or which are carried out only in selected settings, were
usually designated as elective. Alternatively, the Advisory Committee sometimes stipulated that
such procedures could be simulated rather than performed on actual patients. Not all activities from
the task inventory were necessarily included as part of the requirements. The Clinical Competency
Requirements are available from ARRT’s website (www.arrt.org) and appear in the Certification
Handbook.
Application to Content Specifications
The primary purpose of the Examination in Radiography is to assess the knowledge and cognitive
skills underlying the intelligent performance of the tasks typically required of staff technologists at
entry into the profession. The Content Specifications identify the topics covered on the exam; every
topic can be linked to one or more activities on the task inventory. Note that each activity on the task
inventory is followed by a code which identifies the section of the Content Specifications
corresponding to that activity. For example, the first activity (confirm patient’s identity) is followed by
the code E.2. Section E.2. in the Content Specifications covers interpersonal communications,
indicating that knowledge of this topic is required to effectively confirm a patient’s identification.
When establishing a linkage between tasks and topics, the Advisory Committee usually listed one or
two key topics, even though successful task performance may cut across many topics. The Content
Specifications are available from ARRT’s website (www.arrt.org) and appear in the Certification
Handbook.
E-2
Radiography Practice Analysis Report
Activity
Content Categories
1.
Confirm patient’s identity.
E.2.
2.
Evaluate patient’s ability to understand and comply with requirements
for the requested examination.
E.2.
3.
Explain and confirm patient’s preparation (e.g., diet restrictions,
preparatory medications) prior to imaging examinations.
E.2.C., E.7.C.2.
4.
Examine imaging examination requisition to verify accuracy and
completeness of information (e.g., patient history, clinical diagnosis).
E.1.B.
5.
Sequence imaging procedures to avoid residual contrast material
affecting future exams.
E.6.A.4.
6.
Responsible for medical equipment attached to patients (e.g., IVs,
oxygen) during the imaging procedures.
E.4.B.
7.
Provide for patient safety, comfort, and modesty.
E.4., E.1.A.
8.
Communicate scheduling delays to waiting patients.
E.2.
9.
Verify or obtain patient consent as necessary (e.g., contrast studies).
E.1.A.1., E.7.C.1.
10.
Explain procedure instructions to patient or patient’s family.
E.2.
11.
Practice standard precautions.
E.3.C.
12.
Follow appropriate procedures when in contact with patient in isolation.
E.3.C., E.3.D.
13.
Select immobilization devices, when indicated, to prevent patient’s
movement and/or ensure patients safety.
D.
14.
Use proper body mechanics and/or mechanical transfer devices when
assisting patient.
E.4.A.
15.
Prior to administration of contrast agent, gather information to determine
appropriate dosage.
E.7.B.
16.
Prior to administration of contrast agent determine if patient is at
increased risk of adverse reaction (preparatory medication reconciliation).
E.6.A.
17.
Confirm type of contrast media and prepare for administration.
E.7.A., E.7.B.
18.
Use sterile or aseptic technique when indicated.
E.3.A., E.7.D.
19.
Perform venipuncture for contrast administration.
E.7.D., E.7.E.
20.
Administer IV contrast media.
E.7.E.
21.
Observe patient after administration of contrast media to detect adverse
reactions.
E.6.B.
22.
Obtain vital signs.
E.4.C.
23.
Recognize need for prompt medical attention and administer
emergency care.
E.5., E.4.C.3, E.6.B.,
24.
Explain post-procedural instructions to patient or patient’s family.
E.2.C., E.7.C.3.
25.
Maintain confidentiality of patient’s information.
E.1.A.2.
26.
Clean, disinfect or sterilize facilities and equipment, and dispose of
contaminated items in preparation for next examination.
E.3.A.
E-3
Radiography Practice Analysis Report
Activity
Content Categories
27.
Document required information on patient’s medical record (e.g., imaging
procedure documentation, images).
a. On paper
b. Electronically
C.2.E., E.1.B., E.6.B.4.
28.
Evaluate the need for and use of protective shielding.
A.2.B.
29.
Take appropriate precautions to minimize radiation exposure to patient.
A.2.
30.
Question female patient of child-bearing age about possible pregnancy
and take appropriate action (i.e., document response, contact physician).
A.1.D., E.2.
31.
Restrict beam to limit exposure area, improve image quality, and reduce
radiation dose.
A.2.C., C.1.A.1.I., C.1.A.2.I.
32.
Set kVp, mA and time or automatic exposure system to achieve optimum
image quality, safe operating conditions, and minimum radiation dose.
a. Use pulse fluoroscopy
b. Document fluoroscopy time
A.2.A., A.2.E.2., C.1.A.1.A.,
C.1.A.1.B., C.1.A.2.B.,
C.1.C.
33.
Prevent all unnecessary persons from remaining in area during x-ray
exposure.
A.4.C.2.
34.
Take appropriate precautions to minimize occupational radiation
exposure.
A.3.B.
35.
Wear a personnel monitoring device while on duty.
A.4.B.
36.
Evaluate individual occupational exposure reports to determine if values
for the reporting period are within established limits.
A.4.C.
37.
Determine appropriate exposure factors using:
a. Fixed kVp technique chart
b. Variable kVp technique chart
c. Calipers (to determine patient thickness for exposure)
C.1.B.2.
38.
Select radiographic exposure factors.
a. Automatic Exposure Control (AEC)
b. kVp and mAs (manual)
c. Pre-programmed techniques (Anatomically Programmed
Radiography)
C.1.C.
C.1.A.
C.1.B.1.
39.
Operate radiographic unit and accessories.
a. Fixed unit
b. Mobile unit (portable)
B.2.A.
40.
Operate fluoroscopic unit and accessories.
a. Fixed fluoroscopic unit
b. Mobile fluoroscopic unit (C-arm)
B.2.C.
E-4
Radiography Practice Analysis Report
Activity
Content Categories
41.
Operate electronic imaging and record keeping devices.
a. Computerized Radiography (CR)
b. Direct Digital Radiography (DR)
c. Picture Archival and Communication System (PACS)
d. Hospital Information System (HIS)
e. Radiology Information System (RIS)
C.2.C., C.2.D.
B.2.D.
B.2.D.
C.2.E.1.
C.2.E.2.
C.2.E.3.
42.
Prepare and operate specialized units.
a. Chest unit
b. Tomography unit
B.2.E
43.
Remove all radiopaque materials from patient or table that could interfere
with the image.
C.3.H., C.3.N.
44.
Perform post-processing on digital images in preparation for interpretation
(e.g., exposure indicator, brightness/contrast, window and level).
C.2.C., C.2.D.
45.
Use radiopaque markers to indicate anatomical side, position or other
relevant information (e.g., time, upright, decubitus, post-void).
C.2.A., C.3.F.
46.
Add electronic annotations on digital images to indicate position, or other
relevant information (e.g., time, upright, decubitus, post-void).
C.2.A., C.3.F.
47.
Use film-screen cassettes and automatic film processing.
C.1.A.1.H., C.1.A.3.H.,
C.2.B.
48.
Select equipment and accessories (e.g., grid, compensating filter,
shielding) for the examination requested.
A.2.B., C.1.A.1.F.,
C.1.A.2.G., C.1.A.2.F.,
C.1.A.2.G.
49.
Explain breathing instructions prior to making the exposure.
C.1.A.3.J., D., E.2.C.,
50.
Position patient to demonstrate the desired anatomy using body
landmarks.
D., C.3.E.
51.
C.1.B.3., C.1.A.3.J.,
C.1.A.1.L., C.1.A.2.L.,
C.1.A.3.L., C.1.A.4.L.
52.
Modify exposure factors for circumstances such as involuntary motion,
casts and splints, pathological conditions, or patient’s inability to
cooperate.
Verify accuracy of patient identification on image.
53.
Evaluate images for diagnostic quality.
C.3.
54.
Determine corrective measures if image is not of diagnostic quality and
take appropriate action.
C.3.
55.
Store and handle image receptor in a manner which will reduce the
possibility of artifact production.
B.3.C., B.2.D.5., B.2.F.3.,
C.2., C.3.H., C.3.I.
56.
Visually inspect, recognize, and report malfunctions in the imaging unit
and accessories.
B.3.B.
57.
Recognize the need for basic evaluations of radiographic equipment and
accessories.
a. Light field to radiation field alignment
b. Central-ray alignment
c. Shielding accessories (lead aprons and gloves)
E-5
C.3.F.
B.3.A.1.
B.3.A.2.
B.3.D.
Radiography Practice Analysis Report
58.
Activity
Content Categories
Perform routine maintenance on digital equipment.
a. Perform start-up or shut-down
b. Erase CR plate
c. Equipment cleanliness (e.g., imaging plates, CR cassettes)
d. Recognize and report malfunctions
B.2.D.3.
B.2.D.4.
B.2.D.5.
B.2.D.6.
Position patient, x-ray tube, and image receptor to produce the following
diagnostic images:
59.
Chest
D.1.A.
60.
Ribs
D.1.B.
61.
Sternum
D.1.C.
62.
Soft tissue neck
D.1.D.
63.
Abdomen
D.2.A.
64.
Esophagus
D.2.B.
65.
Swallowing dysfunction study
D.2.C.
66.
Upper GI series, single or double contrast
D.2.C.
67.
Small bowel series
D.2.D.
68.
Barium enema, single or double contrast
D.2.E.
69.
Surgical cholangiography
D.2.G.
70.
ERCP
D.2.H.
71.
Cystography
D.3.A.
72.
Cystourethrography
D.3.C.
73.
Intravenous urography
D.3.C.
74.
Retrograde pyelography
D.3.D.
75.
Cervical spine
D.4.A.
76.
Thoracic spine
D.4.B.
77.
Scoliosis series
D.4.C.
78.
Lumbar spine
D.4.D.
79.
Sacrum and coccyx
D.4.E.
80.
Sacroiliac joints
D.4.F.
81.
Pelvis and hip
D.4.G.
82.
Skull
D.5.A.
83.
Facial bones
D.5.B.
84.
Mandible
D.5.C.
85.
Zygomatic arch
D.5.D.
86.
Temporomandibular joints
D.5.E.
87.
Nasal bones
D.5.F.
E-6
Radiography Practice Analysis Report
Activity
Content Categories
88.
Orbits
D.5.G.
89.
Paranasal sinuses
D.5.H.
90.
Toes
D.6.A.
91.
Foot
D.6.B.
92.
Calcaneus (os calcis)
D.6.C.
93.
Ankle
D.6.D.
94.
Tibia, fibula
D.6.E.
95.
Knee
D.6.F.
96.
Patella
D.6.G.
97.
Femur
D.6.H.
98.
Fingers
D.6.I.
99.
Hand
D.6.J.
100.
Wrist
D.6.K.
101.
Forearm
D.6.L.
102.
Elbow
D.6.M.
103.
Humerus
D.6.N.
104.
Shoulder
D.6.O.
105.
Scapula
D.6.P.
106.
Clavicle
D.6.Q.
107.
Acromioclavicular joints
D.6.R.
108.
Bone survey
D.6.S.
109.
Long bone measurement
D.6.T.
110.
Bone age
D.6.U.
111.
Soft tissue/foreign body
D.6.T.
112.
Arthrography
D.7.A.
113.
Myelography
D.7.B.
E-7
Radiography Practice Analysis Report
Appendix F
2012 Content Specifications for the Radiography Examination
F-1
Radiography Practice Analysis Report
CONTENT SPECIFICATIONS FOR
THE EXAMINATION IN RADIOGRAPHY
Publication Date:
August 2010
Implementation Date:
January 2012
The purpose of the ARRT Examination in Radiography is to assess the knowledge and cognitive
skills underlying the intelligent performance of the tasks typically required of the staff technologist at
entry into the profession. To identify the knowledge and skills covered by the examination, the
ARRT periodically conducts practice analysis studies involving a nationwide sample of staff
technologists1. The results of the most recent practice analysis are reflected in this document. The
complete task inventory, which serves as the basis for these content specifications, is available from
our website www.arrt.org.
The table below presents the five major content categories, along with the number and percentage
of test questions appearing in each category. The remaining pages provide a detailed listing of
topics addressed within each major content category.
This document is not intended to serve as a curriculum guide. Although certification programs and
educational programs may have related purposes, their functions are clearly different. Educational
programs are generally broader in scope and address subject matter not included in these content
specifications.
PERCENT
OF TEST
CONTENT CATEGORY
A.
B.
C.
D.
E.
Radiation Protection
Equipment Operation and Quality Control
Image Acquisition and Evaluation
Imaging Procedures
Patient Care and Education
22.5%
11.0%
22.5%
29.0%
15.0%
100%
NUMBER OF
QUESTIONS 2
45
22
45
58
30
200
1.
A special debt of gratitude is due to the hundreds of professionals participating in this project as
committee members, survey respondents, and reviewers.
2.
Each exam includes up to an additional 20 unscored (pilot) questions. On the pages that follow, the
approximate number of test questions allocated to each content category appears in parentheses.
F-2
Radiography Practice Analysis Report
A. RADIATION PROTECTION (45)
1. Biological Aspects of Radiation (10)
2. Minimizing Patient Exposure (15)
A. Radiosensitivity
1.
2.
3.
4.
A. Exposure Factors
dose-response relationships
relative tissue radio sensitivities
(e.g., LET, RBE)
cell survival and recovery (LD50)
oxygen effect
1.
2.
B. Shielding
1.
2.
3.
B. Somatic Effects
1.
2.
3.
4.
short-term versus long-term effects
acute versus chronic effects
carcinogenesis
organ and tissue response (e.g., eye,
thyroid, breast, bone marrow, skin,
gonadal)
1.
2.
1.
2.
3.
CNS
hemopoietic
GI
1.
2.
3.
4.
5.
6.
genetic significant dose
goals of gonadal shielding
Photon Interactions with Matter
1.
2.
3.
4.
effect on skin and organ exposure
effect on average beam energy
NCRP recommendations (NCRP #102,
minimum filtration in useful beam)
E. Exposure Reduction
E. Genetic Impact
F.
purpose of primary beam restriction
types (e.g., collimators)
D. Filtration
D. Embryonic and Fetal Risks
1.
2.
rationale for use
types
placement
C. Beam Restriction
C. Acute Radiation Syndromes
1.
2.
3.
kVp
mAs
Compton effect
photoelectric absorption
coherent (classical) scatter
attenuation by various tissues
a. thickness of body part (density)
b. type of tissue (atomic number)
F.
patient positioning
automatic exposure control (AEC)
patient communication
digital imaging
pediatric dose reduction
ALARA
Image Receptors (e.g., types, relative speed,
digital versus film)
G. Grids
H. Fluoroscopy
1.
2.
3.
4.
5.
pulsed
exposure factors
grids
positioning
fluoroscopy time
(Section A continues on the following page)
F-3
Radiography Practice Analysis Report
A. RADIATION PROTECTION (cont.)
3. Personnel Protection (11)
4. Radiation Exposure and Monitoring (9)
A. Sources of Radiation Exposure
1.
2.
3.
A. Units of Measurement*
primary x-ray beam
secondary radiation
a. scatter
b. leakage
patient as source
1.
2.
3.
B. Dosimeters
1.
2.
B. Basic Methods of Protection
1.
2.
3.
time
distance
shielding
1.
2.
3.
4.
5.
types
attenuation properties
minimum lead equivalent (NCRP #102)
D. Special Considerations
1.
2.
3.
types
proper use
C. NCRP Recommendations for Personnel
Monitoring (NCRP #116)
C. Protective Devices
1.
2.
3.
absorbed dose
dose equivalent
exposure
occupational exposure
public exposure
embryo/fetus exposure
ALARA and dose equivalent limits
evaluation and maintenance of personnel
dosimetry records
D. Medical Exposure of Patients (NCRP #160)
1. typical effective dose per exam
2. comparison of typical doses by modality
portable (mobile) units
fluoroscopy
a. protective drapes
b. protective Bucky slot cover
c. cumulative timer
guidelines for fluoroscopy and portable
units (NCRP #102, CFR-21)
a. fluoroscopy exposure rates
b. exposure switch guidelines
* Conventional units are generally used. However,
questions referenced to specific reports (e.g., NCRP) will
use SI units to be consistent with such reports.
F-4
Radiography Practice Analysis Report
B. EQUIPMENT OPERATION AND QUALITY CONTROL (22)
1. Principles of Radiation Physics (9)
D. Components of Digital Imaging (CR and DR)
A. X-Ray Production
1.
2.
3.
4.
1.
2.
3.
4.
5.
source of free electrons (e.g., thermionic
emission)
acceleration of electrons
focusing of electrons
deceleration of electrons
E. Types of Units
B. Target Interactions
1.
2.
bremsstrahlung
characteristic
1.
2.
C. X-Ray Beam
1.
2.
3.
4.
F.
frequency and wavelength
beam characteristics
a. quality
b. quantity
c. primary versus remnant (exit)
inverse square law
fundamental properties (e.g., travel in
straight lines, ionize matter)
4.
5.
A. Beam Restriction
1.
2.
C. Digital Imaging Receptor Systems
1.
2.
3.
operating console
x-ray tube construction
a. electron sources
b. target materials
c. induction motor
automatic exposure control (AEC)
a. radiation detectors
b. back-up timer
c. density adjustment (e.g., +1 or –1)
manual exposure controls
beam restriction devices
artifacts (e.g., non-uniformity, erasure)
maintenance (e.g., detector fog)
display monitor quality assurance
D. Shielding Accessories (e.g., lead apron and
glove testing)
basic principles
phase, pulse, and frequency
C. Components of Fluoroscopic Unit (fixed or
mobile)
1.
2.
3.
4.
light field to radiation field alignment
central ray alignment
B. Recognition and Reporting of Malfunctions
B. X-Ray Generator, Transformers, and
Rectification System
1.
2.
stationary grids
Bucky assembly
image receptors
3. Quality Control of Imaging Equipment and
Accessories (4)
A. Components of Radiographic Unit (fixed or
mobile)
3.
dedicated chest unit
tomography unit
Accessories
1.
2.
3.
2. Imaging Equipment (9)
1.
2.
PSP, photo-stimulable phosphor
flat panel detectors - direct and indirect
start up and shut down
CR plate erasure
equipment cleanliness (imaging plates,
CR plates)
image intensifier
viewing systems
recording systems
automatic brightness control (ABC)
F-5
Radiography Practice Analysis Report
C. IMAGE ACQUISITION AND EVALUATION (45)
1. Selection of Technical Factors (20)
A. Factors Affecting Radiographic Quality. Refer to Attachment C to clarify terms that may occur on the exam. (X
indicates topics covered on the examination)
1.
Density/Brightness
a. mAs
X
b. kVp
X
c.
OID
d. SID
2.
Contrast/Gray
Scale
grids*
4.
Distortion
X
X
X
X
X
X (air gap)
X
e. focal spot size
f.
3.
Recorded
Detail/Spatial
Resolution
X
X
X
g. filtration
X
X
h. film-screen
X
i.
beam restriction
X
j.
motion
k.
anode heel effect
X
l.
patient factors (size, pathology)
X
X
X
X
X
m. angle (tube, part, or receptor)
X
X
X
X
* Includes conversion factors for grids
D. Digital Imaging Characteristics
B. Technique Charts
1.
2.
3.
4.
1.
pre-programmed techniques –
anatomically programmed radiography
(APR)
caliper measurement
fixed versus variable kVp
special considerations
a. casts
b. anatomic and pathologic factors
c. pediatrics
d. contrast media
2.
spatial resolution
a. sampling frequency
b. DEL (detector element size)
c. receptor size and matrix size
image signal (exposure related)
a. quantum mottle (noise)
b. SNR (signal to noise ratio) or
CNR (contrast to noise ratio)
C. Automatic Exposure Control (AEC)
1.
2.
3.
4.
effects of changing exposure factors on
radiographic quality
detector selection
anatomic alignment
density control (+1 or –1)
(Section C continues on the following page)
F-6
Radiography Practice Analysis Report
C. IMAGE ACQUISITION AND EVALUATION (cont.)
2. Image Processing and Quality
Assurance (12)
3. Criteria for Image Evaluation (13)
A. Brightness/Density (e.g., mAs, distance)
A. Image Identification
1.
2.
B. Contrast/Gray Scale (e.g., kVp, filtration, grids)
methods (e.g., photographic, radiographic,
electronic)
legal considerations (e.g., patient data,
examination data)
C. Recorded Detail (e.g., motion, poor film-screen
contact)
D. Distortion (e.g., magnification, OID, SID)
E. Demonstration of Anatomical Structures
(e.g., positioning, tube-part-image receptor
alignment)
B. Film Screen Processing
1.
2.
3.
C.
film storage
components*
a. developer
b. fixer
maintenance/malfunction
a. start up and shut down procedure
b. possible causes of malfunction (e.g.,
improper temperature, contamination,
replenishment, water flow)
F.
G. Patient Considerations (e.g., pathologic
conditions)
H. Image artifacts (e.g., film handling, static,
pressure, grid lines, Moiré effect or aliasing)
Digital Imaging Processing
1.
2.
3.
4.
5.
6.
electronic collimation (masking)
grayscale rendition (look-up table (LUT),
histogram)
edge enhancement/noise suppression
contrast enhancement
system malfunctions (e.g., ghost image,
banding, erasure, dead pixels, readout
problems)
CR reader components
2.
3.
4.
5.
Fog (e.g., age, chemical, radiation,
temperature, safelight)
J.
Noise
L.
Exposure Indicator Determination
M. Gross Exposure Error (e.g., mottle, light or
dark, low contrast)
viewing conditions (i.e., luminance,
ambient lighting
spatial resolution
contrast resolution/dynamic range
DICOM gray scale function
window level and width function
E. Digital Image Display Informatics
1.
2.
3.
4.
5.
I.
K. Acceptable Range of Exposure
D. Image Display
1.
Identification Markers (e.g., anatomical,
patient, date)
PACS
HIS
RIS (modality work list)
Networking (e.g., HL7, DICOM)
Workflow (inappropriate documentation,
lost images, mismatched images,
corrupt data)
* Specific chemicals in the processing solutions will
not be covered (e.g., glutaraldehyde).
F-7
Radiography Practice Analysis Report
D. IMAGING PROCEDURES (58)
This section addresses imaging procedures for the anatomic regions listed below (1 through 7). Questions will
cover the following topics:
1. Positioning (e.g., topographic landmarks, body positions, path of central ray, immobilization devices).
2. Anatomy (e.g., including physiology, basic pathology, and related medical terminology).
3. Technical factors (e.g., including adjustments for circumstances such as body habitus, trauma, pathology,
breathing techniques).
The specific radiographic positions and projections within each anatomic region that may be covered on the
examination are listed in Attachment A. A guide to positioning terminology appears in Attachment B.
1. Thorax (10)
A. Chest
B. Ribs
C. Sternum
D. Soft Tissue Neck
2. Abdomen and GI
Studies (8)
A. Abdomen
B. Esophagus
C. Swallowing Dysfunction
Study
D. Upper GI Series, Single
or Double Contrast
E. Small Bowel Series
F. Barium Enema, Single or
Double Contrast
G. Surgical Cholangiography
H. ERCP
3. Urological Studies (3)
A. Cystography
B. Cystourethrography
C. Intravenous Urography
D. Retrograde Pyelography
4. Spine and Pelvis (10)
A. Cervical Spine
B. Thoracic Spine
C. Scoliosis Series
D. Lumbar Spine
E. Sacrum and Coccyx
F. Sacroiliac Joints
G. Pelvis and Hip
5. Head (5)
A. Skull
B. Facial Bones
C. Mandible
D. Zygomatic Arch
E. Temporomandibular
Joints
F. Nasal Bones
G. Orbits
H. Paranasal Sinuses
6. Extremities (20)
A. Toes
B. Foot
C. Calcaneus (Os Calcis)
D. Ankle
E. Tibia, Fibula
F. Knee
G. Patella
H. Femur
I. Fingers
J. Hand
K. Wrist
L. Forearm
M. Elbow
N. Humerus
O. Shoulder
P. Scapula
Q. Clavicle
R. Acromioclavicular Joints
F-8
6. Extremities (cont.)
S. Bone Survey
T. Long Bone Measurement
U. Bone Age
V. Soft Tissue/Foreign
Bodies
7. Other (2)
A. Arthrography
B. Myelography
Radiography Practice Analysis Report
E. PATIENT CARE AND EDUCATION (30)
3. Infection Control (5)
1. Ethical and Legal Aspects (4)
A. Terminology and Basic Concepts
1. asepsis
a. medical
b. surgical
c. sterile technique
2. pathogens
a. fomites, vehicles, vectors
b. nosocomial infections
A. Patient’s Rights
1. informed consent (e.g., written, oral,
implied)
2. confidentiality (HIPAA)
3. additional rights (e.g., Patient’s Bill of
Rights)
a. privacy
b. extent of care (e.g., DNR)
c. access to information
d. living will; health care proxy
e. research participation
B. Cycle of Infection
1. pathogen
2. source or reservoir of infection
3. susceptible host
4. method of transmission
a. contact (direct, indirect)
b. droplet
c. airborne/suspended
d. common vehicle
e. vector borne
B. Legal Issues
1. examination documentation (e.g., patient
history, clinical diagnosis)
2. common terminology (e.g., battery,
negligence, malpractice)
3. legal doctrines (e.g., respondeat superior,
res ipsa loquitur)
4. restraints versus immobilization
C. Standard Precautions
1. handwashing
2. gloves, gowns
3. masks
4. medical asepsis (e.g., equipment
disinfection)
C. ARRT Standards of Ethics
2. Interpersonal Communication (5)
A. Modes of Communication
1. verbal/written
2. nonverbal (e.g., eye contact, touching)
D. Additional or Transmission-Based Precautions
1. airborne (e.g., respiratory protection,
negative ventilation)
2. droplet (e.g., particulate mask, restricted
patient placement)
3. contact (e.g., gloves, gown, restricted
patient placement)
B. Challenges in Communication
1. patient characteristics
2. explanation of medical terms
3. strategies to improve understanding
4. cultural diversity
C. Patient Education
1.
explanation of current procedure
2.
respond to inquiries about other imaging
modalities (e.g., CT, MRI, mammography,
sonography, nuclear medicine, bone
densitometry regarding dose differences,
types of radiation, and patient preps)
E. Disposal of Contaminated Materials
1. linens
2. needles
3. patient supplies (e.g., tubes, emesis basin)
(Section E continues on the following page)
F-9
Radiography Practice Analysis Report
E. PATIENT CARE AND EDUCATION (cont.)
4. Physical Assistance and Transfer (4)
B. Complications/Reactions
1. local effects (e.g., extravasation/
infiltration, phlebitis)
2. systemic effects
a. mild
b. moderate
c. severe
3. emergency medications
4. radiographer’s response and
documentation
A. Patient Transfer and Movement
1.
2.
body mechanics (balance, alignment,
movement)
patient transfer
B. Assisting Patients with Medical Equipment
1.
2.
3.
infusion catheters and pumps
oxygen delivery systems
other (e.g., nasogastric tubes, urinary
catheters, tracheostomy tubes)
7.
C. Routine Monitoring
1.
2.
3.
4.
5.
A. Types and Properties (e.g., iodinated, water
soluble, barium, ionic versus non-ionic)
equipment (e.g., stethoscope,
sphygmomanometer)
vital signs (e.g., blood pressure, pulse,
respiration)
physical signs and symptoms
(e.g., motor control, severity of injury)
documentation
B. Appropriateness of Contrast Media to Exam
1. patient condition (e.g., perforated bowel)
2. patient age and weight
3. laboratory values (e.g., BUN creatinine, GFR)
C. Patient Education
1. verify informed consent
2. instructions regarding preparation, diet, and
medications
Medical Emergencies (5)
A. Allergic Reactions (e.g., contrast media, latex)
B. Cardiac or Respiratory Arrest (e.g., CPR)
3.
C. Physical Injury or Trauma
D. Other Medical Disorders (e.g., seizures,
diabetic reactions)
6.
Contrast Media (4)
pre- and post-examination instructions
(e.g., discharge instructions)
D. Venipuncture
1. venous anatomy
2. supplies
3. procedural technique
Pharmacology (3)
A. Patient History
1. medication reconciliation (current
medications)
2. premedications
3. contraindications
4. scheduling and sequencing examinations
E. Administration
1. routes (e.g., IV, oral)
2. supplies (e.g., enema kits, needles)
F-10
Radiography Practice Analysis Report
Attachment A
Radiographic Positions and Projections
1. Thorax
A. Chest
1. PA upright
2. lateral upright
3. AP Lordotic
4. AP supine
5. lateral decubitus
6. anterior and posterior
obliques
B. Ribs
1. AP and PA, above and
below diaphragm
2. anterior and posterior
oblique
C. Sternum
1. lateral
2. RAO breathing technique
3. RAO expiration
4. LAO
5. PA sternoclavicular joints
6. anterior oblique
sternoclavicular joints
D. Soft Tissue Neck
1. AP upper airway
2. lateral upper airway
2. Abdomen and GI studies
A. Abdomen
1. AP supine
2. AP upright
3. lateral decubitus
4. dorsal decubitus
B. Esophagus
1. RAO
2. left lateral
3. AP
4. PA
5. LAO
C. Swallowing Dysfunction Study
D. Upper GI series*
1. AP scout
2. RAO
3. PA
4. right lateral
5. LPO
6. AP
E. Small Bowel Series
1. PA scout
2. PA (follow through)
3. ileocecal spots
4. enteroclysis procedure
F. Barium Enema*
1. left lateral rectum
2. left lateral decubitus
3. right lateral decubitus
4. LPO and RPO
5. PA
6. RAO and LAO
7. AP axial (butterfly)
8. PA axial (butterfly)
9. PA post-evacuation
G. Surgical Cholangiography
1. AP
H. ERCP
1. AP
3. Urological Studies
A. Cystography
1. AP
2. LPO and RPO 60º
3. lateral
4. AP 10-15º caudad
B. Cystourethrography
1. AP voiding
cystourethrogram female
2. RPO 30º, voiding cystogram
male
C. Intravenous Urography
1. AP, scout, and series
2. RPO and LPO 30º
3. PA post-void
4. AP post-void, upright
5. nephrotomography
6. AP ureteric compression
D. Retrograde Pyelography
1. AP scout
2. AP pyelogram
3. AP ureterogram
4. Spine and Pelvis
A. Cervical Spine
1. AP angle cephalad
2. AP open mouth
3. lateral
4. cross table lateral
5. anterior oblique
6. posterior oblique
7. lateral swimmers
8. lateral flexion and extension
9. AP dens (Fuchs)
10. PA dens (Judd)
B. Thoracic Spine
1. AP
2. lateral, breathing
3. lateral, expiration
C. Scoliosis Series
1. AP/PA scoliosis series
(Ferguson)
D. Lumbar Spine
1. AP
2. PA
3. lateral
4. L5-S1 lateral spot
5. posterior oblique 45º
6. anterior oblique 45º
7. AP L5-S1, 30-35º cephalad
8. AP right and left bending
9. lateral flexion and extension
E. Sacrum and Coccyx
1. AP sacrum, 15-25º cephalad
2. AP coccyx, 10-20º caudad
3. lateral sacrum and coccyx,
combined
4. lateral sacrum or coccyx,
separate
* single or double contrast
F-11
F. Sacroiliac Joints
1. AP
2. posterior oblique
3. anterior oblique
G. Pelvis and Hip
1. AP hip only
2. cross-table lateral hip
3. unilateral frog-leg, non-trauma
4. axiolateral inferosuperior,
trauma (Clements-Nakayama)
5. AP pelvis
6. AP pelvis, bilateral frog-leg
7. AP pelvis, axial anterior pelvic
bones (inlet, outlet)
8. anterior oblique pelvis,
acetabulum (Judet)
5. Head
A. Skull
1. AP axial (Towne)
2. lateral
3. PA (Caldwell)
4. PA no angle
5. submentovertical (full basal)
6. PA 25-30º angle (Haas)
7. trauma cross table lateral
8. trauma AP, 15º cephalad
9. trauma AP, no angle
10. trauma AP, axial (Towne)
B. Facial Bones
1. lateral
2. parietoacanthial (Waters)
3. PA (Caldwell)
4. PA (modified Waters)
C. Mandible
1. axiolateral oblique
2. PA no angle
3. AP axial (Towne)
4. PA semi-axial, 20-25º
cephalad
5. PA (modified Waters)
6. submentovertical (full basal)
D. Zygomatic Arch
1. submentovertical (full basal)
2. parietoacanthial (Waters)
3. AP axial (Towne)
4. axial oblique
5. lateral
E. Temporomandibular Joints
1. lateral (Law)
2. lateral (Schuller)
3. AP axial (Towne)
F. Nasal Bones
1. parietoacanthial (Waters)
2. lateral
3. PA (Caldwell)
G. Orbits
1. parietoacanthial (Waters)
2. lateral
3. PA (Caldwell)
H. Paranasal Sinuses
1. lateral
2. PA (Caldwell)
3. parietoacanthial (Waters)
4. submentovertical (full basal)
5. open mouth parietoacanthial
(Waters)
Radiography Practice Analysis Report
6. Extremities
A. Toes
1. AP, entire foot
2. oblique toe
3. lateral toe
B. Foot
1. AP angle toward heel
2. medial oblique
3. lateral oblique
4. mediolateral
5. lateromedial
6. sesamoids, tangential
7. AP weight bearing
8. lateral weight bearing
C. Calcaneus (Os Calcis)
1. lateral
2. plantodorsal, axial
3. dorsoplantar, axial
D. Ankle
1. AP
2. AP mortise
3. mediolateral
4. oblique, 45º internal
5. lateromedial
6. AP stress views
E. Tibia, Fibula
1. AP
2. lateral
3. oblique
F. Knee
1. AP
2. lateral
3. AP weight bearing
4. lateral oblique 45º
5. medial oblique 45º
6. PA
7. PA axial – intercondylar
fossa (tunnel)
G. Patella
1. lateral
2. supine flexion 45º (Merchant)
3. PA
4. prone flexion 90º (Settegast)
5. prone flexion 55º (Hughston)
H. Femur
1. AP
2. mediolateral
I. Fingers
1. PA entire hand
2. PA finger only
3. lateral
4. oblique
5. AP thumb
6. oblique thumb
7. lateral thumb
J. Hand
1. PA
2. lateral
3. oblique
K. Wrist
1. PA
2. oblique 45º
3. lateral
4. PA for scaphoid
5. scaphoid (Stecher)
6. carpal canal
L. Forearm
1. AP
2. lateral
M. Elbow
1. AP
2. lateral
3. external oblique
4. internal oblique
5. AP partial flexion
6. axial trauma (Coyle)
F-12
N. Humerus
1. AP non-trauma
2. lateral non-trauma
3. AP neutral trauma
4. scapular Y trauma
5. transthoracic lateral trauma
6. lateral, mid and distal, trauma
O. Shoulder
1. AP internal and external
rotation
2. inferosuperior axial, nontrauma
3. posterior oblique (Grashey)
4. tangential non-trauma
5. AP neutral trauma
6. transthoracic lateral trauma
7. scapular Y trauma
P. Scapula
1. AP
2. lateral, anterior oblique
3. lateral, posterior oblique
Q. Clavicle
1. AP
2. AP angle, 15-30º cephalad
3. PA angle, 15-30º caudad
R. Acromioclavicular Joints – AP
Bilateral With and Without
Weights
S. Bone Survey
T. Long Bone Measurement
U. Bone Age
V. Soft Tissue/Foreign Body
7. Other Procedures
A. Arthrography
B. Myelography
Radiography Practice Analysis Report
Attachment B
Standard Terminology
for Positioning and Projection
Radiographic View: Describes the body part as seen by the image receptor or other recording
medium, such as a fluoroscopic screen. Restricted to the discussion of a radiograph or image.
Radiographic Position: Refers to a specific body position, such as supine, prone, recumbent,
erect, or Trendelenburg. Restricted to the discussion of the patient’s physical position.
Radiographic Projection: Restricted to the discussion of the path of the central ray.
POSITIONING TERMINOLOGY
A.
Lying Down
1.
2.
3.
4.
B.
supine
prone
decubitus
recumbent




lying on the back
lying face downward
lying down with a horizontal x-ray beam
lying down in any position



facing the image receptor
facing the radiographic tube
erect or lying down
Erect or Upright
1.
2.
3.
anterior position
posterior position
oblique position
a. anterior (facing the image receptor)
i.
left anterior oblique
body rotated with the left anterior portion
closest to the image receptor
body rotated with the right anterior portion
closest to the image receptor
ii. right anterior oblique
b. posterior (facing the radiographic tube)
i.
left posterior oblique
body rotated with the left posterior portion
closest to the image receptor
body rotated with the right posterior portion
closest to the image receptor
ii. right posterior oblique
F-13
Radiography Practice Analysis Report
Anteroposterior Projection
Posteroanterior Projection
Right Lateral Position
Left Lateral Position
Left Posterior Oblique Position
Right Posterior Oblique Position
Left Anterior Oblique Position
Right Anterior Oblique Position
F-14
Radiography Practice Analysis Report
Attachment C
ARRT Standard Definitions
Term
Film-Screen Radiography
Term
Digital Radiography
Recorded
Detail
The sharpness of the structural lines as recorded in the
radiographic image.
Spatial
Resolution
The sharpness of the structural edges recorded in the image.
Density
Radiographic density is the degree of blackening or opacity of an
area in a radiograph due to the accumulation of black metallic
silver following exposure and processing of a film.
Brightness
Brightness is the measurement of the luminance of a monitor calibrated
in units of candela (cd) per square meter on a monitor or soft copy.
Density = Log
Contrast
Density on a hard copy is the same as film.
incidentlight intensity
transmitted light intensity
Radiographic contrast is defined as the visible differences
between any two selected areas of density levels within the
radiographic image.
Contrast
Scale of Contrast refers to the number of densities visible (or the
number of shades of gray).
Image contrast of display contrast is determined primarily by the
processing algorithm (mathematical codes used by the software to
provide the desired image appearance). The default algorithm
determines the initial processing codes applied to the image data.
Scale of Contrast is synonymous to “gray scale” and is linked to the bit
depth of the system. ‘Gray scale’ is used instead of “scale of contrast”
when referring to digital images.
Long Scale is the term used when slight differences between
densities are present (low contrast) but the total number of
densities is increased.
Short Scale is the term used when considerable or major
differences between densities are present (high contrast) but the
total number of densities is reduced.
Film
Latitude
The inherent ability of the film to record a long range of density
levels on the radiograph.
Dynamic
Range
The range of exposures that may be captured by a detector.
The dynamic range for digital imaging is much larger than film.
Film latitude and film contrast depend upon the sensitometric
properties of the film and the processing conditions, and are
determined directly from the characteristic H and D curve.
Film
Contrast
The inherent ability of the film emulsion to react to radiation and
record a range of densities.
Receptor
Contrast
The fixed characteristic of the receptor. Most digital receptors have an
essentially linear response to exposure. This is impacted by contrast
resolution (the smallest exposure change or signal difference that can
be detected). Ultimately, contrast resolution is limited by the dynamic
range and the quantization (number of bits per pixel) of the detector.
Exposure
Latitude
The range of exposure factors which will produce a diagnostic
radiograph.
Exposure
Latitude
The range of exposures which produces quality images at appropriate
patient dose.
Subject
Contrast
The difference in the quantity of radiation transmitted by a
particular part as a result of the different absorption characteristics
of the tissues and structures making up that part.
Subject
Contrast
The magnitude of the signal difference in the remnant beam.
F-15
Radiography Practice Analysis Report
Appendix G
2012 Didactic and Clinical Competency Requirements
for the Radiography Examination
G-1
Radiography Practice Analysis Report
R ADIOGRAPHY
D IDACTIC AND C LINICAL
C OMPETENCY R EQUIREMENTS
Eligibility Requirements Effective January 2012*
Candidates for certification are required to meet the Professional Requirements specified in Article II of
the ARRT Rules and Regulations. This document identifies the minimum didactic and clinical
competency requirements for certification referenced in the Rules and Regulations. Candidates who
complete a formal educational program accredited by a mechanism acceptable to the ARRT will have
obtained education and experience beyond the requirements specified here.
Didactic Requirements
Candidates must successfully complete coursework addressing the topics listed in the ARRT Content
Specifications for the Examination in Radiography. These topics are presented in a format suitable
for instructional planning in the ASRT Radiography Curriculum (2007).
Clinical Requirements
As part of their educational program, candidates must demonstrate competence in the clinical
activities identified in this document. Demonstration of clinical competence means that the program
director or designee has observed the candidate performing the procedure, and that the candidate
performed the procedure independently, consistently, and effectively. Candidates must demonstrate
competence in the areas listed below.





Six mandatory general patient care activities.
Thirty-one mandatory imaging procedures.
Fifteen elective imaging procedures to be selected from a list of 35 procedures.
One elective imaging procedure from the head section.
Two elective imaging procedures from the fluoroscopy studies section, one of which must be
either an Upper GI or a Barium Enema.
Documentation
The following pages identify specific clinical competency requirements. Candidates may wish to use
these pages, or their equivalent, to record completion of the requirements. The pages do NOT need to
be sent to the ARRT.
To document that the didactic and clinical requirements have been satisfied, candidates must have the
program director (and authorized faculty member if required) sign the ENDORSEMENT SECTION of the
Application for Certification included in the Certification Handbook.
_______________________
* Note: Candidates who complete their educational program during 2012 or 2013 may use either the
previous requirements (effective 2005) or the current requirements (effective 2012). Candidates who
graduate after December 31, 2013 may no longer use the previous competency requirements.
G-2
Radiography Practice Analysis Report
Radiography
Clinical Competency Requirements
The clinical competency requirements include the six general patient care activities listed below
and a subset of the 66 imaging procedures identified on subsequent pages. Demonstration of
competence should include variations in patient characteristics (e.g., age, gender, medical
condition).
1. General Patient Care
Requirement: Candidates must demonstrate competence in all six patient care activities listed below.
The activities should be performed on patients; however, simulation is acceptable (see footnote) if state
or institutional regulations prohibit candidates from performing the procedures on patients.
Date
Completed
General Patient Care
1.
CPR
2.
Vital signs (blood pressure, pulse, respiration)
3.
Sterile and aseptic technique
4.
Venipuncture
5.
Transfer of patient
6.
Care of patient medical equipment (e.g., oxygen
tank, IV tubing)
Competence
Verified By
Note: The ARRT requirements specify that certain clinical procedures may be simulated. Simulations
must meet the following criteria: (a) the student is required to competently demonstrate skills as similar as
circumstances permit to the cognitive, psychomotor, and affective skills required in the clinical setting; (b)
the program director is confident that the skills required to competently perform the simulated task will
generalize or transfer to the clinical setting, and, if applicable, the student will evaluate related images.
Examples of acceptable simulation include: demonstrating CPR on a mannequin, positioning a fellow
student for a projection without actually activating the x-ray beam, and performing venipuncture by
demonstrating aseptic technique on another person, but then inserting the needle into an artificial forearm
or grapefruit.
G-3
Radiography Practice Analysis Report
Radiography
Clinical Competency Requirements (cont.)
2. Imaging Procedures
Requirement: Candidates must demonstrate competence in all 31 procedures identified as mandatory
(M). Procedures should be performed on patients; however, up to eight mandatory procedures may be
simulated (see previous page) if demonstration on patients is not feasible.
Candidates must demonstrate competence in 15 of the 35 elective (E) procedures. Candidates must
select one elective procedure from the head section. Candidates must select either Upper GI or Barium
Enema plus one other elective from the fluoroscopy section. Elective procedures should be performed
on patients; however, electives may be simulated (see previous page) if demonstration on patients is not
feasible.
Institutional protocol will determine the positions or projections used for each procedure.
Demonstration of competence includes requisition evaluation, patient assessment, room preparation,
patient management, equipment operation, technique selection, positioning skills, radiation safety,
image processing, and image evaluation.
Imaging Procedure
Mandatory
or Elective
Date
Completed
Patient or
Simulated
Competence
Verified By
Chest and Thorax
1. Chest Routine
2. Chest AP (Wheelchair or
Stretcher)
3. Ribs
4. Chest Lateral Decubitus
5. Sternum
6. Upper Airway (Soft-Tissue Neck)
M
M
M
E
E
E
Upper Extremity
7.
8.
9.
10.
11.
12.
13.
Thumb or Finger
Hand
Wrist
Forearm
Elbow
Humerus
Shoulder
M
M
M
M
M
M
M
14. Trauma: Shoulder (Scapular Y,
Transthoracic or Axillary)*
M
15. Clavicle
16. Scapula
17. AC Joints
E
E
E
18. Trauma: Upper Extremity
(Nonshoulder)*
M
* Trauma is considered a serious injury or shock to the body. Modifications may include variations in
positioning, minimal movement of the body part, etc.
G-4
Radiography Practice Analysis Report
Radiography
Clinical Competency Requirements (cont.)
Imaging Procedure
Mandatory
or Elective
Lower Extremity
19. Toes
20. Foot
21. Ankle
22.
23.
24.
25.
26.
27.
Knee
Tibia-Fibula
Femur
Trauma: Lower Extremity*
Patella
Calcaneus (Os Calcis)
E
M
M
M
M
M
M
E
E
Head – Candidates must select at
least one elective procedure from
this section.
28.
29.
30.
31.
32.
33.
34.
Skull
Paranasal Sinuses
Facial Bones
Orbits
Zygomatic Arches
Nasal Bones
Mandible
E
E
E
E
E
E
E
Spine and Pelvis
35. Cervical Spine
M
36. Trauma: Cervical Spine (Cross
Table Lateral)*
E
37.
38.
39.
40.
41.
42.
43.
44.
M
M
M
M
M
E
E
E
Thoracic Spine
Lumbar Spine
Pelvis
Hip
Cross Table Lateral Hip
Sacrum and/or Coccyx
Scoliosis Series
Sacroiliac Joints
G-5
Date
Completed
Patient or
Simulated
Competence
Verified By
Radiography Practice Analysis Report
Radiography
Clinical Competency Requirements (cont.)
Imaging Procedure
Mandatory
or Elective
Date
Completed
Patient or
Simulated
Competence
Verified By
Abdomen
45.
46.
47.
48.
Abdomen Supine (KUB)
Abdomen Upright
Abdomen Decubitus
Intravenous Urography
M
M
E
E
* Trauma is considered a serious injury or shock to the body. Modifications may include variations in
positioning, minimal movement of the body part, etc.
G-6
Radiography Practice Analysis Report
Radiography
Clinical Competency Requirements (cont.)
Imaging Procedure
Mandatory
or Elective
Fluoroscopy Studies – Candidates
must select either Upper GI or
Barium Enema plus one other
elective procedure from this
section.
49. Upper GI Series (Single or
Double Contrast)
E
50. Barium Enema (Single or
Double Contrast)
E
51. Small Bowel Series
E
52. Esophagus
53. Cystography/Cystourethrograp
hy
54. ERCP
E
55. Myelography
E
56. Arthrography
E
E
E
Surgical Studies
57. C-Arm Procedure (Orthopedic)
58. C-Arm Procedure (NonOrthopedic)
Mobile Studies
M
E
59. Chest
M
60. Abdomen
M
61. Orthopedic
M
Pediatrics (age 6 or younger)
62. Chest Routine
M
63. Upper Extremity
E
64. Lower Extremity
E
65. Abdomen
E
66. Mobile Study
E
G-7
Date
Completed
Patient or
Simulated
Competence
Verified By
Radiography Practice Analysis Report
Appendix H
Spring 2010 Radiography Managers Survey Questionnaire
H-1
Radiography Practice Analysis Report
RADIOGRAPHY
PRACTICE ANALYSIS QUESTIONNAIRE
Dear Radiology Manager:
The American Registry of Radiologic Technologists is revising the content specifications and
clinical competencies for the examination in radiography. It is our philosophy that a certification
exam should be based on the job responsibilities of practicing technologists. Therefore we are
asking a select group of managers to inform us about current radiographic procedures in today’s
workplace.
You are one of the carefully selected professionals from whom the ARRT is requesting input. On
the questionnaire, we have assembled a list of procedures that may be performed by radiologic
technologists. This list is not all inclusive and only contains selected procedures. The survey
takes about 30 minutes to complete. For you convenience, we have included CPT® codes for
most procedures. If you have more than one job, please consider the survey for the workplace in
which you hold a management position, preferably full time. Since this questionnaire is being sent
to only a sample of managers across the country, rather than to all, it is important that you return it.
Your answers represent hundreds of your colleagues.
Please complete the questionnaire and return it within one week. We have included a postage
paid envelope for you convenience. Simply enclose the questionnaire, seal the envelope, and
drop it in the mail.
You may be assured of the complete confidentiality of your responses. Individual responses will
not be released to anyone under any circumstances. If you have any questions please call 651681-3145.
Thank you very much for taking time from your busy schedule to assist the ARRT with this project.
Your participation helps to assure the integrity of the certification process.
Respectfully,
Jerry B. Reid, PhD
Executive Director
March 2010
H-2
Radiography Practice Analysis Report
Section One: Procedures
Directions: Please indicate your answers to the following questions for each procedure in the table below:
A. Was this procedure performed in your facility in 2009? If no, skip ahead to the next question.
B. Indicate how often these procedures were performed in your facility during 2009.
C. Is the procedure performed by entry-level radiographers (0-3 years of experience) or is it only
performed by more experienced radiographers?
D. How many FTEs are available to perform this procedure?
Position patient, x-ray tube, and image
receptor to produce the following
diagnostic images:
Procedure (CPT® Code)
0.
Example (99999)
84.
Sternum (71130)
85.
Esophagus (74220)
86.
Swallowing dysfunction study (70371)
87.
Barium enema, single contrast (74270)
88.
Barium enema, double contrast (
74280)
89.
Therapeutic enema (74283)
90.
Surgical cholangiography (74300)
91.
ERCP (74329)
92.
Cystography (74430)
93.
Cystourethrography, retrograde
(74450)
94.
Cystourethrography, voiding (74455)
95.
Intravenous urography (74400)
96.
Retrograde pyelography (74420)
97.
Scoliosis series, standing only (72069)
98.
Scoliosis series, supine and erect
(72090)
99.
Sacroiliac joints, less than 3 views
(72200)
100.
Sacroiliac joints, 3 or more views
(72202)
101.
Knee, 3 views (73562)
A.
B.
Was this
procedure
performed in
your facility
during 2009?
How many
times was the
procedure
performed
during 2009?
Yes
x
No
Yes
350
H-3
C.
Did entry-level
(0-3 years of
experience)
radiographers
routinely
perform this
procedure?
x
D.
How many
FTEs are
available to
perform this
procedure?
No
12
Radiography Practice Analysis Report
Position patient, x-ray tube, and image
receptor to produce the following
diagnostic images:
Procedure (CPT® Code)
102.
Knee complete, 4 or more views
(73564)
103.
Facial bones, less than 3 views (70140)
104.
Facial bones, minimum of 3 views
(70150)
105.
Mandible, less than 4 views (70100)
106.
Mandible, minimum of 4 views (70110)
107.
Zygomatic arches
108.
Temporomandibular joints, unilateral
A.
B.
Was this
procedure
performed in
your facility
during 2009?
How many
times was the
procedure
performed
during 2009?
C.
Did entry-level
(0-3 years of
experience)
radiographers
routinely
perform this
procedure?
Yes
No
Yes
No
Yes
No
Yes
No
(70328)
109.
Temporomandibular joints, bilateral
(70330)
110.
Nasal bones (70160)
111.
Orbits for MRI screening (70200)
112.
Paranasal sinuses, less than 3 views
(70210)
113.
Paranasal sinuses, 3 or more views
(70220)
114.
Scapula (73010)
115.
Acromioclavicular joints (73050)
116.
Bone survey (77075)
117.
Long bone measurement (77073)
118.
Bone age (77072)
119.
Soft tissue/foreign body
Assist radiologist with the following invasive
procedures:
120.
Arthrography (77002)
121.
Cervical myelography (72240)
122.
Thoracic myelography (72255)
123.
Lumbosacral myelography (72265)
124.
Venography unilateral (75820)
125.
Venography bilateral (75822)
H-4
D.
How many
FTEs are
available to
perform this
procedure?
Radiography Practice Analysis Report
Position patient, x-ray tube, and image
receptor to produce the following
diagnostic images:
Position patient and operate CT scanner to
produce the following diagnostic images:
43.
CT head or brain without contrast
(70450)
44.
CT head or brain with contrast (70460)
45.
CT C-spine without contrast (72125)
46.
CT C-spine with contrast (72126)
47.
CT thorax without contrast (71250)
48.
CT thorax with contrast (71260)
49.
CT chest for PE (71275)
50.
CT abdomen without contrast (74150)
51.
CT abdomen with contrast (74160)
52.
CT pelvis without contrast (72192)
53.
CT pelvis with contrast (72193)
A.
B.
Was this
procedure
performed in
your facility
during 2009?
How many
times was the
procedure
performed
during 2009?
Yes
H-5
No
C.
Did entry-level
(0-3 years of
experience)
radiographers
routinely
perform this
procedure?
Yes
No
D.
How many
FTEs are
available to
perform this
procedure?
Radiography Practice Analysis Report
Section Two: Demographics
1.
2.
3.
4.
5.
Which of the following best describes your
place of employment?
o Hospital
o
Clinic
o
Private office
o
Other _______________________
6.
o
o
o
If you work in a hospital/medical center, what
is its approximate size (number of beds)?
o Less than 100
o
100 to 250
o
251 to 500
o
More than 500
7.
Which of the following best describes the
community where you work?
o Urban
o
Suburban
o
Rural/small town
8.
2
o
3–5
o
o
o
o
o
6 – 10
11 – 15
16 – 50
51 – 100
More than 100
9.
10.
How many entry-level radiographers (FTEs)
are employed in the facility where you
work? _____________________________
H-6
6 – 10 years
11 – 20 years
More than 20 years
About how many patients are seen on an
average day in your department?
o 1 – 50
o
51 – 100
o
101 – 250
o
250 or more
Are CT procedures being performed in your
facility?
o Yes
o
How many radiographers (FTEs) are
employed in the facility where you work?
o 1
o
How many years have you worked as a
manager/administrator
o Less than one year
o 1 – 3 years2
o 4 – 5 years
No (if no, skip questions 8 and 9)
Do any entry-level (0-3 years of experience)
radiographers perform CT procedures in your
facility?
o Yes
o No (if no, skip question 9)
o
About what percent of work time do the entry
level radiographers spend performing CT?
o
1 – 5%
o
6 – 25
o
26 – 50%
o
o
51 – 75%
76 – 100%
Radiography Practice Analysis Report
Appendix I
Professional Comment Process
I-1
Radiography Practice Analysis Report
Professional Comment Process
The revised drafts of the proposed Content Specifications for the Examination in Radiography and the
Clinical Competency Requirements were posted on the ARRT website for professional comment from
June 10, 2010 to June 29, 2010. Postcards were sent to 752 radiography educational program directors
inviting them to complete a survey on the newly proposed changes. The survey was also open to other
interested individuals. Below are tables that summarize the results.
Total Number of people who commented
128
Persons who indicated that they were radiography educators
119
Persons who commented on content specifications
110
Persons who commented on clinical requirements
108
Content Specifications
Comments
CT Removed
Film-Screen Reduced
Other
Persons*
Words
64 Positive
9 approved
3 agreed
52 generally
agreed with
changes
41
825
77 Negative
10 disapproved
20 wanted more filmscreen removed or
questioned why some
areas removed and not
others
47 on various
areas of
document
63
4,543
Unusable
6
Clinical Requirements
Comments
CT Removed
Elective vs Mandatory
Other
63 Positive
12 approved
7 agreed with reduction
of mandatory
42 agreed
with changes
52
1,465
63 Negative
13 disapproved
31 wanted more
mandatory
19 comments
on various
areas of
document
53
3,913
Unusable
Persons* Words
3
* Some persons made comments on more than one area, so the total number of comments does not equal the total
number of persons.
I-2
Radiography Practice Analysis Report
Appendix J
Weighting Exercise
J-1
Radiography Practice Analysis Report
ARRT Radiography Content Specifications Topic Weights Survey
Listed below are the five major sections on the Radiography exam content specifications. For each of the
five major categories, please indicate the percentage of test questions that you believe should be allocated
to that category. The percentages should add to 100%.
A.
Radiation Protection
B.
Equipment Operation & Maintenance
C.
Image Acquisition & Evaluation
D.
Imaging Procedures
E.
Patient Care and Education
Total =
100%
For each of the subcategories listed below, indicate the percentage of test questions that you believe should
be assigned to that category. The percentages should add to 100%
Radiation Protection
Imaging Procedures
I.
Biological Aspects of Radiation
I.
Thorax
II.
Minimizing Patient Exposure
II.
Abdomen and GI Studies
III.
Personnel Protection
III.
Spine and Pelvis
IV.
Radiation Exposure & Monitoring
IV.
Head
V.
Extremities
VI.
CT
Total =
100%
Equipment Operation and Quality Control
Total =
I.
Principles of Radiation Physics
II.
Imaging Equipment
Patient Care
III.
Quality Assurance of Imaging
Equipment & Accessories
I.
Ethical and Legal Aspects
II.
Interpersonal Communication
III.
Infection Control
IV.
Physical Assistance and Transfer
V.
Medical Emergencies
VI.
Pharmacology
VII.
Contrast Media
Total =
100%
Image Acquisition and Evaluation
I.
Selection of Technical Factors
II.
Image Processing & Quality
Assurance
III.
Criteria for Image Evaluation
Total =
Total =
100%
Total Number of Questions
Please indicate the number of questions you believe the exam should have:
J-2
__________
100%
100%
Radiography Practice Analysis Report
Appendix K
References
K-1
Radiography Practice Analysis Report
References
American Educational Research Association, American Psychological Association, & National
Council on Measurement in Education (1999). Standards for Educational and
Psychological Testing. Washington DC: American Educational Research Association.
Equal Employment Opportunity Commission, Civil Service Commission, Department of Labor,
& Department of Justice. (1978). Adoption by four agencies of uniform guidelines of
employee selection procedures. Federal Register, 43(166), 38290-38315.
National Commission for Certifying Agencies (2004). Standards for the accreditation of
certification programs. Washington, DC: Author.
National Commission of Health Certifying Agencies (1981). Task force report on education and
certification. Washington, DC: Author.
Raymond, M.R. (2001). Job analysis and the specification of content for licensure and
certification examinations. Applied Measurement in Education, 14, 369-415.
Reid, J.B. (1983). ARRT Job analysis project. Applied Radiology, 12, 27-32.
K-2