Download The tale of two trainings… - Baltimore Workforce Development Board

Survey
yes no Was this document useful for you?
   Thank you for your participation!

* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project

Document related concepts
no text concepts found
Transcript
Baltimore’s Workforce System at Work
Training Investment Analyses
________________________________________________________________________________________________
2.8 Results of the Training Investment Study
In this section we focus on the question “what are we getting for our public investment in
training?” We review the strategies available, and the results of studies measuring the dollar
value. We then give the results of our own analyses on the customized training and ITA-funded
training given by the local public workforce investment system to 216 individuals in Baltimore.
We present results to both a cost-effectiveness analysis, and a return on investment analysis. We
conclude this section with some recommendations for future action, based on these findings.
Why is this important? There is a need to determine the costs and benefits of
public investments in workforce development in a systematic fashion, so that
comparisons can be made within the same program over time and between
different investment options. Cost-effectiveness approaches and training investment analyses can demonstrate the value of workforce development efforts in
terms that the business and legislative communities understand.
2.8.1 The value of training: what do we know, and how do we know it?
Training magazine has estimated that American businesses budget almost $60 billion a year for
formal training.1 The American Society for Training and Development (ASTD) calculates each
year how that total
amount spent emerges
How much training do U.S. firms do?
in terms of several “key
ratios of training” (see
Total training expenditures per training-eligible employee:
$761
Total training expenditures as % of payroll:
1.9%
sidebar).
Percent of training-eligible employees trained:
78.5%
Training-eligible employees to trainer ratio:
367
In 2001, U.S. firms
Percent of training time via classroom:
77.1%
provided almost 24
Percent of training time via learning technologies:
10.5%
hours of training to each
Payments to outside companies as % of total training expenditures: 20.9%
employee, and spent
Total training hours per training-eligible employee:
23.7
about $761 per
Source: ASTD, 2002
employee, which is
equivalent to about 2%
of their total payroll. Yet for all this activity and spending, there are only a handful of studies that
have attempted to establish the generic “value of training” and then to put a specific dollar value
on it.
Studies of the value of training:
One of the earliest studies of publicly-funded training was by Holzer et al,2 who explored the
effects of the Michigan Job Opportunity Bank Upgrade program. This was a state-financed
training program for manufacturing firms that between 1986 and 1990 administered over 400
grants averaging $16,000 per firm. Using data from both successful and unsuccessful applicants
1
“1997 Industry Report”, Training magazine, October.
2
“Are training subsidies for firms effective? The Michigan Experience”, by Holzer H, Block R, Cheatham M, and
Knott J, Industrial and Labor Relations Review, v46(4), July 1993, pp625-635.
1
Baltimore’s Workforce System at Work
Training Investment Analyses
________________________________________________________________________________________________
to the program, the authors were able to look at total hours of training in the firms and the product
scrap rate. They found receipt of training grants was associated with a large and significant – but
one-time – increase in training hours, and also with a more lasting reduction in scrap rates.
Specifically, they calculated every additional hour of training that was stimulated cost roughly $6
to $7 of government funds.
Bartel looked beyond these specific indices to consider the impact of training programs on labor
productivity as a whole.3 She looked at personnel policies and economic characteristics of 595
manufacturers from a 1986 Columbia Business School survey, to measure the impact of formal
training programs on labor productivity. She found that businesses operating below their expected
labor productivity levels in 1983 but which also implemented new employee training programs
after that date, realized significant increases in their rate of productivity growth between 1983 and
1986. This increase was enough to
bring those businesses up to the
Employers! Here’s what these research studies have
labor productivity levels of
shown training can do for you:
comparable organizations.
 reduce scrap rates
 increase productivity and the rate of productivity growth
Black and Lynch attempted to link
 complement your other investments in new physical capital
human capital investments and
 increase your total shareholder return
productivity, using data from 1990
 increase your profit margins
and 1993 on over 2,900
 increase company income per employee
establishments from the National
 increase price-to-book ratios.
Center on the Educational Quality
of the Workforce’s “National
Employers Survey”.4 They found that the average educational level within the establishment has
significant positive effects within both manufacturing and non-manufacturing sectors, and they
estimated that a 10% increase in “average educational level” would lead to 8.5%-12.7% increases
in productivity. Their other findings were that past training increases current productivity, that
off-the-job training has a greater impact than other types, that computer-skills development has a
positive impact regardless of industry, and that employers who focus on grade levels when hiring
also experience significantly higher productivity than competitors.
Lynch and Black next used data from a 1994 survey to examine how the incidence, content, and
extent of employer-provided training, were linked to workplace practices and characteristics and
physical capital investments.5 They found substantial variation by employer size and industry in
the provision of formal training, and that the percentage of workers given training was highest in
establishments that had made large investments in physical capital, or had adopted new forms of
work organization. They suggested their findings show employer-provided training complements,
rather than substitutes for, investments in physical capital and education.
3
“Productivity Gains from the Implementation of Employee Training Programs”, by Bartel A, Industrial
Relations, v33(4), October 1994, pp411-425.
4
“Human Capital Investments and Productivity”, by Black S and Lynch L, AEA Papers and Proceedings, v86(2),
May 1996, pp263-267.
5
“Beyond the Incidence of Employer-Provided Training”, by Lynch L and Black S, Industrial and Labor
Relations Review, v52(1), Oct. 1998, pp64-79.
2
Baltimore’s Workforce System at Work
Training Investment Analyses
________________________________________________________________________________________________
Bassi et al 6 attempted to answer the question “do firms’ investment in education and training
payoff?”. Using 1996-98 data on 575 publicly-traded U.S. firms in the ASTD Benchmarking
Service database, they found that training investments are significantly related to later total
shareholder return (TSR). Specifically, an increase of $680 of training expenditures per employee
was associated with, on average, a 6% improvement in total shareholder return the following
year, even after controlling for other factors like industry type, company size, and prior financial
performance. Firms with above median training investments for the group had a TSR that was
86% higher than firms below, and 45% higher than the stock market average. Firms in the top
quartile also enjoyed significantly higher profit margins, income per employee, and price-to-book
ratios, than the firms in the bottom quartile. The authors suggest these results can be used inside
firms to justify training compared to other potential investments, and outside of firms as a way for
stock market analysts to predict share values.
How do organizations evaluate training?
Human resources, organizational development, and “workplace learning for performance”
professionals, commonly draw on the “Kirkpatrick model” when structuring their evaluations of
training.7 This scheme has four “levels”, and the share of all organizations in the ASTD
Benchmarking Service database8 attempting each different level in 2001 is shown in the bar chart
below.
KIRKPATRICK LEVEL
Use of evaluation strategies for training
"Reaction" (level 1)
91%
"Learning" (level 2)
36%
"Behavior" (level 3)
17%
"Results" (level 4)
9%
0%
20%
40%
60%
80%
100%
Share of organizations attempting
Source: ASTD, 2002
Level 1 is the “Reaction” level, measuring the learner customer’s satisfaction with the course,
content, delivery method, and instructor. This is the easiest method of evaluation, requiring only a
simple survey form administered at completion, and so is undertaken in over 90% of
organizations. However, it is also in some ways the least informative method.
6
7
8
Profiting from learning: do firms’ investment in education and training payoff? Research White Paper by Lauri
Bassi, Jens Ludwig, Dan McMurrer, and Mark Van Buren, for ASTD and SABA, Alexandria, VA, Sep. 2000.
Kirkpatrick D (1998) Evaluating Training Programs, 2nd edition, Berrett-Koehler Publishers, San Francisco.
In this total, 88% are private corporations and 12% from the public sector.
3
Baltimore’s Workforce System at Work
Training Investment Analyses
________________________________________________________________________________________________
The second level is the "Learning" level, focusing on trainee mastery and retention of content.
This usually requires a final exam or rating of the trainee by the supervisor, and is used by just
over one-third of all organizations.
The third level is the "Behavior" level, which focuses on whether the trainee did anything
different or performed better, as a result of the training. This method has seen a surge in the
number of organizations using it in recent years, because of the shift away from passive
acceptance of exam results and towards a more dynamic “demonstration of competencies”
approach, and because of the growing availability of process-monitoring technologies at the
individual work-station level. Level 3 evaluation is now used by about 17% of organizations.
The fourth and final level in Kirkpatrick’s original scheme is the “Results” level. This focuses on
the “bottom-line” implications of training: productivity increases, the value of additional units
made, or sales achieved, the reductions in costs through lower error and scrap rates, reduced staff
turnover, increased customer satisfaction, repeat customer business, and so on. This fourth level
produces the kind of hard business information most often sought by managers, but it also
requires a more sophisticated study design, more detailed measurement and tracking of activities
and costs, and more time and expertise to complete, than do the other three levels. As a result,
Level 4 evaluations are usually only undertaken for large volume/high cost/high visibility training
initiatives, and so appear in under 10% of all organizations.
More recently, Phillips has supplemented this basic four-level Kirkpatrick framework with a new
“Level 5”, which he calls “return on investment”. This focuses on comparison of specifically the
financial costs and benefits of training, using
the formula shown in the sidebar.
How is ROI calculated?
The ROI calculation itself is but is one piece
ROI is “earnings divided by investment”, and can be
of a multi-step process (see sidebar).9 First
considered a variant of traditional cost–benefit
comes planning the ROI study, including
where:
specification of its objectives, scope,
Benefit:cost ratio =
Program Benefits
methodology, and the data to be collected.
---------------------This step involves identifying appropriate
Program Costs
measures, locating data sources, developing
collection instruments, and specifying
Phillips’s ROI process compares net program
timeframes and responsibilities. Next, the
benefits and costs:
data collection needs to be operationalized.
Net Program Benefits
Baseline and post-training data can be
ROI
(%)
=
----------------------------x 100
collected using surveys, questionnaires,
Program
Costs
observation, interviews, focus groups,
assignments, action plans, performance
(where “net program benefits” are “program
contracts, or performance monitoring. The
benefits” minus “program costs”)
effects of the training itself then need to be
isolated from the potential contributions of other factors, using, for example, experimental control
group strategies, trend-line analysis, forecasting models, or professional estimation. The data
then need to be converted to monetary values, using strategies like translating output data into
profit contributions, historical cost comparisons, or management estimates. The resulting
monetary values are then compared to program costs, including the costs of design and
development, participant program materials, and cost of instructors, facilities, expenses and
administrative overhead. The final step is to identify the intangible, non-monetary benefits, such
9
Phillips J (1967) Return on Investment, Gulf Publishing/Butterworth-Heinemann, MA
4
Baltimore’s Workforce System at Work
Training Investment Analyses
________________________________________________________________________________________________
as increased job satisfaction, improved organizational commitment, improved teamwork,
improved customer service, and reduced complaints and conflicts. These are converted to
monetary value where possible and
credible, recognizing that some costs
The Phillips multi-step approach to ROI
and benefits may have to be left outside
1. Plan the evaluation.
the ROI calculation as “intangibles”.
2. Collect the baseline, treatment, and post-event data.
3. Isolate the effects of the training.
4. Convert the hard and soft data to monetary value.
5. Tabulate program costs.
6. Calculate the ROI.
7. Identify the remaining intangible benefits.
The Phillips “ROI Process” has grown
in popularity and use recently, as the old
justification for training being inherently
“a ‘good’ thing for its own sake” has
given way to the modern performance
perspective, wherein corporations have to justify every additional dollar and staff position for
training in terms of its return on investment. Phillips’s “ROI Institute” has developed protocols, a
curriculum and certifications, and has built a critical mass of practitioners who now use Phillips’s
process, steps, and operational strategies. This movement has thus brought some consistency to
approaches and some comparability to study outcomes.
Applying this approach directly to training done by the public sector, however, runs into some
additional problems noted by Phillips.10 These include the general absence of true revenues and
profits for government activity, the relative paucity of “hard” data, the need to serve multiple
constituencies when arriving at and communicating results, the restricted range of options open to
managers for acting on any ROI findings, and the belief that government services are “essential”
public goods that may not ever be discontinued even if they show poor financial returns.
While these issues may indeed make the ROI task harder, they are more issues of discipline and
appetite for adaptation. We suggest there is a far more serious structural issue with attempting
ROI in the public sector: namely, one of “scope”. The ROI processes developed for private
sector training deal with costs and benefits internalized to the enterprise. This situation gives
natural boundaries for deciding which costs and benefits should go into the calculation. Public
sector activity, in contrast, owes its very existence to the fact of externalized costs and benefits.
These by definition preclude private sector actors from capturing all the benefits and remove
incentive for private provision, leaving the public sector to step in and raise taxes to fund the
activity. In the public sector, therefore, the choice of which -- and whose -- costs and benefits to
include in any ROI calculation is a very much more open one.
We suggest this is a serious enough difference to go beyond the present process, and conceive of
a new “Level 6” type of evaluation, preserving some of the relevant operational steps of Level 5,
but dealing with activities whose impacts are both individual and societal, public and private,
internal and external. For the local public workforce system context, these issues are manifested
in the costs being borne by multiple levels of government (and taxpayers) and the benefits
accruing to both the individual client (through hopefully higher individual income received) and
multiple levels of government – and ultimately the taxpayer -- through the client being in a
different tax and welfare situation.
10
Phillips J and Phillips P (eds, 2002) Measuring ROI in the Public Sector, ASTD Press ‘In Action’ series,
Alexandria, VA.
5
Baltimore’s Workforce System at Work
Training Investment Analyses
________________________________________________________________________________________________
We attempt below the more comprehensive approach needed to account for all these costs and
benefits through a pilot study of two different types of training offered by the local public
workforce system in Baltimore: “customized training”, and “ITA-funded training”. The main
differences between these two types of training are summarized in the sidebar. The key difference
is in the presence of an employer for customized training, versus the individual paths taken by
clients using ITAs to “purchase” their training “retail”.
The tale of two trainings…
FEATURE
CUSTOMIZED TRAINING
ITA-FUNDED TRAINING
Route:
Location:
Content:
Curriculum:
Duration:
Cooperative venture with employer
Can be workplace or institution
Developed for particular needs of employer
Usually modular, tailored
Tailored to needs of employer:
can be short (days or weeks)
WIA with 50% employer match
Usually guaranteed, if completed
Individual path
Usually institution
Developed by training institution
Usually fixed-length course in structured sequence
Usually multiples of 10, 12 or 15 weeks in
quarters or semesters
WIA or other state and local funds
No direct link to employment opportunity.
Funding:
Employment:
We obtained data on 216 clients who started WIA-funded training activities and exited the
program at various dates between May 2000 and June 2002. The “customized training” option
was followed by 101 clients (47% of the total), and the “Individual Training Account (ITA)
funded training” by 115 clients (53%).
We investigate what are the differences between the two groups, in terms of their initial
demographics, the training intervention (cost and time), and their job and earnings outcomes. We
look particularly at their earnings change from before the training to after exit from the program,
and compare this to the cost of training, as a measure of the “cost-effectiveness” of these two
types. Finally, we undertake a more comprehensive ROI analysis. (Details of data and sources,
and detailed findings, are given in Appendix D).
Cost-effectiveness of the different training types:
Analysis of the demographics of the two training groups showed that the customized group had a
higher share of females, of people with families, and of employed people. It was also slightly
younger, with a lower median wage, and a higher share receiving food stamps, prior to training.
The ITA group had a higher share of dislocated workers, of unemployed clients, and of exoffenders, and also a higher median wage, with a lower share receiving food stamps, prior to the
start of training.
The median cost of training per client was half as much for the customized group as for the ITA
group, and the time spent between start of training and exit was shorter for the customized group
than for the ITA group.
The median hourly wage after training for those with wages was roughly the same for both
groups, but for the full 4 quarters following exit the customized group had a higher median wage
than the ITA group. The customized group also had a higher median absolute wage gain in the 4
quarters post-exit compared to the 4 quarters before training. There is no significant correlation
between the dollar cost of the individual’s training and that individual’s wage gain after training.
6
Baltimore’s Workforce System at Work
Training Investment Analyses
________________________________________________________________________________________________
About 1 in 4 trainees exhibited a wage decline after training compared to their situation before:
this number is higher for the ITA group than for the customized group.
Our chosen measure of cost-effectiveness for these two training groups is “aggregate dollars of
client wage gain per $1 of aggregate training cost” (not including employer contributions and
MOED staff time). The recommendations that follow assume this indicator is something we are
trying to maximize: there may be good policy, strategic, or equity reasons for over-riding this
consideration in individual cases. Nevertheless, on this variable alone, customized training is 2.3
times more effective (at $3.55) than ITA-funded training ($1.49), which suggests customized
training is a better route for scarce training dollars (see sidebar below).
Aggregate wage change (4 qtrs pre-training and post-exit) per $1
of aggregate training cost
$ of wage change per $ of
training cost
$4.00
$3.55
$3.50
$3.00
$2.20
$2.50
$2.00
$1.49
$1.50
$1.00
$0.50
$0.00
CUSTOMIZED
ITA-funded
BOTH TYPES
Type of training
This large difference in cost-effectiveness between the two types of training may not all be
attributable to the training type itself, though, because:
(i) There is no statistically significant correlation between the cost of training and the wages in
the 4 quarters after exit, or between the cost of training and the wage change between preand post- periods, for either training type considered alone or for both.
(ii) There is a significant correlation between wages in the 4 quarters before training and in the
4 quarters after, and between wages before training and the value of wages gain. These are
statistically significant for customized training, for ITA-funded training, and for both
groups together.
(iii) There are also differences – and in some cases larger differences than the $3.55 and $1.49
found above – between different client groups: for example, between males ($1.62) and
females ($2.60), and between those in the lowest quartile of the wage distribution before
training ($4.90) versus those in the highest quartile (-$0.97). To the extent that these
7
Baltimore’s Workforce System at Work
Training Investment Analyses
________________________________________________________________________________________________
different client groups are also disproportionately distributed between the two types of
training, then this will account for some of the observed difference between training types,
on this variable.
This finding of lack of correlations does not necessarily mean that training is “unimportant”, “not
effective”, or being “poorly done”, or that we cannot learn anything from the data on training type
itself. It could just mean, for example, that the price of training in general in society is not itself
related to training’s potential value for wage outcome, but perhaps more to the type of institution
through which it is delivered. Some “cheaper” training institutions may be performing better than
the more expensive ones, in terms of getting their customers into higher paying jobs. Some types
of client may have to go into one type of training versus the other.
What it does mean is that, in the
“pre-condition” --> “training intervention” --> “post-condition”
sequence, in order to understand the value of training we need to give as much thought and
emphasis to the “pre-” step (assessment and selection of participants going in), and to the “post-”
step (wage, longevity/retention of employment, likelihood of wage gain over time once in the
job), as to the training type itself.
Attention also has to be given to differences within the third, “post-” stage: median hourly wage
at placement (a single point in time just after training) does not differ between training types, but
entire first year wages post-exit do, with the customized group doing better than the ITA group.
This suggests examining the nature of placements obtained through customized training, and
focusing on maximizing those placements with longer job retention over time, and with good
wage trajectories over time, in order to increase effectiveness. This broadens our focus from our
analytic starting point of just “comparing two training types”.
Within the training choice, there are still some interesting findings suggesting what can be
achieved through each type, because cost-effectiveness is found to differ between training type
for the same client group. For example:
 dislocated workers in both training groups have a higher median wage than adults before
training; this wage difference persists afterwards for dislocated workers in the ITA group, but
in the customized group the adults catch up through training, with a median wage gain almost
double that of the dislocated workers;
 females in the customized group have a higher median wage than males both before and after
training, and females also have a higher median wage gain than males, but within the ITA
group, this situation is reversed and males do better than females; this situation could result
from females tending to be represented in different kinds of jobs with different wage levels
than males;
 high school grads have the biggest median wage gain of the three educational status groups
(dropouts, HS grads, and college grads);
 all but the highest quartile of wages achieve higher incomes through both types of training,
but the poorest group of clients going in (those in the lowest quartile of the pre-training wage
distribution), achieves a higher median wage after training, and a higher median gain in
wages, through the customized route than through the ITA route;
 the client group with the highest gain in median wage, out of 11 groups (with some overlap of
individuals across groups) is that with wages in the lowest quartile prior to training – i.e. the
poorest group to enter the system: the median wage gain is over 70% that of the second
8
Baltimore’s Workforce System at Work
Training Investment Analyses
________________________________________________________________________________________________
highest wage gain group of “all adults”, and also the next, the second lowest quartile of the
wage distribution prior to training;
 the client group with the lowest median wage gain, out of 11 groups (again with some overlap
of individuals across groups) is that with wages in the highest quartile prior to training: their
median wage change is actually negative (-$1,064), and almost half of the 1 in 4 trainees who
experience a lower income after training than before, are in this group;
 the “highest yield” client group, in terms of highest wage gain per $1 of training cost, out of
11 groups (again, with some overlap of individuals across groups) is the client group with
wages in the lowest quartile prior to training: these clients earn afterwards $4.90 more per $1
spent on their training than they did before; this group’s figure is also the highest of any client
group’s in either type of training, reaching $7.45 more for customized and $3.28 more with
ITAs;
 the “lowest yield” client group, in terms of having the lowest wage gain per $1 of training
cost, out of 11 groups (again, with some overlap of individuals across groups) is the client
group with wages in the highest quartile prior to training: they earn afterwards 97 cents less
per $1 spent on their training than before, and this figure is also negative for this group in
each type of training (-$1.53 for customized, and -$0.75 for ITA).
This all broadly confirms, and supports with real numbers, that training is associated with wage
gains for almost all groups of clients, and that the gain effect is highest for those who previously
were the poorest.
Recommendations based on cost-effectiveness findings:
The summary of strategic choices that might increase this training effectiveness even further, as
suggested by these findings, are:
1. Work with employers to develop more customized training opportunities in which to train
clients.
2. Save ITAs for: (a) dislocated workers, because they are the only client group whose wage
gain for each training dollar is higher through the ITA type than through customized
training; (b) those in the lowest quartile of the wage distribution coming into the training
stage, because they this group is twice as cost-effective as other groups in that training
type.
3. Pick less expensive training providers rather than more expensive ones, for the same kind
of training. This choice would not itself hurt client wage outcomes on average, but it
would improve cost-effectiveness because it would save money for the same wage
outcome.
4. If the workforce system has to “ration” training funds and select participants from a
larger group of applicants, and if there is a choice of trainees, then, in general, costeffectiveness (wage gain per $1 of training cost) could be maximized through:
 selecting those who are poorest (i.e. in the lowest quartile) before training;
 avoiding those who are already the highest (i.e. upper quartile) earners: use
additional non-training services instead, to help place them in their next job; if
9
Baltimore’s Workforce System at Work
Training Investment Analyses
________________________________________________________________________________________________
they have to be trained, the ITA route would be more cost-effective than the
customized route;
 picking high school grads over college grads, to receive training;
 routing females, school drop-outs, and those from the lowest income quartile, into
customized training, where their wage gain per $1 of training cost is much higher
than for these same groups in the ITA type;
5. As the above recommendations are based on findings about group medians, recognize
that individuals differ and that these recommendations cannot be blindly applied
prescriptively to everyone, solely on the basis of their group’s characteristics. The few
groupings with data here are demographic and do not fully capture all information about
an individual’s potential. Be prepared to assess individuals for their motivation to
succeed beyond their group medians.
6. Given the low incidence of welfare receipts in these trainees’ records for the 4 quarters
before and after training, combined with the intermittent appearance of benefits in
earlier quarters before the prior year, thought should be given to better coordinating the
timing of income supports with training, so that training progress and success are better
leveraged.
Return on investment (ROI):
The cost-effectiveness findings above show the impact on wages for each $1 spent on training.
This is not an ROI measure, as it shows only the individual’s wage gain in relation to the public’s
cost. A full ROI requires incorporating the other elements, such as taxes paid and welfare
receipts, for both before and after training, and then making some judgment about the persistence
of training benefits over time. Thus we began with the wage record data and the family size and
structure information, and then used TurboTax™ to estimate the following components of a
client’s tax return:
Federal:
 AGI
 Filing status
 Standard deductions
 Exemptions
 Taxable income
 Total tax liability prior to credits
 Child tax credit
 Earned income tax credit
 Total federal income tax paid/refunded
State and local:
 Maryland taxable income
 Maryland income tax liability prior to credits
 Maryland earned income credit
 Maryland poverty level credit
 Local income tax liability prior to credit
 Local earned income tax credit
10
Baltimore’s Workforce System at Work
Training Investment Analyses
________________________________________________________________________________________________



Local poverty level credit
Local refund of earned income credit
State and local income tax paid/refunded.
We then aggregated the value of these components for all 216 trainee clients in both the tax year
prior to start of training activity, and in the tax year after exit from the program, and calculated
the pre/post differences. These are shown in Chart 2.8.2. Clearly visible there are the spikes in
income, and then in taxable income, after training compared to before. What is perhaps surprising
for this population of clients is just how much the values of the income components overshadow
the values of the credits (EITC and CTC at the federal level, and PLC and EITC at the state and
local levels).
The relevant tax components were then selected and plugged in as the “program benefits” portion
of the ROI calculation formula, alongside the “program costs”, in Table 2.8.3. This table shows
how they were used to calculate the ROI for each type of training. The main result is that neither
of the training types yields a positive return on investment. The customized training ROI is minus
9.6%, and the ITA-funded training is minus 57.9%. Combined, both types have an ROI of minus
41.2%. This contrasts sharply with the high earnings impact of training investment found in the
cost-effectiveness analysis above, where each $1 of training investment yielded over double its
value ($2.20) in wage gain. This difference is because the prime weight in the ROI calculation is
given not to the individual’s wage gain as it was in the cost-effectiveness calculation, but instead
to the increases in taxes paid and the reductions in benefits received, as a result of that wage gain.
For just the one year after exit, the combined additional taxes paid and reductions in benefits do
not themselves outweigh the aggregate cost of investment in the training. Even for the customized
group, training costs of over $237,000 still outweighed the value of tax increases and benefit
reduction value at under $215,000.
These negative ROIs have resulted when including just the one year before and one year after
training. Yet clients will likely have their job for more than one year, and thus later years’ costs
and benefits ought to be included in the calculation. Chart 2.8.4 shows the cumulative ROI for
one through ten years post-exit. These calculations assume the client retains this job, or another
with equal earnings, for the years in question, and that retention is a function of training. The
question is how many years to look at.
Different researchers have used varying lengths of time as the period it would be realistic to take
credit for training: MDRC’s evaluation of JTPA training used five years, and Benson’s analysis
of job training in Ohio used eight years, for example. Both these time-spans seem overly
generous, given the increased churning of the economy since they did their work, and the
common assertion that in the knowledge economy the value of what workers know erodes at 20%
a year. An alternative logic would be to say that training earned clients entry to their job, and the
competent use of their training once in the job allowed them to perform their tasks well enough to
pass their first annual performance review. So in effect, training counted for up to the end of two
years. After that length of time, new situations may emerge in the workplace requiring other
skills, or clients’ jobs may have shifted in nature with changing demand and enterprise needs,
making their training prior to employment less relevant. Thus, two years may be a conservative
but realistic time period over which to credit training with any wage, tax, and benefit changes.
With this assumption, Chart 2.8.4 shows what happens to the ROI as it is calculated on a
cumulative basis over subsequent years. The benefits (additional taxes and welfare reductions)
are assumed to continue at their year one level, while the training costs are in for year one only
and do not appear in subsequent years. The net effect is thus to compound the benefits while the
11
Baltimore’s Workforce System at Work
Training Investment Analyses
________________________________________________________________________________________________
costs stay fixed: consequently, the cumulative ROI improves over time. With the second year
after exit, the cumulative ROI for the customized group has changed from a negative 9.6% in the
first year to a positive 81%, but the analogous figure for ITA-funded training is still negative, at
minus 16%. However, by year three, both types have a positive cumulative ROI. Thereafter, the
gap between the customized group’s ROI and the ITA group’s ROI continues to open, reflecting
the direction set by the initially higher wage gains of the customized group. The implications are
that: (1) more than one year in the job is required for either type of training to be a public
investment that pays off, and hence job retention over time is key; (2) time-to-payoff after exit
will be faster for the customized group than for the ITA-funded group.
So what have we learned from this section, and what recommendations does it support?
(TBD)
12
Tax components
13
St
at
e
an
d
lo
in
co
m
e
ta
x
in
c
at
e
t
d
ed
i
di
t
pa
i
cr
cr
e
t
di
t
ed
i
cr
cr
e
ly
e
rty
om
ul
tim
ne
d
ta
x
ov
e
m
e
to
di
t
di
t
cr
e
cr
e
e
ts
om
ed
i
cr
t
id
ed
i
di
t
pa
cr
in
c
ty
ve
r
io
r
pr
lp
o
le
co
m
e
po
in
in
co
x
M
D
Lo
ca
ed
xa
b
x
el
y
ta
pr
io
rt
ta
ta
m
e
di
ts
cr
e
cr
e
ta
x
ul
ti m
at
rn
ed
ea
r
ar
n
of
le
x
ta
x
m
e
ea
l in
co
un
d
ca
l
R
ef
Lo
ca
Lo
ca
ta
M
D
e
co
m
e
om
M
D
in
in
c
in
co
ld
to
$152,800
$219,221
$63,703
$171,741
$15,507
$2,634
$15,745
$1,758
$41,398
$26,116
$38,686
$80,162
$31,655
$2,859
$81,448
$42,740
$63,086
$116,115
$35,536
$111,133
$142,721
$85,544
$10,149
$36,813
$0
M
D
al
de
r
ed
ch
i
io
r
pr
$500,000
Fe
ea
rn
x
co
m
e
$2,626,368
$2,343,030
$1,981,520
$1,444,132
$997,200
$1,000,000
al
ta
in
$1,500,000
de
r
e
bl
e
AG
I
$2,000,000
Fe
om
ta
xa
al
$2,500,000
al
in
c
al
de
r
$3,000,000
de
r
al
de
r
de
r
Fe
$3,851,781
$4,000,000
Fe
Fe
Fe
Baltimore’s Workforce System at Work
Training Investment Analyses
________________________________________________________________________________________________
2.8.2 Aggregate value of tax components (pre and post)
$3,500,000
Pre-training:
Post-exit:
Baltimore’s Workforce System at Work
Training Investment Analyses
________________________________________________________________________________________________
2.8.3 Calculating the ROI by training type……
Basic ROI formula:
CUSTOMIZED TRAINING
Program Program
benefits - costs
ROI =
ITA-FUNDED TRAINING
$214,297 - $237,137
x 100
ROI =
$188,649 - $447,624
x 100 =
Program costs
BOTH TYPES OF TRAINING
-9.6%
ROI =
$402,946 - $684,761
x 100 = -57.9%
$237,137
ROI =
x 100 = -41.2%
$447,624
$684,761
Where "program benefits" includes…
reductions in:
Federal child tax credit
Federal earned income tax credit
MD earned income credit
MD poverty credit
Local poverty credit
Refund of earned income credit
Value of child care vouchers
Food stamps receipts
TANF/TCA receipts
...and increases in:
Federal income tax ultimately paid
State and local income tax ultimately paid
...which sum to:
$13,202
-$22,219
-$11,919
-$14,500
-$7,035
-$6,550
-$13,507
-$54,885
$1,943
$13,462
-$34,958
-$26,789
-$14,296
-$6,952
-$6,323
-$12,064
-$16,093
$172
$26,664
-$57,177
-$38,708
-$28,796
-$13,987
-$12,873
-$25,571
-$70,978
$2,115
-$115,470
-$103,841
-$219,311
$46,503
$52,324
$98,827
$29,094
$55,714
$84,808
$75,597
$108,038
$183,635
$214,297
$188,649
$402,946
14
Baltimore’s Workforce System at Work
Training Investment Analyses
________________________________________________________________________________________________
804%
2.8.4 Cumulative ROI over time, by training type
713%
800%
623%
700%
2
321%
279%
371%
237%
195%
153%
194%
111%
26%
77%
-16%
-41%
1
-58%
-10%
18%
100%
69%
135%
171%
81%
200%
-100%
253%
261%
300%
0%
312%
352%
400%
430%
442%
500%
488%
533%
600%
3
4
5
6
7
8
9
CUST
ITA
Years post-exit
15
BOTH
10
Baltimore’s Workforce System at Work
Training Investment Analyses
________________________________________________________________________________________________
16
Baltimore’s Workforce System at Work
Appendices
________________________________________________________________________________________________
Appendix D
Data for the cost-effectiveness and ROI analyses
17
Baltimore’s Workforce System at Work
Appendices
________________________________________________________________________________________________
Data sources:
The following data was given for the analyses of cost-effectiveness and return on investment:
(1) Client identification and new file ID set-up, prepared by Donna Safely, MOED and David
Stevens, UB-JFI.
(1) Client demographics and training data in starting SPSS file, prepared by Donna Safely,
MOED.
(2) Wage record information for up to and including 4 quarters prior to starting training, and up
to and including 11 quarters after exiting program, prepared by David Stevens, UB-JFI.
(3) Child Care Voucher data from Maryland Child Care Administration, prepared by Jane
Staveley, Maryland Dept. of Human Resources (DHR) and David Stevens , UB-JFI.
(4) TANF/TCA data from DHR, prepared by Jane Staveley, DHR.
(5) Food Stamps data from DHR, prepared by Jane Staveley, DHR.
(6) Annual/modified “Adjusted Gross Income”; federal, state and local income taxes paid;
federal, state, and local earned income credits (EIC); state and local poverty level credit. All
calculated from starting wage record data and family status and size information, following
IRS tax form rules and tax tables, for 1999, 2000, 2001, 2002, and 2003, using TurboTax™ ,
by David Bosser, JOTF, and Lindsey Woolsey, JHU-IPS.
Initial demographics of the two training groups:
 “dislocated workers” are twice as prevalent within the ITA group (57%) compared to in the
customized group (28%);
 females make up three-quarters of the customized group, but only just over half of the ITA
group;
 55% of the customized group and 59% of the ITA group are single persons with no
dependents; 45 other members of the customized group who are not single have 91
dependents, and the 47 non-single members of the ITA group have 78 dependents; thus, the
216 trainees are associated with another 169 individuals economically dependent on their
success: 385 people in total are “affected” by how this training works out;
 the customized group is, on average, younger than the ITA group, with a median age of 35
compared to 40;
 around three-quarters of each group has at least a HS/GED;
 college grads are slightly more prevalent in the ITA group (13%) than in the customized
group (10%);
 the customized group has double the level of “employed” clients compared to the ITA group
(21% compared to 10%);
 a higher share of the ITA group is “unemployed” (84%) compared to the customized group
(64%);
 the share of clients “economically disadvantaged” is higher for the customized group (59%)
than for ITA-funded training group (41%);
 the share of ex-offenders in the study population overall is low (under 7%), but they are
slightly more represented in the ITA group (10%) than in the customized group (3%);
 more customized trainees report an annual family income of $0 than do in the ITA group (40
compared to 28);
 individual median wages (from wage record data) of clients for the four quarters before the
start of training activity was lower for the customized group ($6,526) than for the ITA group
($9,092);
18
Baltimore’s Workforce System at Work
Appendices
________________________________________________________________________________________________
 8% of clients overall had no earnings in any quarter of the four preceding training; each
quarter before training had at least 22% of its clients with a “$0” income;
 about one-third of the customized group received Food Stamps in the four quarters prior to
the start of training, compared to only 14% of the ITA group.
Training intervention: cost and time
 the average cost of training is $2,348 for the customized group, compared to $3,892 for the
ITA group: the ITA option is two-thirds more expensive than the customized; this difference
is even more pronounced if we instead look at the medians ($1,894 for customized vs. $3,992
for ITAs – a 111% difference);
 the aggregate amount of money spent on training (excluding employer contributions to
customized training) for the customized group is $237,000, compared to $448,000 for the
ITAs;
 the median number of days between “start of training activities” and “exit” is 187; this is
shorter for the customized group (with a median of 164 days) than for the ITA group (223
days);
Wages and work-week outcomes:
 the median hourly wage of those reporting any earnings after training was $9.98 for the
customized group and $10.05 for the ITA group;
 the average hours in a work week was lower for the customized group (34.7 hours) compared
to the ITA group (37.9 hours); if we exclude those with “0” hours, this gap is erased (38.9 for
customized and 38.2 for ITAs);
 median earnings in the period for four quarters after exit are $19,120 for the customized
group, compared to $17,191 for ITAs;
Wages before and after training:
 overall, 73% of all clients show a four-quarter wage improvement in the four quarters after
exit compared to the four quarters before start of training activities (no adjustment is made for
average wage gain or inflation, which also could have contributed to the raise); 26% show a
wage decline; the share of clients with wage decline after training is lower for the
customized group (22% with a decline) than for the ITA group (30%);
 the median absolute 4-quarter wage change for all 216 clients is $7,154; the median change is
higher for the customized ($9,669) than for the ITA group ($5,572);
 the aggregate wage gain for all trainees in the customized group (101 cases) was $840,697,
compared to $668,085 for the larger (115 cases) ITA group;
Benefits before and after training:
 TANF/TCA: (TBD)
19
Baltimore’s Workforce System at Work
Appendices
________________________________________________________________________________________________

Food Stamps: 76 of the 216 clients (35%) had been in receipt of food stamps at sometime in
the 37 month period from April 1998 to April 2001. These 76 client recipients averaged 16.3
months of Food Stamp receipts each, with an average monthly receipt of $268.04. The
smallest monthly check was $4, and the largest was $647. During the 37 months, the smallest
total amount received by any individual client in receipt was $25, and the largest was
$18,755. When adding receipts for two periods -- 4 quarters before start of training activity
and the 4 quarters after exit – then in the customized group, 34 clients received Food Stamps,
with a median level of receipts of $1,163, and in the ITA group 16 clients received Food
Stamps, with a median level of receipts of $849. The change in the aggregate value of Food
Stamps by the customized group between pre-training and post-exit was a drop of $54,885,
and for the ITA group the same drop was $16,093. However, because the Food Stamp data
series stops in April 2001, we are missing more post-data than pre-data, so this comparison is
skewed.
 Child Care Vouchers: only 10 of 216 clients had any child care vouchers in either the 4
quarters before starting training or the 4 quarters after exit; the total sum involved before start
of training was $31,948, with individual client child care voucher values varying from $226
to $8,160; 9 of the 120 cases saw a decline in their child care voucher value post-exit, with
the aggregate value falling to $6,377 – an 80% decline.
Taxes before and after training:
(TBD)
Training investment calculations:

dollars of wage change per dollar of training cost, in aggregate, is $2.20 overall; this is much
higher for the customized group ($3.55) than for the ITA group ($1.49);
What else we have learned:
 Limitations of the self-reported annual family income variable. “Annual family income” or
“AFI” (as reported by clients with varying family sizes) may be a very poor measure of actual
income, compared to the quality of data in the wage record database (“WRD”) (which is
reported by the employer, verified by the state, and organized around the individual’s social
security number). The fact that the data we have from these two series are not for exactly
coincidental time periods will inevitably interrupt their numeric relationship, but even so, AFI
should generally be equal to, or greater than, the WRD number because potentially more
earners are present in the AFI numbers than in the WRD numbers. However, for the 79 cases
with non-zero values on both variables, 28 (or 35%) showed a family income less than the
individual’s income. One likely conclusion is that applicants for needs-based services and
benefits have an incentive to under-report, and that many do so.
 With increasing distance downstream from exit, the earnings data in each quarter become
“thinner”. Beyond 4 quarters post-exit (i.e. after just one year later), there is considerable
fall-off in earnings information and an increasing number of cases with “missing” earnings.
By quarter #6 post-exit, over half the clients have “missing” earnings data, and by quarter #8
20
Baltimore’s Workforce System at Work
Appendices
________________________________________________________________________________________________
(i.e. two years after exit), 94/101 of the customized group and 91/115 of the ITA group (or
85% of all 216 cases), are listed as “missing” for earnings. Partly this may reflect the
increasing impact of lags in reporting as we move further downstream from exit and closer to
the present day. The implication is we have relatively more information on older cases than
recent ones, and a systematic bias in results. Since the customized training model has been
relatively constant through this period while the “true-WIA ITA” model has evolved more
recently, then this time question may also bias the results by training group.
 Lack of spousal wage information is a large problem for estimating several components of
the overall budget. Some 44% of all trainees have dependents, and 18% (some of whom may
overlap with the 44%) are likely to be part of a 2-parent family or share the additional
income(s) of other earner(s). Both these variables in turn have large impacts on:
 the type of tax form used (single, joint, separate, 1040/1040A/1040EZ), which we are
trying to replicate;
 the size of “Adjusted Gross Income” for tax purposes, because there may be more than
one income and because of the number of exemptions that can be claimed;
 EITC, where eligibility and dollar value is a function of total family income level and
the number of dependents;
 taxes paid, since EITC is subtracted from the taxable income step.
 Different databases have different lags in compilation, introducing systematic bias between
different components of the budget. Clients in this study exited as late as June 2002. Wage
record data is available for all clients for the 4-quarter period after exit, but Food Stamp data
is only available up to April 2001. Of the 216 trainee cases, 53 received Food Stamps in the
4-quarter period before the end of the Food Stamp data set, but only 36 had a “start of
training activities” date during that period (thereby allowing summing of pre-training Food
Stamp dollars). Only 8 had an “exit” date during that period (allowing summing of post-exit
Food Stamp dollars). The reason why this is important for our overall calculations is that
Food Stamps receipt is not tied to workforce program participation. Food Stamps can carry
on after exit, but projecting beyond the April 2001 cutoff to make up for not having these
data would not be reliable, because the 77-case, 36-month Food Stamp database shows the
general pattern of Food Stamp receipt to be “intermittent”: only 26 cases have uninterrupted
periods of receipt. Thus, we have more complete information on wages than on benefits,
more complete information on “pre” than “post” situations, and more complete information
on earlier starters and exiters than we have on later clients. As a result, any pre- versus postcomparisons would inevitably overstate the “savings” to the budget from “reduced” food
stamp payments. We recommend holding this variable to one side until more complete data
is available.
 The importance of the downstream situation. At placement, i.e. just after exit, the median
hourly wage for the customized group is $9.98, while for the ITA group it is $10.05. This is a
difference of just 7 cents, or only 0.7%. Over the whole four quarters after exit, however, the
customized median wages were $19,120, versus $17,191 for the ITA group: this is a
difference of $1,929, or 10.1%. Thus, the relative value of different types of training may be
seen not so much in the ability to get a job with a given wage at placement, but in being able
to (a) access a job with potential for later earnings growth, and (b) being able to perform in
the job and thereby increase earnings later. This seems to support the case for looking not
only at "placement-based success" measures, but also further downstream at earnings
trajectories, if all the good the workforce system does is to be captured and demonstrated.
21