Download A Process Control Model of Legislative

Survey
yes no Was this document useful for you?
   Thank you for your participation!

* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project

Document related concepts

Probability wikipedia , lookup

Transcript
A Process Control Model of Legislative
Productivity in the House of Representatives:
Testing the Eects of Congressional Reform
Je Gill
Department of Political Science
Department of Statistics
Cal Poly University
E-mail: [email protected]
James A. Thurber
Department of Government
Center for Congressional and Presidential Studies
American University
E-mail: [email protected]
ABSTRACT
We examine the eects of congressional reform on legislative productivity using a completely new methodology in political science based on queueing theory
and industrial simulation and control software. The foundation of our analysis is
the development of a process control model of legislative development. The model
establishes a status quo productivity equilibrium based on empirical data from
the rst 100 days of the 103rd House of Representatives, then stresses the system
using the mandated productivity of the rst 100 days of the 104th House of Representatives. We compare the agenda based distribution of bill assignments from
the \Contract with America" with a uniform assignment and nd that requiring
a stable legislative system to greatly increase productivity has substantial eects
on members' allocation of time. In particular, members are likely to reduce time
considering legislation and increasingly rely upon partisan cues for vote decisions.
The methodology is suciently general that it can be applied to almost any
legislative setting. Our application focuses on the feedback response from an electoral shift, but the methodology can address any productivity question. Since all
legislative bodies have dened processes by which initiatives ow, the modeling and
simulating of these processes can illuminate eciencies and ineciencies. Queueing theory addresses the prevalent and generalizable scenario in which demand for
legislative outcomes exceeds the short-term capacity of a legislative system.
Prepared for the 1997 Annual Meeting of the American Political Science Association,
August 27-31, Washington, DC. Thanks to Scott Desposato, Heinz Eulau, and David
George for helpful comments.
1 Introduction
This research extends Easton's (1965, 1966, 1990) systems analysis of
political institutions by analyzing empirical data on Congress. We model
the legislative process as a discrete, modular series of events and transactions simultaneously occurring over consecutive time periods. The system
is dened as a dynamic set of limited capacity resources which are required
at dierent stages for particular legislation. Certain capacities such as a
member's time, sta size, and committee structure are xed parameters,
whereas quantities like time available for bill analysis, number of trips back
to the district, constituent service, and bill sponsorship are variables.
Summary statistics of the rst 100 calendar days of the 103rd House
of Representatives are used to establish resource allocation and baseline
capacities, whereas the observed productivity of the 104th is used to test
hypotheses about congressional productivity. Productivity in our context
simply means the quantity of legislation that passes through the committee
structure and becomes a oor vote. The outcome of the oor vote is not
considered, nor are the consequences of public laws.
The model is developed by using statistical queueing theory and industrial control simulation which are typically applied to estimating and control
of complex factory workloads and throughputs. Industrial engineers use this
approach to identify and assess workows, bottlenecks, and throughput volumes, all of which have direct legislative equivalents. This new methodological approach allows the testing of hypotheses such as: what are the aects
on various parts of the institution when a high volume of legislation is processed in short period of time (there were 302 oor votes in the rst 100 days
of the 104th House compared to 135 oor votes in the rst 100 days of the
103rd House). Furthermore, what procedural factors or reforms contribute
to a bill's haste or delay? Reforms in the 104th House of Representatives
promise to have signicant and long term impacts on the legislative process
as evidenced by similar eorts in the past (Oleszek 1989, Smith and Deering
1990, Thurber 1991b). This research explores the impact of these reforms
on institutional eciency.
We begin by creating a legislative development structure based on summary productivity and scheduling statistics of the rst 100 calendar days of
the 103rd House of Representatives. These data, collected from the Congressional Record-Daily Digest, establish the baseline for a productivity equilibrium model. Essentially this model represents a status quo level of workload
based on a well established pattern of productivity under Democratic control. Once the baseline parameters are determined, we stress the equilibrium
1
model by requiring it to process the workload from the rst 100 days of the
104th House of Representatives also collected from the Congressional Record
Daily Diary. In order to replicate the legislative output (in terms of committee output and oor votes), we expect certain model parameters to change
such as members' personal time, constituent service, sponsorship, caucus
activity, or fund raising.
2 Political Structure as a System
David Easton (1965, 1966) developed the idea of a system of politics not
from a sociological or economics denition, but rather a biological perspective. He used the term systems analysis to describe a political system which
has distinct environmental boundaries delineating external forces. External forces exert pressure on the elements of the political system, but they
do so dierently from internal sources. As political systems age and endure they receive feedback from the external environment which aects the
intra-system dynamics. This conceptualization of political systems has two
dimensions. First, it is an empirical description of political behavior within
some distinct environmental unit. Second, it symbolizes the working of the
political system and its access points to the external environment. When
Fenno (1973) identies congressional committees as being either corporate
or permeable he species the quantity and strength of these access points by
classifying the porousness of the boundary between a political system and
the external environment. One primary benet derived from delineating
political systems from their surrounding environment is that it denes endogeneity within the model. In doing so one can identify forces that remain
internal to the political system and are therefore more malleable by decision
makers, versus forces from the external environment that are more dicult
to control or predict.
Any collection of actions and entities can conceivably be called a set,
but to be a logically constructed set there must be repeated, meaningful
interactions between within-set actors and objects. We therefore impose
a sense of order on a collection of behaviors when dening it as a set.
Congress is a well-bounded political system in American politics. Although
Congress is a permeable political system using Fenno's (1973) description
(non-autonomous, easily observable), it contains sub-units which are relatively corporate. These tend to be the highly specialized, highly technical
policy subsystems (Thurber 1991a) that exist at the least visible level of the
political process. Easton sees feedback loops as central to the operation of
2
political systems. This cycle of information provides congressional decision
makers with the ability to alter their behavior dependent on perceptions
expressed by the public, president, and opinion leaders.
The rst 100 days of the 104th House represent a classic example of a
feedback response. The Republican leadership perceived a clear mandate
from the electorate for their election agenda, the Contract with America
(Cassata 1995). This stimuli served as a primary agenda setter for a large
proportion of the legislative input to the system. Conversely, we see the
103rd House as unremarkable in this regard, and therefore a ideal candidate
for the \typical" House. This contrast is exploited in the model by requiring
the modied structure of the rst 100 days 103th House to accommodate the
feedback response of the 104th House.
Table 1: Productivity: 1st and 2nd 100 Days, 104th House
First 100 Days Second 100 Days
Days in Session
58
41
Hours in Session
528
343*
Total Votes
293
248
% Votes on Friday or Monday
25
13
Bills Passed
54
33
Hearings
688
429
Markups
134
165
*Through Thursday, July 20.
First 100 days: January 4 - April 13. Second 100 days: April 14 - July 22.
Source: CQ's Congressional Monitor, Monday July 24, 1995.
One basic element of the congressional system is proposed legislation.
Bills are assigned probabilistically to committees based on empirically observed assignment ratios. Committees process bills dependent on their respective committee structure. Committee productivity time is a function
of the number of subcommittees, committee sta, complexity of legislation,
and other parameters. Few bills are assured passage from the committee
(Barry 1995), and the model assigns a probability passage to each bill derived from observed passage ratios. If the bill passes committee than it
queues up for the House oor through the Rules Committee. Finally, the
model provides time for consideration on the oor under very general conditions. At each phase of the productivity model, work-ow statistics are
generated.
3
Table 2: House Floor Activity
First 100 Days
Session
102nd 103rd 104th
Hours
186
208
536
Days
44
44
58
Mean Hours/Day 4.23 4.73 9.24
Source: Congressional Index, Congressional Register.
We require the productivity of the rst 100 days of the 104th House on
a productivity structure determined by the 103rd House, including the Republican reforms, and look at the ramications. Table 1 shows the dramatic
dierence in the productivity in the rst 100 days of the 104th compared
with the subsequent 100 days This provides an indication of the personal
and legislative adjustments required by introducing a radical feedback response to external electoral stimuli. Representative Frank R. Wolf (R-VA)
summarized the impact of the rst 100 days of the 104th Congress on the
personal lives of members as follows (CQ Congressional Monitor, July 25,
1995, p.6):
\The schedule is atrocious. We cannot continue to go night after
night like this. We are frankly going at break-neck speed and
that is eventually going to destroy families."
Table 2 illustrates Representative Wolf's point by comparing the oor activity for the rst 100 days of the three most recent Congresses.
1
3 Classic Queueing Theory
Queueing theory developed around the beginning of the twentieth century primarily as a tool for analyzing the increasingly complex industrial
production environments developing at that time. The intellectual development of the eld begins with A.K. Erlang (Erlang 1917, 1935, Brockmeyer,
Halstrom, & Jenson 1960) through his inuential work in telephony.
1 Our
count of Hours in Session compiled from the Congressional Record Daily Diary
and the Congressional index diered from CQ's by eight hours, a 1.5% dierence.
4
3.1 A Simple Example of Complexity
Complexity in queueing models rises quickly as the number and capabilities of resources increases. For instance, if one were to analyze the waiting
time of a piece of legislation, the model would be quite simple provided that
there was only one deliberative body (committee or oor) serving a single
line of bills and each transaction took the exact same amount of time. In this
case, waiting time for each bill is simply the product of the transaction time
and the number of bills in the queue ahead of this bill. Now let the mean
be xed but there exists variance in the length of each transaction. One can
now get the mean waiting time for bill by dividing the number of bills served
by the length of the service period, but this is the expected value of a wait,
not a determination of any single bill's exact waiting time. Furthermore,
suppose that there are multiple service sub-units of the deliberative body
or legislature that can serve any item in the queue of bills and each of these
bodies has diering eciencies and therefore diering mean service times.
This adds considerable complexity to the analysis as there is no certainty
about the order in which bills are paired with deliberative bodies.
Now suppose that there is a preference queue available for particularly important bills (perhaps authorizations or appropriations) only: \nonuniform" service queues by function. Analysis of this model enhancement
requires an estimate of the proportion of priority bills. Now does the priority
queue server process other bills if there are no priority bills waiting? These
last two issues address queue discipline : the order in which bills are served.
Is the priority queue server selected because of experience and eciency?
If so, this would make the service in this queue more ecient as well as
having a smaller expected queue (fewer bills to process). Can bills go to
other alternate deliberative bodies if the expected wait is too long? Do the
servers try to identify bills in line with particular political signicance. In
some scenarios there is a grocery store type of arrangement where bills can
queue up for only one server but select from several alternative queues. As
each new feature is added to the model, simple statistics such as expected
bill waiting time become less informative about the behavior of the system
as a whole. All of these questions raise substantial modeling challenges even
though they are actually quite simple legislative situations. It is this problem that led Erlang to attempt to construct a rigorous theory of queueing
(which can also be applied to the queueing of bills in Congress).
5
3.2 Basic Principles of Queueing Theory
Queueing theory begins with the analysis of input streams, which is the
ow of proposed legislation in this study. This input stream of events over
time is homogeneous when there exist a countable (not necessarily nite),
discrete set of outcomes, n , dened on some probability space such that in
any specied interval, a nite number of these outcomes occur. Denote n i
as the arrival time of the i + 1st event in the nth stream that begins with
n . Then m + 1, the number of events in the nth time period determined
by n m (the m + 1st and last event in this interval), is a random variable
(Gnedenko and Kovalenko 1989).
The most basic approach is to model arrival times by the number of
events events in a given arbitrary length time period such as the rst 100
days of a given Congress. This approach is a special case of continuoustime Markov chains called a Poisson process, and it considers the number
of arrivals per time period of length t as a random variable. Fry (1965)
demonstrates that under a fairly general set of conditions, that these random
x ,t)
variables are distributed Poisson(t) having the pmf: f (X j; t) = t Xe .
Thus f (X jt; ) is the probability that exactly X arrivals occur in the t length
time period where is the intensity parameter of the stream. The conditions that lead to this distribution of arrivals are often roughly paraphrased
outside of the statistical literature and therefore deserve closer attention.
Consider a time t and a time t + t where t is some small arbitrary interval.
The required assumptions can be expressed as (Gross & Harris 1985 p.21,
Bunday 1986 p.3, Assmusen 1987 p.64):
Innitesimal Interval. The probability of an arrival in the
interval: (t : t) equals t + (t) where is the intensity parameter discussed above and (t) is a time interval with the
property: limt! tt = 0. In other words as the interval t
reduces in size towards zero, (t) is negligible compared to t.
This assumption is required to establish that adequately describes the intensity or expectation of arrivals. Typically there
is no problem meeting this assumption provided that the time
interval is adequately granular with respect to arrival rates.
Non-simultaneity of Events. The probability of more than
one arrival in the interval: (t : t) equals (t). Since (t)
is negligible with respect to t for suciently small t, the
probability of simultaneous arrivals approaches zero in the limit.
+
+
(
)
!
(
)
0
6
I.I.D. Arrivals. The number of arrivals in any two consecu-
tive or non-consecutive intervals are independent and identically
distributed.
Since the number of events within intervals is distributed Poisson(t), under
these assumptions , then the expected number of events is simply t, with
variance t.
Under typical conditions one would like to know the distribution of events
in a homogeneous stream for a time period of size t: (Ti : Ti ). The three
assumptions above were required for establishing that the number of events
for a given interval is distributed Poisson(t). Now the concern is with the
distribution of these arrivals within the interval. Three simple conditions
are required to put a parametric form on this stream:
Stationarity of the Stream. The probability of X events
occurring in (Tj : Tj ) is independent of j.
No Memory in the Stream. There exists pairwise, mutual
independence for any two intervals. More specically, P (X =
x in (Tj : Tj ) does not depend on P (X = x in (Tk : Tk )
for any j 6= k. Satised under the I.I.D. Arrivals requirement
above.
Non-simultaneity of Events. Two events cannot occur (arrive) at the exact same point in time. Also required above for
the parametric distribution of the number of arrivals in a given
interval.
A homogeneous stream that meets these three conditions is said to be a
simple stream and can be shown to be uniformly distributed within specied intervals (Gross and Harris 1985). If it has already been established
that the arrival pattern meets the requirements for a Poisson distribution
for the number of arrivals, then to establish the uniform distribution of arrivals within time periods only the stationarity of the stream needs to be
established. So under a fairly general set of conditions one can apply a parametric form to both the number of events in a given time period and the
distribution of their arrival within that time period.
We can also put these three conditions into the context of Congressional
productivity. Stationarity of the Stream implies that the probability of bill
2
+1
+1
+1
+1
2 Note
that this simple scenario provides no under-dispersion or over-dispersion. For
such cases, the input stream could be modeled with the binomial (under) or negative
binomial (over) pdf's. See King (1989).
7
introduction on the 57th day of the session is independent of being on the
57th day. Or simply put, there is not a \special" day in the period of study
where bill ow is altered. The memoryless feature requires that the ow
of bill introduction on some given day be independent of the ow of bill
introduction on another day. Non-Simultaneity of Events means that two
bills cannot be introduced into the system at exactly the same time. In
practice, we can tolerate mild deviations from these requirements and still
produce valid inferences.
Queueing theory not only models event occurrences, but also the service received upon arrival. A rough but useful method of modeling simple
service times is to approximate
the length of service using the exponential
x
distribution: f (xj ) = e, (Kinchin 1955, 1956). There are two motivations for this approach. First, the exponential distribution is memory-less
(Markovian): it contains no serial dependency . Specically that the probability of observing s , t occurrences given t previous occurrences is equal
to the probability of observing s , t occurrences before any observations are
taken. Thus the time required to service the nth arrival is independent of the
time required to service the n + 1st arrival. Recalling our bill introduction
example, the length of time required for a deliberative body to consider a
bill (service time) does not aect the length of time required to consider
the next bill in then queue. Second, given the mathematical behavior of
the exponential distribution, a large proportion of the random variables will
have a relatively short service time, whereas a small proportion will have
a substantially longer service time. This tends to reect empirical observation. As any president who has a special interest in particular legislation
can attest, most bills are dealt with fairly routinely but periodically there
is always a case that requires substantial work on the the part of Congress
(Davidson 1991, 1996). So the exponential distribution for service times
is selected because it approximately resembles empirical observation and it
meets the three conditions described above for a simple stream.
Suppose the ith bill's waiting time is denoted: !i . The analyst might
be interested in knowing things like: P (!i > t), the probability that the
ith bill waits longer that the time period (possibly not getting served), or
P (!i > tjk), the probability that the ith bill waits longer than the current
time period given k people ahead in the queue. This is obviously dependent
in some way on the eciency of the serving system. This eciency is denoted
as t indicating the number of bills that can be serviced in period t with:
1
3
3 For a proof of this property of the exponential distribution, see Gross & Harris 1985,
Section 1.9
8
Et = . Feller (1940) showed that the probability that k bills are ahead of
an arbitrary entrant is:
Pk = k,Pk,
1
(1)
1
k
This can be thought of as the ratio of the arrival rate of k-1 bills times
the probability that k-1 bills arrive, to the system's ability to serve k bills
where the kth arrival is our bill of interest. So to establish the state of the
system at some arbitrary loading based on the state of the model at the
beginning period (bills can queue up before the system starts functioning:
the beginning state is not degenerate), look at the product of all of the
arrival probabilities prior to the state of interest:
Pk =
j, P = ( )k P
k!
j , j
k
Y
1
0
(2)
0
1
provided that < . Since k equals some positive integer with probability
one, it is possible to determine P (no waiting time) for given , , and (Gnedenko and Kovalenko 1989):
0
,
( )
( )k
P =
k! + !( , ( ))
k
For =1 and =2, note the following forms for P :
X
+1
1
(3)
0
=0
0
= 1 ) P = 1 , 0
2 , =2 ) P =
2 + 0
This makes intuitive sense since if (service time) is much greater than
(intensity parameter of arrivals), then the probability of no waiting time
approaches one for these two cases.
Erlang (1917) applied the exponential distribution to service time ( =
)
in
the absence of a pre-existing queue to derive E (!i ) = , V ar(!i) = 2 ,
and showed that whenever (the number of uniform servers), the queue
tended to 1 as successive time periods follow. For example, suppose our
hypothetical legislative body had only one deliberative committee ( = 1)
1
1
9
1
and that the expected arrival of bills exactly coincides with this committee's
mean processing time of 10 bills per month, = = 3 so = . Then
there will be no queue, provided that the bills arrive over the time period
at exactly 3 day intervals. Since this is unlikely to occur for any reasonable
choice of t, the queue in this scenario will accumulate without abatement.
For this reason, systems designers typically make the restriction: 0 < < .
The probability that the waiting time exceeds t for an arbitrary ith entrant is expressed as the probability that there will be the same or more
individuals in the system as there are servers. This is dependent on the sum
of the conditional probabilities that meet this criteria: the probability of k
in the queue (Pk ) times the probability that the ith individual waits longer
than the time period given k in the queue (P (!i > tjk)):
P (!i > t) =
1
X
k Pk P (!i > tjk)
(4)
=
provided again that < such that the waiting times converge. Erlang's
(1917) construction leads to the following results for the expectation and
variance of the waiting time (both of which are dependent on , the probability that all servers are busy at some arbitrarily chosen time.
E (!i ) = , , )
V ar(!i) = (2
,
2
where : P =
2
2
( ) P
( , 1)!( , )
0
Continuing with our example where = 1 and = 2:
= 1 ) P = = 2 ) P =
(5)
( )
2 + 2
It is also important, particularly from a service provider's point of view, to
consider the total waiting time in the aggregate for a time period. Since
10
t events are expected to occur in time period (Tj : Tj ), the total mean
+1
expected waiting time and its variance can be expressed as:
E
X
V ar
i
!i = (t)( , )
X
i
(6)
!i = (t) (2 ,, )
2
2
(7)
Because and t exist only in the numerator of the total mean waiting time,
this total increases rapidly as time increases for systems that begin with a
long queue. Such situations often occur when cases are allowed to queue up
before service commences. This well known scenario is seen in the provision
of governmental services with limited delivery times (motor vehicles oce,
welfare distribution, etc.).
3.3 Motivation for a Software Simulation of Queueing
Section 3.2 introduced the basic probabilistic derivation behind simple
queueing scenarios. It is clear that the calculation of even the simplest
statistic (mean waiting time) can become elaborate and technical as the
conguration of the system increases in complexity. At a very early point in
the process of making the model resemble some reasonably interesting political system such as Congress, the parameterization of the model is no longer
available as a close ended set of equations (Morgan 1984). Since the objective
is to model a very complex political system, Congress, and it is desirable to
have more informative statistics than those discussed in this section (intermediate productivity measures, thresholds on re-parameterization), another
approach is required . An obvious complication arises since the committees ( = 22) are not uniform servers: legislation cannot be sent to the
next available committee, it is directed by its content and the jurisdiction of
the committees. At this point we turn to queueing systems modeling with
software explicitly designed to imitate and report on complex queueing congurations that escape close ended probabilistic analysis. Simulation results
and model inferences will be placed in the context of this discussion of basic
queueing theory.
4
4 The
Congressional productivity model of congressional legislation developed here is
actually a special case of the very general class outlined above and can be denoted as:
M=Ek =22 (Asmussen 1987, Bunday 1986) for the Markovian (Poisson) distribution for
arrival time, Exponential(k) (simple Erlangian) distribution for service time, and 22 for
the number of servers (i.e. committees).
11
4 Discrete-event Simulation Modeling
Simulation of queueing models originates in industrial engineering where
it is often too expensive, complex, or dangerous to experiment with the conguration of some production environment. Typical applications include
modeling workloads and throughputs in various parts of a factory with the
goal of replacing production bottlenecks with more ecient alternatives.
The approach is to take the owchart of interdependent discrete events required to develop a product, and to determine numerical estimates of the
time required at each step and/or station. Once this is done \transactions"
are fed to the model in sequence, and queueing statistics are accumulated.
Simulation is necessarily a simplication and imitation of empirical phenomena, and therefore does not include every feature of an actual system.
Well constructed simulations focus on the key procedures and leave out
less signicant events. As a result, queueing simulations should incorporate
features which specically aect throughput and waiting times. These simulations leverage the smallest set of modules and descriptions into the most
informative description of system performance and limitations possible.
Time-unit models describe changes in subsystem values resulting from a
series of consecutive events. Therefore, this approach could be considered
a special case of Markov chain models since system occurrences in the j th
time period depend probabilistically on occurrences in the (j , 1)th time
period (Grimmit and Stirzaker 1992) even though the input stream is a
memory-less series of independent, identically distributed random variables.
Since elaborate models can produce complex state-change statistics, statistical software is used to incorporate the stochastic component as well as
to record these statistics . We selected GPSS (General Purpose Simulation
System) due to its wide use and well known features . GPSS is well-suited
to the application of legislative workloads with relatively simple, discrete
time processes and a large number of inputs (Gordon 1975, Solomon 1983).
5
6
5 Specialized
packages include: ACSL, CSMP, DYNAMO, GASP, GPSS, MODSIM,
SCERT, SIMAN, SIMFACTORY, SIMNET, SIMPROCESS, SIMSCRIPT, and SLAM
6 Developed by IBM in the early 1960's as a product designed to appeal to large corporate manufacturers, GPSS evolved through numerous extensions and onto a wide range
of hardware platforms. GPSS is the most general of the listed simulation packages, and is
therefore limited in its ability to model some complex continuous processes (which is not
a issue here).
12
4.1 GPSS Modeling
The fundamental unit in a GPSS simulation is a transaction : a bill in
our case. These are generated at the beginning of a predened sequential
structure and generally progress through the model by capturing resources
called blocks. Blocks alter or detain transactions like a body welding station
in an automotive factory simultaneously alters and detains unnished cars.
The predened sequential structure of these events is called the block diagram and it determines the order of events for the transactions. Complex
systems are modeled through the order and attributes of these blocks. In
general blocks are of two types in a GPSS model: facilities resources and
storage resources. The chief dierence being that in addition to being identied with a particular activity, storage resources can accommodate multiple
transactions. This distinction means that limited resources are often simultaneously in demand by transactions which are queueing up for the service.
As the ow of transactions traverses the system, details on block usage and
transaction queueing are collected. The primary output of the model is
a summary of these statistics over a given time period or the open-ended
interval that produces a specic output quantity.
We dene system resources in which blocks are committees, subcommittees, members' oces, and the oor of the House. This construct, although
greatly simplied from detailed working of Congress, provides a basis for
understanding legislative ow as a function of time within congressional subunits. Congressional resources that impact the ow of transactions include
member and committee sta, time alloted, eciency of the oor, specialization of committees, and member expertise.
5 The Process Control Model
We are concerned strictly with measuring legislative productivity as a
function of the structure of Congress. Our leverage results from focusing
on those factors in the House of Representatives which restrict or consume
productivity resources in the ow of legislation during the rst 100 calendar
days of a new Congress. Bills are assigned to appropriate committees by the
House leadership. In our model these assignments are counted from the Congressional Record Daily Diary and therefore include multiple and sequential
referrals. This is an important consideration since committee workload is a
function of assignments not just the sum of introduced legislation (Davidson
and Vincent 1987). Multiple referral of bills is an increasing trend over the
last 20 years (Davidson and Oleszek 1994), and currently about 20% of all
13
bills are referred to more than one committee (Table 4). Multiple referrals
adjusted as a percent coming out of the committee module are much higher:
44.27% in the 102nd (Davidson and Oleszek 1994). This trend continues
even though the Republican reforms of the 104th House prohibit joint referrals thus limiting multiple referrals to sequential and split assignments.
Bills that pass the committee vote are submitted for scheduling and may
nally reach the oor in the rst 100 days. Bills that pass committees but
are not scheduled during the rst 100 days are placed in a storage queue for
later consideration.
5.1 Committee Parameters
Summary data on committee parameters form the basis of variance in
committee workload and processing capability. Due to the Republican reforms, the committee structure of the House diered from the 103rd to the
104th Congress'. The Republican leadership eliminated three standing committees (District of Columbia, Merchant Marine & and Fisheries, and Post
Oce & Civil Service). Table 3 summarizes these changes along with the
high-level associated committee parameters used in the model. Table 3
shows the dramatic decrease in committee sta associated with the Republican reforms of the 104th House. However, the number of members on each
committee stays relatively constant.
There are three basic productivity parameters in the model, identied
by the quantity of: proposed legislation, legislation that successfully passes
committee, and legislation considered on the House oor. These are xed
values in both stages of the model development. In the process of establishing the equilibrium productivity model of the 103rd House, these three
7
7 In some cases the committee transformations were straight-forward name changes. For
example, it seems natural to think of the Government Reform and Oversight Committee
as the natural her to the Government Operations Committee since William Clinger (R-PA
5) went from ranking minority member of Government Operations to chair of Government
Reform and Oversight, and when John Conyers (D-MI 14) left Government Operations
after the 103rd, the second ranking Democrat Cardiss Collins (D-IL 7) became the ranking minority member of Government Reform and Oversight. The jurisdiction and many
members of the eliminated Post Oce and Civil Service Committee were split between
the Subcommittee on the Postal Service (under Government Reform and Oversight) and
the Subcommittee on the Civil Service (under Government Reform and Oversight). Similarly the jurisdiction and some members of the eliminated Merchant Marine and Fisheries
Committee were transferred to the Subcommittee on Fisheries, Wildlife and Oceans (under Resources), and Subcommittee on the Coast Guard and Maritime Transportation
(under Transportation and Infrastructure). The District of Columbia Committee became
the Subcommittee on the District of Columbia under Government Reform and Oversight.
14
Table 3: House Committee Attributes
103rd House
Committee
104th House
Size
Sub's
Sta
Appropriations
Budget
Rules
Ways & Means
60
43
13
38
13
0
2
6
101
49
29
90
Banking
Education & Labor
Energy & Commerce
Foreign Aairs
Government Operations
Judiciary
50
43
44
45
41
35
6
6
6
7
6
6
76
102
160
99
74
70
Agriculture
Armed Forces
M.M. & Fisheries
Natural Resources
Public Works
Science, Space, & Techn.
Small Business
Veterans Aairs
45
56
46
43
63
55
45
35
6
6
5
5
6
5
5
5
51
76
60
76
81
87
47
39
District of Columbia
House Administration
Intelligence
PO & Civil Service.
Stds.of Conduct
12
21
19
24
14
3
6
3
5
0
34
62
25
70
10
Prestige Committees
Policy Committees
Constituency Committees
Service Committees
Committee
Size
Sub's
Sta
Appropriations
Budget
Rules
Ways & Means
58
42
13
39
6
0
2
5
109
39
25
23
Banking
Econ. & Educ. Oppor.
Commerce
International Relations
Government Reform
Judiciary
51
43
49
44
51
35
5
5
4
3
7
5
61
69
79
68
99
56
Agriculture
National Security
Resources
Trans. & Infrastructure
Science
Small Business
Veterans Aairs
49
55
44
61
50
43
33
5
7
5
6
4
4
3
46
48
58
74
58
29
36
House Oversight
Stds. of Conduct
Intelligence
12
10
16
0
0
2
24
9
24
Prestige Committees
Policy Committees
Constituency Committees
Service Committees
Source: Ornstein, Mann, and Malbin, 1994, Vital Statistics on Congress, and CQ Congressional Sta Directory,
Volume 2 1994 & Summer 1996. Notes: 1. Representatives associate sta not counted toward committee sta.
2. Subcommittee
sta included in committee sta. 3. Budget Committee associate sta not counted was 41 for
the 104th House. 3. Sta number for Standards of Ocial Conduct Committee includes two employees of the
Oce of Advice and Education.
quantities dene variable levels that support such productivity. In the second stage, considering the 104th House, these three quantities are changed
to reect that institution's priorities. Table 4 summarizes these xed values.
The 104th House diered dramatically from previous Congress' in terms
of oor activity. While members obviously do not need to be on the oor
every hour that the House is in session, increased oor activity will certainly
lead to greater demands on each legislators time. Table 2 shows the dramatic increase in oor activity in the 104th House compared with the two
previous. The 104th House had almost three times as many in-session hours
as the previous two Houses. Despite the fact that the Republican leadership required 14 additional working days, the mean session hours per day
was still well over nine, this increased burden is likely to aect committees
dierently.
15
Table 4: Bills Assigned, First 100 Days
Committee Names from 103rd House
Proposed
Referred
Floor Votes
103rd House 104th House
1714
1952
135
1526
2023
302
Appropriations
Budget
Rules
Ways & Means
9
101
54
542
6
23
44
355
Banking
Education & Labor
Energy & Commerce
Foreign Aairs
Government Operations
Judiciary
101
158
196
40
72
168
125
131
271
63
149
253
Agriculture
Armed Forces
Natural Resources
Public Works
Science & Technology
Small Business
Veterans Aairs
60
75
98
69
25
9
41
81
89
142
155
27
8
28
District of Columbia
House Administration
M.M. & Fisheries
P.O. & Civil Service
Stds.of Conduct
Intelligence
9
78
42
104
0
2
Prestige Committees
Policy Committees
Constituency Committees
Service Committees
68
1
4
Source: Congressional Index, Congressional Register.
5.2 Simplifying Assumptions
The previously discussed tradeo between parsimony and realism is implemented through a series of simplifying assumptions. This section lists
each of these assumptions and the associated rational. Most of these are
derived from the special circumstances seen in the rst 100 days of a new
Congress.
Model Assumptions
1. Constant Expectation. The average time it takes to process a bill in
a specic committee does not vary across the considered Congresses
in the model. This means that given identical inputs and identical
parameter values (sta size, number of hearings, number of subcom16
2.
3.
4.
5.
6.
mittees, hours in chambers) the expected value of the processing time
is identical in the 103rd and the 104th Houses. This does not imply
a uniform distribution of processing times, and it does mean that the
actual means will be identical since the two stage model specically
changes the parameter values listed in Table 3 for the 104th House.
Processing Order. The order of bill processing by a given committee
is not important since seriality is conned to 100 days. Because the
bill ow from the 103rd House is used as a baseline for the productivity
equilibrium model, then every bill that successfully passes through a
committee in the rst 100 days does so regardless of the order. The
second stage model reecting the 104th House is mandated to process
bill ow dictated by the observed quantities for that period regardless
of order to the oor.
Corporate Committees. Using Fenno's (1973) denition of corporate, we assume that for the rst 100 days exogenous eects on the
individual committees are unimportant. This is equivalent to asserting
that for the rst 100 days productivity eects are driven by endogenous leadership forces and an agenda derived from the recent election.
Examples of non-included exogenous eects include: inuence from
the executive branch, communication with elites in the district, and
contact with interest groups.
Core Committee Work. For the rst 100 days, members restrict
their committee activity to standing committees only: no select or
special committees. This assumption means that time spent in select
or special committees during the 100 days period is negligible. The
single exception to this limitation is the inclusion of the Permanent
Select Committee on Intelligence.
Core Legislation. For the rst 100 days, the House concentrates
on legislation derived from a desire by leadership to show feedback response from the recent election. This means that considered legislation
by committees and the oor is restricted to bills and joint resolutions
only, no concurrent resolutions, or simple resolutions.
Uniform Generation. Bill generation prior to assignment to appropriate committees is distributed uniformly over the rst 100 days
(a simple stream). This means that there is no sub-period of heightened activity during the 100 day period. This does not imply that
17
the distribution of bills from the committees to the oor schedule is
uniform.
7. Constituent Service. Members will restrict constituent service to
days that the House is not session during the rst 100 days. For the
103rd House this is 56 days (100-44), and for the 104th House this is
42 days (100-58).
This set of assumptions outlines the queueing structure of the model. By
restricting the inclusion of details we are attempting to specify the internal
policies and procedures that systematically aect legislative ow. The result
of this set of restrictions thus resembles (with substantially more complexity)
the simple example discussed previously. Therefore we are not asserting that
each of these assumptions is strictly true. Instead we are claiming that they
are accurate to the extent to which empirically observed counterexamples
are either rare or immaterial to the analysis of legislative productivity.
6 Model Results
6.1 Productivity Equilibrium Model
The 103rd House introduced 1714 bills in the rst 100 days, making 1952
referrals of bills to committees with 135 oor votes (Table 4). When these
productivity requirements were levied on the GPSS model, the mean committee was required to process one bill every 2.0527 days for the period.
Committee processing capability varied, these summaries are aggregate results. In addition to the 135 bills sent on to the House oor, 121 bills were
rejected in committee, and 1458 bills were still unprocessed when the model
nished after 44 in-session days. The GPSS model shows that there was a
mean of 909.70 bills in the committee queues, and a mean of 22.86 bills in
the oor queue. The committee queues had a maximum capacity of 1801,
and the oor queue had a maximum capacity of 71. Interestingly, the oor
queue was empty at the conclusion of 44 days run-time. This indicates that
the committee system is signicantly slower in processing than the oor despite having 22 servers. These quantities, supported by the resources of the
103rd House (sta, subcommittees, bill distribution), form the productivity
equilibrium model. The burden of bill processing fell disproportionately by
committee type. Prestige Committees processed 30.99% and Policy Committees processed 37.65% of the total bills. In contrast, Constituency Committees processed 19.31% and Service Committees processed 12.05% of the
18
total bills. This is consistent with theories that link policy oriented committee work and members' concern with reelection (Fiorina 1989, Davidson &
Oleszek 1994).
One way to evaluate the relative eciency of the committee system is to
compare the results described above to a simple stream queueing model as
described in Section 3.2. This tests the question as to whether committee
specialization hinders productivity. If the legislative agenda is highly disprortionate relative to division of specialization in House committees then
certain committees will have longer queues relative to others. Conversely,
if the bill ow exactly replicates the specializations of committees, then the
committee servers will exactly replicate a set of uniform service queues. In
other words allowing uniform service by committee is functionally equivalent
to an exactly perfect match between proposed legislation and committee assignment in terms of productivity results. Now we analyze the workload of
the 103rd using the observed productivity data outlined above, but assume
the existence of 0 = 22 uniform servers (denoted 0 to remind us of the
uniform assumption). This is summarized:
Intensity Parameter
referrals) = 44:3636
t = 1952(
44(days)
t = 1 ) = 44:3636
Service Time
votes)
= 135(
44(days) = 3:0682
Servers
0 = 22
Probability of No Waiting Time
P =
0
22
X
k
=0
( ::
44 3636
3 0682
k!
)k
+
( :: )
22!(22 , ( ::
44 3636
,1
22+1
3 0682
44 3636
3 0682
))
Probability all Servers are Busy
P = (
= 3:1535693x10,
) (3:1535693x10, )
= 0:02397182
(22 , 1)!(22 , :: )
:
:
44 3636
2
7
3 0682
44 3636
3 0682
19
7
That the probability of an empty queue for some arbitrary bill in the input
stream for the rst 100 days is zero should not be surprising. In fact, it
would be surprising, given the values for and , if this was anywhere else
but close to zero. The interesting result above is that the probability of
all servers being busy is a little over 2%. Conversely, in the simulation the
assignments of bills was shown to be highly disproportionate, sending some
bills to long queues. This is seen in the GPSS simulation result indicating
that the average (total) committees queue was 909.7.
6.2 Stressing the Stable System
If the exact structure and committee productivity capacity of the 103rd
House is held constant but the introduction rate (1526 bills, 2023 referrals)
of the 104th is applied, then only 141 bills are processed through to the
oor. This experiment includes the Republican elimination of 3 Service
Committees and 58 days in session but no other reforms. There was a
mean of 911.59 bills in the committee queue, and a mean of 23.22 bills in
the oor queue. Unlike the 103rd House, there were two bills waiting for
oor action when the time period terminated. Nearly the same number of
bills were sent to the oor even though only 19 committees functioned as
servers and there were twice as many multiple referrals (497 versus 238).
Part of the reason that the 104th committee structure kept relative pace
was the increased number of days in session: 58 versus 44 in the 103rd . The
increase in multiple referrals is interesting given the Republican reform that
disallowed simultaneous multiple (joint) referrals.
The disproportionality of assignment to committee was more even more
extreme in this simulation: 49.04% of the legislation was assigned to Policy
Committees. Prestige Committees (21.16%) and Constituency Committees
(26.20%) processed about the same amount, whereas Service Committees
were very inactive (3.61%) reecting the priorities expressed in the Contract
With America and the Republican leadership's attitude about the activities
of Service Committees.
We now analyze the stressed system with uniform servers as was done
with the workload equilibrium model. This provides the following analysis:
Intensity Parameter
referrals) = 34:8793
t = 2023(
58(days)
t = 1 ) = 34:8793
20
Service Time
votes)
= 302(
58(days) = 5:2069
Servers
0 = 19
Probability of No Waiting Time
P =
0
19
X
k
( ::
34 8793
=0
5 2069
k!
)k
( :: )
+
19!(19 , ( ::
34 8793
,1
19+1
5 2069
34 8793
5 2069
))
= 0:001234007
Probability all Servers are Busy
( : : ) (0:001234007)
= 7:767363x10,
(19 , 1)!(19 , : : )
This analysis shows that under the assumption of uniform servers, the
104th House was more ecient than the 103rd . The probability of no waiting
time was still small, but it was four orders of magnitude better than that
of the 103rd . In addition, the probability that all servers are busy for some
arbitrary entrant is nearly zero. The reason for this surprising result can be
seen by evaluating the ratio of the intensity parameter to the service time.
This is essentially a measure of productivity eciency for the committee
system.
P =
34 879
2
5
5 2069
34 879
5 2069
103
rd
104
th
= 14:4592
= 6:69866
So the committee system of the 104th House was driven to process a greater
proportion of the input stream (bills) by the leadership. Also, this greater
eciency is performed with 19 uniform servers (committees) instead of 22
for the 103rd House.
Next the GPSS model is recongured to replicate the exact output of
the committee system to the oor queue. So the introduction and assignment rates are unchanged from above, but the model must send 302 bills
to the oor. In order to replicate this empirically observed output, the
non-Service Committees had to be 3.92 times more ecient. This means
21
that despite committee sta cuts (Table 3), the elimination of proxy voting
by chairs, elimination of rolling quorums, and mandatory open meetings,
these committees had to dramatically improve their eciency. This produces an obvious question as to how can a committee structure increase its
productivity eciency with decreased productivity resources? One possible
explanation is the Republican reform which included a reduction in members' committee assignments in the 104th House. Members sat on a mean of
5.9 committees and subcommittees in the 103rd House compared to a mean
of 4.8 in the 104th (Davidson 1995).
7 Conclusion
This research develops a new methodological approach to analyzing the
workload of legislatures. By modeling bills as transactions and committees
as processing resources, we are able to test hypotheses about limiting resources and mandating productivity. Queueing theory is a tool which takes
a throughput structure along with the capabilities of the elements of that
structure and tests various scenarios regarding output. Our perspective is
to see Congress, the House of Representatives in particular, as processor of
legislation in the classic factory oor sense. While this approach ignores
many of the features that make Congress such an interesting political body,
it allows the parsimonious investigation of one important aspect of legislation. Productivity is a particularly important aspect of Congress as policy
areas become more technical and complex, and by extension, eciency is
the associated measure of productivity given a specic time interval such as
the rst 100 days.
There are other areas that can be explored with this new methodology.
It would interesting to compare two Democratically controlled Houses with
regard to rules changes or committee reconguration. Other resources can
be included such as interest groups and the executive branch. Clearly the
time period is not limited to the rst 100 days of a new Congress. However,
many complexities emerge when modeling longer periods. The core output
measure provided by this methodology is eciency with regard to structural
changes in the process of legislating. Therefore any aspect of a legislature
that is non-independent of eciency is a potential model in this context.
It has been argued that eciency and representation are not necessarily
at odds (Thurber 1995). We look at eciency as having some expressed
cost. Requiring the committee system to produce a signicantly higher
output with reduced resources implies a greater burden members. Mem22
bers make allocative decisions with respect to their time. If the burdens of
committee work and oor activity increase dramatically, then members are
certain to reduce time spent on other activities. Certainly members reduce
their personal time during a bounded period of substantially increased legislative activity. However, for a 100 day period, it is unreasonable to think
that legislators can function with less than 8 hours per day for personal use
(including sleep). So in order for committees to be 3.92 times more productive, other congressional activities must be curtailed such as campaigning
(not a big issue in the rst 100 days of a new Congress), constituent service
(often delegated), caucus and party activity (not very exible), and most
importantly time considering and writing legislation. If legislators decreased
their deliberation and bill consideration time to meet a specied time table,
then one would expect partisan voting to increase as each member looks for
time saving cues. This is exactly what occurred. For example 141 of the
230 Republicans had party unity scores of 100 on the 33 bills identied as
the implementation of the Contract with America (Congressional Quarterly,
April 1, 1995). The party unity score for freshman Republicans (n=73) during the rst 100 days of the 104th was 96 compared with 93 for the entire
rst term. The eect of requiring a stable legislative system to vastly increase output was described by Minority Leader Gephardt (Congressional
Quarterly, April 8, 1995):
\This hundred days is a self-imposed national emergency that
made no sense. Its caused all of them to jerk stu through the
procedure much faster than it should be. There hasn't been
enough committee consideration or oor consideration."
The model presented here partially agrees with Mr. Gephardt in that
we have found evidence to suggest that the committee system was unable to
spend nearly as much time considering individual legislation. An eciency
increase of 3.92 times means that a committee spends mean of 74% less
time (in days) on each bill, and less time on the functions of deliberation
and education (see Thurber 1995). If legislative performance is judged at
least partially by members' careful consideration of policy outcomes, then
externally driven legislative agendas that that require radical productivity
changes in a stable system aect the quality of representation and deliberation, two major functions of Congress.
23
References
[1] Asmussen, Soren. 1987. Applied Probability and Queues. New York: Wiley &
Sons
[2] Barry, Rozanne M. 1995. \Bills Introduced and Laws Enacted: Selected Legislative Statistics, 1947-1994." CRS Report 95-233C.
[3] Brockmeyer, E., H.L. Halstrom, and Arne Jensen. 1960. The life and works of
A.K. Erlang. Kobenhavn: Akademiet for de Tekniske Videnskaber.
[4] Bunday, Brian D. 1986. Basic Queueing Theory. London: Edward Arnold.
[5] Cassata, Donna. 1995. \Swift Progress of `Contract' Inspires Awe and Concern" Congressional Quarterly. April 1, 1995.
[6] Congressional Quarterly. Summer 1994. Congressional Sta Directory Washington: Congressional Quarterly Press.
[7] Congressional Quarterly. Summer 1996. Congressional Sta Directory Washington: Congressional Quarterly Press.
[8] Davidson, Roger H. 1996. \The Presidency and Congressional Time." Rivals
for Power: Presidential - Congressional Relations. ed. James A. Thurber.
Washington: Congressional Quarterly Press.
[9] Davidson, Roger H. 1995. \Congressional Committees in the New Reform Era:
From Combat to the Contract." In Remaking Congress: Change and Stability
in the 1990's eds. James A. Thurber and Roger H. Davidson. Washington:
Congressional Quarterly Press.
[10] Davidson, Roger H. 1991. \The Presidency and Three Eras of the Modern
Congress." In Divided Democracy: Cooperation and Conict Between the President and Congress ed. James A. Thurber. Washington: Congressional Quarterly Press.
[11] Davidson, Roger H. and Walter J. Oleszek. 1994. Congress and Its Members.
Washington: Congressional Quarterly Press.
[12] Davidson, Roger H. Carol Hardy Vincent. 1987, \Indicators of House Workload
and Activity." CRS Report 87-4975.
[13] Easton, David. 1965. A Framework for Political Analysis. Englewood Clis,
NJ: Prentice Hall.
[14] Easton, David. 1966. A Systems Analysis of Political Life. New York: Routledge.
[15] Easton, David. 1990. The Analysis of Political Structure. New York: Routledge.
[16] Erlang, A. K. 1935. Fircifrede logaritmetavler og andre regnetavler til brug ved
undervisning og i praksis. Kobenhavn: G.E.C. Gads.
24
[17] Erlang, A. K. 1917. \Solution of Some Problems in the Theory of Probabilities
of Signicance in Automatic Telephone Exchanges." The Post Oce Electrical
Engineers Journal 10: 89.
[18] Feller, W. 1940. \On the Integro-dierential Equations of Purely Discontinuous
Markov Processes" Transactions of the American Mathematical Society 48:
488-515.
[19] Fenno, Richard. 1973. Congressmen in Committees. Boston: Little, Brown.
[20] Fiorina, Morris 1989. Congress: Keystone of the Washington Establishment.
Second Edition. New Haven: Yale University Press.
[21] Fry, T.C. 1965. Probability and its Engineering Uses. Second Edition. New
York: Van Nostrand.
[22] Gordon, Geory. 1975. The Application of GPSS V to Discrete System Simulation. Englewood Clis, NJ: Prentice Hall.
[23] Gnedenko, B. V. and I. N. Kovalenko. 1989. Introduction to Queueing Theory.
Boston: Birkhauser.
[24] Grimmett, G. R. and D. R. Stirzaker. 1992. Probability and Random Processes.
Oxford: Clarendon Press.
[25] Gross, Donald, and Carl M. Harris. 1985. Fundamentals of Queueing Theory.
New York: Wiley & Sons.
[26] Kinchin, A. Ya. 1960. Mathematical Methods and the Theory of Queueing.
London: Grin.
[27] Kinchin, A. Ya. 1956. \On Poisson Sequences of Random Events." Theory of
Probability and Applications 1: 291-7.
[28] King, Gary. 1989. \Variance Specication in Event Count Models: From Restrictive Assumptions to a Generalized Estimator." American Journal of Political Science 33: 762-84.
[29] King, Gary, Robert O. Keohane, and Sidney Verba. 1994 Designing Social
Inquiry. Princeton: Princeton University Press.
[30] Morgan, Byron J. T. 1984. Elements of Simulation. London: Chapman and
Hall.
[31] Oleszek, Walter J. 1989. Congressional Procedures and the Policy Process.
Washington: Congressional Quarterly Press.
[32] Ornstein, Norman J., Thomas E. Mann, and Michael Malbin. 1994. Vital
Statistics on Congress 1993-1994. Washington: American Enterprise Institute.
[33] Smith, Steven S. and Christopher J. Deering. 1990. Committees in Congress.
Washington: Congressional Quarterly Press.
25
[34] Solomon, Susan L. 1983. Simulation of Waiting-Line Systems. Englewood
Clis, N.J.: Prentice Hall.
[35] Thurber, James A. 1995. \The 104th Congress is Fast and Ecient, But At
What Price?" Roll Call March 5, p.16.
[36] Thurber, James A. 1991a. \Dynamics of Policy Subsystems in American Politics." In Interest Group Politics. ed. Allan J. Ciglar and Burdett Loomis.
Washington: Congressional Quarterly Press.
[37] Thurber, James A. 1991b. \Delay, Deadlock, and Decits: Evaluating Proposals for Congressional Budget Reform." In Federal Budget and Financial
Management Reform. ed. Thomas D. Lynch. Westport, CT: Quorum Books.
26