Download solutions to the Columbia Space Company problem.

Survey
yes no Was this document useful for you?
   Thank you for your participation!

* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project

Document related concepts

Probability wikipedia , lookup

Transcript
IEOR 4106: Introduction to Operations Research: Stochastic Models
Spring 2005, Professor Whitt, Second Midterm Exam
Chapters 5-6 in Ross, Thursday, March 31, 11:00am-1:00pm
Open Book: but only the Ross textbook plus one 8 × 11 page of notes
Justify your answers; show your work.
1. The IEOR Department Ricoh Printer (30 points)
The Columbia IEOR Department has a versatile Ricoh printer that can rapidly print onesided and two-sided copies, but unfortunately it often goes down. Ricoh is alternately up
(working) and down (waiting for repair or under repair). The average up time (time until
breakdown) is 4 days, while the average down time (time until repair) is 3 days. Assume
continuous operation. Let X(t) = 1 if the Ricoh is working at time t, and let X(t) = 0
otherwise.
(a) What do we need to assume about the successive up and down times in order to make
the stochastic process {X(t) : t ≥ 0} a continuous-time Markov chain (CTMC)?
——————————————————————————
We need the successive up times and down times to be mutually independent random
variables. In addition, these random variables should have exponential distributions. The
lack-of-memory property of the exponential distribution is critical for getting the Markov
property for the stochastic process {X(t) : t ≥ 0}. The exponential distribution has a single
parameter, which can be taken to be its mean. Since the means are already specified, nothing
more needs to be assumed, beyond assuming that the means agree with the specified averages.
——————————————————————————
Henceforth assume that these extra assumptions are in place, so that indeed the stochastic
process {X(t) : t ≥ 0} is a CTMC.
(b) Construct the CTMC; i.e., specify the model.
——————————————————————————
A CTMC can be specified by its rate matrix, Q. That takes a simple form here because
there are only two states: 0 and 1. The rate matrix here, ordering the two states 0 and 1, is
µ
¶
−1/3 1/3
Q=
1/4 −1/4
where the rates are expressed per day. That is, the repair rate is 1/3 per day, while the failure
rate is 1/4 per day.
In this case we have a birth-and-death (BD) process, so that λ0 = Q0,1 = 1/3, while
µ1 = Q1,0 = 1/4. That is an equivalent specification; i.e., we say that it is BD and we specify
these single λi and µi values.
——————————————————————————
(c) Assuming that Ricoh has been working continuously for 7 days, what is the probability
that it will remain working at least 8 more days?
——————————————————————————
The holding time in each state is exponentially distributed. The exponential up time has
mean 4 days, and thus rate 1/4, as indicated above. Let T be the failure time. Then
P (T > 8 + 7|T = 7) = P (T > 8) = e−(1/4)8 = e−2 .
——————————————————————————
(d) Suppose that Ricoh has been working continuously for 12 days. From that moment
forward, let T be the time until the second breakdown. What is the expected value E[T ]?
——————————————————————————
First, by the lack-of-memory property, the history (the 12 days) does not matter. Let Ui
be the ith up time and let Di be the ith down time, starting from the initial time. Then
T = U1 + D1 + U2 ,
which is the sum of three independent exponential random variables. Thus
E[T ] = E[U1 + D1 + U2 ] = E[U1 ] + E[D1 ] + E[U2 ] = 4 + 3 + 4 = 11
days .
——————————————————————————
(e) What is the long-run proportion of time that Ricoh is up?
——————————————————————————
You find the steady-state distribution by solving αQ = 0 or by using the local balance
equation
α0 λ0 = α1 µ1 ,
which here is
α0 (1/4) = α1 (1/3),
used together with α0 + α1 = 1, which agrees with intuition
lim P (X(t) = 1) =
t→∞
E[U ]
4
4
=
= .
E[U ] + E[D]
4+3
7
That is the formula for an alternating renewal process. As we will see in Chapter 7, the
limiting steady-state distribution holds even if the up and down times do not have exponential
distributions.
——————————————————————————
(f) Suppose that Ricoh is initially down. Approximately, what is the probability that Ricoh
will break down at least 9 more times within the next 93 days?
——————————————————————————
If Ricoh is initially down, then we have a down time D and an up time U between each
successive new breakdown. The time at which 9 breakdowns occurs is the sum S9 ≡ X1 +
· · · + X9 , where Xi is distributed as Di + Ui . Notice that E[X1 ] = 4 + 3 = 7 days, while
V ar(X1 ) = 42 + 32 = 16 + 9 = 25. By the central limit theorem, S9 is approximately normally
distributed with mean E[S9 ] = 9 × 7 = 63 days and variance V ar(S9 ) = 9 × 25 = 225. Hence,
we can use a normal approximation
µ
¶
S9 − 63
93 − 63
√
√
P (S9 ≤ 93) = P
≤
≈ P (N (0, 1) ≤ 2) ≈ 0.977 ;
225
225
2
see the normal table on page 81.
——————————————————————————
Bonus Question. (5 points)
Give the moment generating function of T (defined in part (d) above).
——————————————————————————
Continuing from part (d),
E[esT ] = E[es(U1 +D1 +U2 ) ] = E[esU1 esD1 esU2 ]
¶ µ
¶ µ
¶
µ
1/3
1/4
1/4
sU1
sD1
sU2
= E[e ]E[e ]E[e ] =
×
×
; (1)
(1/4) − s
(1/3) − s
(1/4) − s
see Example 2.4.1 on page 66, which can be found by looking up “moment generating function”
(MGF) in the index. Since the random variables are independent, the MGF of the sum is the
product of the MGF’s; see the middle of page 68.
——————————————————————————
2. The Toledo Taxi Stand (42 points)
In the city of Toledo there is a small taxi stand served by three taxis. Prospective groups
of customers arrive at the taxi stand at a rate of 5 per hour. (Assume that each group of
customers can be served by a single taxi.) The groups are served on a first-come first-served
basis by the first available taxi. The taxis take the customers to their desired destinations
and then return to the taxi stand. Suppose that the time of each taxi round trip, from the
taxi stand and back, is an exponentially distributed random variable with mean 10 minutes.
The taxis wait in order of arrival if there are no customers to serve. The groups of customers
also wait if there are no taxis, except that groups will not wait at all if there are already four
groups waiting for taxis. Moreover, the waiting groups of customers have limited patience.
Each group is only willing to wait an exponentially distributed time with mean 15 minutes
before they will leave, without receiving service.
Part I. (16 points)
For this part only, suppose that at some instant of time there is precisely one taxi at the
taxi stand.
(a) What is the probability that nothing happens (no arrivals of taxis or customers) during
the next 10 minutes?
——————————————————————————
When there is one taxi present, there are two things that can happen: the arrival of one
of the other two taxis or the arrival of a group of customers. The time until each of the other
taxis return is an exponential random variable with mean 10 minutes (and thus rate 6 per
hour). The time until the next group arrives is an exponential random variable with mean
12 minutes (and thus rate 5 per hour). The time until one of these three events occurs is the
minimum of three independent exponential random variables, and so is itself an exponential
random variable with a rate equal to the sum of the rates. The rate of this exponential random
variable is thus 6 + 6 + 5 = 17 per hour. Since 10 minutes is 1/6 hour. The time, say T , until
3
the next event is exponential with rate 17 per hour. Thus
P (T > 1/6) = e−17(1/6) = e(−17/6) .
——————————————————————————
(b) What is the expected time until one of the other two taxis returns?
——————————————————————————
As indicated in part (a), the time until each of the other taxis returns is exponential with
rate 6 per hour. The time until the first of these two events occurs is exponential with rate
6 + 6 = 12 per hour. The expected time is the reciprocal of the rate, which is 1/12 hours,
which is 5 minutes.
——————————————————————————
(c) What is the probability that one of the other taxis arrives before another group of
customers arrives?
——————————————————————————
The rate for one of the two taxis arriving is 12 per hour; the rate for the group to arrive is
5 per hour. Thus the probability that one of the two taxis arrives before the group arrives is
12/(12 + 5) = 12/17.
——————————————————————————
(d) What is the probability that both the other two taxis arrive before any groups of
customers arrive?
——————————————————————————
We want the probability of the intersection of two independent events, so it is the product of
two probabilities. The first event is that one of the taxis arrives before the group of customers.
Conditioned on that event, the second event is that the other remaining taxi arrives before the
group of customers. Thus the overall probability is (12/17) × (6/11) = 72/187 ≈ 0.385
——————————————————————————
Part II. (10 points)
(e) Construct a Markov stochastic process for the taxi stand that can be used to find the
long-run proportion of time that any specified number of taxis is available to serve arriving
groups of customers.
——————————————————————————
We present two modelling approaches, which are actually equivalent.
The first modelling approach is to recognize that we can regard this as a standard
M/M/s/r + M Markovian queueing model, having Poisson arrival process (the first M ),
IID exponential service times (the second M ), r = 3 servers, s = 4 extra waiting spaces
and IID exponential times to abandon for waiting customers (the +M ). That gives us the
M/M/3/4 + M model. This model is characterized by three parameters: the arrival rate λ = 5
per hour, the individual service rate µ = 6 per hour and the individual abandonment rate θ = 4
per hour. We get the service rate µ = 6 from the given mean 1/µ = 1/6 hour = 10 minutes;
we get the abandonment rate θ = 4 per hour from the given mean 1/θ = 1/4 hour = 15
minutes. To have this model fit our circumstances, we let X(t) be the number of customers
4
in the system, either waiting or being served, where we consider the customer to be in service
until the taxi he took returns to the taxi stand. With this interpretation, the possible states
are 0, 1, . . . , 7. For example, state 4 means that three customers are being served and one is
waiting, while state 2 means that 2 taxis are away, while 1 is at the taxi stand, and state 0
means that all three taxis are at the taxi stand. The advantage of this modelling approach is
it uses a familiar model. We draw the rate diagram for the birth-and-death process X(t) in
Figure 1.
Rate Diagram for a Birth-and-Death Process
birth rates
O0
O1
1
0
P1
O6
O2
6
2
P6
P2
7
P7
death rates
Figure 1: A rate diagram showing the transition rates for the birth-and-death process.
To complete the model specification, we need to specify the birth rates λi , 0 ≤ i ≤ 6, and
the death rates µi , 1 ≤ i ≤ 7. We define these as follows: Since the arrival rate is λ = 5, we
have λi = λ = 5 for all i, 0 ≤ i ≤ 6. We let λ7 = 0 because arrivals cannot occur when the
system is full The death rates are more complicated, because we need to account for service
completions and abandonments. We have µ1 = 6 because only one taxi is out, available to
return; we have µ2 = 12 because two taxis are out, available to return; µ3 = 18 because three
taxis are out, available to return. Then, for i ≥ 4, we must include abandonment. Each waiting
customer abandons at rate 4. Thus µ4 = 18 + 4 = 22, µ5 = 18 + 8 = 26, µ6 = 18 + 12 = 30
and µ7 = 18 + 16 = 34. We have thus specified all the individual birth rates and death rates.
We now describe the second modelling approach, which is chosen to more directly
describe the system. With this second modelling approach, let the state k designate the
difference between the number of customer groups present and the number of taxis present.
With that scheme the states range from −3 - all three taxis are there - to +4 - there are 4
customer groups present. (Additional customer groups would balk (or be blocked) and not
wait, by assumption.) Hence there are again 8 states: −3, −2, −1, 0, 1, 2, 3, 4. Alternatively,
we could label the states in the customary way: 0, 1, 2, 3, 4, 5, 6, 7; we then understand that
state j means the number of customers in the system, where we regard a customer as being
5
“in the system” if that customer is either waiting or its taxi has not yet returned. In other
words, we say that the customer is in the system until its taxi has returned to the taxi stand.
But that is just relabeling the same 8 states. The new state is the original state minus 3. The
corresponding rates are the same.
The associated rate matrix, numbering the states in increasing order (either −3, −2, −1, 0, 1, 2, 3, 4
or 0, 1, 2, 3, 4, 5, 6, 7) is

Q=











−5.0
5.0
0.0
0.0
0.0
0.0
0.0
0.0
6.0 −11.0
5.0
0.0
0.0
0.0
0.0
0.0
0.0
12.0 −17.0
5.0
0.0
0.0
0.0
0.0
0.0
0.0
18.0 −23.0
5.0
0.0
0.0
0.0
0.0
0.0
0.0
22.0 −27.0
5.0
0.0
0.0
0.0
0.0
0.0
0.0
26.0 −31.0
5.0
0.0
0.0
0.0
0.0
0.0
0.0
30.0 −35.0
5.0
0.0
0.0
0.0
0.0
0.0
0.0
34.0 −34.0












——————————————————————————
(f) Without performing any detailed calculations, indicate how the limiting steady-state
distribution for this Markov process can be efficiently calculated.
——————————————————————————
In general for a CTMC, we can find the limiting steady-state probability vector α by solving
αQ = 0, which corresponds to a system of 8 linear equations with 8 unknowns. In addition, if
we number the states 0, 1, 2, . . . , 7, then we need to use the equation α0 + · · · + α7 = 1.0. But
here we have a BD process, so we can solve the local-balance equations
αi λi = αi+1 µi+1
for all i, plus the equation α0 + · · · + α7 = 1.0. That leads to the explicit formula
αi =
1+
λ0 ×···×λi−1
µ1 ×···µi
Pn λ0 ×···×λk−1
k=1 µ1 ×···µk
for 1 ≤ i ≤ 7 and
α0 =
1+
1
Pn
k=1
λ0 ×···×λk−1
µ1 ×···µk
;
see page 371 of Ross.
——————————————————————————
Part III. (16 points)
For this part, assume that the limiting steady-state distribution of the Markov process has
been found, specifying the steady-state probability that the process is in each of the states.
Introduce notation for the states and the steady-state probabilities of those states. Use that
notation to answer the following questions.
(g) Give an expression for the probability that a group of potential customers will be able
to be served by a taxi immediately upon arrival?
——————————————————————————
6
Let the states be labeled as initially: −3, −2, . . . , 4. Let αi be the steady-state probability
of state i, −3 ≤ i ≤ 4. Then the answer here is α−3 + α−2 + α−1 .
——————————————————————————
(h) Give an expression for the long-run proportion of groups of potential customers that
leave immediately upon arrival without receiving service.
——————————————————————————
α4
——————————————————————————
(i) Give an expression for the long-run average number of taxis waiting at any time.
——————————————————————————
3 × α−3 + 2 × α−2 + 1 × α−1 .
——————————————————————————
(j) Give an expression for the long-run proportion of arriving groups of customers that elect
to wait upon arrival, but abandon before receiving service?
——————————————————————————
This one is more complicated. First, the question is not too clearly worded, so we it might
not be clear what is being asked. Suppose that we are looking for the conditional probability
that a customer abandons given that he enters and must wait. We can write
P (Abandon|enters and waits) =
=
=
abandonment rate
rate of arrivals that enter and wait
α1 θ + α2 2θ + α3 3θ + α4 4θ
λ(1 − α−3 − α−2 − α−1 − α4 )
4α1 + 8α2 + 12α3 + 16α4
5(1 − α−3 − α−2 − α−1 − α4 )
We could also proceed in other ways. For example, we could write
P (Abandon|enters and waits) =
P (enters and waits and then abandons)
.
P (enters and waits)
We then need to calculate the numerator and denominator. The denominator is
P (enters and waits) = α0 + α1 + α2 + α3 .
The numerator is more complicated.
We find the probability that there are i customers waiting upon arrival. Then we note that
the arriving customer makes an additional customer present. Then we find the probability in
question.
µ
¶
4
P (enters and waits and then abandons) = α0
18 + 4
·µ
¶ µ
¶µ
¶¸
4
22
4
+α1
+
18 + 8
18 + 8
18 + 4
·µ
¶ µ
¶ ·µ
¶ µ
¶µ
¶¸¸
4
26
4
22
4
+α2
+
×
+
18 + 12
18 + 12
18 + 8
18 + 8
18 + 4
·µ
¶ µ
¶ ·µ
¶ µ
¶ ·µ
¶ µ
¶µ
¶¸¸¸
4
30
4
26
4
22
4
+α3
+
+
+
18 + 16
18 + 16
18 + 12
18 + 12
18 + 8
18 + 8
18 + 4
7
To explain, consider the first term. The arrival finding 0 will himself be the sole waiting
customer. Then 4/(18 + 4) is the probability that an abandonment is the next event, whereas
18/(18 + 4) is the probability that a taxi arrives and the newly arrived customer group enters
service. When the customer group is initially number k in line it can abandon as the first event,
the second event, and so on, up to the k th event. Next consider the second term. Here is what
can happen: Our new customer can abandon, which happens with probability of 4/(18 + 8);
we divide by 18 + 8 now because 3 taxis can arrive (3 × 6 = 18) and 2 customers can abandon
(2 × 4 = 8). Alternatively, one of the other events can occur - the other customer abandons or
one of the taxis returns - which happens with probability of 22/(18 + 8); then our customer
will become first in line. Thereafter he abandons with probability 4/(18 + 4). We then move
on to the α2 term.
——————————————————————————
Part IV.
Bonus Questions. (8 points)
(k) Can the Markov process constructed in part (e) above be made time reversible? If so,
how?
——————————————————————————
Yes, the stochastic process can be made reversible, provided that we initialize by the
stationary probability vector α, because it is a birth-and-death process.
To elaborate, first we say that a stochastic process {X(t) : −∞ < t < ∞} is (time)
reversible if {X(−t) : −∞ < t < ∞} has the same probability law as {X(t) : −∞ < t < ∞}.
For any Markov process to be reversible, we require that the Markov process have a steadystate limiting probability vector α. That always is the case for a birth-and-death process with
a finite state space. (However, that might not be true with an infinite state space. With
an infinite state space, we need to verify that a proper steady-state limiting vector α exists.)
Moreover, we must look at the stochastic process in steady state. That is accomplished by
letting the initial distribution be α; i.e., we let P (X(0) = j) = αj for all j. With that special
initial condition, we have P (X(t) = j) = αj for all j and for all t. If we initialize a Markov
process in that way, there always is a well-defined reverse-time Markov process, also having
steady-state limiting vector α. The question is whether the reverse-time Markov process has
the same probability law as the original (forward-time) Markov process. That is true if and only
if the detailed balance condition holds; see Section 6.6. That is the case for birth-and-death
processes. In that sense, all ergodic birth-and-death processes are reversible; see Proposition
6.5. So we must initialize the process with the steady-state vector α or, equivalently, we must
consider the process in equilibrium (steady state). Then it becomes reversible. If it does not
have the right initial conditions, then it is not reversible.
——————————————————————————
(l) Consider the stochastic process counting the arrival times of taxis to the stand beginning
at some time in steady state. Is that stochastic process a Poisson process? Why or why not?
——————————————————————————
No, the arrival counting process for the arrivals of taxis is not a Poisson process. To
understand why, it is useful to look at the M/M/3/4 model formulation, with states 0 − 7.
With that model formulation, the arrival process of taxis corresponds to the departure process
from the queueing system. We might think that implies Poisson, because we know that the
8
stochastic process X(t) counting the number of customers in the system is reversible. Since
the external arrival process is a Poisson process, we might expect that reversibility implies
that the departure process is also a Poisson process, by virtue of Corollary 6.6 on page 378.
However, that result does not apply because of the finite waiting room.
For our model, the departure process becomes an arrival process in reverse time, so it has
the same distribution of the process counting arrivals that actually enter the system. But
some of the original arrivals are blocked; indeed all arrivals finding the system full are blocked.
Moreover, the blocking process does not act as an independent thinning of the external arrival
process. Instead, the blocking only occurs in a special state, when the waiting room is full.
Thus, because of reversibility, the departure process does have the same distribution as the
process counting customers entering the system, but that entering counting process is not itself
a Poisson process.
——————————————————————————
3. The Columbia Space Company (28 points)
Columbia University has decided to start the Columbia Space Company, which will launch
satellites from its planned Manhattanville launch site beginning in 2010, referred to henceforth
as time 0. Allowing for steady growth, the Columbia Space Company plans to launch satellites
at an increasing rate, beginning at time t. Specifically, they anticipate that they will launch
satellites according to a nonhomogeneous Poisson process with rate λ(t) = 2t satellites per
year for t ≥ 0. Suppose that the successive times satellites stay up in space are independent
random variables, each uniformly distributed between 3 years and 5 years.
(a) What is the probability (according to this model) that no satellites will actually be
launched during the first three years (between times t = 0 and t = 3?
——————————————————————————
The arrival process is a nonhomogeneous Poisson process. The mean number of arrivals
between 0 and 3 is
Z 3
Z 3
¯3
m(0, 3) =
λ(t) dt =
2t dt = t2 ¯0 = 9 − 0 = 9 .
0
0
Let N (t) denote the number of satellites that have arrived in the interval [0, t]. Then
P (N (3) = 0) = e−9 .
——————————————————————————
(b) What is the expected number of satellites launched during the second year (between
times t = 1 and t = 2)?
——————————————————————————
m(1, 2) ≡ E[N (2) − N (1)] = E[N (2)] − E[N (1)] = 4 − 1 = 3 .
——————————————————————————
(c) What is the probability that precisely 7 satellites will be launched during the second
year (between times t = 1 and t = 2)?
——————————————————————————
9
Using part (b),
P (N (2) − N (1) = 7) =
e−3 37
.
7!
——————————————————————————
(d) Let S(t) be the number of satellites in space at time t. What is the expected value
E[S(6)]?
——————————————————————————
Now we exploit properties of the Mt /GI/∞ model, as discussed in the paper, The Physics
of The Mt/G/infty Queue, by Steven G. Eick, William A. Massey and Ward Whitt, Operations Research, vol. 41, No. 4, 1993, pp. 731-742, posted on the lecture notes web page.
See especially Section 1 up to Remark 1 (about one full journal page). Theorem 1 there states
that S(t) has a Poisson distribution with mean
Z
E[S(t)] =
t
λ(s)[1 − G(t − s)] ds ,
0
The idea is that the arrivals together with their service times can be represented as a Poisson
random measure in the plane. The intensity at a point (s, x) in [0, ∞) × [0, ∞) is λ(s)g(x)
where g is the probability density function associated with the service-time cdf G. We actually
get the single integral above from the double integral
¶
Z t µZ ∞
E[S(t)] =
λ(s)g(x) dx ds ,
0
t−s
In our case, t = 6 and g(x) = 1/2, 3 ≤ x ≤ 5, while g(x) = 0 elsewhere. Hence, G(t) = (t−3)/2,
3 ≤ t ≤ 5. So the final range of integration is as shown in Figure 2 below. We want to integrate
the intensity λ(s)g(x) = 2s × 1/2 = s over the shaded rectangle and triangle, appearing above
the 45 degree line between 1 and 6.
To explain the rectangle for 3 ≤ s ≤ 6, note that all satellites launched in the interval [3, 6]
will still be in space; the expected number of these is m(3, 6) ≡ E[N (6)]−E[N (3)] = 36−9 = 27.
Any launches before time 1 will have returned, and so need not be considered. So we need to
more carefully consider the interval [1, 3]. That is the shaded triangle in Figure 2 for 1 ≤ s ≤ 3.
To do so, write
Z 3
Z 3
Z 3
(s − 1)
ds =
(s2 − s) ds = 14/3 .
λ(s)[1 − G(t − s)] ds =
2s
2
1
1
1
Hence, E[S(6)] = 27 + 14/3 = 95/3.
——————————————————————————
(e) Let R(t) be the number of satellites that have been launched and returned to earth by
time t. What is the covariance between S(6) and R(6)?
——————————————————————————
The covariance is 0, because the random variables R(t) and S(t) are independent Poisson
random variables. To see that independence implies 0 covariance, look at pages 52 and 53 of
Ross. From the Mt /GI/∞ model, these two random variables represent the number of points
in disjoint subsets of the plane (above and below the 45 degree line), when we focus on the
Poisson process in the plane; again see the Physics paper.
10
Diagram for an Infinite-Server Queue
6
5
3
(s,x)
0
1
3
x
s
6
Figure 2: A diagram showing the intensity of arrivals and service times.
——————————————————————————
(f) Give an expression for the joint probability P (S(6) = 7, R(6) = 8).
——————————————————————————
Since N (6) = S(6) + R(6), we have E[R(6)] = E[N (6)] − E[S(6)] = 36 − (95/3) = 13/3.
By the independence,
P (S(6) = 7, D(6) = 8) =
e−a (a)7 e−b (b)8
×
,
7!
8!
where a = 95/3 and b = 13/3.
——————————————————————————
(g) Give an expression for the probability P (S(6) + D(6) = 15).
——————————————————————————
Since N (6) = S(6) + R(6),
P (S(6) + R(6) = 15) = P (N (6) = 15) =
e−36 (36)15
15!
——————————————————————————
4. Ultimate Bonus Questions (6 points)
(a) Who is buried in Grant’s Tomb?
11
——————————————————————————
The main answer is Grant. The more refined answer is Ulysses S. Grant, Civil War General
and U. S. President from 1869 to 1877. But his wife - Julia Dent Grant - is also buried there.
See
http://www.nps.gov/gegr/index.htm
——————————————————————————
(b) Where is Grant’s Tomb?
——————————————————————————
122nd Street and Riverside Drive, about 4 blocks away from Mudd. Right across the street
from the International House.
——————————————————————————
(b) Where does this question come from?
——————————————————————————
The “You Bet Your Life” quiz program hosted by Groucho Marx.
——————————————————————————
12