Computer Design from the Programmers Viewpoint
... from the designer end of the spectrum; this paper will make some further remarks of introduction along the lines of the user's viewpoint and then further develop ideas of what the user expects in the way of computer design. From the standpoint of the user the principal implication of the impending r ...
... from the designer end of the spectrum; this paper will make some further remarks of introduction along the lines of the user's viewpoint and then further develop ideas of what the user expects in the way of computer design. From the standpoint of the user the principal implication of the impending r ...
GSMDPs for Multi-Robot Sequential Decision-Making
... a suitable (discrete) state and action space, the identification of the stochastic models of the system, T and F, must be carried out. At this point, it is useful to group state transitions into E, as per Definition 2. For example, for a set of identical robots, each with a state factor representing b ...
... a suitable (discrete) state and action space, the identification of the stochastic models of the system, T and F, must be carried out. At this point, it is useful to group state transitions into E, as per Definition 2. For example, for a set of identical robots, each with a state factor representing b ...
Direct Demand Models of Air Travel
... Given that the sensitivity of demand to changes in travel variables (elasticities) in logit models depends on the choice probabilities, we would have concerns about using a model which did not closely replicate the base probabilities. Moreover, this approach is limited to the sample of individuals u ...
... Given that the sensitivity of demand to changes in travel variables (elasticities) in logit models depends on the choice probabilities, we would have concerns about using a model which did not closely replicate the base probabilities. Moreover, this approach is limited to the sample of individuals u ...
Data Splitting
... time series [Zhang and Berardi, 2001]. At first, a proper ordering of the dataset T has to be found. For the ordered dataset, a random starting sample is chosen and then each k-th sample is taken for k = nntr , n = |T |, ntr = |Ttr |. Systematic sampling is a very efficient method and it is easy to ...
... time series [Zhang and Berardi, 2001]. At first, a proper ordering of the dataset T has to be found. For the ordered dataset, a random starting sample is chosen and then each k-th sample is taken for k = nntr , n = |T |, ntr = |Ttr |. Systematic sampling is a very efficient method and it is easy to ...
Document
... • This amount falls short of the minimum required savings of $650 per week. • The savings are estimated to be $1,851.50 – $1,159.50 = $692 and exceed the minimum required savings for the additional investment from a 1000 week simulation. • This result emphasizes the importance of selecting the prope ...
... • This amount falls short of the minimum required savings of $650 per week. • The savings are estimated to be $1,851.50 – $1,159.50 = $692 and exceed the minimum required savings for the additional investment from a 1000 week simulation. • This result emphasizes the importance of selecting the prope ...
Segmentation and Fitting using Probabilistic Methods
... Figure from “Color and Texture Based Image Segmentation Using EM and Its Application to Content Based Image Retrieval”,S.J. Belongie et al., Proc. Int. Conf. Computer Vision, 1998, c1998, IEEE Computer Vision - A Modern Approach Set: Probability in segmentation Slides by D.A. Forsyth ...
... Figure from “Color and Texture Based Image Segmentation Using EM and Its Application to Content Based Image Retrieval”,S.J. Belongie et al., Proc. Int. Conf. Computer Vision, 1998, c1998, IEEE Computer Vision - A Modern Approach Set: Probability in segmentation Slides by D.A. Forsyth ...
Statistical classification is a procedure in which individual items are
... binary category, i.e. two classes P and N into which a data collection S needs to be classified, we can compute the amount of information required to determine the class, by I(p, n), the standard entropy measure, where p and n denote the cardinalities of P and N. Given an attribute A that can be use ...
... binary category, i.e. two classes P and N into which a data collection S needs to be classified, we can compute the amount of information required to determine the class, by I(p, n), the standard entropy measure, where p and n denote the cardinalities of P and N. Given an attribute A that can be use ...
Computer simulation
A computer simulation is a simulation, run on a single computer, or a network of computers, to reproduce behavior of a system. The simulation uses an abstract model (a computer model, or a computational model) to simulate the system. Computer simulations have become a useful part of mathematical modeling of many natural systems in physics (computational physics), astrophysics, climatology, chemistry and biology, human systems in economics, psychology, social science, and engineering. Simulation of a system is represented as the running of the system's model. It can be used to explore and gain new insights into new technology and to estimate the performance of systems too complex for analytical solutions.Computer simulations vary from computer programs that run a few minutes to network-based groups of computers running for hours to ongoing simulations that run for days. The scale of events being simulated by computer simulations has far exceeded anything possible (or perhaps even imaginable) using traditional paper-and-pencil mathematical modeling. Over 10 years ago, a desert-battle simulation of one force invading another involved the modeling of 66,239 tanks, trucks and other vehicles on simulated terrain around Kuwait, using multiple supercomputers in the DoD High Performance Computer Modernization ProgramOther examples include a 1-billion-atom model of material deformation; a 2.64-million-atom model of the complex maker of protein in all organisms, a ribosome, in 2005;a complete simulation of the life cycle of Mycoplasma genitalium in 2012; and the Blue Brain project at EPFL (Switzerland), begun in May 2005 to create the first computer simulation of the entire human brain, right down to the molecular level.Because of the computational cost of simulation, computer experiments are used to perform inference such as uncertainty quantification.