Elements of Applied Stochastic ProcessesJ. Wiley, 1972 - 414 páginas Stochastic processes: description and definition; The two-state markov process; Markov chains: classification of states; Finite markov chains; Markov chains with countably infinite states; Simple markov processes; Renewal processes; Stationary processes: some general properties; Markov decision processes; Congestion processes; Stochastic processes in reliability theory; Time series analysis; Social and behavioral processes; Some markov models in business and sports. |
Contenido
INTRODUCTION | 1 |
DESCRIPTION | 7 |
THE TWOSTATE MARKOV PROCESS | 18 |
Derechos de autor | |
Otras 18 secciones no mostradas
Otras ediciones - Ver todas
Términos y frases comunes
analysis arrival assume assumption behavior birth and death called Chapter Clearly components consider convergence critical region death process defined DEFINITION denoted derived discrete discussed distribution function distribution with mean eigenvalues elements estimates Example expected number failure distribution finite Markov chain given gives hence independent interval Laplace transforms likelihood function limiting distribution Markov chain Markov process Mathematical method n₁ negative exponential distribution normal distribution number of customers o(At obtained occur operation P₁ P₁(t parameter period Pn(t Po(t Poisson process population probability density function probability distribution Probability Theory problem process X(t properties queue length queueing system random variables recurrent reliability sample space Section server stationary statistic stochastic process Suppose Theorem traffic transient transition probability matrix two-state Markov variance vector Wiley write X₁ zero μ λ ξι Πο Ση