Markov processes for stochastic modeling:
Gespeichert in:
1. Verfasser: | |
---|---|
Format: | Buch |
Sprache: | English |
Veröffentlicht: |
Amsterdam [u.a.]
Elsevier
2009
|
Schlagworte: | |
Online-Zugang: | Inhaltsverzeichnis |
Beschreibung: | Includes bibliographical references (p. 451-469) and index |
Beschreibung: | XIV, 490 S. Ill., graph. Darst. |
ISBN: | 9780123744517 0123744512 |
Internformat
MARC
LEADER | 00000nam a2200000zc 4500 | ||
---|---|---|---|
001 | BV035173736 | ||
003 | DE-604 | ||
005 | 20090603 | ||
007 | t | ||
008 | 081121s2009 ne ad|| |||| 00||| eng d | ||
010 | |a 2008021501 | ||
020 | |a 9780123744517 |c hardcover : alk. paper |9 978-0-12-374451-7 | ||
020 | |a 0123744512 |c hardcover : alk. paper |9 0-12-374451-2 | ||
035 | |a (OCoLC)633914106 | ||
035 | |a (DE-599)HBZHT015735357 | ||
040 | |a DE-604 |b ger |e aacr | ||
041 | 0 | |a eng | |
044 | |a ne |c NL | ||
049 | |a DE-703 |a DE-824 |a DE-19 | ||
050 | 0 | |a QA274.7 | |
082 | 0 | |a 519.2/33 | |
084 | |a SK 820 |0 (DE-625)143258: |2 rvk | ||
100 | 1 | |a Ibe, Oliver C. |d 1947- |e Verfasser |0 (DE-588)136641784 |4 aut | |
245 | 1 | 0 | |a Markov processes for stochastic modeling |c Oliver C. Ibe |
264 | 1 | |a Amsterdam [u.a.] |b Elsevier |c 2009 | |
300 | |a XIV, 490 S. |b Ill., graph. Darst. | ||
336 | |b txt |2 rdacontent | ||
337 | |b n |2 rdamedia | ||
338 | |b nc |2 rdacarrier | ||
500 | |a Includes bibliographical references (p. 451-469) and index | ||
650 | 4 | |a Markov processes | |
650 | 4 | |a Stochastic processes | |
650 | 0 | 7 | |a Stochastisches Modell |0 (DE-588)4057633-4 |2 gnd |9 rswk-swf |
650 | 0 | 7 | |a Markov-Prozess |0 (DE-588)4134948-9 |2 gnd |9 rswk-swf |
689 | 0 | 0 | |a Markov-Prozess |0 (DE-588)4134948-9 |D s |
689 | 0 | 1 | |a Stochastisches Modell |0 (DE-588)4057633-4 |D s |
689 | 0 | |8 1\p |5 DE-604 | |
856 | 4 | 2 | |m Digitalisierung UB Bayreuth |q application/pdf |u http://bvbr.bib-bvb.de:8991/F?func=service&doc_library=BVB01&local_base=BVB01&doc_number=016980627&sequence=000002&line_number=0001&func_code=DB_RECORDS&service_type=MEDIA |3 Inhaltsverzeichnis |
999 | |a oai:aleph.bib-bvb.de:BVB01-016980627 | ||
883 | 1 | |8 1\p |a cgwrk |d 20201028 |q DE-101 |u https://d-nb.info/provenance/plan#cgwrk |
Datensatz im Suchindex
_version_ | 1804138343126007808 |
---|---|
adam_text | Contents
Preface
xiii
Acknowledgments
xiv
1.
Basic Concepts
1
1.1.
Review of Probability
1
1.1.1.
Conditional Probability
4
1.1.2.
Independence
4
1.1.3.
Total Probability and the
Bayes
Theorem
5
1.2.
Random Variables
6
1.2.1.
Distribution Functions
6
1.2.2.
Discrete Random Variables
6
1.2.3.
Continuous Random Variables
7
1.2.4.
Expectations
8
1.2.5.
Expectation of
Nonnegative
Random Variables
8
1.2.6.
Moments of Random Variables and the Variance
8
1.3.
Transform Methods
9
1.3.1.
The s-Transform
9
1.3.2.
The z-Transform
10
1.4.
Bi
variate
Random Variables
12
1.4.1.
Discrete Bivariate Random Variables
13
1.4.2.
Continuous Bivariate Random Variables
13
1.4.3.
Covariance and Correlation Coefficient
14
1.5.
Many Random Variables
14
1.6.
Fubini s Theorem
15
1.7.
Sums of Independent Random Variables
16
1.8.
Some Probability Distributions
18
1.8.1.
The Bernoulli Distribution
18
1.8.2.
The Binomial Distribution
19
1.8.3.
The Geometric Distribution
20
1.8.4.
The Pascal Distribution
20
1.8.5.
The
Poisson
Distribution
21
1.8.6.
The Exponential Distribution
21
1.8.7.
The
Erlang
Distribution
22
1.8.8.
Normal Distribution
23
1.9.
Introduction to Stochastic Processes
24
vi Contente
1.10.
Classification
of Stochastic
Processes
25
1.11.
Characterizing a Stochastic Process
25
1.11.1.
Mean and Autocorrelation Function of a Stochastic
Process
26
1.12.
Stationary Stochastic Processes
27
1.12.1.
Strict-Sense Stationary Processes
27
1.12.2.
Wide-Sense Stationary Processes
28
1.13.
Ergodic Stochastic Processes
28
1.14.
Some Models of Stochastic Processes
29
1.14.1.
Martingales
29
1.14.2.
Counting Processes
32
1.14.3.
Independent Increment Processes
32
1.14.4.
Stationary Increment Process
33
1.14.5.
Poisson
Processes
33
1.15.
Problems
38
2.
Introduction to Markov Processes
45
2.1.
Introduction
45
2.2.
Structure of Markov Processes
46
2.3.
Strong Markov Property
48
2.4.
Applications of Discrete-Time Markov Processes
49
2.4.1.
Branching Processes
49
2.4.2.
Social Mobility
50
2.4.3.
Markov Decision Processes
50
2.5.
Applications of Continuous-Time Markov Processes
50
2.5.1.
Queueing Systems
50
2.5.2.
Continuous-Time Markov Decision Processes
51
2.5.3.
Stochastic Storage Systems
51
2.6.
Applications of Continuous-State Markov Processes
52
2.6.1.
Application of Diffusion Processes to Financial Options
52
2.6.2.
Applications of Brownian Motion
52
3.
Discrete-Time Markov Chains
55
3.1.
Introduction
55
3.2.
State Transition Probability Matrix
56
3.2.1.
The
η
-Step
State Transition Probability
56
3.3.
State Transition Diagrams
58
3.4.
Classification of States
59
3.5.
Limiting-State Probabilities
61
3.5.1.
Doubly Stochastic Matrix
65
Contents
Vii
3.6.
Sojourn Time
66
3.7.
Transient Analysis of Discrete-Time Markov Chains
67
3.8.
First Passage and Recurrence Times
69
3.9.
Occupancy Times
72
3.10.
Absorbing Markov Chains and the Fundamental Matrix
73
3.10.1.
Time to Absorption
74
3.10.2.
Absorption Probabilities
77
3.11.
Reversible Markov Chains
78
3.12.
Problems
79
4.
Continuous-Time Markov Chains
83
4.1.
Introduction
83
4.2.
Transient Analysis
86
4.3.
Birth and Death Processes
90
4.3.1.
Local Balance Equations
94
4.3.2.
Transient Analysis of Birth and Death Processes
95
4.4.
First Passage Time
96
4.5.
The Uniformization Method
98
4.6.
Reversible Continuous-Time Markov Chains
99
4.7.
Problems
99
5.
Markovian Queueing Systems
105
5.1.
Introduction
105
5.2.
Description of a Queueing System
105
5.3.
The Kendall Notation
108
5.4.
The Little s Formula
109
5.5.
The PASTA Property
110
5.6.
The M/M/l Queueing System
110
5.6.1.
Stochastic Balance
114
5.6.2.
Total Time and Waiting Time Distributions of the M/M/l
Queueing System
114
5.7.
Examples of Other M/M Queueing Systems
117
5.7.1.
The M/M/c Queue: The
с
-Server
System
118
5.7.2.
The M/M/l/K Queue: The Single-Server Finite-Capacity
System
121
5.7.3.
The M/M/c/c Queue: The
с
-Server
Loss System
126
5.7.4.
The M/M/1//K Queue: The Single-Server Finite-Customer
Population System
128
5.8.
M/G/l Queue
130
5.8.1.
Waiting Time Distribution of the M/G/l Queue
132
viii Contents
5.8.2.
The M/JSyi Queue
135
5.8.3.
The M/D/l Queue
137
5.8.4.
The M/M/l Queue Revisited
138
5.8.5.
The
ШНкП
Queue
138
5.9.
G/M/l Queue
140
5.9.1.
The EkIMI Queue
144
5.9.2.
The D/M/l Queue
145
5.9.3.
The HiMIl Queue
146
5.10.
Applications of Markovian Queues
147
5.11.
Problems
148
6.
Markov Renewal Processes
153
6.1.
Introduction
153
6.2.
The Renewal Equation
154
6.2.1.
Alternative Approach
156
6.3.
The Elementary Renewal Theorem
158
6.4.
Random Incidence and Residual Time
159
6.5.
Markov Renewal Process
161
6.5.1.
The Markov Renewal Function
162
6.6.
Semi-Markov Processes
164
6.6.1.
Discrete-Time Semi-Markov Processes
165
6.6.2.
Continuous-Time Semi-Markov Processes
170
6.7.
Markov Jump Processes
175
6.7.1.
The Homogeneous Markov Jump Process
178
6.8.
Problems
181
7.
Markovian Arrival Processes
185
7.1.
Introduction
185
7.2.
Overview of Matrix-Analytic Methods
186
7.3.
Markovian Arrival Process
191
7.3.1.
Properties of MAP
194
7.4.
Batch Markovian Arrival Process
196
7.4.1.
Properties of BMAP
199
7.5.
Markov-Modulated
Poisson
Process
200
7.5.1.
The Interrupted
Poisson
Process
201
7.5.2.
The Switched
Poisson
Process
203
7.5.3.
Properties of MMPP
203
7.5.4.
The MMPP(2)/M/1 Queue
205
7.6.
Markov-Modulated Bernoulli Process
209
7.6.1.
TheMMBP(2)
210
Contents ix
7.7. Sample Applications
of MAP and Its
Derivatives 212
7.8. Problems 213
8. Random Walk 215
8.1.
Introduction
215
8.2. The Two-Dimensional Random Walk 217
8.3.
Random Walk as a Markov
Chain 218
8.4. Symmetrie
Random Walk as a Martingale
219
8.5.
Random Walk with Barriers
219
8.6.
Gambler s Ruin
220
8.6.1.
Ruin Probability
220
8.6.2.
Duration of a Game
222
8.7.
First Return Times
224
8.8.
First Passage Times
227
8.9.
Maximum of a Random Walk
229
8.10.
Random Walk on a Graph
231
8.10.1.
Random Walk on a Weighted Graph
236
8.11.
Markov Random Walk
237
8.11.1.
Markov Random Walk in Semisupervised Machine
Learning
239
8.12.
Random Walk with Correlation
240
8.13.
Continuous-Time Random Walk
246
8.13.1.
The Master Equation
248
8.14.
Sample Applications of Random Walk
251
8.14.1.
The Ballot Problem
251
8.14.2.
Web Search
254
8.14.3.
Mobility Models in Mobile Networks
255
8.14.4.
Insurance Risk
257
8.14.5.
Content of a Dam
257
8.14.6.
Cash Management
258
8.15.
Problems
258
9.
Brownian Motion and Diffusion Processes
263
9.1.
Introduction
263
9.2.
Brownian Motion
263
9.2.1.
Brownian Motion with Drift
265
9.2.2.
Brownian Motion as a Markov Process
265
9.2.3.
Brownian Motion as a Martingale
266
9.2.4.
First Passage Time of a Brownian Motion
266
9.2.5.
Maximum of a Brownian Motion
268
X
Contents
9.2.6. First Passage Time in an
Interval
269
9.2.7.
The Brownian Bridge
270
9.3.
Introduction to Stochastic Calculus
271
9.3.1.
The
Ito
Integral
271
9.3.2.
The Stochastic Differential
273
9.3.3.
The Ito s Formula
273
9.3.4.
Stochastic Differential Equations
274
9.4.
Geometric Brownian Motion
274
9.5.
Fractional Brownian Motion
276
9.6.
Application of Brownian Motion to Option Pricing
277
9.7.
Random Walk Approximation of Brownian Motion
280
9.8.
The Ornstein-Uhlenbeck Process
281
9.8.1.
Mean Reverting Ornstein-Uhlenbeck Process
285
9.8.2.
Applications of the Ornstein-Uhlenbeck Process
286
9.9.
Diffusion Processes
287
9.10.
Examples of Diffusion Processes
289
9.10.1.
Brownian Motion
289
9.10.2.
Brownian Motion with Drift
291
9.10.3.
Levy Processes
292
9.11.
Relationship Between the Diffusion Process and Random Walk
294
9.12.
Problems
295
10.
Controlled Markov Processes
297
10.1.
Introduction
297
10.2.
Markov Decision Processes
297
10.2.1.
Overview of Dynamic Programming
299
10.2.2.
Markov Reward Processes
302
10.2.3.
MDP Basics
304
10.2.4.
MDPs with Discounting
306
10.2.5.
Solution Methods
307
10.3.
Semi-Markov Decision Processes
317
10.3.1.
Semi-Markov Reward Model
318
10.3.2.
Discounted Reward
320
10.3.3.
Analysis of the Continuous-Decision-Interval SMDPs
321
10.3.4.
Solution by Policy Iteration
323
10.3.5.
SMDP with Discounting
325
10.3.6.
Solution by Policy Iteration when Discounting Is Used
326
10.3.7.
Analysis of the Discrete-Decision-Interval SMDPs with
Discounting
328
10.3.8.
Continuous-Time Markov Decision Processes
328
10.3.9.
Applications of Semi-Markov Decision Processes
329
Contents xi
10.4.
Partially Observable Markov Decision Processes
330
10.4.1.
Partially Observable Markov Processes
332
10.4.2.
POMDP Basics
334
10.4.3.
Solving POMDPs
337
10.4.4.
Computing the Optimal Policy
338
10.4.5.
Approximate Solutions of POMDP
338
10.5.
Problems
339
11.
Hidden Markov Models
341
11.1.
Introduction
341
11.2.
HMM
Basics
343
11.3.
HMM
Assumptions
345
11.4.
Three Fundamental Problems
346
11.5.
Solution Methods
347
11.5.1.
The Evaluation Problem
347
11.5.2.
The Decoding Problem and the Viterbi Algorithm
356
11.5.3.
The Learning Problem and the
Baum-
Welch
Algorithm
362
11.6.
Types of Hidden Markov Models
365
11.7.
Hidden Markov Models with Silent States
366
11.8.
Extensions of Hidden Markov Models
366
11.8.1.
Hierarchical Hidden Markov Model
367
11.8.2.
Factorial Hidden Markov Model
368
11.8.3.
Coupled Hidden Markov Model
369
11.8.4.
Hidden Semi-Markov Models
370
11.8.5.
Profile HMMs for Biological Sequence Analysis
371
11.9.
Other Extensions of
HMM
375
11.10.
Problems
376
12.
Markov Random Fields
381
12.1.
Introduction
381
12.2.
Markov Random Fields
382
12.2.1.
Graphical Representation
386
12.2.2.
Gibbs Random Fields and the Hammersley-Clifford
Theorem
388
12.3.
Examples of Markov Random Fields
390
12.3.1.
The Ising Model
390
12.3.2.
The Potts Model
392
12.3.3.
Gauss-Markov Random Fields
393
12.4.
Hidden Markov Random Fields
394
xii Contents
12.5. Applications
of Markov Random Fields
397
12.6.
Problems
398
13.
Markov Point Processes
401
13.1.
Introduction
401
13.2.
Temporal Point Processes
402
13.2.1.
Specific Temporal Point Processes
404
13.3.
Spatial Point Processes
405
13.3.1.
Specific Spatial Point Processes
407
13.4.
Spatial-Temporal Point Processes
410
13.5.
Operations on Point Processes
413
13.5.1.
Thinning
413
13.5.2.
Superposition
414
13.5.3.
Clustering
414
13.6.
Marked Point Processes
415
13.7.
Markov Point Processes
416
13.8.
Markov Marked Point Processes
419
13.9.
Applications of Markov Point Processes
420
13.10.
Problems
420
14.
Markov Chain Monte Carlo
423
14.1.
Introduction
423
14.2.
Monte Carlo Simulation Basics
424
14.2.1.
Generating Random Variables from the Random
Numbers
426
14.2.2.
Monte Carlo Integration
429
14.3.
Markov Chains Revisited
431
14.4.
MCMC Simulation
433
14.4.1.
The Metropolis-Hastings Algorithm
434
14.4.2.
Gibbs Sampling
436
14.5.
Applications of MCMC
442
14.5.1.
Simulated Annealing
442
14.5.2.
Inference in Belief Networks
445
14.6.
Choice of Method
447
14.7.
Problems
448
References
451
Index
471
|
adam_txt |
Contents
Preface
xiii
Acknowledgments
xiv
1.
Basic Concepts
1
1.1.
Review of Probability
1
1.1.1.
Conditional Probability
4
1.1.2.
Independence
4
1.1.3.
Total Probability and the
Bayes'
Theorem
5
1.2.
Random Variables
6
1.2.1.
Distribution Functions
6
1.2.2.
Discrete Random Variables
6
1.2.3.
Continuous Random Variables
7
1.2.4.
Expectations
8
1.2.5.
Expectation of
Nonnegative
Random Variables
8
1.2.6.
Moments of Random Variables and the Variance
8
1.3.
Transform Methods
9
1.3.1.
The s-Transform
9
1.3.2.
The z-Transform
10
1.4.
Bi
variate
Random Variables
12
1.4.1.
Discrete Bivariate Random Variables
13
1.4.2.
Continuous Bivariate Random Variables
13
1.4.3.
Covariance and Correlation Coefficient
14
1.5.
Many Random Variables
14
1.6.
Fubini's Theorem
15
1.7.
Sums of Independent Random Variables
16
1.8.
Some Probability Distributions
18
1.8.1.
The Bernoulli Distribution
18
1.8.2.
The Binomial Distribution
19
1.8.3.
The Geometric Distribution
20
1.8.4.
The Pascal Distribution
20
1.8.5.
The
Poisson
Distribution
21
1.8.6.
The Exponential Distribution
21
1.8.7.
The
Erlang
Distribution
22
1.8.8.
Normal Distribution
23
1.9.
Introduction to Stochastic Processes
24
vi Contente
1.10.
Classification
of Stochastic
Processes
25
1.11.
Characterizing a Stochastic Process
25
1.11.1.
Mean and Autocorrelation Function of a Stochastic
Process
26
1.12.
Stationary Stochastic Processes
27
1.12.1.
Strict-Sense Stationary Processes
27
1.12.2.
Wide-Sense Stationary Processes
28
1.13.
Ergodic Stochastic Processes
28
1.14.
Some Models of Stochastic Processes
29
1.14.1.
Martingales
29
1.14.2.
Counting Processes
32
1.14.3.
Independent Increment Processes
32
1.14.4.
Stationary Increment Process
33
1.14.5.
Poisson
Processes
33
1.15.
Problems
38
2.
Introduction to Markov Processes
45
2.1.
Introduction
45
2.2.
Structure of Markov Processes
46
2.3.
Strong Markov Property
48
2.4.
Applications of Discrete-Time Markov Processes
49
2.4.1.
Branching Processes
49
2.4.2.
Social Mobility
50
2.4.3.
Markov Decision Processes
50
2.5.
Applications of Continuous-Time Markov Processes
50
2.5.1.
Queueing Systems
50
2.5.2.
Continuous-Time Markov Decision Processes
51
2.5.3.
Stochastic Storage Systems
51
2.6.
Applications of Continuous-State Markov Processes
52
2.6.1.
Application of Diffusion Processes to Financial Options
52
2.6.2.
Applications of Brownian Motion
52
3.
Discrete-Time Markov Chains
55
3.1.
Introduction
55
3.2.
State Transition Probability Matrix
56
3.2.1.
The
η
-Step
State Transition Probability
56
3.3.
State Transition Diagrams
58
3.4.
Classification of States
59
3.5.
Limiting-State Probabilities
61
3.5.1.
Doubly Stochastic Matrix
65
Contents
Vii
3.6.
Sojourn Time
66
3.7.
Transient Analysis of Discrete-Time Markov Chains
67
3.8.
First Passage and Recurrence Times
69
3.9.
Occupancy Times
72
3.10.
Absorbing Markov Chains and the Fundamental Matrix
73
3.10.1.
Time to Absorption
74
3.10.2.
Absorption Probabilities
77
3.11.
Reversible Markov Chains
78
3.12.
Problems
79
4.
Continuous-Time Markov Chains
83
4.1.
Introduction
83
4.2.
Transient Analysis
86
4.3.
Birth and Death Processes
90
4.3.1.
Local Balance Equations
94
4.3.2.
Transient Analysis of Birth and Death Processes
95
4.4.
First Passage Time
96
4.5.
The Uniformization Method
98
4.6.
Reversible Continuous-Time Markov Chains
99
4.7.
Problems
99
5.
Markovian Queueing Systems
105
5.1.
Introduction
105
5.2.
Description of a Queueing System
105
5.3.
The Kendall Notation
108
5.4.
The Little's Formula
109
5.5.
The PASTA Property
110
5.6.
The M/M/l Queueing System
110
5.6.1.
Stochastic Balance
114
5.6.2.
Total Time and Waiting Time Distributions of the M/M/l
Queueing System
114
5.7.
Examples of Other M/M Queueing Systems
117
5.7.1.
The M/M/c Queue: The
с
-Server
System
118
5.7.2.
The M/M/l/K Queue: The Single-Server Finite-Capacity
System
121
5.7.3.
The M/M/c/c Queue: The
с
-Server
Loss System
126
5.7.4.
The M/M/1//K Queue: The Single-Server Finite-Customer
Population System
128
5.8.
M/G/l Queue
130
5.8.1.
Waiting Time Distribution of the M/G/l Queue
132
viii Contents
5.8.2.
The M/JSyi Queue
135
5.8.3.
The M/D/l Queue
137
5.8.4.
The M/M/l Queue Revisited
138
5.8.5.
The
ШНкП
Queue
138
5.9.
G/M/l Queue
140
5.9.1.
The EkIMI\ Queue
144
5.9.2.
The D/M/l Queue
145
5.9.3.
The HiMIl Queue
146
5.10.
Applications of Markovian Queues
147
5.11.
Problems
148
6.
Markov Renewal Processes
153
6.1.
Introduction
153
6.2.
The Renewal Equation
154
6.2.1.
Alternative Approach
156
6.3.
The Elementary Renewal Theorem
158
6.4.
Random Incidence and Residual Time
159
6.5.
Markov Renewal Process
161
6.5.1.
The Markov Renewal Function
162
6.6.
Semi-Markov Processes
164
6.6.1.
Discrete-Time Semi-Markov Processes
165
6.6.2.
Continuous-Time Semi-Markov Processes
170
6.7.
Markov Jump Processes
175
6.7.1.
The Homogeneous Markov Jump Process
178
6.8.
Problems
181
7.
Markovian Arrival Processes
185
7.1.
Introduction
185
7.2.
Overview of Matrix-Analytic Methods
186
7.3.
Markovian Arrival Process
191
7.3.1.
Properties of MAP
194
7.4.
Batch Markovian Arrival Process
196
7.4.1.
Properties of BMAP
199
7.5.
Markov-Modulated
Poisson
Process
200
7.5.1.
The Interrupted
Poisson
Process
201
7.5.2.
The Switched
Poisson
Process
203
7.5.3.
Properties of MMPP
203
7.5.4.
The MMPP(2)/M/1 Queue
205
7.6.
Markov-Modulated Bernoulli Process
209
7.6.1.
TheMMBP(2)
210
Contents ix
7.7. Sample Applications
of MAP and Its
Derivatives 212
7.8. Problems 213
8. Random Walk 215
8.1.
Introduction
215
8.2. The Two-Dimensional Random Walk 217
8.3.
Random Walk as a Markov
Chain 218
8.4. Symmetrie
Random Walk as a Martingale
219
8.5.
Random Walk with Barriers
219
8.6.
Gambler's Ruin
220
8.6.1.
Ruin Probability
220
8.6.2.
Duration of a Game
222
8.7.
First Return Times
224
8.8.
First Passage Times
227
8.9.
Maximum of a Random Walk
229
8.10.
Random Walk on a Graph
231
8.10.1.
Random Walk on a Weighted Graph
236
8.11.
Markov Random Walk
237
8.11.1.
Markov Random Walk in Semisupervised Machine
Learning
239
8.12.
Random Walk with Correlation
240
8.13.
Continuous-Time Random Walk
246
8.13.1.
The Master Equation
248
8.14.
Sample Applications of Random Walk
251
8.14.1.
The Ballot Problem
251
8.14.2.
Web Search
254
8.14.3.
Mobility Models in Mobile Networks
255
8.14.4.
Insurance Risk
257
8.14.5.
Content of a Dam
257
8.14.6.
Cash Management
258
8.15.
Problems
258
9.
Brownian Motion and Diffusion Processes
263
9.1.
Introduction
263
9.2.
Brownian Motion
263
9.2.1.
Brownian Motion with Drift
265
9.2.2.
Brownian Motion as a Markov Process
265
9.2.3.
Brownian Motion as a Martingale
266
9.2.4.
First Passage Time of a Brownian Motion
266
9.2.5.
Maximum of a Brownian Motion
268
X
Contents
9.2.6. First Passage Time in an
Interval
269
9.2.7.
The Brownian Bridge
270
9.3.
Introduction to Stochastic Calculus
271
9.3.1.
The
Ito
Integral
271
9.3.2.
The Stochastic Differential
273
9.3.3.
The Ito's Formula
273
9.3.4.
Stochastic Differential Equations
274
9.4.
Geometric Brownian Motion
274
9.5.
Fractional Brownian Motion
276
9.6.
Application of Brownian Motion to Option Pricing
277
9.7.
Random Walk Approximation of Brownian Motion
280
9.8.
The Ornstein-Uhlenbeck Process
281
9.8.1.
Mean Reverting Ornstein-Uhlenbeck Process
285
9.8.2.
Applications of the Ornstein-Uhlenbeck Process
286
9.9.
Diffusion Processes
287
9.10.
Examples of Diffusion Processes
289
9.10.1.
Brownian Motion
289
9.10.2.
Brownian Motion with Drift
291
9.10.3.
Levy Processes
292
9.11.
Relationship Between the Diffusion Process and Random Walk
294
9.12.
Problems
295
10.
Controlled Markov Processes
297
10.1.
Introduction
297
10.2.
Markov Decision Processes
297
10.2.1.
Overview of Dynamic Programming
299
10.2.2.
Markov Reward Processes
302
10.2.3.
MDP Basics
304
10.2.4.
MDPs with Discounting
306
10.2.5.
Solution Methods
307
10.3.
Semi-Markov Decision Processes
317
10.3.1.
Semi-Markov Reward Model
318
10.3.2.
Discounted Reward
320
10.3.3.
Analysis of the Continuous-Decision-Interval SMDPs
321
10.3.4.
Solution by Policy Iteration
323
10.3.5.
SMDP with Discounting
325
10.3.6.
Solution by Policy Iteration when Discounting Is Used
326
10.3.7.
Analysis of the Discrete-Decision-Interval SMDPs with
Discounting
328
10.3.8.
Continuous-Time Markov Decision Processes
328
10.3.9.
Applications of Semi-Markov Decision Processes
329
Contents xi
10.4.
Partially Observable Markov Decision Processes
330
10.4.1.
Partially Observable Markov Processes
332
10.4.2.
POMDP Basics
334
10.4.3.
Solving POMDPs
337
10.4.4.
Computing the Optimal Policy
338
10.4.5.
Approximate Solutions of POMDP
338
10.5.
Problems
339
11.
Hidden Markov Models
341
11.1.
Introduction
341
11.2.
HMM
Basics
343
11.3.
HMM
Assumptions
345
11.4.
Three Fundamental Problems
346
11.5.
Solution Methods
347
11.5.1.
The Evaluation Problem
347
11.5.2.
The Decoding Problem and the Viterbi Algorithm
356
11.5.3.
The Learning Problem and the
Baum-
Welch
Algorithm
362
11.6.
Types of Hidden Markov Models
365
11.7.
Hidden Markov Models with Silent States
366
11.8.
Extensions of Hidden Markov Models
366
11.8.1.
Hierarchical Hidden Markov Model
367
11.8.2.
Factorial Hidden Markov Model
368
11.8.3.
Coupled Hidden Markov Model
369
11.8.4.
Hidden Semi-Markov Models
370
11.8.5.
Profile HMMs for Biological Sequence Analysis
371
11.9.
Other Extensions of
HMM
375
11.10.
Problems
376
12.
Markov Random Fields
381
12.1.
Introduction
381
12.2.
Markov Random Fields
382
12.2.1.
Graphical Representation
386
12.2.2.
Gibbs Random Fields and the Hammersley-Clifford
Theorem
388
12.3.
Examples of Markov Random Fields
390
12.3.1.
The Ising Model
390
12.3.2.
The Potts Model
392
12.3.3.
Gauss-Markov Random Fields
393
12.4.
Hidden Markov Random Fields
394
xii Contents
12.5. Applications
of Markov Random Fields
397
12.6.
Problems
398
13.
Markov Point Processes
401
13.1.
Introduction
401
13.2.
Temporal Point Processes
402
13.2.1.
Specific Temporal Point Processes
404
13.3.
Spatial Point Processes
405
13.3.1.
Specific Spatial Point Processes
407
13.4.
Spatial-Temporal Point Processes
410
13.5.
Operations on Point Processes
413
13.5.1.
Thinning
413
13.5.2.
Superposition
414
13.5.3.
Clustering
414
13.6.
Marked Point Processes
415
13.7.
Markov Point Processes
416
13.8.
Markov Marked Point Processes
419
13.9.
Applications of Markov Point Processes
420
13.10.
Problems
420
14.
Markov Chain Monte Carlo
423
14.1.
Introduction
423
14.2.
Monte Carlo Simulation Basics
424
14.2.1.
Generating Random Variables from the Random
Numbers
426
14.2.2.
Monte Carlo Integration
429
14.3.
Markov Chains Revisited
431
14.4.
MCMC Simulation
433
14.4.1.
The Metropolis-Hastings Algorithm
434
14.4.2.
Gibbs Sampling
436
14.5.
Applications of MCMC
442
14.5.1.
Simulated Annealing
442
14.5.2.
Inference in Belief Networks
445
14.6.
Choice of Method
447
14.7.
Problems
448
References
451
Index
471 |
any_adam_object | 1 |
any_adam_object_boolean | 1 |
author | Ibe, Oliver C. 1947- |
author_GND | (DE-588)136641784 |
author_facet | Ibe, Oliver C. 1947- |
author_role | aut |
author_sort | Ibe, Oliver C. 1947- |
author_variant | o c i oc oci |
building | Verbundindex |
bvnumber | BV035173736 |
callnumber-first | Q - Science |
callnumber-label | QA274 |
callnumber-raw | QA274.7 |
callnumber-search | QA274.7 |
callnumber-sort | QA 3274.7 |
callnumber-subject | QA - Mathematics |
classification_rvk | SK 820 |
ctrlnum | (OCoLC)633914106 (DE-599)HBZHT015735357 |
dewey-full | 519.2/33 |
dewey-hundreds | 500 - Natural sciences and mathematics |
dewey-ones | 519 - Probabilities and applied mathematics |
dewey-raw | 519.2/33 |
dewey-search | 519.2/33 |
dewey-sort | 3519.2 233 |
dewey-tens | 510 - Mathematics |
discipline | Mathematik |
discipline_str_mv | Mathematik |
format | Book |
fullrecord | <?xml version="1.0" encoding="UTF-8"?><collection xmlns="http://www.loc.gov/MARC21/slim"><record><leader>01762nam a2200445zc 4500</leader><controlfield tag="001">BV035173736</controlfield><controlfield tag="003">DE-604</controlfield><controlfield tag="005">20090603 </controlfield><controlfield tag="007">t</controlfield><controlfield tag="008">081121s2009 ne ad|| |||| 00||| eng d</controlfield><datafield tag="010" ind1=" " ind2=" "><subfield code="a">2008021501</subfield></datafield><datafield tag="020" ind1=" " ind2=" "><subfield code="a">9780123744517</subfield><subfield code="c">hardcover : alk. paper</subfield><subfield code="9">978-0-12-374451-7</subfield></datafield><datafield tag="020" ind1=" " ind2=" "><subfield code="a">0123744512</subfield><subfield code="c">hardcover : alk. paper</subfield><subfield code="9">0-12-374451-2</subfield></datafield><datafield tag="035" ind1=" " ind2=" "><subfield code="a">(OCoLC)633914106</subfield></datafield><datafield tag="035" ind1=" " ind2=" "><subfield code="a">(DE-599)HBZHT015735357</subfield></datafield><datafield tag="040" ind1=" " ind2=" "><subfield code="a">DE-604</subfield><subfield code="b">ger</subfield><subfield code="e">aacr</subfield></datafield><datafield tag="041" ind1="0" ind2=" "><subfield code="a">eng</subfield></datafield><datafield tag="044" ind1=" " ind2=" "><subfield code="a">ne</subfield><subfield code="c">NL</subfield></datafield><datafield tag="049" ind1=" " ind2=" "><subfield code="a">DE-703</subfield><subfield code="a">DE-824</subfield><subfield code="a">DE-19</subfield></datafield><datafield tag="050" ind1=" " ind2="0"><subfield code="a">QA274.7</subfield></datafield><datafield tag="082" ind1="0" ind2=" "><subfield code="a">519.2/33</subfield></datafield><datafield tag="084" ind1=" " ind2=" "><subfield code="a">SK 820</subfield><subfield code="0">(DE-625)143258:</subfield><subfield code="2">rvk</subfield></datafield><datafield tag="100" ind1="1" ind2=" "><subfield code="a">Ibe, Oliver C.</subfield><subfield code="d">1947-</subfield><subfield code="e">Verfasser</subfield><subfield code="0">(DE-588)136641784</subfield><subfield code="4">aut</subfield></datafield><datafield tag="245" ind1="1" ind2="0"><subfield code="a">Markov processes for stochastic modeling</subfield><subfield code="c">Oliver C. Ibe</subfield></datafield><datafield tag="264" ind1=" " ind2="1"><subfield code="a">Amsterdam [u.a.]</subfield><subfield code="b">Elsevier</subfield><subfield code="c">2009</subfield></datafield><datafield tag="300" ind1=" " ind2=" "><subfield code="a">XIV, 490 S.</subfield><subfield code="b">Ill., graph. Darst.</subfield></datafield><datafield tag="336" ind1=" " ind2=" "><subfield code="b">txt</subfield><subfield code="2">rdacontent</subfield></datafield><datafield tag="337" ind1=" " ind2=" "><subfield code="b">n</subfield><subfield code="2">rdamedia</subfield></datafield><datafield tag="338" ind1=" " ind2=" "><subfield code="b">nc</subfield><subfield code="2">rdacarrier</subfield></datafield><datafield tag="500" ind1=" " ind2=" "><subfield code="a">Includes bibliographical references (p. 451-469) and index</subfield></datafield><datafield tag="650" ind1=" " ind2="4"><subfield code="a">Markov processes</subfield></datafield><datafield tag="650" ind1=" " ind2="4"><subfield code="a">Stochastic processes</subfield></datafield><datafield tag="650" ind1="0" ind2="7"><subfield code="a">Stochastisches Modell</subfield><subfield code="0">(DE-588)4057633-4</subfield><subfield code="2">gnd</subfield><subfield code="9">rswk-swf</subfield></datafield><datafield tag="650" ind1="0" ind2="7"><subfield code="a">Markov-Prozess</subfield><subfield code="0">(DE-588)4134948-9</subfield><subfield code="2">gnd</subfield><subfield code="9">rswk-swf</subfield></datafield><datafield tag="689" ind1="0" ind2="0"><subfield code="a">Markov-Prozess</subfield><subfield code="0">(DE-588)4134948-9</subfield><subfield code="D">s</subfield></datafield><datafield tag="689" ind1="0" ind2="1"><subfield code="a">Stochastisches Modell</subfield><subfield code="0">(DE-588)4057633-4</subfield><subfield code="D">s</subfield></datafield><datafield tag="689" ind1="0" ind2=" "><subfield code="8">1\p</subfield><subfield code="5">DE-604</subfield></datafield><datafield tag="856" ind1="4" ind2="2"><subfield code="m">Digitalisierung UB Bayreuth</subfield><subfield code="q">application/pdf</subfield><subfield code="u">http://bvbr.bib-bvb.de:8991/F?func=service&doc_library=BVB01&local_base=BVB01&doc_number=016980627&sequence=000002&line_number=0001&func_code=DB_RECORDS&service_type=MEDIA</subfield><subfield code="3">Inhaltsverzeichnis</subfield></datafield><datafield tag="999" ind1=" " ind2=" "><subfield code="a">oai:aleph.bib-bvb.de:BVB01-016980627</subfield></datafield><datafield tag="883" ind1="1" ind2=" "><subfield code="8">1\p</subfield><subfield code="a">cgwrk</subfield><subfield code="d">20201028</subfield><subfield code="q">DE-101</subfield><subfield code="u">https://d-nb.info/provenance/plan#cgwrk</subfield></datafield></record></collection> |
id | DE-604.BV035173736 |
illustrated | Illustrated |
index_date | 2024-07-02T22:55:17Z |
indexdate | 2024-07-09T21:26:41Z |
institution | BVB |
isbn | 9780123744517 0123744512 |
language | English |
lccn | 2008021501 |
oai_aleph_id | oai:aleph.bib-bvb.de:BVB01-016980627 |
oclc_num | 633914106 |
open_access_boolean | |
owner | DE-703 DE-824 DE-19 DE-BY-UBM |
owner_facet | DE-703 DE-824 DE-19 DE-BY-UBM |
physical | XIV, 490 S. Ill., graph. Darst. |
publishDate | 2009 |
publishDateSearch | 2009 |
publishDateSort | 2009 |
publisher | Elsevier |
record_format | marc |
spelling | Ibe, Oliver C. 1947- Verfasser (DE-588)136641784 aut Markov processes for stochastic modeling Oliver C. Ibe Amsterdam [u.a.] Elsevier 2009 XIV, 490 S. Ill., graph. Darst. txt rdacontent n rdamedia nc rdacarrier Includes bibliographical references (p. 451-469) and index Markov processes Stochastic processes Stochastisches Modell (DE-588)4057633-4 gnd rswk-swf Markov-Prozess (DE-588)4134948-9 gnd rswk-swf Markov-Prozess (DE-588)4134948-9 s Stochastisches Modell (DE-588)4057633-4 s 1\p DE-604 Digitalisierung UB Bayreuth application/pdf http://bvbr.bib-bvb.de:8991/F?func=service&doc_library=BVB01&local_base=BVB01&doc_number=016980627&sequence=000002&line_number=0001&func_code=DB_RECORDS&service_type=MEDIA Inhaltsverzeichnis 1\p cgwrk 20201028 DE-101 https://d-nb.info/provenance/plan#cgwrk |
spellingShingle | Ibe, Oliver C. 1947- Markov processes for stochastic modeling Markov processes Stochastic processes Stochastisches Modell (DE-588)4057633-4 gnd Markov-Prozess (DE-588)4134948-9 gnd |
subject_GND | (DE-588)4057633-4 (DE-588)4134948-9 |
title | Markov processes for stochastic modeling |
title_auth | Markov processes for stochastic modeling |
title_exact_search | Markov processes for stochastic modeling |
title_exact_search_txtP | Markov processes for stochastic modeling |
title_full | Markov processes for stochastic modeling Oliver C. Ibe |
title_fullStr | Markov processes for stochastic modeling Oliver C. Ibe |
title_full_unstemmed | Markov processes for stochastic modeling Oliver C. Ibe |
title_short | Markov processes for stochastic modeling |
title_sort | markov processes for stochastic modeling |
topic | Markov processes Stochastic processes Stochastisches Modell (DE-588)4057633-4 gnd Markov-Prozess (DE-588)4134948-9 gnd |
topic_facet | Markov processes Stochastic processes Stochastisches Modell Markov-Prozess |
url | http://bvbr.bib-bvb.de:8991/F?func=service&doc_library=BVB01&local_base=BVB01&doc_number=016980627&sequence=000002&line_number=0001&func_code=DB_RECORDS&service_type=MEDIA |
work_keys_str_mv | AT ibeoliverc markovprocessesforstochasticmodeling |