Statistics 50 :5, Communications in Statistics - Theory and Methods 45 :5, Journal of Time Series Analysis 37 :1, Nonparametric Statistics, Advances in Mathematics , Communications in Statistics - Theory and Methods 44 :4, Stochastic Processes and their Applications , Communications in Statistics - Theory and Methods 43 , Sankhya B 75 :2, Journal of Nonparametric Statistics 24 :3, Stochastic Analysis and Applications 30 :3, Exponential Mixing. Extreme Value Methods with Applications to Finance, Journal of Time Series Analysis 32 :6, Communications in Statistics - Theory and Methods 40 , Journal of Statistical Planning and Inference :8, Comptes Rendus Mathematique , TEST 19 :2, Automatica 46 :5, Journal of Systems Science and Complexity 23 :1, Statistics 43 :5, Journal of Statistical Planning and Inference :4, Communications in Statistics - Theory and Methods 38 :6, Stochastic Analysis and Applications 27 :1, Journal of Time Series Analysis 28 :3, Journal of Multivariate Analysis 98 :1, Annals of the Institute of Statistical Mathematics 58 :3, Econometric Theory 21 Journal of Nonparametric Statistics 16 , Statistics 37 :1, Journal of Statistical Planning and Inference :1, Empirical Process Techniques for Dependent Data, Test 10 :2, Communications in Statistics - Theory and Methods 30 :2, Paperback - Hardcover - Free Shipping.
Ebook - This book presents various results and techniques from the theory of stochastic processes that are useful in the study of stochastic problems in the natural sciences. The main focus is analytical methods, although numerical The main focus is analytical methods, although numerical methods and statistical inference methodologies for studying diffusion processes are also presented. The goal is the development of techniques that are applicable to a wide variety of stochastic models that appear in physics, chemistry and other natural sciences.
Applications such as stochastic resonance, Brownian motion in periodic potentials and Brownian motors are studied and the connection between diffusion processes and time-dependent statistical mechanics is elucidated. Playing a central role in the theory of probability, the Wiener process is often considered the most important and studied stochastic process, with connections to other stochastic processes. Almost surely , a sample path of a Wiener process is continuous everywhere but nowhere differentiable.
It can be considered as a continuous version of the simple random walk. The Poisson process is a stochastic process that has different forms and definitions. The number of points of the process that are located in the interval from zero to some given time is a Poisson random variable that depends on that time and some parameter.
This process has the natural numbers as its state space and the non-negative numbers as its index set. This process is also called the Poisson counting process, since it can be interpreted as an example of a counting process. If a Poisson process is defined with a single positive constant, then the process is called a homogeneous Poisson process. The homogeneous Poisson process can be defined and generalized in different ways. It can be defined such that its index set is the real line, and this stochastic process is also called the stationary Poisson process.
Defined on the real line, the Poisson process can be interpreted as a stochastic process,   among other random objects. There are others ways to consider a stochastic process, with the above definition being considered the traditional one. The state space is defined using elements that reflect the different values that the stochastic process can take. A sample function is a single outcome of a stochastic process, so it is formed by taking a single possible value of each random variable of the stochastic process. An increment of a stochastic process is the difference between two random variables of the same stochastic process.
For a stochastic process with an index set that can be interpreted as time, an increment is how much the stochastic process changes over a certain time period. The law of a stochastic process or a random variable is also called the probability law , probability distribution , or the distribution. The finite-dimensional distributions of a stochastic process satisfy two mathematical conditions known as consistency conditions.
Stationarity is a mathematical property that a stochastic process has when all the random variables of that stochastic process are identically distributed. The index set of a stationary stochastic process is usually interpreted as time, so it can be the integers or the real line.
This type of stochastic process can be used to describe a physical system that is in steady state, but still experiences random fluctuations. A stochastic process with the above definition of stationarity is sometimes said to be strictly stationary, but there are other forms of stationarity. A filtration is an increasing sequence of sigma-algebras defined in relation to some probability space and an index set that has some total order relation, such in the case of the index set being some subset of the real numbers.
A modification of a stochastic process is another stochastic process, which is closely related to the original stochastic process. Two stochastic processes that are modifications of each other have the same law  and they are said to be stochastically equivalent or equivalent. Instead of modification, the term version is also used,     however some authors use the term version when two stochastic processes have the same finite-dimensional distributions, but they may be defined on different probability spaces, so two processes that are modifications of each other, are also versions of each other, in the latter sense, but not the converse.
If a continuous-time real-valued stochastic process meets certain moment conditions on its increments, then the Kolmogorov continuity theorem says that there exists a modification of this process that has continuous sample paths with probability one, so the stochastic process has a continuous modification or version. Separability is a property of a stochastic process based on its index set in relation to the probability measure. The property is assumed so that functionals of stochastic processes or random fields with uncountable index sets can form random variables.
For a stochastic process to be separable, in addition to other conditions, its index set must be a separable space , [b] which means that the index set has a dense countable subset. The concept of separability of a stochastic process was introduced by Joseph Doob ,  where the underlying idea is to make a countable set of points of the index set determine the properties of the stochastic process.
Skorokhod function spaces are frequently used in the theory of stochastic processes because it often assumed that the sample functions of continuous-time stochastic processes belong to a Skorokhod space. But the space also has functions with discontinuities, which means that the sample functions of stochastic processes with jumps, such as the Poisson process on the real line , are also members of this space.
In the context of mathematical construction of stochastic processes, the term regularity is used when discussing and assuming certain conditions for a stochastic process to resolve possible construction issues.
Markov processes are stochastic processes, traditionally in discrete or continuous time , that have the Markov property, which means the next value of the Markov process depends on the current value, but it is conditionally independent of the previous values of the stochastic process. In other words, the behavior of the process in the future is stochastically independent of its behavior in the past, given the current state of the process. The Brownian motion process and the Poisson process in one dimension are both examples of Markov processes  in continuous time, while random walks on the integers and the gambler's ruin problem are examples of Markov processes in discrete time.
A Markov chain is a type of Markov process that has either discrete state space or discrete index set often representing time , but the precise definition of a Markov chain varies. Markov processes form an important class of stochastic processes and have applications in many areas.
A martingale is a discrete-time or continuous-time stochastic process with the property that, at every instant, given the current value and all the past values of the process, the conditional expectation of every future value is equal to the current value. In discrete time, if this property holds for the next value, then it holds for all future values.
Statistical Inference for Stochastic Processes is an international journal publishing articles on parametric and nonparametric inference for discrete- and. Statistical Inference for Stochastic Processes. Theory and Methods. Book • Authors: ISHWAR V. BASAWA and B.L.S. PRAKASA RAO. Browse book.
The exact mathematical definition of a martingale requires two other conditions coupled with the mathematical concept of a filtration, which is related to the intuition of increasing available information as time passes. Martingales are usually defined to be real-valued,    but they can also be complex-valued  or even more general.
A symmetric random walk and a Wiener process with zero drift are both examples of martingales, respectively, in discrete and continuous time. Martingales can also be created from stochastic processes by applying some suitable transformations, which is the case for the homogeneous Poisson process on the real line resulting in a martingale called the compensated Poisson process.
Martingales mathematically formalize the idea of a fair game,  and they were originally developed to show that it is not possible to win a fair game. Martingales have many applications in statistics, but it has been remarked that its use and application are not as widespread as it could be in the field of statistics, particularly statistical inference.
In general, a random field can be considered an example of a stochastic or random process, where the index set is not necessarily a subset of the real line. Sometimes the term point process is not preferred, as historically the word process denoted an evolution of some system in time, so a point process is also called a random point field. Probability theory has its origins in games of chance, which have a long history, with some games being played thousands of years ago,   but very little analysis on them was done in terms of probability. After Cardano, Jakob Bernoulli [e] wrote Ars Conjectandi , which is considered a significant event in the history of probability theory.
In the physical sciences, scientists developed in the 19th century the discipline of statistical mechanics , where physical systems, such as containers filled with gases, can be regarded or treated mathematically as collections of many moving particles. Although there were attempts to incorporate randomness into statistical physics by some scientists, such as Rudolf Clausius , most of the work had little or no randomness.
At the International Congress of Mathematicians in Paris in , David Hilbert presented a list of mathematical problems , where his sixth problem asked for a mathematical treatment of physics and probability involving axioms. In s fundamental contributions to probability theory were made in the Soviet Union by mathematicians such as Sergei Bernstein , Aleksandr Khinchin , [g] and Andrei Kolmogorov. In Andrei Kolmogorov published in German his book on the foundations of probability theory titled Grundbegriffe der Wahrscheinlichkeitsrechnung , [i] where Kolmogorov used measure theory to develop an axiomatic framework for probability theory.
The publication of this book is now widely considered to be the birth of modern probability theory, when the theories of probability and stochastic processes became parts of mathematics. After World War II the study of probability theory and stochastic processes gained more attention from mathematicians, with significant contributions made in many areas of probability and mathematics as well as the creation of new areas.
Also starting in the s, connections were made between stochastic processes, particularly martingales, and the mathematical field of potential theory , with early ideas by Shizuo Kakutani and then later work by Joseph Doob. In Doob published his book Stochastic processes , which had a strong influence on the theory of stochastic processes and stressed the importance of measure theory in probability. Techniques and theory were developed to study Markov processes and then applied to martingales.
Conversely, methods from the theory of martingales were established to treat Markov processes. Other fields of probability were developed and used to study stochastic processes, with one main approach being the theory of large deviations. The theory of stochastic processes still continues to be a focus of research, with yearly international conferences on the topic of stochastic processes. Although Khinchin gave mathematical definitions of stochastic processes in the s,   specific stochastic processes had already been discovered in different settings, such as the Brownian motion process and the Poisson process.
The Bernoulli process, which can serve as a mathematical model for flipping a biased coin, is possibly the first stochastic process to have been studied. In Karl Pearson coined the term random walk while posing a problem describing a random walk on the plane, which was motivated by an application in biology, but such problems involving random walks had already been studied in other fields.
Certain gambling problems that were studied centuries earlier can be considered as problems involving random walks. The Wiener process or Brownian motion process has its origins in different fields including statistics, finance and physics. It is thought that the ideas in Thiele's paper were too advanced to have been understood by the broader mathematical and statistical community at the time. The French mathematician Louis Bachelier used a Wiener process in his thesis   in order to model price changes on the Paris Bourse , a stock exchange ,  without knowing the work of Thiele.
It is commonly thought that Bachelier's work gained little attention and was forgotten for decades until it was rediscovered in the s by the Leonard Savage , and then become more popular after Bachelier's thesis was translated into English in But the work was never forgotten in the mathematical community, as Bachelier published a book in detailing his ideas,  which was cited by mathematicians including Doob, Feller  and Kolmogorov. In Albert Einstein published a paper where he studied the physical observation of Brownian motion or movement to explain the seemingly random movements of particles in liquids by using ideas from the kinetic theory of gases.
Einstein derived a differential equation , known as a diffusion equation , for describing the probability of finding a particle in a certain region of space. Shortly after Einstein's first paper on Brownian movement, Marian Smoluchowski published work where he cited Einstein, but wrote that he had independently derived the equivalent results by using a different method.
Einstein's work, as well as experimental results obtained by Jean Perrin , later inspired Norbert Wiener in the s  to use a type of measure theory, developed by Percy Daniell , and Fourier analysis to prove the existence of the Wiener process as a mathematical object. Another discovery occurred in Denmark in when A. Erlang derived the Poisson distribution when developing a mathematical model for the number of incoming phone calls in a finite time interval.
Erlang was not at the time aware of Poisson's earlier work and assumed that the number phone calls arriving in each interval of time were independent to each other.
He then found the limiting case, which is effectively recasting the Poisson distribution as a limit of the binomial distribution. In Ernest Rutherford and Hans Geiger published experimental results on counting alpha particles. Motivated by their work, Harry Bateman studied the counting problem and derived Poisson probabilities as a solution to a family of differential equations, resulting in the independent discovery of the Poisson process. Markov processes and Markov chains are named after Andrey Markov who studied Markov chains in the early 20th century. Other early uses of Markov chains include a diffusion model, introduced by Paul and Tatyana Ehrenfest in , and a branching process, introduced by Francis Galton and Henry William Watson in , preceding the work of Markov.
Andrei Kolmogorov developed in a paper a large part of the early theory of continuous-time Markov processes.