©2022 Ada Academica https://adac.eeEur. J. Math. Anal. 2 (2022) 15doi: 10.28924/ada/ma.2.15 Quasi-likelihood Estimation in Fractional Levy SPDEs from Poisson Sampling Jaya P. N. Bishwal Department of Mathematics and Statistics, University of North Carolina at Charlotte,376 Fretwell Bldg, 9201 University City Blvd. Charlotte, NC 28223-0001, USACorrespondence: J.Bishwal@uncc.edu Abstract. We study the quasi-likelihood estimator of the drift parameter in the stochastic partialdifferential equations driven by a cylindrical fractional Levy process when the process is observed atthe arrival times of a Poisson process. We use a two stage estimation procedure. We first estimatethe intensity of the Poisson process. Then we plug-in this estimate in the quasi-likelihood to estimatethe drift parameter. We obtain the strong consistency and the asymptotic normality of the estimators. 1. Introduction Parameter estimation in infinite dimensional stochastic differential equations was first studied byLoges [20]. When the length of the observation time becomes large, he obtained consistency andasymptotic normality of the maximum likelihood estimator (MLE) of a real valued drift parameterin a Hilbert space valued SDE. Koski and Loges [18] extended the work of Loges [20] to minimumcontrast estimators. Koski and Loges [17] applied the work to a stochastic heat flow problem. Seethe monograph Bishwal [5] for asymptotic results on likelihood inference and Bayesian inferencefor drift estimation of finite and infinite dimensional stochastic differential equations.Huebner, Khasminskii and Rozovskii [12] started statistical investigation in SPDEs. They gavetwo contrast examples of parabolic SPDEs in one of which they obtained consistency, asymptoticnormality and asymptotic efficiency of the MLE as noise intensity decreases to zero under thecondition of absolute continuity of measures generated by the process for different parameters (thesituation is similar to the classical finite dimensional case) and in the other they obtained theseproperties as the finite dimensional projection becomes large under the condition of singularity ofthe measures generated by the process for different parameters. The second example was extendedby Huebner and Rozovskii [13] and the first example was extended by Huebner [11] to MLE forgeneral parabolic SPDEs where the partial differential operators commute and satisfy differentorder conditions in the two cases. Received: 19 Feb 2022. Key words and phrases. Cylindrical fractional Levy process, stochastic partial differential equations, space-time colornoise, convoluted Levy field, infinite divisibility, Poisson sampling, quasi maximum likelihood estimator, consistency,asymptotic normality. 1 https://adac.ee https://doi.org/10.28924/ada/ma.2.15 Eur. J. Math. Anal. 10.28924/ada/ma.2.15 2 Huebner [10] extended the problem to the ML estimation of multidimensional parameter. Lototskyand Rozovskii [21] studied the same problem without the commutativity condition. Small noiseasymptotics of the nonparmetric estimation of the drift coefficient was studies by Ibragimov andKhasminskii [14].Bishwal [3] proved the Bernstein-von Mises theorem (BVT) and obtained asymptotic properties ofregular Bayes estimator of the drift parameter in a Hilbert space valued SDE when the correspond-ing ergodic diffusion process is observed continuously over a time interval [0,T ]. The asymptoticsare studied as T → ∞ under the condition of absolute continuity of measures generated by theprocess. Results are illustrated for the example of an SPDE.Bishwal [4] obtained BVT and spectral asymptotics of Bayes estimators for parabolic SPDEswhen the number of Fourier coefficients becomes large. In that case, the measures generatedby the process for different parameters are singular. Here we treat the case when the measuresgenerated by the process for different parameters are absolutely continuous under some conditionson the order of the partial differential operators. Bishwal [9] studied the asymptotic properties ofthe posterior distributions and Bayes estimators when one has either fully observed process orfinite-dimensional projections. The asymptotic parameter is only the intensity of noise. In thispaper we treat the more general model with non-Gaussian noise with long memory.On the other hand, recently long memory processes, i.e. processes with slowly decaying auto-correlation and processes with jumps have received attention in finance, engineering and physics.The simplest continuous time long memory process is the fractional Brownian motion discovered byKolmogorov [15] and later on studied by Levy [19] and Mandelbrot and van Ness [27]. Continuoustime long memory jump process is fractional Levy process. Hence fractional Levy process can alsobe called the Kolmogorov-Levy process.We generalize fractional SPDE process to include non-normal innovations. We consider Hurstparameter greater than half. This model is interesting as it preserves both jumps and long memory.A normalized fractional Brownian motion {WHt ,t ≥ 0} with Hurst parameter H ∈ (0, 1) is acentered Gaussian process with continuous sample paths whose covariance kernel is given by E(WHt W H s ) = 1 2 (s2H + t2H −|t − s|2H), s,t ≥ 0. The process is self similar (scale invariant) and it can be represented as a stochastic integralwith respect to standard Brownian motion. For H = 1 2 , the process is a standard Brownian motion.For H 6= 1 2 , the fBm is not a semimartingale and not a Markov process, but a Dirichlet process.The increments of the fBm are negatively correlated for H < 1 2 and positively correlated for for H < 1 2 and in this case they display long-range dependence. The parameter H which is alsocalled the self similarity parameter, measures the intensity of the long range dependence. TheARIMA(p,d,q) with autoregressive part of order p, moving average part of order q and fractionaldifference parameter d ∈ (0, 0.5) process converge in Donsker sense to fBm. See Mishura [22].The fractional Levy Ornstein-Uhlenbeck (fOU) process, is an extension of fractional Ornstein-Uhlenbeck process with fractional Levy motion (fLM) driving term. In finance, it could be useful https://doi.org/10.28924/ada/ma.2.15 Eur. J. Math. Anal. 10.28924/ada/ma.2.15 3 as a generalization of fractional Vasicek model, as one-factor short-term interest rate model whichcould take into account the long memory effect and jump of the interest rate. The model parameteris usually unknown and must be estimated from data.Fractional Levy Process (FLP) is defined as MH,t = 1 Γ(H + 1 2 ) ∫ R [(t − s)H−1/2+ − (−s) H−1/2 + ]dMs, t ∈R where {Mt,t ∈ R} is a Levy process on R with E(M1) = 0, E(M21 ) < ∞ and without Browniancomponent.Here are some properties of the fractional Levy process:1) the covariance of the process is given by cov(MH,t,MH,s) = E(M21 ) 2Γ(2H + 1) sin(πH) [|t|2H + |s|2H −|t − s|2H]. 2) MH is not a martingale. For a large class of Levy processes, MH is neither a semimartingale.3)MH is Hölder continuous of any order β less than H − 12 .4) MH has stationary increments.5) MH is symmetric.6) M is self-similar, but MH is not self-similar.7) MH has infinite total variation on compacts.Thus FLP is a generalization and a natural counterpart of FBM. Fractional stable motion is aspecial case of FLP. First we discuss estimation in partially observed models and then we discussestimation in directly observed model in finite dimensional set up. In finance, the log-volatilityprocess can be modeled as a fractionally integrated moving average (FIMA) process which isdefined as YH(t) = ∫ t −∞ gH(t −u)dMu, t ∈R where gH(t) = 1 Γ(H − 1 2 ) ∫ t 0 g(t − s)sH− 3 2 ds, t ∈R which is the Riemann-Liouville fractional integral of order H and the kernel g is the kernel of ashort memory moving average process. The log-volatility process will have slow (hyperbolic rate)decay of the auto-correlation function (acf ).The process YH(t) can be written as YH(t) = ∫ t −∞ g(t −u)dMH,u, t ∈R. We assume the following conditions on the kernel g : R → R, namely 1) g(t) = 0 for all t < 0(causality), 2) |g(t)| ≤ Ce−ct for some constants C > 0 and c > 0 (short memory). https://doi.org/10.28924/ada/ma.2.15 Eur. J. Math. Anal. 10.28924/ada/ma.2.15 4 The FIMA process is stationary and is infinite divisible. It has long memory and jumps whichagree empirically with stochastic volatility models. The asset return can be modeled as a COGA-RCH process dX(t) = √ eYH(t)dLt where (Lt,t ∈R is another Levy process and the initial value YH(0) is independent of L.Consider the kernel g(t − s) = σe−θ(t−s)I(0,∞)(t − s),θ > 0 then gH(t) = σ Γ(H − 1 2 ) ∫ ∞ 0 eθ(t−s)I(0,∞)(t − s)s H−3 2 ds, t ∈R. Note that UH,θ,σt = ∫ R gH(t −u)dMu, t ∈R is the fractional Levy Ornstein-Uhlenbeck (FLOU) process satisfying the fractional Langevin equa-tion dUt = −θUtdt + σdMH,t, t ∈R. The process has long memory. Levy driven processes of Ornstein-Uhlenbeck type have beenextensively studied over the last few years and widely used in finance, see Barndorff-Neilsenand Shephard [1]. FLOU process generalizes FOU process to include jumps. Maximum quasi-likelihood estimation in fractional Levy stochastic volatility model was studied in Bishwal [6].Berry-Esseen inequalities for the discretely observed Ornstein-Uhlenbeck-Gamma process wasstudied in Bishwal [7]. Minimum contrast estimation in fractional Ornstein-Uhlenbeck processbased on both continuous and discrete observations was studied in Bishwal [8].Consider the asset return driven by fractional Levy process dSH,t = σt−dLH,t, t > 0, S0 = 0, with log-volatility log σ2t = µ + Xt, t ≥ 0where the Levy driven OU process X satisfies dXt = −θXtdt + dMt, t > 0 with θ ∈R+ and the driving compound Poisson process M is a Levy process with Levy symbol ψM(u) = − u2 2 + ∫ R (eiux − 1)Φ0,1/λ(dx), where Φ0,1/λ being a normal distribution with mean 0 and variance 1/λ. This means that M is thesum of a standard Brownian motion W and a compound Poisson process Jt = ∑Ntk=1 Zk, J−t =∑−N−t k=1 Z−k, t ≥ 0 where (Nt,t ∈R) is an independent Poisson process with intensity λ > 0 and https://doi.org/10.28924/ada/ma.2.15 Eur. J. Math. Anal. 10.28924/ada/ma.2.15 5 jump times (tk)k∈Z, i.e., Mt = Wt + Jt. The Poisson process N is also independent from the i.i.d.sequence of jump sizes (Zk)k∈Z with Z1 ∼ N(0, 1/λ). The Levy process M in this case is given by Mt = Nt∑ k=1 (αZk + γ|Zk|) −Ct, t > 0 and C := γ ∫ R |x|λΦ0,1/λ(dx) = √ 2λ π γ. {M−t,t ≥ 0} is defined analogously. The stationary log-volatility is given by log σ2t = µ + ∫ t −∞ e−θ(t−s)dMs. We observe S at n consecutive jump times 0 = t0 < t1 < ... < tn < T < tn+1,n ∈Z over the timeinterval [0,T ]. The state process X has then the following autoregressive representation Xti = e −θ∆tiXti−1 + Nti∑ k=Nti−1 +1 e−θ(ti−tk )[αZk + γ|Zk|] − ∫ ti ti−1 e−θ(ti−s)Cds = e−θ∆tiXti−1 + αZi + ( |Zi|− C θ (1 −e−θ∆ti ) ) where ∆ti := ti − ti−1, i = 1, 2, . . . ,n and Nti−1 + 1 = Nti = i.We do the parameter estimation in two steps. The rate λ of the Poisson process N can beestimated given the jump times ti , therefore it is done at a first step. Since we observe totalnumber of jumps n of the Poisson process N over the T intervals of length one, the MLE of λ isgiven by λ̂n := nT .To estimate the remaining parameters (α,θ,µ), we use the quasi maximum likelihood estimationprocedure in conditionally heteroscedastic time series models developed by Straumann [25].Assuming that S∆ti H,ti given S∆ti−1 H,ti−1 , . . . ,S ∆t1 H,t1 ,X0 is conditionally normally distributed with meanzero and variance σ2ti−/λ, the conditional log-likelihood given the initial value X0 has the repre-sentation L(ϑ|S∆H,λ) := − n 2 log(2π) − 1 2 ( n∑ i=1 log(σ2ti−/λ) − n∑ i=1 (S ∆ti H,ti )2 σ2ti−/λ ) . where S∆ti H,ti = SH,ti −SH,ti−1 is the return at time ti . Since the volatility is unobservable, this log-likelihood can not be evaluated numerically. The quasi log-likelihood function for ϑ = (θ,α,γ,µ)given the data S∆H := (S∆t1H,t1,S∆t2H,t2, . . . ,S∆tnH,tn ) and the MLE λ̂n is defined as L(ϑ|S∆H, λ̂n) := − 1 2 n∑ i=1 log(σ̂2H,ti (ϑ,λ̂n)) − 1 2 n∑ i=1 (S ∆ti H,ti )2 σ̂2 H,Ti (ϑ,λ̂n)/λ̂n where the estimates of the volatility σ2H,ti, i = 1, 2, . . . ,n are given by σ̂2H,ti (ϑ,λn) := exp(µ + e −α∆TiXH,ti−1 (ϑ,λ) − Ĉ∆ti ), ß = 1, 2, . . . ,n and given the parameters ϑ and λ the estimates of the state process X are given by the recursion X̂H,ti = e −θ∆tiX̂H,ti−1 + α SH,ti σ̂ti (ϑ,λ) + ( SH,ti σ̂ti (ϑ,λ) − Ĉ∆ti ) , i = 1, 2, . . . ,n https://doi.org/10.28924/ada/ma.2.15 Eur. J. Math. Anal. 10.28924/ada/ma.2.15 6 Note that E(|W |) = √ 2 πλ , W ∼ N(0, 1/λ).Here the approximation (1−e−z ) ≈ z for small z is used and SH,ti σ̂ti (ϑ,λ) approximates the innovation Zi . The recursion needs a starting value X̂H,0 which will be set equal to the mean value of thestationary distribution of x which is zero. the mean value zero of the stationary distribution of X.QMLE of ϑ is defined as ϑ̂n := arg max ϑ∈Θ L(ϑ|S∆H, λ̂n).Let (Ω,F,{Ft}t≥0,P ) be the stochastic basis on which is defined the Ornstein-Uhlenbeck process Xt satisfying the Itô stochastic differential equation dXt = −θXtdt + dMHt , t ≥ 0, where {MHt } is a fractional Levy motion with H > 1/2 with the filtration {Ft}t≥0 and θ ∈ R+ isthe unknown parameter to be estimated on the basis of completely directly observed continuousobservation of the process {Xt} on the time interval [0,T ]. Observe that Xt = ∫ t −∞ e−θ(t−s)dMHs . This process is stationary and is a process with long memory. It can be shown that Xti is astationary discrete time AR(1) process with autoregression coefficient φ ∈ (0, 1) with the followingrepresentation Xti = φXti−1 + �ti−1where φ = e−θ∆ and �ti−1 = ∫ ti ti−1 e−θ(ti−u)dMHu . Then the problem is a AR(1) estimation with non-Gaussian non-martingale error. For equidistantsampling, one can study the least squares estimator which boils down to the study of error distribu-tion for non-semimartingales. One can specialize to the case when M is a either a gamma processor an inverse Gaussian process in order to have infinite number of jumps in a finite time inter-val unlike the compound Poissoan case which have finite number of jumps in a finite time interval.These fractional Gamma and fractional inverse Gaussian Ornstein-Uhlenbeck (FLOU) processes areLOU processes which include long memory. In the next section we deal with completely observedprocess.The rest of the paper is organized as follows : Section 2 contains model, assumptions andpreliminaries. Section 3 contains the asymptotic properties of quasi likelihood estimator. 2. FLSPDE Model and Preliminaries In order to introduce fractional Levy stochastic partial differential equation (FLSODE) we proceedas follows. Let us fix θ0, the unknown true value of the parameter θ. Let (Ω,F,P ) be a complete https://doi.org/10.28924/ada/ma.2.15 Eur. J. Math. Anal. 10.28924/ada/ma.2.15 7 probability space and W (t,x) be a process on this space with values in the Schwarz space ofdistributions D′(G) such that for φ,ψ ∈ C∞0 (G),‖φ‖−1L2(G) 〈W (t, ·),φ(·)〉 is a one dimensionalWiener process and E(〈W (s, ·),φ(·)〉〈W (t, ·),ψ(·)〉) = (s ∧ t)(φ,ψ)L2(G). This process is usually referred to as the cylindrical Brownian motion (C.B.M.).We assume that there exists a complete orthonormal system {hi}∞i=1 in L2(G)) such that forevery i = 1, 2, . . . ,hi ∈ Wm,20 (G) ∩C∞(G) and Λθhi = βi (θ)hi, and Lθhi = µi (θ)hi for all θ ∈ Θ where Lθ is a closed self adjoint extension of Aθ, Λθ := (k(θ)I −Lθ)1/2m,k(θ) is a constant andand the spectrum of the operator Λθ consists of eigen values {βi (θ)}∞i=1 of finite multiplicities and µi = −β2mi + k(θ).CFLP MH(t) can be expanded in the series MH(t,x) = ∞∑ i=1 MH,i (t)hi (x) where {MH,i (t)}∞i=1 are independent one dimensional FLPs, see Peszat and Zabczyk [24]. Thelatter series converges P-a.s. in H−ν for ν > d/2. Indeed ‖MH(t)‖2−ν = ∞∑ i=1 M2H,i (t)‖hi‖ 2 −ν = ∞∑ i=1 M2H,i (t)β −2ν i and the later series converges P-a.s.Consider the parabolic SPDE duθ(t,x) = θuθ(t,x) + ∂2 ∂x2 uθ(t,x)dt + dMH(t,x), t ≥ 0, x ∈ [0, 1] (2.1) u(0,x) = u0(x) ∈ L2([0, 1]) (2.2) uθ(t, 0) = uθ(t, 1), t ∈ [0,T ], (2.3) Here θ ∈ Θ ⊆ R is the unknown parameter to be estimated on the basis of the observationsof the field uθ(t,x),t ≥ 0, x ∈ [0, 1]. For x ∈ [0, 1], we observe the process {ut,t ≥ 0} attimes {t0,t1,t2, ....}. We assume that the sampling instants {ti, i = 0, 1, 2...} are generated bya Poisson process on [0,∞), i.e., t0 = 0,ti = ti−1 + ξi, i = 1, 2, ... where ξi are i.i.d. positiverandom variables with a common exponential distribution F (x) = 1−exp(−λx). Note that intensityparameter λ > 0 is the average sampling rate which is assumed to be known. It is also assumedthat the sampling process ti, i = 0, 1, 2, ... is independent of the observation process {Xt,t ≥ 0}.We note that the probability density function of tk+i − tk is independent of k and is given by thegamma density fi (t) = λ(λt) i−1 exp(−λt)It/(i − 1)!, i = 0, 1, 2, .... (2.4)where It = 1 if t ≥ 0 and It = 0 if t < 0. https://doi.org/10.28924/ada/ma.2.15 Eur. J. Math. Anal. 10.28924/ada/ma.2.15 8 Consider the Fourier expansion of the process u(t,x) = ∞∑ t=1 ui (t)φi (x) (2.5) corresponding to some orthogonal basis {φi (x)}∞i=1. Note that the Fourier coefficients {uθi (t), i ≥ 1} are independent one dimensional Ornstein-Uhlenbeck processes duθi (t) = µ θ i u θ i (t)dt + β −ν i dMH,i (t) (2.6) uθi (0) = u θ 0i,Recall that µi (θ) = k(θ) −β2mi . Thus duθi (t) = (k(θ) −β 2m i )u θ i (t)dt + β −ν i dMH,i (t) (2.7) The random field u(t,x) is observed at discrete times t and discrete positions x. Equivalently, theFourier coefficients uθi (t) are observed at discrete time points.Now we focus on the fundamental semimartingale behind the O-U model. Define κH := 2HΓ(3/2 −H)Γ(H + 1/2), kH(t,s) := κ −1 H (s(t − s)) 1 2 −H, ηH := 2HΓ(3 − 2H)Γ(H + 1 2 ) Γ(3/2 −H) , vt ≡ vHt := η −1 H t2−2H, MHt := ∫ t 0 kH(t,s)dM H s . For using Girsanov theorem for Brownian motion, since a Radon-Nikodym derivative process is al-ways a martingale, a central problem is how to construct an appropriate martingale which generatesthe same filtration, up to sets of measure zero, as the non-semimartingale called the fundamental martingale.Extending Norros et al. [23] it can be shown that MHt is a martingale, called the fundamen-tal martingale whose quadratic variation 〈MH〉t is vHt . Moreover, the natural filtration of themartingale MH coincides with the natural filtration of the FLP MH since MHt := ∫ t 0 K(t,s)dMHs holds for H ∈ (1/2, 1) where KH(t,s) := H(2H − 1) ∫ t s rH− 1 2 (r − s)H− 3 2 dr, 0 ≤ s ≤ t and for H = 1/2, the convention K1/2 ≡ 1 is used.Define Qi (t) := d dvt ∫ t 0 kH(t,s)ui (s)ds, i ≥ 1. https://doi.org/10.28924/ada/ma.2.15 Eur. J. Math. Anal. 10.28924/ada/ma.2.15 9 It is easy to see that Qi (t) = ηH 2(2 − 2H) { t2H−1Zi (t) + ∫ t 0 r2H−1dZi (s) } . Define the process Zi = (Zi (t),t ∈ [0,T ]) by Zi (t) := ∫ t 0 kH(t,s)dui (s). Extending Kleptsyna and Le Breton [16], we have:(i) Zi is the fundamental semimartingale associated with the process ui .(ii) Zi is a (Ft) -semimartingale with the decomposition Zi (t) = µi (θ) ∫ t 0 Qi (s)dvs + β −ν i MHt . (iii) ui admits the representation ui (t) = ∫ t 0 KH(t,s)dZi (s). (iv) The natural filtration (Zi (t)) of Zi and (Ui (t)) of ui coincide. We focus on our obserbations now. Note that for equally spaced data (homoscedastic case) vtk −vtk−1 = η −1 H ( T n )2−2H [k2−2H − (k − 1)2−2H], k = 1, 2, · · · ,n. (2.8) For H = 0.5, vtk −vtk−1 = η −1 H ( T n )2−2H [k2−2H − (k − 1)2−2H] = T n , k = 1, 2, . . . ,n. We have Qi (t) = d dvt ∫ t 0 kH(t,s)ui (s)ds = κ−1 H d dvt ∫ t 0 s1/2−H(t − s)1/2−Hui (s)ds = κ−1 H ηHt 2H−1 d dt ∫ t 0 s1/2−H(t − s)1/2−Hui (s)ds = κ−1 H ηHt 2H−1 ∫ t 0 d dt s1/2−H(t − s)1/2−Hui (s)ds = κ−1 H ηHt 2H−1 ∫ t 0 s1/2−H(t − s)−1/2−Hui (s)ds. (2.9) The process Qi depends continuously on ui and therefore, the discrete observations of ui doesnot allow one to obtain the discrete observations of Qi . The process Qi can be approximated by Q̃i (n) = κ −1 H ηHn 2H−1 n−1∑ j=0 j1/2−H(n− j)−1/2−Hui (j). (2.10) https://doi.org/10.28924/ada/ma.2.15 Eur. J. Math. Anal. 10.28924/ada/ma.2.15 10 It is easy to show that Q̃i (n) → Qi (t) almost surely as n →∞, see Tudor and Viens [26].Define a new partition 0 ≤ r1 < r2 < r3 < · · · < rmk = tk, k = 1, 2, · · · ,n. Define Q̃i (tk) = κ −1 H ηHt 2H−1 k mk∑ j=1 r 1/2−H j (rmk − rj) −1/2−Hui (rj)(rj − rj−1), (2.11) k = 1, 2, · · · ,n.It is easy to show that Q̃i (tk) → Qi (t) almost surely as mk →∞ for each k = 1, 2, · · · ,n.We use this approximate observation in the calculation of our estimators. Thus our observationsare ui (t) ≈ ∫ t 0 KH(t,s)dZ̃i (s) where Z̃i (t) = θ∫ t 0 Q̃i (s)dvs + MHt . (2.12)observed at Poisson arrivals t1,t2, . . . ,tn. We observe just one such approximate Fourier coefficient ui (t) which we denote by u(t) and the corresponding observations are denoted by ut1,ut2, . . . ,utnand let n →∞. Ideally we are in a large time asymptotic framework.Now we focus on the estimation methodology. Define ρ := ρ(λ,θ) = λ λ−κ(θ) + β2m i . (2.13) The quasi likelihood estimator is the solution of the estimating equation: G∗n(θ) = 0 (2.14) where G∗n(θ) = β2νi λ(ρ(λ,θ)) 2 ρ(λ, 2θ) n∑ i=1 uti−1 ( (uti−1θρ(λ,θ)) 2 + λ )−1 (uti −ρ(λ,θ)uti−1 ) (2.15) We call the solution of the estimating equation the quasi likelihood estimator. There is no explicitsolution for this equation.The optimal estimating function for estimation of the unknown parameter θ is Gn(θ) = β 2ν i n∑ i=1 uti−1 [uti −ρ(λ,θ)uti−1 ]. (2.16) The martingale estimation function (MEF) estimator of ρ is the solution of Gn(θ) = 0 and isgiven by ρ̂n := ∑n i=1 uti−1uti∑n i=1 u 2 ti−1 . (2.17) 3. Main Results We do the parameter estimation in two steps: The rate λ of the Poisson process can be estimatedgiven the arrival times ti , therefore it is done at a first step. Since we observe total number ofarrivals n of the Poisson process over the T intervals of length one, the MLE of λ is given by λ̂n := n T . (3.1) https://doi.org/10.28924/ada/ma.2.15 Eur. J. Math. Anal. 10.28924/ada/ma.2.15 11 Theorem 3.1 We have λ̂n → λ a.s. as n →∞, √ n(λ̂n −λ) →D N(0, eλ(1 −e−λ)) as n →∞. Proof. Let Vi be the number of arrivals in the interval (i − 1, i]. Then Vi, i = 1, 2, . . . ,n are i.i.d.Poisson distributed with parameter λ. Since Φ is continuous, we have I{0}(Vi ) = I{0}(u(ti )) a.s. i = 1, 2, . . . ,n. Note that 1 n n∑ i=1 I{0}(uti ) → a.s. E(I{0}V1) = P (V1 = 0) = e −λ as n →∞. LLN and CLT and delta method applied to the sequence I{0}(uti ), i = 1, 2, . . . ,n give the results. The CLT result above allows us to construct confidence interval for the jump rate λ. Corollary 3.1 A 100(1 −α)% confidence interval for λ is given by[ n T −Z1−α 2 √ 1 n − 1 T , n T + Z1−α 2 √ 1 n − 1 T ] where Z1−α 2 is the (1 − α 2 )-quantile of the standard normal distribution. We obtain the strong consistency and asymptotic normality of the MEF estimator. Theorem 3.2 We have ρ̂n → ρ a.s. as n →∞, √ n(ρ̂n −ρ) →D N(0, λ−i (1 −e−ρ)) as n →∞. Proof: By using the fact that every stationary mixing process is ergodic, it is easy to show thatif ut is a stationary ergodic O-U process and ti is a process with nonnegative i.i.d. incrementswhich is independent of ut, then {uti, i ≥ 1} is a stationary ergodic process. Hence {uti, i ≥ 1} isa stationary ergodic process.Observe that uθi (t) := vi is stationary ergodic and vi ∼ N(0,σ2) where σ2 is the variance of u0. Thus by SLLN for zero mean square integrable martingales, we have as n →∞, 1 n n∑ i=1 uti−1uti → a.s. E(ut0ut1 ) = ρE(u 2 t0 ) 1 n n∑ i=1 u2ti−1 → a.s. E(u2t0 ) Thus ∑n i=1 uti−1uti∑n i=1 u 2 ti−1 →a.s. ρ. Further, √ n(ρ̂n −ρ) = n−1/2 ∑n i=1 uti−1 (uti −θuti−1 ) n−1 ∑n i=1 u 2 ti−1 . https://doi.org/10.28924/ada/ma.2.15 Eur. J. Math. Anal. 10.28924/ada/ma.2.15 12 Since E(ut1ut2|ut1 ) = θu 2 t1it follows by Lemma 3.1 in Bibby and Srensen [2] n−1/2 n∑ i=1 uti−1 (uti −θuti−1 ) converges in distribution to normal distribution with mean zero and variance equal to E[(ut1ut2 ) −E(ut1ut2|ut1 )] 2 = 1 −e2(θ−β1δ){2(β1 −θ)(βi + 1)}−1. Applying delta method the result follows. In the next step, we use the estimator of λ to estimate θ.Note that 1 ρ̂n = ∑n i=1 u 2 ti−1∑n i=1 uti−1uti . Hence 1 + β2m1 −κ(θ) λ = ∑n i=1 u 2 ti−1∑n i=1 uti−1uti . Thus β2m1 −κ(θ) λ = ∑n i=1 u 2 ti−1∑n i=1 uti−1uti − 1 = − ∑n i=1 uti−1 [uti −uti−1 ]∑n i=1 uti−1uti . Now replace λ by its estimator MLE λ̂n. β2m1 −κ(θ) = − ∑n i=1 uti−1 [uti −uti−1 ] T n ∑n i=1 uti−1uti . Thus θ̂n = κ −1 ( β2m1 + ∑n i=1 uti−1 [uti −uti−1 ] T n ∑n i=1 uti−1uti ) . Since the function κ−1(·) is a continuous function, by application of delta method, the followingresult is a consequence of Theorem 3.2. Theorem 3.3 θ̂n →a.s. θ as n →∞, √ n(θ̂n −θ) →D N(0, (κ′(θ))−2λ2(1 −e−2λ −1(κ(θ)−β2m1 ))) as n →∞. In the second stage, we plug-in λ by its estimator λ̂n. Remark Sub-fractional Brownian motion, which has main properties of the fractional Brownianmotion, excluding the stationarity of increments, has the covariance function CH(s,t) = s 2H + t2H − 1 2 [ (s + t)2H + |s − t|2H ] , s,t > 0. https://doi.org/10.28924/ada/ma.2.15 Eur. J. Math. Anal. 10.28924/ada/ma.2.15 13 One can gereneralize this to sub-fractional Levy process by plug-in method which would havenonstationary increments and corresponding SPDE models could be used for modeling in financeand biology. References [1] O.E. Barndorff-Nielsen, N. Shephard, Non-Gaussian Ornstein-Uhlenbeck-based models and some of their uses infinancial economics, J. R. Stat. Soc. B. 63 (2001) 167–241. https://doi.org/10.1111/1467-9868.00282.[2] B.M. Bibby, M. Sørensen, M. Sorensen, Martingale estimation functions for discretely observed diffusion processes,Bernoulli. 1 (1995) 17-39. https://doi.org/10.2307/3318679.[3] J.P.N. Bishwal, Bayes and sequential estimation in Hilbert space valued stochastic differential equations, J. KoreanStat. Soc. 28 (1999) 93-106.[4] J.P.N. Bishwal, The Bernstein-von Mises theorem and spectral asymptotics of Bayes estimators for parabolic SPDEs,J. Aust. Math. Soc. 72 (2002) 287–298. https://doi.org/10.1017/S1446788700003906.[5] J.P.N. Bishwal, Parameter estimation in stochastic differential equations, Lecture Notes in Mathematics, 1923,Springer-Verlag, (2008).[6] J.P.N. Bishwal, Maximum quasi-likelihood estimation in fractional Levy stochastic volatility model, J. Math. Finance.1 (2011) 58–62. https://doi.org/10.4236/jmf.2011.13008.[7] J.P.N. Bishwal, Berry-Esseen inequalities for the discretely observed Ornstein-Uhlenbeck-Gamma process, MarkovProcesses and Related Fields. 17 (2011b) 119-150.[8] J. Bishwal, Minimum contrast estimation in fractional Ornstein-Uhlenbeck process: Continuous and discrete sam-pling, Fract. Calc. Appl. Anal. 14 (2011) 375–410. https://doi.org/10.2478/s13540-011-0024-6.[9] J.P.N. Bishwal, Benstein-von Mises theorem and small noise Bayesian asymptotics for parabolic stochastic partialdifferential equations, Theory Stoch. Proc. 23 (2018) 6-17.[10] M. Huebner, A characterization of asymptotic behaviour of maximum likelihood estimators for stochastic PDE’s,Math. Methods Stat. 6 (1997) 395-415.[11] M. Huebner, Asymptotic properties of the maximum likelihood estimator for stochastic PDEs disturbed by smallnoise, Stat. Inference Stoch. Processes 2 (1999) 57–68. https://doi.org/10.1023/A:1009990504925.[12] M. Hübner, R. Khasminskii, B.L. Rozovskii, Two examples of parameter estimation for stochastic partial differentialequations, in: S. Cambanis, J.K. Ghosh, R.L. Karandikar, P.K. Sen (Eds.), Stochastic Processes, Springer New York,New York, NY, 1993: pp. 149–160. https://doi.org/10.1007/978-1-4615-7909-0_18.[13] M. Huebner, B.L. Rozovskii, On asymptotic properties of maximum likelihood estimators for parabolic stochasticPDE’s, Probab. Theory Related Fields. 103 (1995) 143–163. https://doi.org/10.1007/BF01204212.[14] I.A. Ibragimov, R.Z. Khas’minskii, Some estimation problems for stochastic partial differential equations, Dokl. Akad.Nauk, 353 (1997) 300–302.[15] A.N. Kolmogorov, Wiener skewline and other interesting curves in Hilbert space, Dokl. Akad. Nauk, 26 (1940)115-118.[16] M. Kleptsyna, A. Le Breton, Statistical Analysis of the Fractional Ornstein–Uhlenbeck Type Process. Stat. InferenceStoch. Processes. 5 (2002) 229–248. https://doi.org/10.1023/A:1021220818545.[17] T. Koski, W. Loges, Asymptotic statistical inference for a stochastic heat flow problem, Stat. Probab. Lett. 3 (1985)185–189. https://doi.org/10.1016/0167-7152(85)90015-X.[18] T. Koski, W. Loges, On minimum-contrast estimation for hilbert space-valued stochastic differential equations,Stochastics. 16 (1986) 217–225. https://doi.org/10.1080/17442508608833374.[19] P. Lévy, Processus stochastiques et mouvement Brownien, Gauthier-Villars, Paris, 1948. https://doi.org/10.28924/ada/ma.2.15 https://doi.org/10.1111/1467-9868.00282 https://doi.org/10.2307/3318679 https://doi.org/10.1017/S1446788700003906 https://doi.org/10.4236/jmf.2011.13008 https://doi.org/10.2478/s13540-011-0024-6 https://doi.org/10.1023/A:1009990504925 https://doi.org/10.1007/978-1-4615-7909-0_18 https://doi.org/10.1007/BF01204212 https://doi.org/10.1023/A:1021220818545 https://doi.org/10.1016/0167-7152(85)90015-X https://doi.org/10.1080/17442508608833374 Eur. J. Math. Anal. 10.28924/ada/ma.2.15 14 [20] W. Loges, Girsanov’s theorem in Hilbert space and an application to the statistics of Hilbert space- valued stochasticdifferential equations, Stoch. Processes Appl. 17 (1984) 243–263. https://doi.org/10.1016/0304-4149(84) 90004-8.[21] S.V. Lototsky, B.L. Rosovskii, Spectral asymptotics of some functionals arising in statistical inference for SPDEs,Stoch. Processes Appl. 79 (1999) 69–94. https://doi.org/10.1016/S0304-4149(98)00079-9.[22] I.S. Mishura, Stochastic calculus for fractional Brownian motion and related processes, Springer-Verlag, Berlin,New York, 2008.[23] I. Norros, E. Valkeila, J. Virtamo, An elementary approach to a Girsanov formula and other analytical results onfractional Brownian motions, Bernoulli. 5 (1999) 571-587. https://doi.org/10.2307/3318691.[24] S. Peszat, J. Zabczyk, Stochastic partial differential equations with levy noise: Evolution equations approach,Cambridge University Press, Cambridge, England, (2007).[25] D. Straumann, Estimation in conditionally heteroscedastic time series models, Lecture Notes in Statistics, 181,Springer-Verlag, Berlin, (2005).[26] C.A. Tudor, F.G. Viens, Statistical aspects of the fractional stochastic calculus, Ann. Stat. 35 (2007) 1183-1212. https://doi.org/10.1214/009053606000001541.[27] B.B. Mandelbrot, J.W. Van Ness, Fractional Brownian motions, fractional noises and applications, SIAM Rev. 10(1968) 422–437. https://doi.org/10.1137/1010093. https://doi.org/10.28924/ada/ma.2.15 https://doi.org/10.1016/0304-4149(84)90004-8 https://doi.org/10.1016/0304-4149(84)90004-8 https://doi.org/10.1016/S0304-4149(98)00079-9 https://doi.org/10.2307/3318691 https://doi.org/10.1214/009053606000001541 https://doi.org/10.1137/1010093 References