Quadratic Optimal Control of Fractional Stochastic Differential Equation with Application

The paper is devoted to the study of optimal control of Quadratic Optimal Control of Fractional stochastic differential Equation with application of Economy Mode with different types of fractional stochastic formula (ITO, Stratonovich), By using the Dynkin formula, Hamilton-Jacobi-Bellman (HJB) equation and the inverse HJB equation are derived. Application is given to a stochastic model in economics.

where x (t), t ∈ [0, T], is a given continuous process, u (t) is a control process, H (t) be n × n matrices, M (t) be n × k matrices, b (t) be n × m matrices, the control u(t) be k × 1 vector, B (t) and B (t) are Fractional Brownian Motion and Brownian Motion respectively. we presented Dynkin formula, This result can be obtained from Taylor formula for above Fractional stochastic differential equations and there generators, By using Dynkin formula and the property of expectation, the Hamilton-Jacobi-Bellman (HJB) equation and the inverse HJB equation have been stated. The stochastic optimal control for the stochastic differential delay equation was found in the paper [1], we will give the proof for Dynkin formula, the Hamilton-Jacobi-Belman (HJB) equation, the inverse HJB equation and the optimal control for each of the above equation. For a definitions related to optimal control see [2], a Ramsey model [4,6] that takes into account the randomness in the production cycle. The models is described by the equations 1. dk (t) = [H (t) k (t) + u (k (t))M (t)]dt + b (k (t))dB 2. dk (t) = [H (t) k (t) + M (t) u (k (t))]dt +b (k (t)) ○dB (t) 3. dk (t) = [H (t) k (t) + M (t) u (k (t))]dt +b (k (t)) ○dB (t) where k is the capital, M is the production, u is control process, H (t) be n×n positive matrices. For these stochastic economic models the optimal control for the first and second economic equation is found to be u (t) = − Differential Equation  (4), [3] The probability p is a set function that p: F→ [0, 1], and p is called a probability measure if the following conditions hold i. P (Ω) =1.
Definition (14), [2] A measurable function f: ] for all stopping time T and all x ∈ Y + .
Remark (1), [2] Let f , f , ………, f m are bounded Borel function on Y + and T be a stopping time and A " is %-algebra, then For all 0 ≤ h ≤ h ≤………≤ h m , let g be the set of all real M / -measurable function for t≥0, we define the shift operator For any stopping time u, the following property be satisfy where So f is supermeanvalued

Fractional Stochastic Differential Equation
Let a (x(t)), b (x (t)) are continuous functionaldefined on the metric space K, let the Fractional stochastic process x (t) satisfy the Fractional Stochastic Differential Equation and B (t) is Brownian motion. Let H (t) be n×n matrices, M (t) be n×k matrices, b (t) be n×m matrices and thecontrol u (t) be k×1 vector and then from (8) and (9) the stochastic process x(t) in (7) satisfy the linear Fractional Stochastic Differential Equation Remark (2), "The IT'O Fractional Taylor formula", [9] Let x (t) be the stochastic process given as Wherea (x(t)), b (x (t)) are continuous functional defined on the metric space K, the Hurst parameter H ∈ ( , 1) and V is separable Hilbert space, Let f: V → V be a twice continuously differentiable function such that f ': V → S (V, V) and f '': V ( ) → S (V, V) where f ' and f '' are the first and second derivatives respectively for p, q ∈ [0, t] and V, then the process f (x (t)) satisfies the IT'O Fractional Taylor formula defined by the ITO Formula by taking the derivative of both saided one can get by applying (7)  Definition (15) [5] The generator A • of anFractional Stochastic differential equation (7) defined by

Fractional Martingale Problem
If (7) is an ITO Fractional Stochastic Differential Equation with generator A • and f∈ W (R) then the Fractional Martingale formula is Differential Equation with Application

Dynkine Formula for the Linear Quadratic Regulator Problem
Let h∈ W (R), C (t) be the n×n matrices and G (t) be the k×k matrices, Note that from equation (11) and equation (17) we obtain the following Fractional Taylor formula for the function h (x (t)) where h (x (t)) defined as Let T be a stopping time for the stochastic processx (t) defined in equation (12) such that E ( A • h (x (t)) dt <∞, by taking the expectation of two sides, one can get the following Dynkin formula

The Quadratic Regulator Optimal Problem
Assume that the cost function of the fractional linear quadratic regulator function is where all of the coefficients C (t) be the n×n matrics, G (t) be the k×k, the control u (t) be k×1 vector, we assume that C (t) and Rare symmetric, non negative definite and G (t) is symmetric positive definite and T is the final time of the solution x (t) where x (t) defined in (3. 4) such that E |T| <∞, the problem is to find the optimal control u * (t) such that h (x, u * (t))=min{h (x, u)}.

Theorem (1) "HJB equation "
Suppose that h∈ W (R) and the optimal controlu * exists Then where G(t) be the k×k metrics, the control u (t) be k×1 vector, and the generator A • is given by equation (22) and The minim is a chivied when u * is optimal. In other words Proof Now proceed to prove (7. 4), let u=Tv be the first exit time of the solution x (t) by using (2. 5) and (2. 6) by equation (21), we get Then u * is an optimal controle Proof Let u be a Markov control, and let u be a Markova control then dt] =h (x, u), then u * is an optimal controle.

Application 1 [Economics Model and It's Optimization [Fractional Stochastic Differential Equation]]
In 1928 F. R Ramsy introduced an economics model describing the rate of change of capital K and labor L in a market by a system of ordinary differential equation with P and C being the production and consumption rates − respectively the model is given by Where a(t) is the rate of growth Labor. The production, capital and labor are related by the Cobb−Douglas formula. Differential Equation with Application where A, α, β are some positive constant. in certain the dependence of P on K and L is linear these meanα = β = 1which will be our assumption throughout this section we shall also assume that the labor is constant, L (t) = L ; which is true for certain markets or relatively short time intervals of several years.
Therefore the production rate and the capital are related by p (t) = H (t) k (t). [1] Another important assumption we make is that the production rate is subject to small random disturbances i.e Which can be rewritten in the differential form as Where B is fractional Brownian motion b (k (t))is real function, characteristic of the noise.
Assume that M (t) can be controlled Usually one wants to minimize the cost function let us choose the following cost function The operator onh * (x) = (x " (T) Rx (T)) by taking the derivative of two sides, one can get, Is optimal control for the linear-quadratic fractional Brownian motion differential equation and the optimal cost function is

Stratonovich Stochastic Differential Equation
Leta (x(t)), b (x (t)) are continuous functionaldefined on the metric space K, let the stochastic process x (t) satisfy the Stratonovich Stochastic Differential Equation and B (t) is Brownian motion. Let H (t) be n×n matrices, M (t) be n×k matrices, b (t) be n × m matrices and thecontrolu (t) be k × 1 vector, let a (x(t)) = H (t) x(t)+ M (t) u (t) and b (x (t)) = b (t), then (6.

The Martingle Problem
If (30) is an Stratonovich Stochastic Differential Equation with generator A • and f∈ W (R) then

Dynkin Formula for Fractional Stochastic Linear Quadratic Regulator Problem with Stratonovich Formula
Let h∈ W (R), C (t) be the n×n matrices and G (t) be the k×k matrices, Note that from (34), we obtain the following Stratonovich formula for the function h (x (t)) where h (x (t)) defined as (44) Let T be a stopping time for the stochastic processx (t) such that h (x (t)) dt <∞, by taking the expectation of two sides, one can get the following Dynkin formula h (x (t)) dt] (45)

The Fractional Stochastic Quadratic Regulator Optimal Problem
Assume that the cost linear quadratic regulator function is where all of the coefficients C (t) be the n×n matrics, G (t) be the k×k, the control u (t) be k×1 vector, we assume thatC (t) and Rare symmetric, non negative definite and G (t) is symmetric positive definite and T is the final time of the solution x (t) where x (t) defined in (25) such that E |T|<∞, the problem is to find the optimal control u * (t) such that h (x, u * (t))=min{h (x, u)} (47)

Hamilton-Jacobi-Bellman Equation for Quadratic Regulator Problem
Let the optimal control u * (t) ∈Y where Y is the set of control then the generator in equation (35) become

Proof
Now proceed to prove (43), let u=Tv be the first exit time of the solution x (t) by using (4) and (5) Theorem (4). (convers of the HJB_ equation) let h * (x) be a bounded function in W (-) ∩ C (CL (G)), Suppose that for all u ∈ Y where Y is the set of control the inequality Then u * is an optimal controle Proof Let u be a Markov control, and let u be a Markova control then therefore u * is an optimal controle.

Application 2 [Economics Model with Brownian Stronovich Differential Equation]
In 1928 F. R Ramsy introduced an economics model describing the rate of change of capital K and labor L in a market by a system of ordinary differential equation with P and C being the production and consumption rates − respectively the model is given by Where a(t) is the rate of growth Labor. The production, capital and labor are related by the Cobb−Douglas formula.
where A, α, β are some positive constant. in certain the dependence of P on K and L is linear these meanα = β = 1which will be our assumption throughout this section we shall also assume that the labor is constant, L (t) = L ; which is true for certain markets or relatively short time intervals of several years. Therefore the production rate and the capital are related by A nether important assumption we make is that the production rate is subject to small random disturbances i.e p(t) = H (t) k (t) + b (k (t)) ○dB (t). therefore Where M(t) = − C (t) Which can be rewritten in the differential form as

Where B (t) is Brownian motion b (k (t)) is real function, characteristic of the noise, Assume that M (t) can be controlled then equation (55) become dk (t) = [H (t) k (t) + M (t) u (k (t))]dt +b (k (t)) ○dB (t) (56)
usually one wants to minimize the cost function (38) let h * (x) = x " (T) R x (T), and let h * (x (T)) ∈ D ( A • ) and bytaking the derivative of two sides, one can get, is an optimal control for stratonovich stochastic linear quadratic differential equation and the optimal cost function is

Fractional Stratonovich Stochastic Differential Equation
Let a (x(t)), b (x (t)) are continuous functionaldefined on the metric space K, let the Fractional stochastic process x (t) satisfy the Fractional Stratonovich Stochastic Differential Equation and B (t) Fractional is Brownian motion. Let H (t) be n×n matrices, M (t) be n×k matrices, b (t) be n × m matrices and thecontrol u (t) be k × 1 vector, let a (x(t)) = H(t) and equation (35)

Remark (6), "The ITO Fractional Stratonovich Taylor Formula"
Let the stochastic process x (t) defined as where ã (x (t)), b (t) ○ dB (t) are defined in equation (58) and equation (59) respectively, and a (x(t)), b (x (t)) are continuous functionaldefined on the metric space K, then x (t) satisfy the ITO Fractional Stratonovich Taylor Formula for f: R→R by applying equation (58) By taking the derivative of two sides, one can get,

Remark (7)
by using substitution equation (66) in equation (16) one get that (67) Let h∈ W (R), C (t) be the n×n matrices and G (t) be the k×k matrices, Note that from equation (33) and equation (34) we obtain the following Stratonovich formula for the function h (x (t)) where h (x (t)) defined as h (x (t)) =h ( Then equation (35) become

The Fractional Stratonovich Martingle Problem
If (62) is an ITO Fractional Stratonovich Stochastic Differential Equation withgenerator A • and f∈ W (R) then

Dynkin Formula for the Fractional Linear Stratonovich Quadratic Regulator Problem
Let h∈ W (R), C (t) be the n×n matrices and G (t) be the k×k matrices, Note that from equation (33) and equation (34) we obtain the following Stratonovich formula for the function h (x (t)) where h (x (t)) defined as Let T be a stopping time for the stochastic processx (t) such that E ( A •   " h (x (t)) dt <∞, by taking the expectation of two sides, one can get the following Dynkin formula

The Fractional Stochastic Quadratic Regulator Optimal Problem with Stratonovich
We assume that the cost linear quadratic regulator function is where all of the coefficients C (t) be the n×n matrics, G (t) be the k×k, the control u (t) be k×1 vector, we assume thatC (t) and Rare symmetric, non negative definite and G (t) is symmetric positive definite and T is the final time of the solution x (t) where x (t) defined in(12. 7) such that E |T| <∞, the problem is to find the optimal control u * (t) such that h (x, u * (t))=min{h (x, u)} (76)

Hamilton-Jacobi-Bellman Equation for Fractional Stochastic Quadratic Regulator Problem
Let the optimal control u * (t) ∈Y where Y is the set of control then the generator in equation (16) become

Theorem (5) "HJB equation "
Defineh * (x) =min{ h (x, u): u = u (t) -Markov control} (78) Suppose thath ∈ W (R) and the optimal controlu * exists Then where G(t) be the k×k metrics, the control u (t) be k×1 vector, and the generator A • is given by equation (77) and The minim is a chivied whenu * is optimal. In other words

Proof
Now proceed to prove equation (81), let u=Tv be the first exit time of the solution x (t) by using (4) and equation (5) by equation (7) we have Theorem (6). (convers of the HJB_ equation) let h * (x) be a bounded function in W (-) ∩ C (CL (G)), Suppose that for all u ∈ Y where Y is the set of control the inequality Then h * (x) ≤ h (x, u), for all u ∈ Y, moreover x " (t) C (t) x (t) +u * p (t) G (t) u * (t) + A • * h * (x) = 0, Then u * is an optimal controle Proof Let u be a Markov control, and let u bea Markova control then A • h * (x) ≥ −x " (t) C (t) x (t) +u " (t) G (t) u (t) for u ∈ Y By equation (74) dt ≥ h (x) −E x " (t) C (t) x (t) + u * p (t) G (t) u * (t) " dt Thus h (x) ≤ E [h * (x) + x " (t) C(t) x(t) + " u * p (t) G(t) u * (t) dt] =h (x, u) therefore u * is an optimal controle.

Application 3 [Economics Model with Fractional Stratonovich Differential Equation]
In 1928 F. R Ramsy introduced an economics model describing the rate of change of capital K and labor L in a market by a system of ordinary differential equation with P and C being the production and consumption ratesrespectively the model is given by m ( ) = p (t) -C (t), ˆ ( ) = a (t) L (t) Where a(t) is the rate of growth Labor. The production, capital and labor are related by the Cobb−Douglas formula. p (t) = A k (t) ‡ L (t) ‹ Differential Equation with Application where A, u, β are some positive constant. in certain the dependence of P on K and L is linear these mean u = β = 1which will be our assumption throughout this section we shall also assume that the labor is constant, L (t) = L ; which is true for certain markets or relatively short time intervals of several years.
Therefore the production rate and the capital are related by p (t) = H (t) k (t), [1] A nether important assumption we make is that the production rate is subject to small random disturbances i.ep (t) = H (t) k (t) + b (k (t)) ○dB (t). therefore m ( ) = H (t) k (t) + b (k (t)) ○dB (t) −C (t) Where M(t) = − C (t) Which can be rewritten in the differential form as: -