Optimal Control and Hamiltonian System

In this paper, an optimal control for Hamiltonian control systems with external variables will be formulated and analysed. Necessary and sufficient conditions which lead to Pantryagin’s principle are stated and elaborated. Finally it is shown how the Pontryagin’s principle fits very well to the theory of Hamiltonian systems. The case of Potryagin’s maximum principle will be considered in detail since it is capable of dealing with both unbounded continuous controls and bounded controls which are possibly discontinuous.


Introduction
It has been essential for many physical systems which are governed by differential equations to be controlled in such a way that a given performance index would be optimized. Large savings in cost have been obtained by a small improvement in performance. The optimal control problem which will be formulated will be the so called Bolza problem [1] with the added condition that the control variables lie in a closed set. [2] in his paper of Optimal control of stochastic dynamical systems developed existence of stochastic optimal controls for a large class of stochastic differential systems with finite memory is considered. [3] established a feedback control law is developed for dynamical systems described by constrained generalized coordinates. They revealed that for certain complex dynamical systems, it is more desirable to develop the mathematical model using more general coordinates then degrees of freedom which leads to differential-algebraic equations of motion. [4] developed a computational approach to motor control that offers a unifying modelling framework for both dynamic systems and optimal control approaches. In discussions of several behavioural experiments and some theoretical and robotics studies, they demonstrated how the computational ideas allow both the representation of selforganizing processes and the optimization of movement based on reward criteria. [5] proposed a new mathematical formulation for the problem of optimal traffic assignment in dynamic networks with multiple origins and destinations. Several researchers have studied optimal control and dynamical systems. [6] studied Dynamical Systems based optimal control of incompressible fluids. They proposed a cost functional based on a local dynamical systems characterization of vortices. Connections of optimal control and Hamiltonian systems especially the necessary conditions of optimality has not been studied yet. In this paper, it is intended to focus on the link between optimal control and Hamiltonian systems. The case of Potryagin's maximum principle will be considered in detail since it is capable of dealing with both unbounded continuous controls and bounded controls which are possibly discontinuous.

Formulation of Optimal Control Problem
We consider the state of a control system described by an n -vector x t x t = x whose evolution is governed by a system of differential equations where u is a control function from a closed subset of n ℝ and the optimal control problem can be stated as follows: over all continuous functions x and measurable functions u satisfying L is called the running cost and φ the terminal cost. [7].
The Pontryagin's principle requires the introduction of the Hamiltonian function : In analogy with the corresponding quantity in classical mechanics.
( ) Similar to the formulation of the Hamiltonian systems, the following set of equations hold [8]

Necessary Conditions for Optimality
In this section we shall state the necessary conditions for optimality which then lead to Pontryagin's maximum principle. Theorem The necessary condition's for to be an optimal initial condition and optimal control for the optimal control problem stated above are the existence of a nonzero kdimensional vector λ with 1 0 λ ≤ and an n = dimensional vector function ( ) Condition (ii) above can be written as This is called Pontryagin's maximum principle.
The interpretation of this principle is that on the optimal control, H is minimized with respect to the control variables j u , 1, ..., j m = .
For simplicity we shall treat problem. This is a special case of the problem of optimal control in which the initial time and final time are fixed and there are no conditions on the final state.
We shall restate the Pontryagin's principle so that it fits naturally to our framework of free terminal point problem.
Theorem: (Pontryagin's principle for free terminal point problem) [1] A necessary condition for optimality of a control u for the free terminal point problem is that For each The Pontryagin's principle gives only necessary conditions for optimality but these conditions need not be sufficient. Since each optimal control must be external, there must be external controls which are not optimal. However it is natural to ask for conditions which are not optimal. However it is natural to ask for conditions which are both sufficient and necessary for optimality.  x The necessary and sufficient conditions that a control u be optimal for free terminal point problem is that for each fixed x is a function on . K To fit this to the performance index It is assumed that that L is a real continuously differentiable function and convex in ( ) , x u and φ is continuously differentiable convex function of .
x For simplicity we shall consider a linear system.
Theorem A necessary and sufficient condition for optimality of a control u for free terminal point problem with system [1].
and performance index L t x u is strictly convex in ( ) , x u for each fixed t , the optimal control ( ) t u is unique [1].

Proof
Since the corresponding differential equations is a linear system, the set κ of controls such that ( ) Hence ( ) J u has a minimum at ( ) t = u u [1].

If ( )
, , L t x u is strictly convex, the above inequality is strict. Thus ( ) J u is a strictly convex function on κ and the minimum is unique.

Connections to Hamiltonian Systems
To fit the Pontryagin's principle to theory of to the theory of Hamiltonian systems, a control system  If ( ) * T T X is projected onto * T X then there may be some points in V where the projection does not have mximal rank and thus the solution of the differential equation will not be defined. If * T X is projected onto X , singularities and nonuniqueness of the optimal trajectories my occur [10].

Conclusion
In this paper, an optimal control for Hamiltonian control systems with external variables has been formulated and analysed. Necessary and sufficient conditions which led to Pantryagin's principle are stated. It was shown how the Pontryagin's principle to the theory of Hamiltonian systems. The case of Potryagin's maximum principle was taken abroad because it is capable of dealing with both unbounded continuous controls and bounded controls which are possibly discontinuous.