Identifiability of Quantized Linear Systems

On-line parameter identification is a key problem of adaptive control and also an important part of self-tuning regulator (STR) (Astrom & Wittenmark, 1994). In the widely equipped large-scale systems, distributed systems and remote systems, the plants, controllers, actuators and sensors are connected by communication channels which possess only finite communication capability due to, e.g., data loss, bandwidth constraint, and access constraint. From a heuristic analysis perspective, the existence of communication constraints has the effect of complicating what are otherwise wellunderstood control problems, including the traditional methods, such as the H control (Fu & Xie, 2005), and even the basic theoretic notions, such as the stability (De Persis & Mazenc, 2010).


Introduction
On-line parameter identification is a key problem of adaptive control and also an important part of self-tuning regulator (STR) (Astrom & Wittenmark, 1994).In the widely equipped large-scale systems, distributed systems and remote systems, the plants, controllers, actuators and sensors are connected by communication channels which possess only finite communication capability due to, e.g., data loss, bandwidth constraint, and access constraint.From a heuristic analysis perspective, the existence of communication constraints has the effect of complicating what are otherwise wellunderstood control problems, including the traditional methods, such as the H  control (Fu & Xie, 2005), and even the basic theoretic notions, such as the stability (De Persis & Mazenc, 2010).
Due to the constraints of the communication channels, it is difficult to transmit data with infinite precision.Quantization is an effective way of reducing the use of transmission resource, and then meeting the bandwidth constraint of the communication channels.However, quantization is a lossy compression method, and hence the performance of parameter identification, even the validity or effectiveness of identification may be changed by quantization, along with which the performance of adaptive control may deteriorate.This has attracted plenty of works.The problem of system identification with quantized observation was investigated in (Wang, et al, 2003(Wang, et al, , 2008(Wang, et al, , 2010)), where issues of optimal identification errors, time complexity, optimal input design, and the impact of disturbances and unmodeled dynamics on identification accuracy and complexity are included.
In the light of the fundamental effect of quantization on system identification, it is necessary to pay attention to the parameter identifiability property of quantized systems.The concept of identifiability has been defined by maximal information criterion in (Durgaryan & Pashchenko, 2001): the system is parameter identifiable by maximal information criterion, if the mutual information between actual output and model output is greater than zero.However, the concept of identifiability in (Durgaryan & Pashchenko, 2001) is defined in principle, based on which there is no practical result.Reference (Zhang, 2003;Zhang & Sun, 1996;Baram & Kailath, 1988) have discussed the problem of states estimability, which is related closely with parameter identifiability, for that inputoutput description of linear systems with Gauss-Markov parameters can be transformed to state space model, and then the problem of parameter identifiability can be treated as state estimability.Reference (Zhang, 2003) has proposed the definition of parameter identifiability under the criterion of minimum maximum error entropy (MMSE) estimation referring to the definition of states estimability, and also obtained some useful conclusions.Reference (Wang & Zhang, 2011) has studied the parameter identifiability of linear systems under access constraints.However, there is few work on that for quantized systems.
This paper mainly analyzes the parameter identifiability of quantized linear systems with Gauss-Markov parameters from information theoretic point of view.The definition of parameter identifiability proposed in (Zhang, 2003) is reviewed: the linear system with Gauss-Markov parameters is identifiable, if and only if the mutual information between the actual value and estimates of parameters is greater than zero, which is extended to quantized systems by considering the intrinsic property of the system.Then the parameter identifiability of linear systems with quantized outputs is analyzed and the criterion of parameter identifiability is obtained based on the measure of mutual information.Furthermore, the convergence property of the quantized parameter identifiability Gramian is analyzed.
The rest of the paper is organized as follows: In section 2, we introduce the model that we are interested in; Section 3 discusses the existing definition of parameter identifiability, proposes our new definition, and gives analytic conclusion focusing on quantized linear systems with Gauss-Markov parameters; The convergence property of Gramian matrix of parameter identifiability for quantized systems is discussed in section 4; Section 5 and 6 are illustrative simulation and conclusion, respectively.

System description
Consider the following SISO linear system expressed by Auto-Regressive and Moving Average Model (ARMA) (Astrom & Wittenmark, 1994): where A, B are known matrices with appropriate dimensions; noise sequence {() } wk is Gaussian and white with zero-mean and covariance Q; initial value of the parameter (0) θ is Gaussian with mean θ and covariance (0) Π .Suppose that () ek , () wk and (0) θ are mutually uncorrelated.Hence, linear system (1) with Gauss-Markov parameters can be described by ( 4) and (5), i.e., T (+ 1 ) = () + () Due to bandwidth constraint of the channel, quantization is required.The discussion in the present paper does not focus on a special quantizer, but on general N-level quantization (Curry 1970;Gray and Neuhoff 1998) which can be described as: where E{.} is the operation of expectation.

Definition of parameter identifiability
Reference (Zhang, 2003) proposed the definition of parameter identifiability for system (6) referring to the definition of state estimability under MMEE.

Let ˆ()
θ k be the MMEE estimation of () θ k based on () Fk , and be the estimation error.Define the prior and posterior mean-square estimation error matrices respectively as Definition 1(Zhang, 2003): The linear system (1) with Gauss-Markov parameters (i.e.system ( 6)) is identifiable, if and only if where (;) I  denotes mutual information.
Based on Definition 1, the following conclusion was obtained in (Zhang, 2003).
Lemma 1(Zhang, 2003): The linear system (1) with Gauss-Markov parameters (i.e.system (6)) is identifiable, if and only if, the identifiability Gramian In the present paper, we propose an alternative definition of parameter identifiability for the quantized system (6)(7)(8) from information theoretic point of view, as follows.
Definition 2: The quantized linear system with Gauss-Markov parameters (6 1. From information theory (Cover & Thomas, 2006), condition ( 11) is equivalent to that Identifiability of Quantized Linear Systems 283 where () H  denotes entropy, i.e. the prior error entropy (() ) H θ k is strictly greater than posterior error entropy (() ) H θ k ; 2. Definition 1 considers the mutual information between the actual value and estimate of parameters, so it relies on the estimation principle, while Definition 2 takes into account the intrinsic property of the system independent of the estimator used.If Definition 2 is adopted to analyze unquantized system (6), the identifiability condition (11) turns into where , and condition ( 9) is equivalent to (13) for linear Gaussian system (6) (Zhang, 2003).Hence, in some sense, Definition 2 is a more general one than Definition 1.

Identifiability analysis of quantized linear systems
Mutual information (;) I   (Cover & Thomas, 2006) is a measure of the information amount commonly contained in, and the statistic dependence between two random variables.
(;) 0 I   with equality if and only if these two random variables are independent.Therefore (11) based on the information theoretic Definition 2 indicates that system is parameter identifiable if and only if any direction of parameter space is not orthogonal to all the past (quantized) measurements (Baram & Kailath, 1988), i.e.
From the Gauss-Markov property of the parameters, we have where where '( ) = ( ) -( ) , and the time variable j of other relevant time-variant variables (e.g.F(j), () Π j ) is omitted for notational simplicity, then Based on the Bayesian law, where the last equality is based on the fact that '( ) θ j and e(j) are stochastically independent, then where T(.) is the probability distribution function of standardized normalized distribution (note that, T(.) is different from the tail function defined in (Ribeiro et al., 2006;You et al., 2011)), (' , 0 , )  G θΠ denotes the probability density function which means that stochastic vector ' θ is Gaussian with zero-mean and covariance Π .Following the similar line of argument as in (Ribeiro et al., 2006;You et al., 2011), we calculate the integral (20) and then obtain where ()   is the probability density function of standardized normalized distribution.
Substitute (21) into (18), we have Identifiability of Quantized Linear Systems 285 Combining ( 22) with ( 17) and ( 14), we get that T* * qq Then ( 23) is equivalent to that has full rank.We conclude the above analysis as follows.

() () () () ( ) kj kj
A Π jFjF jΠ jA , =0,1,2, , jk  , which is the same part as in the unquantized system, reflects the intrinsic properties of the system.Hence, the full rank requirement of idq k W shows that the parameter identifiability of the quantized system is defined by quantizer and intrinsic properties of the system jointly; 2. When quantization level = 1 N , i.e. 1 =a  , 2 =+ a  , () 0 ψ j  , then idq 0 k W  , condition (26) is not satisfied and the system is not identifiable.This is consistent with the intuition.From ( 10) and ( 25), it can be observed that the difference between unquantized estimability Gramian id k W and quantized estimability Gramian idq k W is that the later includes additional weights 2 () , =0 , 1 , 2 , , ψ jj k  .As a result, it can be seen that besides the situation of quantization level = 1 N , the matrix idq k W may become singular due to the property of 2 () ψ j , though id k W has full rank; 3. The quantizer in Theorem 1 is time-invariant.However, by using the above analysis method, a conclusion similar to Theorem 1 can be derived for time-variant quantizer, except that the weights in the estimability Gramian reflects the time-variant property of the quantizer, i.e., where l c and l d are constants which make be the same for every j, thus, By comparing ( 30) with ( 10), we can find that such a time-variant quantizer does not change the parameter identifiability of the original system if and only if ( 29) is satisfied; 5. The parameter identifiability of the system can be preserved even if the quantization level is low as long as the quantizer is designed reasonably.Especially, when quantization level = 2  , namely a coarse quantizer of 1 bit can preserve the parameter identifiability of the original system.

Convergence analysis
In this section, we discuss the convergence property of the Gramian in Theorem 1, i.e., the convergence property of ( ), = 0,1, 2, ψ jj  .
We know that by the property of ()   , then () ψ j can be re-expressed as Identifiability of Quantized Linear Systems 287 Let Δ =s u pΔ l 1k N  ,where Consider the convergence property of the Gramian idq k W when Δ 0  .Note that N  when Δ 0  , and then

Simulation
In order to illustrate our main conclusion, the following system is simulated with the tool of Matlab: where () bk is the parameter to be identified, and can be modeled as a Gauss-Markov process.Then the system model can be transformed to T (+ 1 ) = () + () where () = () θ kb k , a=0.5.() ek , () wk and (0) θ are mutually statistically uncorrelated, their covariance are Q=1, R=0.1, (0) = 1 Π respectively, and the mean of (0) θ is =1 θ .Here we set () = ( -1 ) =2 s i n () + 3 Fk uk k as the assumed system input (i.e., the control signal, which can be considered to be generated, for example, by the adaptive controller), where the additive term "3" plays the role of avoiding the problem of "turn-off" (Astrom & Wittenmark, 1994).
To do the illustrative simulation, an optimal filter is required though the analysis about parameter identifiability is independent of the estimator used.The discussed linear system with Gauss-Markov parameter is transformed to state space model, and then the problem of parameter identification can be treated as states estimation.A number of quantized state estimators have been proposed by scholars in various areas, and we choose the Gaussian fit algorithm (Curry, 1970) as the filter in this section for that this filter which bases on the Gaussian assumption is near optimal and convenient to be implemented.Note that, in this simulated model, ()  Fk is defined by (-1 ) uk completely, so it is known at the channel receiver; however, in general model (1) ( ), = 1,2, , The analysis about parameter identifiability of quantized systems is suitable for any rational quantizer.Here the Max-Lloyd quantizer (Proakis, 2001) generally adopted in areas of communication and signal processing is employed.In the following statement, cases of quantization level = 4 N and = 2 N in (7) are simulated, respectively.The thresholds of the 4 level Max-Lloyd quantizer are {-∞, -0.9816, 0, 0.9816, +∞} and the outputs of the quantizer are {-1.51,-0.4528, 0.4528, 1.51} when the signal to be quantized is standardized normally distributed.In the case of 2 level quantizer, the thresholds are {-∞, 0, +∞} and the outputs of the quantizer are {-0.7979,0.7979}.Hence the thresholds of the time-variant quantizers with 4 and 2 levels are respectively .Hence the parameter identifiability will not be changed by the quantization according to Remark 2, i.e. the quantized system is still parameter identifiable, theoretically.The simulation results of the quantized system shown in Fig. 1 ( =  quantized systems, the probability distribution of estimation error () θ k  is unknown, but is supposed to make the entropy of () θ k  maximal according to "maximal entropy principle" of Jaynes (Jaynes, 1957), namely, the uncertainty of () θ k  is supposed to be maximal in the situation of lack of prior information, hence () θ k  is assumed to be Gaussian, and thus (35) can be adopted to calculate the entropy of () θ k  in this simulation.We can observe from Fig. 1(c) and Fig. 2(c) that the posterior error entropy is strictly smaller than prior error entropy from the initial time instant.This indicates that this quantized system is parameter identifiable, and these observations consist with our analysis mentioned above perfectly.Besides, we can observe that the estimation error when quantization level = 2 N is greater than that in the case of quantization level = 4 N though the system is parameter identifiable in both of the two quantization cases.This shows that systems with different quantizers lead to different estimation precision, though all of them are parameter identifiable when rational quantizers are used.

Conclusion
This paper discusses the parameter identifiability of quantized linear systems with Gauss-Markov parameters from information theoretic point of view.The existing definition concerning this property is reviewed and new definition is proposed for quantized systems.Criterion function, the Gramian of parameter identifiability for quantized systems is analyzed based on the quantity of mutual information.The derived conclusions consist with our intuition very well and also provide us with intrinsic perspective the quantizer design.The analysis shows the Gramian of quantized systems converge to that of unquantized systems when the quantization intervals turn to zero, and a well designed quantizer can preserve the identifiability of the original system even if the quantizer is as coarse as one bit.The analytical analysis is verified by the illustrative simulation.

Remark 3 :
Equation (34)  implies the convergence of Theorem 1 to Lemma 1 when Δ 0 4 N ) and Fig. 2 ( = 2 N ) illustrate the above conclusion.In Fig. 1(a) and Fig. 2(a), actual values of parameter are denoted by solid lines and the estimates are denoted by dotted lines.Estimation errors are shown in Fig. 1(b) and Fig. 2(b).Fig. 1 and Fig. 2 show that the estimate can track the real value of the parameter when the outputs are quantized coarsely.The curves of prior error entropy and posterior error entropy are shown in Fig. 1(c) and Fig. 2(c).The entropy is calculated is a Gaussian vector with covariance C, | |  denotes determinant.For