Open access peer-reviewed chapter

# Matrices Which are Discrete Versions of Linear Operations

By Armando Martínez Pérez and Gabino Torres Vega

Submitted: November 22nd 2017Reviewed: January 24th 2018Published: August 29th 2018

DOI: 10.5772/intechopen.74356

## Abstract

We introduce and study a matrix which has the exponential function as one of its eigenvectors. We realize that this matrix represents a set of finite differences derivation of vectors on a partition. This matrix leads to new expressions for finite differences derivatives which are exact for the exponential function. We find some properties of this matrix, the induced derivatives and of its inverse. We provide an expression for the derivative of a product, of a ratio, of the inverse of vectors, and we also find the equivalent of the summation by parts theorem of continuous functions. This matrix could be of interest to discrete quantum mechanics theory.

### Keywords

• exact finite differences derivative
• exact derivatives on partitions
• exponential function on a partition
• discrete quantum mechanics

## 1. Introduction

We are interested on matrices which are a local, as well as a global, exact discrete representation of operations on functions of continuous variable, so that there is congruency between the continuous and the discrete operations and properties of functions. Usual finite difference methods [1, 2, 3, 4] become exact only in the limit of zero separation between the points of the mesh. Here, we are interested in having exact representations of operations and functions for finite separation between mesh points.

The difference between our method and the usual finite differences method is the quantity that appears in the denominator of the definition of derivative. The appropriate choice of that denominator makes possible that the finite differences expressions for the derivative gives the exact results for the exponential function. We concentrate on the derivative operation, and we define a matrix which represents the exact finite difference derivation on a local and a global scale. The inverse of this matrix is just the integration operation. These are interesting subjects by itself, but they are also of interest in the quantum physics realm [5, 6, 7].

In this chapter, we will consider only the case of the derivative and the integration of the exponential function.

## 2. A matrix with the exponential function as an eigenvector

Here, we consider the N×Nantisymmetric, tridiagonal matrix

DNevΔ2χvΔ12χvΔ000012χvΔ012χvΔ000012χvΔ0000000012χvΔ000012χvΔ012χvΔ000012χvΔevΔ2χvΔ,E1

where v—it can be pure real or pure imaginary—, Δ+, and χvΔsinhvΔ/vΔ+v2Δ3/6+OΔ5. This function χvΔis well defined for v=0, with value χ0Δ=Δ. This matrix is interesting because, as we will see below, it represents a derivation on a partition. A rescaled matrix D¯Nis defined as

D¯N1/z1000010100001000000001000010100001z,E2

where z=evΔ, and

DND¯N2χvΔ.E3

We are mainly interested in finding the eigenvalues and the corresponding eigenvectors of these matrices.

We start our study with a result about the determinant of D¯NλIN,

D¯NλIN=D¯N+αIN=α1/z1000001α1000001α1000001α000000α1000001α1000001α1000001α+z=α1zAN1α+AN2α,E4

where λ=α,

Ajαα100001α100001α00000α100001α100001α100001α+z=α+zBj1α+Bj2α,E5

and

Bjα=α10001α10001α0000α10001α10001α.E6

Strikingly, we recognize the determinant Bjαas the Fibonacci polynomial of index j+1[10, 11], i.e., Bjα=Fj+1α. Fibonacci polynomials are defined as

F0x=0,F1x=1,Fjx=xFj1x+Fj2x,j2.E7

Since we have that Bjα=Fj+1α, and the recursion relationship for Fibonacci polynomials, we also have that

Ajα=α+zFjα+Fj1α=zFjα+Fj+1α,E8

and then

D¯N+αIN=α1zzFN1α+FNα+zFN2α+FN1α=zαFN1α+FN2α+α1zFNα=α+z1zFNα.E9

Then, the eigenvalues of the derivative matrix D¯Nare λ1=z1/z=evΔevΔ=2sinhvΔand λm=αm, where αmis the m-th root of the N-th Fibonacci polynomial, which is a polynomial of degree N1[10, 11].

The system of simultaneous equations for the eigenvector emT=em,1em,2eNcorresponding to λm, can be put in a form similar to the recursion relationship for the Fibonacci polynomials, i.e.,

em,2=λmem,1+em,1z,E10
em,j+1=λmem,j+em,j1,1<j<N,E11
zem,N=λmem,N+em,N1.E12

This set of recursion relationships can be written as the matrix equation

em,jem,j+1=011λmem,j1em,j,j=1,,N,E13

where em,0=em,1/zand em,N+1=zem,N. Thus

em,jem,j+1=011λmjem,0em,1,j=1,,N,E14

but

011λmj=Fj1λmFjλmFjλmFj+1λm,E15

and then,

em,jem,j+1=Fj1λmFjλmFjλmFj+1λmem,0em,1,j=1,,N.E16

i.e., the j-th component of the m-th eigenvector is

em,j=Fjλm+Fj1λmzem,1forj=1,2,,N.E17

For the case of the eigenvalue λ1=z1/z, we can rewrite Eq. (17) by noticing that if we let x=ww1(w), then Fnx+Fn1x/w=wn1for n=1,2,. This can be proved by induction method as follows. For n=1, it is immediately verified. First, suppose that the equality holds for nk. Next, we compute the right-hand side of the equality for k+1. Substituting Fk1=wwk1Fkin the expression for k+1, and using the properties of the Fibonacci polynomials, we obtain

Fk+1x+Fkxw=xFkx+Fk1x+Fkxw=xFkx+wkwFkx+Fkxw=wk.E18

Therefore, according to Eqs. (17) and (18), the eigenvector for the eigenvalue λ1=2sinhvΔtakes the form e1=c1zzN1T, where cis a normalization constant. We can take advantage of the normalization constant and write

e1=cevq1evq2evqNT,E19

with eigenvalue λ1=v(in original scaling, i.e., the eigenvalue of the matrix DN), q1is an arbitrary constant, and qj=q1+j1Δ. This means that the exponential function is an eigenvector of the derivative matrix which is a global representation of the derivative on the partition q1q2qN. Recall that the exponential function is an eigenfunction of the derivative of functions of continuous variable.

The remain of the eigenvectors have eigenvalues equal to the negative of the roots of the N-th Fibonacci polynomial λm=xm, m=1,2,,N1, and have the form

em=c1F2λm+evΔF3λm+evΔF2λmFN1λm+evΔFN2λmevΔFN1λmE20

The vector that we will be interested on is the one which is the exponential function (19) with eigenvalue v.

## 3. The matrix DNrepresents a derivation

Let us consider a partition, PNqi1N, qi, of Nequally spaced points qiof the interval ab, a<b, with the same separation Δ=ba/N1between them.

The rows of the result of the multiplication of the derivative matrix DNand a vector gg1g2gnTare

DNgj=gj+1gj12χvΔ,j=1,2,,N,E21

where g0evΔg1and gN+1evΔgN. We recognize these expressions as the second order derivatives of the function gxat the mesh points, but instead of dividing by twice the separation Δbetween the mesh points, there is the function χvΔin the denominator. This function makes it possible that the exponential function be an eigenvector of the matrix DN.

The values g0=evΔg1and gN+1=evΔgNextend the original interval abto aΔb+Δso that we have well defined the second order derivatives at all the points of the initial partition, including the edges of the interval. When gxis the exponential function, we have g0=evx1Δand gN+1=evxN+Δ, i.e., they are the values of the exponential function evaluated at the points of the extension.

Thus, we define finite differences derivatives for any function gxdefined on the partition as

Dg1=g2evΔg12χvΔ,E22
Dgj=gj+1gj12χvΔ,E23
DgN=evΔgNgN12χvΔ,E24

to be used on the first, central, and last points of the partition.

The determinant of the derivative matrix is not always zero, and in fact, it is [see Eqs. (4) and (9)]

D¯N=2sinhvΔFN0.E25

But, since F2j+1=1, and F2j=0, then

D¯2j=0,D¯2j+1=2sinhvΔ.E26

Hence, only the matrices with an odd dimension have an inverse.

Next, we will derive some properties of these finite differences derivatives.

### 3.1. The derivative of a product of vectors

There are two equivalent expressions for the finite differences derivative of a product of vectors defined on the partition. A set of such expressions is

Dgh1=g2h2evΔg1h12χvΔ=g2h2evΔg1h22χvΔ+g1evΔh2h2+h2evΔh12χvΔ=h2Dg1+g1Dh1+g1h2evΔ12χvΔ=h2Dg1+g1Dh1+g1h2v2+v24Δ+OΔ3,E27
Dghj=hj+1Dgj+gj1Dhj,E28
DghN=hNDgN+gN1DhN+1evΔ2χvΔgN1hNhNDgN+gN1DhN+gN1hNv2v24Δ+OΔ3.E29

A second set of equalities is

Dgh1=g2Dh1+h1Dg1+g2h1evΔ12χvΔ=g2Dh1+h1Dg1+g2h1v2+v24Δ+OΔ3,E30
Dghj=gj+1Dhj+hj1Dgj,E31
DghN=gNDhN+hN1DgN+gNhN11evΔ2χvΔgNDhN+hN1DgN+gNhN1v2v24Δ+OΔ3,E32

### 3.2. Summation by parts

The sum of Eqs. (28) or (31), with weights 2χvΔ, results in

j=nm2χvΔhj+1Dgj+j=nm2χvΔgj1Dhj=j=nm2χvΔDghj=gm+1hm+1+gmhmgnhngn1hn1,E33

or

j=nm2χvΔgj+1Dhj+j=nm2χvΔhj1Dgj=gm+1hm+1+gmhmgnhngn1hn1.E34

This is the discrete version of the integration by parts theorem for continuous variable functions, a very useful result.

### 3.3. Second derivatives

Expressions for higher order derivatives are obtained through the powers of DN. For instance, for the first two points, the second derivative is

D2g1=e2vΔ1g1evΔg2+g34χ2vΔ=Dg2evΔDg12χvΔ,E35
D2g2=evΔg12g2+g44χ2vΔ=Dg3Dg12χvΔ,E36

For inner points we get

D2gj=gj22gj+gj+24χ2vΔ=Dgj+1Dgj12χvΔ,3jN3,E37

and for the last two points of the mesh, we find

D2gN1=gN32gN1+evΔgN4χ2vΔ=DgNDgN22χvΔ,E38
D2gN=gN2evΔgN1+e2vΔ1gN4χ2vΔ=evΔDgNDgN1χ2v2Δ.E39

These derivatives also have the exponential function as one of their eigenvectors, and we can generate expressions for higher derivatives with higher powers of the derivative matrix.

### 3.4. The derivative of the inverse of functions

It is possible to give an expression for the derivative of h1q, including the edge points. For the first point, we have

D1h1=12χvΔ1h2evΔh1=12χvΔh2h1h1h2+1evΔh1=Dh1h1h2+1evΔ2χvΔ1h1+1h2.E40

For central and last points, we find that

D1hj=Dhjhj1hj+1,E41
D1hN=DhNhN1hN+evΔ12χvΔ1hN1+1hN.E42

The derivatives for the first and last points coincide with the derivative for central points when Δ=0.

### 3.5. The derivative of the ratio of functions

Now, we take advantage of the derivative for the inverse of a function and the derivative of a product of functions and obtain what the derivative of a ratio of functions is

Dgh1=1h2Dg1+g1D1h1+g1h2evΔ12χvΔ=1h2Dg1+g1Dh1h1h2+12χvΔ1h1+1evΔh2+g1h2evΔ12χvΔ=1h2Dg1g1h1h2Dh1+g1h11evΔ2χvΔ,E43
Dghj=Dgjhj1gj+1Dhjhj+1hj1,E44
DghN=1hNDgNgN1hN1hNDhN+gN1hN1evΔ12χvΔ,E45

expressions which are very similar to the continuous variable results. Again, these expressions coincide in the limit Δ0, and they reduce to the corresponding expressions for continuous variables.

### 3.6. The local inverse operation of the derivative

The inverse operation to the finite differences derivative, at a given point, is the summation with weights 2χvΔ

j=nm2χvΔDgj=j=nmgj+1gj1=gm+1+gmgngn1.E46

This equality is the equivalent to the usual result for continuous functions, axdydgy/dy=gxga. Note that the inverse at the local level is a bit different from the expressions obtained by means of the inverse matrix S(see below) of the derivative matrix D. When dealing with matrices there are no boundary terms to worry about.

### 3.7. An eigenfunction of the summation operation

Because the exponential function is an eigenfunction of the finite differences derivative and according to Eq. (46), we can say that

j=nm2χvΔvevqj=j=nm2χvΔDevqj=j=nmevqj+1evqj1=evqm+1+evqmevqnevqn1,E47

in agreement with the corresponding continuous variable equality axdxvevx=evxeva. However, here, we have to deal with two values at each boundary.

### 3.8. The chain rule

The chain rule also has a finite differences version. That version is

Dghqj=ghqj+1ghqj12χvΔ=ghqj+1ghqj12χvhqj+1hqj2χvhqj+1hqj2χvΔ=Dghjχvhqj+1hqjχvΔE48

where

Dghjghqj+1ghqj12χvhqj+1hqjE49

is a finite differences derivative of ghwith respect to h, and the second factor approaches the derivative of hqwith respect to q

χvhqj+1hqjχvΔhqj+1hqj+OΔh2Δ+OΔ2.E50

Thus, we will recover the usual chain rule for continuous variable functions in the limit Δ0.

## 4. The commutator between coordinate and derivative

Let us determine the commutator, from a local point of view first, between the coordinate—the points of the partition PN—and the finite differences derivative. We begin with the derivative of q,

Dqj=qj+1qj12χvΔ=ΔχvΔ1v26Δ2.E51

Hence, the finite differences derivative of the product qgqis

Dqgj=qj+1Dgj+gj1Dqj=qj+1Dgj+gj1ΔχvΔ,E52

i.e.,

Dcqgjqj+1Dcgj=gj1ΔχvΔ.E53

This is the finite differences version of the commutator between the coordinate qand the finite differences derivative D. This equality will become the identity operator in the small Δlimit, as expected. An equivalent expression is

Dqgjqj1Dgj=gj+1ΔχvΔ.E54

This is the finite differences version of the commutator between coordinate and derivative; the right hand side of this equality becomes gjin the small Δlimit, i.e., it becomes the identity operator.

### 4.1. The commutator between the derivative and coordinate matrices

The commutator between the partition and the finite differences derivative can also be calculated from a global point of view using the corresponding matrices. Let the diagonal matrix [QN] which will represent the coordinate partition

QNdiagq1q2qN.E55

Then, the commutator between the derivative matrix and the coordinate matrix is

DNQN=Δ2χvΔ010000010100000101000000001000001010000010.E56

This is a kind of nearest neighbors’ average operator, inside the interval. The small Δlimit is just

[DN,QN]I,E57

where I is the identity matrix, with the first and last elements replace with 1/2. Thus, coordinate and derivative matrices are finite differences conjugate of each other.

## 5. An integration matrix

Since the determinant of the derivative matrix DNis not always zero, we expect that there exist an inverse of it. At a local level, the inverse of the finite differences derivation is the summation as was found in Eq. (46). In this section, we determine the inverse of the derivative matrix, and we find that it is a global finite difference integration operation.

Once we know the eigenvalues and eigenvectors of the derivative matrix DN, it turns out that we also know the eigenvectors and eigenvalues of the inverse matrix, when it exists. In fact, the equality DNem=λmem, with λm0, imply that

DN1em=λm1em.E58

The inverse matrix SN=DN1is

SN=1z1z1z1z1z1z11/z11/z11/z11/z1z1z1z1z11/z11/z11/z11/z1z1z1z1z11/z11/z11/z11/z1z1z1z11/z11/z11/z11/z1,E59

Its determinant is

SN=sinhN1vΔ.E60

This matrix represents an integration on the partition, with an exact value when it is applied to the exponential function evqon the partition. When applied to an arbitrary vector g=g1g2gNT, we obtain formulas for the finite differences integration, including the edge points

SNg1=1z1/zg1+i=1Mg2i+1zg2i,E61
SNg2j=1z1/zzg1+k=1j1zg2k+1g2k+k=jMg2k+1zg2k,E62
SNg2j+1=1z1/zg1+k=1jg2k+1g2kz+k=j+1Mg2k+1zg2k,E63
SNgN=1z1/zg1+i=1Mg2i+1g2iz,E64

where N=2M+1. These are new formulas for discrete integration for the exponential function on a partition of equally separated points with the characteristic that it is exact for the exponential function evq.

## 6. Transformation between coordinate and derivative representations

Since one of the eigenvalues of the derivative matrix is a continuous variable, we can talk of conjugate functions with a continuous argument v. The relationship between discrete vectors on a partition qiand functions with a continuous argument vmakes use of continuous and discrete Fourier type of transformations, a wavelet [12]. If we have a function hof continuous argument v, a conjugate vector on the partition qiis defined through the type of continuous Fourier transform Fas

Fhqj1L2ΔL/2L/2eiqjvhvdv,E65

and vice-versa, a continuous variable function is defined with the help of a discrete type of Fourier transform Fas

FgvL2Δj=N+1N12χvΔeiqjvgj.E66

Assuming that the involved integrals converge absolutely, we can say that

FFgqj1L2ΔL/2L/2eiqjvL2Δk=N+1N12χvΔeiqkvgkdv=1Δk=N+1N1gkL/2L/2eiqkqjvsinhvΔdvv=k=N+1N1gkKqkqjLΔ.E67

where

KqkqjLΔ1ΔL/2L/2eiqkqjvsinhvΔdvv=12ΔshiL2iqkqj+Δ+ishiL2qkqjiΔ2ishiL2qkqj+iΔ.E68

The function KqkqjLΔis an approximation to the Kronecker delta function δk,j. The function shi is the hyperbolic sine integral shiz=0zdtsinht/t. A plot of it is shown in Figure 1.

FFhv=L2Δj=N+1N12χvΔeiqjv1L2ΔL/2L/2eiqjuhudu=L/2L/2duhuJvuN,E69

where

JxN2χvΔΔj=N+1N1eiqjvu=2χvΔΔj=N+1N1eijvuΔ=2χvΔΔsinN1/2vuΔsinvuΔ/2,E70

The ratio of sin functions, in this expression, is an approximation to a series of Dirac delta functions located at vuΔ=, k. Thus, the operations Fand Fare finite differences inverse of each other.

### 6.1. The discrete Fourier transform of the finite differences derivative of a vector

Next, based on Eq. (28), we find that

Deiqvgj=gj+1Deiqvj+eiqj1vDgj=ivgj+1eiqjv+eiqj1vDgj.E71

If we sum this equality, we get

j=N+1N12χvΔDeiqvgj=ivj=N+1N12χvΔgj+1eiqjv+j=N+1N12χvΔeiqj1vDgjE72

i.e.,

FNDgv=ivFN+1gv+eivΔeiqjvgjj=N+2N+2Leiqjvgjj=N+1N1E73

Therefore, the discrete Fourier transform of the derivative of a vector gis ivtimes the discrete Fourier transform of g, plus boundary terms.

The Fourier transform of the derivative of a continuous function of variable vis easily found if we consider the equality

ddveiqjv=iqjeiqjv.E74

The integration of this equality with appropriate weights gives

iqjL/2L/2dveiqjvhv=L/2L/2dveiqjvdhvdv+eiqjvhvv=L/2L/2,E75

i.e.,

Fhj=iqjFhj+1L2eiqjvhvv=L/2L/2.E76

Hence, as is usual, the Fourier transform of the derivative of a function hvof continuous variable vis equal to iqjtimes the Fourier transform of the function, plus boundary terms.

## 7. Conclusion

We proceed with a brief discussion of the relationship between the derivative matrix DNand an important concept in quantum mechanics; the concept of self-adjoint operators [8, 9]. In particular, we focus on the momentum operator, whose continuous coordinate representation (operation) is given by id/dq, i.e., a derivative times i, in the case of infinite-dimensional Hilbert space.

In the finite-dimensional complex vectorial space (where each vector define a sequence gii=1Nof complex numbers such that igi2<). A transformation Ais usually called Hermitian, when its entries ai,jare such that ai,j=aj,i(denote the complex conjugate). Our matrix DNis related to an approximation of the derivative (see Section 3) which uses second order finite differences. Therefore, we can ask if the matrix iDNis also Hermitian.

Let PN=iDNand v=ixbe the eigenvalue of DN, where xis a free parameter, the corresponding eigenvalue of iDNis indeed the real value x; which is one of the properties of a Hermitian matrix, as is also the case of infinite-dimensional space (for the Hilbert space on a finite interval, these values are discrete, and for the Hilbert space on the real line, these values conform the continuous spectrum, instead of discrete eigenvalues). Other characteristic of iDNis that the eigenvector corresponding to xis the same exponential function which is the eigenfunction of id/dx(see Section 2).

Furthermore, let PNdenote the adjoint of PN. Thus, if we restrict our attention to the off-diagonal entries PNi,j=iDNi,j, it is fulfilled that PNi,j=idj,i=idi,j=PNi,j(noticing that, with v=ixthen χxΔ=sinxΔ/x). Even more, if we do not care about the two entries di,ifor i=1,N, we will have a Hermitian matrix. Finally, as it was seen in Section 4, we can say that PNcan be considered as a suitable approximation to the conjugate matrix to the coordinate matrix.

In conclusion, we have introduced a matrix with the properties that a Hermitian matrix should comply with, except for two of its entries. Besides, our partition provides congruency between discrete, continuous, and matrix treatments of the exponential function and of its properties.

## More

© 2018 The Author(s). Licensee IntechOpen. This chapter is distributed under the terms of the Creative Commons Attribution 3.0 License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

## How to cite and reference

### Cite this chapter Copy to clipboard

Armando Martínez Pérez and Gabino Torres Vega (August 29th 2018). Matrices Which are Discrete Versions of Linear Operations, Matrix Theory - Applications and Theorems, Hassan A. Yasser, IntechOpen, DOI: 10.5772/intechopen.74356. Available from:

### Related Content

Next chapter

#### Square Matrices Associated to Mixing Problems ODE Systems

By Victor Martinez-Luaces

First chapter

#### 3-Algebras in String Theory

By Matsuo Sato

We are IntechOpen, the world's leading publisher of Open Access books. Built by scientists, for scientists. Our readership spans scientists, professors, researchers, librarians, and students, as well as business professionals. We share our knowledge and peer-reveiwed research papers with libraries, scientific and engineering societies, and also work with corporate R&D departments and government entities.

View all Books