Some Foundational Issues in Quantum Information Science

This chapter has three Parts. Part 1 attempts to analyze the concept “information” (in some selected contexts where it has been used) in order to understand the consequences of representing and processing information, quantum mechanically. There are at least three views on ‘Information’ viz., ‘Semantic Naturalism’, ‘the Quantum Bayesian Approach’ and ‘Information is Physical’ approach. They are then critically examined and at last one is given preference. Part 2 of the chapter then goes on to discuss the manner in which the study and quantification of “Qubit” (Quantum bit), Superposition and Entanglement, comprise the three pillars of Quantum Information Science and enable the discipline to develop the theory behind applications of quantum physics to the transmission and processing of information. In Part 3 we take up the issue that although it might appear bewildering, the physical approach to Quantum Information Science is equally proficient in dealing with its impact on the questions of “consciousness,” “freewill” and biological questions in the area known as “bioinformatics.”


Introduction
Quantum Information Science addresses the question as to how the fundamental laws of quantum physics can be exploited in order to explain in what way information is acquired, transmitted and processed, by drawing insights from various subfields of physical sciences, computer science, mathematics, and engineering. Quantum Information Science also combines fundamental research with practical applications.
The history of quantum information began at the turn of the 20th century when classical physics was revolutionized into quantum physics. The field of quantum information bloomed two decades ago when scientists realized that quantum physics could be used to transmit and process information in more efficient and secure ways. The development of quantum algorithm and communication protocols as well as the possibilities of implementing them with different systems, has established the field of quantum information science as one of the most promising fields for the 21st century.
The emergence of Information theory as studies of the transmission, processing, extraction, and utilization of information received immediate worldwide attention in the late forties. It was made possible by the publication of Claude E. Shannon's classic paper "A Mathematical Theory of Communication" [1].
Shannon for the first time introduced the qualitative and quantitative models of communication as statistical processes underlying information theory. Thus, Information theory often concerns itself with measures of information distributions and their application.
There is an urgent need to examine the foundational principles of quantum information and quantum physics in order to understand how we can dramatically improve their applications. The relevant utility of quantum computers has led to the possibility of simulating the complex quantum systems that appear in fields, such as condensed matter physics, high energy physics or chemistry. To do this, it is often necessary to build a scalable quantum computer (often called quantum simulators) and not necessarily an analog one. Quantum information science also has strong connections with quantum sensing and metrology, quantum simulation, quantum networks, and quantum dynamics. Issues in Quantum Information Science also found applications in areas, including statistical inference, cryptography, neurobiology, perception, linguistics, bioinformatics, quantum computing, information retrieval, plagiarism detection, pattern recognition, anomaly detection, biology and many other areas.
Dretske articulates his notion of information and defines it in following way: a state type T carries information of type p if there is a nomological or counterfactual regularity (perhaps a ceteris paribus law) to the effect that if a T occurs, p obtains. So, for example, the height of mercury in a thermometer carries information about the ambient temperature. Dretske's idea is to construct the content of beliefs out of the information that they carry under certain circumstances.
Thus, a signal correlated with p will fail to carry the information that p if the correlation is merely accidental or statistical: my thermometer carries information about the temperature of my room and not somebody else's room, even if the two rooms have the same temperature. It is because the state of my thermometer supports counterfactuals about the temperature of my room but not about the temperature of somebody else's room. Hence, it is a true generalization that if the temperature of my room were different, the state of my thermometer would be different. In contrast, it is not generally true that if the temperature of somebody else's room were different, the state of my thermometer would be different.
According to Dretske the engineering aspects of mechanical communication systems are relevant and he goes on to demonstrate precisely what their relevance is. Dretske's proposal is linking the information theory to the amount of information that an individual event carries about another event or state of affairs. He argues that if a signal is to carry the information that q it must, among other things, carry as much information as is generated by the obtaining of the fact that q. He says: How, for example, do we calculate the amount of information generated by Edith's playing tennis? … [O]ne needs to know: (1) the alternative possibilities … (2) the associated probabilities … (3) the conditional probabilities … Obviously, in most ordinary communication settings one knows none of this. It is not even very clear whether one could know it. What, after all, are the alternative possibilities to Edith's playing tennis? Presumably there are some things that are possible (e.g., Edith going to the hairdresser instead of playing tennis) and some things that are not possible (e.g., Edith turning into a tennis ball), but how does one begin to catalog these possibilities? If Edith might be jogging, shall we count this as one alternative possibility? Or shall we count it as more than one, since she could be jogging almost anywhere, at a variety of different speeds, in almost any direction? ( [6], p. 53) There might be problems in specifying absolute amounts of information; but it is comparative amounts of information with which Dretske is concerned, in particular, with whether a signal carries as much information as is generated by the occurrence of a specified event, whatever the absolute values might be.

Criticisms
In our system of communication and information thus far, there is an apparent failure to provide a satisfactory naturalized account of meaning or semantics. The important reason for this is that language, being a rule governed activity, has an essential normative component that cannot be captured by any naturalistic explanation. The impetus behind this line of thought derives from Wittgenstein's reflections on meaning and rule-following ( [7], p. 53).
Secondly, we learn from the circumstances certain beliefs in which the information these beliefs carry is not the belief's content. Take for example a child who learns to token a belief with a content about tigers by seeing pictures of tigers. In such cases her belief states carry information about pictures, in spite of the fact that their content is about tigers. Dretske's account will end up assigning the wrong truth conditional contents to these beliefs.
Thirdly, according to Dretske a teleological characterization of the state tokens, although the relevant information fixes the content of the beliefs. However, Dretske's main idea is that the informational content fixes the belief's semantic content in these instances of the belief state and they are reinforced by the relevant behavior which produces them. Although this is a naturalistic characterization of this class of beliefs, it is debatable whether it assigns appropriate contents. One may easily come up with situations in which a false token of a belief produces behavior.
Finally, it is believed that informational theories are the most promising proposals for reconciling naturalism with intentional realism. However, it remains to be shown that there is an informational theory of content that satisfies the constraint, viz.`ps cause Ss' is a law (where S has property p as its content). Of course, this does not mean that no informational theory can succeed. However, it does mean that, so far, appeals to information have not resolved the problem of naturalizing content.

The quantum theory as usually presented in terms of Bayesian approach
"To suppose that, whenever we use a singular substantive [e.g. "information], we are, or ought to be, using it to refer to something, is an ancient, but no longer a respectable, error." [8], Taking Strawson's stricture of construing 'information' as a singular substantive, disembodied abstract entity as "an ancient, but no longer a respectable error," we need to look at the probability approach as a possible alternative.
The fallacy of construing information as a disembodied abstract entity can be avoided by taking information as a range of possible results with varying probabilities. Abstractly, information can be thought of as the resolution of uncertainty. While probability theory allows us to make uncertain statements and reason in the presence of uncertainty, information allows us to quantify the amount of uncertainty in a probability distribution, Let take an example, suppose we wish to compute the probability whether a poker player will win a game provided she possesses certain set of cards, exactly the same probability formulas would be used in order to compute the probability that a patient has a specific disease when we observe that she has certain symptoms. The reason for this is as follows: Probability theory provides as a set of formal rules for determining the likelihood of a proposition being true given the likelihood of other propositions.
It was in the second half of the 18th century, there was no branch of mathematical sciences called "Probability Theory". It was known simply by a rather odd-sounding "Doctrine of Chances". An article called, "An Essay towards solving a Problem in the Doctrine of Chances", authored by Thomas Bayes [9], was read to Royal Society and published in the Philosophical Transactions of the Royal Society of London, in 1763.
In this essay, Bayes described a simple theorem concerning joint probability which gives rise to the calculation of inverse probability. This is called Bayes' Theorem. It shows that there is a link between Bayesian inference and information theory that is useful for model selection, assessment of information entropy and experimental design.
What is conveyed by this formula is that we update our belief, (i.e. prior probability), after observing the data/evidence (or the likelihood) of the belief and assign the updated degree of belief the term posterior probability. Our starting point could be a belief, however each data point will either strengthen or weaken that belief and this is how we update our belief or hypothesis all the time.

Objective certainty in finite probability spaces
In 1935, Einstein, Podolsky, and Rosen made the following sufficient condition for reality. Einstein, Podolsky and Rosen maintain that "If, without in any way disturbing a system, we can predict with certainty (i.e., with probability equal to unity) the value of a physical quantity, then there exists an element of physical reality corresponding to this physical quantity. [In this sense], can quantum-mechanical description of physical reality be considered complete?" [10], Rudolf Carnap and Yehoshua Bar-Hillel, also in Carnap Rudolf and Bar-Hillel Yehoshua [11] 'An Outline of a Theory of Semantic Information', take a probabilistic approach that capitalizes on the notion of the uncertainty of a piece of information in a given probability space.

The essential claim of quantum Bayesian approach
Quantum theory (as usually presented with the Born Rule, in its simplest form), states that the probability density of finding a particle at a given point, when measured, is proportional to the square of the magnitude of the particle's wave function at that point. It provides an algorithm for generating probabilities for alternative outcomes of a measurement of one or more observables on a quantum system. Traditionally they are regarded as objective. On the other hand, a subjective Bayesian or personalist view of quantum probabilities regard quantum state assignments as subjective.

Critical remarks
At the turn of the 21st century Quantum Bayesianism emerged as a result of the collaborative work among Caves et al. [12].
First, the word "Bayesian" does not carry a commitment to denying objective probability and a "Quantum Bayesian" insists that probability has no physical existence even in a quantum world. The probability ascriptions arise from a particular state that are understood in a purely Bayesian manner. Caves, Fuchs, and Schack refute Einstein, Podolsky, and Rosen's argument that quantum description is incomplete by giving up all objective physical probabilities. They would rather identify probability 1 with an agent's subjective certainty.
Secondly, the quantum state ascribed to an individual system is understood to represent a compact summary of an agent's degrees of belief about what the results of measurement interventions on a system are. Thus, an agent's degree of belief in terms of Quantum Bayesian approach is quite subjective and hence it would be characterized by a non-realist view of the quantum state.

'Information is Physical' Approach: An alternative
The fact that 'information is physical' means and that the laws of Quantum Mechanics can be used to process and transmit it in ways that are not possible with classical systems.
Thus, Classical Information Theory is the mathematical theory of information that involves processing tasks such as storage and transmission of information, whereas Quantum Information Theory is the study of how such tasks can be accomplished using quantum mechanical systems.

Foundational issues
Quantum Physics, ever since it was advanced in the 1920s, has led to countless discussions about its meaning and about how to interpret the theory correctly. These discussions relate to the issues like the Einstein-Podolsky-Rosen paradox, quantum nonlocality and the role of measurement in quantum physics and several others. For example, in stating their paradox on the basis of a certain restricted set of correlations for a pair of systems in a particular entangled state (explained below), Einstein et al. [10], claimed that the phenomenon of entanglement conflicts with certain basic realist principles of separability and locality that all physical theories should respect. Otherwise we have to regard quantum states as 'incomplete' descriptions of reality.
Challenging Einstein in 1927 during the fifth Solvay Conference (from October 24 to 29), on Electrons and Photons, which championed Quantum Theory, physicist Niels Bohr argued that the mere act of indirectly observing the atomic realm changes the outcome of quantum interactions. Nevertheless, according to Bohr, quantum predictions based on probability accurately describe reality. The so-called Copenhagen interpretation, which is a collection of views about the meaning of quantum mechanics principally attributed to Niels Bohr and Werner Heisenberg, also emerged in 1927. Bohr presented his view on quantum mechanics for the first time and Bohr's presentation of his view on quantum mechanics came to be called the Copenhagen interpretation, in honor of Bohr's home city. It combined his own idea of particle-wave complementarity with Max Born's probability waves and Heisenberg's uncertainty principle.
Earlier around 1926, Erwin Schrödinger had already developed a mathematical formula to describe such "matter waves", which he pictured as some kind of rippling sea of smeared-out particles. But Max Born showed that Schrödinger's waves are, in effect, "waves of probability". They encode the statistical likelihood that a particle will show up at a given place and time based on the behavior of many such particles in repeated experiments. When the particle is observed, something strange appears to happen: the wave-function "collapses" to a single point, allowing us to see the particle at a particular position.
In recent years research into the very foundations of quantum mechanics has given rise to the present field, i.e. Quantum Information Science and Technology. Thus the use of quantum physics could revolutionize the way we process and communicate information. The slogan that 'Information is Physical' is often presented as the fundamental insight at the heart of quantum information theory; after all 'information' is an abstract noun referring to something physical, transmitted from one point to another and it is frequently claimed to be entailed, or at least suggested, by the theoretical and practical advances of quantum information and computation.

Claude Shannon
The concept of information and technical notions of information, is derive from the work of Claude Shannon in his A Mathematical Theory of Communication, Claude E. Shannon [1], Shannon's concept of information tells us the irreducible meaning content of the message, specified in bits, which somehow possess their own intrinsic meaning. However, We must take note first that the notion of "information" in the semantic aspects of communication did not concern Shannon. His notion of "information" is often called "mathematical information" and it names a branch of study which deals with quantitative measures of information. For example, binary digit, or bit, can store two pieces of information, since it can represent two different states. Two bits can store four states, however: 00, 01, 10 and 11. Three bits can store eight states and so on. This can be generalized by the formula log 2 (x), where x represents the number of possible symbols in the system.
Secondly, Shannon, in his mathematical theory of information, introduces the term "entropy." Entropy is a key measure in information theory. It quantifies the amount of uncertainty involved in the value of a random variable or the outcome of a random process. We can illustrate it by identifying the outcome of a fair coin flip with two equally likely outcomes. It therefore provides less information or lower entropy than specifying the outcome from a roll of a die with six equally likely outcomes.
Shannon borrowed the term "entropy" from John von Neumann. However, in Shannon's undertaking, the notion of 'Information entropy' tells us about the measure of the uncertainty corresponding to unpredictability of a piece of information. Thus, it is claimed that information that is highly probable, hence, more predictable, has a lower entropy value than less distributed information, since 'less distributed information' discloses less about the world.
Finally, the important aspect of communication can be specified by bits, which signify the physical aspect of the message and yet, somehow, it carries the meaning of the message from one point to another by encoding and decoding. However, in Shannon's mathematical theory of information, the messages in question will not have meaning. For example, while we talk in a telephone what is transmitted is not what is said into the telephone, but an analogue signal. This analogue signal records the sound waves made by the speaker, which is transmitted digitally following an encoding. Thus, a communication system consists of an information source, a transmitter or encoder, (possibly noisy) a channel, and a receiver or decoder. These are the physical aspect of the message and what mainly concerns information scientists and engineers.
John Barwise and Jerry Seligman [13], identify the 'inverse relationship principle'. The inverse relationship principle says that the informativeness of a piece of information increases as its probability decreases. This position is closely linked to the notion of information entropy. They claim that the quantification of semantic content demonstrates a firm relationship between semantic information and the mathematical quantification of data, previously envisioned by Shannon.

Rolf Landauer
Perhaps the most vociferous proponent of the idea that 'information is physical' was the late Rolf Landauer. In the two articles by him and one related to his work, viz., Landauer Rolf [14-17], Landauer made a very important and new observation, i.e. that information is not independent of the physical laws used to store and processes it. Information is physical, or is a fundamental constituent of the universe. Landauer's point is that whenever we find information, we find it inscribed or encoded somehow in a physical medium of whatever kind.
Although modern computers rely on quantum mechanics to operate, the information itself is still encoded classically.
"Information is not an abstract entity but exists only through a physical representation, thus tying it to all the restrictions and possibilities of our real physical universe … information is inevitably inscribed in a physical medium" ( [17], p. 63, 64).
Moreover, it seems that Quantum Information Theory itself provides an apt illustration of the claim that 'Information is Physical'. But why is it that this claim is being made?
Since Landauer's very first work, viz., Landauer Rolf [14], "Dissipation and heat generation in the computing process," it was argued that information has a physical nature. As Galindo and Martin-Delgado in [18], point out that information is normally printed on a physical support, it cannot be transported faster than in vacuum, and it abides by natural laws. Moreover, they maintain that the statement that information is physical does not simply mean that a computer is a physical object, but in addition that information itself is a physical entity. In turn, this implies that the laws of information are restricted or governed by the laws of physics, in particular, those of quantum physics. Thus, information is not a disembodied abstract entity; it is always tied to a physical representation.
The first important results supporting the idea that "information is physical" was Landauer's erasure principle. it concerns the minimum amount of energy that has to be dissipated by a computing device when erasing one bit of information, The principle also states that the erasure of information is inevitably accompanied by the generation of heat. Bennett states the Principle in the following way: Landauer's erasure principle claims that "any logically irreversible manipulation of information, such as the erasure of a bit or the merging of two computation paths, must be accompanied by a corresponding entropy increase in non-information-bearing degrees of freedom of the informationprocessing apparatus or its environment" [19], It must be emphasized that Landauer's principle is valid both in classical and quantum physics.
Let us take a look at two ways the experts reacted to this view: The question is: do the truly fundamental laws of nature concern, not waves and particles, but "information"?
According to one view the truly fundamental laws of Nature concern information, not waves or particles and it is taken to be the basic postulate. For example, it is known that quantum key distribution is possible but 'quantum bit' commitment is not and that nature is nonlocal (but not as nonlocal as is imposed by causality).
According to the other view: "Information is information, not matter or energy" ( [20], p. 132).
This view will be supported by Shannon. For Shannon what a sender transmits to a receiver is not information but a message. While defining information Shannon is strictly concerned with the potential selections of messages or, more precisely, of the signs available in order to codify them, Shannon' theory does not come to grips with communication as transmission of meaning or with information as the meaning of a message. His theory is mainly concerned with codification and transmission of messages. It equates two terms that are apparently opposed, namely information and uncertainty. What Shannon aims to quantify is not an 'information flow,' [6], but the transmission of messages that can be continuous, discrete or mixed. This transmission is based on a medium or, more precisely, on a messenger and is understood as a formal relation between messages.

The view that 'Information is Physical' is the Foundation of Quantum Information Theory
Claude Shannon in a truly remarkable paper, Claude E. Shannon [1], laid down the foundations of the subject. In this paper Shannon claims that the main concern of Quantum Information Theory is as follows: "The fundamental problem of communication (under quantum information theory) is that of reproducing at one point either exactly or approximately the quantity of information selected at another point." [1] The Quantum Information Theory is much richer and more complex (than its classical counterpart) and it is inherently interdisciplinary in nature, since it touches on multiple fields and brings physicists, computer scientists, and mathematicians together on common goals.
It is far from being complete but has already found application areas well beyond the processing and transmission of information. In particular, it provides a new perspective to investigate, characterize, and classify the complex behavior of large quantum systems, ranging from materials to chemical compounds, high energy problems, and even holographic principles.
Nevertheless, even if Quantum Information Theory reinforces the notion that 'Information is Physical,' based on quantum physics, the notion itself is also relevant within classical physics.

Shannon's definition of quantity of information
Shannon defined the 'quantity of information' produced by a source, for example, the quantity in a message by a formula similar to the equation that defines thermodynamic entropy in physics. In classical thermodynamics, entropy is a property of a thermodynamic system that expresses the direction or outcome of spontaneous changes in the system. According to Shannon Entropy predicts that certain processes are irreversible or impossible, despite not violating the conservation of energy.
Shannon introduced as his most basic term, viz. informational entropy. It is the number of binary digits required to encode a message. This might appear currently to be a simple, even obvious way to define how much information is in a message. However, in 1948, at the very origin of quantum. Information age, the digitization of information of any sort was considered to be a revolutionary step. Shannon's 1948 paper might have been the first to use the word "bit," short for binary digit.
Shannon's paper contained two theorems. The first of these is the source coding theorem, which gives a formula for how much a source emitting random signals can be compressed, while still permitting the original signals to be recovered with high probability.
The second theorem, the channel coding theorem, states that with high probability, n uses noisy channel N can communicate Co (n) bits reliably, where C is the channel capacity.
Thus, a new approach emerges as a result of treating information as a quantum concept and to ask what new insights can be gained by encoding this information in individual quantum systems.

Generalizations and Laws in quantum information science
While we often treat information in abstract terms (especially in the context of computer science), it is more correct to think of information as being represented as different physical states and obeying the laws of physics.
However, what does it mean to say that information obeys the laws of physics. In Quantum Information Theory this amounts to claiming that both the transmission and processing of information are governed by quantum laws defined in terms of "Qubits" (and not by the classical "bits"). Since qubits behave quantumly and in terms of quantum probabilities, we can also capitalize them to explain the two most important phenomena of quantum information science, viz. "superposition" and "entanglement."

"Qubit"
The term for a classical physical system that exist in two unambiguously distinguishable states, representing 0 and 1, is often called a 'bit.' It is commonly acknowledged that the elementary quantity of information is the bit, which can take on one of two values, usually "0" and "1". If we consider any physical realization of a bit, it requires a system with two well defined states, For example in a switch off represents "0" and on represents "1". On the other hand a bit can also be represented by a certain voltage level in a logical circuit or a pit in a compact disc or a pulse of light in a glass fiber or the magnetization on a magnetic tape. For classical systems it is helpful to have the two states separated by a large energy barrier so that the value of the bit cannot change spontaneously.
On the other hand, in quantum information science, the basic variable is the "qubit": a quantum variant of the bit. In order to encode information in a two-state quantum system, it is customary to designate the two quantum states |0i and |1i. The term "Qubit" seems to have been used first by Benjamin Schumacher [21], in his "Quantum coding." Electrons possess a quantum feature called spin, a type of intrinsic angular momentum. In the presence of a magnetic field, the electron may exist in two possible spin states, usually referred to as 'spin up' and 'spin down'.
One of the innovative and unusual features of Quantum Information Science is the idea of "superposition" (explained below) of different states. A quantum system can be in a "superposition" of different states at the same time. Consequently, a quantum bit can be in both the |0i state and the |1i state simultaneously. This new feature has no parallel in classical information theory. Schumacher in [21], coined the word "qubit" to describe a quantum bit.
The job of the weird symbols "|" and "⟩" (the so-called the "bra-ket" notation, was introduced by Paul Dirac in [22]. It is essentially to remind us that mathematically we are talking about vectors that represent qubit states labeled 0 and 1 and physically they represent states of some quantum system. This helps us to distinguish them from things like the bit values 0 and 1 or the numbers 0 and 1. One way to represent this with the help of mathematics is to use two orthogonal vectors: Thus one of the novel features of Quantum Information Science is that a quantum system can be in a "superposition" of different states. In a sense, the quantum bit (or "qubit") can be in both the |0i state and the |1i state at the same time. This is one of the reasons why in 1995 Schuhmacher coined the word "qubit" to describe a quantum bit.
It is also claimed that in Quantum Computing a "qubit" carries information. The question is where is the extra information kept. The usual answer is that the extra information lies embedded in a superposition of entangled states (the two terms will be explained below). The peculiar feature of this is that any accessing of the information destroys the superposition and with it the extra information itself.
Suppose that the two vectors |0i and |1i are orthonormal. This means they are both orthogonal and normalized (a normalized vector is a vector in the same direction but with a norm (length) 1) and 'orthogonal' (Figure 1) means the vectors are at right angles): Consider a situation where the two vectors |0i and |1i are linearly independent. This means that we cannot describe |0i in terms of |1i and vice versa. Nevertheless, it is feasible to describe all possible vectors in 2D space using both the vectors |0i and |1i and our rules of addition and multiplication by scalars, It is maintained that the vectors |0i and |1i form a "basis" because of the fact that (i) the vectors |0i and |1i are linearly independent, and (ii) can be used to describe any vector in 2D space (Figure 2) using vector addition and scalar multiplication. When the vectors are both orthogonal and normalized, they construe an "orthonormal" basis.

ψ function, Schrödinger equation and the dynamics of quantum mechanics
In Quantum Mechanics, the wave function, ψ, plays the central role and represents a variable quantity that mathematically describes the wave characteristics of a particle. At a given point of space and time the value of the wave function of a particle represents the probability of the position of the particle at the time. ψ function may be thought of as an expression for the amplitude of the wave of a particle. However, the spatial probability density is given by the squared modulus of the wave function, ψ 2 . The Schrödinger equation is as follows: where i is the imaginary unit, is the time-dependent wave function, is h-bar, V (x) is the potential, and is the Hamiltonian operator.
The Schrödinger equation is supposed to answer the question as to how the states of a system change with time. It is in the form of a differential equation and it captures the 'dynamics' of quantum mechanics: it describes how the wave function of a physical system evolves over time.
Schrödinger's equation gives an answer to the question: what happens to the de Broglie wave associated with an electron if a force (gravitational or electromagnetic) acts on it. The equation gives the possible waves associated with the particle a number associated with any position in space at an arbitrary time (i.e. functions of position and time). The general form of this wave function is: The essence of the Schrödinger's equation is that, given a particle and given the force system that acts (say, gravitational or electromagnetic), it yields the wave function solutions for all possible energies. Thus a particle can be described by a state vector or wave function whose evolution is provided by the Schrödinger equation. Hence, the Schrödinger equation, being a time-evolution equation, will make ψ vary with time.

"Qubit" and Schrödinger equation
The time-dependent Schrödinger equation gives the time evolution of Ψ. The entangled states are created by distributing the qubits between the particles so that each particle carries one qubit. By assuming that a freely moving particle is the qubit carrier, it is found that both the particle position in physical space and the qubit state, change in time in accordance with the Schrödinger equation.

What is quantum computing?
Basically, Quantum computing is concerned with processing information by harnessing and exploiting the amazing laws of quantum mechanics. The use of long strings of "bits" in traditional computers encode either a zero or a one. In contrast with that a quantum computer uses quantum bits, or qubits. The difference can be explained as follows: a qubit is a quantum system that encodes the zero and the one into two distinguishable quantum states. However, because qubits behave quantumly, we can capitalize on the phenomena of "superposition" and "entanglement," which is not possible in the case of using "bits," as an encoding device.

Superposition and entanglement
These two concepts might baffle us, since we do not come across the phenomena they describe in our everyday lives. Only in the event of our looking at the tiniest quantum particles, atoms, electrons, photons and the like, that we see these intriguing things, like superposition and entanglement [23].

Superposition
Superposition essentially subscribes to the principle that a quantum system can be in multiple states at the same time, that is, something can be "here" and "there," or "up" and "down" at the same time. Thus it is possible for Qubits to represent numerous possible combinations of 1 and 0 at the same time. To put qubits into superposition, researchers manipulate them using precision lasers or microwave beams. This possibility of simultaneously being in multiple states is the phenomenon of superposition. In its most basic form, this principle says that if a quantum system can be in one of two distinguishable states |xi and |yi, then according to this principle it can be in any state of the form α |xi + β |yi, where α and β are complex numbers with | α | 2 + | β | 2 = 1.

Schrödinger's cat
In 1935, Erwin Schrödinger conjured up a famous thought experiment of putting in place a cat in a superposition of both alive and dead states. He envisioned that a cat, a small radioactive source, a Geiger counter, a hammer and a small bottle of poison were sealed in a chamber. He also imagined that if one atom of the radioactive source decays, the counter will trigger a device to release the poison. This is where Schrödinger invoked the idea of entanglement so that the state of the cat will be entangled with the state of the radioactive source. He expected that sometime after the cat will be in superposition of both alive and dead states.
It is certainly counterintuitive to think of the possibility of an organism to be in such a superposition of both alive and dead states (Figure 3). It also dramatically reveals the baffling consequences of quantum mechanics.

The double-slit experiment
Another well-known example of quantum superposition is the double-slit experiment in which a beam of particles passes through a double slit and forms a wave-like interference pattern on a screen on the far side.
Based on this experiment quantum interference is explained by saying that the particle is in a superposition of the two experimental paths: one passage is through the upper slit and the second passage is through the lower slit. Correspondingly a quantum bit can be in a superposition of |0i and |1i. The implication seems to be that each particle passes simultaneously through both slits and interferes with itself. This combination of "both paths at once" is known as a superposition state [23].
It must be noted here that the particles. After going through the two slits, will turn into two sets of waves, Figure 4, in accordance with quantum mechanical principles. Moreover, at some points the two sets of waves will meet crest to crest, at other points the crest will meet the trough of the other wave. Accordingly two possibilities will arise: (i) in Figure 5, where crest meets crest, there will be constructive interference and the waves will make it to the viewing screen as a bright spot, and (ii) where crest meets trough, there will be destructive interference that cancel each other out and a black spot will appear on the screen. One should see below bright lines of light, where the waves from the two slits constructively interfere, alternating with dark lines where the wave destructively interfere, Figure 6.   A particle tends to appear more often at some places (regions of constructive interference) and do not appear very often at other places (regions of destructive interference). However, the likelihood of finding the particle at a particular point can be described only probabilistically.

Superposition and quantum information science
In the two experiments explained above we have seen one of the features of a quantum system (viz, Superposition) whereby several separate quantum states can exist at the same time by superposition.
The Quantum Information Science claims that each electron will exist spin up and spin down, until it is measured. Till measurement is done it will have no chance of being in either state. Only when measured, it is observed to be in a specific spin state. Common experience tells us that a coin facing up is in a specific state: it is a head or a tail. Irrespective of looking at the coin, one is sure while tossing the coin must be either facing head or otherwise tail.
In quantum experience the situation is not as simple and more unsettling: according to quantum mechanics, material properties of things do not exist until they are measured, i.e. until one "looks" (measure the particular property) at the coin, as if, it has no fixed face.

The problem of measurement
Delving into the issue of quantum measurement and taking the double-slit experiment as a case in point, the "wave" of a particle, e.g., an electron, should be interpreted as relating to the probability of finding the particle at a specific point on the screen. We cannot detect any wave properties of a single particle in isolation. When we repeat the experiment many times, we notice that the wave function "collapses" and the particle is detected as a point particle. Thus, in Quantum Information Science the problem of wave function collapse is the problem of measurement of finding the probability.

Inherent uncertainty
In 1927 Heisenberg shook the physics community with his uncertainty principle: where ħ is the reduced Planck constant, h/2π. The formula states that the more precisely the position of some particle is determined, the less precisely its momentum can be known, and vice versa.
The uncertainty principle, the wave-particle duality, the wave collapsing into a particle when we measure it together lead to the claim that the probability of the same particle being there in several places at the same time cannot be ruled out, i.e. 'smeared out' multiple positions at a time.
The smiley face shows, Figure 7, the location of the particle in one peak, but then there are many such places as the multiplicity of smiley faces show.

Superposition and the power of a quantum computer
We have already seen (in Section 5.2.1) that whereas classical computing uses "bits" for data processing, quantum computing uses qubits. We have also scrutinized that the practical difference between a bit and a qubit is: a bit can only exist in one of two states at a time, usually represented by a 1 and a 0, whereas a qubit can exist in both states at one time.
Moreover we have observed that the phenomenon of "superposition" allows the power of a quantum computer to grow exponentially with the addition of each bit. For example, two bits in a classical computer provides four possible combinations-00, 01, 11, and 10, but only one combination at a time. Two bits in a quantum computer provides for the same four possibilities, but, because of superposition, the qubits can represent all four states at the same time, making the quantum computer four times as powerful as the classical computer. So, adding a bit to a classical computer increases its power linearly, but adding a qubit to a quantum computer increases its power exponentially: doubling power with the addition of each qubit.

Application of superposition in solving engineering problems
The principle of superposition is useful for solving simple practical problems, but its main use is in the theory of circuit analysis.
For example, in quantum science, the superposition theorem states that the response (voltage or current) in any branch of a linear circuit which has more than one independent source equals the algebraic sum of the responses caused by each independent source acting alone, while all other independent sources are turned off (made zero). Alternatively, a circuit with multiple voltage and current sources is equal to the sum of simplified circuits using just one of the sources.

Entanglement
Entanglement in quantum mechanics is considered to be an extremely strong correlation and inextricable linkage that may found between different particles of the same kind and with the same physical property. It has been observed that such linkage and intrinsic connection, subsisting between Quantum particles, is so robust, that two or more quantum particles separated albeit by great distances, may be placed at opposite ends of the universe, can still interact with each other in perfect unison. This seemingly impossible connection led Einstein to describe entanglement as "spooky action at a distance." This intriguing phenomenon demonstrates that it is possible for scientists and researchers to generate pairs of qubits that are "entangled," which amounts also to saying that two members of a pair exist in a single quantum state. Thus, they claim that if we change the state of one of the qubits, it will bring about instantaneous change in the state of the other one in a predictable way, even if they are separated by very long distances.
The notion of entanglement was coined by Erwin Schrodinger in order to signify the peculiar properties of quantum correlations. In the classical world, "the whole is the sum of its parts", but the quantum world is very different. Schrödinger [24] says: "the best possible knowledge of a whole does not necessarily include the best possible knowledge of all its parts, even though they may be entirely separate and therefore virtually capable of being 'best possibly known,' i.e., of possessing, each of them, a representative of its own. The lack of knowledge is by no means due to the interaction being insufficiently known, at least not in the way that it could possibly be known more completely, it is due to the interaction itself.
Attention has recently been called to the obvious but very disconcerting fact that even though we restrict the disentangling measurements to one system, the representative obtained for the other system is by no means independent of the particular choice of observations which we select for that purpose and which by the way are entirely arbitrary. It is rather discomforting that the theory should allow a system to be steered or piloted into one or the other type of state at the experimenter's mercy in spite of his having no access to it." For example, consider a pair of qubits. Suppose that each one is described by a state vector: the first one by |a⟩ and the second one by |b⟩. One might therefore think that the most general state of the two qubits should be represented by a pair of state vectors, |a⟩|b⟩, with one for each qubit. Indeed, such a state is certainly possible, but there are other states that cannot be expressed in this form. The possible pair of states are also separable (often called product states). States that are not separable are said to be entangled. Most vectors are entangled and cannot be written as product states. This shows a peculiar feature of quantum states.
Example of entanglement when a measurement is made: a subatomic particle decays into an entangled pair of other particles. Essentially, quantum entanglement suggests that acting on a particle here, can instantly influence a particle far away. This is often described as theoretical teleportation. It has huge implications for quantum mechanics, quantum communication and quantum computing.

Entanglement and quantum information science
Quantum entangled states play a crucial role and have become the key ingredient in the field of Quantum Information Science.
It will be a fair question to ask as to why does the effect of entanglement matter? The answer to this is as follows: the behavior of the Quantum entangled states gives rise to seemingly paradoxical effects, viz. any measurement of a particle's properties results in an irreversible wave function collapse of that particle and changes the original quantum state. In the case of entangled particles such measurement will affect the entangled system as a whole.
Schrödinger, (unlike Einstein, the most skeptical about entanglement and considered it the fatal flaw in quantum theory, referring to it as "spooky action-at-adistance"), was much more prepared to accept quantum theory with the concept of entanglement and along with all its predictions, no matter how weird they might be. In his paper [24], which introduced quantum entanglement, Schrödinger wrote "I would not call it one but rather the characteristic trait of quantum mechanics, the one that enforces its entire departure from classical lines of thought".

Application of entanglement
The interpretation of quantum states, in particular the interpretation of so-called 'entangled states' exhibit peculiar nonlocal (explained below) statistical correlations for widely separated quantum systems. For example, the theory underlying the field of quantum information, dealing with "entanglement," has found intriguing connections with different fields of physics, like condensed matter, quantum gravity, or string theory.
Quantum entanglement, more often is viewed as a physical resource, which enables us to communicate with perfect security, build very precise atomic clocks, and even teleport small quantum objects, dense coding and cryptography.

In what way entanglement enables us to communicate with perfect security
Quantum entanglement offers a new modality for communications that is different from classical communications. It has been claimed that entanglement enhances security in secret sharing.
Quantum cryptography (it is a method of storing and transmitting data quantum mechanically in a particular form so that only those for whom it is intended can read and process it) to a great extent revolves around quantum computing. The entanglement concept is one tool used in quantum computing, e.g., in the use of transmitting data via entangled Qubit, which is a unit of quantum information that is stored in a quantum system. Quantum cryptography utilizes photons and depends on the laws of physics rather than very large numbers and the deciphering of cryptographic codes.
It appears that we are perched on the edge of a quantum communication revolution that will change transmission of information, information security and how we understand privacy.

Nonlocality
Two central concepts of quantum mechanics are Heisenberg's uncertainty principle and nonlocality. Nonlocality plays a fundamental role in quantum information science.
Whereas the quantum entanglement, which can be traced back to the Einstein, Boris Podolsky and Nathan Rosen (EPR) paradox in 1935 (they argued that the description of physical reality provided by quantum mechanics was incomplete). This argument gave rise to the discussions on the foundations of quantum mechanics related to reality and locality. This plays crucial roles in quantum information processing.
Quantum Theory can predict certain patterns of correlation among spatially separated events correctly. This manifests non-local influences between some of these events. This is a remarkable feature of the microscopic world prescribed by quantum theory. This idea of nonlocality was described by Albert Einstein rather dismissively as "spooky action at a distance" that was mentioned above.
For example, if a pair of electrons is created together, one will have clockwise spin and the other will have anticlockwise spin (spin is a particular property of particles mentioned above). The most important point is that there are two possible states and that the entire spin of a quantum system must always cancel out to zero.
However, it is claimed that the two electrons can be considered to simultaneously have spins clockwise-anticlockwise and anticlockwise-clockwise respectively, under quantum theory, and if superposition is possible, If the pair are then separated by any distance (without observing and thereby decohering (see below) and then later checked, the second particle can be seen to instantaneously take the opposite spin to the first, so that the pair maintains its zero total spin, no matter how far apart they may be.

Decoherence
Quantum coherence presupposes the idea that an individual particle or object has wave functions that can be split into two separate waves. When the waves operate together in a coherent way, it is referred to as quantum coherence. Quantum decoherence means the loss of quantum coherence.
However, a quantum computer needs to operate coherently until the results are measured and read out. In implementing a quantum computer, a qubit and/or many entangled qubits must undergo unitary transformations before decoherence affects the qubit states as it no longer represents a unitary transformation. Quantum Theory gives an account of why ordinary macroscopic objects do not exhibit the interference behavior characteristic of quantum "superpositions".

Why do these quantum effects matter?
Simply put, they are extremely useful to the future of computing and communications technology. It is due to superposition and entanglement, a quantum computer carry out a vast number of calculations simultaneously. We know that a classical computer works with ones and zeros, however a quantum computer will have the advantage of using ones, zeros and "superpositions" of ones and zeros. Certain difficult tasks, e.g. code breaking, that have long been thought impossible (or "intractable") for classical computers will be achieved quickly and efficiently by a quantum computer.
Quantum computing is not just "faster" than classical computing, for many types of problems the quantum computer would excel, such as code breaking. The power, which is required for code breaking, is derived from quantum computing's use of "qubits" or "quantum bits."

What can a quantum computer do that a classical computer cannot?
It is easy for any computer to do factoring of large numbers or multiplying two large numbers. But calculating the factors of a very large (say, 500-digit) number, on the other hand, is considered impossible for any classical computer. In 1994, a mathematician from MIT, Peter Shor, came up with the claim that if a fully working quantum computer was available, it could factor large numbers easily.

Areas of application
Many experts divide technologies prompted by Quantum Information Science into three application areas: (1) Quantum Sensing and metrology, (2) Communications and (3) Computing and simulation:

Quantum sensing and metrology and quantum information science
"Quantum sensing" describes the use of a quantum system, quantum properties or quantum phenomena to perform a measurement of a physical quantity. The field of quantum sensing deals with the design and engineering of quantum sources (e.g., entangled) and quantum measurements that are able to beat the performance of any classical strategy. Metrology, on the other hand, is the scientific study of measurement.
Early quantum sensors include magnetometers based on superconducting quantum interference devices and atomic vapors, or atomic clocks. Other example of an early quantum sensor is an avalanche photodiode (ADP). ADPs have been used to detect entangled photons. Entanglement-assisted sensing, sometimes referred to as "quantum metrology," or "quantum-enhanced sensing," More recently, quantum sensing has become a distinct and rapidly growing branch of research within the area of Quantum Information Science and Technology, with the most common platforms being spin qubits, trapped ions and flux qubits.

Communications, its applications and quantum information science
Quantum communications are required to increases the total computing power, especially if only processors with a few qubits are available at each network node. The most advanced application of quantum communication, and in fact of Quantum Information Processing in general, is in security. Moreover. Quantum networks provide opportunities and challenges across a range of intellectual and technical frontiers, including quantum computation and metrology.
In classical signal processing, signals traveling over fiber-optic cable about 60 miles. However, it must be retransmitted. Quantum repeaters can extend the distance the signal can be sent, but they significantly increase the complexity of the process. Communications not only must be secure, but any eavesdropping attempt will destroy the communication, NASA developed quantum networks to support the transmission of quantum information for aerospace applications. This example of distribution of quantum information by NASA could potentially be utilized in secure communication.
(NASA STTR 2020 Phase I SolicitationT5.04Quantum Communications). Quantum communication may provide new ways to improve communication link with security, through techniques such as quantum cryptographic key distribution. Another area of benefit is the entanglement of distributed sensor networks to provide extreme sensitivity for applications, such as astrophysics, planetary science and earth science.

Computing and simulation and quantum information science
Quantum computers have enormous potential to revolutionize many areas of our society. Quantum computing provides an exponentially larger scale than classical computing, which provides advantages for certain applications.
(a) Quantum simulation refers to the use of quantum hardware to determine the properties of a quantum system, for example, determining the properties of materials such as high-temperature superconductors, and modeling nuclear and particle physics. We have seen that harnessing quantum entanglement can solve problems more efficiently.
(b) The other approach is to simulate the behavior of quantum materials and quantum systems using controlled evolution and interaction of qubits.

Prophiciency of the physical approach to quantum information science in dealing with "consciousness," "freewill" and biological questions
In Part 2, we considered the physical approach to Quantum Information Science by characterizing "information" in physical terms and found it vialable. A complete physical approach to quantum information requires a robust interface among microwave photons, long-lifetime memory, and computational qubits.
It might appear to be perplexing that this physical approach to Quantum Information Science is equally proficient in dealing with "consciousness," "freewill" and biological questions in the area known as "bioinformatics."

Consciousness, quantum physics and quantum information science
One of the first processes based on which consciousness and quantum physics come together is through the Copenhagen interpretation of quantum physics. The central ideas of the Copenhagen interpretation were developed by a core group of quantum physics pioneers, centered around Niels Bohr's Copenhagen Institute in the 1920s.
According to this theory, the quantum wave function collapses due to a conscious observer making a measurement of a physical system. This is the interpretation of quantum physics that provoked the Schrödinger's cat thought-experiment, demonstrating some level of the absurdity of thinking that the same cat could be both alive and dead (i.e. two opposite states occurring at the same time and because of such phenomena as superposition and entanglement). Nevertheless, the claim that the quantum wave function collapses due to a conscious observer making a measurement of a physical system does completely match the evidence of what scientists observe at the quantum level. This is one of the reasons why the research into consciousness forged ahead in Quantum Physics and Quantum Information Science and attempts in understanding of human consciousness in terms of some physical theory, in this case Quantum Mechanics, came to the fore.

Roger Penrose
Sir Roger Penrose is an English mathematical physicist, mathematician, philosopher of science and Nobel Laureate in Physics, delved deep into at least three areas in mathematical physics: gravitational radiation, the gravitational collapse of matter in the form black holes and lastly, the modeling of the universe. He touched on many subjects, such as quantum gravity, twistor theory, a new cosmology of the cosmos. However, a scientist of such repute with the vast knowledge of fundamental areas of modern physics also saw the impact and the essential role a physical theory, such as quantum mechanics, plays in the understanding of human consciousness.
The idea of using quantum physics to explain human consciousness really caught genuine interest with Roger Penrose's 1989 book, "The Emperor's New Mind: Concerning Computers, Minds, and the Laws of Physics," [25]. One of Penrose's motivation to write the book was to respond to the claim of the old school of artificial intelligence researchers, who believed that the brain is capable of being modeled by "Universal Turing machine" of Alan Turing as well as the digital computers.
According to Penrose consciousness is not computational and is nonalgorithmic. It is little more than a biological computer. Hence, Penrose made a distinction between his study of consciousness from any potential exploration of consciousness in artificial intelligence. In this book, Penrose argues that the brain is far more sophisticated than what the AI researchers would have us believe. The main difference is that that the brain does not operate on a strictly binary system of on and off. Instead, the human brain works with computations that are in a superposition of different quantum states at the same time. Moreover, to understand consciousness, one needs to revolutionize our understanding of the physical world.
In the initial part of the book Penrose provides a summary of Classical and Quantum Physics and argues that the physical modeling of the "real world" -from Newtonian mechanics over Einstein's relativity theory up to supersymmetry -is carried out in this Physics. However, simulation of the mind will only be possible if we understand how the missing piece of gravity radiation can be consistently included in the standard model of physics.
In the last two chapters of the book Penrose takes up his initial primary problem of modeling the human mind. To begin with Penrose gives a biophysical description of the brain and what is known about its centers and how it works. At this stage Penrose does not give a precise definition of consciousness because it is seemingly impossible. To illustrate this let us take the example of a brain, which seems to be able to register things, even when the person is 'unconscious,' e.g. during the person undergoing an operation. We may indirectly characterize that the person's consciousness is linked to, for example, common sense judgment of truth, understanding, and artistic appraisal, whereas this is exactly opposite to automatic and algorithmic behavior. Penrose says: " … neither classical nor quantum mechanics [ … ] can ever explain the way in which we think;" but "a plausible case can be made that there is a non-algorithmic ingredient to conscious thought processes" ( [25], p. 521) and noncomputability is a feature of our conscious thinking.
Penrose thinks that current computers will never have intelligence because of they operate under algorithmic deterministic system. This idea is partly inspired by Penrose's experience as a mathematician and rests on Gödel's Incompleteness theorems. Mathematicians can know the truth of a proposition by 'insight.' The Gödel Incompleteness Theorems claim that there are propositions that cannot be proved. This indicates that Gödel never lost sight of the importance of human mind, which has a 'non-mechanical' and a noncomputational character. "Moreover, human beings have the ability to "see" and grasp "'truths' without proof" and have visions and intuitions in creating new knowledge and a new way of looking at things." [23]. In Mathematics there are, Gupta Amitabha [23,26], at least two examples of "Mathematical Conjectures" which have been taken to be True without any Proof: (a) Goldbach's Conjecture (1742), which claims that every even integer greater than 2 can be expressed as the sum of two primes, and (b) Cantor's Continuuam Hypothesis (1878), which asserts that that there is no set whose cardinality is strictly between that of the integers and the real numbers.
In the last two chapters, the main concern has been as to what philosophers call the "mind-body problem". Penrose discusses the computational procedures and the noncomputational activity he assigned to the processes of consciousness, and second, he takes recourse to yet-to-be-discovered quantum-level effects to explain consciousness.
The first book of Penrose has a follow-up book, Penrose Roger [27] Shadows of the Mind: A Search for the Missing Science of Consciousness In this book Penrose gives examples of scientists who, by a spark of inspiration came up with a superb result, while they were not`working' on the subject following algorithmic rules. Penrose's Gödelian argument shows that humans minds are non-computable and he attempts to infer a number of claims involving consciousness and physics and ascribes consciousness to the actual physical makeup of the brain.
According to quantum mechanics, a particle can have states in which it occupies several positions at once. When we treat a system according to quantum mechanics, we have to allow for these so-called superpositions of alternatives. This was taken up by Penrose and he argues that these ingredients formed the basis of his follow-up book, Shadows of the Mind. Penrose draws from research into the molecular structures in the brain and finds suggestions of quantum-level activity that may be influencing the processing of information in the brain. Penrose found some ten thousand tubulin dimers, formed together into sheaths called microtubules, collections of which make up the cytoskeletons that can be thought of as the neuron's nervous system.
Penrose's collaborator, a psychologist, Stuart Hameroff also suggested that there is some biological analog of quantum computing in the brain that involves microtubules within the neurons. This idea is further developed into the so called Orchestrated objective reduction (Orch-OR) theory. The biological theory of mind, viz. Orch OR postulates that consciousness originates at the quantum level inside neurons. Instead, the conventional view is based on the idea that consciousness is a product of connections between neurons. This traditional view of consciousness relies on a mechanism of quantum process, called objective reduction.
Penrose-Hameroff theory of evolution that has made our brain the way it is and the advantage it brings to the creatures able of conscious thinking is that " … neither classical nor quantum mechanics [ … ] can ever explain the way in which we think" but "a plausible case can be made that there is a non-algorithmic ingredient to conscious thought processes" ( [27], p. 521), which is explained by the Orch-OR theory.

'Free will' and quantum information science
The significance of 'free will' in quantum tests, to find a quantum perspective on "free will" leads to the issue of conscious "free will," although consciousness as a causal agency or the brain mechanisms causing consciousness are unknown, and the scientific basis for consciousness, and "self," and a mechanism by which conscious agency may act in the brain to exert causal effects in the world is also unknown.
However, brain's electrical activity correlating with conscious perception of a stimulus, apparently shows that it can occur after we respond to that stimulus, seemingly consciously. Based on this, some scientific and philosophical traditions conclude that we act non-consciously and have subsequent false memories of conscious action. This is the reason why they cast consciousness as a epiphenomenal and illusory phenomena (e.g., Daniel C. Dennett [28], Consciousness Explained; Wiener Norbert [20]).
Today, there is little doubt that our volitional ability represents the highest form of control of any mechanism or organism. After observing fantastically complex abilities of animals, such as awareness, cognition, learning, and motor control, some researchers came to the conclusion that they are the products of the mechanistic operation of their brains. It is claimed that these "mental" abilities emerge from the specific interaction between neurons, molecules, and atoms. In order to justify this, ample evidences have been gathered by evolutionary biologists, developmental psychologists and computer scientists. In addition to this, there are indications that these interactions are entirely subject to the known laws of physics and chemistry. It has almost been accepted by modern scientists and has become an established truth that, in time, machines will have all the competence and functionalities of animals.
I spite of these developments, a group of practitioners of science and technology strongly believe that while all the wonderful abilities of some animals, including consciousness and goal-directed behavior, are indeed the result of mechanistic processes, there is no way human consciousness and choice (and possibly that of some of the higher animals) can simply be the result of an essentially Newtonian physics.
If scientists are able to genetically modify chimpanzees so that they are endowed with such human abilities as human language ability, intelligence, and freewill, then these developments and augmentations would completely remove the fiction of the immaterial human mind and the soul. As a matter of fact these experiments need not be carried out at all, since we already have enough data in the form of healthy babies and also brain-damaged adults. They operate pretty much at the level of the higher animals. However, this similarity may not provide an irrefutable argument, yet they strongly suggest that additional neuronal circuits and connections are responsible and required for our extra capabilities. Research by the developmental and pathological psychologists and correlation between DNA and cognitive ability also provide overwhelming and convincing evidence in favor of a naturalistic account of consciousness and freewill

Quantum indeterminacy and free will
The idea of quantum indeterminacy (the fact that a quantum system can never predict an outcome with certainty, but only as a probability from among the various possible states) have been put forth by some proponents of quantum consciousness. This view amounts to claiming that quantum consciousness resolves the problem of whether or not humans actually have free will. The argument for this is as follows: if human consciousness is governed by quantum physical processes, then it is not deterministic, and humans, therefore, have free will.

Biological issues/ "quantum bio-informatics" and quantum information science
Biological Information, Bioinformatics, involves the integration of computers, software tools, and databases in an effort to address biological questions and biological information. The wealth of genome sequencing information has required the design of software and the use of computers to process this information.
There are two important large-scale activities that use bioinformatics. They are genomics and proteomics. Genomics or genetics is concerned with the analysis of genomes. Scientists think about genome in terms of a complete set of DNA, RNA sequences. They code for the hereditary material that is passed on from generation to the next. The abundance of information about genome sequencing has required the design of software and the use of computers to process this information. Proteomics, on the other hand, refers to the analysis of the complete set of proteins or proteome, protein structures and various synthesis processes. Recent work in Proteomics include metabolomics, transcriptomics.
For the future of bioinformatics a key research question would be as to how to computationally compare complex biological observations, such as gene expression patterns and protein networks. Bioinformatics converts biological observations to a model that a computer will be able to process (or understand).
Of course Quantum Mechanics is the fundamental theory that describes the properties of subatomic particles, atoms, molecules, molecular assemblies. However, Quantum Mechanics operates on the nanometer and sub-nanometer scales. This forms the basis of fundamental life processes such as photosynthesis, respiration and vision. The fundamental claim by Quantum Mechanics is that all objects have wave-like properties and when they interact, quantum coherence describes the correlations between the physical quantities describing such objects that have a wave-like nature.

Conclusion
In Quantum Information Science the physical approach to 'Information' is found to be most appropriate approach in explaining our understanding of the consequences of representing and processing information quantum mechanically. Quantum Information Science is reinforced by its three pillars viz. 'Qubit', 'Superposition' and 'entanglement' and their practical and technological applications. It is facilitated by the insights of the physical properties of nature of microscopic systems at the scale of atoms and subatomic particles.
Surprisingly the physical approach to Quantum Information Science is equally significant and apt in dealing with "consciousness," "freewill" and "bioinformatics." Penrose and his collaborator, Stuart Hameroff, maintained that human intelligence is far more subtle than 'artificial intelligence' and suggested a biological analog to quantum computation involving microtubules. In neurons, microtubules (which inhabit in neurons in the brain) help control the strength of synaptic connections. In the Penrose-Hameroff theory of Orchestrated Objective Reduction, known as Orch-OR, the moments of conscious awareness are orchestrated by the microtubules in our brains, which, they believe, have the capacity to store and process information and memory. Orch OR Model and biological theories of mind are important in the area known as "bioinformatics."