Ark's
Homepage
Publications
Curriculum
Vitae
Physics
and the Mysterious
Event
Enhanced Quantum Physics (EEQT)
Quantum
Future
My
KaluzaKlein pages
Links
to my other online papers dealing with hyperdimensional physics

Quantum Future Extracts
[blaja95d]
We start with recalling John Bell's opinion on quantum measurements.
He studied the subject in depth and he concluded emphasizing it
repeatedly [Bell]: our difficulties with quantum measurement theory
are not accidental  they have a reason. He has pointed out this
reason: it is that the very concept of "measurement"
can not even be precisely defined within the standard formalism
. We agree, and we propose a way out that has not been tried before.
Our scheme solves the essential part of the quantum measurement
puzzle  it gives a unique algorithm generating time series of
pointer readings in a continuous experiment involving quantum
systems. We do not pretend that our solution is the only one that
solves the puzzle. But we believe that it is a kind of a minimal
solution. Even if not yet complete, it may help us to find a way
towards a more fundamental theory.
The solution that we propose does not involve hidden variables
First, we point out the reason why "measurement"
can not be defined within the standard approach. That is because
the standard quantum formalism has no place for "events".
The only candidate for an event that we could think of  in the
standard formalism  is a change of the quantum state vector.
But one can not see state vectors directly. Thus, in order to
include events, we have to extend the standard formalism. That
is what we do, and we are doing it in a minimal way: just enough
to accommodate classical events. We add explicitly a classical
part to the quantum part, and we couple classical to the quantum.
Then we define "experiments", and "measurements",
within the so extended formalism. We can show then that the standard
postulates concerning measurements  in fact, in an enhanced and
refined form  can be derived instead of being postulated.
This "event enhanced quantum theory" or EEQT, as we
call it, gives experimental predictions that are stronger than
those obtained from the standard theory. The new theory gives
answers to more experimental questions than the old one. It provides
algorithms for numerical simulations of experimental time series
obtained in experiments with single quantum systems. In particular
this new theory is falsifiable. We are working out its new consequences
for experiments, and we will report the results in due time. But
even assuming that we are successful in this respect, even then
our program will not be complete. Our theory, in its present form,
is based on an explicit selection of an "event carrying"
classical subsystem. But how do we select what is classical? Is
it our job or is it Nature's job?
When we want to be on a safe side as much as possible, or as
long as possible, then we tend to shift the "classical"
into the observer's mind. That was von Neumann's way out. But
if we decide to blame mind  shall we be save then? For how long?
It seems that not too long. This is the age of information. Soon
we will need to extend our physical theory to include a theory
of mind and a theory of knowledge. That necessity will face us
anyhow, perhaps even sooner than we are prepared to admit. But,
back to our quantum measurement problem, it is not clear at all
that the cut must reside that far from the ordinary, "material"
physics. For many practical applications the measuring apparatus
itself, or its relevant part, can be considered classical. We
need to derive such a splitting into classical and quantum from
some clear principles. Perhaps is is a dynamical process, perhaps
the classical part is growing with time. Perhaps time is nothing
but accumulation of events. We need new laws to describe dynamics
of time itself. At present we do not know what these laws are,
we can only guess.
At the present stage placement of the split is indeed phenomenological,
and the coupling is phenomenological too. Both are simple to handle
and easy to describe in our formalism. But where to put the Heisenberg's
cut  that is arbitrary to some extent. Perhaps we need not worry
too much? Perhaps relativity of the split is a new feature that
will remain with us. We do not know. That is why we call our theory
"phenomenological". But we
would like to stress that the standard, orthodox, pure quantum
theory is not better in this respect. In fact, it is much worse.
It is not even able to define what measurement is. It is not even
a phenomenological theory. In fact, strictly speaking, it is not
even a theory. It is partly an art, and that needs an artist.
In this case it needs a physicist with his human experience and
with his human intuition. Suppose we have a problem that needs
quantum theory for its solution. Then our physicist, guided by
his intuition, will replace the problem at hand by another problem,
that can be handled. After that, guided by his experience, he
will compute Green's function or whatsoever to get formulas out
of this other problem. Finally, guided by his previous experience
and by his intuition, he will interpret the formulas that he got,
and he will predict some numbers for the experiment.
That job can not be left to a computing machine in an unmanned
spacecraft. We, human beings, may feel proud that we are that
necessary, that we can not be replaced by machines. But would
it not be better if we could spare our creativity for inventing
new theories rather than spending it unnecessarily for application
of the old ones?

[blaja95c]
In a recent series of papers (cf.[blaja95a] and references therein)
we enhanced the standard framework of quantum mechanics endowing it
with event dynamics. In this extension, which will be denoted EEQT
(for Event Enhanced Quantum Theory), we go beyond the Schroedinger
continuoustime evolution of wave packets  we also propose a class
of algorithms generating discrete events. From master equation that
describes continuous evolution of ensembles of coupled quantum + classical
systems we derive a unique piecewise deterministic random process
that provides a stochastic algorithm for generating sample histories
of individual systems. In the present contribution we will describe
the essence of our approach. But first we make a few comments on similarities
and differences between EEQT and several other approaches.
The Standard Approach
In the standard approach classical concepts are static. They are
introduced via measurement postulates developed by the founders
of Quantum Theory. But ``measurement" itself is never precisely
defined in the standard approach and therefore measurement postulates
cannot be derived from the formalism. One is supposed to believe
Born's statistical interpretation simply ``because it works".
The standard interpretation alone does not tell us what happens
when a quantum system is under a continuous observation (which,
in fact, is always the case).
Master Equation Dynamics and Continuous Observation Theory
Continuous observation theory is usually based on successive applications
of the projection postulate. Each application of the projection
postulate maps pure states into mixed states. Thus repeated application
of the postulate leads to a master equation for a density matrix.
Replacing Schroedinger's dynamics by a master equation is also popular
in quantum optics [...]) and in several attempts to reconcile quantum
theory with gravity (for a recent account see [...]). In all these
approaches the authors usually believe that no classical system
is introduced. All is purely quantum. That is, however, not true.
What is true is just the converse: the largest possible classical
system is introduced, but because it is so large and so close to
the eye, it easily escapes our sight. It is assumed, without any
justification, that jumps of quantum state vectors are directly
observable (whatever it means). These jumps are supposed to constitute
the only classical events. The weak point of this approach is in
the fact that going from the master equation, that describes statistical
ensembles, to a stochastic algorithm generating sample histories
of an individual system is nonunique. There are infinitely many
random processes that lead to the same master equation after averaging.
One can use diffusion stochastic differential equations or jump
processes, one can shift pieces of dynamics between Hamiltonian
evolution and collapse events.
The reason for this nonuniqueness is simple: there are infinitely
many mixtures that lead to the same density matrix. Diosi [...]
invented a clever mathematical procedure for constructing a special
orthoprocess. It provides a definite algorithm in special
cases of finite degeneracy. It does not however remove nonuniqueness
and also there is no reason why Nature should have chosen this special
prescription causing quantum state vector always to make the least
probable transition: to one of the orthogonal states.
Bohmian Mechanics, Local Beables, Stochastic Mechanics
In these approaches [...] there is an explicit classical
system. Quantum state vector knows nothing about this classical
system. It evolves according to the unmodified Schroedinger's dynamics.
It acts on the classical system affecting the classical dynamics
(which is either causal or stochastic) without itself being acted
upon. There is a mysterious quantum potential: action without
reaction. All such schemes are inconsistent with quantum mechanics.
They can be shown to contradict indistinguishability of quantum
mixtures that are described by the same density matrix [jad95a].
That it must be so follows from quite general nogo theorems [...].
The fact that the above schemes allow us to distinguish between
mixtures that standard quantum mechanics considers indistinguishable
need not be a weakness. In fact, it may be an advantage because
it may lead beyond quantum theory, it can provide us with means
of fasterthanlight communication  provided experiment confirms
this feature.
How does our approach compare to those above? First of all, as
for today, our approach is explicitly phenomenological. That is
not to say that, for instance, the standard approach is not phenomenological.
In the standard approach we must decide where do we finish
our quantum description and what do we "measure".
That does not follow from the theory  it must be imputed from the
outside. However, we have been so much indoctrinated by Bohr's philosophy
and its apparent victory over Einstein's "realistic" dreams,
and we are today so used to this procedure, that we do not feel
uneasiness here any more. Somehow we tend to believe that the future
"quantum theory of everything" will explain all events
that happen. But chances are that this theory of everything will
explain nothing. It will be a dead theory. It will not even have
a Hamiltonian, because there will be no time. It will be a theory
of the world in which nothing happens by itself. It will answer
our questions about certain probabilities  when these
questions are asked. But it will not explain why anything happens
at all.
Our theory of event dynamics starts with an explicit phenomenological
split between a quantum system, which is not directly observable,
and a classical system, where events happen that can be observed
and that are to be described and explained. In other words our starting
point is an explicit mathematical formulation of Heisenberg's cut.
The quantum system may be as big as one wishes it to be, the classical
system may retreat more and more, moved as far as we wish  towards
our sense organs, towards our brains, towards our mental processes.
But the further we retreat the less facts we explain. At
the extreme limit we will be able to explain nothing but changes
of our mental states, i.e. only mental events. That state of affairs
may be considered satisfactory for those who adhere to idealistic
or eastern philosophies, but it need not be the one that enriches
our understanding of the true workings of Nature. Probably, for
most of practical purposes, it is sufficient to retreat with the
quantoclassical cut as far as photon detection processes which
can be treated as the primitive events. However, our event mechanics
works quite well when the cut between the quantum and the classical
is expressed in engineering language: like in the example of quantum
SQUID coupled to a classical radiofrequency circuit, or quantum
particle coupled to its yesno position detectors, for instance
to a cloud chamber.
Once the split between the quantum and the classical is fixed,
then the coupling between both systems is described in terms of
a special master equation. Because of its special form there is
a unique random process in the space of pure states of the total
system that reproduces this master equation. The process gives an
algorithm for generating sample histories. It is of piecewise deterministic
character. It consists of periods of continuous evolution interrupted
by jumps and events that happen at random times. The continuous
evolution of the quantum system is described by a  modified by
the coupling  nonunitary Schroedinger's equation. The jump times
have a Poissonian character, with their jump rates dependent on
the actual state of both the quantum and the classical system. The
back action of the classical system on the quantum one shows up
in two ways: first of all by modifying the Schroedinger evolution
between jumps by a nonunitary damping, second by causing quantum
state to jump at event times. Notice that the master equation describing
statistical properties is linear, while the evolution of individual
system is nonlinear. This agrees with Turing's aphorism stating
that prediction must be linear, description must be nonlinear
[...].
Our theory, even if it works well and has a practical value, should
be considered not as a final scheme of things, but merely as a step
that may help us to find a description of Nature that is more satisfactory
than the one proposed by the orthodox quantum philosophy. Pure quantum
theory proposes a universe that is dead  nothing happens, nothing
is real  apart from the questions asked by mysterious "observers".
But these observers are metaphysics, are not in the equations. In
a sharp contrast to the standard approach, our theory of event mechanics
described here makes the universe "running" again, even
before there were any observers. It has gotten however the arrow
of time that is driven by a fuzzy quantum clock. It also needs a
roulette. This is hard to accept for many of us. We would like to
believe that Nature is ruled by a perfect order. And to be perfect
 this order must be deterministic. Even if we do not share Einstein's
dissatisfaction with quantum theory, we tend to understand his disgust
at the very thought of God playing dice. But Nature's concept of
a perfect order may be not that simple one as we wish. Perhaps using
probability theory may be the only way of describing in finite terms
the universe that has an infinite complexity. It may be that we
will never know the ultimate secret, nevertheless the mechanism
proposed by EEQT brings a hope of restoring some order that we are
seeking. Namely, the quantoclassical clock that we describe below
works "by itself". It is true that it needs a roulette
but the roulette is a classical roulette. We need only classical
probability and classical random processes. That is good because
we understand classical probability by hearts but "quantum
probability" we understand only by abstract terms. That is
some progress also because nowadays we know more about complexity
theory, theory of random sequences, and theory of chaotic phenomena.
Each year we find new ways of generating apparently random phenomena
out of deterministic algorithms of sufficient complexity. In fact,
our event generating algorithm is successfully simulated with a
completely deterministic classical computer. The crucial problem
here is the necessary computing power. Moreover, the algorithm is
nonlocal. We do not know how Nature manages to make its world clock
running with no or little effort. We must yet learn it.

[blaja95b]
We will talk about theory of events. To be honest we should
allow for the adjective "phenomenological". We will explain
later our reasons for this restraint. This new theory enhances and
extends the standard quantum formalism. It provides a solution to
the quantum measurement problem. The usual formalism of quantum theory
fails in this respect. Let us look, for instance, into a recent book
on the subject, The interpretation of quantum theory [Omnes].
There we can see both the difficulties as well as the methods that
attempt to overcome them. We disagree with the optimism shared by
many, perhaps by a majority of quantum physicists. They seem to believe
that the problem is already solved, or almost solved. They use a magic
spell, and at present the magic spell that is supposed to dissolve
the problems is decoherence. It is true that there are new
ideas and new results in the decoherence approach. But these results
did not quite solve the problem. Realworldevents, in particular
pointer readings of measuring apparata, have never be obtained within
this approach. Decoherence does not tell us yet how to programm a
computer to simulate such events. A physicist, a human being, must
intervene to decide what to decohere and how to decohere. Which basis
is to be distinguished. What must be neglected and what must not?
Which limit to take? That necessity of a human intervention is not
a surprise. The standard quantum formalism simply has no resources
that can be called for when we wish to derive the basic postulates
about measurements and probabilities. These postulates are repeated
in all textbooks. They are never derived. The usual probabilistic
interpretation of quantum theory is postulated from outside.
It is not deduced from within the formalism. That is rather unsatisfactory.
We want to believe that quantum theory is fundamental, but its interpretation
is so arbitray! Must it be so? 
[blaja95a]
I. Introduction
Quantum Mechanics occupies a particular place among scientific theories;
indeed it is at once one of the most successful and one of the most
mysterious ones. Its success lies undoubtedly in the fact that using
Quantum Mechanics one can predict properties of atoms, of molecules,
of chemical reactions, of conductors and insulators and much more.
These predictions were confirmed by precise measurements and by the
technological progress that is based on quantum phenomena. The mystery
resides in the problem of interpretation of Quantum Theory  which
does not follow from the formalism itself but is left to discretion
of a physicist. As a result, there is still no general agreement about
how Quantum Mechanics is best understood and to what extent it can
be considered as exact and complete.
As emphasized already by E. Schroedinger [...] what is definitively
and completely missing in Standard Quantum Mechanics is an explanation
of experimental facts, as it does not tell us how to generate time
series of events recorded during real experiments on single individual
systems. H.P. Stapp [...] and R. Haag [...] emphasized the role
and importance of events in quantum physics. J. Bell [...]
stressed the fundamental necessity of distinguishing definite events
from just wavy possibilities .
In 1969 E.B. Davies [...] introduced the space of events
in his mathematical theory of quantum stochastic processes which
extended the standard formalism of quantum theory. His theory went
beyond a standard quantum measurement theory and, in its most general
form, was not expressible in terms of quantum master equations alone.
Later on Srinivas, in a joint paper with Davies [...], specialized
Davies' general and mathematically sophisticated scheme to photodetection
processes. Photon counting statistics predicted by this theory were
successfully verified in fluorescence experiments which caused R.
J. Cook to revisit the question what are quantum jumps[...].
A related question: are there quantum jumps was asked by
J. Bell [...] in connection with the idea of spontaneous localization
put forward by Ghirardi, Rimini and Weber [...].
In the eighties quantum optics experiments started to call for
efficient methods of solving quantum master equations that described
effective coupling of atoms to the radiation modes. The works of
Carmichael [...], Dalibard, Castin and Moelmer [...], Dum, Zoller
and Ritsch [...], Gardiner, Parkins and Zoller [...], developed
Quantum Monte Carlo (QMC) algorithm for simulating solutions of
master equations. (A less general scheme was proposed by Teich and
Mahler [...] who tried to extract a specific jump process directly
from the orthogonal decomposition of time evolving density matrix.
On the other hand already in 1986 Diosi [...] proposed a pure state,
piecewise deterministic process that reproduces a given master equation.
His process although canonical in nondegenerate case, is not unique.)
The algorithm emerged from the the seminal papers of Davies [...]
on quantum stochastic processes, that were followed by numerous
works on photon counting and continuous measurements [...]. It was
soon realized [...] that the same master equations can be simulated
either by Quantum Monte Carlo method based on quantum jumps, or
by a continuous quantum state diffusion. Wiseman and Milburn [...]
discussed the question of whether experimental detection schemes
are better described by continuous diffusions or by discontinuous
jump simulations. The two approaches were recently put into comparison
also by Garraway and Knight [...], while Gisin et al. [...] argued
that the quantum jumps can be clearly seen in the quantum
state diffusion plots. Apart from the numerical usefulness of quantum
jumps and empirical observability of photon counts, the debate of
their reality continued. A brief synthesis of the present
state of the debate has been given by Moelmer in the final paragraphs
of his 1994 Trieste lectures [...]:
The macroscopic collapse has been explained, the elementary collapse,
however remains as an essential and unexplained ingredient of
the theory.
A real advantage of the QMC method: We can be sitting there and
discussing its philosophical implications and the deep questions
of quantum physics while the computer is cranking out numbers
which we need for practical purposes and which we could never
obtain in any other way. What more can we ask for?
In the present paper we argue that indeed more can be not
only asked for, but that it can be also provided. The picture that
we propose developed from a series of papers [...] where we treated
several applications including SQUIDtank [...] and cloud chamber
model (with GRW spontaneous localization) [...]. In the sequel we
will refer to it as Event Enhanced Quantum Theory (EEQT). EEQT is
a minimal extension of the standard quantum theory that accounts
for events. In the next three sections we will describe formal aspects
of EEQT, but we will attempt to reduce the mathematical apparatus
to the absolute minimum. In the final Sect. 4 we will propose to
use EEQT for describing not only quantum measurement experiments,
but all the real processes and events in Nature. The new formalism
rises new questions, and in Sect. 4 we will point out some of them.
One of the problems that can be discussed in a somewhat new light
is that of the role of observers and IGUSes (Information
Gathering and Utilizing Systems  using terminology of GellMann
and Hartle [...]). We will also make a comment on a possible interpretation
of Connes' version of the Standard Model as a stochastic geometry
a'la EEQT, with jumps between the two copies of spacetime. Finally
we will mention relevance of EEQT to the theory and practice of
quantum computers.

We have seen that Quantum Theory can be enhanced in a rather simple
way. Once enhanced it predicts new facts and straightens old mysteries.
The EEQT that we have outlined above has several important advantages.
One such advantage is of practical nature: we may use the algorithms
it provides and we may ask computers to crank out numbers that
are needed in experiments and that can not be obtained in another
way . For example in [blaja93c] we have shown how to generate
pointer readings in a tank radiocircuit coupled to a SQUID. In [jad94b,c]
the algorithm generating detection events of an arbitrary geometrical
configuration of particle position detectors was derived. As a particular
case, in a continuous homogeneous limit we reproduced GRW spontaneous
localization model. Many other examples come from quantum optics,
since QMC is a special case of our approach, namely when events
are not feedbacked into the system and thus do not really matter.
Another advantage of EEQT is of a conceptual nature: in EEQT we
need only one postulate: that events can be observed. All
the rest can and should be derived from this postulate. All probabilistic
interpretation, everything that we have learned about eigenvalues,
eigenvectors, transition probabilities etc. can be derived from
the formalism of EEQT. Thus in [blaja93a] we have shown that the
probability distribution of the eigenvalues of Hermitian observables
can be derived from the simplest coupling, while in [blaja94c,blaja95c],
we have shown that Born's interpretation can be derived from the
simplest possible model of a position detector. Moreover, in [jad94a]
it was shown that EEQT can also give definite predictions for nonstandard
measurements, like those involving noncommuting operators (notice
that in our scheme contributions gab from different, possibly
noncommuting, devices add).
It is also possible that using the ideas of EEQT may throw a new
light into some applications of noncommutative geometry. Namely,
when C consists of two points, then our V can
be interpreted as Quillen's superconnection [...] and references
there). (Cf. also [...] for relation between superconnections and
classical Markov processes.)

Another potential field of application of EEQT is in the theory
and practice of quantum computation. Computing with arrays of coupled
quantum rather than classical systems seems to offer advantages for
special classes of problems [...] and references therein). Quantum
computers will have however to use classical interfaces, will have
to communicate with, and be controlled by classical computers. Moreover,
we will have to understand what happens during individual runs. Only
EEQT is able to provide an effective framework to handle these problems.
It keeps perfect balance of probabilities without introducing negative
probabilities and it needs only standard random number generators
for its simulations. For a recent work where similar ideas are considered
cf. [...]
EEQT is a precise and predictive theory. Although it appears to
be correct, it is also yet incomplete. The enhanced formalism and
the enhanced framework not only give enhanced answers, they also
invite asking new questions. Indeed, we are tempted to consider
the possibility that PDP can be applied not only to what we call
experiments, but also, as a world process to the entire universe
(including all kinds of observers ). Thus we may assume that
all the events that happened were generated by a particular PDP
process, with some unknown Q,C,H and V. Then, assuming that
past events are known, the future is partly determined and partly
open. Knowing Q,C,H,V and knowing the actual state (even
if this knowledge is fuzzy and uncertain), we are in position to
use the PDP algorithm to generate the probable future series of
events. With such a promotion of the PDP to the role of a universal
world process questions arise that could not be asked before: what
is C and what is V?, and perhaps also: what is t? and what
are we. Of course we are not in a position to provide answers.
But we can discuss possibilities and we can provide hints.

What is time?
Let us start with the question: what is time? Answering
that time is determined by the thermodynamic state of the system
[...] is not enough, as we would like to know how did it happen
that a particular thermodynamic state has evolved, and to understand
this we must assume evolution, and thus we are back with the question:
what is time if not just counting steps of this evolution. We are
tempted to answer: time is just a measure of the number of events
that happened in a given place. If so, then time is discrete,
and there is another time, that counts the deterministic steps between
events. In that case die tossing to decide whether the next step
is to be an event or not is probably uneconomic and unnecessary;
it is quite possible that the Poissonian character of events is
a result of some ergodic theorem, when we use not the truediscrete
time, but some continuous averaged time (averaged over
a neighborhood of a given place). Thus a possible algorithm for
a finite universe would be discrete, with die tossing every N
steps, N being a fixed integer, and continuous, averaged
time would appear only in a thermodynamic limit. In fact, in a finite
universe, dice tossing should be replaced by a deterministic algorithm
of sufficient complexity. A spectrum of different approaches to
the problem of time, some of them similar to the one presented above,
can be found in Ref. [...]. In a recent paper J. Schneider [...]
proposes that a passing instant is the production of a meaningful
symbol, and must be therefore formalized in a rigorous way as a
transition. He also states that the linear time of physics is the
counting of the passing instants, that time is linked with the production
of meaning and is irreversible per se. We agree only in part, as
we strongly believe that physical events, and the information that
is gained due to these events, are objective and primary with respect
to secondary mental or semantic events.

What is classical?
We consider now the question: what is classical? In each
practical case, when we want to explain a given phenomenon, it is
clear what constitutes events for us that we want to account
for. These events are classical, and usually we can safely extend
the classical system C towards Q gaining a lot and loosing a little.
But here we are asking not a practical question, we are asking a
fundamental question: what is true C? There are several possibilities
here, each one having advantages and disadvantages, depending on
circumstances in which the question is being asked. If we believe
in quantum field theory and if we are ready to take its lesson,
then we must admit that one Hilbert space is not enough, that there
are inequivalent representations of the canonical commutation relations,
that there are superselection sectors associated to different phases.
In particular there are inequivalent infrared representations associated
to massless particles [...]. Then classical events would be, for
instance, soft photon creation and annihilation events. That idea
has been suggested by Stapp [...] some ten years ago, and is currently
being developed in a rigorous, algebraic framework by D. Buchholz
[...].
Another possibility is that not only photons, but also long range
gravitational forces may take part in the transition from the potential
to the actual. That hypothesis has been expressed by several authors
(see e.g. contributions of F. Karolyhazy et al., and R. Penrose
in [...]; also L. Diosi [...]).
The two possibilities quoted above are not satisfactory when we
think of a finite universe, evolving step by step, with a finite
number of events. In that case we do not yet know what is gravity
and what is light, as they, together with space, are to emerge only
in the macroscopic limit of an infinite number of events. In such
a case it is natural to look for C in Q. We could just define
event as a nonunitary change of state of Q. In other words, we would
take for the space S of classical pure states the only available
set  the unit ball of the Hilbert space. This possibility has been
already discussed in [jad94]. This choice of S is also necessary
when we want to discuss the problem of objectivity of a quantum
state. If quantum states are objective (even if they can be determined
only approximately), then the question: what is the actual state
of the system is a classical question  as an attempt to quantize
also the position of psi would lead to a nonsense. We should perhaps
remark here that our picture of a fixed Q and fixed C that we have
discussed in this paper is oversimplified. When attempting to use
the PDP algorithm to create a finite universe in the spirit of spacetime
code of D. Finkelstein [...] and references therein), or bitstring
universe of P. Noyes (cf. Noyes' contribution to [...]) we would
have to allow for Q and C to grow with the number of events. Our
formalism is flexible enough to adjust to such a change.

Dynamics and Binamics
Having provided tentative answers to some of the new questions,
let us pause to discuss possible conceptual implications of the
EEQT. We notice that EEQT is a dualistic (and even syncretistic)
theory. In fact, we propose to call the part of time evolution associated
to V by the name of binamics  in contrast to the part associated
to H, which is called dynamics. While dynamics deals with
the laws of exchange of forces, binamics deals with the laws of
exchange of bits (of information). We believe that these two sets
of laws refer to different projections of one reality and neither
one of these projections can be completely reduced to another one.
Moreover, concerning the reality status, we believe that bits,
are as real as forces. That this is indeed the case should
be clear if we apply the famous A. Landes criterion of reality:
real is what can kick. We know that information, when applied in
an appropriate way, may cause changes and may kick  not less than
a force.

What are we?
We have used the term we too many times to leave it without
a comment. Certainly we are partly Q and partly C (and partly
of something else). But not only we are subjects and spectators
 sometimes we are also actors. In particular we can ain and
utilize information [...]. How can this happen? How can we control
anything? Usually it is assumed that we can prepare states by manipulating
Hamiltonians. But that can not be exactly true. It is beyond our
power to change coupling constants or Hamiltonians that are governing
fundamental forces of Nature. And when we say that we can manipulate
Hamiltonians, we really mean that we can manipulate states
in such a way that the standard fundamental Hamiltonians act on
these special states as if they were phenomenological Hamiltonians
with classical control parameters and external fields that we need
in order to explain our laboratory procedures. So, how can we manipulate
states without being able to manipulate Hamiltonians? We can only
guess what could be the answer of other interpretations of Quantum
Theory. Our answer is: we have some freedom in manipulating C and
V. We can not manipulate dynamics, but binamics is open. It is through
V and C that we can feedback the processed information and knowledge
 thus our approach seems to leave comfortable space for IGUSes.
In other words, although we can exercise little if any influence
on the continuous, deterministic evolution [...], we may have partial
freedom of intervening, through C and V, at bifurcation points,
when die tossing takes place. It may be also remarked that the fact
that more information can be used than is contained in master equation
of standard quantum theory, may have not only engineering but also
biological significance. In particular, we provide parameters (C
and V) that specify event processes that may be used in biological
organization and communication. Thus in EEQT, we believe, we overcome
criticism expressed by B.D. Josephson concerning universality of
quantum mechanics [...] . The interface between Quantum Physics
and Biology is certainly also concerned with the fact that a lot
of biological processes (like the emergence of naturally catalytic
molecules or the the evolution of the genetic code) can be in principle
described and understood in terms of physical quantum events of
the kind that we have discussed above.
We believe that are our proposal as outlined in this paper and
elaborated on several examples in the quoted references is indeed
the minimal extension of quantum theory that accounts for events.
We believe that, apart of its practical applications, it can also
serve as a reminder of existence of new ways of looking at old but
important problems.

[blaja94c]
The proposed solutions to the quantum measurement problem by e.g.
von Neumann and Wigner  are no solution at all. They merely shift
the focus from one unsolved problem to another. On the other hand
the predictions for the outcomes of measurements performed on statistical
ensembles of physical systems are excellent. What is however completely
missing in the standard interpretation is an explanation of experimental
facts i.e. a description of the actual individual time series of events
of the experiment. That an enhancement of Quantum Theory allowing
the description of single systems is necessary is nowadays clear.
Indeed advances in technology make fundamental experiments on quantum
systems possible. These experiments give us series of events for which
there are definitely no place in the original, standard version of
quantum mechanics, since each event is classical, discrete and irreversible.
In recent papers [18] we provided a definite meaning to the concepts
of experiment and event in the framework of mathematically consistent
models describing the information transfer between classical eventspace
and quantum systems. We emphasize that for us the adjective classical
has to be understood in the following sense: to each particular experimental
situation corresponds a class of classical events revealing us the
Heisenberg transition from the possible to the actual and these events
obey the rules of classical logic of Aristotle and Boole. The World
of the Potential is governed by quantum logic and has to account for
the World of Actual, whose logic is classical. We accept both and
we try to see what we gain this way. It appears that working with
so enhanced formalism of quantum theory we gain a lot. 
We proposed mathematical and physical rules to describe
 the two kinds of evolution of quantum systems namely continuous
and stochastic
 the flow of information from quantum systems to the classical
eventspace
 the control of quantum states and processes by classical parameters.

It follows that as long as the usual locality assumptions [...]
are satisfied, the event statistics seen on the right does not depend
on what is measured on the left, and whether anything is measured
there at all. We stress that this observation alone should not be
used to conclude that superluminal signalling using EPR is impossible
 this for the very reason that we were considering above a particular
and simplified model. What we proved above is only that superluminal
communicators must necessarily use more refined methods than the one
considered above. 
We are not saying however that we are satisfied with our
understanding of Quantum Mechanics and we agree with Feynman's belief
that no one really understands Quantum Mechanics. In this context
it is perhaps appropriate to quote a statement by Nagel "The
main event of this century will be the first human contact with the
invisible quantum world". "Real black magic calculus"
is how Einstein described Quantum Mechanics in a letter in 1925. Our
models of coupling quantum systems to classical eventspaces can be
rightly criticized as being too phenomenological. But we offer some
new ways of seeing things and new mathematics providing an additional
perspective on Quantum Theory and links between the old and the new,
the possible and the actual, statistical ensembles and individual
systems, waves and particles and the deterministic and the random.
Despite exciting results, however, an outstanding challenge remains.


[blaja94b]
It was also John Bell's point of view that "something is rotten"
in the state of Denmark and that no formulation of orthodox quantum
mechanics was free of fatal flows. This conviction motivated his last
publication [...]. As he says "Surely after 62 years we should
have an exact formulation of some serious part of quantum mechanics.
By "exact" I do not mean of course "exactly true".
I only mean that the theory should be fully formulated in mathematical
terms, with nothing left to the discretion of the theoretical physicist
...". 
Two options are possible for completing Quantum Mechanics. According
to John Bell [...] "Either the wave functions is not everything
or it is not right ...". 
The class of models we consider aims at providing an answer to the
question of how and why quantum phenomena become real as a result
of interaction between quantum and classical domains. Our results
show that a simple dissipative time evolution can allow a dynamical
exchange of information between classical and quantum levels of Nature.
Indeterminism is an implicit part of classical physics and an explicit
ingredient of quantum physics. Irreversible laws are fundamental and
reversibility is an approximation. 
We extend the model of Quantum Theory in such a way that the successful
features of the exisiting theory are retained but the transitions
between equilibria in the sense of recording effects are permitted.

To the Liouville equation describing the time evolution of statistical
states of total system we will be in position to associate a piecewise
deterministic process taking values in the set of pure states of this
sytem. Knowing this process one can answer all kinds of questions
about time correlations of the events as well as simulate numerically
the possible histories of individual quantumclassical systems. Let
us emphasize that nothing more can be expected from a theory without
introducing some explicit dynamics of hidden variables. What we achieved
is the maximum of what can be achieved, which is more than orthodox
interpretation gives. There are also no paradoxes; we cannot predict
individual events (as they are random), but we can simulate the observations
of individual systems. 
It is tempting to use the Zeno effect for slowing down the time
evolution in such a way, that the state of a quantum system Q can
be determined by carrying out measurements of sufficiently many observables.
This idea, however, would not work, similarly like would not work
the proposal of "protective measurements" of Y. Aharonov
et al [...]. To apply Zenotype measurements just as to apply a "protective
measurement" one would have to know the state beforhand. Our
results suggest that obtaining a reliable knowledge of the quantum
state may necessarily lead to a significant, irreversible disturbance
of the state. This negative statement does not mean that we have shown
that the quantum state cannot be objectively determined. We believe
however that dynamical, statistical and informationtheoretical aspects
of the important problem of obtaining a maximal reliable knowledge
of the unknown quantum state with a least possible disturbance
are not yet sufficiently understood. 
[jad94b]
From a philosophical point of view, it is worth noting that in the
present paper, in a sharp contrast to the standpoint taken by H.P.
Stapp in his recent paper [...], we deliberately avoid the concepts
of an "observer". Our model aims at being as objective as
the concept of probability allows for it. A philosophical summary
of our results can be formulated as follows: Quantum Theory, once
invented by human minds and ones asked questions that are of interest
for human beings, needs not "minds" or "observers"
any more. What it needs is lot of computing power and effective random
number generators, rather than "observers". The fundamental
question, to which we do not know the answer yet, can be thus formulated
as follows: can random number generators be avoided and replaced by
deterministic algorithms of a simple and clear meaning? 
The crucial concept of our approach to quantum measurements is that
of an "event". 
Our aim is to explain the "nice linear tracks" that quantum
particles leave on photographs and in cloud chambers.These tracks
are indeed hard to explain if one assumes that there are no particles
and no events  only Schroedinger's waves. 
We have seen that a simple coupling between quantum particle and
classical continuous medium of twostate detectorsleads to a piecewise
deterministic random process that accounts fortrack formation in cloud
chambers and photographic plates. 
As mentioned in the Introduction, to simulate track formations only
random number generators and computing power is necessary. Our model
does not involve observers and minds. This does not mean that we do
not appreciate the importance of the mindbody problem. In our opinion
understanding the problems of minds needs also quantum theory, and
perhaps even more  that is still beyond the horizon of the presentday
physics.
But our model indicates that quantum theory does not need human minds.
Quantum theory should be formulated in a way that involves neither
observers nor minds  at least not more than any other branch of physics.

[jad94a]
Replacing Schroedinger's evolution, which governs the dynamics of
pure states, by an equation of the Liouville type, that describes
time evolution of mixed states, is a necessary step  but it does
not suffice for modeling of real world events. One must take,
to this end, two further steps. First of all we should admit that
in our reasoning, our communication, our description of facts
 we are using classical logic. Thus somewhere in the final
step of transmission of information from quantum systems to macroscopic
recording devices and further, to our senses and minds, a translation
between quantum and classical should take place. That such a translation
is necessary is evident also when we consider the opposite direction:
to test a physical theory we perform controlled experiments.
But some of the controls are always of classical nature  they
are external parameters with concrete numerical values. So, we need
to consider systems with both quantum and classical degrees of freedom,
and we need evolution equations that enable communication in both
directions, i.e. :
 flow of information from quantum to classical
 control of quantum states and processes by classical parameters

Our point is that "measurement" is an undefined concept
in standard quantum theory, and that the probabilistic interpretation
must be, because of that, brought from outside. What we propose is
to define measurement as a CP semigroup coupling between a
classical and a quantum system and to derive the probabilistic
interpretation of the quantum theory from that of the classical one

Recently Aharonov and Vaidman [...] discussed this problem in some
details. I do not think that they found the answer, as their arguments
are circular, and they seem to be well aware of this circularity.
The difficulty here is in the fact that we have to discriminate between
nonorthogonal projections (because different states are not
necessarily orthogonal), and this implies necessity of simultaneous
measuring of noncommuting observables. There have been many papers
discussing such measurements, different authors taking often different
positions. However they all seem to agree on the fact that predictions
from such measurements are necessarily fuzzy. This fuzziness
being directly related to the Heisenberg uncertainty relation for
noncommuting observables. Using methods and ideas presented in the
previous sections of this chapter it is possible to build models corresponding
to the intuitive idea of a simultaneous measurement of several noncommuting
observables, like, for instance, different spin components, 
[blaja93d]
Our results show that a simple dissipative time evolution can result
in a dynamical exchange of information between classical and quantum
levels of Nature. 
In our model the quantum system is coupled to a classical recording
device which will respond to its actual state. We thus give a minimal
mathematical semantics to describe the measurement process in
Quantum Mechanics. For this reason the toy model that we proposed
can be seen as the elementary building block used by Nature in the
communications that take place between the quantum an classical levels.
The model has not only nice mathematical properties but it is also
of great practical accesibility and, therefore, it is natural to formulate
any practical problem by starting from the general structure of the
"Ansatz" we have proposed. 
The exceptionally brilliant calculational successes of Quantum Mechanics
cannot cause to forget the degree of conceptual confusion still present.
The essential problem follows from the fact that Quantum Mechanics
is the most fundamental theory we know. But if it is really fundamental,
it should be universally applicable. In particular quantum physics
should be able to explain also the properties of macroscopic objects
and occurence of macroscopic events. But measurement situations show
clearly that it is impossible to apply standard Quantum Mechanics
in a consistent way to all relevant situations. If there is such a
universal theory, it is therefore not Quantum Theory.
Indeterminism is an implicit part of classical physics. Irreversible
laws are fundamental and reversibility is an approximation. We cannot
refrain quoting from R. Haag's paper "Irreversibility introduced
on a fundamental level" ... once one accepts indeterminism,
there is no reason against including irreversibility as part of the
fundamental laws of Nature. 
Lot of work must still be done, lot of prejudices overcomed. What
we propose does not aspire to be a magic medicine that will rejuvenate
Quantum Theory and make it UniversalAndTrueForEver. But, perhaps,
it will help to stop the bleeding from some open scars 
[blaja93a]
According to Niels Bohr [...] there is an indispensable fundamental
duality between the classical and the quantum levels of Nature. Our
approach provides a mathematical form to such a view, and thus transfers
its contents from the realm of philosophy into that of physics. The
model that we present below shows that a mathematically consistent
description of interaction between classical and quantum systems is
feasible. Following Niels Bohr, we believe that the very fact that
we can communicate our discoveries to our fellow men constitutes an
experimental proof that interactions of the type that our model describes
do exist in Nature. 
As a summary of our project we think that our results reduce the
number of puzzles to one i.e. that of the arrow of time whereas
initially we believed that two important ones had to be solved i.e.
the puzzle of irreversibility and that of quantum measurement. (as
exemplified, for instance, by the paradoxes of von Neuman's infinite
chain, Schroedinger's cat and of Wigner's friend ). We also believe
that this remaining puzzle can be solved only after we have acquired
a radically new understanding of the nature of time. 
We propose to discuss the hypothesis that the yesnoflip mechanism
that we exploit in our model may constitute an elementary building
block used by Nature in the communication between Her quantum
and classical levels. 
The philosophical motivation for this investigation came from the
works of Niels Bohr and Karl Popper. 
We formulated a model that provides an answer to one of the important
conceptual problems of quantum theory, the problem of how and when
a quantum phenomenon becomes real as a result of a suitable dissipative
time evolution. 
Thus if the universality hypothesis expressed in the Introduction
is to be taken seriously, it will naturally lead to consequences that
are at variance with some of the elements of the prevailing paradigm.
Space (and thus also time) is to be build out of the discrete elements.

