INFORMATION HIDES IN THE GLARE OF
REALITY.
“At the heart of everything is a question, not an
answer” – Wheeler.
Or is it the opposite?
The answer is staring at us with all its glory, but
its glare blinds us!
Why of all the number systems in
use, binary systems dominate the information sector? What reality stands for? How
virtual particles pop out and vanish in the so-called vacuum states? By which mechanism
information (thought) pops out of memory in response to some external stimuli and
vanishes again? What is an electron or a photon? What will happen if proton or
neutron is used in the double slit experiment? Does Big bang imply ex-nihilo? What
is nothingness? Is there an all encompassing background structure? Can energy
be non-interacting (dark)? As Jesus says, let those who have eyes see.
THE
ITS & THE BITS:
Information Theory is based on the
concept of writing instructions that will make the computer follow and run a
program based on those instructions or matching perceptions of the transmitter with
the receiver. Perception is the processing of the result of measurements of
different but related fields of something with some stored data to convey a
combined form “it is like that”, where “it” refers to an object (constituted of
bits) and “that” refers to a concept signified by the object (self-contained
representation). Measurement returns restricted information related to only one
field at a time. To understand all aspects, we have to take multiple readings
of all aspects. Hence in addition to encryption (language phrased in terms of
algorithms executed on certain computing machines - sequence of symbols), compression
(quantification and reduction of complexity - grammar) and data transmission (sound,
signals), there is a necessity of mixing information (mass of text, volume of
intermediate data, time over which such process will be executed) related to
different aspects (readings generated from different fields), with a common
code (data structure - strings) to bring it to a format “it is like that”.
In communication technology, the mixing is done
through data, text, spread-sheets, pictures, voice and video. Data are
discretely defined fields. What the user sees is controlled by software - a collection
of computer programs. What the hardware sees is bytes and bits. In perception,
these tasks are done by the brain. Data are the response of our sense organs to
individual external stimuli. Text is the excitation of the neural network in
specific regions of the brain. Spreadsheets are the memories of earlier
perception. Pictures are the inertia of motion generated in memory (thought)
after a fresh impulse, linking related past experiences. Voice is the
disturbance created due to the disharmony between the present thought and the
stored image (this or that, yes or no). Video is the net thought that emerges
out of such interaction. Software is the memory. Hardware includes the neural
network. Bytes and bits are the changing interactions of the sense organs (string)
with their respective fields generated by the objects evolving in time.
The result of measurement is always related to a time
t, and is frozen for use at later times t1, t2, etc, when
the object has evolved further. All other unknown states are combined together
and are called superposition of states. Hence there is an uncertainty inherent
in it, which Shannon calls entropy. In
perception, the concept remains in a superposition of states and collapses in
response to some stimuli. In information technology, the updating is done by an
agent. In perception, it is done by the neural network and memory. All
information has a source rate (complexity) that can be measured in bits per
second (speed) and requires a transmission channel (mode) with a capacity equal
to or greater than the source rate (intelligence or memory level). In
perception, these are the intelligence level and mind.
Nature follows this principle for storing and
processing information. The nature of energy is to displace, which leads to
transformation. Objects are perceived only during such transition. Such transitions
involve standing waves, which also generate sound. Thus, perception includes
audio-visual aspects or e.m. and sound waves. A wave, by definition, is
continuous. A particle is discrete. Hence something can be described both as a
wave and a particle only at a point - the interface of two waves. The photon
consists of two standing waves of force - one an expansive electro force and
the other the contractive magnetic force. When these waves intersect each other
perpendicularly, it is called an electromagnetic particle. The particle
vanishes as the forces separate in their continuation as standing waves. Photon is the locus of this interface in a
direction perpendicular to both. Hence it is called the carrier of e.m.
energy and has no rest mass. A wave always requires a medium. Since density
plays an important role in momentum transfer and since density of space is the
minimum, the velocity of photon in space is maximum.
A sound wave is perturbations of
density, pressure and velocity, where sites of maximum density alternate with
sites of minimum density to generate and propagate the vibrations. They are
distributed periodically and propagate in the medium with the velocity of
sound. The wavelength is a weak perturbation, where the relative values of the
density amplitude (i.e., the greatest value of density, divided by the average
density of the medium) is small as compared to unity. If we take a sound wave
of large amplitude, the pressure and temperature in the maxima of density prove
to be noticeably greater than their average values. The velocity of sound at
these maxima is also greater.
Figure 1.
Evolution of a sound wave. Density profiles are shown
in four successive instants.
The arrows show the direction of the wave propagation.
Due to the above reason, the crests
of the wave propagate in the environment faster than the wave as a whole.
Similarly, the velocity of sound at the minima of density is less than the
average velocity. Hence the troughs move slower than the whole wave and the
crests tend to overtake the troughs. When a crest gets closer to a trough, the
layer of the density drop becomes narrower and the wave front becomes steeper.
If the crest could catch up and overtake the trough, the wave front would turn
over like those in the sea. This causes background noise in communication and
confusion in perception.
MECHANISM
OF PERCEPTION:
To understand or solve something is to predict its
behavior in a given situation, when such prediction matches observed behavior. Something
makes meaning only if the description remains invariant under multiple
perceptions or measurements under similar conditions through a proper
measurement system. In communication, as in perception, it is the class or form
that remains invariant as a concept. The sequence of sound in a word or signal ceases
to exist, but the meaning remains as a concept. In Nature, same atoms (or numbers
signifying objects) may combine differently to produce different objects. The concept
arising out of each combination acquires a name (word, message) that remains
invariant through all material changes and even when they cease to exist.
This also defines reality. Reality must be invariant under similar conditions at all times. The
validity of a physical theory is judged by its correspondence to reality. In a
mirage, what one sees is a visual misrepresentation caused by the differential air
density due to temperature gradient. All invariant information consistent with
physical laws, i.e. effect of distance, angle, temperature, etc, is real. Since
the perception of mirage is not invariant from different distances, it is not
real.
The concept of measurement changed with the problem of
measuring the length of a moving rod. Two possibilities suggested by Einstein
were either to move with the rod and measure its length or take a photograph of
the two ends of the moving rod and measure the length in the scale at rest
frame. However, the second method, advocated by Einstein, is faulty because if
the length of the rod is small or velocity is small, then length contraction
will not be perceptible according to his formula. If the length of the rod is
big or velocity is comparable to that of light, then light from different
points of the rod will take different times to reach the recording device and
the picture we get will be distorted due to different Doppler shift.
Length contraction is only apparent from the
stationary frame and cannot be real for the moving frame. What the man on the
platform sees cannot affect the train. The passenger on the train will not
notice any length contraction. However, time dilation is real in a different
sense. All experiments conducted to prove time dilation are defective. Data from
the first experiment available in US naval archives proves that it was fudged.
Time dilation has meaning only in relative terms of cyclic evolutionary
sequences. The evolutionary cycles are different for different categories or
different species of the same category. Their evolution over universal time (Einstein’s
clock at C) can lead to comparative time dilation.
In communication, length contraction
or time dilation has no direct bearing on the final outcome. Yet, the
individual letters in a word or the individual words in a sentence submerge
their sovereignty to the final meaning. Further, the same concept can be
communicated by using long or short words or sentences that take different time
to pronounce or write. When the compiler translates the code into assembly
language or the assembler converts the assembly language into computer code or
the computer executes them into a series of ‘on’s and ‘off’s, the effect of
these concepts are evident.
Writing a code means writing a bunch of relatively
simple instructions and allowing the computer to run millions of instructions
in a second. Individually, each code line does very little. The programmer not only
focuses on what the end product looks like, but also on how each little piece
runs, and then being able to write all of the little lines of code that enable
the whole program to run. Finally, the program objective is broken up into
different chunks. Only the chunk that is needed is worked on at a time and those
that are not needed are pushed off to be done at a different time. This enabled
writing more complex codes, but made it more readable and easy to program.
The inherent uncertainty induced by the environment necessitates
error-correcting
codes. This is done by introducing redundancy into the digital representation
to protect against corruption (syntax error). Compilation of information
(pool) is bound by physical rules and all combinations are not permitted
(eigenvalues). Inside an atom, the number of neutrons cannot exceed a specific
ratio. This is the difference of wakeful state from the dream state, where, in
the absence of external stimuli, no such restrictions (compiler) apply to the
stored information in memory. Hence valid source coding is necessary.
In the mechanism of
perception, each sense organ perceives different kind of impulses related to the
fundamental forces of Nature. Eyes see by comparing the electromagnetic field
set up by the object with that of the electrons in our cornea, which is the
unit. Thus, we cannot see in total darkness because there is nothing comparable
to this unit. Tongue perceives when the object dissolves in the mouth, which is
macro equivalent of the weak nuclear interaction. Nose perceives when the finer
parts of an object are brought in close contact with the smell buds, which is
macro equivalent of the strong nuclear interaction. Skin perceives when there
is motion that is macro equivalent of the gravitational interaction.
Individually the perception has no meaning. They become information and acquire
meaning only when they are pooled in our memory.
In the perception “this (object)
is like that (the concept)”, one can describe “that” only if one has perceived
it earlier. Perception requires prior
measurement of multiple aspects or fields and storing the result of measurement
in a centralized system (memory) to be retrieved when needed. To understand a
certain aspect, we just refer to the data bank and see whether it matches with any
of the previous readings or not. The answer is either yes or no. Number is a perceived property of all
substances by which we differentiate between similars. Hence they are most
suited for describing messages concerning everything. Since the higher or lower
numbers are perceived in a sequence of one at a time, it can be accumulated or
reduced by one at each step making it equivalent to binary systems.
The
probabilities on which Shannon based his
theory were based on objective counting of relative frequencies of definite
outcomes. Physics uses an elementary, indivisible entity - quantum -
defined by the act of its observation, to build everything. So does information
theory. Its quantum is the binary unit,
or bit, which is a message representing one of two choices: 1 or 0 – on or off
– yes or no. The ‘on’s are coded (written in programming language) with 1 and
the ‘off’s with 0.
CLASSIFICATION
OF INFORMATION:
Information is specific data reporting the state of
something based on observation (measurements), organized and summarized for a
purpose within a context that gives it meaning and relevance and can lead to
either an increase in understanding or decrease in uncertainty. Information is
not tied to one’s specific knowledge of how particles are created and their
early interactions, just like the concepts signifying objects are not known to
all. But it should be tied to universal and widely accessible properties. Information
theory tries to make the concepts opaque to the less privileged. Two widely
used theories are the Shannon’s mathematical
theory and Chaitin's algorithmic theory.
Chaitin used a version of Gödel’s incompleteness
theorem. Using an information theoretic approach based on the size of computer
programs, he found regions in which mathematical truth has no discernible
structure or pattern and appears to be completely random. Hence he used
statistical laws to build computable strategies. We dispute his undecidability theorems
which equates very small to zero. We are not discussing it now.
Shannon dealt with Channel Capacity & the Noisy Channel
Coding Theorem, Digital Representation instead of electromagnetic waveform,
Efficiency of Representation - Source Coding (data compression) and Entropy
& Information Content. The channel capacity can be approached
by using appropriate encoding and decoding systems. The noisy channel coding theorem gave rise to
the entire field of error-correcting codes by introducing redundancy into the
digital representation to protect against corruption below certain threshold – Shannon limit. Digital representation ensured that once data was
represented digitally, it could be regenerated and transmitted with minimal
error. Source Coding removed redundancy in the information to make the message
compact. Shannon showed that
information could be sent using high power and low bandwidth (brevity), or high
bandwidth and low power (expressiveness).
The traditional low bandwidth radio
focused all their power into a small range of frequencies. With increasing
number of users, the number of channels used increased and so was interference.
Since too much power was confined to a small portion of the spectrum, even a single
interfering signal in the frequency range could disrupt the communication. Shannon redefined the relationship between information,
noise and power. He quantified the amount of information in a signal as the
amount of unexpected data the message contains. He called the information
content of message ‘entropy’ or uncertainty.
In digital communication, a stream of unexpected bits
is just random noise. Shannon showed that the more a transmission resembles random
noise (common usage), the more
information it can hold, as long as it is modulated to an appropriate
carrier - a low entropy carrier can carry a high entropy message. He could send
a message with low power spread over a high band width by spreading its power
over a wide band of frequencies. One problem with his model that differentiates
it from intelligent models like perception is that, it does not consider
message importance or meaning that concerns quality of data.
The second category of information is factual
information, which is the content. The Shannon
information involves messages represented by symbols (usually binary numbers)
and probabilities of their being chosen. But what is the content of the
messages? All sounds do not convey a message. When we say “pen” – a sound
symbolizing three letters (symbols) arranged in a particular pattern, what is
the content of the message for the receiver? To someone who can’t hear or does
not know English or have not seen a pen or knows the pen by some other name,
the word “pen” or the object does not make any sense. If he has come across
this word earlier and has known to relate the sound to the object, only then he
can think that “It is like the one I had seen earlier, which was called a pen.
Hence it is a pen”. Thus the actual content of any word is the concept of a
known object.
The particular structured configurations of letters
are words that convey a fixed meaning. The binary symbols only conform or deny
whether the perception of the object matches the concept or not, but not the
probabilities of their sequence or occurrence. Shannon
followed the Morse code, which worked on the probabilities of their sequence or
occurrence. Several words can be formed with the same set of letters. A
particular configuration conveyed a particular meaning. The same object may be
associated with different words by different receivers. The same word may
convey different meanings in different contexts. The probability of any
specific word being chosen over others rests with the understanding level of
the receiver and depending upon the environment. This is determined by
experience and not due to uncertainty.
Just like the result of measurement is preserved for future
requirement, the fixed meaning assigned to words (concepts) is also preserved
in Nature. For example, inertia of motion starts after an impulse, which makes
a body move in a field imparting energy to it at the point of contact. It
gradually diminishes when other opposing force components act upon it. Since
energy cannot be destroyed, the energy of the body in motion is transferred to
others. Similarly, when we perceive something due to an impulse, it starts
inertia in our mind by generating a chain of thoughts drawing from memory. This
chain ceases if we get the object of our desire or know all about it or experience
some pain due to another stronger impulse. In this process, the energy that
generated thought is transferred to the field that gave rise to the perception.
Our thoughts consist of words with etymological or fixed
meanings (variables and constants), which are preserved in Nature. Thus, along
different cultures, we find similarities in the words signifying similar
concepts like mother, father, brother, five, seven, etc. When perceptions of
such words (sounds) or symbols (visuals) are mixed (array) with the perception
of the object, the message is complete. When we see a person singing in a TV
program, we perceive it as “the person my eyes see is the person whose songs my
ears hear”. This is due to the mix of the two different perceptions in our brain.
The third category of information is intrinsic semantic information. Semantic
means pertaining to different meanings of words or other symbols. It relates to
configuration that carries intrinsic
information in the sense that different persons can, in principle, deduce the law or process that explains the
observed structure. Here, the grammatical meaning has been
discarded fully or partially for a different meaning because of similarities
associated with the concept. It is popular usage or special programs that may or
may not follow general logic, but follows a law of its own.
The fourth category is the control statement. The computer is programmed to go straight from the
first line of code to the last. But if we want it to run some code only on some
conditions, it changes control to read from a different line of code, instead
of the next. We can even put one control statement inside another or use a
pseudo code.
The last category belongs
to some axiomatic postulates (operations and operands) that are accepted as
evidently proven (primitive) in communication. This is essential for
programming. For example, unless we accept the concept of numbers and the
binary system as self-evident truths, we cannot start writing a program.
Classically, we used grammar,
dictionary, synonyms, public usage and adages respectively for these categories
of information.
WHAT
HAPPENS IN NATURE?
According to the Church-Turing
principle, every piece of physical reality can be perfectly simulated by a
quantum computer. But there is difference between Reality and its simulation.
Formulating a Theory of the observed (or potentially observable) events means
building up a network of input-output connections between them. In a causal
theory, these connections are causal links. In computer-programming language,
the events are the subroutines and the causal links are the registers where
information is written and read. In physical terms, the links are the systems
and the events are the transformations. The computer does not function
naturally, but we design and write the algorithm for the computer to function.
Hence it will be a creature of our ideas and limitations – GIGO. The notion of
Information cannot become the new big paradigm for Physics.
Wheeler’s delayed-choice experiment is a variation of
the two-slit experiment, where the experimenter decides whether to leave both
slits open or to close one off - after the electrons have already passed
through the barrier. The electrons are said to know in advance how the
physicist will choose to observe them. This experiment was carried out in the
early 1990s and is said to confirm Wheeler’s prediction. But has anyone ever
tried to do the experiment with protons and neutrons, which are also quantum
particles?
While conducting experiments,
most people exclude the properties of the measuring instrument that affect the
outcome. If you throw a pebble to the surface of a pond, the pebble goes down,
but the waves spread perpendicular to its direction. If we throw another pebble
a little away from the first pebble, the pebble will sink below, but the new waves
on water will show interference pattern. Something similar happens in two-slit
experiment. A moving electron generates a magnetic field that moves
perpendicular to it. We have conducted
some experiments in a water body with separated channels. Interference pattern
was seen when the waves had access to both channels, but not seen when one
channel was blocked. Our experience also links the interference pattern to the
distance of the barrier from the slits. An experiment using protons and
neutrons will show similar behavior.
In a recent experiment of two-slit experiment, a
gold-coated silicon membrane with two slits each 62 nm wide and 4 μm
long with a separation of 272 nm was used. To block one slit at a time, a
tiny mask controlled by a piezoelectric actuator was slid back and forth across
the double slits. Piezoelectric effect is the generation of an electric charge
in certain non-conducting materials, such as quartz crystals and ceramics, when
they are subjected to mechanical stress. Piezoelectric materials exposed to a
fairly constant electric field tend to vibrate at a precise frequency with very
little variation. Since the electrons were created at a tungsten filament and
accelerated across 600 V and collimated into a beam and the intensity of
the electron source was set so low that only about one electron per second was
detected, the mask that controlled the barrier was interfering with the
results. Hence the entire set up is faulty.
Till date no one has
described “what an electron is”. To understand it, we have to look at
the Solar system as reported by Voyager 1. The solar radiation moves out in all
directions gradually reducing in energy with the passage of different planetary
orbits to face resistance at the termination shock. After passing through the
heliosheath and heliopause, it meets a transition region before crossing the
heliosphere. In an atom, these are equivalent to the electron orbits. The
electrons are the locus of the nucleic radiation at the resistance points of
the nucleic field, confined by the negative charge band of the field. Thus,
they behave like waves in the sea till they hit the shore. When they pass or
are directed through only one slit, they show one pattern. When the two slits
are open, if the distances in both sides are right, they show interference
pattern. Elsewhere we have explained entanglement with macro examples. There is
no quantum weirdness.
How do atoms get instructions about the laws they
must obey? Density variation in the field generates different strings that are
revealed as the fundamental forces of Nature, just like the sequence of letters
create words with specific concepts. Most of the “instructions” are really
interactions (mechanical reaction or as induced by a conscious agent, which
again are reduced to mechanical reactions). The second law of thermodynamics
proves this. The information lives in the Universe as the physical counterpart
of a background structure – maintaining a state of equilibrium. When disturbed,
the tendency for maintaining equilibrium generates two complementary forces:
inertia of motion and inertia of restoration (elasticity). These forces can act
linearly or through a point in the field at equilibrium. This creates
non-linear behavior that leads to different confinements which are experienced
as different forces of Nature.
What is the substrate within which space and time are
encoded in this description? Matter itself is patterns of fields in space and
time. Particles are nothing but locally confined fields of different densities.
Both space and time are related to the order of arrangement in the field, i.e.,
sequence of objects and changes in them (events) as they evolve. The interval between
objects is space and that between events is time. Both space and time co-exist
like the fundamental forces of Nature. Similarly, the sequential arrangements
of letters form words with different concepts conveying fixed meanings.
Application of force can be of two types: application
by a conscious agent or perpetual application of mechanical force, i.e.,
temporal evolution. Knowledge is the initial condition for application of force
by a conscious agent. Incompleteness of our knowledge brings in instability
that generates the inertia of motion and inertia of restoration to induce the
conscious agent to apply force. The reaction, when compared with previous data,
becomes knowledge – real or imaginary. Since result of measurement is fixed,
knowledge is in a state of equilibrium. There is a continual pressure starting
from the creation event to achieve complete knowledge. If we can have full
knowledge, there will be no inertia of motion or restoration – hence no
application of force, no measurement, no perception and no knowledge to
describe anything. This incompleteness propels creation.
In the case of perpetual motion, which forms the
background in which all natural information is stored and physical objects
evolve, we can’t control it as it does not interact, but we can have indirect
knowledge about it. Big bang did not imply ex-nihilo. If there was no
background, what the Universe is expanding in to? When we deny the existence of
something, we only deny its physical existence at here-now. We cannot deny the
existence of the very concept.
Quantum states give only
probabilities, which are determined by observation. The probability is related
to the observer’s inefficiency to control the environment and not to the way
the quantum world behaves. They pop out and vanish following general rules of
momentum transfer. The hidden variables are the characteristics of the
background structure on which the fields rest. These are equivalent to
etymological meanings and fixed meanings of words. A field, which is a
continuum, cannot exist in void – nothingness. Even the interacting quantum
systems - a quantum computer - need a base to exist.
Cosmologists count the number of super-clusters of galaxies
in volumes of 300 Mpc or more in size, to find their average concentration in
space. Knowing galactic masses, they estimate the average density of matter in
such volumes. This density is the same 3 x 10-31 g/cm3 or
about one hydrogen atom per 30m3, wherever we take a volume in space.
Can they exist in a void? The huge range of temperatures found in different
locations in the Universe and the consistency of background radiation at 2.73k
shows the universal background structure. This is the real dark matter. We have
discussed the galaxy rotation problem elsewhere. The galactic clusters spinning
around a center appear to be temporarily moving away like planets sometimes appear
to move away from each other to close in later. The galaxies are not receding
due to dark energy as the effect is not seen in galactic scales or less. They
are not expanding. Energy is perceived only through its interactions. Hence it cannot
be dark (non-interacting). Similarly, information cannot be dark (without
answers). It shines in full glory blinding us. We should have the eyes to see
it.
No comments:
Post a Comment
let noble thoughts come to us from all around