No one knows how nature constructs information or even how does it process. But we humans have created a giant resource of scientific formulations. We feel that the extent of complexity that we have enriched, would not have attained, had we known the true form of information in nature. The science would never have been so complex had we knew how information actually looks like in nature, and how does it integrate. Not just scientific theories, often we talk about data overflow. What is the origin of this Big data? Repeatedly analysts are arguing that we add the world full of knowledge at every six months, but count bits. “bit” does not quantify knowledge. Our perception is that we did not measure information properly, so it grew in an uncontrolled manner like cancer. What is that mistake?
While experimenting with biomaterials we have seen that nature encodes information as changing 3D geometric shape encapsulated inside a sphere. If necessary, it adds new geometric shapes inside its corner points. Driven by Turing philosophy existing scientific formulations read any systems information as a string of a linear set of events. When the topology is used in a scale free manner, by nature, at limited scale linearization works, but not when topology becomes complex. There are ten reasons we have outlined here, why linearization is fatal. Note that the science experienced in terms of topology is new kind of approach to seeing the same science, this is not a rejection of the existing science, but replacing the culture of the methodology of working with the old science culture. Of (i) basic assumptions, (ii) derivations keeping a close eye to derive the expression that would explain the result, (iii) make expressions as complex as possible to add accuracy to the experimental results, (iv) correct basic assumptions if major changes are required.
- Linear observation is non-real, superimpose of two world, one below, one above: When a topology is linearized, we consider that the corners of that topology are not changing, interacting. They are static and absolute. But in reality, every point has a topology inside and it is part of a higher topology. The higher and the lower topologies regulate it, any topology that we observe is a combined effect of the two, the reality is an illusion. When we measure and linearize we lose the cause and read an effect which does not contain any information of nature.
- “bits” to be replaced by elementary topological structures: When we linearize any event happened in nature and then integrate it linearly, we consider that every single event happening in nature is identical fundamentally, it is made of switching between “yes” and “no”. This is highly unlikely that at the elementary level everything is identical. Physics teaches us that there is a fundamental set of symmetries of topology (number of polyhedron that we can create is not infinite) and that is now units of information.
- Two body to many body system transition is fatal, we need topological morphing model to replace it: In creating information architecture in science, we take a stream of bits, but afterward, we use black box methods to reconstruct the models for science. In any physics theory, we consider that an elementary number of particles that exist in this universe is just two, the rest disappears. This is a very non-physical unrealistic situation that demands human bias to a large extent. For simpler events, imagination works, but for complex events, it does not, so human bias takes over, different scientists argue, debates continue for centuries. When we observe that topological structures are the foundation of information not a streamline of bits, we make a grand change because it is simply demanding to create a topological version of all mathematical theories that thousands of scientists developed over centuries. In 2010, in our Nature Physics work we argued to create a new kind of science using topology, but now, we are arguing that it could be 3D topology representing the same theories in science in significantly simpler formulations. For streamlining events, equations containing infinite series often results in non-physical unrealistic significance.
- Directionality if lost it is like losing spin of an electron, even much more than that, losing composition of spins in a spin foam: Topological information when converted to linear stream of pulses, it is directional information, say you are looking into a data, it does not reflect any part of the information completely. Linearization is fractally incomplete. Select any point on a linear information, any part, it is incomplete. We can enter continuously, everything would be incomplete.
- No culture of justifying a basic law by fitting experimental observation, derive all from the pattern of primes: Topological integration follows the pattern of the number system, it does not use any fitting anywhere. So, the science is not born from the elementary basic laws, but metric of primes. This is one of the most fundamental principles of the new science that we want to build. Here instead of mathematical formulations, 12 different metric regulates structural transformation of topologies. Linear streaming of data is dead, gives enormous freedom to a user to twist and play with the reality of the system. Explore wild imagination, we do not have it.
- Topology grows inside and above: it always follows an undefined route: There is no differentiation anywhere. Therefore, linearization cannot be modeled. Imagine a point whose properties are defined by a topology inside which is structurally evolving following multiple clocks. Obviously, the point has no fixed property. We can replicate or morph its behavior as much as possible, but not quite accurately.
- Sequence seen in a burst may not be the real sequence of a topology: Bursts that make a linear string of events, could originate from different corners of a topology, it could be an effect of temporal locking of the system with the measuring device.
- Self-resonance, self-oscillation always requires additional components like different forces but in topology, it does not: When we were kids in our school textbooks it was written, when a double derivative of position is proportional to the position we get a pendulum. Derivatives 90 degree phase regulator, but topology is analogue phase regulator, so, systems within and above could reflect energy to oscillate periodically in a fractal system. In a linear system we need two perpendicular forces and a topological constraint. So, linear systems are not robust to accommodate oscillations.
- The science of equation is always a singular path: Quantum has an infinite path but does not detail the path specifics: Topological architecture when details scientific formulations, it tells topology of paths at every level, this beautiful pathways are missing totally in the linearized science.
- Topology inside a single point holds a topology inside by interference in open space: 3D density of states of fractal topology vibrates and changes its configuration, this is where the phase information is located. Linearization probes the single point burst, so gets a totally different information, which is fundamentally different if one measures topological information topologically. `
While developing the technology for the new age computer we encountered a severe problem. We found that at the rate of GHz we have 10^9 bits of data per second. Then, for THz we have 10^12 bits of data per second, and then finally for pHz or LASER induced capture, we find that we are generating 10^15 bits per seconds (flops). How to manage them? We simply can’t existing science or technology can’t. Of course we have peta-flop computers but those are only to show that we have it. When the time comes to look into various data sets at that speed, those who operated with peta-flop machines know very well that simultaneously operating at GHz, THz, pHz is just impossible. We have to transfer data at the same rate, and adjust speeds with other clocking speed data transfer and then find useful relations between them. By the time we understand its intelligence trillions of bits will be waiting for us asking “Hi, mister”
I had rigorous conversations with our chief scientist Martin at IIoIR (www.iioir.org). I told him looking at his concerns, look, we will not take any data as points, we will add them to topology, and modify the topology made of a few bits. Then, you wont have to worry about all this huge data sets. Like in the 1800 a movement was in India “go back to Vedas”, because of massive abuse from British-Christianity invasion, the route was to go back to the basics. Even now, the route is to go back to the cavemen to find inspiration from geometry and topology.
It is so nice to say that we will take a topology and start building on this topology. But doing it in reality is extremely difficult. If we do not know the skeleton of the information architecture of the universe we cannot start anywhere. This is why we need a phase prime metric, just like astrophysics used a metric to build the space time correlations of the universe we need to build a similar metric of pattern or topology. Where we can add.
When we are looking into the patterns produced at the femto-seconds time scale we do not need to see how the pattern would look like in the pico seconds time scale. We might need to make detailing of some of the patterns we see for hours (micro-hertz), then we can look into some of its important topology in the seconds scale, it zooms and finds some interesting topology at the microseconds scale, then, finds some more. The journey goes on and on to the femto second time scale. But who decides the interest? This is done by two factors. First one is memory and second one is phase prime metric that contains map of all possible uncertainties.
So, we are going to do just that in the conventional electronics hardware. Do not see all the data, because it is impossible, but inspired by the universal links between topologies we get into building a multi-time-scale topology of our own. Who knows you may be creating that at this moment in your brain now.
H=Hierarchical (higher level)
H=Heuristic (without programming)
Then, we can
(i) Search a massive database without searching (spontaneous reply).
(ii) Multiple nested clocks one inside another enable “a virtual instant decision making”,
(iii) No programming is required as “cycles self-assemble/dis-assembly for better sync at all possible time scales simultaneously”.
(iv) “Phase space” keeps “volume intact” as required resources only increases phase density not a real space;
(v) Perpetual spontaneous editing of slower time cycles (creation/destruction/defragmentation) “prepare for unknown” = higher-level learning.
(vi) We introduce “fractal resolution”, a complex signal’s lowest and fastest time scale signals are absorbed. simultaneously, & during expansion, the fractal seed delivers full output, from a seed of information (drastic shrinking of data).
(vii) The superposition of simultaneously operating million paths assembles into a sphere enables “extreme parallelism”. In quantum, only one Bloch sphere, here sphere inside a sphere inside a sphere…
(viii) Time cycle is memory, rotation along the cycle is processing, are same events, “no transport needed between memory and processing units”, no wiring.
(ix) No logic gate, no reduction of choices, which ensures that “speed” is irrelevant.
(x) All sensory information is converted to one geometric language that allows “perception”, a yellow color could have a taste. Perception is not a programming as wrongly perceived.
Often we get confused that we are not making a real computer, but we are making an user. Because we do not have an user after the computer is constructed. The computer runs by itself. But then, who is the original driver? The metric of primes created by us exclusively for the computer, runs the show inside its core hardware. What is a metric? And why a metric would have the power to do remarkable things?
When we work on building a metric, like good old days of astrophysics, we are becoming as religious as Turing machine. In astrophysics, theoreticians used to have a space time metric, while doing complex math, students used to refer to the metric time to time and retrieve all essential data to solve planetary problems. Similarly for Artificial Intelligence we have introduced a new metric of primes. The idea is to hack nature and make a computer that can generate most patterns that we see in nature, so that unknown is known. How to perfectly build an effective prime metric architecture is to be a matter of investigation for longer time, but, it cannot be ruled out that the concept to use a prime metric as a prime decision maker is a new concept altogether.
The existing information theory is based on the idea of the known. Now, we have introduced a new information theory, FIT, (Fractal Information Theory), wherein we have put tools to bridge two known domains through an unknown path. This is an important change from the era of information theory that was existing for the last century.
What is the trick that I know the unknown? We can do it if we build a universal metric that keeps all possible solutions, just like the space-time metric that is being used for nearly a century with little modifications to discover new and new physical phenomenon that was never known. If we are not surprised how a space time metric discovered in the 1920s is able to provide us new and new discoveries over a century, we should not be surprised that a similar metric for AI. Of course this is not a known culture in AI, but we feel that people would get accustomed with this with our simple DIY (Do it Yourself) kits that we are building now.
Imagine you have two parts of a music, and you have a kit that would combine two parts of the music with a new one in the middle, and that new music would make a sense to your mind. Similar things would be true in handling a large data, it would generate unseen patterns in the big data. The reason we want a DIY kit is that every people in the world could get a free access to the information revolution that we want. This is not about making money, but transforming the way we live in the world of unpredictability, where economics is worse than astrology, virus are transforming themselves in a pattern in which we do not even have any data.
The beauty of our computing is that we get the total picture at once. Then, more the time pass by, more the information arrives, from 50% reliability to 66% reliability to 72% reliability to 76% reliability to… the journey moves on towards 99% reliability, beyond which not possible to achieve. Absolute reliability is a trademark of the existing computers, but for us, “zooming the unknown” as a function of time and more detailed input is the key.
Our product computer would be a toy to change the perspective about this world. Not just playing game, if we are in an unknown territory about which we have absolutely no information, then, our product computer or user can provide a good overview, instantly with 66% success rate. Uncharted territories are increasing everyday with the data explosion. If we humans do not have a technology to estimate what is there in the uncharted territory we cannot do anything. Accidents would create massive havoc to the human society.
- Imagine a virus is silently evolving into a dangerous species. Some prime metric hardware is there to perpetually track the development of the virus evolution and perfecting the prediction to monitor its evolution. Thus, it could estimate the terror threat well in advance.
- From the Microwave background data of the universe, it could estimate the structure of the universe partially.
- From the massive data flow of the internet, it could find patterns of threats, like cyber attack.
- It can monitor and predict all possible climate change, where the future predictions are not possible due to complexity.
- It can monitor individual health over years and learn about individual health crisis well in advance. Typical heath problems exclusive to a person could be identified and cared.
- Economics will become a scientific subject of study as the computer would build predictable models perfecting it over time.
- Social science and psychology will become a scientific subject as verifiable predictable models would be there, that could be rejected or accepted by logic.
- General science would get a tool to study absolute property of a system, not a fitting model, thus, even scientific studies would get a better cross checker of its conclusions.
- Evolution of life could be tracked scientifically, not just in the past, but also the future to be predicted.
- Life like machines of the future will come, which would have their own operation life time and after a certain time, they will die just like living systems.
We are starting today 1st February 2017 design-construction of the artificial brain. I hope that we will be able to (1) Musiceuticals: cure disease without medicine by applying set of vibrational frequencies, (2) Increased human sense bandwidth: enhance consciousness by increasing the sensing ability of a human, (3) Halt ageing related processes significantly, (4) understanding of the language of nature and every life form, thus, understanding the nature as is and talking to every life form (5) predict intelligence where we cannot find any logic, e.g. earthquake etc, or beyond logic intelligence (6) Science of human behavior, society, economics etc scientifically define human behaviors and several responses which remained thus far, thus, treating brain disorders logically. (7) Simulate beyond limit or knowledge, extrapolation to a domain where no knowledge available. This is a technology that could generate something that it has never encountered, hence we can simulate astrophysics (8) Noise would replace signal, hence we would enter into the era of ultra-low power. (9) Predict and scientifically study evolution (science of evolution). (10) Harmonic technology: Create a culture, art technology that harmonizes with nature, change the way currently every single machine, processed food is an enemy to nature.
Here are the 10 features of the computer that makes it unique in the world. (1) It does not have any software program (no software). (2) It runs by white noise, better randomness in noise is preferred. (3) It uses ultra low power. (4) It runs 24-7 as it evolves its wiring by itself for learning. (5) It never performs a search yet finds what it seeks (no search). (6) It follows geometric-clocking language or principles of composing music to do everything. (7) It has a unified homogeneous fractal hardware for everything, learning changes them. (8) The computer is made of one element only, clocks; considers only parameter phase. (9) It explores singularity and used fractal mechanics, nothing to do with classical or quantum. (10) No wiring a completely wireless connection.
SocPros 2016, IEEE conference, Agra, India Dec 2016
The paper describes ten scientific concepts developed by us over a decade, to understand the consciousness.
First concept is the “colony of the immortals“. Universe is a life inside a life inside a life from the end of cosmos to the smallest scale. Each life is a nest of time cycles (clocks), wherein there is a host who lives several orders of time longer than the guests sitting on its perimeter (e.g. neuron lives much longer than other living cells).
Second concept is the “flute of Krishna“, it is a network of flutes, kept side by side and one inside another. Each flute is a fractal cavity resonator, so is the neuron and even proteins, vibrates like a wheel of frequencies.
Third concept is the “bing bang balloon“, here we reject Turing & propose our own fractal tape. In this fractal information theory (FIT) that operates by Geometric Musical Language (GML), wherein entire universe is a self-assembled Bloch spheres. Here, the Bloch sphere of a Qudit includes rhythm to expand perpetually.
Fourth concept is the “tear drop of primes“, the number of tunes that the flute of Krishna noted above could play if plotted against numbers gives the most fundamental symmetry of the universe. As the numbers increase, it repeats the triplet of triplet made of teardrop to ellipsoid.
Fifth, a “chameleon of nested phase”, a fractal network of geometric phase represents everything in the universe, mass, space, time and all fundamental constants. Every single force, symmetry of the universe is regulated by teardrop of primes, but implemented by a chameleon of nested phase.
Sixth concept is the “geometries of continued fractions of Brahman“, in all life forms geometry = nested cycle of continued fractions. Multiple infinite series governing the universe are evolution of geometric shapes.
Seventh is the “repentance of morphing“, a nested time cycle always oscillates to fix into the most feasible diameter, or a defined clock, but it fails due to the continuous expansion of the information architecture or the self-assembled Bloch spheres (bing bang balloon). Thus, an ideal morphing could never be done.
Eighth, “an imaginary life of three infinities: e, pi, phi and i“. These four infinite series set the grammar of dynamics of every single system of the universe, these are fundamental to all.
Nine, “harvesting noise by a magnetic personality“, we have invented a new fourth circuit element Hinductor to mimic a biomaterials vibrations.
Tenth, “hot dance of proteins in its own Raga“; the absorption band of the biomaterials shows how it exploits triplet of triplet band for nested clocks that itself-dances to make a pi. The drive to make a pi is the origin of all rhythms we see in the universe.