Pushpendra Singh and Anirban Bandyopadhyay
Can we take a classical computer and replace all components with their quantum analog or counterpart, will it become a quantum computer? No. It would remain a classical computer. Quantum devices do not make it a quantum computer, quantum algorithm makes it so. We should know first how to use the quantum features to our advantage in a particular problem. There are a number of unique quantum phenomena and it is wrong to assume that only entanglement-induced collapse or quantum annealing-based minimization are two ways to explore the speed. Not just speed, quantum properties emerge in superradiance, where multiple uncorrelated centers get entangled; birefringence to create entangled photons, quantum teleportation that allows exchanging quantum information without breaking entanglement, so many phenomena are out there. Moreover, there are other quantum mechanical features too, where researchers could come up with new ways to explore the advantages of quantum mechanics. Since quantum computers are designed only to speed up computing, none of the incredible features apart from speed-up have been explored.
Any quantum computing hardware built today faces stern criticism. While logic gate-based computing architectures need to have periodic choices in a large number as a pre-requisite to explore the advantage of the quantum phenomenon in computing. A purely analog method like quantum annealing-based quantum computing does find global energy minima or map the entire energy profile. Thus, both methods are restricted to limited problems. In reality, both kinds of quantum advantages are seen in narrow applications. These two advantages do not provide a time advantage in real computing. Scientists tend to argue that the “instantaneous” feature is not visible because other classical processes are involved too in completing a computing process.
Builders of quantum computers suggest that both logic gate-based computing and analog quantum annealing have been limited by algorithms to explore quantum advantage. When one has to rewrite the entire problem to harvest the quantum computing advantage in typical hardware, the applications turn restricted. For example, we cannot always expect that a logic gate-based quantum computer would only be used to explore the prime factorization of integers. There could be other problems that a computer needs to solve. Supremacy over classical process is still a debatable term and requires huge compromisation over universality. We make a universal quantum logic gate does not mean we can make a universal quantum computer, think about integrating the computing steps, not about the isolated steps. Often we find that different experiments are performed to show speed up over the classical process. There, the quantum circuit is designed for a purpose. On large-scale computing, there could be only a few limited scales operations where entanglement could give benefits. We always think quantum means rejecting large unwanted choices at once, what is not told to us is that we can reduce only those choices which are similar. We need to physically address all of the choices using a similar phase gap, similar spin or some physical entity. Because of the lack of similarities among all choices on large scale, for most problems, it would function at the same speed as a classical computer. Josca and Repp have shown that shared resources among entangled quantum states are possible when states have multiple similar features. Thus, only in limited classes of problems could deliver true speed-up advantages.
Therefore, the central problem is to have altogether a missing new computing protocol or algorithm that could explore the advantages of quantum process irrespective of problems.
A single device is not the key, think about large-scale advantage: It could be logic gate-based hardware or quantum annealing, the important thing is how the advantages of quantum mechanics are used in computing. So far for logic gate-based quantum computing, a large number of choices with a periodic separation or searching similar choices were considered after the quantum logic gate was conceived, while the quantum Fourier transform QFT delivers the advantage. Several terms of QFT are several choices that are physically self-similar and thus could be collapsed together delivering the one that is required. Creating a large number of special points where information could be taken out of the quantum circuit and information could be poured in from external sources would only slow down the process. Thus, quantum logic gates are not even part of the problem. The real concern is that if any problem has only a few similar choices, several steps where outputs should exchange different multiple sub-pathways, then, taking advantage of quantum computing over classical would be a critical or nearly impossible challenge.
For quantum annealing, it is all about reaching a global minimum in the energy profile, where clustering of choices around minima delivers the advantage. If choices are not periodic, then, we cannot do much even with a quantum logic gate. Similarly, if choices have one to many connections, quantum annealing would totally miss the situation.
There could possibly be three criteria to create true quantum computer hardware. First, large-scale quantum behavior, making quantum logic gate is not enough, some feature like “periodic separation” needs to reduce choices; second, suitable fault tolerance, where quantum protocol should survive natural decoherence as computing progresses, and third, the universality of solving a problem, that is the biggest part, where we have to ensure that the protocol we use to harness quantum advantage should not be limited to a particular class of problem like searching choices or minimizing choices.
A D-wave quantum computer uses quantum annealing to find the ground state or the energy minima, or it could map the energy profile. We can do the same thing using a classical computer. Moreover, quantum wells, their coupling, 2D energy distribution profile all hardware features are suitably trained for a particular class of problems, where choices are not correlated in one-to-many or many-to-many relations and temporal evolution of relations is out of the question. A D-wave quantum computer is not required as such as there is no time advantage, even though it is now experimentally proved that they use quantum. It may be possible that sometimes, D-wave quantum computer shows some improvement in speed over the equivalent quantum computer, however, criticisms mount on whether there is any real advantage. In summary, quantum logic gate and annealing-based technologies use quantum mechanics at the limited scale circuits, undoubtedly, but scaling up of operations where quantum mechanics is used holds the key problem. The physical process used to explore quantum mechanics for solving a global problem is always a separate physical event from local circuits, and that protocol is special to a particular class of problems only.
How do we want to solve three problems:
- Fusion of classical thermal annealing and Hasse diagram to replicate and expand the associated recurring events happening in nature: We have a classical thermal annealing process where liquid molecules form helical nanowires and corresponding organic gel made of those nanowires. Classical thermal annealing plays a tremendous role in expanding the composition of symmetries observed in the clock architecture. Expanding the choices and extrapolating various possibilities is a critical analytic process required in computing and we do it by Hasse’s diagram implemented in the chemical beaker. We have compiled Hasse’s diagram and the mathematics of ordered factors of integers. We call the mathematical structure that we follow in the solution as the phase prime metric. There are several advantages of using classical thermal annealing at the nanoscale structures.
2. Replicate input information into multiple classical annealing beakers to favor particular geometric symmetry of recurring events in the input data: It is also possible that we use various chemical beakers, and in each beaker, we expand particular symmetry and its choices. Therefore, we could assign different dimensions to each beaker and allow very different aspects of the information structure to unfold.
3. Single photon has given a shape of a 3D geometric shape hologram: Classical expansion of choices and a chemical quest to find all associated networks of symmetries need to converge. We use quantum optics to read a near-crystal structure formed in the beaker. Conventionally in quantum optics, one uses a single photon with a low anti-branching value so that in the light beam all photons reside in a single state. We convert the single photon beam in a 3D holographic topology, say a cube, tetragon or dodecahedron or icosahedron so that the single photon is much more stable than the normal light beam. In the case of light beam-based quantum computing one uses vector vortex beams or photons with two angular momenta to compute. However, we use a photon with three angular momenta, so that after the collision of 3D photon structures we get decisive 2D structures.
4. Superradiance, birefringence, quantum cloaking: Creation of an astronomically large number of entangled quantum holograms in the H-bonded supramolecular gel: So when we send these 3D photon structures through the chemical beaker, we find a large number of equivalent entangled 3D photon structures form, one from each nanowire due to superradiance (see below) and birefringence helps in second generation quantum photon sources. Earlier, researchers used quantum dots, we use helical nanowires in the gel to produce superradiance. We find quantum cloaking due to electromagnetic pumping in the GHz domain (1G-25G), which enables hiding some nanowires and amplifying others in the quantum optical projection.
5. Interference of an astronomical number of quantum sources: an orthogonal transformation as an alternative to the quantum logic gate: The resultant 3D shapes made of light are entangled. They overlap and we could see interference effects in the projected light structures from the beakers. Due to constructive and destructive interference, we get unique topological structures made of light. These structures of light are differential structures. The interference of optical structures actually executes the orthogonal transformation of interacting structures. This operation of light structures creates the invariant structure of participating light structures. In Euclidean geometry, orthogonal objects are related by their perpendicularity to one another. Lines or line segments that are perpendicular at their point of intersection are said be related orthogonally. Similarly, two vectors are considered orthogonal if they form a 90-degree angle. It means organic gel could work as the universal logic gate (Kos Ž, Dunkel J. Nematic bits and universal logic gates. Sci Adv. 2022 Aug 19;8(33):eabp8371. doi: 10.1126/sciadv.abp8371. Epub 2022 Aug 19. PMID: 35984880; PMCID: PMC9390992.).
Decomposing orthogonal matrix as rotation and reflection of a single photon is a well-studied subject, which our helical nanowires in the gel carry out frequently (https://math.stackexchange.com/questions/2388761/decomposing-an-orthogonal-matrix-as-a-rotation-and-reflection-in-mathbb-r4). So the single photon structure that falls on the helical nanowires in the gel, transmit and refract from the nanowire. From one single photon, several topologies are born, and many 3D photonic structures carry different symmetry-generated information. From one photon structure, several entangled photon structures are born due to the birefringence properties of the nanowire. The projection of the single photon after it passes through the local 3D assembly of helical nanowires, the symmetry of the nanowire arrangement and the geometries of the nanowires are all included in the transmission and refraction mode generated photonic structures.
6. Long-range global quantum teleportation between the sixteen channels to evolve, affect, and execute higher dimensional information exchange to reach a conclusion: Now, 16 channels are independent light-matter interactions and generate 16 independent differential spectra for quantum computing. Classically, there is no way that 16 channels could communicate with each other, but they are all entangled, so, when a particular set of symmetries are selected in one channel, others modify and together, collectively 16 channels operate. The collective evolution of 16 dimensions is concrete evidence that quantum states shared between different matrices are processed together.
Summary of complete computing protocol: Therefore classical annealing leads to chemical structures expanding the symmetries that we derive from the input information. Then optical structures find the differential signals and thus carry out the computation. If we want to solve a problem we rewrite the problem into a 3D clock assembly or a 3D network of recurring events (GML). This allows the universality of addressing any problem and converting our choices into variables (recurring event = a variable). Then two quantum steps happen one after another. First nanowires generate a cluster of entangled photon structures that reads all nanowires holding specific features of the problem, interfere constructively and destructively to accept and reject the topologically restricted choices, and thus, find invariants. And second, only the collectively differential symmetries survive among 16 physically separated channels and that operation happens without physical contact like a true quantum process. 16 chemical beaker outputs are read at once using coincidence counters and vortex analyzers to read how 16×16 tensors where each of the 16 channels delivers 12×12 tensor interacts and delivers output satisfying Bell’s inequality.
How do we differ from existing quantum computers: Therefore, we are using quantum technology to execute a generic orthogonal transformation to find invariants using light-matter interaction and second we filter correlated features from 16 shared matrices of a generic quantum state or a generic tensor where all pure states remain. Thus, it is not about collapsing choices like existing quantum computers where periodic choices are shrunk by collapse or annealing. Here, the correlation between choices is first filtered using quantum interference and then a pure quantum state of a 16×16 tensor where each element is a 12×12 tensor takes a snapshot of the dynamic behavior of a system instantly. Thus, our quantum computing is very different from the existing protocols of quantum computing. Here we carry out integration and differentiation of choices, find invariants and then map higher dimensional and multichannel correlations of choices. Moreover, choices for us are a recurring event, therefore, instead of reduction of a large number of choices we use quantum technology for doing multiple jobs, first finding and mapping relations between many choices and then building a weight map of all variables as a 3D geometric structure. Therefore, we do not need to discard but place them in a point near the corner of the topology where the choice cannot have a significant contribution.
Our quantum computer uses quantum phenomena for 10 purposes. First, we need to differentiate and find topological differences between photonic structures, so we use quantum interference. Second, we need to create many single photon sources from one, so, we use birefringence and superradiance together to read the entire 3D arrangement of nanowires in the solution. Third, we use quantum cloaking to vanish those of a certain time domain to zoom into a memory structure and find something new. Fourth, quantum teleportation is used between 16 beakers to evaluate the best and most complete composition of symmetries in the output. Fifth, shared tensor between entangled states across 16 dimensions, the orthogonal operation that happens in the single nanowire, the superposition is never broken. Sixth, quantum tomography is extensively used during the evolution of the photonic topology, the 3D structure of light that is generated at the single nanowire in the gel solution continuously evolves, since its a single photon structure that undergoes changes in the multi-layer cluster and even due to quantum superposition of eventual photon structures among different beakers, we get a reconstruction of the quantum state or tomography of 3D light structure (see reference below). Seventh, all sixteen beakers carry out self-testing as a mode to evaluate the symmetries that enables the collective quantum state to survive (see reference below, Nature Physics). Eighth, the curious case of 16 beakers is very important. A state exhibiting Bell nonlocality must also exhibit quantum steering, and a state exhibiting quantum steering must also exhibit quantum entanglement. But for mixed quantum states, as it forms in the 16 beakers collectively, there exist examples that lie between these different quantum correlation sets. The notion was initially proposed by Schrödinger (see reference below) and later made popular by Howard M. Wiseman, S. J. Jones, and A. C. Doherty (see reference below). Ninth, quantum compression of quantum and classical information is carried out for adopting GML. Tenth, quantum key distribution among sixteen channels, instead of qubit here the information that is exchanged is the symmetry of topology and we could see that live.
Additional notes:
Superradiance laser
In quantum optics, superradiance is a phenomenon that occurs when a group of N emitters, such as excited atoms, interact with a common light field ( Dicke, Robert H. (1954). “Coherence in Spontaneous Radiation Processes”. Physical Review. 93 (1): 99–110. Bibcode:1954PhRv…93…99D. doi:10.1103/PhysRev.93.99). If the wavelength of the light is much greater than the separation of the emitters, then the emitters interact with the light in a collective and coherent fashion (Gross, M.; Haroche, S. (1 December 1982). “Superradiance: An essay on the theory of collective spontaneous emission”. Physics Reports. 93 (5): 301–396. Bibcode:1982PhR….93..301G. doi:10.1016/0370-1573(82)90102-8). This causes the group to emit light as a high intensity pulse (with rate proportional to N2). This is a surprising result, drastically different from the expected exponential decay (with rate proportional to N) of a group of independent atoms (see spontaneous emission). Superradiance has since been demonstrated in a wide variety of physical and chemical systems, such as quantum dot arrays (Scheibner, Michael; Schmidt, T.; Worschech, L.; Forchel, A.; Bacher, G.; Passow, T.; Hommel, D. (2007). “Superradiance of quantum dots”. Nature Physics. 3 (2): 106–110. Bibcode:2007NatPh…3..106S. doi:10.1038/nphys494) and J-aggregates. (Benedict, M.G. (1996). Super-radiance : multiatomic coherent emission. Bristol [u.a.]: Inst. of Physics Publ. ISBN0750302836) This effect has been used to produce a superradiant laser.
D’Ariano, G Mauro; Laurentis, Martina De; Paris, Matteo G A; Porzio, Alberto; Solimeno, Salvatore (2002-06-01). “Quantum tomography as a tool for the characterization of optical devices”. Journal of Optics B: Quantum and Semiclassical Optics. 4 (3): S127–S132. arXiv:quant-ph/0110110. Bibcode:2002JOptB…4S.127M. doi:10.1088/1464-4266/4/3/366. ISSN 1464-4266. S2CID 17185255
Šupić, I., Bowles, J., Renou, MO. et al. Quantum networks self-test all entangled states. Nat. Phys. (2023). https://doi.org/10.1038/s41567-023-01945-4
Schrödinger, E. (October 1936). “Probability relations between separated systems”. Mathematical Proceedings of the Cambridge Philosophical Society. 32 (3): 446–452. Bibcode:1936PCPS…32..446S. doi:10.1017/s0305004100019137. ISSN 0305-004
Wiseman, H. M.; Jones, S. J.; Doherty, A. C. (2007). “Steering, Entanglement, Nonlocality, and the Einstein-Podolsky-Rosen Paradox”. Physical Review Letters. 98 (14): 140402. arXiv:quant-ph/0612147. Bibcode:2007PhRvL..98n0402W. doi:10.1103/PhysRevLett.98.140402. ISSN 0031-9007. PMID 17501251