Quantum Computers vs Classical Who Wins? 2024

Exploring the Quantum Computer Revolution

Processing with quantum computers has become a tangible experience in the past 20 years with the potential for use in the next decade.

By quantum computers, the fallout points to the proliferation of solutions based on high computing skills, such as artificial intelligence.

Quantum computers offer distinct advantages over conventional computers: the computational speed is significantly faster, the ability to perform many computations at the same time, the ability to process a huge amount of data in a short time, and the transmission of results.

So quantum computers are a great resource and solution to the most complex problems facing humanity, problems involving artificial intelligence, machine learning, big data analysis, new materials, biochemistry, genetics, surgery, energy, environmental protection, finance, and astrophysics.

Revealing an unprecedented ability to reproduce, analyze, and model natural phenomena, they can solve important cryptographic and security problems, and create encrypted messages that are resistant to intrusion.

Quantum computers can simulate fundamental physical problems and solve quantum mechanical calculations that cannot be solved with a conventional computer.

The physical and engineering difficulties that must be overcome to use it are still many: the dimensions of the machine are large, the cooling reaches its limits because the core of quantum computers is surrounded by superconducting circuits and quantum particles must not be affected by external events.

The reliability of the results must be ensured, for example by error-correction mechanisms. The laboratory in which the machine is located in a large area that is protected and highly protected from vibration, noise, electromagnetic fields, radiation, dust, and temperature changes.

Irregularities in the quantum processor's crystal lattice due to thermal vibrations may be enough to destabilize the process. The same will happen due to weak electromagnetic fields, including the Earth's magnetic field.

Quantum computer algorithms

Another difficulty is the fact that the language cannot be the same as the language of classical microprocessor computers, it is a revolutionary type of computation and new computational ability and it is necessary to formulate domain-appropriate algorithms.

But by different algorithms than those of classical computers. The solution to the problem seems to lie in getting into quality control, for specific problems, using data from traditional supercomputers, doing calculations in quantum language, and then completing the program by translating the results into the classical language to make them understandable. It can be accessed via the cloud.

Quantum computing applications and targeting will be restricted for now, for general purposes, it will not be economical on a large scale in the short term.

Traditional computers, as everyone knows, rely on electrical bits that can only take the values ​​0 and 1, so they run on sequences of 0 and 1, and those sequences can reach gigantic lengths in sequential computing. It is optimized with parallel computing, but for some applications, the times are always too long.

For quantum computers, a quantum bit is a unit of quantum information

The basic unit of memory. Qubits depend on the state in which quantum particles exist, for example, photons, electrons, fermions, or even atoms or ions that have the property of superposition or the principle of quantum superposition.

The spin of a particle can form a fermion, or an atom for example an information vector: it has two states that can encode binary information, 0 can be represented by the spin corresponding to the spin in the clockwise direction, and 1 as the spin corresponding to the spin in the counterclockwise direction.

A qubit represents the simultaneous existence of all possible states of a particle or physical entity before it is measured. This means that before it is measured, a qubit can take the values 0 and 1 simultaneously. Only through measurement can the qubit property be accurately determined.

When several qubits operate coherently, they can process many options simultaneously. This allows quantum computers to process many sets of states at the same time.

The second property that is being exploited is entanglement or quantum correlation

Looking at the research efforts of physicists in this field from a general point of view, we can consider all teleportation experiments as part of a research program aimed at realizing quantum computers. If two cycles are intertwined, the information associated with them can exist as a superposition of different possibilities.

Teleportation used to process information can play a fundamental role in a quantum computing algorithm. In theory, quantum computers can also only use photons, so a purely optical machine would be built. This approach will be useful for small computers and for making simple calculations.

When analyzing the current state of computer technology, it has been observed that computer microchips are becoming faster and more capable of storing more and more data every day.

Exceptional and useful miniaturization of integrated circuits follows Moore's Law “The density of transistors embedded in microchips doubles every year and a half”.

This empirical law is valid year after year and implicitly says that fewer and fewer atoms or electrons are required for the physical realization of a single piece.

By doing a quick computation in about twenty or thirty years, the miniaturization will stop when it reaches the atomic level, which is the fundamental physical limit for conventional chips when a tiny fraction is transferred by a single electron.

This means that it is the natural evolution of the chip that will lead us toward the quantum limit.

So how long will quantum computers hold up against failure and error?

It will probably take some time to develop: it's a story similar to that of fusion reactors, but we'll get there step by step. At the moment, it seems that one solution, once the performance and stability issues are resolved, is to use a certain number of quantum computers. and link it to the cloud, so users can access it when needed.

Many companies were involved in building the first computers, for example, IBM, the first to create a prototype, Canadian D-Wave, Google, Microsoft, Intel, Honeywell, China, and some start-ups.

Summary of industry developments

The concept of a quantum computer was first developed by physicists Paul Benioff and Yuri Manin in 1980, with some details and ideas also by Richard Feynman et al. (1982).

  • In 1997 the prototype was built for IBM.
  • In 2001, IBM introduced the first C.Q. at 7 qubits (a molecule with 7 nuclear cycles).
  • 2005, the first qubits (8 qubits) were created by scientists from the University of Innsbruck, and a demonstration of the work of the first C. Q. one way was obtained at the University of Vienna.
  • 2006 Peter Zoller of the University of Innsbruck discovers a method on how to use cooled polar molecules to make quantum memories stable.
  • February 2007, D-Wave Systems publicly demonstrated Orion, what is believed to be the first C.Q. 16 qubits adiabatic.
  • In May 2011, D-Wave Systems announced D-Wave One, the first C.Q. to be marketed.
  • In April 2012, scientists from the Max Planck Institute were able to create the first working quantum network.
  • In May 2013, Google and NASA presented the D-Wave Two, at the Quantum Artificial Intelligence Laboratory in California.
  • In February 2016, IBM made the IBM Quantum Experience processor, the first C.Q processor. In cloud mode with a 5-qubit processor.
  • mid-2017, IBM will provide 16 and 20 qubits, quantum processors, via the cloud.
  • In March 2018, Google Quantum AI Lab introduced the new 72 kbit Bristlecone processor.
  • In January 2019, IBM announced the first quantum computers for commercial use “IBM System Q One” and the “IBM Q Network” platform for scientific and commercial use.
  • In January 2020, IBM announced it had achieved a quantum size of 32, on a 28-qubit quantum processor, and in August 2020 announced the largest quantum size ever, equal to 64, confirming the annual doubling trend of its QCs power… Quantum size is a non-hardware-specific measure specified for measuring quality control performance. truly. Quantum size takes into account the number of bits, connection, port, and scaling errors. IBM is also building a 53-kilobyte computer.

Google, in collaboration with D-Wave and NASA, has invested so many resources that it has come a long way in developing quantum computers.

It solves a complex arithmetic operation in the time of 200 seconds, a computational operation that takes time to solve 10,000 years by a conventional supercomputer. Huge computer hack. IBM engineers criticized this claim, but even by giving Google a certain advantage for a promotional activity, the difference in calculation times is very large.

Google built a 72 qubit computer, Intel with 49 qubits, IBM with 53 qubits, and D-Wave systems starting at 128 qubits refer to the latest prototype at 5000 qubits, obviously using a different type of connection and active qubit arithmetic. The need for a system reference standard is critical, and simply calculating qubits is not enough.

In quantum computing, the computing power increases exponentially with the number of bits. To use quantum computers, Microsoft says it is necessary to have a few tens of thousands of logical qubits and have access to hundreds of thousands in the future.

In one of Intel's reports, a few million qubits were mentioned. However, scalability increases the frequency of errors, qubits are more unstable than bits and it is necessary for a quantum processor to have low error rates in logical operations and when reading.

Some error-checking and correction techniques have been developed and improved. Significant progress has been made in both gateway bugs, extended consistency improvements, and reduced crosstalk (unwanted transfer of signals between communication channels) from gateways to architectures and access to the cloud.

As of May 2020, IBM owns 18 quantum computers, Google 5 and Honeywell 6.

Over the past decade, several research groups around the world have worked to develop quality centers. Follow up on different fields of study.

Some used single atoms, ions, or photons as vectors of information, while others used semiconductor technology, modifying it so that they could encode and manipulate single quantum bits.

One idea, for example, is to implant single atoms in silicon and have them talk to each other, thus generating a quantum processor. This is how computer physicist Bruce Kane conceived, who proposed in 1998 to build a computer using phosphorous atoms arranged on a layer of silicon just 25 nanometers thick. Other groups worked with superconducting elements.

How will these technologies evolve in the future?

The future core technology of Quality Centers may turn out to be a combination of some of these technologies, which are all based on the superposition and entanglement principles of quantum physics. The field of research is changing so rapidly that what we say today may not be true next year. It will be incomplete.

The rapid development of studies is illustrated by the following examples:

Intel supports ongoing research on spin qubits: spin qubits work based on the spin of a single electron in silicon, controlled by microwave pulses.

This system is very close to the current technologies for creating semiconductor components, which are the prerogative of Intel. Moreover, they can operate at a “higher” temperature than superconducting qubits.

Microsoft is focusing on topological quantum computing with Majorana fermions. These enigmatic particles, developed by Majorana in 1937, are distinguished by the fact that they are equal to their antiparticles, and therefore are free of charge, and will also play a role in error-correcting codes.

The D wave and others also use Josephson junctions as junctions between two superconductors separated by a thin insulating layer, with distinct properties and a quantum tunneling effect associated with quantum mechanical penetration of the barrier.

It is clear that this stage of research, which is still experimental and not codified, forces the adaptation and modification of the computation algorithms according to the machine used.

Even just physically stepping into a lab with quantum computers is impressive (on the Internet you can visit IBM, Google, and Microsoft labs). Great silence, numerous controls, a forest of cables and electrical tubes, and protective screens of all kinds.

Advanced cooling systems bring circuits and processors to temperatures close to absolute zero We're talking about 15-20-30 milliKelvin a fraction of a degree Kelvin depending on the machines, temperatures lower than those in intergalactic space.

The different systems both use dilution refrigerators, using helium-3, a very rare and very expensive single-neutron isotope, cooling by magnetic fields, exploiting the alignment of atoms, which absorb energy from the surrounding environment, or refrigerators. Quantitative absorption with “trapped ions”.

Experiments under these extreme conditions allow temperature, superconductivity, and special magnetic fields to highlight new properties of materials at the atomic level such as topological superconductivity, and advance the theoretical understanding of solid-state physics.

Another important problem to be solved is the direct communication between different quantum computers, without going through decoding to transfer the computation results. A possible solution could be to use interlaced states to link quantum computers together.

Two particles in the entangled state remain quantitatively related to each other regardless of their distance. quantum computer output is a certain quantitative state, to transfer this state as a new input to another computer, the solution can be teleportation.

If the two computers are at a great distance, we should be able to teleport quantum states over a great distance, one problem is the possibility of photons being lost along the way.

Using optical fibers, one hundred kilometers can be covered, and the same limit is found through the air. For longer distances, a teleport chain with intermediate stations would be required to measure the Bell states of higher correlation quantum states and also to correct for any photon loss.

Since quantum states cannot be amplified, they are believed to present a series of iterators. This technical solution is not yet feasible.

Conclusion Our future with quantum computers

The next 10 years will be the decade of quantum systems and the birth of a true hardware ecosystem that will lay the foundations for improved coherence, ports, stability, cryogenic components, integration, and assembly.

So, for now, we are not talking about quantum laptops, but we should be very optimistic and hope that one day all computers will become quantum computers.

Quantum Computers Are Coming … But Why Should You Care?

Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top button