May 14, 2022 10:00 pm

Quantum revolution: new records in the era of “good enough”

The debate on the agenda for the future is, as is the case with the economy or decisions in everyday life, full of biases. A very common one is to believe that the explosion point of a technology occurs when it is equal to the best humans in efficiency and quality, when, in reality, spillover occurs from outcomes that are “good enough” for most of the population.

For example, the images taken by professional photographers are better than those taken by untrained photographers, but the spread of high-resolution cameras on cell phones (a technology popularized in recent years) means that this profession has entered a total rethinking, because the availability of perhaps not perfect photos, but “good enough” for most consumers, multiplied by millions. Whether artificial intelligence (AI) will be able to write like Borges or Hemingway is debated, but we overlook the (disruptive) fact that natural language systems (such as GPT-3 and others) can already write for us most of the emails we send on a daily basis.

In the world of quantum applied to different technologies (and especially to computing) a similar dynamic is taking place. All the headlines on this topic are taken by speculation about quantum “supremacy”: the moment in which computers of this type can solve a problem that all the traditional PCs in the world cannot address, something that has already happened (first with a Google project and then in China), although with challenges that for now they do not have a practical application. “Supremacy” resembles the concept of “broad advantage”. The “narrow advantage”, instead, it is less glamorous and generates fewer headlines: the moment in which quantum “Lego pieces” begin to be inserted into processes dominated by traditional computing to make them more efficient. And they become “good enough.”

Let’s rewind the story for a couple of paragraphs: quantum computing was anticipated in a famous speech by physicist Richard Feynman in the early 1980s. In traditional computing, bits can be 1 or 0, but in quantum computing, qbits can overlap and take many more values, so the computational capacity –for some problems– is multiplied exponentially.

For decades, Feynman’s prediction was only theoretical, because the instability and “noise” of qbit systems made them an almost impossible engineering challenge to solve. But in 2021, very good news arrived on the hardware side, and new technologies were even announced that allow qbits to be stabilized without having to go to extreme temperatures below zero, which raised the cost of these computers to tens of millions of dollars.

The landscape has changed drastically in the last three years., to the point that in this environment they speak of “the old days” to refer to… 2018!

Google achieved supremacy with a 54-qbit machine, and In November, IBM set a new record, with a 127-qbit quantum computer (“Eagle”).

“Incredible things are happening because many disciplines, such as physics, geology, biology and geology, among others, are massively turning to data science” (which is the language of both quantum and AI), tells the nation Facundo Sapienza, who studied Mathematics and Physics in Exact Sciences at the UBA in parallel and is currently pursuing his doctorate at Berkeley. His thesis was on quantum and in the United States he works on AI. “Argentina has nothing to envy to any country in terms of physicist talent, our restrictions are economic, for example lack of budget to buy instruments in laboratories,” adds Sapienza.

In addition to the aforementioned IBM and Google, in this race are also – with different approaches – Intel, Microsoft, Honeywell and IonQ, the first startup in the industry to make a public offering of shares with an initial value of US$2 billion.

It is expected that Soon the Californian firm Rigetti will also make its IPO. Its CEO, Chad Rigetti, stated in a recent report that in the 100-200 qbit environment we will begin to see interesting commercial applications, and defended the concept of “narrow advantage” as a true level of disruption. What will be the first sector to “plug in” quantum parts to its computational processes? Rigetti believes that the one with the financial derivatives. The pharmaceutical industry stay in the queue of enthusiasm.

Following Sycamore’s milestone of supremacy over Google with 54 qbits, there was a successful demonstration of 60 by the University of Science and Technology of China (USTC) in Hefei. Here the progress should not be read linearly: a 200-qbit computer is not “twice” as powerful as a 100-qbit one, but more powerful (and difficult to stabilize) by several orders of magnitude. IBM expects to hit 400 qbits by the end of this year and probably 1,000 qbits by 2023.

“It is important to highlight the fact that the technology continues to stabilize and the performance of quantum processors continues to increase”, account to THE NATION Román Zambrano, IBM CTO for Argentina and Uruguay. If you have to name the areas in which you think this new technology will be most impactful, you think you shouldn’t forget to consider “the performance of algorithms of machine learning to accelerate data mining and AI model development; the possibility of recreating certain processes that, due to their scale or complexity, are difficult to represent in traditional models, and cryptography and security”.

For Rigetti and other experts, the quantum path is “the vine” to which computation is going to go in order to continue supporting Moore’s law (exponential growth in computing capacity) that leveraged the digital revolution in recent decades. This time the path will not go through continuing to make smaller and smaller chips, but with a completely different format.

In the innovation literature, “general purpose technologies” (GPTs) are spoken of to describe those avenues of advance that have an impact on all (or almost all) sectors of the economy. They were electricity, the internal combustion engine, the PC, the internet or mobility. In these “interesting years” there are several TPGs interacting: AI, decentralization (Web3) or biotechnology, which goes beyond the health business and has effects on energy, food, infrastructure, etc. The quantum is perhaps the least known TPG, the most cryptic and difficult to decipher, but not for that the least fascinating.

Reference-www.lanacion.com.ar

Comments (0)

Leave a Reply

Your email address will not be published.