Adventures In Information Technology

Tag: Strong Ai

With historic spending, British team hopes to get quantum computing operational “by 2030.”

By the end of the decade, Quantum Motion hopes to produce and market breakthrough computers.

In the midst of a worldwide race to conquer the quantum realm, two university academics have obtained the largest-ever investment for a British quantum computing business.

The taxpayer-funded technology investment fund and Porsche SE, the German family-owned multinational that supports the VW Group, have invested £42 million in London-based Quantum Motion.

Professors John Morton and Simon Benjamin, who have spent the last 20 years studying quantum technologies at Oxford and University College London, founded Quantum Motion.

By the end of the decade, the startup wants to be able to produce and market quantum computers to clients like cloud service providers.

Tech giants like Google and IBM, which are also creating their own quantum computers, are approaching the technology entirely differently from Quantum Motion. The British start-up isn’t attempting to use electromagnetic fields or light signals, but rather is advancing existing microprocessor manufacturing technologies.

It believes that doing this will reduce the cost of creating quantum devices that might be used by scientists, researchers, and governments.

When the business announced that it had successfully mounted thousands of quantum devices on a microchip, it was a significant achievement.

The German industrial behemoth Bosch’s venture capital division is driving the investment in Quantum Motion. Via Future Fund Breakthrough, a £375 million fund founded by Rishi Sunak in 2021 to aid high-tech businesses. 

You can find out more about Quantum Motion here

Scientists extend qubit lifetimes

A team of researchers at the U.S. Department of Energy’s (DOE) Argonne National Laboratory, MIT, Northwestern University, The University of Chicago and the University of Glasgow. Have shown in an experiment that changing a surrounding crystal’s structure to be less symmetric may prolong a molecular qubit’s lifetime.

The qubit is shielded from noise by the asymmetry, allowing it to maintain data five times longer than if it were housed in a symmetrical structure. In contrast to the 2-microsecond coherence duration of a molecular qubit in a symmetrical crystal host, the study team was able to attain a coherence time of 10 microseconds or 10 millionths of a second.

This is a major breakthrough due to longer coherence times which makes for more useful qubits in various applications such as long-distance telecommunications, medicine global navigation, astronomy and computing 

This study was supported by the U.S. Department of Energy Office of Science National Quantum Information Science Research Centers and the Office of Basic Energy Sciences.

You can find out more about the project here

Microsoft to Provide New Funding for OpenAI?

Microsoft’s longtime collaborator, OpenAI Inc., is looking for more cash from the tech giant.

Rumours are circulating that Microsoft is currently in advanced talks with OpenAI about increasing its investment in the organisation. The amount of funding the group is requesting is unknown. It is also not known what, if anything, OpenAI will give Microsoft in return. 

What we do know is that the company is now valued at about $20 billion after the company sold some stock to existing investors in 2021.

Microsoft invested $1 billion in OpenAI in 2019, enabling Azure to run the open-source AI provider’s cloud services. Since then, Microsoft has obtained an exclusive l from OpenAI for the GPT-3 API.

Many may see the move as a counter-threat to Google who are ramping up their AI offerings and new competition coming in fast from Stability.Ai and Midjourney who are all gaining traction after DALL-E 2 took generative art mainstream. 

The field of artificial intelligence is growing quickly, and IDC experts forecast that in 2026, it will be a $900 billion industry, from there the worldwide industry would expand by an average of 18.6 per cent annually. OpenAI is concentrating on artificial general intelligence (AGI), which aims to develop cognitive skills in software such that an AGI-based system could discover a solution to a challenging intellectual issue in a manner akin to humans.

You can find out more about OpenAi here

If you would like to know more about the Microsoft & OpenAi Collaboration you can visit the Azure site here

Three-Qubit System Quantum Breakthrough

Researchers from RIKEN in Japan have made a critical step toward large-scale quantum computing by showing error correction in a three-qubit silicon-based quantum computing device. This study, which was reported in the journal Nature, may contribute to developing useful quantum computers.

Because they promise to resolve significant issues that can’t be resolved by conventional computers, quantum computers are currently a hot topic of research. They use superimposition states of quantum physics in place of the simple 1 or 0 binary bits seen in conventional computers. Due to their fundamentally different construction, they are however extremely vulnerable to background noise and other issues, such as decoherence, and require error correction to do accurate calculations.

Choosing the optimal systems to act as the “qubits,” or fundamental components required to do quantum calculations is a difficult task. Every potential system has benefits and drawbacks of its own. Superconducting circuits and ions are two of the most widely utilised systems today. They have the advantage of having some form of error correction shown, allowing them to be used in practical applications, albeit on a small scale.

Silicon-based quantum technology, which has only recently begun to be developed, uses a semiconductor nanostructure similar to that routinely used to integrate billions of transistors on a small chip, it may be able to take advantage of already-in-use manufacturing technologies.

The lack of technology for error connection is a significant issue with silicon-based technologies. Two-qubit control has already been shown by researchers, but error correction requires a three-qubit system, which has not yet been achieved.

A three-qubit system, one of the largest qubit systems in silicon, was fully controlled in the current study by researchers from the RIKEN Center for Emergent Matter Science and the RIKEN Center for Quantum Computing. This achievement allowed for the creation of the first silicon prototype for quantum error correction. They accomplished this by putting in place a Toffoli-type quantum gate with three qubits.

This is a serious breakthrough in the field of quantum computing and can only lead the quantum supercomputer space hotting up with competing technologies from global research centres.  

You can find more info on RIKEN here

Nvidia China Ai Chip Ban

As the tech crackdown deepens, the US prohibits the sale of specific AI processors to China.

Nvidia, a company that designs computer chips, claimed that US officials ordered it to cease shipping two of the best computing chips for artificial intelligence work to China. This action might seriously impair Chinese companies’ capacity to perform complex tasks like image recognition.

The business stated on Wednesday that the prohibition, which affects its A100 and H100 chips made to accelerate machine learning activities, could prevent the development of the H100, the company’s flagship chip unveiled this year, from being completed.

The new rule “will address the possibility that the covered items may be employed in, or diverted to, ‘military end use’ or a ‘military end user’ in China,” according to Nvidia, which claimed to have heard this from US officials.

The announcement denotes a significant uptick in the US’s campaign against China’s technological might as tensions over Taiwan’s future, where Nvidia and nearly all other big semiconductor companies source their chips, rise.

A representative for rival AMD told Reuters that the company had received additional license requirements that would prevent the export of its MI250 artificial intelligence processors to China, but that it does not anticipate any impact on its MI100 chips.

Chinese organisations won’t be able to efficiently do the kind of advanced computation needed for image and speech recognition, among many other jobs, without American chips from companies like Nvidia and AMD.

What Is Artifical Inteligence?

Artificial Intelligence is the use of computers to mimic human thinking. This term is often a buzzword, so it can be hard to define and understand. But artificial intelligence is a broad term that can be used to describe everything from a computer’s ability to play chess to self-driving cars. The truth is that artificial intelligence is a complex field and it’s constantly growing. This article will provide an overview of what artificial intelligence is, what it can do, and how it could impact our future.

Where do we start?

Artificial intelligence is the intelligence of machines and software in the form of software agents and intelligent machines. It is an umbrella term for a set of technologies that encompasses the theory and practice of making intelligent machines, or more specifically computer software that perceives its environment and takes actions which maximize its chance of achieving its goals.

How does Artificial Intelligence work?

Artificial Intelligence is a type of intelligence that a machine uses to complete tasks that would otherwise require human intelligence. The term is most commonly used to refer to computers or software that can complete tasks that would normally require human intelligence. In order to achieve this, the machine uses algorithms as well as pattern recognition.

What are some applications of Artificial Intelligence?

As Artificial intelligence evolves it wl be systematically replacing jobs and all forms of mainstream labour that require human intelligence from the white collar office to construction workers. AI can be used to improve business processes, make scientific discoveries, or help with medical diagnoses. It is a recent development in computer science and is still a controversial topic that is expanding over time.

This article is an introduction to artificial intelligence and some of the applications of the technology used in the field of AI. If you are interested in learning more about artificial intelligence, I recommend you visit the MIT website for the latest developments in the field.

https://www.mit.edu/

© 2023 Efe Anidi

Theme by Anders NorenUp ↑