Lawrence Livermore National Laboratory’s free public lecture series returns February 7–28 at Las Positas College in Livermore, Calif., with four Saturday sessions for middle and high school students. This year’s theme is “Computing the Future.” The series opens Feb. 7 with “Cosmic Treasure Hunt: Finding Stardust in Meteorites.” That session will dig into how ancient…
Qunova’s HI-VQE quantum chemistry algorithm is now on AWS Marketplace
Daejeon, South Korea–based Qunova Computing has announced that its HI-VQE algorithm is now available on AWS Marketplace as part of the Braket ecosystem of quantum computers. It reports that the algorithm will give researchers and industrial users a new way to deploy the company’s hybrid quantum-classical chemistry workflow. Qunova expects distributing HI-VQE through AWS Marketplace…
Sandia unveils Spectra, a reconfigurable supercomputer for nuclear stockpile simulations
Sandia National Laboratory and NextSilicon, a technology company, collaborated to create a supercomputer designed to prioritize tasks in real time, Sandia announced on Monday. The computer, called Spectra, could alter how the nation conducts high-stakes simulations for its nuclear deterrence mission. In other words, while it won’t top the TOP500 list of supercomputers, the prototype…
Maryland set for first subsea internet cable: AWS’s 320+ Tbps “Fastnet” to Ireland
Maryland is getting its first undersea internet cable, and it’s a monster. Amazon Web Services announced plans for “Fastnet,” a dedicated fiber optic system linking the state’s Eastern Shore to Ireland with enough raw power to stream 12.5 million HD films simultaneously. The project, set to be operational in 2028, represents AWS’s bet that customer…
Microsoft’s 4D geometric codes slash quantum errors by 1,000x
Microsoft Quantum has unveiled a family of new four-dimensional (4D) geometric codes, that can reduce the error rates of physical qubits by orders of magnitude to reach the level required for reliable quantum circuits. Available in the Microsoft Quantum compute platform, the error correction codes deliver a 1,000-fold reduction in quantum error rates (from 10⁻³…
Berkeley Lab’s Dell and NVIDIA-powered ‘Doudna’ supercomputer to enable real-time data access for 11,000 researchers
Designed to help 11,000 researchers “think bigger and discover sooner,” Lawrence Berkeley National Laboratory’s Doudna supercomputer, launching in 2026, will be central to an integrated research fabric. This Dell and NVIDIA-powered system will ingest data from telescopes, fusion tokamaks, and genome sequencers via the Energy Sciences Network (ESnet). It will enable near real-time analysis and…
QED-C outlines road map for merging quantum and AI
The Quantum Economic Development Consortium has released a 28-page report, “Quantum Computing and Artificial Intelligence Use Cases,” setting out why the two technologies should be developed in tandem and what Washington, universities, and industry can do to speed that convergence. The document distills a Seattle workshop held October 29, 2024, that pulled in quantum engineers,…
Quantum computing hardware advance slashes superinductor capacitance >60%, cutting substrate loss
Reducing performance-killing noise from chip substrates is key for advancing quantum computing. Addressing this challenge, Lawrence Berkeley National Laboratory scientists developed a practical chemical etching process that precisely lifts vital superconducting components, superinductors, just above the wafer surface. This suspension method directly targets stray capacitance and substrate-related loss channels by minimizing physical contact. The research…
Hold your exaflops! Why comparing AI clusters to supercomputers is bananas
Okay, deep breaths. Maybe you’ve heard the buzz around Google’s Ironwood TPUs, which generated at least one headline claiming its system offered a 24x performance boost over the world’s most advanced supercomputer, El Capitan. Or perhaps the news about Nvidia’s Blackwell line of GPUs, its forthcoming exaflop Vera Rubin platform, or xAI’s Colossus cluster, which…
Why IBM predicts quantum advantage within two years
Industry analysts from McKinsey to Omdia largely converge on a timeline for initial quantum advantage emerging in the next few years. While the era when quantum computers can routinely tackle large-scale challenges in fields like drug discovery and materials science might still be years away, IBM’s Quantum CTO, Oliver Dial, Ph.D., predicts the threshold of…
Aardvark AI forecasts rival supercomputer simulations while using over 99.9% less compute
A deep learning system known as Aardvark Weather offers accurate weather forecasts that are orders of magnitude quicker to generate than existing systems. Described in a Nature article (currently posted as a preprint), the system can generate predictions on four NVIDIA A100 GPUs that would otherwise take roughly 1,000 node-hours on a traditional supercomputer system…
Quantum industry sees rapid growth in 2025, report finds
According to the Quantum Economic Development Consortium’s (QED-C) “2025 State of the Global Quantum Industry” report, the global quantum technology industry is experiencing unprecedented growth and investment. The report reveals a rapidly expanding market driven by advancements in quantum computing, sensing, and communication technologies, fueled by significant public and private funding. The following is a…
Mission accomplished: NVIDIA CEO’s quantum mea culpa brings Microsoft and AWS to GTC table after market-rattling comments
NVIDIA CEO Jensen Huang hosted the company’s first-ever “Quantum Day” at its GTC conference on Thursday, bringing together executives from across the quantum computing industry just two months after dismissing the technology as something that won’t be useful for “15 to 30 years.” In the immediate aftermath of Huang’s skepticism, shares of IonQ, Rigetti Computing,…
Quantinuum joins NVIDIA’s Accelerated Quantum Research Center as founding collaborator
Quantinuum has been chosen as a founding collaborator in the upcoming NVIDIA Accelerated Quantum Research Center (NVAQC), an initiative to advance hybrid quantum-classical computing. Set to open later this year, the center will integrate Quantinuum’s System Model H2 — one of the highest-performing quantum systems—with NVIDIA’s CUDA-Q platform and the GB200 NVL72 supercomputer to enhance…
Nanodots enable fine-tuned light emission for sharper displays and faster quantum devices
Penn State and Université Paris-Saclay researchers report a new way to control light by embedding “nanodots” in ultra-thin, two-dimensional (2D) materials. The team says this precision could lead to higher-resolution screens and advances in quantum computing technologies. In a study published in ACS Photonics, the scientists demonstrated how these nanodots — tiny islands of a…
Alice & Bob reports 160-fold improvement in cat qubit error protection
The quantum computing company Alice & Bob has announced a new method for stabilizing its cat qubits. Cat qubits are a type of multi-qubit superposition that mimics the macroscopic superposition idea of Schrödinger’s cat. The company says this method can achieve up to 160 times better bit-flip error protection. The approach involves “squeezing” cat qubits…
Quantum Brilliance, Pawsey integrate room-temp quantum with HPC on NVIDIA GH200
Imagine no longer needing to stand next to a giant supercomputer to dive into quantum research. Thanks to Quantum Brilliance’s virtual Quantum Processing Unit (vQPU), you can now explore quantum computing applications from wherever you are — whether that’s a standard workstation, a remote HPC cluster, or the cloud. This advancement emulates the experience of…
Frontier supercomputer reveals new detail in nuclear structure
A team of researchers at the Department of Energy’s Oak Ridge National Laboratory has unveiled a new technique to predict nuclear properties with unprecedented precision. By harnessing the Frontier supercomputer, the world’s first exascale system, the scientists modeled how subatomic particles bind and shape an atomic nucleus — work that could open new frontiers in…
Microsoft’s Majorana 1 is ‘world’s first quantum processor powered by topological qubits’
Microsoft has unveiled Majorana 1, a new quantum chip built on what the company calls its Topological Core architecture. Engineers say the design could lead to quantum machines with up to one million qubits, a size necessary to tackle complex problems in fields like chemistry, manufacturing, and environmental sustainability. Moving beyond conventional qubits At the…
RIKEN partners with Quantinuum to develop quantum-supercomputing hybrid platform
RIKEN, Japan’s largest comprehensive research institution, has selected Quantinuum’s H1-Series ion-trap quantum computing technology for its new quantum-supercomputing hybrid platform. The collaboration will see Quantinuum install its hardware at RIKEN’s campus in Wako, Saitama, as part of a project to integrate quantum computers with high-performance computing (HPC) systems like the supercomputer Fugaku. The initiative, commissioned…
Students use machine learning to predict crime at Thunderbird Hackathon
High school students dove into the world of coding and artificial intelligence (AI) at the second annual Thunderbird Hackathon, held earlier this month. Sponsored by Sandia National Laboratories and Explora’s X Studio, the event challenged teams to create machine learning models predicting crime incidents using real data from Albuquerque’s open-data initiative. “At Thunderbird Hacks, we…
AI takes center stage at ORNL, where potential meets risk
In the early 1990s, the internet seemed poised to improve our lives by democratizing knowledge, publishing, and communication. While it did achieve many of these goals, it also introduced security risks ranging from malware to phishing. The online world of 2024 feels more like a war zone than a digital playground, “If you connect a…
This week in AI research: Latest Insilico Medicine drug enters the clinic, a $0.55/M token model R1 rivals OpenAI’s $60 flagship, and more
While OpenAI charges $60 per million tokens for its flagship reasoning model, a Chinese startup just open-sourced an alternative that matches its performance—at 95% less cost. Meet DeepSeek-R1, the RL-trained model that’s not just competing with Silicon Valley’s AI giants, but in some cases running on consumer laptops in some configurations rather than in data…
How the startup ALAFIA Supercomputers is deploying on-prem AI for medical research and clinical care
Imagine a hospital spending millions on advanced imaging equipment yet relying on decades-old computers to run the software. That paradox propelled robotics and computer vision veteran Camilo Buscaron—a former systems engineer at NVIDIA and Chief Technologist for AWS Robotics—into action. In 2023, he set out to commercialize an open-source computer vision library known as Kornia,…
R&D Market Pulse: $29B energy mega-merger, new CHIPS Act hub at ASU, and more AI restrictions on China
In this week’s R&D Market Pulse, the $29.1 billion Constellation-Calpine mega-merger promises to reshape U.S. energy, the Commerce Department awards a third CHIPS for America facility to Arizona State University, and new AI export restrictions put China on notice. Meanwhile, Elon Musk’s xAI rolls out a consumer app, BlackRock withdraws from a major climate initiative,…





















