Posted in | News | Consumer Robotics

The Potential of Stochastic Spintronics for AI

Researchers from Tohoku University and the University of California, Santa Barbara, have developed innovative computing hardware featuring a Gaussian probabilistic bit (g-bit) based on a stochastic spintronics device. This breakthrough offers a promising energy-efficient solution for the computational demands of generative AI.

(a) Configuration of Gaussian probabilistic bit (g-bit) made of five binary probabilistic bits (p-bits). Photograph of a prototype of a probabilistic computer with stochastic spintronics devices and FPGA. (c) Measurement result of Gaussian random number with various means and standard deviations (sigma).
(a) Configuration of Gaussian probabilistic bit (g-bit) made of five binary probabilistic bits (p-bits). Photograph of a prototype of a probabilistic computer with stochastic spintronics devices and FPGA. (c) Measurement result of Gaussian random number with various means and standard deviations (sigma). Image Credit: Shunsuke Fukami and Kerem Camsari

As Moore's Law reaches its limits, domain-specific hardware architectures are emerging to tackle complex computational problems. Among these, probabilistic computing, which utilizes stochastic building blocks, has gained attention.

Probabilistic computers are particularly well-suited for algorithms in combinatorial optimization and statistical machine learning, where inherent randomness plays a critical role. While quantum computers excel in problems grounded in quantum mechanics, probabilistic computers focus on algorithms that leverage probability.

Traditional probabilistic computers rely on binary probabilistic bits (p-bits), which limit their efficiency in applications requiring continuous variables. The collaboration between the University of California, Santa Barbara, and Tohoku University addresses this limitation with the introduction of Gaussian probabilistic bits (g-bits). These g-bits extend the capabilities of p-bits by generating Gaussian random numbers, enabling efficient handling of continuous-variable algorithms.

The development of g-bits has significant implications for machine learning models, such as the Gaussian-Bernoulli Boltzmann Machine (GBM). GBMs can now operate more efficiently on probabilistic computers equipped with g-bits, paving the way for advancements in optimization and learning tasks.

One notable application is generative AI, where current models like diffusion models require computationally intensive iterative processes to produce realistic images, videos, and text. By leveraging g-bits, probabilistic computers can perform these iterative computations more efficiently, reducing energy consumption while maintaining high-quality outputs.

Other potential applications include portfolio optimization and mixed-variable problems, where models must process both binary and continuous variables.

Conventional p-bit systems struggled with such tasks because they are inherently discrete and required complex approximations to handle continuous variables, leading to inefficiencies. By combining p-bits and g-bits, these limitations are overcome, enabling probabilistic computers to address a much broader range of problems directly and effectively.

Tell Us What You Think

Do you have a review, update or anything you would like to add to this news story?

Leave your feedback
Your comment type
Submit

While we only use edited and approved content for Azthena answers, it may on occasions provide incorrect responses. Please confirm any data provided with the related suppliers or authors. We do not provide medical advice, if you search for medical information you must always consult a medical professional before acting on any information provided.

Your questions, but not your email details will be shared with OpenAI and retained for 30 days in accordance with their privacy principles.

Please do not ask questions that use sensitive or confidential information.

Read the full Terms & Conditions.