How we can help

Need help with your Managed IT Services?

Our team are available Mon – Fri: 7:30am-5:30pm

Call Now On:
Stourport: 01299 848311 Hereford: 01432 663026

Technical Support

Contact us

- 5th Sep 2024

Sustainability-in-Tech : New Device Could Reduce AI Energy Consumption By 1000 +

Engineering researchers at the US University of Minnesota Twin Cities claim to have demonstrated a state-of-the-art hardware device that could reduce energy consumption for artificial intelligent (AI) computing applications by a factor of at least 1,000! 

AI’s Massive Energy Consumption 

The issue the researchers were aiming to tackle is the huge energy consumption of AI, which is only increasing as AI becomes more widespread. For example, the International Energy Agency (IEA) recently issued a global energy use forecast showing that energy consumption for AI is likely to double from 460 terawatt-hours (TWh) in 2022 to 1,000 TWh in 2026. This is roughly equivalent to the electricity consumption of the entire country of Japan!  

With the growing demand of an increasing number of AI applications, the researchers have been looking at ways to create a more energy-efficient process, while keeping performance high and costs low. 

The New Device – Use The ‘CRAM’ Model 

The new device developed by the University of Minnesota College of Science and Engineering researchers works using a model called computational random-access memory (CRAM). The hardware device which uses the CRAM model is a ‘machine learning inference accelerator’ that is used to speed up the process of running machine learning models, specifically during the inference phase. Inference is the phase where a trained machine learning model makes predictions or decisions based on new, unseen data. 

The researchers claim in a recent paper that a CRAM-based machine learning inference accelerator can achieve an improvement on the order of 1,000. Another example showed energy savings of 2,500 and 1,700 times compared to traditional methods. 

What Makes IT So Different? 

The difference with the CRAM model is that whereas current AI processes involve a transfer of data between logic (where information is processed within a system, and memory, where the data is stored), the CRAM model performs computations directly within memory cells. Permanently storing data in this computational random-access memory (CRAM) means there’s no need for slow and energy-intensive data transfers to take place, which results in much greater efficiency. 

Jian-Ping Wang, the senior author of the research paper about CRAM highlighted how this idea of using memory cells directly for computing “20 years ago was considered crazy” but thanks to the interdisciplinary faculty team built at the University of Minnesota (UMN) they have “demonstrated that this kind of technology is feasible and is ready to be incorporated into technology.” 

Very Flexible Too 

In comments posted on the UMN website, Ulya Karpuzcu, an expert on computing architecture, co-author on the paper has also highlighted another reason why CRAM is a more energy-efficient than traditional building blocks for today’s AI systems. Karpuzcu said: “As an extremely energy-efficient digital based in-memory computing substrate, CRAM is very flexible in that computation can be performed in any location in the memory array. Accordingly, we can reconfigure CRAM to best match the performance needs of a diverse set of AI algorithms.” 

Builds On MTJ Research 

This latest discovery builds on Wang and his team’s previous groundbreaking, patented research into Magnetic Tunnel Junctions (MTJs) devices. These are the nanostructured devices used to improve hard drives, sensors, and other microelectronics systems, including Magnetic Random Access Memory (MRAM), which has been used in embedded systems such as microcontrollers and smart-watches.  

Semiconductors 

Following their own successful demonstration of the efficiency boost provided by CRAM-based hardware, the research team is now planning to work with semiconductor industry leaders, including those in Minnesota, to provide large scale demonstrations and produce the hardware to advance AI functionality. 

What Does This Mean For Your Business? 

The development of this new CRAM-based machine learning inference accelerator could be a significant breakthrough with far-reaching implications across several industries. For example, for the semiconductor industry, this discovery could bring a new era of innovation. By partnering with the University of Minnesota researchers, semiconductor companies have an opportunity to lead the charge in creating energy-efficient AI hardware, offering a competitive edge in an increasingly sustainability-focused market. The ability to reduce energy consumption by such a vast factor may not only address the growing concern over AI’s carbon footprint but may also align with global initiatives towards greener technologies. 

For AI application makers and users, the introduction of CRAM-based technology could revolutionise the way AI systems are designed and deployed. The drastic reduction in energy consumption may allow developers to create more complex and capable AI applications without being constrained by energy costs and efficiency limitations. This could lead to a surge in innovation, as more businesses could afford to implement advanced AI solutions, knowing that their energy requirements and associated costs will be manageable. Users of these AI applications may benefit from faster, more responsive, and more cost-effective services, as the energy savings translate into enhanced performance and lower operational costs. 

The energy industry, too, stands to benefit from this technological advancement. With AI’s projected energy consumption doubling within a few years, the shift towards more energy-efficient computing is not just beneficial but essential. By adopting CRAM-based hardware, data-centres and other large-scale AI operators could significantly reduce their energy demands. This reduction may ease the pressure on energy resources and help stabilise energy prices, which is particularly important as demand continues to grow. For data-centre operators, in particular, the promise of lower energy consumption translates directly into reduced operating costs, making them more competitive and sustainable. 

Also, this development may support global carbon emission targets, a concern shared by governments, businesses, and consumers alike. By enabling a reduction in energy usage by a factor of 1,000 or more, the adoption of CRAM-based AI technology could substantially cut carbon emissions from data-centres and other heavy users of AI. This would align with the goals of many corporations and nations trying to meet climate commitments and reduce their environmental impact. The widespread implementation of such efficient technology could even become a cornerstone of global efforts to combat climate change, offering a practical and impactful solution to one of the most pressing challenges of our time. 

The advent of CRAM-based machine learning inference accelerators, therefore, may not only transform the AI landscape but could also reshape industries and address critical global challenges. By embracing this technology, businesses could achieve greater efficiency and performance as well as contributing to a more sustainable and environmentally friendly future.

Google Rating
5.0
Based on 45 reviews