Amazon Web Services (AWS) and Amazon-backed AI developer Hugging Face have released a custom AI chip designed to offer cost-effective performance for AI compute workloads.
Inferentia2 will be available to Hugging Face users through its developer hub, which is widely used to access open-source models such as Meta’s Llama 3.
The chip will allow developers to use modified open-source AI models to power software.
“… When you find a model in Hugging Face you are interested in, you can deploy it in just a few clicks on Inferentia2,” read a blog post by Hugging Face.
Users will be automatically billed by the amount of time used.
In a 2024 tech sentiment survey by research and analysis company GlobalData, AI was ranked the most disruptive technology by businesses.
How well do you really know your competitors?
Access the most comprehensive Company Profiles on the market, powered by GlobalData. Save hours of research. Gain competitive edge.
Thank you!
Your download email will arrive shortly
Not ready to buy yet? Download a free sample
We are confident about the unique quality of our Company Profiles. However, we want you to make the most beneficial decision for your business, so we offer a free sample that you can download by submitting the below form
By GlobalDataMore than 40% of businesses participating in the survey stated that AI had already began to disrupt their industry and a further 13% believed that AI would begin to disrupt their business in the next 12 months.
By 2030, GlobalData forecasts the total AI market will be worth more than $1.04trn, achieving a compound annual growth rate of 39% from 2023.
AI chips are also predicted to account for 30% of all next-generation chip manufacturing by 2030, according to GlobalData forecasts.