Amazon partners with Hugging Face to launch custom chips for cost-effective and efficient AI model operation.
According to Reuters, Amazon Web Services (AWS) announced a strategic partnership with artificial intelligence startup Hugging Face on Wednesday to jointly launch a solution that enables developers to run thousands of AI models more easily and efficiently on Amazon's custom Inferentia2 chip.
Hugging Face, valued at $4.5 billion, has become a central hub for AI research and development, attracting numerous AI researchers and developers worldwide. Through Hugging Face's platform, developers can conveniently access, modify, and share various open-source AI models, such as Meta Platforms' Llama 3.
This collaboration signifies an important step for AWS in promoting the popularization and application of AI technology. By leveraging the Inferentia2 custom chip, AWS aims to support the inference process of AI models with lower costs and higher efficiency, meeting the growing market demand.
Jeff Boudier, the Head of Product and Growth at Hugging Face, stated, "We have always been committed to improving the popularization and efficiency of AI technology. Through our collaboration with AWS, we can ensure that more people can easily run AI models and realize their innovative ideas in the most cost-effective way."
Matt Wood, the AI Product Leader at AWS, also expressed appreciation for this partnership, saying, "We are delighted to collaborate with Hugging Face to drive innovation and application of AI technology. The Inferentia2 chip will provide developers with a powerful platform to run AI models at lower costs and higher efficiency, achieving more business value and social impact."
As AI technology continues to evolve, more and more companies and organizations are realizing its crucial role in digital transformation and innovation. The collaboration between AWS and Hugging Face undoubtedly provides strong support for the popularization and application of AI technology, driving the entire industry to a higher level.