The artificial intelligence (AI) sector is undergoing a period of diversification, with markets evolving on multiple different fronts.
While there is a sense of maturation beginning to develop in the sector, not least with two of modern AI’s early researchers having this week been announced as Nobel Prize winners for their work, the technology and its applications remain emergent in the big scheme of things.
The newly published fourth edition of GlobalData’s Artificial Intelligence Executive Briefing takes stock of how the AI industry is variously progressing.
Gemini ranked top
Perhaps surprisingly, it names Google’s Gemini large language model (LLM) as the market leader. OpenAI’s ChatGPT remains arguably the most well-known LLM, driven by it having been the platform that brought consumer-facing LLMs, generative AI and arguably AI itself into the public consciousness.
However, GlobalData currently ranks it as second behind Gemini in a competitive landscape assessment that takes into account six factors – model options, specialized capabilities, AI guardrails, ecosystem integration with third-party providers, professional services and go-to-market strategy.
It puts its view of Google’s current market leadership down to “a combination of highly developed model capabilities in the Gemini family and sophisticated enterprise tooling to build and scale generative AI applications.”
How well do you really know your competitors?
Access the most comprehensive Company Profiles on the market, powered by GlobalData. Save hours of research. Gain competitive edge.
Thank you!
Your download email will arrive shortly
Not ready to buy yet? Download a free sample
We are confident about the unique quality of our Company Profiles. However, we want you to make the most beneficial decision for your business, so we offer a free sample that you can download by submitting the below form
By GlobalDataSmall language models emerging
In an area closely related to LLMs like Gemini and ChatGPT, GlobalData notes the continued emergence of small language models (SLMs) and says they are expected to have a significant market impact.
SLMs are focussed on more specific purposes than LLMs like Gemini and ChatGPT, and as such they trained on smaller datasets than LLMs – typically fewer than 10 billion parameters. Their smaller sizes have the benefits of typically faster training times, lower carbon footprints and improved security, and they offer particular potential for organisations looking to deliver targeted applications.
“SLMs are not meant to replace LLMs but rather complement them,” GlobalData’s briefing states. “As competition in the AI market intensifies, companies are under increasing pressure to show a strong business case with demonstrable return on investment. SLMs, with their suitability for industry-specific applications, offer easier scalability across diverse environments.”
Hype slowing on generative AI
Despite the continued maturation of LLMs and SLMs, GlobalData believes the initial hype about their underpinning generative AI technology is cooling. “Related job postings have fallen since Q2 2024,” it states.
Despite a slowdown, however, it does not expect a so-called “generative AI winter”, with the emergence of SLMs among the arguments it provides against such an occurrence. The AI briefing also points to the diversification of AI vendor revenue streams and to more efficient as being arguments against a stalled market rather than a slowed market.
AI chip competition for Nvidia
Elsewhere, GlobalData notes that, while NVIDIA remains the market leader in AI chip design, AMD is in pursuit.
“AMD has traditionally had more of a focus on high-performance computing but since the release of its Instinct MI300 AI chips, it has been challenging Nvidia in AI workloads,” the briefing states. It says that AMD is positioning itself to challenge Nvidia in both power efficiency and scalability.
The briefing also points to Big Tech companies strengthening their in-house chip design capability. “Large tech companies are developing their own processors to limit reliance on costly Nvidia chips,” it explains. Among them are Amazon, Meta, Google and Microsoft.
For the time being, though, Nvidia remains dominant. “A study showed that Nvidia was responsible for 98% of data centre GPU shipments in 2023,” the briefing notes.