AI funding is failing to reap significant returns on investment. Is there a danger in over investing right now?

AI is a long-term play for us as an organisation, and for the tech industry as a whole. This entails building solutions to business problems that customers and the industry have, and investing ahead of time.

If you take some data points from how Google or Microsoft are thinking, for example, this is probably the first year where Google has more capital expenditure than operational expenditure. They are putting a lot more into facilities, into power purchase agreements, into running data centres efficiently.

So, that gives us a directional view of where the market is going: Generative AI, that’s where the tech industry is investing. Chip-makers are broadly investing there as well. Given this expansion, it’s critical for us to work with an ecosystem of partners in the AI space, across the stack. We work with partners at the chip level – AMD, NVIDIA, Intel – we work with the hyperscalers and IBM as well, and HP too is starting to play that market.

We also work with private AI stack players like a Dell and HPE building solutions there. And finally, with the hyperscalers and more AI partnerships that we are building to have access to LLMs and that kind of technology. Across the stack, at the chip level, at the hyperscaler level, the data center level, and on top of it, the model players, we have relationships and partnerships that we are building.

We want to make all of this accessible to our network of cloud native AI labs across the globe. We have a couple now, one in Noida, another in London, and we are starting one in the US, in New Jersey, and in Singapore, to expand our presence to these networks.

What is HCL’s position on the appropriate approach to regulating AI?

We are preparing ourselves as a company, for AI regulation to become more prominent, especially in regulated industries like healthcare and financial services. The technology will play a little ahead of the regulation, but the regulation will catch up. So even if you think about the EU regulation, or what there is today in California, I think it will become more prominent. But it will also become more thoughtful. What I mean by that is certain high-risk areas, where you don’t want AI to get in, will be prohibited, and there will be many other areas where regulation will be light-touch. Either way, you have to comply with that regulation and those obligations to get products and services to customers. So, we are preparing ourselves as an organisation for multi-tier regulation.

How well do you really know your competitors?

Access the most comprehensive Company Profiles on the market, powered by GlobalData. Save hours of research. Gain competitive edge.

Company Profile – free sample

Thank you!

Your download email will arrive shortly

Not ready to buy yet? Download a free sample

We are confident about the unique quality of our Company Profiles. However, we want you to make the most beneficial decision for your business, so we offer a free sample that you can download by submitting the below form

By GlobalData
Visit our Privacy Policy for more information about our services, how we may use, process and share your personal data, including information of your rights in respect of your personal data and how you can unsubscribe from future marketing communications. Our services are intended for corporate subscribers and you warrant that the email address submitted is your corporate email address.

How is AI impacting the HCL products and services that you offer customers? And how do you see that evolving over the next five years?

So, there are two things to distinguish here. One is, how does customers create business change with AI? There are examples in healthcare and banking, in marketing, across different parts of industry verticals and some horizontals. Call center technology and the front-end of the customer support service industry technology is going to rapidly advance. So many industries have their customers landing through this interface. But if the person’s issue is not getting resolved, you need to be able to talk to someone, and those resolution systems will get much better from much faster learning from knowledge bases, and learning from previous human experiences. Those are things that benefit customers and end customers.

And then there are the technologies we are building to help our own internal processes improve. HCLTech, as an organisation, has a little over 110,000 developers with copilots expected to deliver benefits in coding. We have a solution, called AI force, across the life cycle of our products. We are able to see all the data right from conceptualising a service or a product, to putting it into production. This enables us to find efficiencies, not just in the coding phase. We had invested in AI about seven years back, and we built systems using generative AI. These systems have matured really fast in the last 12 months or so.

I think for knowledge industries like ours, the internal efficiencies and the service transformation is equally important while we are transforming customers’ businesses, and we think we can also help our customers think about their internal processes and mature them. So, there are two opportunities. One is for our customers to change their business and create innovative new services. The second, is for our internal processes and for customers’ internal processes to become much more efficient.

With an efficiency boost from integrating AI, are you able to reduce your 110,000 developer head count?

Our directional view is that we will continue to grow our business, not grow our head count. That’s our aspiration. It will take us some time to get there. We’ve been delivering industry leading growth so far, and we expect we will continue to grow our business, without changing head count. However, the nature of the people, skills and capabilities that we will require going forward will change significantly. And we expect our cost structures will grow in line with our revenue with some relationship there.

Can you talk about the evolving skillset required to meet the demands of the AI era?

Many people have heard about prompt engineering as a role. Prompt engineering is essentially how to use AI systems. If you prompt well, you will get good answers. If you don’t prompt well, you’ll get poor answers, and number of iterations required to get the right answer increases. AI systems need to be monitored and skills are needed for everything that’s required for an AI system to live beyond its inception. And those skills for running LLM operations are new capabilities. But there will be more. Some roles will be new, like prompt engineering. But a lot of existing roles will also transform or change.