High demand for domestic AI infra, Neysa AI CEO says


A file image of Neysa AI CEO Sharad Sanghi. File photo: Special Arrangement.
Neysa AI CEO Sharad Sanghi said that “sovereignty, latency and compliance” were driving demand for locally installed AI infrastructure among enterprise clients.
Mr. Sanghi spoke to The Hindu on the sidelines of the AI Impact Summit, hours after news broke that his firm had raised $600 million from Blackstone, Inc., and was on the road to borrow another $600 million, for a war chest to procure GPUs to run AI on Indian servers.

Neysa’s fundraise is notable because the vast majority of AI “inferencing,” or responses to prompts, comes from datacentres abroad — a situation that causes little pain for retail users, but could be a source of discomfort for enterprises looking to incorporate sophisticated LLMs internally. It is among the largest AI infrastructure investments in India to an Indian firm.
“Almost all the people who provide development platform want to set up increasing clusters in India,” Mr. Sanghi said. “Now with the tax holiday [for datacentres processing data for foreign entities], they will use India as a hub not only for India but also for the region. That’s why we can be in dialogue with some of these large players and we are expecting to close some of the vendors.”
Mr. Sanghi said he hoped to take some of the cash from reaching unicorn status to market the firm’s platforms abroad, but that the majority of the fundraise would go to “AI infrastructure”.

Mr. Sanghi said that the scale of non-tech firms seeking to leverage LLMs was “unbelievable,” and that the value he was able to provide was “handholding” them into deploying systems for industries like insurance and small startups. “Some of them were using hyperscalers,” he said, but he said that a “nimble” service was more appealing to users.
“We believe that today there are anywhere from 50 to 60,000 GPUs deployed in the country, and we believe in the next 2-3 years it will be 3 million, so it is going to be a 30-fold explosion,” he said. Much of this infrastructure will be used for inference, and not necessarily training, he said.
Published – February 17, 2026 05:38 am IST