Synthetic intelligence has moved previous the pilot stage in India. Enterprises throughout ecommerce, journey, meals supply, fintech, gaming, and actual property are deploying fashions that deal with hundreds of thousands of each day transactions, conversations, and interactions. But, as panelists on the Groq × YourStory roundtable themed ‘Quicker than now: Constructing India’s real-time AI future’ in Bengaluru identified, the nation’s most formidable use circumstances is not going to succeed with out infrastructure that may reply in actual time, at sustainable prices, and at nationwide scale.
The closed-door roundtable, moderated by Sangeeta Bavi, COO of YourStory, introduced collectively a curated group of India’s main expertise builders: Sanjay Mohan (CTO, MakeMyTrip), Madhusudhan Rao (CTO, Swiggy), Jithendra Vepa (Co-founder & CTO, Observe.AI), Manish Gupta (Founder & CEO, Rezo.ai), Zaher Abdulazeez (Head of Knowledge Science, NoBroker), Ramesh Gururaja (SVP – Shopper Merchandise & Progress, Flipkart), Samir Deepak Shaw (Head of Knowledge Science, Flipkart), Dr. Arjun Jain (Founder & Chief Scientist, Quick Code AI), and Satish Rao (SVP – Semiconductor BU, Mistral Options).
They have been joined by members of the Groq management staff — Jonathan Ross, Founder and CEO, Scott Albin, GM – APAC, Mohsen Moazami, President Worldwide (Workplace of the CEO), and Mehul Patel, Gross sales – India, who shared world views and demonstrated Groq’s ultra-low latency infrastructure in motion.
Latency: the invisible deal-breaker
Throughout industries, contributors agreed that latency is the largest blocker to unlocking real-time AI at scale. Present business fashions typically take many seconds to generate responses. For customer-facing functions — journey planning, ecommerce help, meals supply ETAs, or multiplayer gaming — such delays are deal-breakers.
The ambition, panelists harassed, is to maneuver from seconds to sub-seconds. True adoption will come when AI interactions really feel instantaneous, pure, and seamless. With out that, buyer belief and enterprise worth erode.
Economics: AI should beat human prices
Even when pace improves, leaders cautioned, economics will determine adoption. India’s market is unforgivingly price-sensitive. In areas like customer support, panelists famous {that a} human name heart agent prices roughly Rs 9 a minute. For AI to exchange or increase that position, the fee per interplay has to fall under that line.
The group emphasised that companies can not wait years for ROI. The expectation is that AI should justify itself inside the present monetary cycle. Latency with out affordability is unsustainable; affordability with out pace is irrelevant.
Language and voice: India’s unsolved frontier
A number of contributors identified that language variety is as large a barrier as pace or price. Fashions that carry out nicely in English wrestle with the code-mixed, accented, and noisy environments that dominate India.
Voice AI was referred to as out as particularly difficult. Whereas progress has been made, most programs fail to deal with the big selection of Indian dialects and accents in real-world circumstances. Fixing this frontier, panelists argued, is important to taking AI past metros and into mass-market adoption.
One other theme that surfaced was the altering nature of shopper decision-making. Leaders famous that customers are shifting away from search-based discovery and towards conversational interfaces. If clients more and more depend on AI assistants for purchasing, journey, or monetary recommendation, platforms threat shedding the normal high of the funnel.
The consensus: enterprises can not merely bolt AI onto present flows. They have to design AI-native experiences the place real-time intelligence is central to buyer engagement.
Edge and constrained environments
Past shopper web, voices from sectors like protection and semiconductors harassed that edge deployments include distinctive constraints. In these contexts, workloads can not depend on the cloud on account of safety, bandwidth, or latency necessities.
Panelists mentioned the necessity for smaller, environment friendly fashions that may run on-device beneath strict energy and warmth limitations. These improvements, they argued, wouldn’t solely profit protection but in addition unlock alternatives in automotive, IoT, and rural connectivity.
Enterprise leaders additionally mirrored on AI’s affect on developer productiveness. Generative instruments can speed up coding, however in addition they introduce integration challenges and information gaps. With out governance and analysis frameworks, short-term good points could turn out to be long-term dangers.
On the organizational stage, panelists famous adoption gaps. Non-technical groups typically method LLMs as “magic wands” and lose momentum once they confront the hassle required for integration and scaling. Clear frameworks and alter administration can be important to shut these gaps.
Knowledge residency and native infrastructure
Infrastructure readiness was one other recurring concern. Leaders identified that a number of industries, notably BFSI and telecom, already require buyer information to stay in-country. This makes native infrastructure not only a compliance issue, however a sensible necessity for adoption.
Whereas world fashions and cloud deployments stay essential, the panel agreed that India’s AI ambitions rely upon constructing native capability — each to satisfy enterprise necessities and to allow large-scale real-time functions.
Groq’s demo: pace in motion
The Groq staff showcased its ultra-low latency capabilities, working fashions throughout textual content, imaginative and prescient, and video with sub-200 millisecond inference instances. Demos included conversational itinerary planning, near-instant code technology, vision-based object recognition, and video rendering scaled to 4K inside seconds.
Groq additionally shared its roadmap: to considerably speed up the size of its infrastructure globally, with pricing fashions that intention to carry down prices dramatically. The management confirmed that India is beneath lively consideration for infrastructure deployment, contingent on seen demand alerts.
Rethinking the worldwide sport board
The dialogue closed on a forward-looking observe. Panelists agreed that India is not a passive adopter of world AI. With its scale, information, and expertise base, the nation has the potential to set world benchmarks in real-time AI.
Groq’s Jonathan Ross left the group with a provocation: the worldwide expertise order is being rewritten quicker than anybody anticipated. May the following technology of high world tech firms be Indian? The roundtable agreed that the reply is dependent upon whether or not India can mix pace, economics, and ecosystem readiness to make real-time AI inevitable.
The roundtable highlighted each the optimism and urgency surrounding India’s AI journey. Optimism, as a result of the market has the size, expertise, and entrepreneurial drive to guide. Urgency, as a result of latency, prices, and infrastructure gaps nonetheless stand in the way in which.
Elevate your perspective with NextTech Information, the place innovation meets perception.
Uncover the newest breakthroughs, get unique updates, and join with a world community of future-focused thinkers.
Unlock tomorrow’s traits at the moment: learn extra, subscribe to our publication, and turn out to be a part of the NextTech group at NextTech-news.com

