The AI Engine Room
Behind every seamless AI interaction, from breaking down complex economic concepts to debugging code or drafting emails, lies an immense computational
effort. This processing happens at breakneck speeds within data centers – colossal physical structures that are the bedrock of the digital AI world. As you journey away from the bustling cityscapes of Noida towards Knowledge Park V, an area earmarked for corporate and institutional growth, the landscape transforms. Amidst scenes of tranquil buffalo herds and promises of prime office spaces, a commanding, six-story monolith emerges: a data center. This particular facility, operated by Yotta, an AI and cloud infrastructure firm based in Mumbai, represents a significant step in India's AI journey. Its current 40 MW capacity, with the potential to expand to 250 MW, is supported by its own dedicated 220 Kv substation. Yotta, backed by the Hiranandani Group, houses one of the nation's most extensive AI computing infrastructures across its various data centers, including its Navi Mumbai site, which is recognized as India's largest. This infrastructure is pivotal, acting as the 'refineries' for the 'new oil' that is data, as industry leaders often put it. The construction of these data centers is arguably the largest infrastructure build-out in human history, yet India, with its 152 operational data centers and a total IT load capacity of 1,200-1,300 MW, trails behind global leaders like the US (around 14,000 MW across 5,500 centers) and China (7,000 MW).
Fortified AI Sanctuaries
Accessing these AI data centers involves a rigorous security protocol, necessitating passage through four checkpoints and the surrender of personal mobile devices. The perimeter is under constant surveillance by advanced cameras and patrolled by a dedicated private security force. Within the facility, even the elevators are designed for utility, large enough to transport advanced equipment, as explained during guided tours. The core computing area, situated on the fifth floor of the D1 Data Centre, maintains a strictly controlled environment. Visitors must first step onto an adhesive mat to remove any dust from their footwear. Movement within this zone is highly restricted, limited to authorized personnel such as customer engineers, facility operations teams, vendors, project managers, IT operations staff, service delivery engineers, and essential 'remote hands and feet' support. Their presence is meticulously managed, with minimal unsupervised activity permitted. The space is characterized by rows of white server racks stretching towards the ceiling, punctuated by the steady blinking of countless indicator lights and an incessant, low mechanical hum – the collective sound of machines diligently processing requests from users globally. This is the unseen physical infrastructure that powers the everyday AI tools we rely on.
The AI Architecture
The construction and operation of AI are often conceptualized as a five-layer structure. At the foundational level lies the essential physical infrastructure: the land, electricity, and sophisticated cooling systems that keep data centers operational. The next layer up is computing power, specifically the compute driven by Graphics Processing Units (GPUs). These are specialized chips engineered to handle the immense calculations required for both training and running AI models. Following this is the data layer, comprising the vast repositories of text, images, and signals used to educate these AI systems. Above this are the foundation models – the large-scale AI systems capable of interpreting language, images, and code. Finally, the topmost layer consists of applications, where AI is integrated into consumer and enterprise products for practical deployment. Sunil Gupta, MD and CEO of Yotta Data Services, elaborates that training an AI model like ChatGPT involves feeding it colossal amounts of data, which GPUs process to learn patterns and language structures. Post-training, these same GPUs facilitate real-time responses, meaning every user interaction with an AI system triggers GPUs working at extreme speeds within a data center. Yotta recently announced a significant partnership with NVIDIA to deploy 20,000 advanced GPUs at its Greater Noida data center. The computing zones are strategically designed with alternating hot and cold aisles. In the cold aisles, where server rack fronts face each other, thick water pipelines beneath the raised floor actively cool the air. Behind the locked, mesh-doored racks, only blinking lights and fiber-optic cables hint at the immense processing power within. Approaching the rear of these rows unleashes a palpable wave of heat.
Evolving Data Center Design
Not all data centers are built with the same purpose. Older facilities primarily served as extensive digital storage and transfer hubs, akin to massive filing cabinets. In contrast, the contemporary AI data centers are specifically engineered for the demanding tasks of training and executing sophisticated AI models. Ashish Arora, CEO of Nxtra by Airtel, explains that AI data centers are purpose-built to enable machines to learn, reason, and analyze, powered by high-performance GPUs. These GPU-accelerated systems can process enormous datasets concurrently, essential for both training and running generative AI models. Historically, data centers relied on Central Processing Units (CPUs), which are designed for sequential task processing. GPUs, on the other hand, excel at parallel processing, handling numerous tasks simultaneously, which is fundamental to AI. However, this enhanced capability comes with significant demands. GPUs generate considerably more heat, necessitating advanced cooling solutions, robust racks capable of supporting heavier equipment, and a substantially increased electricity supply. To ensure continuous operation, Yotta's Greater Noida facility, with its 40 MW capacity (expandable to 250 MW), draws power from two separate external substations. Furthermore, the campus boasts its own 220 Kv substation, capable of sustaining the massive server farms for up to 48 hours in the event of an external grid failure. Tarun Kumar from Invest UP, the UP government's investment promotion agency, notes the government's proactive role in establishing a dedicated substation and ensuring dual grid access for the data center park to meet its significant power requirements.
India's Data Center Push
While India initially lagged in scaling its AI computing capacity, the government has now decisively shifted its focus, placing a strong emphasis on data center development. State governments are actively collaborating with the central government to attract this infrastructure, offering incentives such as preferential power rates, land acquisition assistance, and streamlined approval processes. Mumbai currently leads India's data center landscape, accounting for 49% of the nation's 1,300 MW capacity, followed by Chennai at 18%. This coastal dominance is largely due to their strategic locations as major submarine cable landing points, which serve as the primary gateways for global internet traffic into India. Despite this progress, significant questions persist regarding the wisdom of embarking on extensive infrastructure development with uncertain job creation potential. Concerns are also raised about whether a resource-constrained nation should invest heavily in facilities that are substantial consumers of electricity and water. The relentless pursuit of hyperscaling – the belief that increased GPU power inevitably leads to superior AI models – is also under scrutiny. Sandeep Mertia, an Assistant Professor at the Stevens Institute of Technology, points out that this paradigm might lead to diminishing returns, as cautioned by scholars like Yann LeCun, suggesting that scale alone may not yield true reasoning and understanding in Large Language Models. A 2025 S&P report forecasts that India's data center growth could more than triple electricity demand from these facilities, rising from 0.8% in 2024 to approximately 2.6% by 2030, potentially increasing reliance on fossil fuels. In Mumbai, the increased energy demands from data centers have prompted two coal plants to seek operational extensions.
Sustainability Concerns
While one company claims its Greater Noida facility is entirely powered by renewable energy, others are mandating renewable energy procurement. An executive from NTT Global Data Centres, which holds significant installed capacity in the country, affirmed their commitment to achieving net-zero emissions across operations by 2030 and throughout their value chain by 2040. Water is another critical resource for data centers. The immense heat generated by GPU clusters during AI model training requires substantial cooling. Anwesha Sen's October report estimated that a 1 MW data center consumes roughly 25.5 million liters of water annually, a significant demand in Indian cities like Bengaluru, Chennai, and Delhi, which already face water scarcity. Data center companies assert their use of recycled water from sewage treatment plants and their efforts to adopt advanced, water-efficient technologies. Astha Kapoor, co-founder of the Aapti Institute, expresses skepticism about the industry's job creation claims, questioning the cost-benefit analysis of water, energy, and land usage versus the employment generated. She notes that once constructed, data centers require very few on-site personnel. Despite these challenges, experts believe there is no turning back from this developmental path. Anjani Kumar, a partner at Deloitte advising on data center investments, likens investment in digital infrastructure to that in roads, airports, and power plants, emphasizing its necessity for an AI-driven future. He warns that India's inability to build this infrastructure would necessitate reliance on other nations. Conversely, Dwaipayan Banerjee, an Associate Professor at MIT, views this trend as a potential creation of new dependencies on a few dominant technology companies.
Realms and Reflections
The uppermost floor of the facility is restricted to visitors and is reportedly leased to a hyperscaler – a major technology company like Amazon, Microsoft, or Google that manages vast cloud and AI infrastructure networks, though the specific client's identity is undisclosed. Adjacent to this main building, construction continues on Yotta's second data center, which is slated to be fully occupied by a hyperscaler for its AI operations. The data center's site was originally part of Tusiana, a village inhabited by approximately 2,000 people. In the village, drains are often obstructed by refuse, and residents complain of construction debris and waste being dumped on their land by nearby factories. Dinesh Bhatti, who manages a local ration shop, voices another concern: the government acquired village land at low prices, promising employment opportunities through factories and offices. However, he states that these companies offer no jobs to villagers, and even the construction labor was not sourced locally. Saurabh, a 24-year-old from Tusiana, acknowledges familiarity with AI but is unaware of the nearby building's role in it, commenting, 'How would we know? We have never gone inside.' This highlights a disconnect between the burgeoning AI infrastructure and the local communities it impacts.













