The polling exercise for Assembly elections 2026 across four states and one Union Territory will conclude on Wednesday evening as Phase 2 voting wraps up in West Bengal at 6pm.
Post that, it will be time for the exit poll live results for West Bengal, Tamil Nadu, Assam, Kerala and Puducherry to predict winning parties, seat projections and likely vote shares. Though Kerala, Assam and Puducherry voted on April 9 and Tamil Nadu voted on April 23 along with the Phase 1 polling in West Bengal, television news channels and other media are legally barred from projecting exit polls results till the last phase of voting concludes in any ongoing election cycle.
Which is why all eyes across these four states and the Union Territory of Puducherry today evening
will be on the exit poll results for the 2026 Assembly elections.
But beyond the confident projections is a messy, human process. Here’s a clear, ground-up explainer of what exit polls actually are, when and how they’re done in India, and why they sometimes get it wrong.
What Is An Exit Poll And How Is It Different From An Opinion Poll?
In simple terms, an exit poll is a post-vote survey where voters are asked who they voted for. This survey is conducted immediately after a voter casts their ballot in an election, just after they exit the polling booth.
Willing participants are asked who they just voted for, not who they plan to vote for. This is the main different between an exit poll and an opinion poll. While opinion polls try to predict voter behaviour before polling, exit polls attempt to capture the actual voting behaviour, but through a sample.
In India, exit polls are regulated by the Election Commission (ECI), which bans their publication until the final phase of voting is over. Media outlets and think-tanks can conduct them, but cannot publish the results until the final phase of polling is over. In this case, that’s today, April 29 after 6pm.
The Exit Poll Process In India: What Happens On The Ground
There’s no single standard template, but most major agencies that conduct exit polls — like Axis My India, CVoter, CSDS-Lokniti — broadly follow the same methodology.
Sampling Constituencies And Polling Stations
These pollsters don’t and can’t cover every booth in every constituency since that would be logistically impossible. Instead, they select a representative sample of constituencies and polling stations based on region, past voting patterns and demographic spread.
On-Ground Field Survey
On the day of polling, the field representatives of the agencies stand outside selected polling stations and approach voters as they exit after casting their votes.
They usually ask voters to fill a secret ballot-style slip or they record responses verbally on handheld devices.
Participation by the voters is absolutely voluntary; they cannot be forced to disclose who they voted for. In many cases, some voters even refuse to answer.
Capturing Demographic Data
Apart from asking which party or candidate they voted for, the pollsters also collect certain demographic data from willing voters, like age group, gender, caste or community, and income or occupation.
Weighting The Data
This raw data collected is never published verbatim. Pollsters apply statistical weights to adjust for underrepresented communities or groups, variations in regional turnout, and historical voting patterns.
For example, if fewer women responded in a region, their responses may be scaled up.
This is where methodology varies sharply between agencies — and why different exit polls can show wildly different results.
Translating Votes Into Seats
This is the hardest part and India’s electoral system makes this step especially fragile.
Because of the country’s first-past-the-post system, even a small shift in vote share can flip multiple seats. Add alliances, local candidates, regional swings, and converting vote data into seat projections becomes the trickiest and most sensitive part of the whole exercise.
Why Exit Poll Results Differ From Agency To Agency
If you’ve ever wondered why one agency or news channel shows a comfortable majority while another predicts a tight contest, the answer lies in how each agency builds its numbers. The differences aren’t random — they stem directly from variations in exit poll methodology.
Different sampling choices: No two agencies pick the exact same booths or constituencies to sample. One may lean more on urban centres, another may have deeper rural coverage. Even small differences in sampling can shift the overall picture.
Size and spread of the sample: Some surveys cover a larger number of respondents, others prioritise geographic spread. A bigger sample doesn’t always mean better — it depends on how representative it is.
How questions are asked: The way a question is framed, or whether a voter fills a secret slip versus answering verbally, can influence responses. Subtle, but it adds up.
Treatment of no-response: A significant number of voters refuse to answer. Agencies handle this differently — some adjust aggressively, others take a more conservative approach. That alone can swing projections.
Weighting formulas: This is the core of any exit poll explained properly. Raw data is adjusted using past election results, demographic balance, and turnout estimates — but each agency uses its own formula. There’s no universal standard.
Vote-to-seat conversion models: Two agencies might report similar vote shares — and still project very different seat counts. That’s because their models for translating votes into seats vary widely, especially in India’s first-past-the-post system.
Timing and field conditions: Surveys conducted early in the day may capture a different voter mix than those done later. Add weather, turnout surges, or local mobilisation — and field conditions begin to matter.
Why Exit Polls Sometimes Get It Wrong
You might have noticed over the years that exit poll results are not always proven right come results day. That’s because even when data collection is solid, several structural challenges remain:
- Not everyone agrees to answer. And those who refuse may not be random — for example, supporters of certain parties may be more guarded.
- Some voters don’t reveal their true choice — especially if it’s politically sensitive.
- Even carefully designed samples can miss micro-level shifts especially in a country as diverse as India.
- Exit polls assume sampled turnout reflects actual turnout. That’s not always true.
- Vote share is not the same as seats won. A 1-2% error in vote share can produce a massive seat miscalculation.
Why Exit Polls Are More Difficult In India
Exit polling in India is more complex than in many countries because of multi-party contests instead of a two-party bipolar election. In some states, regional parties are stronger than national parties, there are drastic variations in voting patterns of different castes and communities, while in some constituencies tactical voting in favour of the alliance instead of a party of candidate complicates analysis.
A survey model that works in one state may fail completely in another.
Why Election Commission Embargoes Exit Poll Results
So you voted in your state on a particular date but have to wait few more days, or even weeks, to know who might win. That’s because under Election Commission Of India rules, exit poll results cannot be published till the last of voting in that particular election cycle is over.
The idea is simple: prevent early trends from influencing voters in later phases or other states.
So Should You Trust Exit Polls, And How Much?
The short answer is treat exit poll results as signal, not outcomes. They are good for spotting broad trends, gauging shift in political momentum, and understanding voter segmentation.
But they are not results, and shouldn’t be read with that level of certainty.

/images/ppid_a911dc6a-image-177762202746770653.webp)
/images/ppid_a911dc6a-image-177762205881494985.webp)






/images/ppid_a911dc6a-image-177762269278254361.webp)
/images/ppid_a911dc6a-image-177762266215351360.webp)
/images/ppid_a911dc6a-image-177762262820275787.webp)

