The Missing Link in India’s Exam Reform: Assessment Literacy
India’s exam reforms stress reasoning and application, but teacher assessment literacy remains the critical gap. Building teacher capacity is key to real change.
India’s education system stands at a critical inflection point. The National Education Policy (NEP 2020) calls for a decisive shift away from rote memorisation toward application, reasoning and real-world problem-solving. The Central Board of Secondary Education (CBSE) has begun reworking board examinations with a greater focus on competency-based questions that test understanding rather than recall.
These are vital steps—but they also raise a core question: as exams change, are teachers equipped to teach and assess for genuine competence?
Reform cannot stop at altering what students are tested on. Equally important is who designs those tests and how. Changing exam formats is merely the visible tip of the iceberg; the deeper foundation is teacher expertise—the ability to design, interpret and use assessments that capture what students truly understand. Without strengthening this professional capacity, even the most progressive exam formats risk becoming rote learning in a new wrapper.
Teaching for competence—and assessing for it
Developing higher-order thinking begins with the teacher. Two intertwined areas of expertise are central to this shift.
Pedagogical Content Knowledge (PCK)—the ability to explain complex ideas in ways that help students reason, connect and question—is the first. Global research shows teachers with strong PCK anticipate misconceptions and design tasks that encourage analysis, not repetition.
The second is Assessment Literacy—the skill to design high-quality questions, interpret responses and use feedback to guide learning. Even when curricula emphasise application and creativity, weak assessment literacy tends to pull classrooms back toward standardised, recall-based formats.
This is India’s current paradox: policies promote reasoning, but a large proportion of teachers have not been prepared to teach or assess for it effectively.
Why teacher capacity matters
Assessment literacy often appears technical, but at heart it is about professional judgement—knowing which tasks reveal actual understanding. When teachers build this judgement, the quality of both tests and feedback improves.
International studies show teachers trained to design diagnostic and open-ended tasks identify misconceptions more accurately and adjust instruction more meaningfully. In such classrooms, students engage more deeply and perform better.
Frameworks from bodies like the Australian Council for Educational Research (ACER) and the International Baccalaureate underline the same principle: better assessments do not emerge from software or policy; they come from teachers who understand how to measure learning. Without this expertise, competency-based exams risk testing the outward form of application without capturing the thinking behind it.
The missing link in reform
India’s assessment reforms have rightly focused on what students should know and be able to do—apply, analyse, reason. But far less attention has been paid to how teachers can measure these skills in their own classrooms.
Data from initiatives like the Centre for Teacher Accreditation (CENTA) show teachers possess strong subject knowledge, but their comfort with integrating pedagogy and assessment varies widely. National datasets reflect persistent gaps in access to quality professional development.
The consequence is a widening gap between policy intent and classroom practice. Teachers are expected to deliver competency-based assessments yet rarely receive structured training on what such questions look like, how they should be scored or how student evidence can guide instruction. This is not a reflection of teacher inadequacy—it is a systemic capacity design flaw.
Reimagining capacity building
For exam reform to succeed, it must be anchored in teachers’ assessment literacy. And the training must go beyond token “one-off workshops”. Research on professional learning offers clear pathways:
Embed assessment literacy in teacher preparation: Pre-service programmes must include hands-on modules in designing questions, analysing responses and understanding issues of validity and bias.
Make in-service training contextual and data-driven: Effective training uses teachers’ own classroom data. Real student work—not abstract examples—should form the basis of reflection and learning.
Create communities of practice: Teachers learn best from peers. Moderation of student work, peer review of tasks and cross-school learning circles have shown enduring results globally.
Strengthen evidence-backed programmes and partnerships: Professional development itself should be evidence-led—piloted, refined and built through partnerships between academic experts and local training institutions. High-quality exemplars, rubrics and tasks in regional languages can ensure reform reaches every classroom.
Policy implications: aligning policy, research and practice
For exam reforms to deliver real impact, India must look beyond test scores and prioritise how teachers make learning visible. Policy, research and classroom practice must move in tandem.
One, frameworks like the National Professional Standards for Teachers must recognise assessment literacy as a core professional competency—on par with subject expertise.
Two, teacher training must be context-responsive and evidence-based. Large workshops that ignore classroom realities rarely build usable skills.
Three, research–practice partnerships involving universities, SCERTs and examination boards can produce validated question banks, rubrics and moderation guides rooted in India’s diverse learning contexts.
Four, monitoring systems should track not how many teachers were trained but what changed—specifically teachers’ ability to design, interpret and act on assessment evidence.
True exam reform will not be judged by new question papers but by how confidently teachers can design and interpret evidence of learning. Building that capacity is not supplementary—it is foundational to meaningful educational change.
By Pooja Nagpal, Doctoral Researcher, Centre for Educational Measurement and Assessment (CEMA), University of Sydney