| P |
akistan’s education system appears stuck on a reform treadmill: moving faster and faster, but going nowhere. Every few years, major reforms are announced — a new curriculum, teacher-training drives, revised assessment frameworks, expanding public–private partnerships and increasingly elaborate school monitoring regimes. These initiatives reflect considerable effort and innovation. Yet learning outcomes remain stubbornly low, millions of children are still out of school and most reforms fail to survive political transitions. Despite constant motion, there is little forward movement.
Why do reforms fail to translate into learning gains? A central reason is that the system does not learn. New reform initiatives are rarely informed systematically by evidence from previous reforms. As a result, policies are neither course-corrected nor built cumulatively on what worked and what did not.
Pakistan’s repeated experimentation with curriculum reform illustrates this problem well. Analyses of the national curriculum have identified deep structural and pedagogical flaws: content overload, unrealistic learning expectations and weak alignment with classroom realities in government and low-cost private schools. These criticisms are not new. For years, researchers and practitioners have pointed to rote-learning mindsets, outdated standards and one-size-fits-all approaches that ignore linguistic diversity and resource constraints. Yet this evidence is repeatedly sidelined, and reform cycles continue with little reflection or adjustment.
This is particularly striking given that Pakistan has an abundance of data and research. Over the past two decades, federal and provincial governments have invested heavily in education data systems that track enrolment, infrastructure, staffing and teacher qualifications. These systems have grown increasingly sophisticated, with some provinces introducing classroom observations and large-scale learning assessments. Yet the data generated is rarely used to inform teaching practices or teacher professional development. Large volumes of assessment data are collected each year, but seldom translated into guidance that helps teachers improve instruction.
Research tells a similar story. Universities produce thousands of MPhil and PhD graduates annually and think tanks and donor-funded programmes commission numerous studies and evaluations, yet education decisions are often made in haste — shaped more by political urgency, administrative pressures and inherited practice than by systematic evidence.
This is not a failure of well-intentioned individuals. It is a failure of systems and incentives.
On the supply side, research incentives are misaligned. Universities reward academics for the number of papers they publish, not for producing knowledge that informs policy or practice. As a result, research often remains disconnected from the real problems policymakers are grappling with. Consider a careful study on continuous professional development showing that teachers attend workshops, but classroom practice barely shifts without coaching, follow-up and head-teacher support. The study will most likely end up on an institutional website or in a journal, while the department’s teacher training programme continues unchanged — measuring success by numbers trained rather than improvements in teaching or learning. Uneven research quality and weak collaboration between universities, think tanks and government further dilutes the relevance and impact of research.
On the demand side, policymakers operate under intense constraints. Education departments are in constant “firefighting mode,” responding to crises, court cases, fiscal pressures and political directives. Electoral cycles reward visible action, not careful analysis. Surveys of government officials reflect this reality. While most have access to administrative data, few request deeper analysis. Decisions about budgeting and priorities often rely on experience or intuition. Nearly three-quarters report that research studies are too technical or too long to be useful. By the time a detailed report arrives, the decision has usually already been taken. On rare occasions when evidence is cited, it is often used symbolically — to justify decisions already made — rather than instrumentally, to inform choices from the outset.
We must start building an ecosystem where evidence routinely travels from classrooms and datasets into the decisions that shape children’s futures.
The result is an education sector with islands of evidence but no bridges to policy. Data and research are scattered across institutions, donor-funded programmes and provinces, but rarely connected into a functioning ecosystem that consistently informs policy and practice.
Encouragingly, there are signs that the system is beginning to recognise the gap.
Initiatives such as those led by the Pakistan Institute of Education, the Data and Research in Education–Research Consortium and international partnerships like the What Works Hub for Global Education aim to bridge the gap between evidence, policy and practice. Their focus is not simply on producing more research, but on changing how research is commissioned, translated and absorbed into decision-making.
Three insights from this experience of translating evidence into policy and vice versa, deserve attention.
First, policy relevance cannot be retrofitted. Commissioning excellent research and hoping dissemination will make it useful rarely works. Evidence is far more likely to shape decisions when policymakers are engaged early and help define research priorities.
Second, translation matters as much as production. Policymakers need synthesis, diagnostics and clear options delivered in short, timely formats aligned with policy cycles — not lengthy technical reports.
Third, institutions matter more than individuals. Sustained evidence use requires dedicated evidence units within ministries, stable partnerships with universities and policy labs embedded in government that test, adapt and learn from reforms in real time.
Together, they point to one conclusion: evidence must live inside government. International experience shows that when evidence is embedded within ministries, its likelihood of informing administrative decision-making increases significantly.
Pakistan has made some progress, particularly through monitoring and reform-support units in provincial departments. But progress remains fragile and uneven across provinces. At the national level, the Pakistan Institute of Education has yet to fulfil its promise, largely due to insufficient resources — itself a reflection of the low value placed on evidence-informed policymaking.
If Pakistan is serious about improving education outcomes, three shifts are essential. Governments must become the primary clients of education research, jointly setting priorities with researchers and commissioning evidence before reforms are introduced. Incentives must change so that academics are recognised not only for publishing research, but also for producing demand-driven evidence and civil servants are rewarded for successful programme delivery informed by credible analysis - not just compliance or speed. Isnvestment must move beyond research production to research use — building evidence labs within government departments to produce research on priority questions, research catalogues and repositories to organise studies and datasets into searchable inventories and resource centres that provide accessible guidance to officials. Universities and research institutes, in turn, should deploy research funding more strategically towards areas with high policy relevance.
Pakistan does not lack education research. It lacks a system that insists on listening to it. Bridging that gap will not deliver instant transformation. But without it, even the best policies are doomed to repeat a familiar cycle — bold on paper, shallow in practice and quickly forgotten.
If we want durable education reform, we must start building an ecosystem where evidence routinely travels from classrooms and datasets into the decisions that shape children’s futures.
Saima Anwer is the programme director of DARE-RC (Data and Research in Education - Research Consortium). DARE-RC is funded with UK International Development from the UK Government. Its implementation is led by Oxford Policy Management in partnership with its consortium partners. [email protected]