PIDS flags flaws in DepEd’s 75% proficiency standard

1 month ago 24
Suniway Group of Companies Inc.

Upgrade to High-Speed Internet for only ₱1499/month!

Enjoy up to 100 Mbps fiber broadband, perfect for browsing, streaming, and gaming.

Visit Suniway.ph to learn

Bella Cariaso - The Philippine Star

December 21, 2025 | 12:00am

The paper, “Examining the DepEd’s National Assessments: A Review of Framework, Design, Development, Psychometric Properties, and Utilization,” assessed how national tests are developed and used, and underscored the need to strengthen test quality and ensure closer alignment with curriculum expectations.

STAR / File

MANILA, Philippines — The Department of Education (DepEd)’s long-standing practice of classifying students as “proficient” only if they score at least 75 percent in national assessments may not accurately reflect what learners actually know and can do, according to a new study by the Philippine Institute for Development Studies (PIDS).

The paper, “Examining the DepEd’s National Assessments: A Review of Framework, Design, Development, Psychometric Properties, and Utilization,” assessed how national tests are developed and used, and underscored the need to strengthen test quality and ensure closer alignment with curriculum expectations.

The PDIS said DepEd’s fixed 75 percent benchmark for the National Achievement Test (NAT) is not based on standard-setting processes that determine cut-offs based on curriculum requirements.

“Generally, more students are reaching the proficient level when using the standard setting cut-offs than the Bureau of Education Assessment (BEA) cut-offs,” the study said, suggesting that the current bar may be set too high and may not represent actual learner performance.

It added that many students who demonstrate the expected skills are still categorized as “nearly proficient” or “low proficient,” highlighting the need for a more evidence-based approach to defining proficiency.

According to the study, teachers, school heads and division testing coordinators interviewed for the study pointed to the need for better alignment between national assessments and classroom instruction.

The study noted that system-level tests often emphasize broad 21st-century skills, such as problem-solving and critical thinking, but these skills are difficult to assess properly without clear training, well-developed test items, and a shared understanding of what they look like in practice.

The paper said that teachers found data linked to specific learning competencies more helpful in improving instruction and recommended that national assessments provide more detailed information to help them better understand student learning.

The study’s analysis further highlighted the need for stronger test development and item validation.

Some test items were found to be too easy, too difficult, or not discriminating enough, underscoring the importance of rigorous quality control in item writing, review, and selection.

“Ideally, system and classroom assessments should be aligned, and if ever there is misalignment, these should be intentional, not unintended outcomes,” the study added.

According to the study, stakeholders raised concerns about delays in releasing test results and the lack of clear, skill-based proficiency descriptions.

It said that teachers said timely and skill-focused reports would help them support student progression more effectively.

Read Entire Article