Thursday, March 6, 2014

Data Results…Not the bottom line!


Our teachers are continuing to prepare folders of student work for parent-teacher conferences; however, recently, some of the teachers found themselves in a quandary…the results of the mid-year diagnostic tests revealed regression in skill acquisition for a number of their students. Consequently, the data-driven digital content material, contained within the i-Ready Program, had automatically readjusted the students’ learning paths to provide repetition of lessons, apparently not mastered. The challenge, for many teachers, became how to explain to parents that, although their children previously passed lessons reflecting skill mastery, after taking the diagnostic test, the results proved otherwise…. or did they?

Up to this point, the blended learning lead facilitator (me) had met, with every teacher, individually, once a week to review data within the multitude of data reports. By doing so, the teachers grew steadily in their knowledge and analysis of the data reports as well as confident in their ability to relay the data results to parents.
However, what we all did not anticipate was diagnostic data results indicating skill regression. Why and how this happened, with more than just a few students, across multiple grade levels, required careful analysis and serious scrutiny. The teachers and I reviewed each and every student’s diagnostic results, seeking immediate answers to the following questions:  

1. Did the diagnostic test demonstrate overall advancement?
2. If not, how wide was the score divide?
3. Did any curricula strand show an increase in knowledge?
4. Did the student stay focused?
5. Did the student rush?
6. Is there evidence of learning difficulty?
7. Should the student retake the test?

Firmly believing that numerous students could not have cognitively regressed and consequently, should not be forced to retake previously passed lessons, we dissected the data, conversed with the students and concluded that much of the diagnostic test data resulted from less than optimal test taking conditions and a sincere lack of many students’ best efforts.

Some teachers had allowed students to complete their diagnostic tests, in totality, rather than opting for shorter time frames over a few days. Additionally, many students’ primary goal was the completion of the test, rather than focusing on quality of performance.  Once the students realized that their post diagnostic lessons were directly correlated to the results of their diagnostic test (meaning they were repeating lessons previously passed) the students anxiously opted to retake their tests, take their time and try their best. As a result, all but two improved their scores by 20-40 points. The two outliers are suspected of having learning difficulties, which will be investigated, further.

The lessons learned:

  • K-5 students require close monitoring while taking diagnostic tests.
  • K-5 students need to be front-loaded with motivational pep talks and goal setting mental strategies before taking their diagnostic tests.
  • Developmentally appropriate test-taking time frames must be established and maintained for optimal student performance.
  • Each student’s results, and newly correlated prescriptive lessons must be reviewed and possibly teacher adjusted for accurate alignment to students’ authentic skill sets.

The bottom line:  digital data, generated by technologic machinery, still requires human partnering in order to fully meet the needs of our students. 


But, we all knew that!