News and Tribune

July 30, 2013

Clark, Floyd education officials digest ISTEP+ study

State says test not invalidated, but local districts still have doubts

By JEROD CLAPP
jerod.clapp@newsandtribune.com

> SOUTHERN INDIANA — Results of a validity study into interruptions in the second phase of ISTEP+ testing left school districts in Clark and Floyd counties with more questions than answers as they prepare to start the school year.

The results of the outside-party study — which determined the interruptions had no discernible effect on test scores — were shared with the state’s Education Commission on Monday. With grades for schools and districts, as well as teacher evaluations and compensation in some school corporations at stake, district leaders were left unsure of some of the study’s results as they open the doors to school this week.

John Reed, assistant superintendent of West Clark Community Schools, said the state’s own admission that it can’t know how scores  would have looked without interruptions raises some red flags.

“They want to use that data to determine a letter grade for schools,” Reed said. “We’re going to want to question that data to see if that changes our goals set in our school improvement plans. They can say it’s valid and reliable, but I still question it.”

Students across the state experienced problems during April’s phase of the test, which was administered online to about 95 percent of students. Districts argued that login problems and system lockups affected students’ performance.

But confusion over definitions of interruptions, conflicting reports from the state, districts and test administrator CTB McGraw-Hill over how many students were affected still loom as schools try to figure out how to treat the study.



Definition, please

In West Clark, the state reported about 12 percent of tested students experienced interruptions. But in June, the district reported more than two and a half times that many students as having trouble.

The district estimates and state reports varied widely in Clark and Floyd counties, leaving some administrators — like Sally Jensen, director of assessment for the New Albany-Floyd County Consolidated School Corporation — scratching their heads at how the state calculated those numbers.

“Whatever number I saw today, it wasn’t a number that struck anything in my mind and I usually remember those things pretty well,” Jensen said. “That’s why I emailed them and asked where that number came from.”

She said she wasn’t sure if the state looked only at students who were completely locked out or if long load times were included in that figure.

The state reported 1,268 interruptions in New Albany-Floyd County schools — less than 40 percent of the district’s estimates.

But Reed said the definition of an interruption even goes beyond students who had connection troubles. In some cases, if one student had a problem, the whole class had to stop testing.

“The school districts had a lot more than what they had and what they’re having is based on actual written reports,” Reed said. “We’re saying we were putting in there that a kid may have an interruption and that’s reported, but what about the ones sitting next to them? While their computer didn’t stop, it may have affected them. That’s the kind of thing we were recording.”

Andrew Melin, superintendent of Greater Clark County Schools, said though he respects the study and its findings, he’s still left with questions about how login trouble and lockups affected students at computers.

“I still feel that with our students having to go through the interruptions that they did, I have a hard time fully believing that it did not have more of an impact on scores,” Melin said. “But if this validity study says that the impact was negligible, I respect that. Let’s bring the scores on and look at them. I’m just glad we have numerous other assessments that give us a good idea of where our students are and how to help them.”

Districts don’t have the scores yet and neither do parents. Glenda Ritz, state superintendent, said parents should have results by the end of August with an attachment to explain the interruptions. No timetable was given on when districts would get the results for their own review.



Tabulating scores

The delay in getting scores to districts prevents them from submitting their school improvement plans, which are mandated by the state and due on Sept. 15.

Kim Knott, superintendent at Clarksville Community Schools, said without an extension, it would put some undue stress on corporations to complete them in such a short amount of time.

“By the time you analyze the data and you get all the people together, you don’t have a lot of time remaining,” Knott said. “Basically [under the previous deadline], you’ve got four weeks to get it done and you’d like to be able to spend the summer doing that, so you’re starting of the school year with those measurable goals. None of that’s going to happen until after we get those results.”

But while the third-party reviewer, Richard Hill, co-founder of the National Center for Improvement of Educational Assessment, said it’s easy enough to apply the study’s results to districts and schools, it’s harder to tie that data to how effective individual teachers were last year.

Reed said if the data can’t be applied on a micro level, it didn’t make much sense to apply it on a macro level.

“The real problem is when you look at it from an apples to oranges standpoint,” Reed said. “Can you say that when you start giving letter grades to schools, you’re saying that you’re comparing apples to apples? But their own data shows schools had interruptions up to 28 percent, and you’ll compare those to schools with little or no interruptions? They say they can because based on their past performances, before they gave the test, these kids came close to their prediction, so the disruptions don’t matter. I beg to differ with that, I will never agree to that.”

Jensen said without having read the full report, she didn’t want to comment on the outcome of the study. But she said she was left with a number of questions just in listening to Monday’s report.

Melin said once his team has a chance to look at the ISTEP+ results on their own, he’ll have a better comparison of how his district performed over how the state says they performed.

“I guess at first glance, you need to respect the study that was completed,” Melin said. “If the study says that based upon their analysis that the interruptions had a negligent impact on ISTEP testing, I guess you need to trust that at first glance. Until we get a chance to see our scores and do an analysis on our own, it’s difficult to be concrete until then.”

Daniel Altman, spokesman for the Indiana Department of Education, did not return phone calls for this story.