Skip to main content

Exploring the weaknesses of data analysis in education

Rachel Wheeler Archive

There's a lot of talk in America today about a "data revolution" in public education, as innovators are excited to use an influx of information for improving their operations from all angles. They're eager to help kids develop basic skills, ensure that they stay in school and eventually graduate and ultimately, put students on track to succeed in college and beyond.

All the hype indicates that education data is moving forward at a ferocious pace. But there are still a great deal of weaknesses to be addressed, not the least of which is the uncertain level of data quality that education data scientists can rely on.

According to Wired magazine, there are a lot of improvements that need to be made. The news source spoke with David Stewart, founder and CEO of education data startup Tembo, who said that there's a great deal of potential in data sources like standardized test scores, but data scientists need to have a better idea of what they're doing with this information.

"The biggest issue to me is that educators are not technologists or data people, data people aren't educators, and neither one is a design-focused person," Stewart said.

Stewart has tangible examples of the weaknesses of education data based on his time working in the field. For example, he was employed with the New York City public school system at one point, and he found that not all gradations of achievement were equally telling about a student's status.

Eighth graders who scored in the lower end of the "proficient" range of a test had a 54 percent chance of graduating high school, Stewart found. He also saw that for those in the middle of "proficient," the figure leaped to 83 percent. But there wasn't enough emphasis on defining what those buzzwords really mean, how they're measured and what education leaders can do to address the very real, tangible problems they're facing.

More specifically, educators are encountering four main problems with data.

It's arbitrary
Like in Stewart's example above, there are still situations where the data we combine is confusing or nonsensical. What does it really mean for a student to be "lower-proficient" or "middle-proficient"? Data scientists are setting benchmarks, but those figures are arbitrary at best and misleading at worst. There needs to be more of a focus on finding principles that really matter.

It's uninformative
Identifying that a student has an X percent chance of meeting a target test score or making it to graduation is helpful. But what would be even better is some guidance on how to take action. How can education leaders actually make a difference? The data is only the first step. It doesn't tell the full story, and it's unrealistic for anyone to expect it to.

It's used ineffectively
School systems are collecting data galore. Massive amounts of data. Then again, actually doing something with it is the tougher part. It requires having scientists with a great deal of technical skill and analytical thinkers with the ability to interpret what it all means. Too often, schools don't go the full distance when it comes to using education data to the best of their abilities.

It's still unappreciated
Despite all the buzz about education data in progressive places like New York and Washington, there are still large swathes of the United States that don't appreciate it as they should. Education data quality is an important priority for the U.S. in the years ahead, and there needs to be more awareness nationwide of this growing trend.