How Should We Evaluate Edtech for Future Generations?

The field of education is one of the most researched disciplines on the planet. We set goals and objective for our students, we test until our eyes fall out, and we gather data that will prove what works and what doesn't.

One of the tough parts about education is that we are dealing with human being as subjects, with all of their backgrounds and baggage. They are not lab rats whose experiences we can control 24 hours a day. They are individuals who spend certain amounts of time pursuing education and other amounts of time doing many other things.

The newest caveat in education is clearly edtech. While education is an area in which change is slow to be embraced, edtech is forcing educators to re-evaluate their approach to curriculum delivery and to find new ways to evaluate the efficacy of the edtech delivery systems that now operate through the educational process.

Before educational systems embrace edtech curricula, they want data that warrants their purchasing decisions. How do they get this data quickly and accurately?

What Are You Measuring?

This is the first question to ask. Educators have curricular goals in all of the content disciplines, although reading and math appear to be the key areas for most institutions. When schools and districts have clearly identified student outcomes, then they can measure the effectiveness of their delivery systems relative to achieving those outcomes, including edtech programs they adopt.

How Are You Measuring?

Just as we have testing in place to determine student mastery of learner outcomes, companies that develop software for use in educational environments must have their own evaluative tools in place for measuring efficacy. Unfortunately, in education, people tend to want to engage in long-term studies – for as long as five years.

In the meantime, institutions have adopted expensive edtech programming that, after five years, proves not to be the best solution. Finding shorter-term methods of evaluation, and targeting those evaluations to more specific student demographics seem to hold promise, and educational software developers are beginning to embrace this concept.

Setting Up Research Designs

According to Bi Vuong, director of a Harvard University initiative with three school districts and 10 charter schools:

“The challenge with regard to evaluating the product efficacy across a variety of educational software programs in the variation in the way data are collected by the educational software provider.”

It is not enough to simply deliver a product to an institution or district. The company must set up the data gathering procedures that will compare “apples to apples.” This means working closely with an institution to set up experimental and control groups that have the same demographic characteristics and then to gather achievement data of both groups for comparison.

If schools and districts want to collect data in organized ways, they will need software in place that allows teachers to streamline the process, based upon the criteria that has been established. One such tool is Breeze, a comprehensive school management system that allows collection of data in real time, streamlines the process of synthesizing that data, and presents clear reports from that synthesis. Further, the system can measure non-academic criteria such as attendance and behavior – important data that can speak to student attitudes toward their learning and learning environments.

Sound Research Principles Still Prevail

Sound educational research has not changed. The evaluation of edtech programs should follow those research designs and principles. So long as learner outcomes are clearly identified; so long as there a design in place that compares experimental and control groups; and so long as data is gathered in efficient ways, carefully analyzed, and reported honestly, we should be able to determine efficacy. Edtech software developers and providers need to take a collaborative role with educators to see that this happens.

Did you find this article helpful? Click on one of the following buttons
We're so happy you liked! Get more delivered to your inbox just like it.

We're sorry this article didn't help you today – we welcome feedback, so if there's any way you feel we could improve our content, please email us at contact@tech.co

Written by:
Dianna is a former ESL teacher and World Teach volunteer, currently living in France. She's slightly addicted to apps and viral media trends and helps different companies with product localization and content strategies. You can tweet her at @dilabrien
Back to top