ºÚÁÏÍø

skip to main content
ºÚÁÏÍø Blog
Better Technology Produced Better Learning Outcomes During Pandemic
ºÚÁÏÍø Blog

Better Technology Produced Better Learning Outcomes During Pandemic

by

One of the unfortunate consequences of the COVID-19 pandemic was that it forced many children to learn from home. It is now well-documented that remote learning resulted in substantial learning loss. The slowdown in academic progress was especially pronounced among Black students, Latino students and those from low-income households.

Yet, evidence from a suggests that the quality of digital learning tools (websites, apps, online tutorials, online games and videos, or programs used to teach and support student learning and schoolwork) may have blunted the negative impact of remote schooling. Better student outcomes -- including ease of learning from home, confidence in schools' ability to provide high-quality education, and expectations for learning progress -- are strongly associated with the quality of digital learning tools, as reported by teachers, parents and students.

These results are based on web surveys conducted in July and August 2020, with 1,111 teachers, 2,345 parents and 1,088 students in grades three through 12. Student and parent responses were excluded if parents stated that their child did no distance learning in the spring of 2020 (less than 1% of the sample).

The findings indicate that universal access to digital learning tools will probably not generate gains for students nor equitable opportunities across groups of students unless the tools themselves are of high quality.

About one in five teachers, parents and students rated their digital learning tools as "excellent," less than the proportion who rated them as "fair" or "poor." The preponderant response -- comprising roughly half of answers -- was that digital learning tools were "good." There was remarkable consistency across teachers, parents and students in providing these ratings.

Custom graphic. About one in five teachers (21%), parents (20%) and students (20%) rated their digital learning tools in the spring of 2020 as excellent, less than the proportion who rated them as fair or poor.

Teachers working in schools with a higher percentage of children from low-income households were less likely to rate digital learning tools as "excellent" or "good" than were teachers serving high-income students. In general, schools with a higher percentage of children from low-income families had lower-quality digital learning tools, according to their teachers.1 When comparing teachers at schools with less than 25% of students meeting eligibility criteria for reduced-price lunch to those with at least 75%, the gap in digital learning quality (using the share reporting "excellent" or "good") was 10.9 percentage points in favor of students from high-income households. This suggests that students in low-income schools had less access to the most useful digital tools.

Custom graphic. Teachers working in schools with a higher percentage of children from low-income households, at 67.4%, were less likely to rate digital learning tools used in the spring of 2020 as excellent or good than were teachers serving high-income students, at 78.3%.

Across teachers, parents and students, the quality of digital learning tools is strongly associated with several indicators of learning outcomes. These indicators measure the reported ease of learning from home, confidence in schools' ability to provide high-quality education, and expectations for learning progress in the subsequent semester.

When asked whether students found learning from home easy or hard compared with learning at school, teachers, parents and students who reported having high-quality learning technology were all more likely than those without that digital advantage to consider remote learning "easy" or "very easy." For instance, teachers who rated their digital learning tools as "excellent" were 32 percentage points more likely to say remote learning was easy or very easy, compared with teachers who rated their digital learning tools as "poor." The gaps in ease of learning reported by parents and students were 45 points and 13 points, respectively.

When asked about the upcoming fall semester of 2020, each group also expressed greater confidence in their school's ability to provide high-quality education when they reported having high-quality learning technology. To illustrate, we looked at the percentages of respondents who expressed high confidence in their school's ability (a "4" or "5" on a five-point scale) among two groups: those who gave digital learning tools an "excellent" rating and those who gave them a "poor" rating. Confidence in school ability was 31 percentage points higher among teachers who rated digital tools as "excellent" compared with those rating them "poor." The effect was even stronger for parents (38 points) and stronger still for students (44 points). Thus, across all groups, optimism about their school's ability to provide high-quality education was much higher when digital learning tools were perceived as high-quality.

Likewise, when asked whether students would learn more, the same or less than in a typical fall semester, each group was much more likely to expect learning gains to be on par with the standards or even above them when they rated digital learning tools as "excellent" compared with "poor." In this case, the gaps for teachers, parents and students were similar, at 28, 30 and 34 points, respectively.

To rule out whether the association between quality digital tools and educational outcomes simply reflects a bias toward positivity by certain respondents, or household income levels, we conducted additional research to test for those issues. Even when controlling for these factors, the results didn't change. Thus, we have confidence that there is a robust association. Details of this analysis are provided in the appendix.

Overall, these results suggest a strong connection between learning during the pandemic and the quality of digital learning tools. Moreover, there is remarkable agreement on the important relationship between technology and learning across students, parents and teachers. Taken at face value, these results provide compelling motivation to identify the best digital learning tools and make them more widely available. At the very least, doing so would likely raise students' and their supporters' confidence in making learning gains, boost expectations for exceeding standards, and ease the difficulty of learning from home.

Yet, there are several important limitations to this analysis, which point to opportunities for further research to test these findings more rigorously. The survey did not directly measure learning using objective, reliable metrics -- such as performance on standardized tests. However, subjective reports and expectations are often correlated with objective progress, so the positive association between digital learning and actual learning would likely hold using objective measures. Still, test score data would clarify the strength of the relationship and allow for comparisons to other well-studied interventions, like tutoring.

The same measurement limitation applies to the subjective evaluation of digital learning tools. These subjective measures should be considered alongside objective specifications or specific software. In future studies, researchers could ask students, parents and teachers to provide subjective ratings of specific digital tools to study the relationship between the two. Those data could then be used to identify the features of digital technology that predict higher ratings.

Beyond measurement challenges, another important limitation is that the association between technology and learning cannot be confidently interpreted as a causal effect because the quality of digital learning tools is not randomly assigned. More ambitious social science research could randomly assign students to use the highest-rated tools and test the effects on objective learning outcomes.

In the absence of those findings, the results here nonetheless should motivate school administrators to solicit feedback from teachers, parents and students about whether the digital tools they currently use are working for them. Even with most districts providing full-time in-person schooling, the quality of digital learning tools is likely to affect learning outcomes for the foreseeable future.

Custom graphic. Better student outcomes -- including ease of learning from home, confidence in schools' ability to provide high-quality education, and expectations for learning progress -- are strongly associated with the quality of digital learning tools, as reported by teachers, parents and students.

To stay up to date with the latest ºÚÁÏÍø ºÚÁÏÍø insights and updates, .

Footnotes

[1] School characteristics -- including the share of students eligible for free or reduced-price lunch -- were matched to teacher responses using data from the U.S. Department of Education. U.S. federal government guidelines show that students are eligible for free or reduced-price lunch if their household income is at or below 185% of the federal poverty line threshold. For a household of two, that was just under $32,000 in 2020.

Appendix

As briefly alluded to above, one possible source of bias in these results is that students with more positive attitudes about school generally will be more positive about both technology and learning. To test this, we examined 11 items about student school attitudes not directly related to technology. These items included "I do schoolwork that makes me want to learn more" and "There is someone at my school who cares about me as a person." Students were asked to rate their agreement with these items on a 1-to-5 scale. Using the mean response across items for each student, we created a construct to measure "positive attitudes toward school." This index is positively correlated with ratings for the quality of technology (correlation coefficient is 0.38), measured by whether the student rated digital learning tools as excellent/good or fair/poor. In other words, students who like their digital learning tools are more likely to appreciate other aspects of school.

To check for potential bias, we estimated three statistical models that regressed the three outcomes analyzed above (confidence in school's ability to provide high-quality education, expectations about students' learning, and ease of learning from home) on the quality of technology and the positive attitudes index. In all three models, the quality of digital technology predicted significantly better outcomes, even after controlling for attitudes. We added further controls for household income, and the results did not change. These results gave us some confidence that there is a robust association between the quality of technology, as perceived by students, and their own assessment of learning outcomes. Since we do not find evidence for this positive attitude bias in students, we doubt it would be present in parents and teachers, though we lack data on more general attitudes for these groups.

Author(s)

Jonathan Rothwell is Principal Economist at ºÚÁÏÍø.


ºÚÁÏÍø /opinion/gallup/388502/better-technology-produced-better-learning-outcomes-during-pandemic.aspx
ºÚÁÏÍø World Headquarters, 901 F Street, Washington, D.C., 20001, U.S.A
+1 202.715.3030