top of page

Take-aways from Remote Learning — Were Students Engaged?

Updated: Nov 14, 2022

As a former 8th-grade mathematics instructor, I know the proverbial back-to-school question from the typical 14-year-old math student – ‘Why do we need to know this stuff?’


This is where the craft of teaching truly manifests in the classroom. Unless teachers motivate students to engage, you’ve lost them before the lesson has even begun.



It goes without saying that the challenges K12s faced starting in March 2020 were innumerable. Administrative teams worked to navigate the logistics of remote and hybrid learning models, ranging from providing remote students lunch to device and hotspot distribution to how to best take attendance.


As K12s moved deeper into the school year, many also recognized that 1) some of the digital resources they’d invested in weren’t ideally suited to remote learning and 2) simply giving a student technology didn’t necessarily guarantee engagement (or learning, for that matter).


Did students achieve during remote learning?


As of the writing of this piece, Texas was the first state to release test score data since the pandemic hit. July 2021 data released by the Texas Education Agency (TEA) shows that the State of Texas Assessments of Academic Readiness Reports (STARR) scores were down across the board, in some cases erasing educational gains that students made over the past seven years.

According to TEA, “As a result of the learning disruptions caused by the COVID-19 pandemic, the number of students not meeting grade-level increased from 2019 across all subject areas and grade levels, with English I and English II being the only exceptions. As a subject area, mathematics reflects the largest decline in proficiency across all grade levels. Districts with a higher percentage of students learning virtually experienced a greater degree of decline. Districts with the highest percentage of in-person learners largely avoided any learning declines in reading.”


Granted, this is one state’s results. Yet, it’s worth noting that Texas had one of the highest levels of in-person instruction in the country, which could indicate that results from other states with higher rates of remote learners could be even worse.


There’s no substitution for in-person instruction, but perhaps improvements can be made to increase (and better measure) student digital engagement to prevent such a slide from occurring when/if K12 moves to an iteration of remote instruction in the future.


Did students engage during remote learning?


K12s certainly hoped students would engage and achieve during remote learning. They went to great lengths to purchase and hand out devices, distribute and set up hot spots, and provide staff with last-minute PD to retrofit classroom instructional practices to the new remote instructional model.

The fact of the matter is, remote instruction or not, that students may engage with technology resources schools provide. Or they may not. In the context of Covid-driven instructional changes, no one truly knew how this would work or if students would engage a lot, a little, or at all. Nor did they have any idea how to measure the engagement if it occurred.


Based on preliminary data that’s come out thus far, there were decreased levels of student engagement.

Parallel to the TEA’s findings, a recent Rand report points to two student engagement elements that may have negatively impacted student achievement during the pandemic. Surveyed teachers noted that:

  • Instruction in mathematics decreased by 80 hours in one year (360 hours vs. 440 hours)

  • Almost half of the remote learning days were shortened (compared to just one-sixth of in-person schools)

It appears that the significant drop in actual content instructional time may have played a role in the achievement decline.


The same report suggests decreased student engagement, specifically citing poor attendance and incomplete assignments in fully remote schools, as a contributing factor to the achievement drop. For example, more than one-tenth of remote students were absent on a daily basis (nearly double the in-person rate), while 25% of remote students failed to turn in assignments (compared to 14% of their in-person counterparts).


The take-away given these initial findings from Rand? Student engagement plays a vital role in student achievement.


Why is student engagement so important?


Simply put, student engagement refers to the ability of an instructor to integrate tasks and activities into instruction to increase the likelihood that students will find meaning in the topic.


All of us can probably recall one of our teachers at SOME point in our education proclaiming, 'Pay attention - this will be on the test!' If you were a student who cared about your scores on tests, 'This is on the test' probably caught your attention. But student engagement needs to be integrated into the lesson to tap into deeper meaning and relevance. Teachers need to motivate student engagement that extends beyond high achievement on assessments.

The fact of the matter is that some students are poor test takers or otherwise struggle with tests in general. For those students, scoring well on an assessment won't motivate them to attain and engage. The 3-5 minutes prior to instruction allows teachers to help students recognize the connections and encourage them to engage in the lesson activity.


Madeline Hunter, whose 7-step lesson design developed in the mid 20th century became a widely accepted practice in K12, termed this initial lesson engagement as 'Anticipatory Set.' Hunter's plan has been modified in recent years to better drive instruction toward higher-order thinking. In addition to informing students beforehand what they'll be learning, she emphasized the importance of capturing students' attention to 'bring them in' and providing some measure of intrinsic motivation to engage in the lesson.


Hunter's lesson model stipulates that in order for students to learn, they first need to engage. To engage, they need motivation. Without the motivation or establishing a rationale or reason to learn, students disengage. When students disengage, learning stops.


Hunter's model was developed prior to the integration of ed tech into most classrooms. Yet K12s could benefit from applying similar lesson structure elements to help ensure that digital and online content engages students in actual learning. Teacher motivation of students, though important, is only one part of the equation, though.


It's equally essential that K12s implement digital resources that truly engage students, helping them make those sticky, meaningful, and relevant connections to the world around them. And by doing so, they can perhaps begin to minimize the dramatic drops that are starting to be realized in a K12 world where technology has rapidly become the heart of classroom instruction.


What does student engagement look like online?


For the most part, the in-person observational model was thrown out the window during remote instruction, leaving K12s asking themselves the Million Dollar Question: 'What does digital engagement look like?'


Heads down, eyes fixed on their screen, busily tapping at keys solving for "x"? Certainly, anecdotal observation gives instructional teams 'some' data to record. Still, perhaps Schlechty's (2002) student engagement framework is a good starting point in helping to define what online engagement may actually look like.


Think about students in your class as you read the list below. Which levels are you able to observe in your work? Do you see the possible transfer of these in-class student engagement observables to remote learning?

Levels of student engagement infographic - Deledao
  1. Authentic Engagement – students immersed in work/activities with personal meaning and value (e.g., discussing a topic of personal interest)

  2. Ritual Compliance – the work/activity has little or no immediate meaning to students, but students engage because of extrinsic outcomes or value (e.g., earning a high grade)

  3. Passive Compliance – students view work/activity as having little or no meaning but participate/expend effort to avoid negative consequences (e.g., failing the course)

  4. Retreatism – students are disengaged from work/activity (e.g., observed texting on the phone) and make no attempt to comply but are not disruptive to others. Learning is unlikely to occur.

  5. Rebellion – students refuse to do the assigned task, act disruptive, and attempt to substitute alternative activities as a level of disruption (e.g., viewing social media sites that distract students within a viewing distance). Learning does not occur.


Can technology generate engagement and impact achievement?


Too often, digital learning tools are all sizzle and no steak. Meaning, the wow factor associated with a digital resource can quickly run its course if it lacks substance, relevance, or is recognized by students as repetitious or devoid of an overt connection to the lesson subject matter.


A perfect example of this is gamification in ed tech. As Louisa Rosenhack, a researcher at MIT's Education Arcade Lab, explained in a recent Harvard Graduate School of Education Usable Knowledge article, "Humans like to learn and get better at things." Rosenhack observes that many games deemed as 'educational' instead tend to focus on lower-level procedural skills instead of sparking intrinsic motivation or authentic engagement. Just as a lack of engagement in conventional classroom instruction can impede learning, the same holds for digital content.

In other words, effective digital resources need to stretch a student's mind toward levels of analysis, synthesis, and evaluation to be beneficial for students. It's not enough to simply deploy a technology application because the graphics or the user interface are attractive. For example, a biology teacher wouldn't purchase a textbook for her students just because the cover looks interesting without examining the author's coverage of cellular respiration, right?


The same holds true for digital resources. Just as it is in selecting textbooks and other core content resources, ensuring digital content reaches higher-level thinking that leads students toward analysis and synthesis is critical in selecting effective digital resources.

It's far easier to monitor student engagement in an in-person instructional scenario vs. remote. However, that's not to say that we can't glean more accurate information from remote student online activity. To do so requires tools that allow K12 instructional staff to report on, visualize and analyze actual student activity levels while engaged in digital learning.


Unquestionably, there ARE effective researched-based digital tools available on the market (AIMSweb, iReady, among others) proven to be successful in increasing student achievement for core content and intervention instruction students alike. But beyond 'Jamal reported reading for 28 minutes this week' or 'Susie got an 87% on her math assignment', data that measures student active digital engagement is missing.

Detailed data of a student's activity (e.g., active mouse movements, keystrokes, idle detection) could be helpful as schools search for methods to better monitor and analyze the impact digital resources have on student achievement.


Conclusion


From a financial ROI standpoint, knowing student engagement within digital resources is ultimately the measuring stick for gauging the success of the resource. However, deploying browser-level technology designed to monitor keystrokes, mouse movements, and detect idle periods will provide K12s with a useful data point to quantify active student engagement.


Should active student engagement data be the centerpiece in determining the effectiveness of digital applications? Obviously, no.


But what it will provide is a vital metric, a critical data point that schools can use to help paint a clearer picture of the impact active student engagement can have on achievement within a digital application.


Deploying technology isn't a rubber stamp for guaranteeing student engagement, nor is it a foregone conclusion that students will learn as a result. As is the case with in-person classroom engagement, educators delivering digital instruction – remotely or in-person – should be urged to provide students with similar levels of motivation to better engage students' learning in a digital environment.

To keep updated on helpful tips for K-12 schools, follow Deledao on LinkedIn, Twitter, or Pinterest.

Deledao can help with student engagement in a hybrid or remote classroom environment. Automatically detect if students are on YouTube or Spotify with Deledao's Live Classroom Management software so teachers can focus on teaching instead of policing students.



514 views
bottom of page