Flipped instruction and self-regulated learning

From Robert Talbert’s “Flipped learning skepticism: Can students really learn on their own?“:

[Students’] primary experience is with pedagogy that emphasizes dependence. They are brainwashed through years of instructor-centered pedagogy that they are helpless when it comes to learning. But the fact remains that as human beings, they retain the capacity to learn and to regulate their learning. It’s just difficult and takes time and patience.

I’ve long believed that the most important quality which a good teacher must possess is the belief that all students can learn. I’ve also come to believe that self-directed learning is a valuable skill which must be explicitly nurtured and sustained from early childhood education on. As Talbert notes, we all start with it, but it can fade quickly in discouraging environments.

 

Advertisements

Balancing human-human and human-computer interaction

A fundamental challenge in implementing personalized learning is in determining just how much it should be personal—or interpersonal, to be more specific. Carlo Rotella highlights the tension between the customization afforded by technology and the machine interface needed to collect the data supporting that customization. He narrows in on the crux of the problem thus:

For data to work its magic, a student has to generate the necessary information by doing everything on the tablet.

That invites worries about overuse of technology interfering with attention management, sleep cycles, creativity, and social relationships.

One simple solution is to treat the technology as a tool that is secondary to the humans interacting around it, with expert human facilitators knowing when and how to turn the screens off and refocus attention on the people in the room. As with any tool, recognizing when it is hindering rather than helping will always remain a critical skill in using it effectively.

Yet navigating the human-to-data translation remains a tricky concern. In some cases, student data or expert observations can be coded and entered into the database manually, if worthwhile. Wearable technologies (e.g., Google Glass, Mio, e-textiles) seek to shorten the translation distance by integrating sensory input and feedback more seamlessly in the environment. Electronic paper, whiteboards, and digital pens provide alternate data capture methods through familiar writing tools. While these tools bring the technology closer to the human experience, they require more analysis to convert the raw data into manipulable form and further beg the question of whether the answer to too much technology is still more technology. Instructional designers will always need to evaluate the cost-benefit equation of when intuitive human observation and reflection is superior, and when technology-enhanced aggregation and analysis is superior.

On the realistic use of teaching machines

From the perspective that all publicity is good publicity, the continued hype-and-backlash cycle in media representations of educational technology is helping to fuel interest in its potential use.  However, misleading representations, even artistic or satirical, can skew the discourse away from realistic discussions of the true capacity and constraints of the technology and its appropriate use. We need honest appraisals of strengths and weaknesses to inform our judgment of what to do, and what not to do, when incorporating teaching machines into learning environments.

Adam Bessie and Arthur King’s cartoon depiction of the Automated Teaching Machine convey dire warnings about the evils of technology based on several common misconceptions regarding its use. One presents a false dichotomy between machine and teacher, portraying the goal of technology as replacing teachers through automation. While certain low-level tasks like marking multiple-choice questions can be automated, other aspects of teaching cannot. Even while advocating for greater use of automated assessment, I note that it is best used in conjunction with human judgment and interaction. Technology should augment what teachers can do, not replace it.

A second misconception is that educational programs are just Skinner machines that reinforce stimulus-response links. The very premise of cognitive science, and thus the foundation of modern cognitive tutors, is the need to go beyond observable behaviors to draw inferences about internal mental representations and processes. Adaptations to student performance are based on judgments about internal states, including not just knowledge but also motivation and affect.

A third misconception is that human presence corresponds to the quality of teaching and learning taking place. What matters is the quality of the interaction, between student and teacher, between student and peer, and between student and content. Human presence is a necessary precondition for human interaction, but it is neither a guarantee nor a perfect correlate of productive human interaction for learning.

Educational technology definitely needs critique, especially in the face of its possible widespread adoption. But those critiques should be based on the realities of its actual use and potential. How should the boundaries between human-human and human-computer interaction be navigated so that the activities mutually support each other? What kinds of representations and recommendations help teachers make effective use of assessment data? These are the kinds of questions we need to tackle in service of improving education.

Messy personalized learning

Phil Nichols describes his youthful adventures reappropriating the humble graphing calculator to program games:

For me, it began with “Mario” — a TI-BASIC game based loosely on its Nintendo-trademarked namesake. In the program, users guided an “M” around obstacles to collect asterisks (coins, presumably) across three levels. Though engaging, the game could be completed in a matter of minutes. I decided to remedy this by programming an extended version. I studied the game’s code, copying every line into a notebook then writing an explanation beside each command. I sought counsel from online tutorials, message boards, and chat rooms. I sketched new levels on graph paper, strategically placing asterisks in a way that would present a challenge to experienced players. Finally, after a grueling process of trial and error, I transformed my designs into code for three additional stages.

As he summarizes, his non-school-sanctioned explorations of an otherwise school-based tool led to sophisticated discoveries and creations:

[W]ith the aid of my calculator, I’d crafted narratives, drawn storyboards, visualized foreign and familiar environments and coded them into existence. I’d learned two programming languages and developed an online network of support from experienced programmers. I’d honed heuristics for research and discovered workarounds when I ran into obstacles. I’d found outlets to share my creations and used feedback from others to revise and refine my work. The TI-83 Plus had helped me cultivate many of the overt and discrete habits of mind necessary for autonomous, self-directed learning. And even more, it did this without resorting to grades, rewards, or other extrinsic motivators that schools often use to coerce student engagement.

While he positions calculator programming as a balance between the complementary educational goals of “convention” and “subversion,” this also echoes tradeoffs between routine expertise and adaptive expertise, between efficiency and creativity, or between convergent and divergent thinking. It remains an ongoing risk in overly restrictive learning environments. Standards that dictate the time and sequence of each stage of students’ progression fail to allow for the different paths which personalization accommodates. Yet even adaptive learning systems that seek to anticipate every next step a student might take must be careful not to add so many constraints that crowd out productive paths the student might otherwise have pursued. Personalized learning needs to leave room for error and open-ended discovery, because some things just aren’t known yet.

Why personalized learning and assessment?

Much of the recent buzz in educational technology and higher education has focused on issues of access, whether through online classes, open educational resources, or both (e.g., massive open online courses, or MOOCs). Yet access is only the beginning; other questions remain about outcomes (what to assess and how) and process (how to provide instruction that enables effective learning). Some anticipate that innovations in personalized learning and assessment will revolutionize both, while others question their effectiveness given broader constraints. The goal of this blog is to explore both the potential promises and pitfalls of personalized and adaptive learning and assessment, to better understand not just what they can do, but what they should do.

Technology and the brain

Examining the benefits of technology on the brain…

Beyond paddling: children and technology

One of the most sensible articles yet published on children, technology and the brain has just appeared in the scientific journal Neuron. It’s titled “Children, Wired: For Better and for Worse” and has been made open-access so you can read it in full online.

…as well as the potential drawbacks of its overuse.

Your Brain on Computers – Studying the Brain Off the Grid, Professors Find Clarity

Five scientists spent a week in the wilderness to understand how heavy use of technology changes how we think and behave.

(Unfortunately, they confounded a natural landscape with being unplugged, but presumably the real research being planned would address that.)