Skip to content

Dunce’s App

How Silicon Valley’s brand of behaviorism has entered the classroom

About five years ago, a cluster of new technologies began to migrate through the nation’s schools like a gaggle of fall geese. Schools have long devised policies and procedures to manage and shape students’ behavior. Sticker charts. Detentions. Referrals. Rewards. Educators routinely point to classroom management as one of the most important skills of being a great teacher, and new teachers in particular are likely to say this is one of their most significant challenges. These novel apps, bearing names like ClassDojo and Hero K12, promised to help by collecting students’ behavioral data and encouraging teachers to project the stats onto their classroom’s interactive whiteboard in order to keep students “on task.” It is, they claim, all part of a push to create a “positive classroom culture.”

The apps come with the assurance of making schools operate more efficiently. But such management technologies don’t simply reflect Taylorism, schoolwork monitored and fine-tuned; they are part of a resurgence of behaviorism in education, and in education technology in particular.

In Ivy League institutions, behaviorism took hold way before the smartphone. Harvard University psychologist B.F. Skinner claimed that he came up with the idea for his “teaching machine” in 1953 while visiting his daughter’s fourth grade class. Skinner believed that all learning was a matter of shaping behaviors and he contended that, much like the animals he trained in his lab, students should be taught through a system of rewards and reinforcement. Machines, he considered, could do this much more reliably than teachers. This machine, Skinner argued, would address a number of flaws in the education system: it would enable students to move at their own pace through lessons and, on top of this, students would receive immediate feedback on their work.

Skinner was unsuccessful in convincing schools in the 1950s and ’60s to buy his teaching machines, but anyone who pays attention to the claims made by today’s education technology industry will recognize Skinner’s promises. These are the principles behind much of what gets touted as “personalization” today.

Skinner argued that teachers frequently provided the wrong sort of reinforcement, focusing on punishing students for misbehaving rather than rewarding them for learning something correctly. “Comparable results have been obtained with pigeons, rats, dogs, monkeys, human children, and psychotic subjects,” Skinner wrote in The Technology of Teaching in 1968. “In spite of great phylogenic differences, all these organisms show amazingly similar properties of the learning process. It should be emphasized that this has been achieved by analyzing the effects of reinforcement and by designing techniques which manipulate reinforcement with considerable precision. Only in this way can the behavior of the individual organism be brought under such precise control.”

Skinner’s theories have fallen out of favor in some education circles. Noam Chomsky, for one, wrote of Skinner’s behaviorism that “The tendencies in our society that lead toward submission to authoritarian rule may prepare individuals for a doctrine that can be interpreted as justifying it.”

It’s no surprise that Silicon Valley investors have come to expect habit-forming hooks and nudges in the products they fund, particularly in education.

But behaviorism never really went away. Today it shapes much of how new digital technologies are imagined and built. Stanford psychologist B.J. Fogg calls it “behavior design,” and his Persuasive Technology Lab teaches software engineers and entrepreneurs how to construct products that can manipulate and influence users, encouraging certain actions or behaviors and discouraging others, by cultivating addiction.

According to Jacob Weisberg, some of Silicon Valley’s most successful app designers are alumni of the lab—now doing time at Google or Instagram—so it’s no surprise that investors have come to expect these sorts of habit-forming hooks and nudges in the products they fund, particularly in education. ClassDojo, perhaps the best known of the education startups making behavior management apps, has raised over $30 million in venture funding from some of the Valley’s luminaries: among them, Y Combinator’s Paul Graham, SV Angel, and Yuri Milner. Hero K12, another behavior management company, raised $150 million in private equity funding earlier this summer, one of the largest investments in education startups so far this year.

Hero K12, formerly known as PlascoTrac, allows schools to track student behavioral data—attendance, tardiness, detentions, and the like—with a suite of mobile and desktop apps that, as the website boasts “track students in and out of anything.” The Hero K12 app can issue hall passes and tardy slips with barcodes that can in turn be scanned by teachers, administrators, and security officers. Behavioral data is aggregated in dashboards for principals to monitor, and parents can receive notifications instantly when a “behavior incident” occurs. These apps encourage teachers and administrators to “track the good with the bad,” as the Hero K12 website puts it, “and reinforce positive behaviors that have the potential to ripple through your school.” A promotional video for the company suggests that students accumulate points to use for rewards like being able to cut ahead in the lunch line.

What does it mean to give these companies—their engineers, their designers—the power to determine “correct behavior”?

Hero K12 claims its apps have captured some 490 million “behavior scans” from some 2.8 million students. (That works out to roughly 175 scans per student.) ClassDojo, for its part, says that 90 percent of K-8 schools in the US have at least one teacher using its app. All this points to an incredible amount of behavioral data being collected by schools just by these two companies alone.

A 2014 story in the New York Times took ClassDojo to task over privacy concerns, criticizing the app for recording sensitive information about students “without sufficiently considering the ramifications for data privacy and fairness, like where and how the data might eventually be used.” ClassDojo responded, listing “what the New York Times got wrong” and asserting that the app was designed to be used to encourage positive feedback not to serve as that old threat that “this will go down on your permanent record.”

But of course, that has always been the underpinning of behaviorism—an emphasis on positive reinforcement techniques in order to more effectively encourage “correct behavior.” “Correct behavior,” that is, as defined by school administrators and software makers. What does it mean to give these companies—their engineers, their designers—this power to determine “correct behavior”? How might corporate culture, particularly Silicon Valley culture, clash with schools’ culture and values? These behavior management apps are, in many ways, a culmination of Skinner’s vision for “teaching machines”—“continuous automatic reinforcement.” But it’s reinforcement that’s combined now with a level surveillance and control of students’ activities, in and out of the classroom, that Skinner could hardly have imagined.

 

You can also listen to this story on curio.io, a partner of The Baffler.