In June 2021, as students and teachers were finishing up a difficult school year, Priscilla Chan, wife of Facebook founder and CEO Mark Zuckerberg, made a live virtual appearance on the “Today” show, announcing that the Chan Zuckerberg Initiative (CZI), along with its “partner” Gradient Learning, was launching Along, a new digital tool to help students and teachers create meaningful connections in the aftermath of the pandemic.
According to CZI and Gradient Learning, the science of Along shows that students who form deep connections with teachers are more likely to be successful in school and less likely to show “disruptive behaviors,” resulting in fewer suspensions and lower school dropout rates. To help form those deep connections, the Along platform offers prompts such as “What is something that you really value and why?” or “When you feel stressed out, what helps?” Then, students may, on their “own time, in a space where they feel safe,” record a video of themselves responding to these questions and upload the video to the Along program.
CZI, the LLC foundation set up by Zuckerberg and Chan to give away 99 percent of his Facebook stock, is one of many technology companies that have created software products that claim to address the social and emotional needs of children. And school districts appear to be rapidly adopting these products to help integrate the social and emotional skills of students into the school curriculum, a practice commonly called social-emotional learning (SEL).
Panorama Education—whose financial backers also include CZI as well as other Silicon Valley venture capitalists such as the Emerson Collective, founded by Laurene Powell Jobs, the widow of Apple cofounder Steve Jobs—markets a survey application for collecting data on students’ social-emotional state that is used by 23,000 schools serving a quarter of the nation’s students, according to TechCrunch.
Gaggle, which uses students’ Google and Microsoft accounts to scan for keywords and collect social-emotional-related data, has contracts with at least 1,500 school districts, Education Week reports.
Before the pandemic temporarily shuttered school buildings, the demand for tracking what students do while they’re online, and how that activity might inform schools about how to address students’ social and emotional needs, was mostly driven by desires to prevent bullying and school shootings, according to a December 2019 report by Vice.
Tech companies that make and market popular software products such as GoGuardian, Securly, and Bark claim to alert schools of any troubling social-emotional behaviors students might exhibit when they’re online so that educators can intervene, Vice reports, but “[t]here is, however, no independent research that backs up these claims.”
COVID-19 and its associated school closures led to even more concerns about students’ “anxiety, depression and other serious mental health conditions,” reports EdSource. The article points to a survey conducted from April 25 to May 1, 2020, by the American Civil Liberties Union (ACLU) of Southern California, which found that 68 percent of students said they were in need of mental health support post-pandemic.
A major focus of CZI’s investment in education is its partnership with Summit Public Schools to “co-build the Summit Learning Platform to be shared with schools across the U.S.” As Valerie Strauss reported in the Washington Post following the release of a critical research brief by the National Education Policy Center at the University of Colorado Boulder, in 2019, Summit Public Schools spun off TLP Education to manage the Summit Learning program, which includes the Summit Learning Platform, according to Summit Learning’s user agreement. TLP Education has since become Gradient Learning, which has at this point placed both the Summit Learning program and Along in 400 schools that serve 80,000 students.
Since 2015, CZI has invested more than $280 million in developing the Summit Learning program. This total includes $134 million in reported contributions revenue to Summit Public Schools 501(c)(3) from 2015 to 2018 and another $140 million in reported awards to Summit Public Schools, Gradient Learning, and TLP Education (as well as organizations that helped in their SEL tools’ development) posted since 2018; a further $8 million has been given to “partner” organizations listed on the Along website—which include GripTape, Character Lab, Black Teacher Collaborative, and others—and their evaluations by universities.
An enticement that education technology companies are using to get schools to adopt Along and other student monitoring products is to offer these products for free, at least for a trial period, or for longer terms depending on the level of service. But “free” doesn’t mean without cost.
As CZI funds and collaborates with its nonprofit partners to expand the scope of student monitoring software in schools, Facebook (aka Meta) is actively working to recruit and retain young users on its Facebook and Instagram applications.
That CZI’s success at getting schools to adopt Along might come at the cost of exploiting children was revealed when Facebook whistleblower Frances Haugen, a former employee of the company, who made tens of thousands of pages of Facebook’s internal documents public, disclosed that Facebook is highly invested in creating commercial products for younger users, including an Instagram Kids application intended for children who are under 13 years. While Facebook executives discussed the known harms of their products on “tweens,” they nevertheless forged ahead, ignoring suggestions from researchers on ways to reduce the harm. As Haugen explained, “they have put their astronomical profits before people.”
The information gathered from SEL applications such as Along will likely be used to build out the data infrastructure that generates knowledge used to make behavioral predictions. This information is valuable to corporations seeking a competitive edge in developing technology products for young users.
Schools provide a useful testing ground to experiment with ways to hold the attention of children, develop nudges, and elicit desirable behavioral responses. What these tech companies learn from students using their SEL platforms can be shared with their own product developers and other companies developing commercial products for children, including social media applications.
Yet Facebook’s own internal research confirms social media is negatively associated with teen mental health, and this association is strongest for those who are already vulnerable—such as teens with preexisting mental health conditions, those who are from socially marginalized groups, and those who have disabilities.
Although Facebook claimed it was putting the Instagram Kids app “on hold” in September 2021, a November 2021 study suggests the company continues to harvest data on children.
There are legislative restrictions governing the collection and use of student data.
The Family Educational Rights and Privacy Act (FERPA) protects the privacy of student data collected by educational institutions, and the Children’s Online Privacy Protection Rule (COPPA) requires commercial businesses to obtain parental consent to gather data from “children under 13 years of age.” Unfortunately, if a commercial contract with a school or district designates that business a “school official,” the child data can be extracted by the business, leaving the responsibility to obtain consent with the school district.
While these agreements contain information relating to “privacy,” the obfuscatory language and lack of alternative options mean the “parental consent” obtained is neither informed nor voluntary.
Although these privacy policies contain data privacy provisions, there’s a caveat: Those provisions don’t apply to “de-identified” data, i.e., personal data with “unique identifiers” (e.g., names and ID numbers) that have been removed. De-identified data information is valuable to tech corporations because it is used for research, product development, and improvement of services; however, this de-identified data is relatively easy to re-identify. “Privacy protection” just means it might be a little bit more difficult to find an individual.
What privacy protection doesn’t mean is that the privacy of children is protected from the “personalized” content delivered to them by machine algorithms. It doesn’t mean the video of a child talking about “the time I felt afraid” isn’t out there floating in the ether, feeding the machines to adjust their future.
The connections between the Along platform and corporate technology giant Facebook are a good example of how these companies can operate in schools while maintaining their right to use personal information of children for their own business purposes.
Given concerns that arose in a congressional hearing in December 2021 about Meta’s Instagram Kids application, as reported by NPR, there is reason to believe these companies will continue to skirt key questions about how they play fast and loose with children’s data and substitute a “trust us” doctrine for meaningful protections.
As schools ramp up these SEL digital tools, parents and students are increasingly concerned about how school-related data can be exploited. According to a recent survey by the Center for Democracy and Technology, 69 percent of parents are concerned about their children’s privacy and security protection, and large majorities of students want more knowledge and control of how their data is used.
Schools are commonly understood to be places where children can make mistakes and express their emotions without their actions and expressions being used for profit, and school leaders are customarily charged with the responsibility to protect children from any kind of exploitation. Digital SEL products, including Along, may be changing those expectations.
By Anna L. Noble is a doctoral student in the School of Education at the University of Colorado, Boulder.
___________________
Independent Media Institute