Privacy is key: Holding EdTech accountable

Schools made a quick pivot to online teaching in spring 2020 as the pandemic sent kids home to learn. But educators soon faced a host of data sharing issues, as classrooms moved to platforms neither designed for education nor in compliance with privacy laws. Joe Jerome (Common Sense Media) evaluates popular EdTech platforms with a view to security and the protection of students’ right to privacy.

This article is part of our dossier "Digital classrooms - Transatlantic perspectives on lessons from the pandemic".

Auf einem Computerbildschirm zeigt der Cursor auf das Wort "Security"

Twenty-first century education increasingly relies on computers, laptops, tablets, and other technology in the classroom and on cloud-computing services for a wide variety of academic and administrative functions. The Covid-19 pandemic has only accelerated this trend. The responsible use of technology can help educators enhance and personalize student learning and better use limited school resources. At the same time, private educational technology, or edtech, companies are becoming stewards of massive amounts of sensitive data about students.

In serving students, edtech serves an especially vulnerable population. Legal experts have deemed both educational information and children’s information to be sensitive data. Such information can lead to inferences— algorithmic or human— about abilities and intelligence. Vulnerable young people may share personal information readily with strangers without understanding potential consequences. Data processed by edtech platforms, whether generated from children, teens, or adults, deserves special protection and care. It does not always receive this care.

During the pandemic, video conferencing apps and remote teaching became a necessity for teachers around the world. But privacy and security remained an afterthought, and the shift to online teaching has further exposed longstanding issues about privacy protection in schools. Remote learning via social media or other platforms was neither designed for education nor in compliance with (educational) privacy laws. While edtech companies are marketing an array of products as affordable, safe and efficient solutions to provide remote education and socially distant classrooms, the privacy ramifications warrant extra attention. A recent survey of 496 global edtech apps by the International Digital Accountability Council (IDAC), an independent watchdog, uncovered a number of questionable practices, including excessive sharing of location data with third parties and exposure of students’ personal information.[1]

What laws govern student privacy in the U.S.?

There is no comprehensive data protection law in the United States, so a combination of state and federal law governs how student data is processed. These laws vary in scope and cover different types of information; their proliferation also shows the limitations to protecting student privacy.

The basis for student privacy protection in the United States is the federal Family Educational Rights and Privacy Act (FERPA), which aims to protect the confidentiality of student education records.[2] Enacted in 1974, FERPA was a response to concerns that schools were haphazardly and secretly compiling detailed permanent records about their students. The law grants families the right to access and review their own education records, request corrections, and restrict certain disclosures of their records.[3] As a general rule, disclosing student data contained in educational records is prohibited without written consent.

However, FERPA has important limitations. For one, its age reflects a learning environment where “educational records” were held in office filing cabinets and not digitized in the cloud. The law allows for data sharing with “school officials” who have a “legitimate educational interest” in student data.[4] But lawmakers could not have foreseen that edtech platforms such as Google Classroom might one day be designated by schools as “school officials”. The issue is that FERPA only applies to schools, creating gaps in protection when private edtech companies are involved in education. This has major ramifications for enforcement. The primary penalty for non-compliance is withholding federal funding from schools, but the U.S. Department of Education, which enforces the law, has been reluctant to strip money from resource-starved schools for privacy violations, mindful that they may originate with edtech vendors.

Though there have been congressional efforts to amend FERPA, the primary method for addressing student privacy has been via the passage of state laws. Concerns among parents about excessive data collection set off a flurry of legislative activity, with 41 states passing 126 new student privacy laws since 2013. State laws looked to shore up security protections or establish school-based chief privacy officers and other governance mechanisms for student data. Starting with California’s Student Online Personal Information Protection Act (SOPIPA), states enacted laws to regulate edtech vendors themselves.

Common Sense Media worked with lawmakers to draft and introduce SOPIPA in 2014, after hearing from educators that they were overwhelmed by edtech services and practices. These sites and services were increasingly collecting student information, sometimes because teachers had asked students to use a service, and were doing so without contracts with local school districts. Schools faced the choice of developing their own educational software or services, or using vendors who were not obligated to comply with student privacy requirements or take on the burden of securing student information. We worked with legislators in California, including the bill’s author Senator Darrell Steinberg, to make edtech companies responsible for protecting student privacy and keeping student data secure.

SOPIPA applies to edtech apps, sites and services regardless of whether contractual restrictions are in place. It prohibits edtech providers from using any data collected to target ads, create advertising or commercial profiles about students, or sell student information.[5] Somewhat unique in American privacy law, it establishes flat prohibitions— not rules that companies can seek to maneuver around by obtaining often confusing “consent” from overburdened parents or students. Many states across the country copied key tenets of SOPIPA.

Other general-purpose privacy laws such as the Children’s Online Privacy Protection Act (COPPA) and the new California Consumer Privacy Act (CCPA) can also regulate edtech. Even the European Union’s General Data Protection Regulation (GDPR) has spurred discussion about how schools and their edtech vendors should be protecting the student data they process. However, comprehensive privacy proposals in the United States often exempt data collected under student privacy laws or, more problematically, can override existing protections.

Legal protections are important, but they do not address all the factors that complicate the student privacy landscape for school officials. Technical issues and legacy systems do not easily comport with a set of complex, confusing, and overlapping laws and regulations -- all against a backdrop of rapidly changing needs and expectations of educators, students, and parents.

Privacy challenges for schools

Whether information security or data protection, schools have long been under-resourced and have faced competing priorities. Most student privacy laws provide neither funding nor training for implementation. Compared to large enterprises, schools have less funding and technical expertise (even though in many instances they have the same amount of, if not more, sensitive information). Even large school districts are hard-pressed to keep up with the continual security alerts, patches, and updates needed to maintain secure systems of their own. Educators are increasingly reliant on edtech vendors for basic school tasks.

Evaluating the breadth and scope of edtech applications is challenging. Teachers, parents, and students are inundated by app marketing, and school districts already use dozens of different educational platforms. Further, Covid-19 has forced schools, parents, technology providers, and students to seek online alternatives to in-class learning. Privacy protection sometimes fell by the wayside in the rush to adopt easy-to-use, general-audience services that were not intended to handle educational or child data.

Targeted primarily to consumers and corporate users, Zoom was the poster child for this shift. Its ease of use belied the fact that the service was not designed with privacy or security in mind. Users were confused as the company quickly rolled out new service updates and school districts often reacted by banning the service altogether. Apps and services that could support remote learning quickly had to adapt their products and privacy policies to meet the needs of students and educators. Distance learning technologies ranging from remote test monitoring to new devices were deployed without the normal safeguards or considerations that schools and districts usually should take when it comes to student privacy, generating concern among privacy groups across the globe.

Trust but verify 

The reality is that schools are outmatched. They not only need more resources and personnel to devote to privacy and information security, but they need more help overseeing and monitoring edtech companies generally. Civil society organizations have led ongoing efforts to improve the data processing practices of edtech.

While Common Sense worked to establish legal protections, we also launched a comprehensive edtech privacy program in 2016 in response to educators’ repeated privacy concerns. Working with a consortium of school districts, Common Sense evaluates privacy policies to help parents and teachers make informed choices about learning tools, so schools can be active partners in improving the data practices of their edtech vendors.[6] We have routinely heard from schools that the process of negotiating individualized data privacy agreements is difficult and time consuming. Our evaluations break down complex policies based on a broad range of legal requirements, including COPPA, FERPA, and SOPIPA, and based on industry best practices, to help educators assess safety, privacy, security, and compliance for edtech apps, platforms, and services. We believe our evaluations help level the playing field by providing schools and districts of all sizes with better, standardized information about potential edtech products.

Common Sense has identified a set of questions and practices that automatically trigger warning ratings, including data uses related to selling data, third-party marketing, creating profiles that are not associated with any educational purpose, and/or using data to target advertisements.

 

In short, we believe students’ personal information should never be used to target them with advertising or ever be sold, rented, or licensed to third parties. Further, an educational product that requires young people to be contacted by third-party companies for advertising or marketing purposes risks exposing children to inappropriate influences that exploit their vulnerability. Companies should not allow third parties to use a child or student's data to create a profile, engage in data enhancement or social advertising, or targeted advertising. This means edtech companies must ensure their products are not used by third parties to track children’s or student’s actions over time and across online services and devices. To fail our evaluation completely, a company must neglect to provide the public with a rudimentary understanding of how it protects privacy. We specifically look at whether the edtech product (1) has a publicly available privacy policy, (2) supports and uses encryption when users are logged in, and (3) uses online trackers.

Of the 150 policies Common Sense assessed in 2019, approximately 20% of apps received passing ratings. Sixty percent came with caveats or concerns, while another 20% of apps failed our assessment completely, suggesting basic privacy and security failings. This is an improvement over 2018, suggesting that emerging norms about limits on student data and new regulatory oversight are having an impact. Still, the majority of edtech products we evaluate continue to raise cautionary flags, even as many are distributed to schools and students across the globe. This is particularly concerning because our evaluations focus on basic transparency and public disclosures, which are fundamental elements of ensuring trust in a company’s privacy practices.

The overall lack of transparency is especially troubling since we have found transparency to be a reliable indicator of quality: applications and services that are more transparent also tend to engage in qualitatively better privacy and security practices.

Trends in edtech platforms

Edtech products serve a wide variety of classroom needs from one-off applications to comprehensive learning platforms. Major technology companies like Apple and Google are increasingly dominant, though traditional education companies like McGraw-Hill and new entrants such as Edmodo[7] also offer teachers platforms for communication, collaboration, learning management, content delivery, and student assessment.

We speak regularly with educators across the United States to identify the most popular edtech platforms deployed school- or district-wide. The following chart shows our full scores for Microsoft Teams, Google Classroom, and Apple School Manager, our comprehensive assessments and their policies with respect to the risky privacy practices discussed above. A higher score (up to 100%) means the product provides more transparent privacy policies with better practices to protect user data. The score is best used is as an indicator of how much additional work a person will need to do to make an informed decision about a product.

 

Concerns

Microsoft Teams

Google Classroom

Apple School Manager

 Data Collection: Protecting personal information

65

50

65

Data Sharing: Protecting data from third parties

95

90

90

Data Security: Protecting against unauthorized access

95

95

95

Data Rights: Controlling data use

95

95

95

Data Sold: Preventing sale of data

45

70

60

Data Safety: Promoting responsible use

75

60

80

Ads & Tracking: Prohibiting the exploitation of users' decision making process

60

65

85

Parental Consent: Protecting children’s personal information

60

80

100

School Purpose: Following student data privacy laws

35

70

70

Statute

 

 

 

California Online Privacy Protection Act (CalOPPA)

77

79

81

Children's Online Privacy Protection Act (COPPA)

69

72

83

Family Educational Rights and Privacy Act (FERPA)

63

69

79

Student Online Personal Information Protection Act (SOPIPA)

63

76

80

General Data Protection Regulation (GDPR)

84

85

86

 

Apple and Google both received a passing designation, because their policies clearly disclosed that they do not sell student data, and do not use student data for third-party marketing, advertising, tracking, or ad profiling purposes. We have given Microsoft Teams a warning because its policies disclose that it engages in practices designed to profit from user data, such as third-party marketing, targeted advertising, and tracking children and student users.

 

Overall compliance trends

This small snapshot of some of the major edtech services illustrates the range of transparency as well as inconsistent or unclear practices of edtech apps and services targeted toward children and students. That said, trends are improving. U.S. state privacy laws like SOPIPA and those that followed have set an established legal baseline prohibiting the sale of students’ personal information and the use of this information to display behavioral advertising -- or certainly prohibiting this without clear, affirmative parental consent.

Since 2018, Common Sense also has seen improved disclosures about, and a resulting ability of users to exercise their privacy rights in: data access, data modification, user deletion, user export, and opt-out consent. We attribute these trends to growing awareness of new data privacy laws, such as the EU’s GDPR and California’s CCPA. The GDPR provides clear data rights and allows people to withdraw consent or object to high-risk data processing. The CCPA incorporates education records, as defined by FERPA, in its definition of personal information, and it broadly covers data linked to individuals. The CCPA also includes some of the individual rights found in the EU and imposes opt-in requirements for the sale of personal information of minors under age sixteen to third parties.

Of the 157 questions we use to assess company policies, 60 are responsive to specific GDPR requirements or provisions.[8] For example, in addition to assessing how policies discuss individual data rights, we assess whether vendors categorize themselves as data controllers or data processors and identify a data protection officer for purposes of GDPR compliance. Questions also cover how companies make disclosures about elements of the GDPR including purpose specification, special data categories, and how profiling and automated decision-making may be used. These subjects make up approximately 40% of Common Sense’s evaluations. Our analysis suggests that the majority of companies we evaluated took steps to update their privacy policies in 2018, in order to disclose qualitatively better practices that allow users to access, review, modify, delete, and export personal information.

The road ahead

While Common Sense believes greater transparency in edtech would help school officials protect the privacy of students and young people in their classrooms, more must also be done. Part of this is educating teachers about the importance of privacy generally, but schools also need the resources and leverage to deploy privacy-protective technologies in the classroom, particularly when teachers are looking enviously at the basic functionality provided in general audience consumer products.

Our efforts to review the disclosures of edtech companies needs to be augmented by investigations led by technical experts into the privacy and security practices of edtech platforms. Only a handful of U.S. school districts have this capability, but other privacy researchers have launched efforts, such as the International Digital Accountability Council and AppCensus.[9] Efforts such as Consumer Reports’ Digital Standard could also have utility in the education sector.

Active regulatory oversight and ultimately enforcement is also needed. Self-regulatory efforts like the U.S.-based Student Privacy Pledge may have raised awareness of privacy concerns and created an expectation among school officials that responsible vendors have “joined the pledge,” but many questionable business practices, including the licensing of student data and its use in generating anonymous profiles, are still used by some who signed it. Privacy experts and advocacy groups have been calling for regulatory action and further guidance for years. Nearly three years after the U.S. Department of Education and the Federal Trade Commission suggested further guidance was forthcoming, little new regulatory activity has emerged. Furthermore, no U.S. state has announced a public investigation against any edtech provider under their state student privacy laws.

Unfortunately, educators and parents, alongside policymakers and regulators, still need to pay more attention to the privacy and security practices of technology platforms that affect tens of millions of children on a daily basis: educational software and other applications used in schools and by children outside the classroom. The ongoing Covid-19 pandemic is likely to further exacerbate this problem, but it is essential that educators, parents, and policymakers engage in an open dialogue with vendors to build solutions that strengthen our children’s privacy and security protections.

This article is part of our dossier "Digital classrooms - Transatlantic perspectives on lessons from the pandemic".

 

[1] IDAC brings together a staff of lawyers and technologists to work with developers, platforms, and regulators to improve privacy practices across the app ecosystem. The IDAC counts board members from the German Marshall Fund, the Brookings Institution, and the Future of Privacy Forum among its members. https://digitalwatchdog.org/about/.

[2] The Protection of Pupil Rights Amendment (PPRA) governs the administration to students of surveys and certain physical examinations of minors. See 20 U.S.C. § 1232h.

[3] 20 U.S.C. § 1232g.

[4] 34 CFR § 99.31.

[5] SOPIPA also puts in place disclosure limitations, data security requirements, data deletion requirements, and controls on data sharing with educational researchers and educational agencies performing school functions. Student Online Personal Information Protection Act, CAL. BUS. & PROF. CODE § 22584.

[6] Today, with the involvement of over 250 schools and districts across the U.S., we are working in collaboration with product developers to bring greater transparency to privacy policies across the board.

[7] Edmodo was acquired by a Chinese firm in 2018, raising additional international privacy issues.

[8] Our assessment cites the GDPR in 175 instances across all questions.

[9] AppCensus is the culmination of several academic research projects focused on mobile app privacy and security led by Nathan Good and Serge Egelman.