Press Release: Study Finds That New Technology, Relaxation of Protections Threaten Student Privacy
Federal government using grants to induce states to construct identical, increasingly sophisticated student-data systems
BOSTON – New technology allows advocates for education as workforce development to accomplish what's always been out of their reach: the collection of information on every child, beginning with preschool or even earlier, and taking advantage of that data to trace the kid throughout his/her academic career after which through the workforce, according to a new study authored by Pioneer Institute.
Cogs within the Machine: Big Data, Common Core, and National Testing
“It is definitely an idea that dates back towards the Progressive era,” says Emmett McGroarty, a co-author of “Cogs within the Machine: Big Data, Common Core, and National Testing.” “It is based in a belief that government 'experts' should make determinations by what is successful in education, what isn't, and what sorts of education and training are most likely to create workers who contribute to making the United States competitive in the global economy.”
In a period in which violations of privacy have become front-page news, the technology presents myriad threats to student privacy.
For many years the us government continues to be using grants to induce states to construct identical and increasingly sophisticated student-data systems. Recently, the federal government did with private entities to create and encourage states to participate in initiatives such as the Data Quality Campaign, the first Childhood Data Collaborative, and also the National Student Clearinghouse – all aimed toward increasing the collection and sharing of student data. The National Education Data Model, using its suggestion of over 400 data points on each child, offers an ambitious target for the states in constructing their data systems.
None of the privacy protections currently in place reliably protect student data. Last year Congress gutted the government Family Educational Rights and Privacy Act (FERPA), leaving no reliable protections in place for student data. With Big Data, anonymization of an individual student's details are difficult.
Initiatives like the Workforce Data Quality Initiative, Unified Data Standards, MyData, ConnectEd, and student-unit records have sprung up to eliminate the technical obstacles to increased data-sharing. Private companies have donated education apps to schools in return for access to student information.
This treasure trove of student data is an attractive target for hackers, who've already begun their assaults.
In Promoting Grit, Tenacity and Perseverance, published this past year by the U.S. Department of Education (USED), the authors expressed a strong curiosity about starting to monitor students' “beliefs, attitudes, dispositions, values and ways of perceiving oneself” and to measure non-cognitive attributes such as their “psychological resources.”
The report says that researchers could employ “functional Magnetic Resonance Imaging (fMRI) and physiological indicators [that] offer insight into the biology and neuroscience underlying observed student behaviors.” It is going onto say that they can hook students as much as devices for example cameras to record facial expressions, chairs that record posture and movements, and skin sensors to measure student responses to classroom activities. “Informant reports,” where a parent, teacher, or any other observer judges a student's “grit, tenacity, persistence, and other psychological resources,” can gauge student attitudes and behaviors.
“This sort of character development and monitoring has traditionally been the domain of parents,” says “Cogs within the Machine” co-author Joy Pullmann. “But the Grit report clearly implies that families can't be trusted to inculcate values and attitudes.”
“Cogs within the Machine” also discusses the kinds of “fine-grained” data that can be collected on non-cognitive attributes through students' interaction with certain digital-learning platforms. “The manufacturers of these technologies certainly know what they mean for classrooms,” says “Cogs in the Machine” co-author Jane Robbins, “but few teachers know about it and even fewer parents are.”
These expansive data structures are intimately attached to the Common Core State Standards Initiative and national testing. Any information from the data initiatives mentioned above that's provided to the 2 federally funded national assessment consortia aligned using the Common Core State Standards will be distributed around the USED.
The national standards will also create a unified “taxonomy” that facilitates development of common instructional materials and data-collection technology. Because Common Core focuses not on academic knowledge but rather on “skills” that involve attitudes and dispositions, it makes way for national assessments and digital platforms that measure such attributes.
The authors make a series of recommendations to protect student privacy. They include urging parents to ask what kinds of information are now being collected on digital-learning platforms and whether the software will record data about their children's behaviors and attitudes rather than just academic knowledge. If parents object to such data-collection, they ought to opt out.
The authors also urge state lawmakers to pass student privacy laws, plus they suggest that Congress correct the 2019 relaxation of FERPA.
Emmett McGroarty is executive director of the Education Project at the American Principles Project. Joy Pullmann is a research fellow from the Heartland Institute and managing editor of School Reform News, a national monthly publication. Jane Robbins is an attorney along with a senior fellow with the American Principles Project in Washington, DC.