In a blog post by Diane Ravitch, we actually thought the comments at the end were more interesting and informative:
August 13, 2016 at 3:06 pm
Another problem with online Machine Learning, (AKA Adaptive Learning, Personalized Learning) is the online programs collect hidden data about the student users, in order to get to know them, personalize the experience. This data goes way beyond simple answers of correct or incorrect, and algorithms can be used to detect personality, behavior, and ability to focus, to name just a few.
Who sees and who uses this highly sensitive and predictive information? (It certainly isn’t transparent to the student, parent or teacher.) Are the predictive analytics used fair or accurate? Will this predictive data be used in predicting job placement or future hires, as is already happening per the Wall Street Journal posted below? Another author asks, “Is Personalized Learning” too Personal”?
“There is a great tension between how software is tracking the outcomes and the learning that students are doing and what happens to that data,” said Betsy Corcoran, chief executive and co-founder of EdSurge, an education technology news site that also sponsors conferences. “Who owns the data? Who is responsible for making sure that data is not abused in some way?”
Privacy advocates are calling for transparency and regulation of these “Black Box” or secret algorithms. The Federal Trade Commission agrees and has called on the industry to regulate themselves, asking for algorithmic transparency.
Bottom line: Machines and algorithms should not be allowed to secretly predict people, period; but this is especially heinous when it is happening to children while at school under the guise of “personalized” learning.
Wall Street Journal: Bosses using data brokers to predict applicants. http://www.wsj.com/articles/bosses-harness-big-data-to-predict-which-workers-might-get-sick-1455664940
When Personalized Learning gets too personal. http://www.recode.net/2015/12/9/11621282/when-personalized-learning-gets-too-personal-google-complaint-exposes
FTC Data Brokers- Call for Transparency and Accountability https://www.ftc.gov/system/files/documents/reports/data-brokers-call-transparency-accountability-report-federal-trade-commission-may-2014/140527databrokerreport.pdf
focuses on the first three steps in the life cycle of big data within that industry—collection, compilation, and analytics. discusses how information gathered for one purpose (e.g online assessment or video evidence in data badge) could be compiled and analyzed for other purposes, such as for marketing or risk mitigation in hiring or insurance or line of credit.
FTC the Bias of Big Data: A tool for Inclusion or Exclusion? https://www.ftc.gov/system/files/documents/reports/big-data-tool-inclusion-or-exclusion-understanding-issues/160106big-data-rpt.pdf
FTC- A Call for Algorithmic transparency
HERE is another comment, but it was removed:
“Digital learning is designed to segue into algorithm-driven workforce development.
Competency-based education and digital learning approaches are intended to reduce a child’s accomplishments into series of “badges” that can be stored in online portfolios.
Tomorrow’s workforce can look forward to a “lifelong learning” scenario where people are compelled to accumulate more and more badges (paying for-profit, online companies for the opportunity and racking up piles of student debt in the process) as they chase job postings dictated by coded skill sets.
We need to make everyone aware that it’s not just academic skills that are being tracked. Social, emotional, and behavioral competencies are in the mix, too.
That is why companies like Parchment (allied with Pearson) are working so hard to position themselves as arbiters of education and workforce credentials. http://exchange.parchment.com/extend/
I’m kind of surprised this is news to you Diane as your son’s merchant banking company Raine Group, is one of Parchment’s major investors.
Public School Parent