When Profit Margins Require Children’s Learning Data
As educational technology becomes increasingly embedded in daily learning, students are generating vast amounts of data with every interaction. Behind colourful interfaces and gamified learning experiences lies an uncomfortable reality: children’s educational journeys are being converted into valuable data assets that fuel corporate profit margins.
Examining the terms of service of popular edtech platforms reveals the stark truth. Buried in legal language are broad claims to aggregate, analyze, and monetize data generated through student interactions. What appears on the surface as engaging educational tools often functions simultaneously as sophisticated data harvesting mechanisms.
This raises a fundamental ethical question: Is it right for companies to generate profit from data produced by children who cannot meaningfully consent to its collection, let alone comprehend how it might be used?
The typical response from edtech companies is that they’re creating a win-win scenario: students get engaging educational tools, and companies get the data they need to improve those tools. But this framing neatly sidesteps the deeper ethical issues at play.
First, there’s the fundamental power imbalance. Children in primary and secondary schools have no meaningful choice about which educational technologies they use. These decisions are made for them by adults – parents, teachers, administrators – who themselves may not fully understand the data practices they’re agreeing to on children’s behalf. When a ten-year-old uses an educational app in school, they’re not making a consumer choice; they’re following a requirement.
Then there’s the question of temporal vulnerability. Data collected today may be used in ways we cannot currently predict. When companies claim perpetual rights to student data, they’re making claims on a future that neither they nor we can fully envision. A child who generates data in primary school may find that same data influencing algorithms that affect their opportunities as adults.
The profit motive adds another layer of ethical complexity. When a company’s primary obligation is to shareholders rather than students, which interest prevails when they conflict? If a feature that boosts engagement (and therefore data collection) doesn’t actually improve learning outcomes, which path will the company choose?
There’s a troubling parallel to the educational inequalities and the Matthew Effect discussed previously. Just as AI threatens to widen existing educational divides, the commercialization of student data risks creating a two-tier system: one where privileged students enjoy data privacy and ownership, while disadvantaged students become unwitting data sources for corporate profit.
Consider the contrast between wealthy private schools that can negotiate favorable terms with edtech providers (or opt out entirely) and underfunded public schools that depend on “free” tools subsidized by data extraction. Once again, “to those who have, more will be given.”
There’s also a fundamental question about what we’re teaching children through these arrangements. If we allow their learning journeys to be mined for profit without their meaningful consent, what lesson are we imparting about the value of their privacy, their agency, their very experience of learning?
This isn’t an argument for rejecting edtech outright – that would be a form of technological determinism in reverse. Rather, it’s a call for a more critical and ethically nuanced approach that centres children’s long-term interests rather than short-term commercial gains.
What might this look like in practice? Perhaps it begins with transparency and genuine clarity about what data is collected and how it will be used. Perhaps it includes temporal limits on data retention rather than perpetual claims. Perhaps most importantly, it requires a shift in our regulatory framework to recognize that children’s educational data deserves special protection precisely because it is generated in contexts where meaningful consent is impossible.
The tools offered by edtech companies have real potential to enhance learning, but at what cost? Are we unwittingly turning classrooms into data mines where children’s learning experiences become the raw material for corporate profit?
The future of education technology isn’t predetermined – it’s something we collectively shape through our choices, our policies, and our willingness to ask difficult ethical questions. Let’s ensure that future prioritizes children’s rights and wellbeing over the imperative to convert their learning into corporate assets.
Perhaps it’s time to recognize that some things shouldn’t be for sale – and the intimate details of how a child learns might just be one of them.