In 2018 it was reported that some schools overseas were installing facial recognition software to monitor student attentiveness. One of the comments beneath the article echoed the opinion of many of my colleagues at the time: There's technological progress and then there's pure evil.
And yet, about a month ago I found myself in a room with a dozen school IT Managers and some very enthusiastic software developers debating the merits of facial recognition software in tracking student attendance. They proposed that it is an efficient way to meet duty of care obligations: You will always know the location of vulnerable students. In the face of such an appeal, I seemed to be alone in my misgivings. We wouldn’t impinge on the civil liberties of adults by tracking their movements, so why would we permit that for young people? How did opinions shift so drastically in under 5 years?
I think the answer is that the shift was not drastic, but incremental. Small increments that made life easier. Today, for example, Microsoft emailed me and helpfully suggested that I improve productivity by filing documents in a particular way; it also reported that I spend the bulk of my time collaborating and suggested I add daily focus time to my calendar. When I got in my car, Google advised the fastest route to work via my favourite coffee shop, and my Subaru steering wheel started turning on its own when I swayed a little too close to the edge of the road. Even walking the dog, Apple Fitness coach goaded me into travelling just a little bit further. All very helpful, and it’s making me a more efficient worker, keeping me safe, improving my health; isn’t it?
In some ways, perhaps. In other ways, not so much. Relinquishing control of decision-making to an algorithm erodes autonomy. This is the critical point to keep in mind when considering the use of artificial intelligence in schools. Schools have long been criticised for a lethargic response to new technologies, but they proceed with caution for good reason. Schools want to ensure that students learn to make decisions themselves – to make mistakes, to understand the consequences of their actions, to develop agency. It takes schools longer than industry to adopt new technologies because they are justifiably conducting continual cost-benefit analyses, always keeping the experience of their students top of mind.
The Ethics Centre suggests that: we need to understand exactly what technology is. This starts by deconstructing one of the most pervasive ideas going around called ‘technological instrumentalism’, the idea that tech is just a ‘value-neutral’ tool. Instrumentalists think there’s nothing inherently good or bad about tech because it’s about the people who use it. It’s the ‘guns don’t kill people, people kill people’ school of thought – but it’s starting to run out of steam.
Many of us are beneficiaries of an education free from unfettered access to technology. We had to learn to spell, to calculate, to organise, without a bot correcting mistakes and proposing superior solutions. We were made to work through problems guided by experienced teachers who knew their subject area, when to intervene, and when to leave us to try on our own. This supported us in developing discernment in judging the usefulness of suggestions from an algorithm. We must not deny our young people the opportunity to learn these same lessons, and that means education must continue to be delivered by exceptional teachers who promote independence of mind. Big tech might look at our classrooms and see inefficiency; we see space and time for students to grapple with ideas at their own pace so that they might grow in independence.
Each time a school allows students to relinquish control to technology, it does a small, incremental disservice to its students by encouraging dependence – the antithesis of what we want for our girls. If a word processor corrects spelling, how will children learn to spell? If facial recognition software tracks a teen’s every move, how will she learn to find her own way? The answer is: she will learn when the stakes are higher, when she is grown, and there is no soft place to fall.
I am a great proponent of technology when used in the right way for the right purpose. There are wonderful programs that support students in rehearsing key skills covered in class without increasing the marking burden for teachers, or homework supervision commitment for parents. As an Art & Design teacher, I see how technology can fuel creativity and support students in realising extraordinary ideas. I am, however, sceptical of technology that does the work for you – technology in education should help students think, not think for them.
What we’re reading on this topic
- How to Stay Smart in a Smart World Gerd Gigerenze
- Teachers vs Tech Daisy Christodolou
- Ethical by Design: Principles for Good Technology Dr Matt Beard & Dr Simon Longstaff AO
- Centre for AI and Digital Ethics, The University of Melbourne