Friday March 15, 2019, started for many like any other weekday in New Zealand.
People’s daily habits would have meant snatching a bit of early morning news on the iPhone, running a Google search or two to find a bit of ready information and checking for anything juicy on Instagram and Facebook.
The world probably seemed fine as people headed to work or school and, as usual, social media and Google search could be easily located at the ends of a million fingertips. Hardly a thought would have been given to how any of this digital convenience was being delivered and nary a concern about algorithms, echo chambers and the so-called dark web.
But much of this lack of awareness came to a jarring halt following the brazen actions of a white supremacist on that day.
While using the same online conveniences, as thousands of other New Zealanders, the gunman was able to casually record and social-broadcast his actions while driving to two mosques to gun down scores of worshippers.
What is it about the state of digital media that might have contributed to the Christchurch massacre? And, is it a clear and present danger, or just an aberration and passing issue?
Much has been made, since March 15, about the ease with which a madman could share unsavoury content with millions of others. This is primarily what is driving a firmer view from lawmakers that social media platforms need to take greater responsibility for the content on their platforms.
The truth is, the social platforms that promulgate this content have much less control over the content than they would care to admit. That is not the same as a question about whether they should take responsibility for that content, but the brutal truth is that the radical changes required for platforms to self-manage the content would fly in the face of the very things that make theirs a multibillion-dollar enterprise. Short of putting themselves out of business, there is not much likelihood they will willingly comply unless lawmakers force them to.
But uncontrolled content is just one issue. I believe the greater social problem is the way in which people are unwittingly corralled into isolating bubbles to consume content that, over time, leaves them with a one-dimensional view and a diminished ability to reason in a healthy way.
As algorithms have become more and more sophisticated at channelling repeat purchase behaviour and data to fuel the needs of targeted marketing communications, the unwitting daily actions of individuals build grotesque echo chambers where the mental health of our young people is at risk and, at worst, a lone gunman believes he is doing God’s work because his messages of hate are all he knows.
Then, to top it off, he is emboldened by the knowledge he can instantly share it with millions of applauding like-minded people.
Many would say that the digital age has allowed us to be better connected with unprecedented access to more information than we know what to do with. And didn’t the internet promise freedom from the big media barons of the past? Now it’s possible for a kid in a bedroom to start a global social movement or a business, with minimal resources.
The sad truth is also that the new frontiers of a free and democratic internet have been firmly captured by new and even bigger corporations and, through clever algorithms, the quality and diversity that any individual receives is limited and manipulated.
Diversity is important in any ecosystem, including in the ways we have conversations and share knowledge and ideas. Democracy and a healthy society depend on it. But when we have the overblown capacity to exchange content within an echo chamber of millions, and where individual and collective responsibility is absent, we open the door to an unhealthy society.
Fraser Carson is a member of the XŌtaki Ōtaki College Alumni Trust and the founding partner of Wellington-based Flightdec.com. Flightdec’s kaupapa is to challenge the status quo of the internet to give access to more reliable and valuable citizen generated content, and to improve connectivity and collaboration.