Moderation systems for kids?
October 15, 2003
A few weeks ago the Washington Post put up an article called Cliques, Clicks, Bullies And Blogs all about how children and teenagers are using the affordances and limitations of social software and community spaces as mechanisms to help them assert their dominance (often through bullying) in schools' social shark tanks:
"The Internet has transformed the landscape of children's social lives, moving cliques from lunchrooms and lockers to live chats and online bulletin boards, and intensifying their reach and power. When conflicts arise today, children use their expertise with interactive technologies to humiliate and bully their peers, and avoid reprimand from adults or foes. As parents plead technological ignorance with a my-Danny-hooks-everything-up sort of pride and many schools decline to discipline "off-campus" behavior, the Internet has become a free-for-all where bullying and cruelty are rampant."
These unpleasant but intriguing situations are almost directly illustrative of some of Clay Shirky's points in his article A Group is its Own Worst Enemy. In the article he talks about the importance of structuring the space for social interactions online. He gives an example of a community that became over-ran with new members who didn't have any respect for the established patterns of behaviour that had evolved:
"The place that was founded on open access had too much open access, too much openness. They couldn't defend themselves against their own users. The place that was founded on free speech had too much freedom. They had no way of saying "No, that's not the kind of free speech we meant." But that was a requirement. In order to defend themselves against being overrun, that was something that they needed to have that they didn't have, and as a result, they simply shut the site down. Now you could ask whether or not the founders' inability to defend themselves from this onslaught, from being overrun, was a technical or a social problem. Did the software not allow the problem to be solved? Or was it the social configuration of the group that founded it, where they simply couldn't stomach the idea of adding censorship to protect their system. But in a way, it doesn't matter, because technical and social issues are deeply intertwined. There's no way to completely separate them."
That space - where technical and social issues are deeply intertwined - is what I consider to be the heart of the issue of moderation, which is almost as much about creating spaces where people have a purpose and an aim as it is about finding ways that those groups can effectively be managed or self-manage. Moderation systems are precisely designed to cool off people's worst excesses and to try - through continual pressure and effort (either manually or technological given) - to find processes and systems that make a community able to survive significant pressures and still achieve useful things.
The most interesting analogue between the two articles is in the specific kinds of behaviour that the groups are undertaking. Clay details some interesting chunks of Bion:
The first is sex talk, what he called, in his mid-century prose, "A group met for pairing off." And what that means is, the group conceives of its purpose as the hosting of flirtatious or salacious talk or emotions passing between pairs of members.
The second basic pattern that Bion detailed: The identification and vilification of external enemies. This is a very common pattern. Anyone who was around the Open Source movement in the mid-Nineties could see this all the time. If you cared about Linux on the desktop, there was a big list of jobs to do. But you could always instead get a conversation going about Microsoft and Bill Gates. And people would start bleeding from their ears, they would get so mad.
The third pattern Bion identified: Religious veneration. The nomination and worship of a religious icon or a set of religious tenets. The religious pattern is, essentially, we have nominated something that's beyond critique. You can see this pattern on the Internet any day you like. Go onto a Tolkein newsgroup or discussion forum, and try saying "You know, The Two Towers is a little dull. I mean loooong. We didn't need that much description about the forest, because it's pretty much the same forest all the way."
These three patterns - groups ostensibly about something that are instead actually obsessed with sex, cliques and "things that an individual associates their identity with" (you could probably make ome analogy with brands here without much trouble) - are familiar to all of us. But we're also more than aware that when we were teenagers we were more susceptible to that kind of behaviour than we are as adults. Which brings me to an intriguing question. Given children's uncanny ability to manipulate authority figures and to feel out the rules of any given situation and attempt to manipulate them - what kind of moderation process might both reduce the incidence of this kind of cliquey, pack-like bullying and implicitly educate the children in question about why that behaviour is counter productive? Would a distributed moderation system that put power in the hands of all the children be a useful or counter-productive approach? And if so, how should their power scale? Should a clump of twenty people together have radically more power in their community than five or one? Answers on a postcard please...