…surely it has to be in everyone's best interest to make places such as Facebook as safe as possible.No, they haven’t.
Ceop, the Child Exploitation and Online Protection Centre, has seen its calls for Facebook to adopt their "report abuse" button rejected.
They’ve agreed to a button, just not to placing it where Jim Gamble wants it.
The button allows users to immediately report any abusive, threatening, worrying or wholly unacceptable content or behaviour to a range of experts, who can provide specialist help as necessary.And it still will…
For me, the impact and importance of having the Ceop button on websites populated by young people is clear. Young people love Facebook, but they also have a right to be safe. If they do ever get into trouble online, they want two things: first, to be able to report it to the website so it can take action, and second, to get help. Social networking sites can do the former but not the latter, and therefore need to work with organisations better placed to help their users around specific issues, such as cyberbullying.Organisations like..?
Oh, go on, Emma-Jane. Don’t be shy!
The Ceop button, already implemented by Bebo and MSN (although haplessly small and ill-conceived in the case of Bebo), links directly to CyberMentors (Beatbullying's online service provision) for reported incidents of cyberbullying…Aha!
Ultimately, the safety of young people has to be what we must all come back to…Or to cold, hard cash. Right, Emma-Jane?
…that's why we need to work with the big industry players, the Facebooks and the Googles of this world. Their safety centres must be easy to find from every page, they must refer to support services, while awareness-raising campaigns are used to drive the message home.Oooh, that’s a lot of ‘musts’ there.Why 'must' they do these things, Emma?
Data protection, privacy and civil rights, confidentiality and issues of consent should all be examined if we are to set standards for the safeguarding of young people on social networking sites. These are the issues we have to tackle with CyberMentors, which is in itself a social networking site – it provides young people who are being bullied or are dealing with a variety of wellbeing issues with real-time online mentoring from their peers and counselling from accredited counsellors.So, you are the standard-setters, are you?
Think of the data we are holding. Imagine holding over 350,000 taped conversations of vulnerable young people, and as a matter of integrity, law and best-practice, be bound to protect them. These are private conversations which must be quarantined, privileged, safeguarded and if you care about privacy and civil rights, be subject to informed consent if you are to obtain, process and analyse the content. Now think of the data held by the big social networking sites, and question how they set out to safeguard; they can and they should protect data, privacy, and the identities (and locations) of their users, many of whom are under 18.Ahh, right. Only charities (run as businesses/arms of government) should have access to that sort of data. Only they are to be trusted, eh?
I see where this is going now…
Child safety online goes beyond installing a reporting button or running an ad campaign. It's a great start, and an absolutely critical one, but the debate needs to be widened to include data protection and identifiability – and if we are going to prioritise the safety of our children online, then it's one we need to have now.It’s a grab for more money, more access to policymakers, and (the key issue) more control over that pesky internet…