The Register Home Page

back to article UK to demand social platforms take down abusive intimate images within 48 hours

The UK is bracketing "intimate images shared without a victim's consent" along with terror and child sexual abuse material, and demanding that online platforms remove them within two days. The government announced today that it would add an amendment to the Crime and Policing Bill requiring platforms to "remove this content no …

  1. Anonymous Coward
    Anonymous Coward

    I can't help feeling the message to the public is correct but the obvious tentacle of Government censorship of entire foreign websites/domains rather than dealing with illegal posters is driving this.

    "We will publish guidance for internet providers setting out how they should block access to sites hosting this content, " displays government intent ... where does it mention "we will prosecute illegal posters"?

    This statement would be the equivalent of fencing off a flyover and suing the council for providing the surface for someone's racist graffiti rather than dealing with the person writing it.

    1. Catkin Silver badge

      The sharing part is already a crime, that hasn't changed so why would they comment on it?

      1. Jimjam3 Bronze badge

        They comment on it to get noticed.

        Seriously, the UK government routinely makes statements like this with no idea or intention or method of enforcing them.

    2. m4r35n357 Silver badge

      Yeah poor innocent "social media". How about IMMEDIATELY?

      1. KittenHuffer Silver badge

        The problem with IMMEDIATELY is that this leaves zero opportunity for any sort of review ..... which means that the take down process would have to be automated ..... which means that it instantly becomes a method by which anyone can have anything taken down that they want removed, regardless of whether it should be taken down or not.

        We have already seen plenty of these automated take down (or under reviewed) processes being abused by those that want things taken down that should be left in place. Copyright holders have used these sorts of processes on things that qualify as 'fair use'. I'm sure it would not be too difficult to come up with plenty of other examples.

        1. m4r35n357 Silver badge

          How long would you take, personally, to "review" these images before doing something about it?

          1. KittenHuffer Silver badge

            I never said how long the 'review' should take, and I have no preference.

            I merely pointed out the problem with your suggestion.

            I also assume that you down-voted me because I did not 100% agree with your position. If so then that is a really mature response.

            1. m4r35n357 Silver badge

              I do not, normally. I will happily argue with people here without downvoting them, I guess today is an angry day!

              In this case, I do not see why you would object to immediate takedown. It is trivial to restore data AFTER review if appropriate.

              1. KittenHuffer Silver badge

                I've already said that I've had second thoughts about IMMEDIATELY.

                I'm happy for it to happen your way, because within a week someone will have automated (and weaponised) the take down request process. And if 'Social Media' have properly implemented a take down process then it will not be long before they are unable to serve the addicts with their daily pap, and the whole edifice of profit sucking shite will start to collapse.

                There is nothing more that I would like than 'Social Media' as it exists today to become a fading memory that the world can slowly forget and recover from.

                I have experienced the lose of family members to substance abuse, and it is not pleasant. At the moment I am seeing the loss of many members of society to addiction to 'Social Media', and I wish for nothing more than for there to be a way back for them.

                1. LybsterRoy Silver badge

                  So follow up by banning smartphones - social media problem pretty much solved.

                  1. Ken G Silver badge

                    I'm accessing this via a browser and a desktop. Ban HTTP/HTTPS.

              2. Anonymous Coward
                Anonymous Coward

                Downvoted because you whined about being downvoted.

              3. LybsterRoy Silver badge

                If it were not that your comment -- It is trivial to restore data AFTER review if appropriate. -- is incorrect I would support immediate takedown. Technically you may be correct but have you ever tried something simple like getting your email take off a blacklist?

              4. localzuk Silver badge

                Removing material that should not be removed can also be harmful. Having a deadline allows time for a service to lodge a legal case to object, and if the request is particularly egregious, then they can look for an emergency injunction etc...

            2. m4r35n357 Silver badge

              Update - turns out I DIDN'T downvote you. I wasn't actually sure, so I looked at the post and it looked "unvoted-on". So I tested it by clicking, and now I can only remove the downvote by upvoting (nice web design there!). C'est la vie!

          2. Roland6 Silver badge

            Probably not very long, but there are another hundred thousand takedown requests in the queue...

            1. Adrian The Alchemist

              And not enough police

              My wife's cousin went from CID in their force to reviews of videos flagged in the child protection team.

              It is soul destroying, you just just review this stuff all day and then as a break go and interview the people's who's computer it was on.

              She moved on as soon as she could, thrse proposals are being added to the workload of the child protection teams and a stretched system could just break not bringing help to children or the abused women

              Given that it will be the police who will be the ones expected to make it work with no extra resources I worry about the churn that will most likely happen in these units even if civilians are recruited to fill in.

              1. Anonymous Coward
                Anonymous Coward

                Re: And not enough police

                Sounds like an ideal job for some people called Andrew.

                1. Anonymous Coward
                  Anonymous Coward

                  Re: And not enough police

                  My name is Andrew, and it sounds like a horrible job, perhaps you need to be specific about which Andrew?

                  1. Rivalroger

                    Re: And not enough police

                    For clarity. The Andrew formerly known as Prince.

                    1. Roland6 Silver badge

                      Re: And not enough police

                      Well given what has been reported, I’m not so sure. I suggest there are grounds to consider his own daughters as victims, albeit not to the same extent as those Epstein and Maxwell trafficked…

        2. KittenHuffer Silver badge

          Thinking about this again I think it is a wonderful idea!!!

          1. Wait for 'Social Media' to put in place automated take down processes.

          2. Create scripts to randomly flag something on their systems for take down.

          3. Repeat randomly but in large quantities.

          4. ???

          5. Profit! Or at least the eradication of a number of 'Social Media' sites as more and more of their content is taken down!

          1. Anonymous Coward
            Anonymous Coward

            RE: 2. Create scripts to randomly flag something...

            Is this a job for AI???

            Not make a script, just flag random stuff.

            ;-)

            1. Sampler

              Re: RE: 2. Create scripts to randomly flag something...

              I think this would be a job for AI recognition at least, for actual implementation, like, once an image is flagged for it to not be uploaded again, like, if I crop, flip, change the colour depth or however they flag the image (like, if it's a basic crc then it's trivial to change a file hash without affecting the image).

              So, as much as I hate AI, having some automated system that can, we a degree of confidence, say these two images are the same, this one's just been arbitrarily adjusted, would actually be a good thing.

              (appreciate that's not what you were talking about, but started a thought)

              1. doublelayer Silver badge

                Re: RE: 2. Create scripts to randomly flag something...

                They have perceptual hashes to do that. It can be tricky and they're not perfect, but it's probably a more sustainable solution because AI vision systems can be thrown off by many things and are much less efficient to run. I don't think even Facebook has enough servers to run an AI vision model against each uploaded image and each banned image to compare them. If they had to and that put them out of business, so much the better, but likely if they had to they'd find a way to lie about having done it.

                1. Like a badger Silver badge

                  Re: RE: 2. Create scripts to randomly flag something...

                  I don't think even Facebook has enough servers to run an AI vision model against each uploaded image and each banned image to compare them.

                  All social media platforms have the capacity to offer individually targeted ads, to cross reference profiles and contents across a social media user's entire extended contact network and timeline, and to feed users highly personalised content, it is pretty obvious they've easily got the capacity to screen all images. Especially since a preliminary scan can work out with high accuracy whether an image needs testing against known images.

                  1. doublelayer Silver badge

                    Re: RE: 2. Create scripts to randomly flag something...

                    You will notice from the sentence you quoted that the thing I said they didn't have enough servers for was doing that comparison with AI vision models. I specifically indicated I do think they have the capacity to do that with perceptual hashes. AI vision models are much hungrier for RAM than any of the stuff you name. This was about how, not whether, so your counter doesn't make a relevant point.

                2. Sampler

                  Re: RE: 2. Create scripts to randomly flag something...

                  They used to do facial recognition on all images, that's got to be harder to say this face is same as this face in a different orientation and lighting than the faces on record, we're talking matching a similar enough image.

                  We know this as they used to prompt you to tag your friend in the image, so, yeah, I think they have the processing capacity. Plus, Immich is doing the (in my view harder) face stuff for me on an N100 chip so it can't be that intensive..

            2. Roland6 Silver badge

              Re: RE: 2. Create scripts to randomly flag something...

              I’m sure Meta’s AI will rapidly get guiderails, that actually work preventing it being used to do anything remotely useful as this.

        3. vtcodger Silver badge

          It's (probably) not that simple

          Dead right unfortunately. A short time limit will likely lead to automated takedown. That probably means "AI" which likely won't be cheap. Or particularly effective. And, for example, it's virtually certain given the number of crackpots in the world that takedown requests will turn up for online images of every nude work of art known to man from "Venus de Milo" to "September Morn."

          There's probably some reasonable compromise. But it's going to take a lot of serious thinking to get the details right. And even then there will be problems. And abuses.

          1. Blazde Silver badge

            Re: It's (probably) not that simple

            For the big providers it'll be automated regardless, I don't think a time-limit makes much difference. For the small guys the whole 'section 230' freedom is on borrowed time. They can't do anything in the face of broad and honestly inevitable legislative change, that the big providers don't fight because they benefit from the loss of competition and have their own ways of adapting. Somewhere in the middle is the battleground: mid-level services petrified of being labelled big tech companies, running scared on legal advice, the chilling, blocking, winding down their services, occasionally shifting the needle of public opinion with outcry, but mostly trying to cling on to the old regime and adapt as best they can even as the costs and risks mount. Those are the guys that need a reasonable time-limit.

        4. jdiebdhidbsusbvwbsidnsoskebid Silver badge

          How about don't put it up in the first place? Perhaps if you can't be sure that are on the right side of the law or not then maybe don't do it all? Laws on offensive behaviour etc. might be judged on a fine line but the majority of us manage to get through our lives without straying over it. Facebook et al should be treated no differently in that regard.

          And I have no sympathy for the amount of material they have to check before putting online. The scale of your operations should be no defence.

          1. doublelayer Silver badge

            I feel as little or even less sympathy for Facebook than you do, but remember that laws cover everything. Do you think The Register could or should verify all of your posts before you posted them just in case one of them was illegal? They could not, so if the law required it, they would end up shutting down the forums. Facebook, on the other hand, is so big they can ignore many laws because enforcing them against them would result in complaints and lost jobs so politicians tend to just ignore illegal behavior. Does the suggestion sound as good now?

        5. Tron Silver badge

          Inevitable consequences

          Any reported image would be taken down, regardless of what it is, using an automated system, and you will have to request a review and wait weeks or months to get your content back up.

          This is a licence for the government to log 'awkward' photos of politicians up to no good, and have them removed from the internet in the UK.

          It is also a licence for the UK government to shut off any site from the UK, such as 4chan, for a single instance of posting, and scare other sites from allowing uploads.

          As I have said on here before, governments will progressively restrict your internet to sites in your country and foreign ones which are whitelisted for access through expensive licensing arrangements, that most will not bother with. We will eventually lose 95% of the internet in the UK as the net becomes globally siloed for local content that the state can monitor, control and block at will. In short, we are running down the slippery slope to become China.

          Those posting the images should be prosecuted instead, and forced to pay for each appearance. A few well publicised cases would stop this in its tracks. Because it is not worth losing your house, car and savings, and having a lifelong sanction on your income, routing a percentage to your victim, for an image post.

          1. Jimmy2Cows

            Re: A few well publicised cases would stop this in its tracks.

            Nice idea in principle, but the sort of people that post these things aren't known for having filters, impulse control, or thinking it through.

            Certainly fine the hell out of them, but it won't stop other stupid people doing what stupid people do.

      2. rg287 Silver badge

        You realise the OSA doesn’t just apply to billionaire social media right? It also covers PHPbulletin boards and Other traditional forums, often run by an enthusiast for a community at their own expense in their own time.

        12hours is a minimum since… y’know people have jobs and need reasonable opportunity to check their forum for mod notices and action that review. 48hours is more realistic.

        Now, if you want to specify an “immediate” action for platforms with more than £100m in global turnover then that’s a different discussion. They should have 24/7 mod teams. But for small sites? Be realistic. If you legislate on the basis all operators have equivalent resource to meta, then the entire internet will be only Facebook/Insta/Tiktok - because everyone will say “I can’t comply with that” and close up shop (the OSA does actually make allowance for “small” services, but doesn’t define what a small service is, and OfCom have refused to provide meaningful definitions, so everyone has to assume they’re in scope for the whole caboodle).

        1. m4r35n357 Silver badge

          These images are at least potentially illegal. Would you expect 48 hrs notice if they were posted to a site that you hosted? Do you think the police would "understand" your workflow?

          1. rg287 Silver badge

            These images are at least potentially illegal. Would you expect 48 hrs notice if they were posted to a site that you hosted? Do you think the police would "understand" your workflow?

            If I was at work and some images were posted at 2.30pm, and I checked my mod notices when I got home at 6pm, then that's 3.5hours. But of course I might be on holiday, or asleep in another time zone. Such services are necessarily "best effort". Yes, it would be nice to be "immediate". But deleting them "as soon as I become aware of them", which may actually be 6-24hours after they are reported/flagged, is hardly unreasonable. And of course if they're illegal, then I shouldn't be deleting them - I should be removing them from public view, but archiving them for Police.

            To criminalise a gap of a few hours for a best-effort service is inane - unless your name is Mark Zuckerberg and you want the entire internet to be four American hyperscalers and the rest of us just bow and scrape and don't host anything of our own ever.

            It's also worth noting that these things scale. On a very small service where I am the sole mod, I probably know everyone on a personal basis. It's highly unlikely that anyone is going to post that amongst people they know (unless I'm deliberately hosting a CP service, in which case the whole question is moot).

            This is a very different risk profile to one like Facebook which is massive, and also has extensive paid moderation. Alas, the OSA doesn't really make a distinction, even though it should.

            1. Adrian The Alchemist

              When I was a mod

              When World Of Warcraft 1st cane out I was a guild officer in a UK based family guild and eventually became a moderator for our forum.

              25 years ago we had a solicitor in the guild and so any flagged posts were moved to an officer only part of the site, the poster banned for however long and reported to police until we were told it was OK to delete

              Because deleting something may make you an accessory if it's bad enough.

              I got sick of seeing intimate pictures at breakfast and we had to interview every new application to join. When your hobby turns into a job it was time to quit as I was paying to just administer for other people to have fun

              Even 20 years ago we got attention from Russians trying to muck us about for giggles

            2. m4r35n357 Silver badge

              IMMEDIATELY means "AS SOON AS YOU ARE INFORMED," not "AS SOON AS THE IMAGE IS POSTED" !!!

              Jesus H Christ.

              1. Roland6 Silver badge

                Setting clear expectations for all parties.

                “ Immediately” means very different things to the person reporting the problem and the person charged with dealing with the takedown. As we know in matters of child protection, the media and protestors will always side with the person reporting an issue.

          2. jdiebdhidbsusbvwbsidnsoskebid Silver badge

            If you're hosting illegal images, why should you get any grace period? If you've built a system that allows people to upload stuff to your web site that then publishes it with no review or oversight, that's your look out.

            1. m4r35n357 Silver badge

              There seems to be a lot of resistance to taking stuff down on this topic. I can only speculate why.

              1. Jimmy2Cows

                Not sure what you're smoking, but there's no resistance here to taking stuff down. There is, however, resistance to having "IMMEDIATELY" applied in a legal context. Context where large companies will get away with skirting the boundaries, but a small outfit will be hauled over the coals if they're a nanosecond too slow.

                Define the term. Is it the very instant an infringing post goes up? 5 minutes after posting? 10 minutes? 20? Careful now, because only the first one qualifies as "immediate" by every accepted definition of the word.

                48 hours is far too long for a global org, but 5 minutes is far too short for a one-man-band forum that possibly falls under OSA scope because they don't properly define "small".

              2. Anonymous Coward
                Anonymous Coward

                You and some others have clearly never run a public service, like a BBS, as otherwise you would understand the operational issues.

        2. Dan 55 Silver badge

          The OSA has different categories of online services, the 12 hour limit could have been set for category 1 services.

    3. Headley_Grange Silver badge

      "This statement would be the equivalent of fencing off a flyover and suing the council for providing the surface for someone's racist graffiti rather than dealing with the person writing it."

      If the council could readily identify the artist and also knew that they had history of doing it but they didn't want to deal with it because they made a profit from the coffee bar they'd opened next to the graffiti then I agree - the council should be sued.

      If the council also accosted people in the street and prodded them towards the fence and showing them the racist graffiti before putting a spraycan in their hand and saying "go on, see if you can do better, and have a coffee afterwards" then, again, I agree with you that the council should be sued.

    4. logicalextreme

      "People who moan at the council about the streets being full of litter; not stopping to think that it is people who drop litter, not the council”

      Life is a PBR

    5. LionelB Silver badge

      > I can't help feeling the message to the public is correct but the obvious tentacle of Government censorship of entire foreign websites/domains rather than dealing with illegal posters is driving this.

      I don't personally feel that ordering the takedown of illegal, abusive content targetted at individuals amounts to censorship – YMMV.

      And yes, it would be lovely if we could go directly after the illegal content posters rather than the platforms, but how do you propose that might be achieved (without far more intrusive state/legislative intervention)?

      1. m4r35n357 Silver badge

        You get the thing taken down first, THEN track down the poster with the cooperation of the platform/ISP.

        It is called "police work", not "state/legislative intervention".

        1. LionelB Silver badge

          Agreed in priciple, but surely the problem lies with the "cooperation of the platform/ISP" part. That doesn't happen unless you legislate for the mandatory cooperation of the platform/ISP which, as we know, runs into all kinds of problems.

          1. m4r35n357 Silver badge

            Are you from the US by any chance? Here in the UK it is not optional to cooperate with law enforcement.

            1. LionelB Silver badge

              No, I'm from the UK; but I was thinking more of the US, where the law seems to have no appetite to take on the tech giants (one might well ask why that is the case).

              But even in the UK and the EU, legal sanctions against the big players seem to amount to a slap on the wrist. Hopefully this will change.

              As for the posters, if they're using VPNs or Tor, identifying them (and the jurisdiction they're in) is hard to impossible even for the platforms and ISPs.

        2. IGotOut Silver badge

          "You get the thing taken down first, THEN track down the poster with the cooperation of the platform/ISP."

          And if the person is using a VPN, or is based in another country where it isn't a crime?

          It's called reality

        3. Ken Hagan Gold badge

          "You get the thing taken down first, THEN track down the poster with the cooperation of the platform/ISP."

          Or, more likely, THEN discover that the poster lives thousands of miles away and there is not a snowball's chance in hell of you ever seeing them in court.

          Even the takedown action depends on the hosting site having enough of a presence in your country that it cares about your laws. I predict that eventually we'll work this out and the global internet will be replaced by several smaller ones, within which there is a reasonable chance of being able to compel compliance with local law. Until then, expect no improvement in social media and no end to politicians sounding off on the subject.

  2. Philip Storry
    Joke

    You missed a bit...

    X's statement is clearly unfairly truncated. I believe that the original stated:

    "We remain committed to making X a safe platform for everyone and continue to have zero tolerance for any forms of child sexual exploitation, non-consensual nudity, and unwanted sexual content. Unless Elon thinks it's cool, in which case our position is obviously: **** you, we'll do whatever Elon wants us to."

    It's shocking to see such a negative portrayal of the poor, honest, trustworthy, and reliable X. It's not like they've ever done anything to deserve it!

  3. Sloth77

    Why not, why not, why not...

    "But she added: "Why 48 hours and not 24 or even 12? Every hour these images remain online compounds the harm.""

    Why not 1 hour... 1/2 hour.... 10 seconds?!?!?!?

    FFS. Because 48 hours was chosen as a reasonable time limit and you have to draw a line in the sand somewhere

    1. m4r35n357 Silver badge

      Re: Why not, why not, why not...

      48 hrs is ample time to "enjoy" and disseminate the image. That is why.

      "there there sweety, it will be gone in another 16 hours, I'm sure of it"

      SICK

    2. Anonymous Coward
      Anonymous Coward

      you have to draw a line in the sand somewhere

      Yeah. Though arsehole Starmer''s proposed line in the sand is in the wrong place.

      Just turn off all social media forever. Fuck Musk, Facebook, TikTok and the rest of these bottom feeding parasites.

      The planet will be a much nicer place without them.

    3. Joe Gurman Silver badge

      Re: Why not, why not, why not...

      Why is 48 hours reasonable? If the complainant's ID is verified, why indeed should it not be instant?

      1. doublelayer Silver badge

        Re: Why not, why not, why not...

        Ah, so your suggestion is that reporters of inappropriate images need to identify themselves now? It solves one problem, the problem of spraying takedown requests, at the cost of introducing two more, that any victim needs to identify themselves to get something taken down and that unrelated members of the public who might also know that this is inappropriate likely couldn't as they're not the one depicted. This is a hard problem to solve. If only people didn't keep thinking it was simple and suggesting things that would make it harder.

  4. StewartWhite Silver badge
    FAIL

    Oh no they wont!

    "Platforms that do not do so would potentially face fines of 10 percent of qualifying worldwide income..."

    Given that the ICO has similar financial penalty powers that is has never even got close to using, it's safe to assume that the Melon, Zuckertwat and Jeff "Melania" Bezos won't be quaking in their boots over this performative "threat". What even does "qualifying" mean here? If it's like Alphabet's tax bill then that income is likely to be an old threepence and a couple of buttons.

  5. Steve Davies 3 Silver badge

    A better alternative

    would be to take down the social network first. THEN remove the offending item.

  6. Brl4n Bronze badge

    Governments don't have a good record when it comes to decision making. UK government especially.

  7. Anonymous Coward
    Anonymous Coward

    DMCA mk 2?

    We already see wide-scale weaponisation of the DMCA for purposes well beyond removal of your copyrighted material that someone else has posted - the EFF has documented some of the most egregious cases. The way the law is structured disincentivises the service providers from doing anything other than legalistic compliance, and the lack of adequate punishment for incorrect reporting leads to the fast discovery and exploitation of whatever loopholes exist in the blocking services as implemented. None of the benefits of the system accrue to the public.

    Whilst I agree wholeheartedly that internet hosting services must respond properly to these complaints and I definitely see that leaving it up to self-regulation is an absolutely failed approach, I have severe reservations. The potential for gaming this and the well-meaning desire to require removal immediately coincide in what I consider to be a very dangerous place for journalism and self-expression. I think the risk to journalism is clear, but an immediate example which comes to mind is that anyone who wanted to prevent the circulation of something like that photo of Andrew and Virginia, or the one of the other Andrew and the young lady dancing, will be very easily able to accomplish it, and potentially nobody will ever know how it happened.

    When I say self-expression, I mean that the ability of enthusiasts to have any kind of community or group outside of Facebook or Reddit will be hugely diminished. I think that's a real loss to society. The vast majority of them are not hotbeds of CSAM, self-harm encouragement, or repositories of hate. Tackling those which are, whilst still preserving the ability for the rest to exist, is, I think, the balance of our times.

  8. Anonymous Coward
    Anonymous Coward

    They arrest princes don’t they?

    Stop protecting, covering up, apologising and downplaying pedo behaviour.

  9. Anonymous Coward
    Anonymous Coward

    We'll see

    UK gov is very good at making laws, making a law gets good headlines, it's quite easy to do, quite cheap, MPs get to feel important.

    UK gov is not very good at enforcement. Enforcement takes time to do, it is hard, it is expensive, it rarely involves MPs unless their friends/donors are the ones being investigated so enforcement gets underfunded and isn't supported up the food chain.

    Just look at how toothless the UK ICO is, plenty of legislation in place, precious little enforecement especially against big business.

    And social media slingers are the biggest business.

    I'm sure the social media giants will give this all due lip service and then little thought to actually complying. They know UK Gov will not fund or support the sorts of investigations that will be necessary. Can you imagine the USA extraditing Zuck et al to the UK? Laughable.

  10. Anonymous Coward
    Anonymous Coward

    48 hours allows more time for a government U-turn.

  11. tiggity Silver badge

    Performative BS

    Meanwhile in the UK, plenty of people with huge amounts of CSA get a slap on the wrist & no jail time.

    Conviction rates for rape are abysmally low.

    Many rape gangs have avoided proper investigation / actions due to worry about inflaming religious / racial tensions / people not investigating properly for fear of being labelled racist.

    So how about fixing some of that?

    .. and for these currently discussed images, how about going after the people posting the images.

    Meanwhile a role for "AI"...

    Had your nudes leaked? Got caught up in revenge porn?

    NO PROBLEM!

    With the AI SLOPCANNON, we can generate and upload 5 million nude photographs of you INSTANTLY!

    The chances of malicious actors finding your real nudes amidst the slop are statistically VERY SMALL

    Sign up today!

    (Shamelessly nicked from Lukas)

  12. Anonymous Coward
    Anonymous Coward

    Call me cynical but

    Ofcom doesn't even manage to manage what they were set-up to do.

    Expecting them to manage the global internet is just a pipe dream.

POST COMMENT House rules

Not a member of The Register? Create a new account here.

  • Enter your comment

  • Add an icon

Anonymous cowards cannot choose their icon