back to article It's not just Big Tech: The UK's Online Safety Act applies across the board

A little more than two months out from its first legal deadline, the UK’s Online Safety Act is causing concern among smaller online forums caught within its reach. The legislation, which came into law in the autumn of 2023, applies to search services and services that allow users to post content online or to interact with each …

  1. Anonymous Coward
    Anonymous Coward

    Just another example...

    ...of clueless government ministers failing to have basic grasps of technology and the web.

    1. Headley_Grange Silver badge

      Re: Just another example...

      Why is it less important to protect people on a small site than it is on a big site? Sure, it's expensive, but you can make exactly the same argument for health and safety legislation; big chain stores should have to make sure their products are in date and the shop isn't a dangerous place to walk around cos they can afford it but it's too expensive for small shops to do that so they shouldn't have to do anything to make their shops safe and they can sell out-of-date food if they want to.

      1. Anonymous Coward
        Anonymous Coward

        Re: Just another example...

        Shops aren't run by hobbyists doing it out of love and passion. They're businesses. And use by dates are easier to define than "hate speech" or whatever else people are meant to be protected against.

        1. Headley_Grange Silver badge

          Re: Just another example...

          So - why is it OK that people aren't protected when they are on a hobby site?

          1. Anonymous Coward
            Anonymous Coward

            Re: Just another example...

            You think that these new laws are going to meaningfully protect people? They are largely a bureaucratic burden that is more likely to stifle and kill off small positive sites than reduce existing problematic behaviours. We will lose more than we will gain.

            There are already laws protecting against defamation, harrassment, hate, etc - so people are not without protection; the problem is getting the existing laws applied. Got a problem, take a crime report number.

            Small shopkeepers are required to ensure that their properties are safe (just as homeowners are rerquired to ensure their homes are safe for visitors - even uninvited ones); shopkeepers are also subject to merchantability laws. However, they might not be required to do all the same paperwork or meet all the same standards as a large store simply because the degree of risk is much less with around 50 visitors a day compared with 2000 a day.

            Unfortunately this looks like typical UK 'gold plated' legislation dreamt up by professional bureaucrats and written in the broadest catch-all terms. Ministers will once again say that "it is not intended to ..." - but these words are worthless as it is only the words written in the statute that will be used in determining the application of the law.

            For the tech giants the penalties might be a rounding error in their legal department's budget, for small concerns it will be a death knell. Why would anyone want to volunteer to do something positive when the consequences could be personal ruination.

            1. Headley_Grange Silver badge

              Re: Just another example...

              You still haven't answered the question - and I suspect that no one will. Small shops have exactly the same health and safety responsibilties as large shops and the law makes no recognition of size of the business in the case of an H&S breach.

              The main problem here is the prevailing view that everything on the internet should be free. If these small hobby sites are so useful to their members then their members won't mind paying a nominal fee to use them and that fee could be used to pay for compliance. You can imagine a Shopify-type setup which provides a compliant set of policy and procedure docs, online training and away you go - follow the rules and you're OK. The problem is, of course, that no matter how "valuable" these sites are claimed to be for their users, when that value is tested then most of the time they come up wanting and about 0% of their users are willing to pay a few quid a year to use them. I really like the Reg. It's the only tech site I read and the only internet site I comment btl on, mainly cos its a nice relatively twat-free zone and there's no stress coming here - not even the downvotes I'll get for these posts. If the Reg started charging me to have an account for commenting I'd stop commenting and if the Reg went behind a paywall I'd almost certainly stop reading it.

              1. Fonant

                Re: Just another example...

                You've answered your own question: "If the Reg started charging me to have an account for commenting I'd stop commenting and if the Reg went behind a paywall I'd almost certainly stop reading it."

                Small hobby forums only exist because they're free to use, and run by volunteers. They provide much useful benefit and almost entirely zero risk of being used for online harm.

                This is nothing like a shop, which is a business. It's more like requiring all community noticeboards to be risk-assessed and monitored by a named person just in case someone pins up a pornographic image: it's possible that someone could do so, but vanishingly unlikely.

                1. Headley_Grange Silver badge

                  Re: Just another example...

                  ".. It's more like requiring all community noticeboards to be risk-assessed and monitored by a named person.."

                  That's the argument that has got us where we are today with Meta, Google, Xitter and the rest of the disruptive brigade and the shit place they've turned the internet into. They argued from their very start that they and the internet shouldn't be regulated like the real world because, you know, internet.

                  1. Russ T

                    Re: Just another example...

                    "They argued from their very start that they and the internet shouldn't be regulated like the real world because, you know, internet."

                    I think there are two aspects there.

                    One is the "evil" stuff, the incessant tracking, selling of data, targeting ads etc. Similar to how TV advertising is regulated, so should the giants.

                    The other is the "forum-esque" stuff, posts and so on. Most forums have moderators. Of course there are forums for nazis and whatever else in the world. But they are niche. Now, if they all want to give eachother racist hand jobs in their forum, so be it. We all know to avoid the site if there's a swastika on the homepage. However, if they turn up on the cycling forum to do it, that's not OK. In this case moderators would ban them. Or maybe the law would catch up with them.

                    It feels like most of this stuff is self policing, and the stuff they're trying to stop is already illegal, and the other stuff around kids etc is impossible to police because kids will find a way to access things if they really want.

                    The whole thing is a mess, and not easy, but blanket laws that capture people just trying to make a little bit of the internet the place where you and your niche interest hang out, how is that in any way a good thing?

                    1. flayman

                      Re: Just another example...

                      I will begin by admitting that I have not looked very closely at the requirements that come into effect in March. But my limited reading of the gov.uk's explanation of the Act and how it is to be enforced suggests to me that a small forum that is actively moderated does not have much to worry about. Inappropriate content will be quickly removed and appropriate actions taken. Adequate controls are therefore already in place, and it is just a matter of noting this. It might seem a bit arduous in the first instance, but that would be a one-off cost, as the conditions year on year would remain stable until and unless the forum becomes much larger. GDPR creates a similar nuisance for small data controllers, but with the initial assessment exercise out of the way, it's become mainly a yearly box ticking exercise.

                      The article points to a four year old post in a small forum as an example of how this Act adversely affects small online communities. They're still freaking out about it and are talking about moving to Discord, but one wonders whether this is an overreaction. The fact is that there is this law, and forum providers need to comply with it. The degree to which one must comply is proportionate to the risk of harm. In theory, a risk assessment would show a negligible risk of harm on an efficiently moderated forum which has been operating for many years and has never seen an example of such harm, or has demonstrated swift removal of potentially harmful content. In practice, I hope this risk assessment does not create an undue burden. I can practically guarantee that OfCom are not going to over-prosecute this because they do not have the resources.

                      1. zapgadget
                        Big Brother

                        Re: Just another example...

                        The trouble is that now you have a law that can be used whenever the authorities wish to. The more overbearing laws like this exist, and the more that everyone says "oh don't worry, they'll never enforce it" the more likely it is that eventually someone will abuse the fact that these laws are broken on a daily basis to prosecute someone that they don't like. Either prosecute all the laws or remove them, but this halfway hiouse is incredibly bad for societ\y and the respect of law.

                      2. nijam Silver badge

                        Re: Just another example...

                        > The degree to which one must comply is proportionate to the risk of harm.

                        Fallacious, I'm afraid. Leaving aside any custodial sentences that might yet be built in, one incident would bankrupt our small sports club. From our point of view this is a bulldozer to swat a gnat.

                      3. ThomasDial

                        Re: Just another example...

                        I know I'm quite late to the party, but could not resist this one.

                        The notion that this review (which by my reading of OFCOM's guidance clearly applies, for example, to The Register) will be a one time exercise shows a far more favorable view of bureaucracy than I gained in 40 years employment in a (US) federal government agency. OFCOM surely will have to increase staffing to handle the workload associated with requiring, designing, receiving, answering queries about, and evaluating the risk and risk mitigation statements. Staffing requirements will be increased further by the need to query and resolve ambiguities found during reviews as well as verifying corrective actions and hounding organizations who overlooked the requirement or who were found wanting and needed further guidance as to remediation of risk conditions. At the end there will be some residue of enforcement action required, hopefully small, but it will necessitate additional legal staffing or hiring of outside counsel.

                        The executives in charge of the effort will not fail to recognize that new online services will be created on a continuing basis and that existing ones (those which do not close shop) are potential backsliders and that there is a consequent need of periodic review and resubmission of reports of risk and remediation practices, and their review and followup. This, unfortunately, will require that most of the new staff be retained as permanent.

                        A follow on result will be growth of a private sector industry to advise and assist, for appropriate fees, those required to prove the purity of their online services to the government. This will be staffed by a combination of former Online Safety Act enforcers and trainees for future employment in enforcement. After a few years it may not be easy to differentiate between the enforcers and their prey. It will be good for employment statistics, but nearly 100% waste in terms of genuine productivity and public benefit. The sad part is that this has near nothing to do with actually protecting anyone and nearly everything to do with demonstrating the existence of formal procedures that purport to offer protection.

                  2. LybsterRoy Silver badge

                    Re: Just another example...

                    Do you want every post everywhere moderated?

                    1. UnknownUnknown

                      Re: Just another example...

                      No, but do you want every post to be shit, AI shit, and either partially or fully fact free?… without consequence. FB don’t even take stuff down that’s illegal, fake or scam … and that’s now.

                      1. Catkin Silver badge

                        Re: Just another example...

                        I don't think the Act requires any quality for posts but, if it did, it would be even worse. The sort of person who believes laws can be used to prevent "shit" posts fills me with the same type of dread as someone who, seeing a mouse scurrying around, enthusiastically loads a machine pistol.

                2. UnknownUnknown

                  Re: Just another example...

                  The Register isn’t a hobby forum. Perhaps a gazillion years ago… maybe.

                  Take a look at the footer of your daily mail.

                  Indeed Publisher .. as opposed to FB/X/Tik-To and their content lender weaselling.

                  Situation Publishing Ltd, 315 Montgomery Street, 9th & 10th Floors, San Francisco, CA 94104, USA

                  The Register and its contents are Copyright © 2025 Situation Publishing. All rights reserved.

              2. Killing Time

                Re: Just another example...

                'the law makes no recognition of size of the business in the case of an H&S breach.'

                Wrong. The Health and Safety at Work Act applies to all organisations above a certain level of employees. The mantra being ' 5 or more or over 4'

                1. Headley_Grange Silver badge

                  Re: Just another example...

                  The H&S act applies to all businesses. If someone gets killed or injured in a place of business because the business owner is negligent there is no defence of "there are only two employees so we don't have to do H&S".

                  1. Killing Time

                    Re: Just another example...

                    Again, you are wrong. The Act applies as stated previously therefore 'the law does make recognition of size of the business in the case of an H&S breach.'

                    That doesn't mean to say that just because you have less employees than specified you are are absolved of all responsibilities. It the case of your ludicrous ' selling out of date food' scenario the 'Act' wouldn't apply for any size of organisation however other legislation would. Most likely in that case it would be covered by Trading Standards legislation.

                    In the case of 'someone gets killed or injured in a place of business because the business owner is negligent' other overlapping legislation would apply if the organisation is smaller than the stated requirements but not the H+S at Work Act.

                2. nobody who matters Silver badge

                  Re: Just another example...

                  <......."The Health and Safety at Work Act applies to all organisations above a certain level of employees. The mantra being ' 5 or more or over 4'".....>

                  I don't know where you are getting this gem from, but if you are referring to the United Kingdom Health and Safety at Work etc. Act 1974 (and its various revisions), then it is you who is wrong.

                  The Health and Safety at Work etc. Act 1974 applies to every employer, every employee and every self employed person.

                  Throughout the Act it repeatedly states "Every employer", and "Every self-employed person", and "Every employee". It applies to everyone in any workplace, and requires that everybody should have regard for their own safety as well as the safety of others working around them, and the safety of anyone present who is not part of the workforce (ie. members of the General Public).

                  There is quite definitely no threshold of it only applying where 5 or more people are employed. I think perhaps you may be confusing it with other pieces of employment legislation, some of which do indeed only apply where there is a specified number of employees, and yes that number is usually 5 or more.

                  If you don't believe me, have a read of the Act and point out to me where it says that the Act doesn't apply to businesses that employ fewer than 5 people: https://www.legislation.gov.uk/ukpga/1974/37/contents

                  1. Killing Time

                    Re: Just another example...

                    Ok, fair spot. Been some time since I was heavily involved in the legislation aspects of safety.

                    Where I have stated Act, substitute Heath and Safety Management Regulations ( which are effectively the practical instructions for persons or organisations to manage their obligations under the Act) and my posts will be more accurate.

                    This is where the threshold is set and defines the more formal steps required of an organisation in managing H+S. Below this level, nothing is required to be documented so proving negligence would be more problematic. For instance, you can claim to have risk assessed a particular activity but as it is undocumented its difficult to prove you have not or that it is inadequate. Above the threshold the organisation is required that ( and produce evidence of ) they follow POPMAR - Policy, Organisation, Personnel, Management, Audit and Review.

                    My original point still stands though, the law does recognise of size of the business in the case of an H&S breach.

                    1. nobody who matters Silver badge

                      Re: Just another example...

                      The only part where a threshold of five employees is set is in the requirement for a <written> Risk Assessment and a <written> Health and Safety Policy - businesses with less than five employes are still required to do risk assessments for all aspects of the work involved, and have a Health and Safety policy, but do not have a specific requirement to have it in writing. In the event of an accident or breach of H+S regulations, they still have to show that they had assessed the potential risks, and a good many small employers do actually make a written assessment and have a written H+S policy anyway (I mainly work for businesses employing less than five individuals, so see that first hand).

                      There are other health and/or safety regulations where a threshold of five applies, but the actual Health and Safety Act applies to all regardless, and in the event of breach resulting in death or serious injury, the resulting action by HSE will be the same regardless of how many individuals are employed. The degree of action, prosecution or punishment will only vary due to the number of employees who were/are at risk and the seriousness of the breach. In the event of action needing to be taken, HSE will take the same course of formal warnings (improvement notices) prohiobition notice or prosecution as appropriate to the breach, and will take these courses of action irrespective of the number of employees.

                    2. andy gibson

                      Re: Just another example...

                      "Ok, fair spot. Been some time since I was heavily involved in the legislation aspects of safety."

                      Then maybe you shouldn't be commenting if your information is out of date?

                3. nijam Silver badge

                  Re: Just another example...

                  > Health and Safety at Work Act applies to all organisations above a certain level of employees.

                  Again, do you really belive that would prevent an actual prosecution?

                  A breach of the H&S regulations could for example be used as supporting evidence, even if the H&S legislation weren't applicable in a particular case. Servo-assisted prosecuting...

                4. nijam Silver badge

                  Re: Just another example...

                  I recently had cause to look intor the "working at Height' regulations. They are are so draconian that under some readings, even a passer-by who happens to witness a breach might be considered to have to some responsibility.

              3. Andy 73 Silver badge

                Re: Just another example...

                I think you're unwilling to accept the answers that have been given. This legislation applies right down to a village run website, which will never have sufficient traffic or 'value' (in your miserable monetary sense) to justify or pay for someone doing a few days worth of legally consequential work.

                And the obligation is so severe and vaguely worded that it is like asking your local village store to provide complete supply chain tracking for anything they sell, or face a multi million pound fine.

                Most of your argument is the usual 'think of the children' nonsense, relying on the fallacy that no amount of busy work is too much if there is some vague threat that it might tangentially prevent.

                1. Anonymous Coward
                  Anonymous Coward

                  Re: Just another example...

                  Every time you feed Headley The Troll, God kills one of these children. Please, think of the children.

              4. TheMeerkat Silver badge

                Re: Just another example...

                > You still haven't answered the question - and I suspect that no one will. Small shops have exactly the same health and safety responsibilties as large shops and the law makes no recognition of size of the business in the case of an H&S breach.

                Have you considered that the law is stupid when applied to sites of any size?

                It is just that when the size of a site os small it makes the stupidity of the law more obvious.

                1. nijam Silver badge

                  Re: Just another example...

                  In the limiting case, even a site with zero users still has to jump through the bureacratic hoops.

              5. LybsterRoy Silver badge

                Re: Just another example...

                Did you actually read the article? It mentions 2000 pages of information from OFCOM. How much would the "nominal fee" be to get a lawyer to read through that, examine the site and make recommendations.

                This law, like many passed, is on the basis of "we've passed the law so that's it, job done" and lets ignore the actual implementation. After all we have a law against murder and we don't see any of that do we?

              6. localzuk

                Re: Just another example...

                Well, no, small shops do not have exactly the same H&S responsibilities as large shops. By their very nature, a small shop will have a lower footprint, and therefore less "stuff" to make compliant - e.g. they likely won't have a toilet, or parking spaces. They often will rent a space only, so they won't be responsible for maintaining the fabric of the building either, their landlord will be...

                These new rules treat all forum sites identically - whether they have that proverbial toilet or not.

              7. nijam Silver badge

                Re: Just another example...

                > If these small hobby sites are so useful to their members then their members won't mind paying a nominal fee to use them and that fee could be used to pay for compliance.

                It wouldn't be a small fee, that is why it won't work. The only way to comply with this legislation (which is itself both malicious, and an unnecessary duplication of existing protections - albeit regualtions which the government has decided it's not worth enforcing) is to charge a very substantial amount or shut the site down.

              8. tiggity Silver badge

                Re: Just another example...

                @Headley_Grange

                Do you realise how difficult & time consuming it can be to make sure anything defamatory is removed by analysing things manually?

                If you automate it then you get lots of false positives & irritate your users (& still probably miss some banned stuff) .. e.g. context analysis is key, use of the "N" word in a friendly manner between 2 black users is totally different from it being used by a non black user in an insulting manner toward a black user.

                .. Try automating content analysis on e.g. a rape survivors group forum, would be masses of "red flag" content using automated analysis.

                Then we have the challenge of Foreign language comments (not sure how admin of a small group is supposed to deal with that) - automated translation is not perfect and the "translated" English version may not seem a problem but possible the "Foreign" original did actually have a problem but poor translation meant the English output seemed hassle free. Have to say, not looked at the ruling in detail but I hope you could have a reasonable argument at not coping with non English content.

                By many real world stories big companies don't care much about false positives as I hear lots of complaints from people about posts unfairly removed / suppressed - but most users of big sites seem to put up with it - but users are more likely to leave a small niche site if their posts are auto banned than one of the "big social media sites"

          2. John Robson Silver badge

            Re: Just another example...

            "So - why is it OK that people aren't protected when they are on a hobby site?"

            The question is invalid - it presupposes that they *are* protected elsewhere, or that the actions to be taken will meaningfully protect them.

          3. david 12 Silver badge

            Re: Just another example...

            So - why is it OK that people aren't protected when they are on a hobby site?

            For the same reason people aren't "protected" when talking to each other in person.

          4. Anonymous Coward
            Anonymous Coward

            Re: Just another example...

            It's more a case of many hobby sites needing to close because the people running them can't afford the risk of a big fine. It's fine thinking of bigger sites absorbing this (and any insurance needed as a result) as a cost of doing business, but for small special interest groups there's no business to take those costs from. It's all fun and games until you miss someone else's ill-advised comment and you're stuck with an £18m bill.

          5. not.known@this.address
            Big Brother

            Re: Just another example...

            Headley_Grange asked "So - why is it OK that people aren't protected when they are on a hobby site?"

            Protected from what, exactly? People showing pictures or posting comments that someone (*) might find offensive? What about all those horror forums where people regularly discuss dismemberment and other nasty actions? What about wargaming forums where players regularly discuss attacking real places? What about fantasy and science fiction forums where the conversation can cover literally everything from weapons of mass destruction to ways to assassinate a planetary governer or wipe out entire bloodlines of orcs or goblins?

            (*) Just as there will always be someone who will find anything you can think of to be erotic, so there will be someone who will take offence at anything you can think of. And with no need for actual proof of harm, the scope of this legislation is frightening. If you think you have nothing to worry about, just remember how many people have been taken to task for things they said or did when they were just starting college or university...

        2. DuncanLarge

          Re: Just another example...

          > Shops aren't run by hobbyists doing it out of love and passion. They're businesses. And use by dates are easier to define than "hate speech" or whatever else people are meant to be protected against.

          Yet hobbyists using 3D printers cant print guns.

          Whats your point?

          Also hobbyists making cars, must make them safe for the road. Your point?

          I could, if I wanted to, build a traction engine. But I still have to get the boiler signed off. Even though I'm not a business, I must get it signed off.

          So what the hell is your point?

      2. Yet Another Anonymous coward Silver badge

        Re: Just another example...

        Yes but this is more like having your corner store get a team of lawyers to confirm in writing with each of their suppliers that their products don't contain weapons of mass destruction.

        Fairly easy if your Costco, a little expensive for a corner shop.

        1. abend0c4 Silver badge

          Re: Just another example...

          Corner stores have exactly the same liability for the goods and services they provide as do multiple retailers. And, in reality, it's corner stores that are more likely to be selling counterfeit products and knocked-off stock. Yes, it's harder to make a profit on small margins at small volumes but that's not usually an acceptable excuse for lower standards.

          Community websites were previously liable to the same extent as big tech for compliance in areas such as libel and illegal contents. They really ought already to have a policy for dealing with it. In future the responsibilities will be more proportionate to the scale of the risk.

          What's new is that you must conduct a risk assessment - that doesn't require lawyers or, for most cases, a great investment of itme. But it does require thinking about the possible consequences of what you're doing. If the risk is low, you need to have a stated content policy, a clear channel for reporting inappropriate content and a mechanism for removing it. That's not much different to your current responsibilities, except it's now explicit.

          There's an entry-level of responsibility in providing services of all kinds. In this case it's pretty minimal. If you don't want even that level of responsibility, don't do it. There are plenty of other places you can take your group conversation where it's already taken care of. If you want to volunteer to work with children or vulnerable adults - or even provide community meals in the local church hall - you'll be dealing witch considerably higher compliance requirements for activities of considerably greater social value.

          1. Yet Another Anonymous coward Silver badge

            Re: Just another example...

            But this doesn't scale with the goods. The corner shop buys the goods from reputable wholesalers and assumes they are responsible for the quality that they pay for,

            This is like putting all the health and safety requirements for the entire supply chain on the teenager at the counter.

            The reason is obvious, the lobbyists paid for the legislation, small local community PHP forums will close down and you will be forced to move your parish Bell Ringing meetup to Whatever-Facebook-Is-Called-Today and the failed cabinet minister behind this will be given the job of vice-president of community at Whatever-Facebook-Is-Called-Today

            1. nobody who matters Silver badge

              Re: Just another example...

              <........"The corner shop buys the goods from reputable wholesalers and assumes they are responsible for the quality that they pay for"......>

              If the goods turn out to be faulty, it will be the corner shop that will be first in the firing line. Any written guarantee from the wholesaler would be taken into account in their defence, but there is still a legal duty for the shop to ensure that the goods they sell are what they say they are and are "of merchantable quality".

              However, the corner shop situation is rather different from that of a small amateur web forum with a small number of members, most of whom will rarely if ever actually post anything ;)

            2. Anonymous Coward
              Anonymous Coward

              Re: Just another example...

              > the failed cabinet minister behind this will be given the job of vice-president of community at Whatever-Facebook-Is-Called-Today

              Michelle Donelan was the Secretary of State for Science, Innovation and Technology who introduced this bill to Parliament. She lost the seat she contested in the 2024 General Election. No idea what she is doing now although with a young child and a rich husband she is probably sat at home scheming her way back into relevancy.

          2. nijam Silver badge

            Re: Just another example...

            > In this case it's pretty minimal.

            Only in some government fantasy world.

      3. sabroni Silver badge
        Facepalm

        Re: Just another example...

        Headly, think it through bud.

        This legislation basically bans anyone who doesn't have a massive amount of money from running a web site. The reason Meta et al haven't complained about it because they know it means they become the only places that can afford to provide these services.

        You've made the begginners mistake of looking at what the government says the legislation is for and thinking that is what the legislation is for.

        Must try harder.

        (https://www.techdirt.com/2024/12/20/death-of-a-forum-how-the-uks-online-safety-act-is-killing-communities/)

        1. Anonymous Coward
          Anonymous Coward

          Re: Just another example...

          >This legislation basically bans anyone who doesn't have a massive amount of money from running a web site. The reason Meta et al haven't complained about it because they know it means they become the only places that can afford to provide these services.

          Its only a few weeks since JD Vance was threatening the UK over this and Elon Musk has been stirring up discontent with the government. He certainly seems unhappy with it - and the recent changes at Meta make it more likely it will fall into scope for a fine. Which is a good way for the government to fill the black hole with the fines it can impose.

          1. Anonymous Coward
            Anonymous Coward

            Re: Just another example...

            I fail to see why I should be happy about a piece of legislation just because a couple of high profile American arseholes dislike it as well.

        2. Anonymous Coward
          Anonymous Coward

          Re: Just another example...

          From the techdirt link:

          "The act only cares that is it “linked to the UK” (by me being involved as a UK native and resident, by you being a UK based user), and that users can talk to other users… that’s it, that’s the scope."

          So, does this mean...

          - if logs are not kept, say on a Swedish hosted server, for IP info

          - Sites are run anonymously

          - Users are anonymous, or at least aren't allowed to choose their nationality as part of a 'profile'

          that Ofcom and all its massive resources (har har har) wouldn't have reasonable suspicion there's anyone from Blighty using such a site?

          Hell, I might even be tempted to Geoblock UK IP address spaces and invite people to consider their VPN options. Or, start teaching everyone about 'The Dark Web' OoooOOOooo

          Interesting times.

          Anonymous posting because...well you know.

          1. TonyHoyle

            Re: Just another example...

            It applies where there are a 'significant' number of UK users. That number isn't defined anywhere.

            It applies in any country.. so hosting it in timbuktu wouldn't absolve you of the responsilibity for the act (whether the UK gov have the ability to apply their rules worldwide is a separate discussion).

            Theoretically, even if you geoblocked UK users if a 'significant' (undefined) number of UK users used a VPN you could be on the hook.

    2. UnknownUnknown

      Re: Just another example...

      Gonna be a load of toothless horseshit, and damages proportionate to the impact. Just like GDPR.

      See British Airways pleading Covid poverty and getting a GBP£190m egregious data breach fine talked down to a £20m sweetheart deal…. whereas any regulator with sense and aome kahuna’s would have taken the 20m upfront and the rest - inc interest - as a deferred 20% profit share until paid off.l - IN FULl.

      Good for BA, good for the InfoCommision, good for the UK Taxpayer and good for BA Customer's (and their data).

      The deal struck was only good for BA.

  2. Mentat74
    Thumb Down

    Something tells me...

    This is all designed to put an extra burden on small players while the big ones just shrug their shoulders and continue doing whatever they where doing...

    (I can definitely see Ofcom getting a poop-emoji from Twatter as a reaction to their request...)

    1. Andy 73 Silver badge

      Re: Something tells me...

      Government has been almost completely captured by the global corporations, who long ago realized that almost all regulation was to their benefit. The rest of the harm has been done by bureaucrats who only want to expand their department.

      1. G40

        Re: Something tells me...

        and by pols who want to point to a train wreck and be able to say 'I did that'.

    2. John Sager

      Re: Something tells me...

      This! Very definitely. We've seen the burden of State control of us proles increasing steadily over at least the last 2 decades. It's not just a Left- wing thing either - the Tories seem to have been just as willing to circumscribe our freedom to do stuff in the name of various Menckenian Hobgoblins.

      1. Andy 73 Silver badge

        Re: Something tells me...

        Why should it be left or right wing, when both lots of short term grifters are advised by the same civil service? There is a reason why some of the (admittedly more reactionary) critics of government refer to them as 'the blob'.

        1. veti Silver badge

          Re: Something tells me...

          It's a textbook example of "we must do something, this is something so let's do it". It's one of the dumbest laws I've ever seen.

          My assumption is that big tech will ignore it, and simply (and loudly) threaten to shut down services in the UK if Ofcom tries to press them on it.

          Small services, on the other hand, may well be screwed. Probably depends how professional their management are/are willing to be.

      2. Anonymous Coward
        Anonymous Coward

        Re: Something tells me...

        >the Tories seem to have been just as willing to circumscribe our freedom to do stuff in the name of various Menckenian Hobgoblins.

        This is a Tory-introduced law. It was passed before the last general election but is only coming into force now.

  3. AVR Bronze badge

    Besides the direct compliance costs there's the risk of trolls shutting down the site and exposing the operators to liability by putting up objectionable material and reporting it. There's possible ways around that, but they either kill engagement on the site or require more investment in tech &/or moderators than a single forum operator is likely to be able to handle. This act is going to kill a lot of UK blogs and forums, and I really doubt it's going to put a dent in the availability of child porn or terrorists info or drugs.

    1. Yet Another Anonymous coward Silver badge

      No but it does help big sites remove the competition. Handy if you're a deputy PM hoping for a job when you lose an election

    2. DuncanLarge

      > trolls shutting down the site and exposing the operators to liability by putting up objectionable material and reporting it.

      The solution is simple. Dont let people post like that. All posts have to be reviewed. Why would you allow instant replies?

      You could also limit the number of posts, to one a day. Or have them pay to post, all sorts of mechanisms can be employed to disuade people posting what literraly just popped into their heads.

      If you force most people to wait and hour or two, most bad ideas and bad posts will be forgotten by the would be poster. Just like in real life. Thus leaving you with the real bad ones.

      1. Zog The Undeniable

        No-one would use a forum as crippled as that. And who's going to review every post? Get real.

  4. KittenHuffer Silver badge

    I get the feeling that many non-UK websites may end up geofencing the UK so as to not have to worry about this legislation. And the UK websites will end up closing down.

    Guess I'm gonna have to start using my VPNs geolocation changing facility, and start visiting non-UK websites.

    .

    Is TheReg having to go through this BS as well?

    1. Dan 55 Silver badge

      "Below-the-line" comments are exempt, but then again have you seen the comments on the Mail, Sun, BBC Speek Yur Brainz, etc...? Perhaps they should be covered after all.

      1. Tom 38

        El Reg doesn't just have comments on stories, there is a whole user forums section too

        1. Dan 55 Silver badge

          I forgot about them. Probably as good a reason as any to knock them on the head, they're pretty dead.

    2. Anonymous Coward
      Anonymous Coward

      I could see huge backlash from the public if/when sites start blocking the UK leading to the UK gov backtracking hard and bringing the law more in line with the EU DSA.

    3. nobody who matters Silver badge

      ,....."I get the feeling that many non-UK websites may end up geofencing the UK so as to not have to worry about this legislation. And the UK websites will end up closing down......">

      Whilst this is a distinct possibility (particularly bearing in mind the way some non-EU/UK websites effectively block those from the UK/EU rather than comply with GDPR), it also has to be borne in mind that there are a lot of web forums which appear on the face of it to be UK based, which in fact belong to a foreign entity - the main one that springs to mind is VerticalScope; based in Canada and over the last 10 to 15 years has bought up a significant proportion of UK forums, automotive related in particular.

      1. Infused

        Alternatively Ofcom could impose a geoblock themselves (which is also an option).

    4. Infused

      It depends. News organisations naturally have an exemption. (Newspapers were one of the main promoters of the legislation). However, it depends how Ofcom defines a media site. The Register may not qualify. Even if it does, I believe there is some uncertainty if comments under articles are exempt or not.

    5. prh99

      Unless they have some connection to the UK outside of some users or they have similar laws in their home country they're probably not going to bother. The trend is saying your law applies globally, in practice that's not the case unless you're multinational. The UK government will probably end up having to block them if they don't comply.

      They tried to pass a similar law in California with the backing of Baroness Kidron and it got struck down. So Ofcom can sit and spin here when it comes to small forums.

  5. devin3782

    I thinik theres possibly another way this could be achieved would be to have a DNS record similar to the SFP for email. It could work thus:

    TXT "child-safe:true, min-age=13, country=GB"

    min-age (see film classifications or something similar)

    country (2 letter country code)

    - Any website without this record is classed as child unsafe by default.

    - Perhaps social media sites could be classed as adult only by default (we'll simply single out the big ones here: any owned by publicly traded company )

    - Parents enable parental controls, any time their kid goes to a new website with say a PG or is less than their age the parent gets a notification of the URL so they can check the site and has a log of what they're looking at, then the parent can allow or deny. This should be pretty trivial to implement on all devices and can operate solely on those devices i.e. doesn't need the cloud to compute whether a site is safe or not.

    Obviously this is still marking your own homework, but so is the current law also who decides what's safe for their children? the current law doesn't define it and its moving goal posts this gives the tools to parents. The nice thing is this would work globally for everyone, although the age classifications depending on territory could be challenging... but DNS zoning does exist so maybe it would work even with that and using a VPN to a different country.

    1. Scotech

      Connect to a VPN, or even directly to a website, by its raw IP address then, and bypass the whole DNS palaver. Or use DoH. Or a private proxy server running on a domain you control. All stuff I could do at 13 (except for DoH, which mercifully hadn't been invented then, and never should have been, IMO) and presumably many more teens could today.

      There's no technical solution here that works for all cases, or that can't be circumvented somehow. The solution isn't to try to legislate or solutionise the problem away, but to apply the existing laws properly, fund police forces to run proper cyber-crime divisions, and tell parents to stop being lazy and pay more attention to what their kids are doing on the computer.

      On that last note - I was 15 before I got a private computer in my own room. Prior to that, all my screen time was monitored because the family PC was in the living room, with the screen in full sight from there and the kitchen at all times. And I had a mobile, but I didn't get a smartphone until I was away on my own - prior to that, all I got was a dumb Sony Ericsson, and it did me just fine. If parents want to control their kids' access to the web, they don't need fancy technological solutions, they just need to pay attention and put in a little effort!

    2. Yet Another Anonymous coward Silver badge

      Wouldn't a bit field be more flexible?

      Then you can code if your 21 year old can see boobies but not alcohol

      Or your 12 year old can see ads for machine guns but not Kinder Surprise

  6. heyrick Silver badge

    offences related to information likely to be of use to a terrorist

    Oh piss off.

    That's stupidly vague. A train timetable could be information of use to a terrorist, in the right circumstances.

    1. heyrick Silver badge

      Re: offences related to information likely to be of use to a terrorist

      Reading on in the linked document:

      "Possession of extreme pornography offence" - kindly define extreme.

      "Offences related to articles for use in fraud" - to silence criticism of the banking industry? They're the biggest fraudsters around.

      "Foreign interference offence" - so why haven't they demanded Musk be extradited?

      1. Yet Another Anonymous coward Silver badge

        Re: offences related to information likely to be of use to a terrorist

        >kindly define extreme.

        Anything you wouldn't want your wives or servants to see. Especially anything involving game keepers.

      2. Anonymous Coward
        Anonymous Coward

        Re: offences related to information likely to be of use to a terrorist

        >kindly define extreme.

        anything showing flesh?

        1. Yet Another Anonymous coward Silver badge

          Re: offences related to information likely to be of use to a terrorist

          >anything showing flesh?

          You're into kinky skeleton porn ?

    2. Apocalypso - a cheery end to the world
      Joke

      Re: offences related to information likely to be of use to a terrorist

      > That's stupidly vague. A train timetable could be information of use to a terrorist, in the right circumstances.

      One of those circumstances being that the train runs on time? Highly unlikely in the UK.

      1. Yet Another Anonymous coward Silver badge

        Re: offences related to information likely to be of use to a terrorist

        >One of those circumstances being that the train runs on time? Highly unlikely in the UK.

        That's all part of the cunning plan. The terrorists don't know where the trains will be.

        It's like removing all the road signs in WWII

    3. Mishak Silver badge

      Nah...

      Train timetables are far too "wishful thinking" to be of any use to a terrorist.

  7. Licensed_Radio_Nerd
    FAIL

    Can probably be ignored.

    This will be a bit like speed-limits on the roads. It is only enforced if you have resource to catch people. Ofcom lack the ability to police and protect the radio spectrum (outside of safety-of-life services). I cannot see, short of employing thousands of people, how they can police every web-forum (and others) in the UK. And what if you set-up your forum with a non UK gTLD and claim it is outside the UK? They are going to have to pester a lot of DNS registrars to find out who owns what in order to chase them.

    1. I could be a dog really Silver badge

      Re: Can probably be ignored.

      But, the corollary is that they can take action against anyone if they have a reason to. So "small forum with no problems" could be trolled & reported, then OfCon prosecute the person who runs it who ends up with a criminal record and their life ruined. That's quite a risk.

      1. Anonymous Coward
        Anonymous Coward

        Re: Can probably be ignored.

        It looked to me that the penalties were civil, not criminal. Perhaps I missed something?

        1. Anonymous Coward
          Anonymous Coward

          Re: Can probably be ignored.

          Unless they decide it comes under some vaguely defined terrorism or computer misuse act - but that would never be misused for trivial offenses

          #include < stories_of_local_council_using_RIPA_fly_tipping_dog_shit_leaving >

        2. Blazde Silver badge

          Re: Can probably be ignored.

          It looked to me that the penalties were civil, not criminal. Perhaps I missed something?

          It's even easier than this. OFCOM's first communication with any small or medium provider profoundly unlucky enough to come under their microscope will be "hey we'd like to see this information" and then you either scramble to collect that information or you shutter your site long before the fines (and ultimately, in theory, criminal offences) become part of the enforcement.

          Anything else is an over-reaction, especially if you're already confident you're very low risk like the 'hosted outdoor sports forum' mentioned in the article, and are taking steps to defeat trolls and spam like any forum administrator has had to do since the dawn of time. The vast majority of admins already try to run sites that don't harm their users, so why should they worry?

          Whole areas of UK regulation are like this. There's a massive gap between what the legislation technically says and what everybody does in practice. The gap is used by the regulator to stop the genuine offenders sitting just the right side of the line while making a mockery of the spirit of the law, because it allows the them to force suspected offenders to comply way in excess of the average man without the tricky business of first having to prove they are offenders. Is it ideal from an equitable point of view? Not entirely, but the penalty of having to actually do the compliance is not egregious and regulators are usually kept from abusing their powers by being starved of funds.

          1. Anonymous Coward
            Anonymous Coward

            Re: Can probably be ignored.

            This is the entire point of: a jury decides the facts.

            Laws are *meant* be be written vaguely, according to the wording of the people. They *should not* be written specifically, exactly, such that a computer can evaluate the case, or a lawyer has to consider the conditions.

            A jury decides: is this criminal, or is this not criminal? They consider the facts (extreme, violent, unjust, history of defendant, apparent intent), and decide: is this part of a criminal act, or is it not?

            Having things overly-specifically defined prevents the jury from doing this (considering, as _people_, whether this act is criminal), and allows all the loopholes by which (typically the rich) people can sneak through.

            1. Blazde Silver badge

              Re: Can probably be ignored.

              Juries are only used in serious cases. There's an efficiency balance. You can't have 12 people gathering round unpaid for weeks to scrutinise whether or not you possessed a small amount of the controlled substance on the night in question. A senior police officer offers you a caution. If you don't like that a magistrate tuts at you, issues a fine and tells you he doesn't want to see you again, and the fact that the police and the courts are independent from each other makes this work. If you don't like that you submit your appeal to the crown court, the judges there pretend they read your appeal and then they increase your fine. (Unless, rarely you actually had some decent grounds for appeal, because they love nothing more than overruling the muppets in the lower courts on some obscure procedural issue).

              All the Online Safety Act enforcement comes with a right to appeal to the courts. OFCOM are just playing the role of the police, empowered to investigate and issue notices and fines while being careful to adhere to the law themselves. For the more serious enforcement actions like raiding premises and 'access restriction' (blocking) OFCOM first needs to apply to the courts for a warrant, just as the police do.

            2. Missing Semicolon Silver badge

              Re: Can probably be ignored.

              No, vague law is bad law. It is subject to judicial expansion and interpretation. This means that the definition, for example, of "hate offences such as stirring up of racial hatred offence and stirring up of hatred on the basis of religion or sexual orientation" can be blown up to ensure that criticism of certain orthodoxies will be effectively illegal.

      2. Anonymous Coward
        Anonymous Coward

        Re: Can probably be ignored.

        But also a huge PR disaster for Ofcom if they get it very wrong and and Ofcom could be going against the European Convention on Human Rights.

        1. JulieM Silver badge

          Re: Can probably be ignored.

          Considering there are active moves afoot to get the UK out of the ECHR, that sounds like something they might regard as a feature as opposed to a bug.

      3. nobody who matters Silver badge

        Re: Can probably be ignored.

        ,......."But, the corollary is that they can take action against anyone if they have a reason to. So "small forum with no problems" could be trolled & reported"....>

        You just know that OFCOM will go after the small privately run forum because it is an easy target, they will meet little resistance and they can tick another box on their report sheet. Meanwhile, the big professionally run ones will see little if any enforcement because they will be difficult, will resist and it will cost OFCOM a lot of time and money (neither of which they have in the kind of abundance they would need), so they will write them a polite letter giving them a ticking off, and let it pass.

        1. Anonymous Coward
          Anonymous Coward

          Re: Can probably be ignored.

          Going after a small privately run forum over a big tech one would be a PR Nightmare.

          1. nobody who matters Silver badge

            Re: Can probably be ignored.

            How exactly? I can't really see it = a small forum that hardly anyone outside it has heard of, and those people won't miss it vs. a large well supported and well (and widely) known forum with a wide membership and a large amount of daily activity. I know which one I see as having the bigger PR impact.

            The owner and members of any small forum may shout loudly for a short while, but I think they and their forum may soon be once again forgotten by the rest of the population.

        2. Anonymous Coward
          Anonymous Coward

          Re: Can probably be ignored.

          The potential fines from Twitter (~£100m) and Meta (~£1bn) make it very worthwhile to pursue them...

          1. nobody who matters Silver badge

            Re: Can probably be ignored.

            Potential, yes. However, how often are the fines likely to be anywhere near their 'potential', and those organisations have sufficient excess monetary clout to be able to use interminable legal processes to contest the fines and effectively end up getting away with not paying them. We have actually seen this happening time and again where megacorps have been fined for other things.

        3. PCScreenOnly

          That's all government departments

          Especially HMRC

      4. Infused

        Re: Can probably be ignored.

        So it will become risky to base services in the UK, but not abroad. Ofcom will have other options for foreign-based services like geoblocking or financial sanctions.

    2. keithpeter Silver badge

      Re: Can probably be ignored.

      I suspect the concern is the amount of work needed to have the risk analysis document and the relevant counter-measures ready just in case that is the issue.

      I gather that OFCOM has to finance all activity under this act through fines received so I suspect they will be after larger concerns that are flagrant in their defiance initially.

    3. sabroni Silver badge
      Unhappy

      Re: Can probably be ignored like speed-limits on the roads

      That's a dickish thing to say.

    4. Yet Another Anonymous coward Silver badge

      Re: Can probably be ignored.

      >This will be a bit like speed-limits on the roads. It is only enforced if you have resource to catch people.

      So enforcement would be automated to the point that most people regard it as merely a tax rather than a safety measure?

    5. Missing Semicolon Silver badge

      Re: Can probably be ignored.

      Do you want to be the example they choose to make? Thought not.

  8. Tron Silver badge

    Game over for Web 2.0 in the UK.

    Unless you have a corporate level legal dept., you may as well take down all Web 2.0 interactivity.

    So, why is it like this? Three reasons. Some folk in government want to end the ability of individuals to broadcast their opinions or dirt they have on the wealthy and powerful - #MeToo etc. The Horizon scandal took years of concerted pressure from 'Private Eye' to gain traction. The #MeToo movement was much quicker. Public voices on the net make it harder for the rich and powerful to get away with corrupt and evil stuff and hide their failures, so obviously they will suppress Web 2.0. Second, incompetence - the British govt. lacks competence all over, and decent, competent people just don't want to work with them [in part because they do stuff like this]. Finally, activists. This is the age of the screechy activist. Protect the children from the evil internet. Because they were all fine before the net, and their parents have no role in such things. One person suffering from hurt feelings is one too many. Nazi types can leverage this into bans very easily.

    China, Russia and Iran already do this. Australia and the EU will follow the UK. Our internet is being taken away from us piece by piece, because it empowered us at the expense of the idiots who ru(i)n our countries.

    What can we do? Some distributed services may fill in the gap, but we are in the same place as the Chinese and the others - we can do toss all. Just hate your government for what it is doing and what it is taking away from you. Vote whatever regime you have out at the earliest opportunity. What replaces them won't be any better, but seeing a bunch of them forced to remove their snouts from the trough for what they have done, and leave, feels good. It's all we have as the quality of life in our Orwellian dystopia continues to decline.

  9. Brewster's Angle Grinder Silver badge

    I haven't looked at the legislation, yet. But if you are a small, text-based forum, do you even need to bother with the risk assessment? You, ahem, do it when Ofcom write to you and backdate it knowing every case will be low risk.

    1. Anonymous Coward
      Anonymous Coward

      if you are a small, text-based forum,

      I can imagine that the assessment for small, text-based fora might be pretty similar, barring a few bits of customisation here or there. Perhaps it would help if there was a template which could be customised. For many people, writing such a thing starting from a complete blank would be quite onerous.

      1. Brewster's Angle Grinder Silver badge

        Re: if you are a small, text-based forum,

        That's eminently sensible. It will just take a while for those things to emerge. Software could ship with it.

    2. Anonymous Coward
      Anonymous Coward

      Today one needs to think more like a Russian or Chinaman.

      They have had this sort of pettyfogging shit since forever, and they get on with ignoring it.

  10. Anonymous Coward
    Anonymous Coward

    I would not be shocked if the implementation is pushed back seeing how unworkable the law is.

    And if most sites do end up blocking the UK then that going to be felt right a way and alot of people are going to become aware of how bad the OSA is. There also could be a huge negative affect on the UK tech economy. I can see huge backlash from the public and the UK gov backtracking hard with panic amendments.

    1. Brewster's Angle Grinder Silver badge

      I think it's already law, isn't it? But a few hundred people here, and a few hundred people there, don't add up to nearly seventy million. Hopefully, the courts will take a more reasonable line and draw it more tightly. But I don't see much backtracking is likely.

      1. Anonymous Coward
        Anonymous Coward

        I see a bit of backtracking but the law will likely face Judicial reviews and legal challenges.

  11. Boris the Cockroach Silver badge

    Big quote

    "offences related to information likely to be of use to a terrorist and offences relating to training for terrorism

    hate offences such as stirring up of racial hatred offence and stirring up of hatred on the basis of religion or sexual orientation

    sexual exploitation of adults such as causing or inciting prostitution for gain offence

    human trafficking

    assisting or encouraging suicide offence

    the unlawful supply, offer to supply, of controlled drugs, and the unlawful supply, or offer to supply, of articles for administering or preparing controlled drugs

    weapons offences such as offences relating to firearms, their parts, ammunition, including air guns and shotguns"

    .

    .

    Well thats facebook up the shitter then

    1. Yet Another Anonymous coward Silver badge

      Re: Big quote

      It also covers the Bible, Scouting for Boys and most of Roald Dahl's children's books

      1. Anonymous Coward
        Anonymous Coward

        Re: Big quote

        I joined NAMBLA and got the free copy of Scouting for Boys.

        It was quite good on knots and stalking, but they seem to have omitted the sections on semaphore and camp cooking.

        1. Sceptic Tank Silver badge
          Flame

          Re: Big quote

          Why do you want to cook a camp?

          1. Yet Another Anonymous coward Silver badge

            Re: Big quote

            Have you tried eating them raw?

          2. PerlyKing
            Joke

            Re: camp cooking

            Not cooking a camp, you're being obtuse.

            Cooking while being camp - fabulous!

  12. J.G.Harston Silver badge

    I notice that the legislation states that the UK state is exempt from restrictions publishing harmful content.

  13. Chubango

    RIP cool little corners of the internet

    The homogenization of the internet and its culture continues as only the big players will be able to comply. My only hope as someone involved with a few small communities as a (web)admin for two is that the rest of the world does not pursue similar idiocy.

    Never thought I'd get into the habit of checking the obits.

  14. martinusher Silver badge

    How about a ban on spray paint?

    Spray paint cans have legitimate uses but by far their most common use is for spray painting graffiti, something that's mostly offensive to the eye, the devil to get rid of and is completely uncensored.

    This is just a typical example of ineffective government overreach, the sort of "Brazil" like mindset that makes the place really naff for ordinary people to live in. Its ineffective because it won't stop the material its ostensibly supposed to stop, won't protect the people that its its supposed to protect but will provide endless employment for what used to be called in the old days "Little Hitlers" who can harass and suppress at will. (....and I'd guess its a lot easier than confronting the problem of 'grooming gangs', apparently a very real issue rather than something cooked up by tabloids to boost circulation)

    So I suppose the London Fixed Wheel mob need to open their site for London, Ohio or London, Ontario riders (wink....wink.....)

    1. Yet Another Anonymous coward Silver badge

      Re: How about a ban on spray paint?

      >Spray paint cans have legitimate uses but by far their most common use is for spray painting graffiti,

      My government has taken all possible steps to prevent art and are now reorganising the Department of Education in an effort to wipe out writing by the next election

  15. TheMaskedMan Silver badge

    "That's just the way that Parliament designed the act."

    Which is just what happens when your elected representatives are professional politicians with little to no knowledge of anything other than getting themselves elected.

    Running online fora, even of the text only persuasion, is hard work if you want to keep the trolls under control. So much so that I stopped doing it years ago, and have no intention of doing it ever again. The burden imposed by this legislation will surely be such that many small sites will simply close down, or move to Reddit, Facebook etc. Which is nice for Reddit et al, I guess.

    I assume that the regulations will also apply to comments posted under articles / blog posts, so that's yet another group with extra work to do for no practical gain.

    As others have noted, there are already perfectly adequate laws in place to deal with unpleasant material, but that doesn't allow a government to be seen to be "doing something", does it? There may also be side benefits in making it difficult to operate independent fora, and, perhaps, troll them into extinction by posting - and then complaining about - something unpleasant. It's much easier to keep track of political dissent if you know where to look for it!

    The world goes madder by the day.

    1. AndyBarker

      Commenting on articles, review, and blogs is exempt

      1.17 A U2U service is exempt if the only way users can communicate on it is by posting comments or reviews on the service provider’s own content (as distinct from another user’s content).

      1. Anonymous Coward
        Anonymous Coward

        Re: Commenting on articles, review, and blogs is exempt

        but a comment by a user is that user's content, right?

      2. Anonymous Coward
        Anonymous Coward

        Re: Commenting on articles, review, and blogs is exempt

        >1.17 A U2U .

        So I can go on a movie review site and say that all Buddhists are a bunch of Chihuahua molesting terrorists, but if I say anything unkind about amfm1 on el'reg they are in trouble ?

        1. Killfalcon

          Re: Commenting on articles, review, and blogs is exempt

          Only if you say it in the El Reg forum. Article comments like this are, apparently, not a risk to the public.

  16. Anonymous Coward
    Anonymous Coward

    About time too.

    I wandered into an LFGSS bar and woke up with a sore arse, and chain grease smeared up my inside leg.

    Never again!

    1. Yet Another Anonymous coward Silver badge

      Were you just bicycle-curious ?

  17. Anonymous Coward
    Anonymous Coward

    Hail the Return of the Dial-Up BBS?

    For a good time, call 867-5309.

  18. bernmeister
    Headmaster

    I am all right jack

    I suspect that small niche websites wont be affected by this. I run a small website dedicated to the modification of old radios. I have only received one comment and three views over the past few years. Its a free site and I dont look at it myself very often. I put it together as an experiment. The comment was "you dont seem to be very active".

    1. Anonymous Coward
      Anonymous Coward

      Re: I am all right jack

      And yet you're just as liable for the £18 million fine as Facebook.

    2. Amblyopius

      Re: I am all right jack

      You are most likely exempt anyway as per Schedule 1 Paragraph 4 1a:

      "A user-to-user service is exempt if the functionalities of the service are limited, such that users are able to communicate by means of the service only in the following ways—

      (a)posting comments or reviews relating to provider content;"

      Your (somewhat brief) description seems to imply it's a blog style site which is what Paragraph 4 seems to be targeting (aside from comments it lists other functionality you'd typically see on blog-style sites)

      1. Anonymous Coward
        Anonymous Coward

        Re: I am all right jack

        "Likely exempt" is a bit shaky when the minimum fine is set at £18m.

        1. flayman

          Re: I am all right jack

          There is so much FUD around this. It's not minimum fine. It's up to £18 million or 10 percent of qualifying revenue. Read this explainer: https://www.gov.uk/government/publications/online-safety-act-explainer/online-safety-act-explainer#how-the-act-will-be-enforced

          I just can't see good faith errors being punished. Wilful failure to engage with OfCom is of course going to incur penalties.

          1. Anonymous Coward
            Anonymous Coward

            Re: I am all right jack

            >I just can't see good faith errors being punished.

            Obviously these measures are only going to be used against terrorists .....

          2. Missing Semicolon Silver badge

            Re: I am all right jack

            Oh, you poor sweet child.

            Of course they will be punished. Otherwise the legislation will be ineffective, as Ofcom don't have time to have a little chat with every forum owner.

            And when they do, you'd better make sure that every single post would never be taken down from the Guardian's comments section.

        2. David Hicklin Silver badge

          Re: I am all right jack

          >> regulator Ofcom could impose a fine of £18 million

          Important word that "could", which means an "up to" in reality ....it sounds good however just like being seen to do something.

  19. This post has been deleted by its author

  20. StrangerHereMyself Silver badge

    Desert

    IMHO the UK will become an internet desert, with most services being cut back or terminated.

    And why is the OSA (Online Safety Act) needed anyway? Most offenses are already punishable with existing laws. The only thing the OSA does is add more layers of bureaucracy and paperwork.

    1. flayman

      Re: Desert

      I'm sorry, but this is rubbish. We'll see. Give it a year. I'm sure you will be proven wrong. The Act will catch actors who are not being conscientious. It is attempting to ensure that everyone understands what sort of content is proscribed, most of which is common sense, and to assess the risk of such content a) getting onto their platforms and b) not being swiftly removed and dealt with in a compliant manner. In so doing, they will look at whether they have adequate procedures in place. The people who are worried about it are already doing the things they are meant to be doing.

      On the other side of the Atlantic, major social media providers are shrugging off their responsibilities and just saying it's the wild west. Maybe Facebook and X.com will stop doing business in Britain. I honestly could not give a shit.

      1. emacca

        Re: Desert

        Catch the actors? They are literally letting out pedos. Not to mention the grooming gangs up and down England.

        If that your definition of catching?

        1. Evil Scot Bronze badge

          Re: Desert

          If my grandfather were alive today he wouldn't know what to do with a computer...

          An Instamatic camera, no problem.

          This is the issue. The damage is done with the image production not the distribution.

  21. Anonymous Coward
    Anonymous Coward

    It is this sort of laws that make me think that Trump and Musk might actually have a point…

  22. emacca

    Ouch

    I’m starting to hate this country and the way its going. Speech is dying everything comes with red tape. Common sense is dead. Why would you want to start a business or try any innovative here?

  23. Guy de Loimbard Silver badge
    Childcatcher

    The government says it is designed to protect children and adults online

    Would like to see the meeting notes for this pearl of wisdom......

    Bureaucracy for the sake of it!

    There merest mention of "protecting the children" seems to be only basis for ramming this stuff through.

  24. flayman

    An evidence based approach

    We're tech professional mostly. Smart people, I would have thought. We should take an evidence based approach when talking about this Act and how it is supposed to work. The Online Safety Act has its problem, but nothing like on the scale or scope of what some of you are suggesting. There are some publications explaining what will happen and how it will be enforced. Please read them. Here's one from gov.uk. I've linked to the heading "How the Act will be enforced."

    https://www.gov.uk/government/publications/online-safety-act-explainer/online-safety-act-explainer#how-the-act-will-be-enforced

    It states that criminal charges only apply where information requests are not followed and for non-compliance with enforcement notices.

    OfCom provide a tool to determine whether a service is caught by the Act. The start page for that is here, and it links to some other resources:

    https://www.ofcom.org.uk/online-safety/illegal-and-harmful-content/check/

    You answer a few questions, and if you find that the Act applies (for most of what we are talking about, it does), you can click a button "Check how to comply", which brings up this:

    https://www.ofcom.org.uk/Online-Safety/check-how-to-comply/

    Four steps:

    "Step 1 - understand the harms

    Step 1 will help you to know which kinds of illegal content to assess, and to make accurate judgments about your risks.

    Step 2 - assess the risk of harm

    Step 2 will help you use evidence to assess and assign a risk level to: the risk of harm to users encountering each of the 17 kinds of priority illegal content and other illegal content; and also the risk of harm of your user-to-user service being used for the commission or facilitation of a priority offence.

    Step 3 – decide measures, implement and record

    Step 3 will help you identify any relevant measures to implement to address risk, record any measures you have taken, and make a record of your assessment.

    Step 4 - report, review and update

    Step 4 will help you understand how to keep your risk assessment up to date, and put in place appropriate steps to review your assessment.

    Based on your answers to questions asked within the tool, we will provide you with compliance recommendations for your service. It will be of most use to small and medium sized businesses but could be useful to any in scope service provider. "

    The tool mentioned above is still being developed. It will provide recommendations. Following those recommendations would be a good way of ensuring compliance, but the recommendations are not law. I can imagine a recommendation that communities develop and publish a sensible set of rules and that those rules are enforced.

    For the vast majority of online forums to which the Act applies, once you understand what content to look out for, step 2 will result in an assessment of minimal risk, because they are already operating in a manner that catches this type of content and deals with it quickly or there has never been such an incident in years of operation. That cycling forum mentioned in the article would be one such. In the latter case, step 3 will involve working out how to ensure that such incidents are caught and dealt with. For example "We will moderate our channels by providing a means to flag inappropriate content as well as proactively tackle inappropriate content that moderators discover themselves." If OfCom accepts the risk assessment and the measures, then you're good. There might still be some incidents which slip through the cracks. This is not a violation. It just means that the risk assessment probably underestimated the risk. If OfCom order removal and you don't comply, that would be a violation. There would have to be a process of appeal because a regulator is in a quasi-judicial rule. Step 4 is not unlike what we're having to do every year for GDPR.

    I think there are some legitimate concerns around things like hate speech, and there had better be clear advice on that. Often people report things for hate speech which do not qualify. It could come down to an enforcement notice, and that would need to be complied with. Fines and, at the most extreme, criminal prosecution are reserved for responsible parties who do not carry out the duties imposed by the Act. If you do not perform and submit a risk assessment, you could be fined, but it won't be £18 million pounds or anything like that. Complying with the Act means carrying out steps 1 through 4, responding to information requests, and complying with enforcement notices.

    Some of what the Act is attempting to do may seem pointless (age verification, etc), and perhaps it is. But I really don't think this is armageddon for service providers. If you disagree with any of what I've laid out, then I welcome your thoughtful and polite reply. /ducks

    1. Missing Semicolon Silver badge

      Re: An evidence based approach

      What it is for is to ensure that ordinary people cannot publicise opinions that are seen as "harmful" by the Government. The penalties are so draconian, and the penalty for merely being investigated is so huge (have you ever asked a Solicitor to write a letter?) that most small forums will simply close, to avoid the attention.

      This is one of those things where the "process is the punishment".

      1. flayman

        Re: An evidence based approach

        As I said, compliance is about ensuring that the sort of content that is for the most part already considered illegal is recognised and that there are suitable processes for dealing with it. If you run such a service and you don't have a risk appropriate process or you fail to comply with an enforcement notice, you are liable. Proscribed content making its way onto your service does not constitute a failure to comply. It is to be expected. How you deal with it is what matters. The penalty everyone mentions is the maximum available and would surely be reserved for gross and wilful noncompliance. How about we all just relax a little bit. It hasn't even started yet and some of you are predicting the end of days.

    2. ThomasDial

      Re: An evidence based approach

      "The tool mentioned above is still being developed. It will provide recommendations. Following those recommendations would be a good way of ensuring compliance, but the recommendations are not law. I can imagine a recommendation that communities develop and publish a sensible set of rules and that those rules are enforced."

      The above caught my eye. I'm in the US, and realize that UK law may be quite different to ours. Here, though, such a statement is far from guaranteed. If an agency finds a condition that they think violates the law, it may make no difference that it arose from an earlier agency recommendation, as they recognize that the recommendation may have been a mistake, incorrect as a matter of law as they later see it (or as courts have later interpreted it), and that they must enforce the law as it is, not as they may once have thought, or as you thought based on their algorithm's output.

      1. flayman

        Re: An evidence based approach

        The recommendations that the tool gives you are simply that. Recommendations tailored for your specific case. You can decide to implement a different policy, as long as it's reasonable. The following year, if that implementation has not proven effective (i.e. you failed to quickly find and remove some proscribed content) then it should probably be amended. That incident does not mean you've broken the law, but ideally it would mean that you've already changed the processes in place.

        OfCom would have to accept your risk assessment and description of the processes in place as well as the review that you carry out every year. If they do, you're fine. If they don't, you need to do it again. If you don't submit the risk assessment and other requirements, then you are non-compliant.

  25. Anonymous Coward
    Anonymous Coward

    FINALLY!

    Finally, a clean internet is within reach.

    In the 90's/00's I would have been right against this but it all went to pot when social media allowed unfettered and unchecked communication between all ages and all cultures from all over the world.

    Instant sharing of violent and incompatible ideas, direct into the pockets of naive and inexperienced kids, who then grew up to have their own kids who are in even deeper than the parents.

    As social media has become so prevalent in life, we have to take action. Just like we have laws and regulation on the roads, and with who can interact with children professionally, well this unfortunately twisted version of the internet we have created needs this too.

    I crave the days of when the internet was fun, when it was free and anonymous, I'd fight for such a net. But, even I can see that this is far than what we have created. That Net is a memory. It is buried deep underneath the rot of the commercial and twisted net we have today. We have to do something, the underlying true net of the past will return, once we sort out, finally sort out, our natural inability to handle social media as it is presented by commercial interests today.

    We let it become centralised. We let it become commercialised. Central commercial walled gardens, fighting for users and doing so by actually playing with our very minds. We made sure TV couldn’t do that. Why should Facebook? Why should the Chinese TikTok? We who grew up in the 90s/00s gained the defences needed to handle using this, this thing, that the net became. But many can’t, and especially the kids, they are swimming in it, day and night. Strange and disgusting foreign ideas and demands, against even hostile to the very nature of the society that they are starting to grow up in, are provided to them day and night. There is no off time.

    China created their great firewall to hold this back. That was too extreme, but they in a way were right there is something innate within us that must be held back in some ways.

    In the past, a bully at school could only chase you home. Inside, you were safe, all weekend even. Back at school on Monday it may be totally different, the Bully has forgotten, or mellowed, I was there. But after 2010, or even just in the 00's with kids getting pay&go mobiles, well, now you have no escape as the bully now can enter your home at all times. With pay&go mobiles, changing the SIM would resolve that but now with social media, you can’t run.

    How many kids have died because of that? Quite a few in fact.

    Many have died because foreign kids, now strangely close as if in the same country, help to goad a kid in a totally different country and culture to take the "X challenge" and we all know what some of those challenges do. Remember that brain dead kid on life support over a year? Forgot him already didn’t you. Last year’s news eh. Well, nobody ever found out why he was found dead. Just was. Assumed to have been in a video call online with other kids, they would have seen it, seen it live, helped make it happen, but did they confess? Did they reach out and explain and provide a witness testimony? Did they f*ck. The cowards watched a kid in another country take a challenge and die and they did what was normal for them, run away and forget they ever saw anything.

    You really want to keep something like this?

    You really want to continue to "just see where it goes next"? All those "sign of the times" mantras I have heard over the years, as if we are following. Don’t make me laugh. We are not following, we are simply refusing to act. Excusing ourselves as if we simply had no control, there was nothing to be done etc.

    Well, this is HOW we actually do it. I'm fed up of hearing the same crap about just letting crap happen and deal with the problems reactivly. When I was a kid, my parents made changes and choices to control what WOULD happen regarding me and my siblings. These days I see mostly everyone just happy to let it all just evolve.

    Well, this is what we evolved.

    And this act is how we regain control. Trust me, other countries will follow.

    If it works well, then in 20-30 years we will be remembering this twisted internet as a distant bad memory. We'll have the old net back, but the controls in place will prevent it being abused to help, even by design, the abuse of people. We won’t have a Great Firewall like China does, no we’ll have our freedom still, but just like absolute power corrupts so does absolute freedom.

    I see that past generations were more strict with their younglings, every generation following being less and less strict and protective. We see much of that as progressive, which is true. But you can’t progress forever, there is an end point, a line you shouldn’t cross and in many ways we crossed it decades ago just to see what happened.

    And we did.

  26. Ex IBMer

    Trivial. Simply remove content

    You don't even have to employ additional staff.

    Simply place a button on any user contact that allows people to flag it as illegal, and delete the comment instantly, no questions asked.

    People will go to town on government pages. And it will be fun to watch. Especially if you blur the content and say that "A user in Stafordshire has taken objection to the content so it has been removed...."

    Extra points for collating a list of removals monthly for people to see.

    Almost worth creating a social media platform with that feature.

    Oh, I forgot.... once removed... it's removed forever...don't want to not comply with the law...

  27. Zog The Undeniable

    I part-run a small non-commercial forum. We did the risk assessment as required but ANYTHING with private messaging is high risk for grooming or harassment, if you're being honest, and we can't prove people's ages so we don't know how many minors we really have. It looks as if we'll have to convert it to static content on 16 March, upsetting a few thousand people.

    Faecebook and Xitter will be untouchable, as will the dark web. We're the low-hanging fruit, with easily-identified UK ownership and no money, we are terribly exposed to someone joining, posting grot, and ringing the alarm bell.

    1. flayman

      "...we are terribly exposed to someone joining, posting grot, and ringing the alarm bell."

      But that doesn't mean you're not compliant. It's how you deal with that situation that matters, and the risk assessment is to ensure that you've thought about it and come up with a process for handling these things, which is likely the same as the one you already have. The risk assessment will either be accepted or you'll be asked to do it again. If it's accepted, then you've done your bit for the year. Follow the process you've specified and you're fine. If you receive an enforcement notice, comply with it. It's understood and accepted that you can't prevent all instances of these things happening. You're just expected to deal with them quickly and efficiently.

POST COMMENT House rules

Not a member of The Register? Create a new account here.

  • Enter your comment

  • Add an icon

Anonymous cowards cannot choose their icon

Other stories you might like