Installing equipment?
Yeah, there's no way that won't be incredibly broadly interpreted.
Just imagine a year in jail for building a general purpose PC for a friend who then uses it to create a porno deepfake.
The UK government has promised to make the creation and sharing of sexually explicit deepfake images a criminal offence. It said the growth of artificially created but realistic images was alarming and caused devastating harm to victims, particularly the women and girls who are often the target. The government promises to …
I'm pretty sure the installing equipment relates to someone putting in a spy camera. While there would be issues for CCTV installers, I expect the purpose is to nab people who set up cameras in locations like changing rooms but are caught before anything personal is recorded.
At the same time, overly broad laws are often abused to the benefit of the rich and powerful.
This post has been deleted by its author
Politicians like to be seen doing things - so pass a new law that expands on or replaces an existing law looks like they are doing something. Its a shame that they don't seem to realise that passing the law is only phase one. Phase two is detecting violations, phase three from demonstrated actions by all political parties is ignore it. Phase four if it ever gets that far is to deny anyone was ever caught but look at how good we were at passing a law against it.
It would make photojournalism far more difficult and could be used to prevent reporting of politicians' activities that they would like to keep out of the public eye
For example, you can't use that picture of me accepting a brown envelope because it infringes my copyright. Or you can't show that film of some event because it has a person clearly visible in the background and you have no way of getting their permission to use their copyright.
>For example, you can't use that picture of me accepting a brown envelope because it infringes my copyright. Or you can't show that film of some event because it has a person clearly visible in the background and you have no way of getting their permission to use their copyright.
We are already at the point where high-quality videos of you accepting a brown envelope can be made in minutes. You might never have even seen the person giving you the envelope, have never been to the building, never seen any money and have 100s of witnesses showing you somewhere else at that time. But videos don't lie... do they?
If it has the effect of reducing the number of pointless images displayed on websites (just maybe I'm thinking of the BBC) which add nothing to the actual content of the page it has my vote, and I'd extend it to inanimate objects as well.
Please note: A picture is not worth a thousand words every bloody time
This post has been deleted by its author
Although it should be implemented in a different way than corrupting copyright.
- CCTV should only be permitted to face inside a building and only be stored onto offline computer, with an overwrite after a few days (ensuring that such footage is never viewed or exploited without a good reason) - alas there are endless spy-cameras absolutely everywhere that upload video to some remove server.
- I can't really think of any quality photo journalism recently done in the past several years, but maybe that should be still permitted.
- If you are doxing yourself in a holiday photo with a demon rectangle, at least have basic human decency and don't dox other people.
The route you suggested will lead to an immense amount of civil litigation: profitable for lawyers and for some corporate interests. Copyright is a legal principle built upon sand. The digital era is rapidly rendering copyright unenforceable. Therefore, nothing is to be gained by adding to the extant tangle of legislation and case law.
"Is there any reason copyright laws cannot be toughened up to tackle deepfakes. If everyone owns their own likeness ..."
'Owning your own likeness' is not covered by copyright. As its basic principle, it applies only to created works and is vested in the creator. So you can't have copyright in your likeness -- <irony>at a pinch it might be extended so that your mum might own copyright in your likeness but that's seriously stretching a point</irony>.
This could be a true test of the definition of "intelligence" in AI. If I use a non-intelligent tool, like a hammer, to something criminal then it's my fault but the hammer can't really be punished. However, if and intelligent person knowingly assists me then they are an accessory. If AI does what is says on the tin then surely it is complicit in any crime and therefore should be punished, shouldn't it. Take it off-line for 6 months every time it does something against the law and see if it learns anything from it.
This post has been deleted by its author
The problem with legislating on matters 'moral', when with sexual overtones, is MPs, including those who generally confine their activities to voting on clauses, all wanting to be 'seen' by their constituents as showing deep concern and contributing to robust legislation.
Offences introduced during the Blair era to target sexual molestation of minors gave considerable attention to images; their 'making', distribution, and possession. Good intention left a pig's ear of legislation; this resulting from silly attempts to cross every conceivable 't' and dot every 'i'; these arising from almost all MPs putting in their pennyworth of moral outrage. It fell to the police and the courts to make sense of it. Following the legislation's enactment, there was a period of confusion. For example, some museums were visited by police officers following complaints by members of the public over paintings of long-standing which depicted childhood nudity; some curators of galleries called in the police for advice. Much of the fuss emanated from officials (police and others) fearing to exercise good judgement (aka 'common sense') lest they be condemned for complacency.
A major failing in the legislation was lack of guidance to the police about prioritising responses to complaints, and urgency for investigation and court proceedings; ditto for prosecution services. It was recognised by some that simple artistic nudity had differing import to erotic depictions and those portraying sexual activity; indeed, much of the former was entirely innocent; yet doubt arose over whether possession of a copy of photographs taken of the child Alice Liddell by Charles Dodgson (pen name Lewis Carroll) and circulating on the Internet was a crime potentially meriting imprisonment. During such 'debate' as took place, it was prudent for public figures to make themselves out to be as prudish as Victorians were wrongly alleged to have been.
Of course, sensible enforcers of law would/should draw a distinction in terms of priority for investigation of images concerning children likely to be currently subjected to exploitation, and those from the past for which wrongs cannot be remedied.
In the context of 'deep fakes', legislators could avert nonsense arising from lack of clear guidance to the police and judiciary. The police admit they cannot investigate all crimes reported to them. For instance, some burglaries, and even assaults, are only noted or receive perfunctory investigation, whereas, people merely 'offending' other people through their choice of words are hounded.
Should the police in all matters work under the principle of 'first come, first served'? Clearly nonsense. Thus, 'deep fakes' must be categorised by their intended, or likely, impact. 'Celebrities' must accept 'deep fakes' as an occupational hazard to be brushed off. Ordinary people subjected to harassment should be the focus of attention.
This post has been deleted by its author
This post has been deleted by its author
I think this not well thought through. Existence of deepfakes is conducive to plausible deniability. Everyone will be able to deny any compromising image as false due to being product of deepfake AI. In effect the word “compromising” will lose meaning and any potential inflammatory value that deepfakes may have at the moment will be lost due to ease and prevalence with which such imagery could be generated. Furthermore, affected individuals will be in position to question validity of real images by flagging those as deepfakes. The whole concept of deepfakes will lose on sensationalism and value. Journalist will have to focus on undertaking real investigative work instead of relaying on a photographic evidence of “X accepting the potentially compromising envelope”. Due to the see with which garbage content can be generated the public will value more robust outputs and the whole business of deepfakes generation will reduce its own value to anonymous slander posted on obscure discussion boards.