back to article Clearview AI fined millions in the UK: No 'lawful reason' to collect Brits' images

The UK's data protection body today made good on its threat to fine controversial facial recognition company Clearview AI, ordering it to stop scraping the personal data of residents from the internet, delete what it already has, and pay a £7.5 million ($9.43 million) fine. The company, which is headquartered in New York, …

  1. Throgmorton Horatio III

    Please fine them until the pips squeak.

    Rightly or wrongly, people knowingly and willingly give their photos to Google, Facebook etc, but not to this bunch.

    1. Sykowasp

      Re: Please fine them until the pips squeak.

      And keep on doing it until the data is removed.

      The problem for Clearview is that this will probably mean re-creating their databases from scratch from the new input datasets (with removed images), which is a huge amount of processing work, assuming they iteratively add new images over time.

      And yes, people giving personal images to Google/Facebook does not mean they ever consented to this use which is very different from why people upload to social media.

      And also why we shouldn't upload to social media, unless we have locked things down massively, which unfortunately the social media businesses don't want to make easy.

      1. Barrie Shepherd

        Re: Please fine them until the pips squeak.

        How will the ICO be able to audit that the images have been removed? I doubt Clearview will, it will just move them to some offline backup media to be reinstated later.

        The ICO should make it clear to UK organisations that they must not use Clearview services either directly or indirectly through agents or subsidiaries in Clearview friendly countries - and that should include the UK Police and Security Services.

        As for "Clearview AI is not subject to the ICO's jurisdiction...." if the US are able to use their laws to chase people across our borders then we should be able to reciprocate.

        I'd be interested to know what the US's 3 Alpha organisations do with the data - and I'd not be surprised if they even funded, in some way, Clearview's development, as they would not be allowed to do it themselves directly.

        1. HildyJ Silver badge
          Trollface

          Re: Please fine them until the pips squeak.

          "How will the ICO be able to audit that the images have been removed?"

          Subcontract it to a dark web data wiper.

        2. tip pc Silver badge

          Re: Please fine them until the pips squeak.

          Once they’ve passed the image through their ai to train it they likely don’t need it again.

          The real issue is that they trained their ai using your images, they would have to untrain their ai using the same image which is likely impossible.

          The only way to make this work will be to delete the trained ai and delete the wrongfully gained images then start the ai training process all over again.

          That will set them back years.

  2. cawfee

    As good as a win as this is...

    I can see clearview trying to resist this with a "how are we supposed to know who's British?"

    1. devin3782
      Coat

      Re: As good as a win as this is...

      They'll know, they've been training the AI with our data after all

      1. Anonymous Coward
        Anonymous Coward

        Re: As good as a win as this is...

        "The question of whether computers can think is like the question of whether submarines can swim." - Edsger W. Dijkstra

    2. Filippo Silver badge

      Re: As good as a win as this is...

      Yeah, I don't think that's going to fly. It's their problem. If they can't figure out how to operate their business without breaking the law, then they don't get to operate their business.

    3. Flocke Kroes Silver badge

      Re: How are we supposed to know who's British?

      Don't need to. Just delete the data for everyone you cannot prove is some other nationality.

      The same goes for Google. They do not need to know who I am to delete what they know about me. They can simply delete records for all people they cannot identify. These days, that list must consist almost exclusively of people who do not want to be tracked by Google.

      1. katrinab Silver badge
        Megaphone

        Re: How are we supposed to know who's British?

        It is not about nationality though. It is about whether they uploaded it to a Facebook [etc] account that was registered while in Britain, or a photo taken in Britain, or uploaded while in Britain.

        1. worldtraveller2

          Re: How are we supposed to know who's British?

          Interesting thought. I opened one of my social media accounts, stating country of residence as UK whilst working abroad. However, future communications were based on the language of the country I was in when I opened the account. I believe after much "shouting" at the company concerned that they have at least altered my preferences (if not the underlying record, which I can't see).

    4. Danny 2 Silver badge

      Re: As good as a win as this is...

      "how are we supposed to know who's British?"

      The images will show our famous stiff upper lips. (Admittedly used to hide our awful teeth).

  3. Filippo Silver badge

    Just because I make a piece of data freely available on the Internet, that does not make it public domain. It can still be legally protected in all kind of ways, even if it's not protected by technical means.

    Companies that gather data from the Internet ought to be made painfully aware of this fact: you still need a license to use any data that's not yours, and if it's not clear whether there is a license to be had, that doesn't mean the piece of data is up for grabs; it means that if you grab it you're in an ambiguous situation at best, and could very well land in court.

    1. Danny 2 Silver badge

      @filippo

      I cut and pasted your post down below under my name, albeit with a Joke Ahead icon.

      I wasn't mocking your argument or sentiment. My serious point was the ICO may not be toothless but it is underfunded and out classed so impotent. It's akin to the serious fraud office - plenty of bite on the statute books, but a kitten in reality.

      There is no effective mechanism to make Clearview AI give up our data, the £7.5 million fine is essentially a windfall tax or a slap on the wrist. I'll change my opinion when company directors are extradited here and face prison time, or lists of all their customers are seized and published.

      Once someone has your data, they have your data and you hope in vain they don't pass it along. My mum is on a 'suckers list' and gets scam phone calls all the time.

      And it's not just private companies. I've been arrested a fair few times without being charged (as a peace protestor mostly), and on the initial arrests they'd swab me for DNA. I asked why they stopped doing that and they said because they already had it. I said they weren't allowed to keep it if I wasn't charged, after an ICO ruling, and they replied, "No, that is just in England." - it is not just in England, same rules apply! Same rules are ignored.

  4. Gordon 10 Silver badge
    FAIL

    And the enforceability of this fine is done how?

    Seems a pointless headline grabber by the ICO - I'm assuming that by design this company has no UK/EU assets or presence.

    So the ICO is going to make them stop HOW exactly?

    1. localzuk Silver badge

      Re: And the enforceability of this fine is done how?

      If they ever want to do business in the UK, they'll care.

      1. Zippy´s Sausage Factory

        Re: And the enforceability of this fine is done how?

        In other words they can't right now legally sell to ANY business in the UK, or that has a UK subsidiary that would use their technology without automatically becoming liable to pay the fine.

        1. Anonymous Coward
          Anonymous Coward

          Re: And the enforceability of this fine is done how?

          I'm sure *they* won't. The entity will be Clearview (British Virgin Islands) Holdings Ltd who use the Clearview brand under license to their parent company and act as the middleman between them and UK customers.

      2. Gordon 10 Silver badge

        Re: And the enforceability of this fine is done how?

        Since what they are doing - collecting images without consent is pretty much illegal in the UK - I'll doubt they'll care. They obviously dont give a stuff about operating in the EU or UK or they would never have gone down this route. BUT EU/UK citizens data will still be used to populate the training sets for their ML models.

        Especially if you consider what they'd actually have to do to comply - which is to hard delete both the photos AND the ML models that were generated from them.

        Never

        Gonna

        Happen

        So I say again. What actual enforcement actions (that will work!) are open to the ICO? The only one I can think of is get them added to a sanctions list - but that seems tall ask for the ICO to achieve and I'm not sure the framework is in place for it. I think they'd have to write to one of the junior ministers in charge of the UK Treasury.

        You downvoters are a bit naive on this one I think.

    2. Barrie Shepherd

      Re: And the enforceability of this fine is done how?

      Extradite the CEO to the UK if the fine is not paid and proof given that our images have been destroyed not just deleted (i.e. moved to an offline backup) - the US do it so what's good for the goose etc. etc.

      1. Anonymous Coward
        Anonymous Coward

        Re: And the enforceability of this fine is done how?

        I like this idea better. The UK still has more allies around the world than clearview has friends. Unless other parts of the UK gov are shielding them, ClearviewAIs employees, backer and assets are fair game.

        Not that it's likely, but in the US we have draconian civil asset forfeiture laws that can be damn inconvenient.

        Then again Thiel sells the same type of toys, so maybe he wants to pull some strings and cut out the competition stateside...

  5. Anonymous Coward
    Anonymous Coward

    NSA laughs....

    ...and adds data source to existing library

  6. VoiceOfTruth

    I'm in two minds about this, because it ignores the elephant in the room

    -> failing to have a lawful reason for collecting it

    Why is it lawful for Google to collect this and Clearview not? It seems that the ICO is ignoring the elephant in the room. I have never had much time or respect for the ICO. It is a chair-polishing unit of government.

    1. Pink Duck

      Re: I'm in two minds about this, because it ignores the elephant in the room

      The distinction is between indexing for search, and capturing/saving process to derive a biometric class of identifier. Google can still do the latter of course, but if they do it's just not public knowledge yet.

      1. VoiceOfTruth

        Re: I'm in two minds about this, because it ignores the elephant in the room

        Given Google's history I would be surprised if they weren't doing this.

    2. localzuk Silver badge

      Re: I'm in two minds about this, because it ignores the elephant in the room

      Consent. If I put a photo on Google's site, I give consent under Google's terms for its use there. I do not give carte blanche to every random company to take it and use it how they see fit.

      And, if Google were to sell that data, they'd still need explicit, informed consent for that particular use too.

      1. Andy The Hat Silver badge

        Re: I'm in two minds about this, because it ignores the elephant in the room

        unless, usually, you consented to allow you data to be shared "with trusted and carefully selected third parties".

        Bit like signing up to virtually any UK government system ... to paraphrase "we will share you data with anyone from any security organisation, local council, pet rescue organisation, private parking mafia and anyone else who knows and will pay us for your data ..."

        1. Anonymous Coward
          Anonymous Coward

          Re: I'm in two minds about this, because it ignores the elephant in the room

          That would fail the informed consent test, because the list of partners is not enumerated...

          1. localzuk Silver badge

            Re: I'm in two minds about this, because it ignores the elephant in the room

            Precisely this. When sharing said data, especially if it is special category data, they would need to be explicit as to who they were sharing it with, and for what purpose it would be processed. A catch all "we share with others" is not enough under GDPR.

        2. VoiceOfTruth

          Re: I'm in two minds about this, because it ignores the elephant in the room

          This is all too true. There is always a 'need' word in their too - if it is necessary we will share your data. Need for who? Not me.

        3. iron Silver badge

          Re: I'm in two minds about this, because it ignores the elephant in the room

          You forgot they will sell it to Palantir and its evil boss Peter Thiel.

        4. katrinab Silver badge
          Megaphone

          Re: I'm in two minds about this, because it ignores the elephant in the room

          Maybe, but Clearview wasn't one of those carefully selected third parties.

      2. VoiceOfTruth

        Re: I'm in two minds about this, because it ignores the elephant in the room

        Er hello? Google scrapes the web. That is what it does.

      3. Anonymous Coward
        Anonymous Coward

        Re: I'm in two minds about this, because it ignores the elephant in the room

        Consent. If I put a photo on public internet, I give consent to everybody and his cat... erm, dog, to see it and do whatever he wants with it.

        If you don't like it, don't make the photo available on public internet. Use a private cloud under your control.

        1. Missing Semicolon Silver badge

          Re: I'm in two minds about this, because it ignores the elephant in the room

          Rubbish. Otherwise you could create a news site by just scraping and collating other news sites, for example.

          The principle that content on the web can only be consumed in accordance with the license, and not freely copied and republished is well established.

          See for example here.

          1. Anonymous Coward
            Anonymous Coward

            Re: I'm in two minds about this, because it ignores the elephant in the room

            WOOSH!

            The principle that content on the web can only be consumed in accordance with the license, and not freely copied and republished is well established.

            What did I just said? Well, not in so many fancy words, but the ideea is the same, right?

        2. fwthinks
          Unhappy

          Re: I'm in two minds about this, because it ignores the elephant in the room

          The flaw with this argument, is that it assumes all photos were knowingly uploaded to the internet and only contain a picture of the person who uploaded the photo.

          There will be lots of people who do not even know their photo is on the internet. Ever been in a group photo taken by somebody else? Did the person who uploaded the photo collect consent from all individuals in the photo?

          1. chivo243 Silver badge
            Stop

            Re: I'm in two minds about this, because it ignores the elephant in the room

            NO, no they did not know that institutions including schools, churches and not to mention you're buddy's wife who lives and breathes social media and posts every photo taken at a gathering...

    3. Filippo Silver badge

      Re: I'm in two minds about this, because it ignores the elephant in the room

      It's a difficult question. As a society, we don't really yet have a universally-accepted consensus on how privacy works on the Internet. And legislation usually trails societal consensus, for good reasons.

      In this case, the difference is that while it's fairly obvious that nobody wants other people to be able to run face recognition on them without their consent, it's not obvious at all what exactly people want Google to index - even though a reverse GIS looks a lot like what Clearview is doing. I'm fairly sure that everyone at the ICO is well aware that this distinction is rather nebulous, but what can they do about that? Come up with arbitrary rules that would likely discontent everyone?

      Until we, the everyday people of the Internet, get into some kind of agreement on exactly what's acceptable and what isn't, I wouldn't blame governments too much for failing to evict elephants from rooms.

    4. jmch Silver badge

      Re: I'm in two minds about this, because it ignores the elephant in the room

      "Why is it lawful for Google to collect this and Clearview not?"

      Simple answer is that it's not lawful for Google to collect this type of data either

      1. VoiceOfTruth

        Re: I'm in two minds about this, because it ignores the elephant in the room

        I agree. Yet the chair polishers don't do anything about it.

      2. Alumoi Silver badge

        Re: I'm in two minds about this, because it ignores the elephant in the room

        But they're doing it anyway. The law? We've heard of it and we'd gladly pay the fine. It's the cost of doing business.

  7. Rol

    I am not an idiot. Sadly some of my friends are

    I have never uploaded my image to the internet, but I have been tagged in photo's by friends and family. They are all very aware now that they were out of order for doing that. The damage was done though and cannot be undone.

    It's a bit of a stretch for Clearview to consider that all of the images they gouged from the internet had an implicit agreement behind them, as many like me had no say in whether they could be uploaded. They assume too much.

    And if they had approached the custodians of the original images offering money for them, I guess £7 million would be a snip. More like £7 billion for that kind of data.

    1. VoiceOfTruth

      Re: I am not an idiot. Sadly some of my friends are

      -> I have never uploaded my image to the internet, but I have been tagged in photo's by friends and family.

      This point you are raising here is crucial, and it doesn't even occur to many people. Your name and your photo is now in Clearview's database (and other databases which you are unaware of). The surveillance state is built on this. Just because it is a private company that has this data, do not think that the state does not have access to it. At the drop of a hat if the state comes round to Clearview and says 'can you match this photo?', Clearview is not going to say 'we do not do that sort of thing'. They are now party to the surveillance state.

      1. Jellied Eel Silver badge

        Re: I am not an idiot. Sadly some of my friends are

        Yup.

        I had a FaceMelta account with a pseudonym. A friend published a picture of me and tagged it with both my real name, and my pseudonym, enabling both to be linked. Assuming Clearview stole that, it could now find me in random images that just happen to contain me. I have no idea if those images exist, so cannot have given consent.

        It's an interesting legal challenge. I remember years ago, a German fellow was at a motor race with a lady that was not his wife. His wife saw the image, wasn't impressed, and filed for divorce. The man sued the broadcaster, and won for infringing his privacy. From memory, case relied on what's a reasonable expectation of privacy at a public, or semi-public event.

        But this is a good step by the ICO. Now it just has to repeat the process against every data slurper and aggregator that thinks privacy is less important than profit. Personally I think there's a couple of quick fixes. Data controllers are legally responsible for data accuracy, so should automatically send their records to data subjects to check. Obviously this would cost them, and reveal the amount of data held. All executives at the slurpers should also be required to maintain public profiles detailing every class of data held and processed. If they think it's ok to spy on our browsing habits, they should have no problem with publishing their own.

    2. Roj Blake

      Re: I am not an idiot. Sadly some of my friends are

      Yep, it's much like Whatsapp taking my number from other people's Contacts with their permission but not mine.

      1. VoiceOfTruth

        Re: I am not an idiot. Sadly some of my friends are

        Exactly.

      2. chivo243 Silver badge

        Re: I am not an idiot. Sadly some of my friends are

        Or Linked In harvesting all contacts from address books, contacts etc. I thought it odd a colleague I barely knew wanted to connect on LI, I quizzed him about it, and seems he forgot to untick the box about collecting contact info. He was a bit embarrassed as I wasn't the first to quiz him on it...

    3. hammarbtyp
      Black Helicopters

      No - I'm sparticus

      "I have never uploaded my image to the internet, but I have been tagged in photo's by friends and family. They are all very aware now that they were out of order for doing that. The damage was done though and cannot be undone."

      That was your 1st mistake. What you should of done was get a total random picture of a person (preferable deceased) then upload it to as many sites as possible attached to your name. You could even ask your nearest and dearest to add some photoshopped images of them with your avatar.

      It may not be possible to beat the man, but you can at least confuse the hell out of hime

      1. Anonymous Coward
        Anonymous Coward

        even in jest

        Fun for a while, but then you get flagged because of the false ID.

        Spam attacks are no panacea, there are too many other places for them to get both a positive ID and an alternate picture. You could expect get turned away at the border for suspected visa/passport fraud, or run into trouble at the DMV etc, or end up linked to your cover ID's criminal record.

        An alternate and still totally not serious method would be to go full Juggalo for all of your ID and public appearances. If you use temp tats and henna they can't make you wash it off and will have to take your picture. Be prepared to be harassed at every security checkpoint in every country till you die though, even if it fades after a week or two.

        Or just embrace your new identity and live the ICP life. They will appreciate your commitment to sticking it to the man.

        To bad sticking it to the man usually involves sticking yourself in the eye first.

      2. dajames Silver badge

        Re: No - I'm sparticus

        What you should of done was get a total random picture of a person (preferable deceased) then upload it to as many sites as possible attached to your name.

        Better still, if you want to poison the well, upload a selection of different images with your name attached, and also upload your own image several times with different names attached. Not too many, in either case, or your ploy will become too easily apparent.

        If your data are to be monetized, all you can do is act to reduce their value.

  8. IGotOut Silver badge

    Public images.

    No problem.

    Let's gets a group follow them taking photos 24/7 when in public, and post saying where they were, who they were with, what they ate, what time they went to the toilet etc.

    No problem with that, it's public after all.

    1. Cederic Silver badge

      Re: Public images.

      That sounds like a clear case of harassment.

      Photographing people in public without their permission is allowed in the UK and the US. Following them around continually photographing them is materially a different thing.

      1. msknight

        Re: Public images.

        There is, I believe a caveat to that which is whether you are using the picture for profit or not. If yes, then you should have the permission of the people in the picture, whether or not you are standing on public ground when you take the image.

        1. msknight

          Re: Public images.

          I don't understand the thumb downs but it's a free world I guess...

          "DO I NEED PERMISSION TO TAKE PICTURES FOR COMMERCIAL PURPOSES?

          In order to sell your photos to a media library or to use your photos to promote or sell products or services, you will often be required to obtain a signed model release form from any identifiable person featuring in the image. Although it is not illegal in the UK to take an identifiable photo of a person in a public place, media libraries and agencies often require you to have had permission to take the photos regardless."

          https://www.pauldavidsmith.co.uk/photographers-rights/

          1. MarkTriumphant

            Re: Public images.

            None of which says that you can't do it. What it says is that some companies need signed releases. Not all do, though.

        2. iron Silver badge

          Re: Public images.

          No, you don't. Otherwise there would be no pictures of celebs doing things they don't want pictured in the newspapers.

          Picture of a footballer coming out of a restaurant with his mistress, sorry no permission. Picture of him beating up some bloke on the street because he looked at him funny, sorry no permission. ETC.

          1. msknight

            Re: Public images.

            Yes you do. Journalism has an exemption to GDPR

            https://bookdown.org/fede_caruso/bookdown/the-journalistic-exemption-in-the-gdpr.html

      2. Jellied Eel Silver badge

        Re: Public images.

        That sounds like a clear case of harassment.

        Why would they feel harassed? I think we should crowd fund some private investigators to do this, and publish the results in near-realtime. Sure, it might be considered stalking, but it's not really any different to what these scumbags do. If they're concerned about their privacy being violated, stop violating ours.

  9. Roj Blake

    New York

    What are the odds that they won't pay the fine or do anything to rectify this since they're headquartered in NY?

    1. Michael B.

      Re: New York

      That pretty much means they won't be able to do any business in the UK from now on. They could hide behind that barrier but it would mean that they would have to stay behind that barrier and I'm sure that their investors won't be too happy with a limited marketplace.

  10. John Brown (no body) Silver badge

    What are they being ordered to delete?

    Has anyone read the judgment? Are they being asked to delete the photos/images or are they also being ordered to delete the derived hashes etc that could still be used to later identify people in new photos?

  11. Danny 2 Silver badge
    Joke

    Just because I make a piece of data freely available on the Internet, that does not make it public domain. It can still be legally protected in all kind of ways, even if it's not protected by technical means.

    Companies that gather data from the Internet ought to be made painfully aware of this fact: you still need a license to use any data that's not yours, and if it's not clear whether there is a license to be had, that doesn't mean the piece of data is up for grabs; it means that if you grab it you're in an ambiguous situation at best, and could very well land in court.

  12. Anonymous Coward
    Anonymous Coward

    failing to have a lawful reason for collecting it

    The article says one of the reasons for the fine is "failing to have a lawful reason for collecting it". ICO's own article actually says "failing to have a lawful reason for collecting people’s information".

    It is not 100% clear whether this is a reference to: (a) failing to have any valid/defined legitimate purpose(s) (GDPR Article 5(b)), or (b) failing to have any valid defined lawful bases/conditions (GDPR Articles 6(1) and 9(2)).

    I assume the ICO is referring to GDPR Article 5(b) "legitimate purpose(s)".

    You'd expect the ICO to use the correct legal terminology in their press release so as to be clear as to which of these they were referring to. Then again it is the ICO...

  13. Steve Davies 3 Silver badge
    Mushroom

    Only $7.5M?

    It should be $7.5B. Then they might stop scraping images of all and sundry without a thought to the damage they are doing.

    suck on this--> [see icon]

  14. jollyboyspecial

    Contradiction

    "Clearview AI is not subject to the ICO's jurisdiction, and Clearview AI does no business in the UK at this time."

    "My company and I have acted in the best interests of the UK and their people by assisting law enforcement in solving heinous crimes against children, seniors, and other victims of unscrupulous acts."

    Those two statements contradict each other

    1. Ken Hagan Gold badge

      Re: Contradiction

      Assuming that they gave their assistance for free, that doesn't constitute "doing business" in the UK. Then again, even if the statement is true this judgement is still a problem if they ever want to start (*) doing business in the UK.

      (* Or restart. I note the phrase "at this time" and wonder if ClearView have done business in the UK in the past.)

    2. John Brown (no body) Silver badge

      Re: Contradiction

      I note the use of "think of the children" in their statement. And old people. So they don;t care about your average run-of-the-mill "heinous" crimes?

  15. Richard 12 Silver badge

    Double it.

    Actually, no, take it straight to the maximum permitted under the law.

    Then charge the directors for contempt of court - that response is not "we're appealing", it's straight-up contempt.

    Note that it's also illegal everywhere within the EU, and as the UK is no longer a member, an EU member state can bring a separate GDPR action and fine them the maximum too...

  16. Anonymous Coward
    Anonymous Coward

    You know they know…

    …that they've been caught pants down when it's their lawyers, not the regular PR drone, who supply the press releases.

    Given their poor understanding of issues of scope, they need to find better lawyers too.

  17. Sparkus

    company fines are ineffective

    Fines and enforcement actions need to be directed against executives, sized so as to be larger than any possible 'insurance' those individuals may have.

  18. xyz123 Bronze badge

    Hoan Ton-That is a fascist on record saying that future governments could use clearview's face recognition to "round up undesirables".

    Clearview is a company that should be outlawed in the UK, and pressed to be outlawed everywhere as its stated goals are basically to round up various ethnic groups etc into death camps.

POST COMMENT House rules

Not a member of The Register? Create a new account here.

  • Enter your comment

  • Add an icon

Anonymous cowards cannot choose their icon

Other stories you might like

  • Clearview AI wants its facial-recognition tech in banks, schools, etc
    I get knocked down but I get up again, Italy, Canada, UK, ACLU, Facebook, Google, YouTube, Twitter... are never gonna keep me down

    Clearview AI is reportedly expanding its facial-recognition services beyond law enforcement to include private industries, such as banking and education, amid mounting pressure from regulators, Big Tech, and privacy campaigners.

    The New York-based startup's gigantic database contains more than 20 billion photos scraped from public social media accounts and websites. The database was used to train Clearview's software, which works by performing a face-matching algorithm between input images and ones stored on its database to identify individuals.

    These images were downloaded without explicit permission from netizens or companies. Although Clearview has been sent numerous cease and desist letters from Twitter, YouTube, Google, Facebook and more, it continued to collect more images and grow its database. The demands to stop scraping public-facing webpages, however, were not legally binding, unlike the settlement agreement Clearview entered into to end its lawsuit against the American Civil Liberties Union.

    Continue reading
  • Clearview AI promises not to sell face-recognition database to most US businesses
    Caveats apply, your privacy may vary

    Clearview AI has promised to stop selling its controversial face-recognizing tech to most private US companies in a settlement proposed this week with the ACLU.

    The New-York-based startup made headlines in 2020 for scraping billions of images from people's public social media pages. These photographs were used to build a facial-recognition database system, allowing the biz to link future snaps of people to their past and current online profiles.

    Clearview's software can, for example, be shown a face from a CCTV still, and if it recognizes the person from its database, it can return not only the URLs to that person's social networking pages, from where they were first seen, but also copies that allow that person to be identified, traced, and contacted.

    Continue reading
  • Did ID.me hoodwink Americans with IRS facial-recognition tech?
    Senators want the FTC to investigate "evidence of deceptive statements"

    Democrat senators want the FTC to investigate "evidence of deceptive statements" made by ID.me regarding the facial-recognition technology it controversially built for Uncle Sam.

    ID.me made headlines this year when the IRS said US taxpayers would have to enroll in the startup's facial-recognition system to access their tax records in the future. After a public backlash, the IRS reconsidered its plans, and said taxpayers could choose non-biometric methods to verify their identity with the agency online.

    Just before the IRS controversy, ID.me said it uses one-to-one face comparisons. "Our one-to-one face match is comparable to taking a selfie to unlock a smartphone. ID.me does not use one-to-many facial recognition, which is more complex and problematic. Further, privacy is core to our mission and we do not sell the personal information of our users," it said in January.

    Continue reading
  • Research finds data poisoning can't defeat facial recognition
    Someone can just code an antidote and you're back to square one

    If there was ever a reason to think data poisoning could fool facial-recognition software, a recently published paper showed that reasoning is bunk.

    Data poisoning software alters images by manipulating individual pixels to trick machine-learning systems. These changes are invisible to the naked eye, but if effective they make the tweaked pictures useless to facial-recognition tools – whatever is in the image can't be recognized. This could be useful for photos uploaded to the web, for example, to avoid recognition. It turns out, this code may not be that effective.

    Researchers at Stanford University, Oregon State University, and Google teamed up for a paper in which they single out two particular reasons why data poisoning won't keep people safe. First, the applications written to "poison" photographs are typically freely available online and can be studied to find ways to defeat them. Second, there's no reason to assume a poisoned photo will be effective against future recognition models.

    Continue reading
  • Ukraine uses Clearview AI facial-recognition technology
    Controversial search engine being used to identify dead and Russian operatives

    The Ukrainian government is using facial recognition technology from startup Clearview AI to help them identify the dead, reveal Russian assailants, and combat misinformation from the Russian government and its allies.

    Reuters reported yesterday that the country's Ministry of Defense began using Clearview's search engine for faces over the weekend.

    The vendor offered free access to the search engine, which Ukraine is using for such tasks as identifying people of interest at checkpoints and identifying people killed during Russia's invasion, the news organization wrote, citing Lee Wolosky, who currently advises Clearview and formerly worked as a US diplomat under Presidents Barack Obama and Joe Biden.

    Continue reading
  • 1,000-plus AI-generated LinkedIn faces uncovered
    More than 70 businesses created fake profiles to close sales

    Two Stanford researchers have fallen down a LinkedIn rabbit hole, finding over 1,000 fake profiles using AI-generated faces at the bottom.

    Renée DiResta and Josh Goldstein from the Stanford Internet Observatory made the discovery after DiResta was messaged by a profile reported to belong to a "Keenan Ramsey". It looked like a normal software sales pitch at first glance, but upon further investigation, it became apparent that Ramsey was an entirely fictitious person.

    While the picture appeared to be a standard corporate headshot, it also included multiple red flags that point to it being an AI-generated face like those generated by websites like This Person Does Not Exist. DiResta was specifically tipped off by the alignment of Ramsey's eyes (the dead center of the photo), her earrings (she was only wearing one) and her hair, several bits of which blurred into the background. 

    Continue reading
  • Face Off: IRS kills plan to verify taxpayers with facial recognition database
    Uncle Sam takes security, privacy concerns seriously, it says here

    Updated The Internal Revenue Service has abandoned its plan to verify the identities of US taxpayers using a private contractor's facial recognition technology after both Democrats and Republicans actively opposed the deal.

    US Senator Ron Wyden (D-OR) on Monday said Treasury Department officials informed his office that the agency has decided to move away from using the private facial recognition service ID.me to verify IRS.gov accounts.

    "The Treasury Department has made the smart decision to direct the IRS to transition away from using the controversial ID.me verification service, as I requested earlier today," Wyden said in a statement. "I understand the transition process may take time, but I appreciate that the administration recognizes that privacy and security are not mutually exclusive and no one should be forced to submit to facial recognition to access critical government services."

    Continue reading
  • IRS doesn't completely scrap facial recognition, just makes it optional
    But hey, new rules on deleting your selfies

    America's Internal Revenue Service has confirmed taxpayers will not be forced to use facial recognition to verify their identity. The agency also set out rules for which images will be deleted.

    Folks setting up an online IRS account will be given the choice of providing biometric data to an automated system, or speaking with a human agent in a video call, to authenticate. Those who are comfortable with facial recognition tech can upload a copy of their photo ID and then be authenticated by their selfie, and those who aren't can talk to someone to prove they are who they say they are. An online IRS account can be used to view tax documents and the status of payments among other things.

    "Taxpayers will have the option of verifying their identity during a live, virtual interview with agents; no biometric data – including facial recognition – will be required if taxpayers choose to authenticate their identity through a virtual interview," the IRS said in a statement on Monday.

    Continue reading
  • Sri Lanka to adopt India’s Aadhaar digital identity scheme
    Biometric IDs for all, cross-border interoperability not on the table

    Sri Lanka has decided to adopt a national digital identity framework based on biometric data and will ask India if it can implement that nation’s Aadhaar scheme.

    The island nation had previous indicated it would work with the Modular Open Source Identity Platform (MOSIP), an organisation based in India that offers tools governments can use to create and manage digital identities.

    But a list of Cabinet decisions published on Tuesday, Sri Lanka’s government announced its intention to ask India for a grant of its scheme, which has been widely interpreted as meaning India share Aadhaar technology.

    Continue reading
  • UK police lack framework for adopting new tech like AI and face recognition, Lords told
    Governance structure is 'a bush, not a tree' – whatever that means

    UK police forces have no overarching rules for introducing controversial technologies like AI and facial recognition, the House of Lords has heard.

    Baroness Shackleton of the Lords' Justice and Home Affairs Committee said the group had found 30 organisations with some role in determining how the police use new technologies, without any single body to guide and enforce the adoption of new technologies.

    Under questioning from the Lords, Kit Malthouse, minister for crime and policing, said: "It is complicated at the moment albeit I think most [police] forces are quite clear about their own situation."

    Continue reading

Biting the hand that feeds IT © 1998–2022