Fuck off, Clippy.
We can tell it's you hiding under that fake-nose-and-glasses getup.
Like an incontinent hippo on a helter-skelter, Microsoft has flung out yet more Copilot functionality in the form of enhancements to Designer, an AI-infused image generator. Keeping track of Microsoft's Copilot emissions is becoming tricky. Writing in Directions on Microsoft, Wes Miller noted the growing confusion around the …
Working for a finance company that has over 35 years of data, around 300PB, we're looking to feed it into some fun AI models and see what pops out. Just R&D right now but the users and management are keen and that's rare to have them behind a new tech project that might actually be fun for the tech teams too.
We all know that we'll "pay the piper" as it's MS AI technologies and we're not getting "ought for nought", but if it gives our biz users some fun numbers to play and keeps them happy, then so be it!
My aged eyes misread "Copilot" as Coprolith * which is probably reality seeping through Microsoft's AI induced hallucinations.
Leaving Cory Doctorow's verdict on the internet aside it all really is resembling the contents of the potty or chamber pot.
*Coprolith: a mass of hard fecal matter in the intestine. Merriam-Webster.com Medical Dictionary
Correction: It's a marathon for the clients who gets locked into long term contracts for this stuff long after its relevance and possible use case has faded away, but it's a drag race for vendors and consultants to sucker people into as many of those contracts as possible before the next new shiny comes along and AI gets chucked off the cliff along with all the other "essential" tech fads of the last couple decades.
.. yet they got kinda embarrassed when I asked how much al this shiny stuff demonstrated with a Scientologist's fervor would raise our license costs (and no answer, of course).
The question I didn't asked (I had been asked to be nice by my boss) was if the proclaimed "Copilot keeps your data private" (as opposed to ChatGPT) also kept our data safe from Microsoft. As far as I can tell, it very much does not. If you're friendly with your administrator, ask him or her to show you what Microsoft Actions pick up, and then imagine adding AI to that. You won't sleep afterwards.
From using AI, I guarantee it. We're using it at work. Currently testing it and people like its "summary" of things. We had a meeting and had transcription on. After we got sent the summary of the meeting, that CoPilot had created.
I had to point out that on my part, it stated I said I'd be looking into an issue with an app and fixing it.
Not only is there not app that has an issue but I also never said what it claims I said.
And there is the problem. People may not check and relie on it. A month later you'll be pulled up for review as you never did what you said you'd do in the summary. You point out you never said it and much like the investigators in the Post Office, they'll claim "it was stated in the summary, so you must of said it. CoPilot doesn't make stuff up"
That's terrifying, there is enough trouble already with folks leaving the same meeting with different memories of what happened
This is like having a forgetful person write a summary of a meeting a day late - holding people to what it remembered they said will be terrifying.