Anthropic needs eliminating by Reddit
Anthropic has been overtly stealing data. They should be fined $10bn and closed down.
60 publicly visible posts • joined 22 Dec 2018
These AI language models have no real value as their outputs are unreliable and reduce productivity.
I believe Anthropic should be found liable to Reddit for at least $10bn, and OpenAI should owe the New York Times at least $50bn for content theft.
The CEOs of LLM firms should also face Bernie Madoff style prison sentences.
All I keep hearing from data storage companies is that you need to keep and store all your data, when in fact it needs properly managing and curating.
Just dumping garbage data in a data lake is a waste of money and will just produce worse results over time.
In the worst scenario, if casual users of social media just keep posting GenAI Slop, what do people think will end up being used for model training?
Much GenAI really is dead and a complete waste of time an money. People should just focus on narrow enterprise use cases where there are known required outputs.
If your storage vendor says you need tens of Petabytes to store GenAI output, tell them to go away.
The other side of the AI coin has been the data boom. However, people seem to have learned once again that quality is more important than quantity.
The GPUaaS vendors have overbought both compute and storage, and will slow down this year, having overpaid NVIDIA and seeing their hourly rates plunge by 75%.
Sensible AI adoption will see enterprises using their own data on smaller scale platforms with pretrained / better distilled models.
This will be fine for the Pures and NetApps of the world, but will certainly kill off some of the Storage IPO hopefuls.
Looking at the rest of the world, the UK Government is just following suit.
The trouble is, AI has become a giant Ponzi scheme. Just look at how some of the GPU-as-a-Service providers are doing with 75 percent price plunges on hourly rental having paid NVIDIA full whack. Now people are moving to smaller more manageable distilled models and agents, and so running the latest high-end GPUs and storing vast quantities of data is proving unnecessary. Quite a lot of the firms in this ecosystem are rushing to IPO as they know they will be worthless once the hype burns out.
These AI features are not remotely useful, other than being a search alternative.
I get they can generate a template for code, but it still needs tweaking and was probably available of GitHub (whose IP BTW?).
Also for written correspondence and business plans needing specificity, it is absolutely hopeless and just wastes people time.
I my opinion, using CoPilot for detailed writing reduces productivity by about 40%. A desire to use CoPilot or ChatGPT in my mind is a sign of low IQ.
Interesting to see so many CEOs hoodwinked by it.
Apple's products have been getting worse for the last decade. I unfortunately have been issued a Mac for work and it rubbish compared against a Lenovo Carbon running Ubuntu. I've recently ditched my private iPhone for a Samsung and it is much better. Just waiting to change my work phone, where I'll do the same.
The AI chip startup space is littered with firms with unusable architectures - mainly down to inadequate software stacks. NVIDIA has most of the cards, and the only ones who can compete and build their own chips are the hyperscalers - who also have developers en mass and loads of data to train the models.
ChatGPT is doing nothing more than replaying and combining billions of previously ingested examples using clever statistical classification.
It's is nothing more than a giant piracy engine. Does ChatGPT actually stand for Content Heist Attribution Theft General Piracy Tool?
ChatGPT is doing nothing more than replaying and combining billions of previously ingested examples using clever statistical classification.
It's is nothing more than a giant piracy engine. Does ChatGPT actually stand for Content Heist Attribution Theft General Piracy Tool?
This approach is just repeating the NHS' bad old mistakes and seems to support the ongoing lobbyists revolving door, whilst locking in NHS structural inefficiency.
It is also worth noting that Palantir's Technology is out of date, and really they should be excluded from the process for trying to circumvent it by hiring.
Patients will die if the NHS doesn't reform its data handling and analysis processes. Today too much narrow focus is placed on the specialisms rather than a holistic approach with patient centric medicine. Modern analytics and AI can reveal new insights, but again the adoption of this is slow.
This really has to be peak GPU.
I have 32 inch 4k monitor, loads of real estate on it and I can't actually see the pixelation. This is why NVIDIA is diverting to DPUs, this is the last generation of non commodity graphics cards. We also know using GPUs fot AI is too difficult for most,, this feells like a last stab at Graphics before the DPU and Omniverse dream.
NVIDIA is all out of stock for A100/H100, you won't be able to get one for 12 months minimum and the software will take 12 months more to work. They have pre-announced this early to get buyers to hold off and not investigate other options. I can't take NVIDIA seriously anymore!
The DPU is also designed to kill off storage vendors.
What many probably don't realise is that a lot of these powerful gpu enabled systems in Russia are being used to spy on the population - watch out for dual purpose technology. I also read on a very 'informative' forum that the Vlad Putin we see at the end of the long table isn't the real Vlad Plutin, and is actually a hologram controlled by the Sberbank computer.....clearly Skynet has taken over.
It isn't silly, it makes sense.
Firstly it is bad for the GPU market, as NVIDIA may squeeze Intel and AMD in that area, as they might prefer their own CPU.
Secondly it may restrict long term innovation in areas such as AI. NVIDIA GPUs are clearly substandard for this purpose, and so they shouldn't be allowed to take dominance via blatant commercial piratering.
NVIDIA kicked off this trend with their DPU and it makes sense for Intel and certainly Cisco to join in. At a workload level as this is an increasing DC overhead workload, but it also makes sense from a business continuation standpoint for NVIDIA. It is no secret, but GPUs probably won't exist ten years from now, as really we do not need bigger monitors at higher resolution. May be 8k in some cases, but my 32 inch 4k is already more than I need and can usefully see, at 8k it would either be 64" or the pixels woud become smaller giving no benefit. GPUs will just get built into CPUs again, hence the interest in ARM and why NVIDIA needs DPUs. AI won't cut it for NVIDIA, the specialist vendors have superior technology and Gus fail at scale
CUDA is awful and GPU memory management is worse.
Really NVIDIA GPUs are a dead end technology.
ARM has great mobile GPUs and Intel/AMD are doing a great job. It really is time to stop buying NVIDIA cards, they are wasteful of power and have the worst programming environment. Any CTO allowing them should be fired.
This is good news. Also Intel has a good capability to integrate graphics into processors, so perhaps some firms will respond with Arm chips with integrated Intel GPUs in yine. That said Arm also has some great mobile GPUs. This type of innovation is definitely a reason to block Nvidia taking Arm over. I suspect overtime Nvidia and their expensive and energy inefficient devices will become less relevant.
Yes, I take the point about the chip companies not looking at software. That said, I recently met SambaNova at the AI Summit in London. They were very much talking about the software they support and really leading with their pre-enabled AI models. The framework support for Tensorflow and PyTorch also seems great.
No one this system was intended for is using it properly.
I am at one of the originally named pharmas and we aren't allowed to use the system due having secure data requirements. I've only seen one output which seems to look like render jobs, $40m is a lot for a graphics card!
NVIDIA is fast becoming the new Intel and really needs breaking up. To make matters worse, they also funded a pork barrel supercomputer project in the UK near Cambridge so they could win UK Government plaudits. As far as I am aware no useful science has been done on that service, just some AI Graphics. Real scientists would do better working with the University. A real pro-British Politician would do better blocking this deal and help ARM float on the LSE.
NVIDIA has a hopeless ecosystem and essentially unprogrammable hardware. Don't waste money on GPUs for AI workloads, the memory aspects are a nightmare. You are way better off looking at some of the new AI startups with AI specific hardware and PyTorch support. Ignore NVIDIA, they don't make AI platforms, they just loosely join GPUs with wet string.