Think this announcement has just killed all the other AI Chip startups!
Posts by MikeLivingstone
41 publicly visible posts • joined 22 Dec 2018
Nvidia creates open server spec to house its own chips – and the occasional x86
That old box of tech junk you should probably throw out saves a warehouse
Facial recog system used by Met Police shows racial bias at low thresholds
The ThinkPad X1 Carbon Gen 10 as a Linux laptop
OpenAI's ChatGPT is a morally corrupting influence
Soon to hear from 10,000 lawyers
ChatGPT is doing nothing more than replaying and combining billions of previously ingested examples using clever statistical classification.
It's is nothing more than a giant piracy engine. Does ChatGPT actually stand for Content Heist Attribution Theft General Piracy Tool?
OpenAI is going to have to answer to thousands of lawyers
ChatGPT is doing nothing more than replaying and combining billions of previously ingested examples using clever statistical classification.
It's is nothing more than a giant piracy engine. Does ChatGPT actually stand for Content Heist Attribution Theft General Piracy Tool?
Basecamp details 'obscene' $3.2 million bill that caused it to quit the cloud
Nvidia admits mistake, 'unlaunches' 12GB RTX 4080
NHS data platform procurement delayed for a second time
Concerns that £360m data platform for NHS England is being set up to fail
A scandal in the making!!!
This approach is just repeating the NHS' bad old mistakes and seems to support the ongoing lobbyists revolving door, whilst locking in NHS structural inefficiency.
It is also worth noting that Palantir's Technology is out of date, and really they should be excluded from the process for trying to circumvent it by hiring.
Patients will die if the NHS doesn't reform its data handling and analysis processes. Today too much narrow focus is placed on the specialisms rather than a holistic approach with patient centric medicine. Modern analytics and AI can reveal new insights, but again the adoption of this is slow.
IDC: Public clouds to surpass non-cloud spending this year
Nvidia releases $1,999, 8K-capable GeForce RTX 3090 Ti GPU
Who can actually see the 8k benefit ?
This really has to be peak GPU.
I have 32 inch 4k monitor, loads of real estate on it and I can't actually see the pixelation. This is why NVIDIA is diverting to DPUs, this is the last generation of non commodity graphics cards. We also know using GPUs fot AI is too difficult for most,, this feells like a last stab at Graphics before the DPU and Omniverse dream.
How Nvidia is overcoming slowdown issues in GPU clusters
Don't even bother trying to get an H100
NVIDIA is all out of stock for A100/H100, you won't be able to get one for 12 months minimum and the software will take 12 months more to work. They have pre-announced this early to get buyers to hold off and not investigate other options. I can't take NVIDIA seriously anymore!
The DPU is also designed to kill off storage vendors.
£5bn up for grabs in UK govt G-Cloud 13 framework
Russia scrambles to bootstrap HPC clusters with native tech
Supercomputer parts the way to hit Putin
What many probably don't realise is that a lot of these powerful gpu enabled systems in Russia are being used to spy on the population - watch out for dual purpose technology. I also read on a very 'informative' forum that the Vlad Putin we see at the end of the long table isn't the real Vlad Plutin, and is actually a hologram controlled by the Sberbank computer.....clearly Skynet has taken over.
US trade watchdog opposes Nvidia's Arm buy, mostly over fears about datacentre innovation
Re: This is Silly... There is plenty of competition in the CPU space
It isn't silly, it makes sense.
Firstly it is bad for the GPU market, as NVIDIA may squeeze Intel and AMD in that area, as they might prefer their own CPU.
Secondly it may restrict long term innovation in areas such as AI. NVIDIA GPUs are clearly substandard for this purpose, and so they shouldn't be allowed to take dominance via blatant commercial piratering.
SmartNICs, IPUs, DPUs de-hyped: Why and how cloud giants are offloading work from server CPUs
This is to replace soon to die peripherals like GPUs
NVIDIA kicked off this trend with their DPU and it makes sense for Intel and certainly Cisco to join in. At a workload level as this is an increasing DC overhead workload, but it also makes sense from a business continuation standpoint for NVIDIA. It is no secret, but GPUs probably won't exist ten years from now, as really we do not need bigger monitors at higher resolution. May be 8k in some cases, but my 32 inch 4k is already more than I need and can usefully see, at 8k it would either be 64" or the pixels woud become smaller giving no benefit. GPUs will just get built into CPUs again, hence the interest in ARM and why NVIDIA needs DPUs. AI won't cut it for NVIDIA, the specialist vendors have superior technology and Gus fail at scale
GPU makers increasingly disengage from crypto miners
Nvidia open to third parties making custom silicon tuned for CUDA applications
Who wants to use CUDA?
CUDA is awful and GPU memory management is worse.
Really NVIDIA GPUs are a dead end technology.
ARM has great mobile GPUs and Intel/AMD are doing a great job. It really is time to stop buying NVIDIA cards, they are wasteful of power and have the worst programming environment. Any CTO allowing them should be fired.
Facebook may soon reveal new name – we're sure Reg readers will be more creative than Zuck's marketroids
Upcoming Intel GPU to be compatible with Arm
Nvidia needs taking down a peg
This is good news. Also Intel has a good capability to integrate graphics into processors, so perhaps some firms will respond with Arm chips with integrated Intel GPUs in yine. That said Arm also has some great mobile GPUs. This type of innovation is definitely a reason to block Nvidia taking Arm over. I suspect overtime Nvidia and their expensive and energy inefficient devices will become less relevant.
Megachips or decoupled approach? AI chip design companies accounting for operating costs
Some of the startups have looked at Software.
Yes, I take the point about the chip companies not looking at software. That said, I recently met SambaNova at the AI Summit in London. They were very much talking about the software they support and really leading with their pre-enabled AI models. The framework support for Tensorflow and PyTorch also seems great.
Nvidia launches Cambridge-1, UK's most powerful supercomputer, in Arm's neighbourhood
$40m on a render farm
No one this system was intended for is using it properly.
I am at one of the originally named pharmas and we aren't allowed to use the system due having secure data requirements. I've only seen one output which seems to look like render jobs, $40m is a lot for a graphics card!
UK's competition regulator fires red flare over Nvidia's $40bn Arm takeover deal
NVIDIA needs breaking up
NVIDIA is fast becoming the new Intel and really needs breaking up. To make matters worse, they also funded a pork barrel supercomputer project in the UK near Cambridge so they could win UK Government plaudits. As far as I am aware no useful science has been done on that service, just some AI Graphics. Real scientists would do better working with the University. A real pro-British Politician would do better blocking this deal and help ARM float on the LSE.
Wanna use your Nvidia GPU for acceleration but put off by CUDA? OpenAI has a Python-based alternative
Using GPUs is painful
NVIDIA has a hopeless ecosystem and essentially unprogrammable hardware. Don't waste money on GPUs for AI workloads, the memory aspects are a nightmare. You are way better off looking at some of the new AI startups with AI specific hardware and PyTorch support. Ignore NVIDIA, they don't make AI platforms, they just loosely join GPUs with wet string.
Windows 11: What we like and don't like about Microsoft's operating system so far
What Microsoft's Windows 11 will probably look like
Bless you: Yep, it's IBM's new name for tech services spinoff and totally not a hayfever medicine
150,000 lost UK police records looking more like 400,000 as Home Office continues to blame 'human error'
China blocks access to website hosting code-for-kids tool Scratch and its forums
Windows to become emulation layer atop Linux kernel, predicts Eric Raymond
Microsoft is being more open, that is a good thing!
I think this is a good thing and makes sense. Microsoft was always a user experience company, whereas Linux has always been king of the datacenter. Being king of the datacenter means your kernel is faster, more reliable and being FOSS means it is more open and maintainable. So running Windows over Linux actually makes sense. Microsoft isn't stupid, there is more Linux in Azure than Windows now, bur MS continues to make many great products up the stack.
IBM cuts deep into workforce – even its Watson and AI teams – as it 'pivots' to cloud
Three is the magic number, unless you're Apple. That's how many million iPad shipments it was down in Q1
IBM's outgoing boss Rometty awarded $20m+ in 2019 for growing revenue 0.1%
What's big, blue, and hands out pink slips? IBM on Thursday: Word spreads of job cuts
IBM dropping loads of products / offerings
These actions mean IBM is no longer able to develop or support many products or services.
Clearly cutting GTS means close to the box services will disappear, so don't expect to get support anytime soon. I can also see loss making products like the Storage Range disappearing, no money in flogging other vendors disks/SSDs and no amount of SDS will cover it. It is hardly surprising IBM is now a minor vendor in the Supercomputer Top500 and clings to a few high margin mainframe sales.
You and me baby ain't nothing but mammals, so let's watch for tech sales VAT weirdness through the channel
Atari finally launches its VCS console. Again.
This is about retro and nouveu combined
Guys, give Atari a break.
I've signed up since 2017 and I am completely chilled about the timescales as I want something fun and playable with retro capability.
The point made in the article contrasting the hardware performance is totally irrelevant, the reason it is irrelevant is that as a rule, the developers in most games companies have been completely incapable of exploiting the benefits of faster hardware for the last 10 years.
Games are like films, and many older games stand the test of time due to the concept and playability, many new games, like many new formulaic films are total rubbish, even if running at 8k.
'Year-long' delay to UK 5G if we spike Huawei deals, say telcos
I think I know what providing the code would provide!
I think there are two issues here.
1) The Spying which undoubtedly has taken place, but which is hard to verify especially if done in silicon.
2) What inspecting the code might actually reveal.
On point 2 I suspect certain US based networking manufacturers might find a lot of similar looking code to their own. Really we should just ban all Chinese goods and have a massive trade war. China would lose out,especially if the US and Europe withheld the manufacturing technologies that we still control. Without these, China’s technology sector would be left decades behind.