The Register Home Page

* Posts by JacobZ

254 publicly visible posts • joined 26 Nov 2014

Page:

Anthropic reveals $30bn run rate and plans to use 3.5GW of new Google AI chips

JacobZ

Lies, damned lies, and annualized run rates.

Anthropic says lots of things in unaccountable press releases. When its CFO Krishna ‌Rao made a court filing under oath, he stated that revenue "has exceeded $5 billion to date.”*

So we are supposed to believe that a company with $5B lifetime revenue had an ARR of $9B in 2025, $12B mid-February, and now $30B.

Also, I'll believe that Anthropic is serious about buying those chips when Google files a quarterly financial report that shows a corresponding bump in their revenue forecast.

----

*This was in their lawsuit with the Pentagon. Filing can be found at https://storage.courtlistener.com/recap/gov.uscourts.cand.465515/gov.uscourts.cand.465515.6.5.pdf

'Uncle Larry’s biggest fan' cut by email in early morning Oracle layoff spree

JacobZ

Revenge is the best revenge

Many years ago I "separated" from Oracle. I got paid, including for vacation; got my vested options; got my CORBA.

And then about two weeks later, I got the HR letter asking me to sign that I would never say anything disparaging about Oracle. I sent the letter back with a little note saying "Why would I ever sign this?"

And then I got a job as a tech analyst [not Gartner] and said lots of disparaging things about Oracle.

OpenAI gets $122B to 'just build things' as the world blows them up

JacobZ

Re: Strange times

...and the rest will be just wasted

(Nod to George Best)

Usage pricing leaving software vendors guessing what lands on the invoice

JacobZ

That's an ad

That's an for m3ter via a "report" they commissioned from PwC that astonishingly concludes that people should be using m3ter's products.

GitHub backs down, kills Copilot pull-request ads after backlash

JacobZ

Nominee for Weasel Word of the Year

Using "tip" for "ad" has to be an early leader in the contest for Weasel Word of the Year.

JacobZ

Here's a tip: Pick a lie and stick to it.

So was this a conscious but tone-deaf decision to insert ads that they then realized was "icky", as stated by GitHub VP of developer relations Martin Woodward?

Or was it an unintentional "programming logic issue with a GitHub Copilot coding agent tip that surfaced in the wrong context", as stated by Martin Woodward, VP of Developer Relations, GitHub?

Here's a "tip" for you, Martin: pick a lie and stick to it.

Staff too scared of the AI axe to pick it up, Forrester finds

JacobZ

Re: mandated AI

Rewarding people based on how good their work is presumes that managers know what their people are doing and understand it well enough to evaluate it. How often is that true?

Mind you, I did have a boss once who told me "I don't really understand what you do, but people keep telling me how good you are and asking for you on their projects, so keep doing that." At least he was honest about it.

SoftBank to build massive AI datacenter on former US nuclear weapons site

JacobZ

Softbank

You know the end is near when Softbank is committing billions.

JacobZ

Re: 10GW

Serious answer: because the available power is the limiting factor in the capability of a datacenter. It's defined initially and baked into the construction, and increasing it is a major construction exercise.

By contrast, the amount of processing power is something you might target at the beginning of construction, but by the time the facility is online, you really don't know what processors will be capable of for a given power consumption. And even if you did, that number is going to change repeatedly over the lifespan of the building.

Similarly, the footprint of the building is not telling you much about the capabilities.

Payment biz pulls plug on open source charity after KYC spat

JacobZ

...or worse

Im fact, if they wanted to do that check, the LAST thing they should do is to ask FSFE for test credentials.

If FSFE were up to any funny business, they could rig their portal so that the test credentials would be recognized and take the user through acceptable cancellation flows, while continuing to do whatever shenanigans Nexi is worried about for everybody else.

Far better to, as you suggest, to anonymously set up an account like a real contributor without announcing their intentions to FSFE.

Everything about this smells.

IBM stock dives after Anthropic points out AI can rewrite COBOL fast

JacobZ
FAIL

One of the problems with COBOL (and other old languages) is not just that the code is old and the programming expertise is scarce; it's that the code often IMPLICITLY implements business requirements that are not EXPLICITLY documented anywhere. The Requirements docs or design docs or functional specs are probably long gone. There are likely to be critical chunks of code where the last person who knew why it did a particular thing - or did it a particular way - retired decades ago.

That is a level of understanding - and a level of caution - that an LLM is never going to achieve. It's a big part of what other posters have referred to as the problem of ensuring that the replacement code is functionally equivalent to the code it is replacing.

And I'm guessing the people proposing this have as little understanding of what it was like to develop in the Olden Days as do the people selling off IBM over this latest scare.

Ring kills Flock partnership amid surveillance scrutiny

JacobZ
Big Brother

Big Brother

Imagine installing a video camera on your front door that you don't control, and that the vendor can turn on or off on a whim.

I can't imagine any way that might be abused.

Amazon can't build AI capacity fast enough, throws another $200B at the problem

JacobZ
Headmaster

Re: Is the return worth it?

"So how do we equate energy consumption to processing power?"

<Sensible>

There is actually a sensible answer to this.

During the lifetime of a datacenter (say, 15 years for a regular non-AI datacenter?), the processors are going to be replaced several times, (usually) increasing the processor power. So while processing power is a good measure of the initial capacity of a datacenter, it is also a constantly-changing number, and doesn't tell you anything useful about the capacity, say, five years from now.

The maximum power capacity, however, is a much more stable number, as it's a major engineering exercise to increase it. It's also often the limiting factor, especially in the current madness where the demand for power for datacenters far outstrips the ability to build out more (which is why you have, for example, Microsoft bringing Three Mile Island.) And third, it's hard to measure the processing power of modern datacenters. If you have a mix of CPUs (some ARM, some x86), GPUs, TPUs, etc., how do you add all those up to a sensible number that is comparable across different datacenters?

</Sensible>

We now return you to your regular El Reg comment thread in progress

Oracle expects investors to pump $50 billion into its cloud this year alone

JacobZ

Fantasy money

That $455B that Oracle claims it has booked is entirely imaginary. $300B of it is from OpenAI, which does not have the money. It doesn't even have $30B. It is burning cash so fast that it will have to borrow more than any company in history has ever borrowed to pay any of what it has promised.

But it gets worse. A lot of the remaining book is companies like NVidia and others buying capacity *on behalf of OpenAI*, which they will then lease to OpenAI -- which can't pay them either.

And worse again: Oracle does not get paid on (most of) these contracts - if it gets paid at all - until the datacenters are up and running, at which point it will be hundreds of billions in the hole.

The best thing that can happen to Oracle is that OpenAI fails quickly, before Oracle spends a lot more on datacenters it is never getting paid for.

JacobZ

Big Tech is delusional, Oracle most of all

Big Tech doesn't want to believe that anything but massive cloud-based LLMs can get the job done, regardless of reality. The promise of Big LLMs is that only a handful of companies have the money and scale to deliver them, and that those companies will make massive profits. And that is so seductive to tech CEOs it has completely clouded their judgement.

What's that Upton Sinclair quote? "It is difficult to get a man to understand something, when his salary depends upon his not understanding it." Now substitute "image of himself as a visionary leader transforming the world and also becoming as rich as Croesus" for "salary" and you get some idea of why tech CEOs collectively have lost their minds.

Yes, you can build an AI agent – here's how, using LangFlow

JacobZ

This again?

For the past 40 years I have been reading press releases that promised that non-programmers could code just by connecting boxes.

For about 20 years, they have promised that non-programmers could configure workflows the same way.

And now they promise that non-programmers can configure something something agent something.

I expect it to end in tears.

Banker claims Oracle may slash up to 30,000 jobs, sell health unit to pay for AI build-out

JacobZ

AI job losses

Finally, a real example of AI adoption causing job losses

'Ralph Wiggum' loop prompts Claude to vibe-clone commercial software for $10 an hour

JacobZ

What about original work?

So, after humans have done all the heavy lifting of soliciting requirements, developing code, testing, fixing, scaling, integrating, making secure, iterating to implement additional requirements because people are really bad at identifying requirements until they can see the thing and say "no, that's not what I meant!", thoroughly documenting the whole pile and then productizing the whole thing...

...an AI can reproduce that for $10 an hour (it doesn't say how many hours).

A photocopier can plagiarize a book for a lot less than $10 an hour, and that doesn't impress me either.

AI adoption at work flatlined in Q4, says Gallup

JacobZ

When somebody tells me...

...that AI helped them with their work, it doesn't raise my opinion of AI. It lowers my opinion of their work.

Microsoft CEO: AI sovereignty isn't where it runs, it's who controls it

JacobZ

Re: Datacenter location is the least of an enterprise's worries

"We are spending HOW much now?! How can we get out of this without losing face to the investors?"

The best thing that these numbnuts can hope for at this point is a major recession so that they can dial back their wild overspending and blame it on the economy.

Meanwhile the politicians will blame the recession on wild overspending on AI.

Everybody wins.

Well, except us, obviously.

JacobZ

corporate pirates the lot of them

"SatNad argued that corporate AI sovereignty hinges on control over models trained on proprietary knowledge"

He neglected to mention that their models are trained on other people's proprietary knowledge

Hyperscalers, vendors funding trillion dollar AI spree, but users will have to pay up long term

JacobZ

Re: Who is getting rich?

Yep. Mostly it's Nvidia, sometimes with systems builders like Dell or Supermicro.

Some DC builders may do OK, however some are in a risky position where they keep ownership of the building and lease it back to the company that ordered it, such as Google, based on the "promise" of massive AI usage that will definitely come any day now. And by "builder" here I mean the prime contractor. I hope that the subs pouring concrete and laying cable are getting paid up front...

Workers should control the means of agentic production, suggests WorkBeaver boss

JacobZ
Holmes

All over by Christmas

If the AI vendors focused on the demand side, they would pack their bags and be home in time for Christmas

Trump Media jumps aboard the speculative nuclear fusion bandwagon

JacobZ

"There is a moonshot opportunity here, a version of the Apollo program."

Such a stupid comparison.

The Apollo program was a massively expensive taxpayer-funded science and engineering project. At the end of it, we did not have rocket ships for the masses, nor was that the goal.

A better comparison might be something like the companies that first gambled on bringing widespread access to electricity.

JacobZ

Breaking ground in 2026

If there's one thing Trump is really good at, it's skimming off a construction project.

Snowflake update caused a blizzard of failures worldwide

JacobZ

Whatever happened to...

...progressive rollouts? Update one region. Let it marinate for a while. Then do some more.

Or even testing before deployment?

"Breaking the database schema" seems like such a fundamental mistake there surely should be a test case for backwards compatibility of the schema.

Jassy taps 27-year Amazon veteran to run AGI org, which is now definitely a thing that exists

JacobZ

Irony?

It seems weird that AGI (which doesn't exist yet, and will not come from GenAI no matter how much money and how many GPUs they throw at it) is under GenAI (which doesn't work).

However, it seems highly appropriate that all of this sits under a Marketing guy.

Hyperscalers fuel $112B server spending spree in Q3

JacobZ

Bubblenomics

Phase 1: Everybody is afraid to be the last one in. Goldrush!

Phase 2: Everybody is afraid to be first one out. Surely the last one standing will make a fortune?

Phase 3: Everybody is afraid to be last one out. Game over.

We entered phase 2 somewhere around mid-year. It's in the nature of bubbles that nobody can predict when we tip into phase 3 - personally I thought it would come as early as Novemner - and when we do, it will happen extremely rapidly.

Bezos-backed Unconventional AI aims to make datacenter power problems go away

JacobZ

Re: Twaddle

One pedantic point: LLMs are not trying to emulate the analog processes of brains in order to reason like brains. They are merely trying to emulate the outputs of brains in the belief that if you scale that large enough, reasoning - or intelligence - will somehow magically appear. Or as linguists might say, if you imitate surface structure well enough, deep structure will magically emerge.

In other words, it's Cargo Cult Computing.

Or if you prefer a cruder analogy, it's like shoving sh*t up a cow's arse and expecting to get grass out of its mouth.

Cloudflare suffers second outage in as many months during routine maintenance

JacobZ
Headmaster

Re: I couldn't give a.....tinker's damn!

El Reg comment threads are antisocial media.

Campbell's CISO canned after lawsuit alleges hour-long rant against staff and customers

JacobZ

Bio-engineered meat insanity

Just for the record, bio-engineered meat costs around $60/kg. Real chicken costs less than $2/kg wholesale, probably even less in the industrial quantities Campbell's buys it in. It would be absolute insanity for them to use it in soup.

Anthropic is at the heart of the latest billion-dollar circular AI investment bonanza

JacobZ

Re: Hot air (and water)

The limiting characteristic of a datacenter, the one that is reasonably stable over time, is how much power it is supplied with and can distribute to its racks for compute and for cooling. The amount of compute, however you choose to measure it, is going to vary over the life of the datacenter, for example if you upgrade the GPUs (megaflops) or change the model (tokens).

So power capacity is both the only thing that is stable and the only thing that is reasonably predictable from the outset.

JacobZ

Stranded assets

The technical term is "stranded assets". And the people holding the bag are the private capital firms who accepted the GPUs in the bit barns and/or the Big Tech leases that "promised" to use those bit barns as collateral for their loans. And possibly the traditional investment firms who lent money to the private capital firms.

The whole thing is the absolute classic definition of a bubble. I suspect the people investing in it already know this, much like the 2008 housing bubble, and the game now is simply to not be the one holding the bag when the music stops.

Commodity memory prices set to double as fabs pivot to AI market

JacobZ

Just in time...

...for the AI bubble to pop, and the chipmakers to have to switch back again.

OpenAI’s viability called into question by reported inference spending with Microsoft

JacobZ
Facepalm

Microsoft knows best

Microsoft knows exactly what OpenAI's revenues are, because it gets a cut.

And it knows what OpenAI's losses are, because it has to report its share in its own quarterly reports.

And of course it knows how much (little) it is paying OpenAI for reselling its models on Azure (bear in mind that under 2% of Microsoft 365 users are licensing Copilot).

And knowing all that... Microsoft is partnering with Anthropic (https://www.cnbc.com/2025/09/24/microsoft-adds-anthropic-model-to-microsoft-365-copilot.html)

Big Tech's control freak era is breaking itself apart

JacobZ

Re: As irrational as it sounds...

"Experience is that thing you get right after you could have used it."

Agents of misfortune: The world isn't ready for autonomous software

JacobZ

Finally...

...a meaningful comparison between AI and Uber:

"Don't waste time asking for permission. Establish market dominance first, then ask forgiveness."

By the way, is it a requirement to be an AI leader that you throw out blatant lies to promote your product, or is it just a coincidence that OpenAI, Anthropic, and Perplexity all do that?

Famed software engineer DJB tries Fil-C… and likes what he sees

JacobZ

Re: Type checking and compatibility

FWIW, and that's probably not much, I prefer "bool flag := <bool expression>". It is a useful redundancy that ensures that the (potentially hard to read) expression is of the type that the author and later reader think it should be. While this is not a big deal for bool, it can be a big deal for numeric types or for references (how many C bugs occur because something is a pointer to a pointer, rather than just a pointer?)

Also FWIW my own language project [actually a pre-processor to Go since they won't add it to the language] goes even further with compile-time checking. It allows the programmer to define named types, much like any language, and then also "compound types", or as I call them, dimensions. For example, in C-ish pseudo-code it might look something like...

type Meters double;

type Seconds double;

type Velocity (Meters/Seconds);

And then if you have m, a variable of type Meters, s of type seconds, and v or type velocity, you can write:

v = m/s;

or

m = v*s;

but not

v = m*s; //wrong dimensions, failed by dimension checker even though underlying types are all double

The dimensions will be checked at compile time, leaving runtime code as efficient as if the dimension checker never existed.

Of course, this is a relatively trivial example. The value becomes more apparent the more complex an expression becomes, and when calculations are chained together.

JacobZ

Re: Not quite ...

Is the hammer responsible for bending nails?

Tony Hoare certainly thought so, and he knew a thing or two about writing languages: "The author of a language is responsible for errors commonly made by its programmers" (wording may not be exact, intention is.)

When I was learning Rust and I got to the sentence that began "A common mistake made by Rust programmers..." I threw the book against the wall. (Not my only issue with Rust).

Look at this way: if you had two hammers, and in the hands of the same skilled carpenter, one of them drove straight every time, and the other one bent one nail in ten, would you say that the hammer was responsible? Because I sure would.

Microsoft just revealed that OpenAI lost more than $11.5B last quarter

JacobZ

10% of profits is not easily shaken off, even for MS

A loss of $3.1B out of a gross profit of $30.8B is 10%, and that is not "easily absorbed" even by Microsoft. If you had told investors mid-quarter "hey, our profits are going to be 10% less than they would have been without OpenAI", do you think they would have said "this is fine"?

These are seriously big numbers, and they are getting bigger, not smaller.

AI boffins teach office supplies to predict your next move

JacobZ

Ceiling cats

"The system combines ceiling-mounted cameras "

Make that ceiling-mounted cats and objects that throw themselves off the counter, and I'm in.

JacobZ

Re: Could be useful if

I feel like mine do that already. I own three hammers, and I'll be buggered if I can find one when I need one.

OpenAI's ChatGPT is so popular that almost no one will pay for it

JacobZ

Re: Nothing like statistics and creative maths to make the world go around !!!

I don't know why somebody downvoted you. That's *literally* how OpenAI etc. are calculating their Annual Recurring Revenue (ARR). They take their most profitable recent month and multiply by 12.

Actually, given the lack of transparency, it might not even be a month. It's possible they took their best day and multiplied by 365.

JacobZ

Re: One day...

Prescient.

In the classical enshittification model as practiced e.g. by Facebook and Amazon, phase 3 begins when having locked in the business users, they start screwing them over too.

For a preview of what that will be like in AI, look at Anthropic. Anysphere, the maker of Cursor AI, reportedly pays 100% of its revenue to Anthropic for use of its Claude models. Meanwhile, Anthropic is developing Claude Code, it's own AI coding assistant. As soon as they judge it good enough, they will increase the prices they charge Anysphere, tipping them into unsustainable losses, and pick up their user base on the premise that it's easier to move to another Claude-based tool that start over somewhere else entirely.*

*I have no idea whether it really is easier, but that's how it will be sold to the suits.

Some like it bot! ChatGPT promises AI-rotica is coming for verified adults

JacobZ

What does ChatGPT say?

I asked ChatGPT about this, and it adamantly and repeatedly denied that OpenAI would ever do such a thing:

"I haven’t seen any press release or public statement from Sam Altman, OpenAI’s CEO, where he explicitly says OpenAI will create porn chatbots. It’s possible that there’s been some confusion or misinterpretation, as Sam Altman and OpenAI have generally been focused on ethical AI development and ensuring that their models are used for positive societal impact.

OpenAI has been quite clear in its guidelines and efforts to restrict harmful, illegal, or unethical uses of its AI, including adult content. They’ve taken active steps to prevent the misuse of their technology in various harmful areas, including by implementing content filters and moderation features to ensure users don’t engage with inappropriate or damaging material."

It also said that "If such a claim were true, it would likely spark significant ethical debates, as creating or allowing AI to be used in adult or explicit contexts raises major concerns about consent, safety, and harm."

Then I pointed it at this story on El Reg, and it admitted that "Yes, OpenAI (via its CEO) has publicly confirmed an intention to allow erotic/mature content for verified adults starting December 2025, under new age-gated arrangements."

I now feel very dirty.

Not because of the pornbots, but because I used ChatGPT.

18 zettaFLOPS of new AI compute coming online from Oracle late next year

JacobZ

Water, water everywhere

Yep. And on an even more basic level, imagine if that much water were supplied to people who don't have clean water, or to agriculture for small farms. According to a study by the Lawrence Berkeley National Lab, "in 2023, U.S. data centers consumed 17 billion gallons (64 billion liters) of water directly through cooling, and projects that by 2028, those figures could double — or even quadruple. The same report estimated that in 2023, U.S. data centers consumed an additional 211 billion gallons (800 billion liters) of water indirectly through the electricity that powers them."

Fast forward than number to 2025, and projections for 2026...

Qualcomm solders Arduino to its edge AI ambitions, debuts Raspberry Pi rival

JacobZ

Re: An obvious choice

And a lot more profitable to start with one that's hugely popular.

Nadella hands Microsoft money machine off to new commercial CEO so he can visioneer the future

JacobZ

Re: Don't Replace The Workers With AI... Flip The Script.

Now that is some out-of-the-box thinking. It's a whole new paradigm.

JetBrains backs open AI coding standard that could gnaw at VS Code dominance

JacobZ

Embrace, Extend, Extinguish

Sure, Microsoft will adopt ACP.

Then it will add some unique proprietary extensions.

And then it will drop or modify some key features so that your agents will only run with VS Code.

And anybody not paying sufficient attention will be locked into Microsoft again.

AI has had zero effect on jobs so far, says Yale study

JacobZ

Things that make me go "hmmm"

If AI is producing such tremendous efficiency gains for companies, why can't any of the AI services - OpenAI, Anthrophic, Anysphere (Cursor) - charge enough to even cover the cost of a query?

If these claims were true, you'd think that would be something really valuable that companies would pay a lot of money for.

Page: