CVSS 10 eh?
Lucky it wasn’t in the kernel then.
35 publicly visible posts • joined 18 Jun 2015
Stackoverflow content is generated by people. It has lots of human entropy in its raw form. The whole idea of machine learning prediction is to create a model which outputs one statistically valid central value. So it takes all those stack overflow 'answers' and aggregates them. The AI doesn't give you all the stack overflow answers that it was trained on which match the coding parameters you give it. It gives you one answer.
So AI will generate way more predictable code than a bunch of people using stack overflow would. This plays into the hands of people who get a benefit from predictable patterns. Like the people who (legally or not) step through binary execution in a debugger to figure out where the creds are cached, for example.
It's not just devs producing code who might use AI tools. When your handy LLM produces code it will have a kind of fingerprint, characteristic of the model that produced it. It will be repeatable, there won't be the entropy associated with a random factor like a dev headache, or a deadline, or a personality quirk. The structure of data and objects will likely have the common characteristics of model that produced it. For the people that want to unwind that code creation process to study its weaknesses, this kind of information is very useful. Just like the way a compiler organises code into a predictable, familiar executable format. If anyone can access the same code generation tools as the "bank" did because they used "open" AI, then their code patterns will not be private. It will make hacking their code that much easier. The information on exactly whose AI was used and which version will become a new source of income for vendors on the tor network.
Have I got this right? Some artifical code challenges were more efficient when using an AI helper. Yep. That makes sense. That's not solving any real world problems that the programmers might encounter during the course of the work the bank actually employs them for.
So the AI helper hasn't really been proven to make ANZ programmers more efficient at their work. Here is an alternative take: Microsoft is actually just trying to show investors that devs are using their tools. So they can announce a return on investment for all those LLMs they built with VC. Shareholder value is the benefit they are trying to sell here. Not real world development.
If it's very mathematical, then the real challenge is to point to something tangible as a prediction. This is why we refer to General Relativity on its own, and not as General Relativity Theory. There are a number of predictions of GR that have been successfully tested. String Theory on the other hand, is only ever referred to as theory. Good on them for having a crack. Maths is important but it's not so important to frame the question as it is to explain the answer.
"proactively powered down a small subset of selected compute and storage scale units, to avoid damage to hardware."
So the combined penalties that Azure might fork out for an outage on their cloud is less than the cost of a selection of compute and storage. Nah. I reckon they powered down that hardware because the data on that hardware might have become irretrievable. Hardware can be replaced. Data is a different liability altogether
Sounds like we haven't heard the whole story. Mind you after Azure token signing keys were exfiltrated earlier this year, we are getting used to not hearing the whole story from Microsoft.
Prevention of insecure code has to be built-in to the software development life cycle. The org has to prevent devs from pushing insecure configuration and design, devs will take the shortest path to get the job done,especially when they are measured on commit or release frequency. So what's to stop them building bad designs.
But building that secure development pipeline costs. The org needs pen testers, security teams, and tools. None of which adds directly to shareholder value.
Queue company directors. They say they are answerable to shareholders. If they aren't directly liable when the data breaches occur, what's to stop them spending less on security, and returning more profit to shareholders in it's place?
Good governance should be at the core of good security. But instead company directors maximise profits and pay themselves tens or hundreds of times more than the developers who build their code. They can get their pals in one of the big four accounting firms to sign-off on the audits in return for a guarantee to conduct next year's review. We are audited, job done.
I don't see this circus ending any time soon.
"Mirochnik’s calculations included the cost of electricity, internet service providers charges, IP Transit costs, dark fiber charges, and the cost of operating internal network hardware."
Network hardware, thats like a switch right?... So not the servers themselves, or the support staff, or rack space?
What about installation? Patching? The cost of downtime? The cost of snafus? Sure screw ups happen in cloud, but when you're managing your own kit, the number of parts for shit to happen where you have to own it is so much bigger. Surely El Reg has left a few things off this person's list, or maybe it wasn't that big to begin with?
If you're a hardware operations manager in a software company, you need to tell a good story. When you save the company so much money surely you deserve a raise.
'plasma gas'. Plasma is formally a different state to the other three. Solid, liquid, gas and plasma. Plasma is modelled using MHD (MagnetoHydroDynamics). Whereas a gas that is not ionised, is just modelled with fluid dynamics (navier-stokes). Good on those Korean Engineers and Physicists though.
70kWH battery in each car (on average) x 10 million cars. That gives 7 followed by 11 naughts in watt-hours. 700 GWH
In 20y the storage capacity per EV will likely be higher.
But even if you only allow for 50kWH in each car battery, distributed storage is still compelling. It's also much more democratic than relying on a few profit oriented power companies to deliver for the future.
Are you sure the UK needs more than 11 h of peak winter storage? Peak usually only lasts for a few hours each day. Besides that, you seem to be assuming that the UK would have no other storage or generation and that it would be running entirely off V2G for 11 hours each day all year round, while everyone in the UK has all the heaters and lights on while cooking dinner all night 365 days a year.
Even if 1/3 of the vehicles in UK are not connected to the grid all the time, homes will still have batteries as well. In any case it's clear that distributed storage like V2G is going to make large-scale power generation a thing of the past. That includes nuclear, coal and gas.
For all that nuclear power is safe and renewable energy has a storage challenge. New nuclear is just not looking viable. When the majority of the population has a V2G EV, the storage problem will not be an issue. There are 32 million cars in the UK in 2022. If only 1/3 of those are 70kWH V2G-capable EVs connected to the grid that's 700GWH of storage (or 11h for the entire UK at peak winter use) . The point at which that could happen might still be 20-30 years away. But who is going to invest in a nuclear power station when 25 years (or less) after it comes online it will be obsolete because storage has been democratised.
Perhaps some assumptions are being made about what speculative capital might be required to make said technology viable?
Maybe one of the reasons Small Modular Reactors are still not viable is because investment in wind and solar is much cheaper and is not nearly so speculative. Wind and solar doesn't solve the storage problem, but V2G also has a lot of potential as a future storage technology. Most of the internal combustion vehicles on the road today will have been replaced by 2050. If they have been replaced with V2G capable EVs, storage will simply not be an issue by then.
"Overall, the company made a net loss of $26.7 million, down from the $15.9 million loss of the same period last year"
If the loss this year is greater, or bigger, or larger than the loss of the previous year, doesn't that mean the change in the loss is the opposite of down?
While I laud your optimism for carbon neutrality. I reckon fusion will be important for next gen space travel. In interstellar space the only viable power source is one which travels with you. Fission reactors will only last a few decades. So the thing that powers propulsion and life support has to be fusion using interstellar hydrogen. Maybe not the original problem the tokamak was envisaged to solve. But there are plenty of examples of discoveries and technologies that ended up being used for completely different purposes from that which they were originally conceived for. The internet being a prime example.
The original bug in eniac was also not part of the design, but gave its name to what we now refer to generically as a flaw in the operation of the device/software whatever the cause. That said, since posting the above I realised I omitted a line from the original (early 90s) joke.
I am Pentium of Borg.
Division is futile.
You will be approximated.
Botnets are still ubiquitous if my web logs are anything to go by. You know they are part of the same botnet because even though one failure is from Brasil and another from France and yet another from Ukraine etc, they all cause the exact same number of failed logins in a 24h period.
Hint to SVR: try to introduce some randomisation with your low and slow.
Once upon a time the pilot of an aircraft would have to be responsible for a pre-flight check including air-worthiness of the plane, knowing takeoff weight, fuel amount, and maintenance history. They would also have to submit a flight plan. All before takeoff. It was part of being a pilot, and not necessarily anything to do with knowing how to actually fly the plane in the air. Several disasters later we know that its simply not possible anymore for the pilot to gather all the info they need to know. So its then the responsibility of the airline to ensure that the required checks have completed and information is accurate. Pilot error is often attributed when things go wrong, but it should be obvious from the above that modern pilots of larger aircraft are completely blameless when things like takeoff weight are wrongly calculated.
Further to the above mental rambling, from the point of view of looking toward/at a black hole, the farthest one can see is the event horizon. At the event horizon time has stopped from an observers point of view. All the stuff that has ever fallen toward the black hole is jammed up together, moving slower and slower the closer it gets to the event horizon. If there are any magnetic fields being generated by moving charges would they be in that wad of almost unmoving almost timeless stuff near the event horizon.
In a plasma by definition there is charge separation. Monopole magnetic fields do not exist in the known universe, the field is always a dipole and requires circulating currents. Circulating currents require charge separation, however small the electric field, a current implies moving or unbound charges. So if a black hole is generating a magnetic field where are the moving charges? If light can't escape a black hole does a black holes magnetic field come from inside the event horizon?
We have plasma streaming out from the Sun at some velocity. the density of the plasma is constantly reducing as it spreads out over a larger area. The momentum of the plasma and the pressure exerted by it is also reducing. If the heliopause represents the boundary where the pressure of interstellar plasma is equal to the solar wind, then unless the interstellar plasma is travelling toward the sun with the same or greater velocity (and lets face it it wont be unless we we closer to some other star), the only way you can have a point where the two pressures are equal and the combined momentum is zero, is if the interstellar medium is more dense. It has to be more dense because the solar wind is travelling at a greater velocity than the interstellar plasma. The solar wind is travelling at a greater velocity away from the Sun than the interstellar plasma is travelling towards the Sun because the solar wind it comes from a star close by, and by comparison the interstellar plasma is likely to be in thermal equilibrium, which will be cooler and thus also more dense than the solar wind, because it has been sitting in interstellar space for billions of years.
https://3parug.com/viewtopic.php?f=25&t=2515
And
https://support.hpe.com/hpsc/doc/public/display?docId=emr_na-a00022399en_us
Apparently if you had a few of these drives in your Storserv, once the error rate started to increase (read: inevitable pre-failure) array performance was sooo bad it was effectively an outage. Everyone had to leave the room while the regional manager from HPE discussed options with your account Exec. Definitely a situation HPE wanted to 'manage' carefully. Give me an XP any day.
in a dream sequence featuring David Bowie, Flight of the Conchords have already established that pretending your bed mate is female does not count as gay.
" - Hey, David Bowie? - Yes, Bret? Do you mind if I ask you a personal question? Of course, Bret.
That's what I'm here for.
Mm, if a friend of yours puts a wig on you when he's Ionely, pretends you're a woman, is that gay? He was pretending you're a woman? No, that's not gay.
Are you sure? Totally fine.
Mm, okay.