
Vista?
Does this mean Apple is going to license the Windows Vista Design Language?
301 publicly visible posts • joined 19 Aug 2014
Fresh talk of Linux on the desktop is misplaced... the boat sailed when 64-bit migration happened with Windows-7. Even the polished turd that was Windows-8 couldn't displace Windows.
The next Microsoft OS is likely to be called something like "Frame", built on a hyperlight Rust micro-kernel/hypervisor with Windows and Linux sub-systems, it might ervev be open source
Do we blame Chelsea football club for the invasion of Ukraine? No, just because the club was owned by a sanctioned associate of Vladimir Putin, doesn't mean they culpable.
Before being sold to Fujitsu, ICL was owned by the UK Government, and encouraged to develop capabilities in Northern Ireland. Do we blame CFM Belfast for the Horizon system developed in Bracknell? I'm going to go with probably not.
With a large northern ireland footprint, it might just be that the Northern Ireland executive took the view that locval knowledge couterbalanced "mainland" failings.
While the announcement is geared to the upgrade cycles of key customers, the reality is that it uses standard rack sizes, with very-fast microprocessors and PCIe connection cards. when running Linux it can be considered as just very-very big box, with a price to match level of reliability.
the report "Don had inadvertently made his server the machine most likely to respond first. But as it was built to keep traffic in the lab" is a little too generous. DHCP uses (udp) broadcast messages, so there is no "most likely to respond". "was built to keep traffic in the lab" is also too generous - if there was a firewall policy to keep traffic in the lab Don's server wouldn't be a problem. What (likely) happened, was that campus PC's received a lab IP address complete with a lab gateway address that wasn't reachable.
The route cause is likely that Don, kept UDP open to retain access to network printers.
Depending on your deployment scenario, it might be worth coughing up for an Oracle license - The GraalVM is only available for a fee, and might save of cloud hosting fees...
.. but if you wanted to save on hosting costs, you might be better migrating to C#
https://benchmarksgame-team.pages.debian.net/benchmarksgame/fastest/java-graalvm.html
https://benchmarksgame-team.pages.debian.net/benchmarksgame/fastest/java.html
At the inception of electronic legal due diligence, the first step was always to scan all documents for keywords like "Bhopal" (for exposure to the Chemical disaster) and words like "unlimited". It was quickly discovered that blanket scorecards have limited value, since the number of false positives highlighted.
The reason to mention it is that that lesson was not learned by Facebook, that censored references to the Austrian town of Fuck, Spanish chocolate Negro or English food Faggot. It seems that despite the huge investment in "AI", they never really got to grips with context, and Californian censors were (frankly) too stupid to comprehend that words had an established definition before they were adopted as profanities.
While most normal people find Elon Musk's posting of count transcripts (technically pornographic text) disturbing, it is still impossible to adequately censor content.
We should the honest: Facebook failed on many fronts
'C' is a difficult Language to parse because of exception cases and implicit conversions, "mini C" parsers have existed for a very long time (Haskell used a mini-C intermediate language for over a decade before LLVM rendered it redundant), there is even a ANTLR C parser for parsing C in Java. It would be new if it produced safe Rust code, but it doesn't.
The principle of parsing a language into Abstract Syntax Trees (AST), then rendering the AST into another language is not rocket science, Hiperspace.SQL is an example that translates SQL into C#
Watching evidence to the enquiry, there was a large number if discrepancies reported in the migration from Horizon (store & forward via dial-up ISDN) to Horizon-NG, highlighted by whistleblower.
The case that was brought to the old bailey was not in-scope of Horizon-NG, so no reasonable expert witness would reference bugs with that rollout.
The often ignored fact is that Horizon was primarily an EPOS and network system, with replication of all transactions to Post Office each night, but POCL did not implement management information to support counter management or investigation (the cash-account report was initially a transitional system). You shouldn't need an engineer expert system for a criminal case: the evidence should be clear.
Post Office counters brought a case against a sub-postmaster to the Old Bailey (Central Criminal Court) with the express intent of making an example, including a full team of barristers and solicitors for the prosecution, while the sub-postmaster chose to represent himself.
The sub-postmaster highlighted an example that he believed to be conclusive proof that the system was defective, but there were scenarios where it could be normal:
1) Messages were transmitted (via UDP) between counter 1 & 2 were first written to disk by counter 2 (idle device) while counter 1 (single CPU) was concerned with counter transactions
2) Session-move key was used to transfer the session to another counter (e.g. to use weighing scales), but cancelled.
Gareth's evidence related to the specific scenarios mentioned in defense, he was not cross examined or asked about other defects : The UK courts rely on the adversarial principle if you don't cross examine, doubt is not considered. The judge could have asked questions, but it is likely they were peeved by the professional discourtesy of not engaging a barrister.
Had the case been reported to police, a solicitor would have been appointed (who would seek legal funds from Federation of sub-postmasters, and/or professional indemnity insurance); and CPS would have considered evidence before seeking prosecution. This didn't happen because POCL used crown-agency status to investigate and prosecute, choosing to bypass crown court and seek trial at highest court (only appeal to supreme court can override the Old Bailey, and only if a legal precedent is needed). Application for leave to appeal was not sought.
It was a miscarriage of justice, but not because Gareth didn't volunteer an answer to a question that was not asked.
Anybody who didn't switch from Oracle when they bought Sum Microsystems deserves it, but if you're tied into Oracle database, WebLogic or Oracle Financials, the cost of the Java subscription is discounted and a small part of the overall cost - Oracle has a mantra "prices never go down". If you switch to OpenJDK, your discount tier will go down to compensate, and any financial institution will forced to pay "support" subscriptions to another provider to offset operational risk "insurance" (that after all is how RedHat makes money).
sure, no one ever needs the "support", doesn't make it a complete waste of money
You might want to reflect on your own racism - it was foolish of someone to start with "it might seem racist", but the caste system is one of the factors holding India back (relative to China), and that was the target of the criticism.
Elon Musk is not a paragon of inclusive management, but he does pay his staff (highly) and is in a hurry: Twitter eclipsed the facebook, but lost its way and needs to recover lost ground.
The Telum processor is a CMOS processor like AMD64, the latest AMD and Intel chips are also multi-chip processors. The difference is volume, but that applies to the POWER processor too.
Given that you can run RedHat Linux on {Telum, POWER, AMD64, ARM} the price ratio should reflect value rather than legacy premium.
If they certified it as 100 percent compatible with WSL2, could be very good. Azure Linux and WSL2 both use the Microsoft SMB client rather than Samba.
Would be nice if they included loadlibrary in the kernel for DLL loading
The undock procedure has been changed to push it away from the ISS faster so that [1] if the thrusters fail completely it will drift away from the ISS fast enough to burn up in the atmosphere [2] if it explodes, it will be far enough away not to avoid shrapnel. If there are any heat problems during first maneuver, NASA can target a sea landing to avoid cities.
Worth noting that "NASA has approved the Starliner for a crew return as a contingency" can be translated to "in the event of gravity film like scenario (where the ISS must be abandoned), it can be used as a life-raft - better than certain death".
What nobody is mentioning is that the propellant leak on the way up might have continued adding to the risk - at some point a Boeing manager would have highlighted the growing risk to expedite a decision.
It's entirely possible that SpaceX has vetoed sending a empty suite since it might not fit, and insist they return on a Soyuz spacecraft instead.
Other technical questions include the software patch and reboot to allow unmanned disconnect.
Either way, Boeing is going to have issue a bond for uninsured liability: The Astronauts will not have signed a waiver for a SpaceX ride
I thought Racketeering was illegal in the US?
Rust is a great language, and reminds me of the debate we had thirty years ago about whether to use the PL/1 (or PL/S the systems programming subset of PL/1 without a runtime) or use C/370.
The debate didn't change much for most of that time (other than the suggestion of Pascal/Delphi rather than PL/M), or using Lint (itself a 40 year old static analysis tool).
While it seems like a great suggestion, the main value of using an LLM is to add "AI" to the description - so much more sexy than using a static analysis tool to solve the problem.
The problem is Rust achieves type safety by applying ownership and immutability concepts, that are difficulty for LLM - what you'll get is:
1. syntax conversion
2. Compilation errors
3. Managers wondering how long it's going to complete
4. Managers complaining that "the AI has done the harp part", why are developers taking time.
eBPF includes a verifier where shoddy software does not.
As someone who has written kernel-level code, I can assure that it is possible to write formally provable code using Formal Methods - I don't anymore because the levels of code review, profiling, verification, {unit, system, integration, regression, performance} testing are prohibitive. duff data is only an issue if you don't verify it.
I know my surveillance code as 100% reliable, not because I'm some kind of genius, or used VDM mathematical proof; it was 100% reliable because it checked every pointer and fell back to a read-discard loop (after WTO instruction) that ensured it did no harm. CloudStrike's code is surveillance code - their first work-around "fix" was to delete csagent.sys
to translate your comment " It is certainly possible for shoddy invalid data to trigger a shoddy logic bug in the shoddy code that interprets it. Indeed, that happens all the time in application code"
This article says they couldn't ("https://www.theregister.com/2024/07/22/windows_crowdstrike_kernel_eu/")
There is simply NO EXCUSE for what CrownStrike has done: they've taken advantage of a signed kernel driver to side-load code into the kernel in contravention of the license agreement. We can expect a future Patch Tuesday to black-list the csagent as malware.
To describe these sys files as "broken configuration file" insults the intelligence of readers : either they include binary executable code, or are virtual environment like https://en.wikipedia.org/wiki/EBPFbut inferior.
This really has nothing to do with the Windows security model - it's the history: When the internet became as retail experience in the early 1990's Windows had a TCP/IP stack based on DCE and wasn't secure enough to prevent attack - compounded by MS decision to add IIS remote debugging on the server and ActiveX on the client. Windows rightly got a bad reputation for poor protection, creating a market for 3rd party virus protection.
MacOS is better protected than windows because [1] it shares the micro-kernel design of the NT Executive (using messages rather than stack frames to call kernel functions) [2] almost nobody used it when virus attacks exploded, [3] Apple had early experience of virus attacks when it was especially vulnerable due to floppy disk viruses, and learned the lesson.
CrowdStrike problems are only just starting: It would be reasonable to blacklist their csagent.sys kernel driver as malware, destroying their business. That we're not (yet) talking about bankruptcy is a testament to two things: [1] their well-funded legal department, [2] their well-funded marketing department.
While MBA led tech companies are poor for engineering, the modules on legal and marketing are well used stopping the truth from getting out
Cost Based Optimization takes a parse tree in a language we like to call "structured" query language - a LLM does the same (the prompt is parsed into a language tree
It generates permutations of query plans using a stop check that estimated cost is lower than the cheapest plan generated to so far
Cost is estimated using heuristics (generated from hard simulation of the of cost of access paths), and statistics of the cardinality of data and where the predicates fall within them
Oracle (prior to v6) used rule based optimization that favored unique indexes, then non-unique indexes, then order of tables in the query - not particularly intelligent.
Granted CBO is more like the Amygdala that higher cerebral functions, but so is current AI.
Alas we live in an age where "an algorithm" that looks at recent price history to make buy/sell decisions is now branded as "an AI" - If you can call a price algo AI, or shopping "people who bought toys also bought batteries" agent AI, then you certainly can call CBO AI.
IBM spent years training CBO to be reliable.. but (as with Oracle RBO) early punters were happy to go with "it's like magic"
the "out of support" includes every version of SQL/Server up to 2017, since then Microsoft has been pushing clients to cloud subscriptions.
Unless you're using graph tables (doesn't really work) or ledger tables (me-too demo searching for a use-case), the main difference is scalability, security patches (even EOL SQL/Server has less issues than Oracle) and in-place version upgrade (page table format)
Anybody who ever used Microsoft Outlook Journal will find this feature somewhat familiar if a little sad in its sheer stupidity of the technology.
Journal used to look at applications you were using and what files were open and record basic information - it wasn't very useful, but didn't use much power to process. Recall could reinvigorate Journal applying AI to what you're working on, but instead uses daft technology. The idea of taking a screen print of a document, and using AI to convert back to text is so stupid, you'd only do it for a demo.
A US company that happens to sell software, gives $250k to a university, research doesn't confirm whether the companies products were used in the study.
The fact that Neo4J has funded the study highlights that its products are not used by any web-giant, and wasn't used for the "graph problem of our time" (pandemic contact tracing).
When writing historical stories about the old roller mice, you really should reference the IBM field service engineering instructions for cleaning mice balls.
Back in the day, IBM had instructions for everything - but the mice-ball cleaning instructions when viral (in as much as anything could in the 1980's) because they were hilarious when taken out of context
The intel 80386 introduced configurable size segments, which all the OS vendors used to wet segments to 32-bit and used a single address space, with segment selector registers set to this one large segment. Without much change it was possible to increase address space from 4Gb to 16Gb by using {code,stack,data,extra} registers. All the effort would fall to OS and compiler vendors,
AMD didn't just increase the addressing from 32 to 64 bits, they remapped segment registers as general purpose registers to ensure that AMD64 programs would always be faster than "whatever Intel called their version" (was it x86-64 or ia32-64, can't remember).. Fortunately Intel was persuaded to build AMD64 compatible chips
unrelated to Chelsea Manning's big dump, Julian got impatient with a DoD hacker's caution and accessed servers himself without using a VPN - that not journalism
The clarifications are a formality - constitutional rights in the US universal (that why people sign visa waivers on planes to the US)..
It's likely that he'll be convicted, and allowed to serve his time in "spider land" (which will release him on licence). Aus doesn't want any blow-back if he kills himself, but also doesn't want him using his celebrity to run for political office
Back when Generic Java (GJ) was mooted for as a successor for Java 2, ther4e was much debate about whether to bite-the-bullet and amend the Java bytecode to support runtime generics, but instead stuck with GJ type-erasure : at runtime ArrayList<T> is the same as Java 2 non-generic ArrayList. .NET 2 did bite-the-bullet and include runtime generics because it was thought that you could never replace C++ with arrays scattered across the heap.
Net result, betting that "everything will eventually be re-written in Java" was a mistake, because there is no equivalent of native arrays packed in continuous memory, and no support for primate data types that were not understood three decades ago
The paid version includes GraalVM which "promises" higher performance as long as you don't use any of the stupid API that can't work
Java 22 still includes RMIClassLoader 22 years after it was near universally agreed as a bad idea. RMIClassLoader is responsible for most security violations (including log4j & struts)
The distinction between G-Force and Quadro GPU used to be the latter sported full FP64 double precision floating point numbers while G-Force only supported FP32 single precision floating point numbers that were fine for games and mundane traditional calculations like Ballistic range calculation and Hydrogen Bombs
FP16 and later FP8 where billed as the advance that set Google Tensor Processing Unit (TPU) apart from GPU that were thought to be useful only for training.
With nVidia support for FP8 and FP4 in microcode, it's difficult not to see the end of any argument for dedicated TPU processors.
If You find your budget for Oracle licences is going down, expect an audit.
If your user count is going down, expect your discount to be reduced to keep the bill up.
If you fail the audit you lose your discount: there was an incentive to over provision licences.
when an audit is scheduled, Shut down any servers not currently used, and spinup postgress on at least one of them
I thought I was doing well remembering back 35 years, Citation is difficult because none of the publications at the time were on the Internet..
I developed on OS/2 (1.1-,1.3) - it was rubbish for services unless you disabled real-mode applications (processor reset was used to switch from protected mode to real-mode and lost interrupts).
i have no evidence that IBM wanted OS2 to remain single CPU, but 80486 was the first x86 processor to have CMPXCHG needed for multi-CPU Critical sections
The 8086 could address 1Mb of memory in real-mode (20-bit addressing), while 80286 could address 16Mb (24-bit addressing) in protected-mode, and 80386 4Gb (32-bit addressing).
That 20-bit addressing consisted of 16-bit for segment (for 8080 compatibility), and 4-bits for segment selector (16 segments)
Through experimentation MS engineers found that the 80286 in real-mode could address 20 segment giving a usable address space 4Mb - a surprise to MS/IBM and Intel! Because the PC-AT could take a maximum of 4Mb anyway, the whole 286 protected mode (the original reason for OS/2) became completely redundant.
Windows 3 didn't start as a breakup with IBM, but as a way to push win-Word and Excel (that has been ported from Apple Macintosh) and get a head-start over IBM DisplayWrite, WordPerfect and Lotus.
Windows/386 (the variant of Windows 2) was a faster GUI platform because it used top-left origin for CRT (the standard for TV) rather than bottom-left of IBM terminals. translation of screen addresses meant OS/2 was always going to be a little bit slower than Windows 3.
The break-up came because IBM didn't want OS/2 to be a multi-processor OS to protect its mainframe platform. Windows NT was originally going to be OS/2 3.
anyone who's watched (or listened to the original radio-4) Hitchhikers guide to the galaxy, will know that Zaphod Beeblebrox had sun glasses that turned black when something frightening happened to him.
The apple glassholes are unique in that they are not "augmented reality" in the conventional sense, but instead VR where the camera image is overlayed behind the VR images.. and consequently go black during in an accident. The other feature is that you don't "see" someone eyes, but an OLED display that projects an image taken from the inside of the glassholes, for that authentic second-head appearance.
going to message the Douglas Adams estate, in case they fancy a few million quid for "prior art" patent infringement: Douglas would love that!
Sounds like there is going to be quite a market for after market paint jobs! - you know, that clear outer layer of paint you find on normal cars.
There might even be some "survivalists" that choose camouflage transfers to match assault rifles they use for "legitimate deer hunting" (clearly unaware that several hundred bullets ruins the texture of the meat)
Worth noting that the job advert is for the web-substrate for office - running in a browser. Rust is better than C# for web-assembly because it doesn't use a garage collector (C# downloads mono runtime). Rust is better for edge-security than C++ because it blocks unallocated read (that could reveal keys).
Rust has a bright future, but it's biggest impact is likely to be the open release of C++ code analysis tools that MS uses internally, and an alias for std::unique_ptr<T> ( T^ anyone?).
C# is now the undisputed fasted managed language, and TIOBE language of the year for good reason, but there are still situations where managed languages are not be best fit.
If you were a device developer, that is exactly what you could do.. All operating systems have source licences for device developers - what you couldn't do is ship the product.
I loved "the land before Linux" story, but couldn't work out which episode of star-wars it was a parody for, but am looking forward to the the "return of the jedi" edition. The only thing missing is the plot hint about Xenix for the "I'm your father Luke" line.
Those familiar with UNIX SVR4, will remember the SunOS boot sequence where the Microsoft (C) is included alongside AT&T. The original "Unix on the desktop" outfit got the gig for PC-DOS because [1] IBM was already in the room shopping for a Basic interpreter [2] The CP/M author had snubbed IBM [3] MS had pitched Xenix to IBM for the desktop - IBM thought "if you can do Xenix, you can do DOS".
MS was a Unix shop before OS/2 (they developed and ran office apps on DEC Unix) - but changed as PC's became more powerful. ~Nobody should forgive them for the Balmer years, but shouldn't rewrite history either.
I've designed databases schema for DB2, Teradata, Oracle and SQL/Server. Raw volumes were introduced for Sybase, Oracle and others because (at the time) Unix only provided buffered synchronous IO. DB2 never needed raw volumes because VSAM datasets were always asynchronous and translation from VSAM page address to cylinder, track, sector imposes little overhead.
I was skeptical of SQL/Server dropping raw volumes in the migration to NTFS (from OS/2 to NT), but direct async IO and list IO reduced the performance advantage. The biggest distinction now is that databases don’t need journaled files-system – Oracle ASMLib and MS Storage Space Direct avoid this but are still logically file-systems.
You’re right that NAS storage and S3 will never be as fast as a raw volume.