* Posts by guyr

73 posts • joined 25 Oct 2012


Developing for Windows 11: Like developing for Windows 10, but with rounded corners?


Re: Meh to both Windows 11 and mobile 5G

Haha, like that picture. I had totally forgotten that I went to the midnight release of Windows 95. :)


Meh to both Windows 11 and mobile 5G

I actually am getting old, but I can't get excited about either Windows 11 or the next mobile standard 5G. My main computer is still running Windows 7 (I have a secondary on Windows 10, but rarely use it), and my 3 year old 4G phone on Android 10 does everything I require. Microsoft positioning Windows 11 not to run on anything older than 2 years makes my (non)-decision about Windows 11 very easy.

Samsung reveals DDR5 memory module that’s ready for Compute Express Link


Re: Don't get it

This is why USB has replaced older parallel connections

And why PCIe (serial) replaced PCI (parallel.)

More Linux love for Windows Insiders with a kernel update


Re: Windows Tax

Well, it's very convenient for developers running in a corporate network that's using Active Directory and Windows clients. You can have several entire Linux distros of one's choice on that client, and you need never bother IT for permission to put a Linux machine on the corporate network.

Been doing that for well over a decade using Virtual Box. And VB has supported GUI apps - including seamless mode which displays Linux windows independently directly on the Windows desktop - for many years. So, what unique capability does WSL add?


Re: Windows Tax

"Anything "Windows for Linux" is nothing but a cancer out to get at the Linux ecosystem from inside out."

So you're flipping Steve Ballmer's famous quote now? Strange world.

Absolutely fab: As TSMC invests $100bn to address chip shortage, where does that leave the rest of the industry?


U.S. chip companies need to cooperate

TSMC has stated they are spending $20 billion on their next generation fab. I would imagine that is a significant hurdle for U.S. companies. TSMC gets all sorts of subsidies, which the US doesn't do. I think this will require some law changes as well, to allow US companies to cooperate without fear of all being sued for anti-competitive practices. I know Intel has said they are looking to build two fabs in Arizona; I don't believe it, I think they are just trying to exert economic pressure on TSMC and others. Intel has already said their lower end processors are being farmed out to TSMC. These fabs are just too expensive for individual US companies to undertake without subsidies or other financial support.

Canonical: Flutter now 'the default choice for future desktop and mobile apps'


try installing Ubuntu Server on a box, and then try to upgrade to use the NVidia drivers.

Why would you be installing NVidia drivers on a server box? Typically, servers don't employ graphical tools.

SK Hynix boss predicts CPUs and RAM will merge, chipmakers will hold hands to make it happen



I thought HBM was designed to address this problem. You'll never be able to solve this problem generally via merging memory into the CPU, because different applications need different amounts of memory. The latest Intel Xeon Cascade Lake has "only" 38.5 MB of onboard cache. So, figuring out how to expand HBM seems to be a better expenditure of time, energy and money.

Devuan adds third init option in sixth birthday release


I don't have a fax, I'm Internet only these days

If you have Internet, you have fax. :) I've been using fax1.com for years because there is no set monthly cost, and when you buy credits for faxing at 12 cents (US) /page, it never expires. Would likely get expensive if you fax in high volume, but I bought $10 probably 8 years ago and haven't used it up yet.


I don't have a fax, I'm Internet only these days

If you have Internet, you have fax. :) I've been using fax1.com for years because there is no set monthly cost, and when you buy credits for faxing at 12 cents (US) /page, it never expires. Would likely get expensive if you fax in high volume, but I bought $10 probably 8 years ago and haven't used it up yet.

President Biden to issue executive order on chip shortages as under-pressure silicon world begs for help


the US used to HAVE fab plants of their own

This is the point people need to acknowledge. All the major semiconductor companies - IBM, Intel, AMD, etc - did their manufacturing in the US.All those plants actually still exist, and are churning out chips today, but not the latest and greatest. Why? State-of-the-art chip manufacturing is not for the faint of heart. TSMC's current 7 nm fab cost $10 B, and the next one is estimated to cost $20 B. For comparison, AMD had a great year in 2020, with a net income of $2.4B, but that included a one-time income tax benefit of $1.3B, so more like $1.1B, which is still great for AMD, which had lost money for 15 years straight.

So, how does a company who makes $1.1B in their best year afford a $20B manufacturing plant every 10 years? It can't, math doesn't work out. AMD outsourced manufacturing to TSMC for its latest chips because it had no other choice, not because it just wanted to make a couple more bucks. Building those chips on their existing (last-generation ) production line would have been unsuccessful - too big, too much heat. And they didn't have the time or money to build a new current-gen fab.

So, how does TSMC do it? Government subsidies and indirect government support (tax breaks, low interest loans, etc.) TSMC can easily get that $20B loan and go 10 years showing little or no profit. No US company can do that. So how do we fix this? I'm not an economist or a miracle worker, but I doubt we can mass-hypnotize American investors to accept zero profit in their investments for 10 years at a time. So, off the top of my head, I'd say we need to reorganize how massive investments can be made while still showing profits. Public funding? Accelerated tax write-offs? Smarter minds than mine need to think about that.

Microsoft tells Biden administration to adopt Australia’s pay-for-news plan


Despite Microsoft pouring billions into loss-making media ventures, creating new options for advertisers inside Xbox games and boasting annual search-driven ad revenue in the billions, Smith blamed others for the media’s problems.

None of which has anything to do with news. I'm not a shrill for Microsoft, Google or Facebook, but CNet could help understanding by focusing on the topic at hand.

In Rust we trust: Shoring up Apache, ISRG ditches C, turns to wunderkind lang for new TLS crypto module


Re: But///but...this is routine programming

"My guess as to what's gone wrong isn't 'malloc' as such but 'new'. This can create quite large objects either from pool memory or on the stack"

"new" does not allocate objects on the stack. But the larger point is that while I haven't looked at the httpd code recently, most code written in the last 10-15 years doesn't use "new" directly, but instead relies on libraries to manage dynamically allocated memory. The problem of memory not being freed has been well-known for decades, so projects of any significant size use libraries to ensure that doesn't happen.

KDE maintainers speak on why it is worth looking beyond GNOME


Re: The "Problem" with Linux

At least Mate (based on GTK 2 last I checked)

Mate moved to GTK 3 several versions ago. I use it exclusively on all my Linux systems. It's original design goal directly from the MATE web site https://mate-desktop.org/:

"The MATE Desktop Environment is the continuation of GNOME 2."

So, the look and feel of GNOME 2 is the essential ingredient, rather than a particular generation of GTK.

'This was bigger than GNOME and bigger than just this case.' GNOME Foundation exec director talks patent trolls and much, much more


Re: I'm just rebuilding my desktop ...

I've been running Ubuntu MATE (after starting with Mint MATE) for several years, and it has been trouble-free. You can opt for the LTS (Long Term Support) releases, which only come out every 2 years, and thus provide a more stable platform. In reality, I've found that MATE provides its own stability, so don't find the 6-month releases that disruptive. But I've adopted LTS on all my systems except for one VM I use expressly to see what's currently going on, simply to cut down on the update volume.

Third time's still the charm: AMD touts Zen-3-based Ryzen 5000 line, says it will 'deliver absolute leadership in x86'


Why skip a number series?

So the last generation of desktop processors were the 3000 series, why has AMD jumped to 5000? I see on this page:


AMD appears to have allocated the 4000 series to mobile chips. Is that a one-time thing, or are they going to be allocate even-numbered series to mobile and odd-numbered series to desktops?

And then there is this head-scratcher. On that page: "AMD Ryzen™ 3000 Series Mobile Processors with Radeon™ Graphics", under which is ... wait for it ...

AMD Ryzen™ 9 4900H, AMD Ryzen™ 7 4800H, etc. How's that for clear marketing???

Teracube whips out cheap, fixable phone with removable battery and four-year warranty


Re: A suggestion for a long life

Took a look at the specs - missing bands 66 and 71. So, forget about 5g. Unfortunately, this phone doesn't even have newer 4G bands. I like the idea, and my needs are pretty basic; I especially like the replaceable battery, which every phone before my current one had. My current phone is LG G7, which was engineered in 2018, and has both 66 and 71. So, doesn't seem like including these bands in 2020 should be a difficult engineering challenge.

Cross-platform app toolkit Flutter lead Tim Sneath aims Dart at an ambient computing future


Re: Cross platform?

Java is cross-platform, and is by far the largest language used for enterprise applications because it can get you were you need to be in a reasonable amount of time and resources. But Java GUI apps never really caught on because, as you say, they tend to look terrible and non-native, so people don't like them.

If Flutter can get an app on desired platforms in a short amount of time to get a feel for potential user interest, that seems like a desirable approach. Most organizations either can't afford or don't have the time to simultaneously develop for multiple native platforms. Then if sufficient interest exists, developers can weigh the cost and benefits of a native approach for specific platforms.

Oracle adds Arm-powered servers with up to 160 cores to its cloud – must be why it sunk millions into Ampere


workloads that can thrive on modest resources

"AMD is also playing nice with Oracle, with the chip slinger's Epyc Milan silicon on the way. Oracle plans to rent individual cores of the new processors to those running microservices and similar workloads that can thrive on modest resources."

This is an odd statement. AMD is presently outperforming all available Intel options by a wide margin. So why is Oracle targeting AMD cores only to workloads requiring modest resources?

Party like it's 2004: Almost a quarter of Windows 10 PCs living with the latest update


I've tried to update my single Windows 10 system to 2004 at least 10 times now. *Still* failing. Thankfully, my primary system is on Windows 7, which is safe because Microsoft abandoned it. :(

Relying on plain-text email is a 'barrier to entry' for kernel development, says Linux Foundation board member


Re: But for accurate communication between geeks at long distances, plain text wins every time.

I worked in software development for 40 years, now mostly retired but dabble in open source projects to keep my brain working. In the early years, we used rudimentary editors and makefiles. Got the job done, but the toolset was not really helpful to comprehension.

For the last 25+ years, we've used IDEs for the development task, which greatly facilitates understanding. Tools don't replace the need for a developer to understand what the code is doing, or even more important, fundamental software concepts. But it certainly helps competent developers to do their jobs with fully cross-referenced source, source-level debuggers, etc.

To your point, plain text facilitates accurate communication between remote systems. Today we have tremendous tooling that enables easier communication between people. Think distributed reviews.

WSL2 is so last year: Linux compatibility layer backported to older Windows 10 versions


"Rather than requiring Hyper-V, which is resolutely not part of Windows Home"

Ye of little faith. Please see my comment on the page in the following link. Hyper-V can be enabled on Home with some easy to follow steps.


Linus Torvalds pines for header file fix but releases Linux 5.8 anyway


#ifdef/#define wrappers

Came to say exactly this. The problem of including a header more than once was solved at least 30 years ago, when I was doing C/C++ development consistently, using this #ifdef/#define pattern. I can't imagine any code passing a code review today without it.

Battle for 6GHz heats up in America: Broadcasters sue FCC to kill effort to open spectrum for private Wi-Fi


NAB is worried about it because they use it for production

Yes, that's it exactly. I found this topic interesting enough to do some searching, and found this article that explains some of it. Apparently that band is used by wireless microphones as well as microwave backhaul links. To us not in the industry, WiFi in the home wouldn't appear to impact either of those applications. However, I can see a scenario where someone is trying to do an on-site news report in a suburban neighborhood, and every house is leaking these WiFi signals, rendering the spectrum unusable for licensed users.


CompSci student bitten by fox after feeding it McNuggets


It's likely the foxes will be euthanised

No, that's what we tell ourselves we do to our dogs, so we are able to sleep following the deed. The foxes will be simply killed.


*WE* computer folks are supposed to be smart enough to know how to use English

Same thought - apparently English is not a prerequisite for writing Reg columns.

Intel outside: Chip king Keller quits x86 giant immediately 'for personal reasons'


Smart but restless

Jim Keller is without doubt a very smart man, given his stints at every major CPU manufacturer. But he doesn't appear to like to stay in one place for very long. Given how complex the task is, I'm a little surprised the big names - Intel, AMD - are willing to make the investment. Keller joined Intel in April 2018, meaning he's been there a little over 2 years. How quickly can anyone absorb an architecture with billions of transistors to start making significant contributions? I worked in software development, not hardware, and we would allow new people a couple months to get familiar with the architecture of any project before we expected major independent work.

This'll make you feel old: Uni compsci favourite Pascal hits the big five-oh this year


Re: pascal was simply useless.

>> "Meh. Pascal appealed because it was strongly typed."

Indeed, others here are criticizing Pascal being strongly typed. They fail to recognize that this one feature is what made Pascal the preferred teaching language in the 70s. I went to a pretty rigorous engineering university, and they employed Pascal in both intro and advanced language classes. If you are instructing people who know nothing about structured programming, then you want to start them thinking in very methodical techniques. Pascal provided that structure then.

More than a billion hopelessly vulnerable Android gizmos in the wild that no longer receive security updates – research


Both manufacturers and Google are to blame really.

In my experience, the responsibility goes further than that, to the carrier. In my experience, for an Android update to appear on my phone requires the cooperation of all 3: Google, the phone manufacturer, and the carrier (T-Mobile in my case.) Unless all 3 are on board, the update will not show up on the phone.

Android owners – you'll want to get these latest security patches, especially for this nasty Bluetooth hijack flaw


Re: "you'll want to get these latest security patches"

I came here to say the same thing. We may *want* to get updates, but unless the maker of our phone and our carrier both provide the updates, then it ain't happening. Telling phone users to get an update accomplishes about as much as telling your dog to get a driver's license.

LibreOffice 6.4 nearly done as open-source office software project prepares for 10th anniversary


Re: "Has LibreOffice succeeded?"

"In LibreOffice, you can change chart type and hide or show the legend and that's just about it."

I've only used LO at home, where my needs are not that great. But I did add a chart to my retirement spreadsheet, and tweaked to get what I wanted. The chart capability actually does provide quite a bit of customization, though it might not be as easy to find and apply as you'd wish. As one example I remember off the top of my head, to tailor how individual lines in a data series appear, you can't just right click the line and modify properties. Instead, you need to open the data series editor and change properties there.

Waity K8-y no more Pivotal: We'll unhook Application Service from VMware


We're now at a scale that is too massive for humans to touch.

Hmmm, why did I immediately think of SkyNet upon reading this?

There once was a biz called Bitbucket, that told Mercurial to suck it. Now devs are dejected, their code soon ejected


Re: Git

To think that we're going to be forever lumbered with this sad example of "eh, good enough" design saddens me when there are so many better systems that already exist.

Forever? Not if somebody thinks of something better. Remember, Torvalds developed git as a 1-month side project when he couldn't find an available version control system that worked as he liked, without unbearable licensing terms. The entire open-source universe has proven its meritocracy-based roots. Torvalds himself would not hesitate to throw out git if something provably better comes along.

Bit of a time-saver: LibreOffice emits 6.3 with new features, loading and UI boosts


Re: 32 bit removal

"32-bit code will actually run a bit FASTER than equivalent 64-bit code, if for no other reason than the instructions and pointers are smaller, which means LESS memory has to be read into the cache, and so on."

Hogwash. If you've ever taken a look at Intel assembler, loading an address into a register is a single instruction. That's true regardless of whether the address is 32-bit or 64-bit. Next, loading a memory location *from* the address in that register is *also* a single instruction, again regardless of whether the address is 32-bit or 64-bit. So, the only possible difference would be in how that 32-bit or 64-bit address got into memory in the first place. And since memory is loaded in multiples of pages, and NOT individual 32-bit or 64-bit values one at a time, there will be no difference is the amount of time spent loading the 32-bit or 64-bit addresses into memory (which may be zero in either case, if the address is derived from a previous operation and already in a register.)

Now, having said all that, if you've got some degenerate case in which for some reason millions of addresses are stored in memory, then sure, each additional million will use up 32 MB of memory. I can't conceive of how this would happen, but don't deny the possibility. This will likely have an measurable impact, though if you are dealing with such a case, you are probably anticipating the issues involved.

Airbus A350 software bug forces airlines to turn planes off and on every 149 hours


Turn off and back on - patented process

I think Microsoft patented the process of turning the machine off and back on again to fix any random problem. Some serious fees are going to be paid for this.

El Reg sits down to code with .NET for Linux and MySQL, hitting some bumps along the way


Re: MS Access for Linux

In a corporate software development environment, I've never been in a group that seriously considered MySQL for production work. If open source is preferred for a database, PostgreSQL is a natural choice.

Delphi RAD tool (remember that?) gets support for Linux desktop apps – again


Re: Communicates with Delphi on Windows?

"This article is exactly about now being able to develop GUI application for Linux and no longer only console ones."

Thanks for the correction. I went back and reread this part:

"FmxLinux was developed by a third party, Eugene Kryukov. It has been licensed under “a long term distribution agreement,” says Embarcadero’s Marco Cantu in the announcement this week." and followed the link to the announcement Makes more sense now. But I also read this on the announcement page: "active subscription to the Enterprise or Architect editions". What's up with Embarcadero? They really like to put the screws to their customers. Less pain-seeking customers can just license FmxLinux directly at a much lower total cost.


Communicates with Delphi on Windows?

"On the Ubuntu side, you have to install an agent which communicates with Delphi on Windows."

What's this? I took a 15-minute look at Kylix when it came out, and honestly I don't remember anything about it. But this quote has me scratching my head. Am I understanding correctly that to run a Delphi app on Linux, you need to shuttle work over an agent to a process running on Windows? Sounds like the mother of all kludges. I really can't see the niche for this. If you have a staff of experienced Delphi developers, and you want to deploy a bespoke app onto a Linux desktop? But if you also need to have a full Windows running to do the grunt of the work, why bother? Just run the app on that installation of Windows.

Okay, found this article on the Embarcadero site that explains things:


The agent is only required during development, not deployment. You develop the app on Windows, and cross-compile it for Linux. GUI apps are not supported, only console apps. Meh, very niche market. If you want to run a full app on both Linux and Windows from a single source, several options are available. Java of course, but for native look and feel, WxWidgets gets you there.

Halleluja! The Second Coming of Windows Subsystem For Linux blesses Insider faithful


OS/2 failed for different reasons

I was inside the castle for pretty much the entire run of OS/2, from pre-1.0 to 3.0 Warp. From my perspective, what killed OS/2 was IBM's defiant adherence to the original design decisions, and a refusal to modernize. For example, IBM insisted it run on the 286, even though by the time it reached any measurable market, the 386 was ubiquitous. That let to numerous architectural restrictions the messed up the works. From a developer's viewpoint, the single input queue (from all Windows versions prior to NT) was a nightmare; this was the old cooperative multitasking model, where a single badly designed program could hang the entire OS.

Microsoft addressed all these issues in NT, but IBM refused to budge with OS/2. When Microsoft demonstrated they were serious about NT, everyone gave up on OS/2. It held onto that darned single input queue to the very end!

All nodes lead to Rome: Epyc leak spills deets on second-gen Zen 32-core AMD server chippery


Re: Mostly harmless upgrades

I doubt mainstream businesses upgrade CPUs very often. If a company has a rack of dual CPU servers supporting their business and they are working acceptably, they don't risk disrupting their business by tearing apart that rack. Instead, they'll just keep those servers going until they are depreciated, then replace them completely with newer ones.

The only large scale upgrades I've ever read about were supercomputers, where they bring the system offline and replace (e.g.) 8096 processors at once. They can do that because time on that system is scheduled, and it's not used for daily purposes like processing credit card transactions.

Mozilla tries to do Java as it should have been – with a WASI spec for all devices, computers, operating systems


Re: If it happens

"Java has notoriously bloated, verbose code which offends many programmers' sense of elegance."

Don't know what you are referring to. I programmed professionally in Java for 20 years and never felt offended. Java has the same control structures as all C-like current languages. When you say bloated, are you referring to the library load pulled in by programs of any significance? Then you should compare to the full set of shared objects required to run, e.g., C/C++ executables.

Nokia 9: HMD Global hauls PureView™ out of brand limbo


Pictures of fingers

Geezer warning. When I see these phones with all these lenses on the back, I can only think one thing: I would clumsily put my fingers all over them. Maybe these multiple lenses produce stunning photos, and would make a good *camera*. But as I'm talking on the phone and my mind starts to drift and I start fidgeting, the lenses would get frequent contact. I have an LG G5 with only 2 horizontal lenses, and I have trouble staying clear of those!

Clearly, I'm not the target market.

Core blimey... When is an AMD CPU core not a CPU core? It's now up to a jury of 12 to decide


Buyer wake up

I bought two Opteron 4234 processors for a workstation. I was fully aware of the Bulldozer architecture, and that a module shared an FPU. As others have said, processors change over time. I hope this case goes down in flames. We don't want to hamper innovation by punishing vendors after the fact. AMD did not hide Bulldozer's architectural details. If it did, *that* would be grounds for a lawsuit. They were up front about the architecture, so I can't see where they did anything deceptive.

Man drives 6,000 miles to prove Uncle Sam's cellphone coverage maps are wrong – and, boy, did he manage it


British English then?

"Those results doesn't correlate exactly"

We would say that differently in the states.

Talk about beating heads against brick walls... Hard disk drive unit shipments slowly spinning down


New math

"The third category was nearline and other high-capacity (3.5-inch) drives at 11 million supplied, down 9.5 per cent on the year from about 10 million."

10 million last year to 11 million this year is ... "down 9.5 per cent"???

Poor people should get slower internet speeds, American ISPs tell FCC


Rich man poor man

Rich people have nicer stuff than poor people? Say it isn't so! When did this start to happen?

This is not a new issue. This exact same thing happened during the telephone era. Running a telephone wire 90 miles out into the country to serve 5 farmers just wouldn't pay for itself, so the early telephone companies wouldn't do it. The solution was a government access tax on everyone to subsidize service to those the profit-making companies refused to serve otherwise. If we (society) deems that in our time everyone should have internet access, then we'll need to do something similar.

I'm as much a liberal as anyone, but I see this issue more as a capitalist one, not a fairness one. We can't take a profit-making company that has to earn its own way, and force it to provide money-losing services.

Who fancies a six-core, 128GB RAM, 8TB NVMe … laptop?


Re: What does it run?

However my ubuntu laptop was ubuntu because I tried windows 10 and was officially told windows 10 once installed can't dual boot because of UEFI (I was asking for the ubuntu install disk) So probably because of Dell not windows

Perhaps Dell didn't something nasty to impose that limitation. I bought a refurb HP Pavilion 500-314 that came with Windows 8 pre-installed, which I then upgraded to 10. This is a UEFI system. I then installed Ubuntu in its own partitions. Without any work from me, Ubuntu made itself the bootloader, and added Windows as a boot option.

I wouldn't be shocked if Dell tried to prevent that, but you probably would have been successful if you tried.

Is this cuttlefish really all that cosmic? Ubuntu 18.10 arrives with extra spit, polish, 4.18 kernel


Re: "the system has a more modern and 'flatter' look"

I came to the comments to weigh in on this same topic. I don't get the drift to flat interfaces. 5-year-old graphics chips are perfectly capable of rendering 3-D graphics. I prefer icons with some depth; the appearance is visually pleasing. So who decided that we all need flat interfaces? Is this change simply for change's sake?

Sync your teeth into power browser Vivaldi's largest update so far


Re: thunderbird.. I would like to see something that continues refinement

I had to give up on Thunderbird when it corrupted our Google Calendar at work. I had to replace it with something quickly, and Windows was the corporate desktop. I found eM Client:


Windows only, unfortunately. I would prefer a cross-platform solution, but have found Evolution meets my needs on Linux. eM Client interacts well with Google Calendar and Google Contacts, and allows multiple IMAP accounts.

WLinux brings a custom Windows Subsystem for Linux experience to the Microsoft Store


VirtualBox seamless mode

VirtualBox has had seamless mode for years. In seamless mode, Linux app windows appear on the Windows desktop just like native Windows apps. No need to download anything from an app store, plus runs on any version of Windows, including Home and Windows 7. Of course, you can run any flavor of Linux (or BSD or ...). In short, I don't see the selling point of WSL. I guess if you have a version of Windows 10 that already has it, then why not. Other than that, meh...



Biting the hand that feeds IT © 1998–2021