* Posts by juice

870 posts • joined 16 Nov 2010

Page:

Warning: That new AMD Ryzen 7000 laptop may not be as fresh as you think

juice Silver badge

Re: Power to the processors

> So if we accept that my first generation Ryzen with graphics card (gaming PC) draws 500 watts then running it for 12 hours a day would cost £4.32. If I spent lets say a thousand quid on a more efficient box that only draws 250 watts then the cost saving would be £2.16 per day that I used the PC for 12 hours.

Your maths might be a bit off; according to https://www.omnicalculator.com/everyday-life/electricity-cost, your 500W Ryzen would cost £2.16 per day if ran for 12 hours a day at full pelt, versus £1.08 for a 250W machine.

Which to be fair, helps to strengthen your argument!

Though equally: would you actually need to spend £1000 on a replacement? A quick peek at pcspecialist.co.uk suggests you could spec up a basic Ryzen 3 machine with 16GB ram from around £400, depending on how much stuff you want to bring over from your old machine (e.g. GPU, HDDs, OS licence, etc). Or you can buy a motherboard/CPU/RAM combo from scan.co.uk from around £250...

Also (and depending on the use-case), you could use something far smaller, lighter and cheaper. E.g. a Raspberry Pi 4 draws a maximum of 5W, which is handily just 1% of your Ryzen's power draw, or around 2.16p per day.

And at £57.50 for the 4GB model[*], plus another tenner or so for a case, that'd break even in about 5 weeks...

Admittedly, it'd entirely depend on whether a Pi 4 could perform the tasks that your GPU-equipped Ryzen can do. And that's not taking into account any peripherals (e.g. hard drives) which need to be plugged into your machine.

Anecdotally, I'm just about to switch from a dual-X5570 Xeon machine to a single E5-1650 which I picked up off Ebay for £165. Partly because even before the recent energy price hikes, the ancient dual-CPU beast was chewing through about 50p a day when idling.

It'll probably still take a year or so for the new machine to break even, but in the meantime, I also get a machine which is both faster and quieter. So it's generally a good thing :)

[*] Assuming you can find them anywhere...

Creatives up in arms over claim that AI is killing human art

juice Silver badge

Re: How many times have we heard "art is dead" since the '50s?

> See also "The Art of Kenny Who?" circa 1980's

There's plenty of earlier prior art (pun unintentional); there was a short story by Harry Harrison [*] which probably dates back to the 60s or 70s, about a comic artist who commits suicide after years of being demoted to just feeding scripts into an automated drawing machine [**], having first drawn his final comic which shows him cleaning up his drawing board before jumping out of the window.

Only for the comic editor to then mock his drawings, as the final panel doesn't exactly match up with how his body is lying on the ground below...

[*] I think - I've not read this story for several decades. Harry Harrison was himself a comic artist before he became a sci-fi writer, so it's not the easiest short story to try and do a web-search for!

[**] Oddly prophetic in some ways, given that's pretty much what we're doing with the current AI tools!

BOFH: It's Friday, it's time to RTFM

juice Silver badge

LOL

Is it time to go old school and bring the roflcopter back?

(Also: do [pre] tags work here? Time to find out...)

ROFL:ROFL:ROFL:ROFL

_^___

L __/ [] \

LOL===__ \

L \________]

I I

--------/

Arm sues Qualcomm over custom Nuvia CPU cores, wants designs destroyed

juice Silver badge

Re: RISC-V

> Well, I respectfully disagree : I think it could potentially happen.

That's the nice thing about opinions: we can all differ ;)

It'd be mildly interesting to see just who has joined the RISC-V foundation.

But in truth, I'd guess that's being driven more by the US embargos on Russia and China, than any of the ARM/Nividia shenannigans, as well as a maybe a bit of bet-hedging.

I mean, does anyone remember when IBM set up a press office in Second Life? And all those blockchain projects which were announced with great fanfare have all fallen distinctly silent...

Beyond that, I can't see any major manufacturers wanting to take the risk of having both ARM and RISC-V product lines. Because it's not just about the CPU design; it's also about the SoC hardware design, and the software.

And all the support and patching overheads thereof.

That's a huge set of extra costs, and the margins on low/mid-tier smartphones are already pretty thin.

Plus, the market is currently contracting for everything other than flagship phones, which reduces the amount of money people are willing to invest.

Pragmatically, the most likely scenario I can see is that we'll get a bunch of small OEMs (most likely from China, though Vietnam and India are both gearing themselves up to get a slice of the ePie) making budget RISC-V handsets, with budget hardware and shonky software and support.

Stuff like this isn't exactly going to set the market on fire; if anything, there's a risk that we'll just end up with another flood of "landfill android" devices.

On the other hand, it'll be interesting to see how the US tech embargos affect things; that may well have a serious effect on the rate at which RISC-V evolves...

So, yeah. Let's meet back here in 5 years, and the person whose prediction was the furthest out gets to buy the beers ;)

juice Silver badge

Re: RISC-V

> If Qualcomm were to release a half decent RISC-V processor

Realistically, that ain't gonna happen.

In the first instance, the RV64 RISC-V chip which is meant to be being used in smartphones Real Soon Now is claimed to maybe be equivalent to a Snapdragon 662 chip.

https://www.notebookcheck.net/First-RISC-V-smartphones-could-launch-in-2022.581456.0.html

And even if we believe the manufacturers claims about performance, that still puts it as having roughly a third of the performance of Qualcomm's flagship Snapdragon 888 Plus chip.

https://www.notebookcheck.net/Qualcomm-Snapdragon-662-Processor-Benchmarks-and-Specs.497538.0.html

It's also worth noting that even the 888P is slower than Apple's A14 chip.

RISC-V is not currently an ARM killer when it comes to mobile phones. It's definitely not fit for purpose when it comes to flagship devices, or even for mid-tier handsets.

It can be an ARM killer where power-consumption trumps the need for processing power (e.g. the IoT eco-system), but it'll take years before RISC-V can compete with ARM's chips, and neither ARM nor Apple are standing still in the meantime.

Secondly, there's then the question as to how Qualcomm would go about using RISC-V. There's three main possibilities:

1) Just use a standard RISC-V design

2) Have an in-house team produce a customised RISC-V design

3) Licence a customised RISC-V design

The first option gives them zero competitive advantage over other manufacturers. The second option requires them to build, maintain and fund a CPU design team. The third option is pretty much the same as licencing CPU designs from ARM...

Plus, they'd then have to maintain two separate software stacks for their ARM and RISC-V ecosystems.

And thirdly, this is Qualcomm. A company which has spent the last 20 years using it's massive patent war-chest to shake down all other mobile phone manufacturers for as much money as it could soak from them.

I.e. Qualcomm is institutionally adverse to the concept of open-source principles.

So even if they did decide to use RISC-V, given that it has a permissive open-source licence, does anyone really think that if Qualcomm did start to produce it's own designs, that it'd actually be willing to share them or contribute anything back to the RISC-V community?

Braking news: Cops slammed for spamming Waze to slow drivers down

juice Silver badge

Re: feels valid to me

> I had not realised quite how effective the grey backed trendy number plates were at preventing ANPR detection

Wowsa. It only took a few seconds of searching to discover a .co.uk website selling licence plate covers which block IR detection. They're not even trying to position it as an legally defensible "anti-dust" cover or somesuch.

It's definitely a step up from the cars you see zooming around which have somehow got mud splattered all over their boot, but nowhere else - and which has conveniently happened to completely cover the licence plate...

About that $1b... IBM says Watson Health assets fetched $230m in pre-tax profits

juice Silver badge

Re: Answers

Where's amanfrommars when you need them? ;)

juice Silver badge

Re: Perhaps the bloom is off the AI rose

> Are we seeing a concrete example that AI/ML cannot produce the results the hype machine promised

I suspect that it's the same issue as per self-driving cars, in that it's a mix of legal and technical concerns.

Or to put it another way: the legal costs involved with any technical solution which isn't 99.999% perfect are simply going to be too high!

Leaked Uber docs reveal frequent use of 'kill switch' to deactivate tech, thwart investigators

juice Silver badge

Two wrongs don't make a right

> Reality is that the legacy taxi cartels at least verge on organised crime, where they are not well-known to be entirely controlled by it - many cities around the world, from New York to relatively small towns.

Cartels are generally a bad thing. But I'm struggling to see how replacing a local cartel with an international cartel would ever be a good idea, especially when companies like Uber have literally billions of dollars of money to use (legally or otherwise) to maintain their market dominance.

> Uber is orders of magnitude safer for passengers, and cheaper, and preferred by drivers. Legacy minicab firms were primarily known for tax dodging and treating their drivers like dirt, driving pay well below minimum wage. It's really hard to see what people complain about Uber for.

This would be the same Uber who was routinely paying their drivers below the minimum wage, until eventually forced by legal action to change things? The same Uber who was doing that across the entire world? E.g.

The USA: https://www.theguardian.com/technology/2018/mar/01/uber-lyft-driver-wages-median-report

The UK: https://www.bbc.co.uk/news/business-56412397

... and the same Uber which has yet to make a profit, since they've been continuously burning through their investor funding by subsidising passenger costs in an effort to drive out rivals and monopolise the market?

I'll grant that Uber is nominally safer for passengers than the traditional "hail from the kerb" black cab, but they're far from the only company to offer an internet-based hailing service. E.g. in Sheffield, the local firm City Taxis has been offering much the same service, at a similar price, and this was rolled out about a decade ago.

And truth to tell, City Taxis have generally been more reliable than Uber, and they don't have those insane surge-pricing periods, during which it's cheaper (if ironically, less safe) to just flag down the next black cab you see.

Beyond that, my (admittedly simplistic) understanding of Uber was that it was essentially a massive high-payout gamble based on the theory that self-driving cars were just a few years away.

I.e. first they intended to capture as much market share as quickly as possible, both to drive out any rivals and also to outpace any legal action which could be levelled at their business practices.

Secondly, once they had market dominance, they were going to roll out self-driving taxis, which would then allow them to do away with taxi drivers altogether, alongside all of their pesky wage requirements.

And with that lovely market dominance to protect them from rivals and politicians alike, they'd then be able to behave in the traditional monopolist way, arbitrarily raising prices, crushing potential upcoming rivals and generally sitting back on their laurels and letting things stagnate.

Sadly for them, self-driving cars turned out to be far trickier than expected, and that meant that the world's legal systems were able to catch up to them and force them to change their behaviour.

It'll be interesting to see just how things play out in the next year or two. Though at the very least, Uber has somewhat acted as a mechanism to redistribute wealth, by taking all that investor money and using it to subsidise passenger fares and driver costs...

One of the first RISC-V laptops may ship in September, has an NFT hook

juice Silver badge
FAIL

Re: CPU

> If you want to make modifications to ARM chips, you aren't allowed to

Apple seem to have little or no problems in modifying their ARM chips to make them significantly faster than any other ARM chips out there. And they've raked in vast amounts of money as a result...

Arrogant, subtle, entitled: 'Toxic' open source GitHub discussions examined

juice Silver badge

Re: Eh?

> What about the above opinion is entitled, demanding or insulting? Sounds to me like somebody knows a better way to do something, and yet is being forced into doing it in a way that drastically slows down the entire process

It depends.

Something which may be convenient for person A may turn out to be highly inconvenient for person B. And it's definitely a prima-donna move to wildly exaggerate just how inconvenient the current situation is.

Or to put it another way: is the new way actually better for everyone, or just for person A?

Equally, I'm not particularly fond of the "either put up and shut up, or fork and build your own" attitude which can be quite comment when discussing open source technologies; in it's own way, that's as toxic a form of gatekeeping as any other.

However, open source contributers are all too often both under resourced and under appreciated. And there's a huge difference between making suggestions which could improve things, and throwing a hissy fit because something isn't 100% perfect for your specific needs.

And for me, the above comment definitely falls into the latter category!

Consultant plays Metaverse MythBuster. Here's why they're wrong

juice Silver badge

What about the biggest myth?

> If so, you can add VR headsets to the list of things that people won't rush towards.

To be honest, this is the key "myth" that this little puff-piece conspicuously failed to mention: how many people are willing to wear VR headsets for extended periods, in order to interact with this metaverse?

juice Silver badge

Re: Holodeck

> They actually shoot the actors on green screen and use a computer to replace the green (color humans don't have) with whatever CG decor they want. And if you feed the real camera's moves to that same computer you can make it copy the exact same camera movements in your fake 3D decor space, making it look even more real (check any and all recent movies).

It's a mix of things. I suspect what the original author was referring to was the tech being pioneered by Lucasarts for The Mandalorian, where the actors are basically stood inside a giant ring of screens which project the background for the scene.

https://www.redsharknews.com/production/item/6963-the-mandalorian-is-totally-redefines-cgi-for-television

The problem there is that when all is said and done, it's still just a flat projection on a screen, with no 3D attributes at all; anything the actors needs to directly interact with has to be either a physical prop or given a placeholder in the shape of a man holding a tennis ball at the right height, who can then be replaced with some post-processed CGI.

Which works well enough for filming a TV show, but isn't quite as useful for interacting with a metaverse...

Will this be one of the world's first RISC-V laptops?

juice Silver badge

Re: Some serious questions.

> Just about everything in your post is wrong

Oh noes!

> RISC-V was not introduced 12 years ago, some students and their professor had a crazy idea in a pub to START it 12 years ago. It was essentially introduced to the world a little under 7 years ago.

You might want to go and update Wikipedia with your detailed knowledge then, since that's where i sourced my timescales from. Or indeed, the official RISC-V history page.

Personally, I'd differentiate between the initial development of RISC as a concept, and the actual implementation of RISC-V. Since as the name suggests, RISC-V is actually the fifth generation of RISC design!

Beyond that...

https://en.m.wikipedia.org/wiki/RISC-V

https://riscv.org/about/history/

major RISC-V milestones were the first tapeout of a RISC-V chip in 28nm FDSOI [...] in 2011, publication of a paper on the benefits of open instruction sets in 2014 2, the first RISC-V Workshop held in January 2015, and the RISC-V Foundation launch later that year with 36 Founding Members.

For me, the fact that the design was open-sourced and taped out in 2011 is the key date; it may have then taken 5 or 6 years for RISC-V to be publically debuted, but that doesnt change when the "1.0" specification was released.

> Dave Patterson invented the term "RISC" and the first RISC I CPU around 1980-1981, not 1990. I can only assume you weren't born at those times and consider them prehistoric.

Ooo. It's always a pleasure to be considered younger than i actually am ;) Sadly, while I was a little young to be using computers in 1980, I did start poking buttons on a ZX Spectrum in 1983 or so.

And again, I was quoting Wikipedia:

The term RISC dates from about 1980 [...and academic research...] resulted in the RISC instruction set DLX for the first edition of Computer Architecture: A Quantitative Approach in 1990 of which David Patterson was a co-author, and he later participated in the RISC-V origination.

RISC may have been "named" in 1980 - ARM originally stood for Acorn RISC Machine back in 1983 - but again, for me, the release of the DLX paper in 1990 is where RISC-V was born, especially since the author of that paper - David Patterson himself - had a huge hand in designing RISC-V.

But then, the nice thing about opinions is that everyone has one!

> ARM does NOT allow you to add or remove things from their CPU core or the instruction set. Of course you can add whatever you like else in the SoC, as you don't license that from ARM and ARM doesn't make such IP

Odd. I must have dreamed up the link which I added to my post, about how ARM lets you add custom instructions to your CPUs.

No, wait...

https://www.arm.com/technologies/custom-instructions

Certainly, Arm Custom Instructions support the intelligent and rapid development of fully integrated custom CPU instructions. sounds like exactly what you're talking about, unless im missing something major.

> The Raspberry Pi is very far from standard. There are simply a lot of them. (Compared to other SBCs, not compared to phones or tablets)

This one is a bit more subjective. But for me, the argument would be that once something has a significant market share, it's effectively a standard, as also happened with Apple's iOS devices. Certainly, there's a very large and healthy eco-system for both iOS and RPi devices, with far more peripherals, expansions, etc available than for any other devices - or indeed, all other similar devices combined.

I'd also note that the RPi is based on a fairly standard ARM "mobile phone" SoC, which means it's able to hook into standard Linux/Android/Windows toolings and libraries.

So while I can see some merits to your argument, I still think that from a practical perspective, the RPi is a standard!

juice Silver badge

Re: Some serious questions.

> However x86 and to a lesser extent also ARM carries some baggage due to their age. Furthermore they both contain everything you could dream of in a general purpose processor, and more

It's worth bearing in mind that RISC-V was introduced 12 years ago, and is based on David Patterson's RISC designs, which he first drew up in 1990 while in academia. So it's not *that* new, though at least RISC-V was effectively a "clean slate", since it didn't have to carry any significant baggage over from its school days.

> RISC-V is super simple and most functionality, even floating point operations, is optional extensions

You can't add or remove things from an x86 chip, but ARM very much lets you pick and choose what you want in your SoC.

In fact, that's part of the reason why Apple's ARM chips are dominating things at present, since they've picked the bits they want and then done their own engineering and custom design work atop that!

https://www.arm.com/products/custom-socs

https://www.arm.com/technologies/custom-instructions

It's also worth bearing in mind that all that extra flexibility carries costs of it's own. One reason why ARM took so long to make any inroads against x86 dominance of the "PC" market was that while there were millions (if not billions) of ARM devices out there, each one was based on a unique design and OS/software had to be custom tailored to each one.

It wasn't until we got Android (and to a degree, iOS) that we started to get properly standardised ARM hardware designs which you could build standardised software packages for. And that then led to things like the Raspberry Pi and it's ilk.

> It’s also open, so companies wouldn’t have to pay for a licence, in theory, because they would in most cases still have to buy a design somewhere… Maybe there would be companies offering custom designs to the industry.

It'll be interesting to see how pricing pans out over time; for most "commercial" Open Source stuff, the main charges come from support and training, and I'd be surprised if those costs were roughly on par with those for x86 and ARM chips.

> Also because it is so new and carry so little baggage, general purpose versions for PC’s have the potential to have very good power/performance. It should at least be able surpass x86 to rival ARM.

Perhaps. The last article I saw about such things indicated that while a prototype RISC chip was significantly beating the competition in the power stakes, it was also only running about a quarter of the speed.

And even then, that comparison was based on benchmark results provided by the manufacturer and using a deliberately simplified single-process benchmark, since Ars didn't actually have a sample of the RISC chip to test.

And we all know just how reliable and unbiased manufacturer-supplied benchmark figures can be!

https://arstechnica.com/gadgets/2020/12/new-risc-v-cpu-claims-recordbreaking-performance-per-watt/

Fundamentally, RISC-V may be improving faster than it's competition, but that's partly because it's so far behind them. And it remains to be seen whether it'll be able to become truly competitive on both price and performance, especially given that x86 and ARM both get a lot more design-money thrown at them, and are able to licence patents etc.

On the other hand, there's plenty of niches where low-power (and/or patent-free/politically unencumbered) CPUs can be slotted into. So I think RISC-V is here to stay, regardless!

The sad state of Linux desktop diversity: 21 environments, just 2 designs

juice Silver badge

Re: The curse of overchoice

> But what are you familiar with?

I'll say that when I come to "modern" software, I'm generally used to commercial software, such as Microsoft Office and Windows. OTOH, I also spent a few years using a Macbook as my primary environment, and I'm currently using Ubuntu for my job.

And generally, they're fairly consistent, give or take little fun things like alt-tabbing behaviour, or having to use crtl-shift-c and crtl-shift-v in Ubuntu when doing copy-pasta in a terminal.

> Proprietary stuff is apt to keep changing the UI.

Not sure I'd agree with that. From my experience, proprietary stuff is generally far more conservative - at least when it comes to "business" technologies.

And as ever, it's all down to following the money. Significant changes mean more calls into support, it invalidates existing training/certifications (and means that your lucrative training courses need to be rewritten and your trainers re-educated), and it increases the risk of user churn.

After all, if you're going to have to retrain to use the new version of your current software, why not just move to some other vendor who is cheaper and/or offers a more familiar UX?

As such, Microsoft, Apple, etc tend to go for well-spaced out "revolutionary" changes, which are heavily telegraphed in advance, and with plenty of support offered to ease people into the new way of things.

Conversely, FOSS and smaller "independent" commercial companies are more inclined to experiment; for the former, there's no financial implications, and for the latter, they're likely to have a dedicated userbase who are willing to accept significant variation from standard UX.

Admittedly, you do get exceptions to every rule, and for complex and popular stuff such as Gimp which are effectively "build by committee", there's plenty of people who will actively resist UI changes.

But there's also plenty of FOSS stuff which is driven by small development teams, who may either be enthusiastically exploring a "new paradigm", or are being driven by some ideological commitment...

And that's not a bad thing; some of those experiments may actually be better, and may even end up incorporated into other software.

But at the same time, they can be an active hindrance to efficient working, especially if you regularly switch between multiple platforms.

juice Silver badge

Re: The curse of overchoice

> Thank you, yes - I just want to be pointed to an ISO that will install a version of linux that looks vaguely Windows-like on my old laptop. That is how I (and others) will be persuaded to make the change.

I'd go a bit further: I just want something which works in a way which I'm familiar with.

Having lots of choice is great when it comes to the consumption of things - food, drink, music, movies, etc.

It's not as great when it comes to productivity tools, especially those sourced from multiple providers. I don't want to spend any great amount of time learning how to use the latest shiny: I want to be able to bring at least 90% of my existing knowledge and "muscle memory" over with me from the last shiny.

In fact, I'd say that there's an argument to be made that the market will generally trend towards just two or three operational modes. See operating systems (Windows vs Mac), mobile phones (Android vs Apple), or even online shopping (Amazon vs Ebay) and social media (Facebook vs Twitter, or even Youtube vs TikTok).

Fundamentally, if a new paradigm is revolutionary rather than evolutionary, then it has to be so much better than the competition that it completely supplants it before the competition can respond. Otherwise, it'll either just fade away completely or just end up serving a niche userbase.

The end of the iPod – last model available 'while supplies last'

juice Silver badge

Re: Ol' (mostly) reliable

Probably a bit late now, but yep - if you had to change the batteries, you had to do it next to the computer, so you could re-sync your songs onto it.

As Doublelayer said, early PDAs had similar issues, though with at least some (e.g. Palm), I'm pretty sure they had a capacitor "UPS" built in, which gave you about 10 seconds to get the batteries swapped out. Even worked, sometimes!

There was also a period where some PDAs were still using RAM for internal storage, but also had memory-card slots built in. At which point, apps started to spring up which would let you back up/restore the internal memory from said memory card.

Which at least meant that you could swap out the batteries while travelling...

juice Silver badge

Re: Still use my ipod

> Toyota's player doesn't support album shuffle.

Could be worse.

At one point, I picked up a car stereo with a sd card reader, which could play MP3s.

Which was all well and good... except that the built in RNG reset itself when the car was switched off.

So the "shuffle" feature always played the same songs in the same order...

After a while, I got into the habit of deleting random files from the card, just to force a bit of a "reshuffle"!

juice Silver badge

Re: Ol' (mostly) reliable

> There are plenty of companies making nice standalone music players, so when your iPhone dies you can migrate to one of those.

Possibly. In some ways, it's more about the integration with iTunes which is the key thing for me. Because I do actually sync from both iTunes -> iPod and vice versa.

E.g. I'll put some new songs onto the iPod, listen to them for a day or two, and then rate each individual song.

Then, when I sync the iPod with iTunes, I can use those ratings to update my "master" playlists. E.g. I can delete any 1-star songs, or move any 5-star songs over to one of my permanent "genre" playlists.

Similarly, iTunes has the option to mark songs as "skip while shuffling".

Because, as much as I love leaving things up to the Gods of Random, I've also got a number of comedy and seasonal songs (or sometimes both - the HP Lovecraft themed A Very Scary Solstice, which rewrites Xmas carols to be about eldritch horrors...) which I don't particularly want to listen to on a regular basis, but which I'd like to keep around for the appropriate occasions.

And it's also got the ability to soft-trim songs. Which is great when there's tunes which have a spoken word intros (e.g. live recordings).

And as far as I know, there's no MP3 player (hardware or otherwise) which supports these features.

Though I'm more than willing to be pleasantly surprised :)

Having said all that, I'm going to have a look at Music Bee, since I'd quite like to have all my tunes on my Samsung S21+, and for some reason, iSyncr is bringing over all the songs, but not the playlists. Which makes the Gods of Random happy, but complicates everything else!

juice Silver badge

Ol' (mostly) reliable

I've had a long history with digitally compressed music; I can remember encoding CDs on a P100 machine, and friends who had RAM-based MP3 players; if the battery died, then you lost your music!

However, back in the day I mostly stuck with Sony's minidisk players; the later models gave you up to 320 minutes per disk (at around a quid per disk), Sony grudgingly let you digitally copy music over to the later models via USB, via the abomination called Sonic Soundstage, and you could get over 50 hours playback from a single AA battery.

And the MP3 players at the time frankly couldn't hold a candle to either the battery life or song-storage capacity, especially since I could easily carry a half-dozen minidisks with me!

Still, once solid-state capacities started to grow past 1GB, I did start using MP3 players.

I can't remember exactly when I fell into the Apple eco-system, but for me, the iPod peaked with the 3rd gen iPod Nano. The "shortened credit card" form factor was far easier to keep in a shirt than the tall/thin form factor favored for the other Nano variants, and 8GB gave the Gods of Random (tm) a fair amount of stuff to play with.

Plus, at that point, iTunes was measurably better than Sonic Soundstage.

I did eventually drift away from iPods, but ended up being roped back in entirely by accident; my old MP3 player died, the day before I was due to go on a 20-hour coach trip to Germany, so I nipped into a local Cash Converters and grabbed a used 120GB iPod Classic.

And that thing lasted for over a decade; it was a bit dented and scratched when I finally parted with it, but functionally, the only real issue with it was a diminished battery life. And even then, I think I still got about 50 quid for it, via Ebay...

Alas, things haven't been quite as smooth sailing since then. I did pick up a "refurbished" iPod Classic with a 128gb SD card, but it was prone to locking up. I suspect the battery wasn't quite up to snuff, as it was generally fine when charging/playing via a dock.

I also dabbled with iPod Touches, but all the ones I bought experienced the same issue; after a while, the headphone jack-port would fail. I did try to get a couple of them repaired, but this was never successful.

In the end, I bit the bullet and upgraded... to a 128gb 1st-gen iPhone SE, from CEX. Since it was both cheaper than an iPod Touch and is the last iPhone model to have a headphone port.

And I'm still using said beast daily, while sat at my work desk.

Smeg knows what I'll do when that finally dies (and/or I can't source a good replacement from Ebay or similar).

For all that iTunes is a bit pants, it's still better than most options, especially since I'd have to deal with the joys of exporting all the stuff I've set up; I carry around 10,000 songs with me, split across 50+ playlists, even if I do usually just let the Gods of Random pick the songs for me ;)

I have dabbled with porting stuff over to Android - e.g. via iSyncr, which uses AppleScript (or somesuch) to query iTunes and then transfers the given playlists and songs to your Android device - but it's an extremely clunky process and Android is fundamentally a far poorer host for locally stored music, especially with the number of tunes I carry around.

And as far as I know, there isn't any Android player which lets you tag/rate songs, which still remains my favorite way to process new tunes: stick 'em in a playlist, sync to the player, and then mark them as 1-star if they're crap.

So, yeah. I've had a good run with the classic iPods, if not so much the more modern iPod Touches. It's been one of the few places where Apple's It Just Works ethos properly clicked with me!

BOFH: You'll have to really trust me on this team-building exercise

juice Silver badge

Re: Takes me back

At one point, I was working in a ground floor office, which had full height glass "walls" directly by the pavement running alongside the building.

Thankfully, said glass was silvered, so people generally couldn't see into the office during the day.

Which meant that one day, attention was called to a young lady, who'd clearly had quite an enjoyable afternoon in the pub, and had decided to use the corner of our building to relieve herself. While clearly being completely unaware that said silvered glass was effectively a one-way mirror, and there was a load of tekkie types staring at her...

ZX Spectrum: Q&A with some of the folks who worked on legendary PC

juice Silver badge

Re: Screen memory layout

> Why was that so whacky?

The short and simple answer is that it saved memory.

At 256*192, a simple black and white display uses 6kb of memory. Conversely, a display using 8 bits of colour per pixel would use 48kb. Which wouldn't have left any memory to do anything else, unless Sinclair was willing to bump the RAM up to 64kb[*]!

Conversely, the colour overlay only used 768 bytes, since it only needs one byte per 8*8 block on screen.

And that meant that even the 16k Spectrum still had about 9kb of RAM free, which was enough to do something useful. Or to play games like Jetpac ;)

It's also worth noting that the ZX Spectrum wasn't the only computer to use this sort of technique, though most of it's rivals did offer other display modes.

https://en.wikipedia.org/wiki/Attribute_clash

Still, you gets what you pays for, and the ZX Spectrum was significantly cheaper than it's more flexible competition...

[*] 8 bit CPUs can generally only address up to 64kb of memory, which can be a mix of RAM and ROM. The basic Spectrum models had a 16kb ROM and then either 16kb or 48kb of memory. Conversely (and to somewhat simplify), the Commodore 64 had a 16kb ROM and 64kb of RAM, and the programmer could choose to either use the ROM + 48kb of RAM, or replace it with their own code and use the entire 64kb of RAM!

Fancy a remix? Ubuntu Unity and Ubuntu Cinnamon have also hit 22.04

juice Silver badge

Re: Might be worth moving with the times...

> Interesting opinion, but in my opinion it basically misses the point.

The nice thing about opinions is that we all have one...

And in my opinion, you're perhaps missing the point as well. Or at least: we're perhaps talking from two different perspectives.

> Likewise, if what you need is something that is genuinely 'open' then use that, and accept the costs inherent in that.

And this is where I think our perspectives are differing. The above line suggests that you're looking at it from the perspective of someone implementing a piece of functionality with OSS. Whereas I'm looking at it from the perspective of the end-users who are then trying to make use of that functionality.

Because let's face it, my hypothetical great aunt doesn't care if the software on her laptop is "open". She just wants it to work.

And that brings me back to my pathological obsession with standards. Because the key thing which has driven scientific, logistical and engineering evolution is the introduction of consistent standards. Even where those standards were arguably flawed (e.g. fahrenheit, or PRINCE2), it gives people something they can work against.

And when all is said and done, Open Source is effectively just a codification of a given process or standard. Whether it's the GNU C compiler, the linux kernel or the BSD network stack: here's a tool to do a set of things in a standardised and agreed way.

(If you're lucky, there might even be some documentation to go alongside it...)

Equally, it's one thing for a back-end developer to (say) choose Postgres over MySQL, or even to opt for MariaDB over MySQL. Because that has no impact on the end-user.

But it's an entirely different thing when it comes to "end-user" OSS implementations. Ubuntu, Mint, Redhat, Debian, etc, etc: they all do nominally the same thing, but with visibily different implementations. And as per this very article, many of them include multiple ways to mostly do the exact same thing.

Hell, just look at how much of a mess the "internet of things" has become; it's riddled with exploits, incompatibilities, etc. Despite all nominally being built on FLOSS.

For better or worse, this is arguably fundamental to the very nature of OS: people are free to pick stuff up and tweak/mutate/evolve it to fit their needs.

But by the same token, it also seriously hinders the wider uptake of OSS, especially when it comes to things like the oft promised "year of Linux on the desktop".

juice Silver badge

Might be worth moving with the times...

> Nobody is going to hold your hand, either dive in or, to be honest, STFU.

I've been tinkering with Linux ever since I downloaded Slackware onto a stack of floppies back in 1997 and tried to get it up and running on a 486 with 8mb of ram. And my working life is spent sitting in front of a laptop running Ubuntu.

But in general, I am more in agreement with VoiceOfTruth's position.

And this kind of "It's free, so STFU" comment has two fundamental issues with it.

The first is that Open Source stuff - especially Linux - is no longer just a playground for hobbyists. It's being used as both the back and front end of stuff which is being used by non-technical people. And not only do non-technical people vastly outnumber technical people, but they generally just want something which Just Works.

The idea that (to take an arbitrary example) my great aunt is going to dive into the documentation and source code to try and figure out why DHCP isn't working on her linux router is frankly farcial.

And that also leads into the second point.

Because the key thing about Open Source is not that it's free (as in beer). It's arguably not even that it's free (as in speech)[*].

Instead, it's that Open Source defines and implements a standardised set of tools and processes which anyone can use. Thereby reducing both waste and cost, since there's no need to reinvent the wheel.

It just so happens that sort of standardisation works best when it's given away for free. But at the same time: it only works if there is a single standard - or at least, a single standard with multiple implementations.

However, when it comes to things like Linux package-managers and GUIs, that's not what we have. They all look slightly differently, all work slightly differently and are generally poorly documented; fixing issues can be a nightmare unless you have the knack of hunting through Stack Overflow for nuggets of information.

This isn't to say that Linux should freeze and stagnate. Because that's the other nice thing about Open Source; people are free to take it, change it and improve it. Evolution in action, baby.

But again, my great-aunt doesn't care about any of that. She wants something which Just Works. And as long as we're in a situation where there's lots of different (and mostly compatible) solutions for the same problem, we're going to get people throwing their hands up and walking away, especially if they're non-technical.

So for me, snarking about how users should just STFU is frankly both elitist and counter productive when it comes to promoting and improving OSS.

[*] There's also the point that while open-source is generally free at the point of use, a lot of mainstream stuff is commercially funded, whether that's through donations, advertising, charging for training/support, etc. But that's a wider discussion...

Netflix to crack down on account sharing, offer ad-laden cheaper options

juice Silver badge

> We've just ditched Sky because of the deluge of ads and constant attempts to rip us off

I'm about to say some very rude things to Virgin, but that's mostly because they've hugely bumped up the monthly price, and I haven't actually switched on their TV box for the last 6 months.

Youtube is the main annoyance for me atm - the deluge of adverts they're forcing on "free" viewing is frankly obnoxious. Especially since they seem to be overly repetitive and woefully poorly targetted, to boot.

I wouldn't mind as much if YT was actually producing the content, or if any measurable amount of the monies from said adverts went to the people actually creating said content. But alas, it all seems to stream straight into Google/Alphabet's pockets.

I may have to see if I can figure out some way of getting a filtering-proxy set up for my Roku TV...

juice Silver badge

Re: GREEEEEED

> I think I prefer series that only have one or two seasons

It can be nice having properly self-contained and properly wrapped up stories.

The problem these days is that series finales tend to be deliberately written with some unfinished threads/deliberate cliffhangers, so that they've got something to kick start the next series off.

Supernatural being perhaps the prime example/chief culprit.

Unfortunately, I think this model is starting to backfire for Netflix, since they've started doing the Google thing of cancelling shows after one or two seasons.

Anecdotally, several friends have stopped watching new shows on Netflix, because they know that said shows are likely to be cancelled at a "cliffhanger" point.

Which then leads to a catch-22: if people aren't watching the shows unless they've reached a natural conclusion, then Netflix are more likely to cancel said shows, since no-one's watching them...

Shanghai lockdown: Chinese tech execs warn of supply-chain chaos

juice Silver badge

Re: Get out

> Read the article again. 372,000 cases and three deaths.

The problem with that is that those statistics are official government ones, and Chinese officials aren't exactly reknowned for transparency and honesty. Especially when it comes to anything which their higher-ups have an active interest in.

Admittedly, there's no doubt that a zero-tolerance lockdown is a proven way to halt Covid transmissions. But at the same time, the long-term social and economic impacts are huge.

And with Shanghai being China's most densely populated city (3925 people per square km), I do also have to wonder just how effective even a military-enforced lockdown can be.

Then too, not only are China's "inactivated" vaccines quanitifably less effective than the western mRNA vaccines (e.g. 51% vs 90% for CoronaVac vs Moderna, and a significantly faster drop off in antibody levels), but it also looks like they've also had significantly less uptake in their older population, who also happen to be the most vulnerable.

https://www.nature.com/articles/d41586-021-02796-w

https://arstechnica.com/science/2022/04/cdc-study-spotlights-utter-failure-of-chinas-covid-zero-policy-in-hong-kong/

So, yeah. I'd take those statistics with a grain of salt. Possibly one as large as the iceberg which sunk the Titanic...

'Bigger is better' is back for hardware – without any obvious benefits

juice Silver badge

Re: Softie !

> I feel that on the software side we have had minimal return since Windows 3.11, the primary reason for future Microsoft bloat being to replenish sales of both hardware and software

Linux Mint requires 20GB of hard drive space

Mac OS requires at least 35GB of space.

Even the Pi's Rasbian distro takes up 4.7GB of space.

Yes, there's arguably a lot of pointless bloat in modern OS's, and if you're brave and/or have lots of time, you can no doubt manually trim them down significantly.

But it's not a purely "Microsoft" issue.

What do you do when all your source walks out the door?

juice Silver badge

I am kinda wondering...

What Dave was thinking when they did this:

> "During the afternoon of that day, Dave's manager looked out the window to see Dave loading boxes and boxes of floppy disks into his car," said Al.

I'm guessing he wasn't feeling particularly rational at the time, but unless there was some sort of special contract involved, both the floppy disks and the IP they contained belonged to the company. So even in the best-case scenario he'd have been absolutely hammered in court and forced to return them.

Equally, given how fragile floppies can be, I'd have hated to be the person trying to restore all the data from said disks...

RIP: Creators of the GIF and TRS-80

juice Silver badge

If you really want to set the flaming cat among the petrol-soaked pigeons, try innocently asking what the correct term is to describe bread which has been baked into a small, rounded shape.

Wars have been started over less.

(Which is silly, as it's clearly a barm cake... ;) )

Russian IT pros flee Putin, says tech lobby group

juice Silver badge

> Back in the cold war days, the western countries had what amounted to an embargo of exporting high-tech equipment like computers to the USSR (CoCom), with the covert goal of sabotaging their economy as without IT, it was unable to keep developing its economy to the extent the western countries could.

As far as I know, it actually was an embargo, and was primarily aimed at making it harder for the USSR to get access to high tech which could be weaponised, especially when it came to things like high-precision tooling or fast computers which could help with things like encryption or nuclear weapon design. I'm not sure these embargos were ever primarily viewed as an economic weapon, though it was no doubt a useful secondary effect.

> The question is how self-sufficient Russia is today in terms of high-tech goods and systems - can they make their own computers, servers switches, phones, etc?

I think the bigger question is: do they actually need to be? China is at the very least an ally of convenience, and with the USA having already applied similar embargos to companies such as Huawei, there's already a lot of Chinese technology which should be generally be fit for purpose, and which the USA has little or no ability to embargo.

Admittedly, the situation is pretty complicated, especially when it comes to how much trust Russia is willing to put into Chinese tech.

And it'll be interesting to see what happens in the long term, as all of these embargos (and the ongoing development of open-source hardware/software) are effectively weaning the rest of the world off US-controlled technology.

But in the short to medium term, a lot of Russians - and their entire economy - is going to be be put through a lot of hardships.

How experimental was Microsoft's 'experimental banner' in File Explorer?

juice Silver badge

Re: Usual answer

> I really do hate them all. It is 2022 and the interface of Windows NT 4.0 from ~1996 was arguably more crisp, more consistent and used less than 16MB of RAM.

Nostalgia ain't always what it used to be. Personally, I run a lot more stuff than I did back in 1996. At a much higher resolution than the 1024*768 CRT I had at the time.

And across three monitors and two graphics cards, to boot. And then there's all the stuff in the background, like realtime audio mixing, etc.

Modern GUIs may chew up a lot more resources than they used to, but they're also dealing with a lot of stuff - and adding a lot of value-add stuff to boot. Even if (as with office software), everyone uses a different subset of said features.

OTOH, my preferred dev environment is a Terminal window with a dozen tabs on it, most of which are running Vi. Swings and roundabouts, or somesuch...

Machine-learning models more powerful, toxic than ever

juice Silver badge

Re: So much for copyright!

> If it is being used for training there is no copyright issue

I'm not sure how well that argument holds up. Certainly, the books I had to use at university weren't made available for free!

114 billion transistors, one big meh. Apple's M1 Ultra wake-up call

juice Silver badge

Re: I'm holding off

It's not too bad for noise. Otoh, between the dual CPUs and the many sata drives crammed into it, it does add about 50p/day to the electricity bill when switched on.

Which is pretty much going to double come April.

So I am thinking about upgrading to something newer! Since some £200 jobbie off Ebay will pay for itself within a year or so...

juice Silver badge

> I'm not sure the author is suggesting it's harder to learn, but that it's harder to excite potential learners.

People are excited about doing things - it's just all that extra memory and processing power has enabled us to create tools which abstract away the underlying hardware and processes.

And the newer generations are excited about using those tools and abstractions to create new things - as well as new tools and layers which will then be used in turn by the generations which follow them.

Admittedly, this also means that it's much harder for each new generation to peel back the layers of the onion. And that could meant that finding - and training - people to do things with those deeper layers is going to get more difficult.

On the other hand, it also means that the older generation of tekkies will generally always be able to find work. COBOL 4 LIFE, or somesuch ;)

juice Silver badge

Re: I'm holding off

... maybe.

Personally, I'd say that we've been seeing diminishing returns on new hardware for the best part of the last decade. Once we got past 4 cores and more than 8GB of ram - on both computers and mobile phones - then we hit the point where 99% of tasks can be done in memory, and on a secondary process/thread, thereby keeping the OS responsive.

Which isn't to say that there aren't things which can bring computers to their knees, even in the consumer market; video encoding, video games and VR are all power and memory hogs. But they're very much a tiny percentage of what people use their machines for.

And at the risk of being hoisted on my own "640kb should be enough for anyone" petard, it's difficult to think of what might come along to change that. Not least because our network infrastructure has also improved in both bandwidth and latency, to the point where most tasks (barring video encoding) can potentially be offloaded to the cloud. Voice processing for Siri, Google and Alexa are prime examples of this.

Still, that's the good - and sometimes bad - thing about the future. It always has ways to surprise us ;)

juice Silver badge

Re: I'm holding off

The problem is that we've hit the law of diminishing returns when it comes to making use of all that processing power, especially now that we've moved a lot of the more complex stuff to either parallel-processing algorithms or pre-trained AI models.

Because the parallel-processing stuff will generally be Fast Enough on any halfway decent CPU produced in the last decade; I'm running an 2012-vintage dual-CPU Xeon (16 cores total) with 24GB of ram, and that's more than capable of churning out 1080p video encoding in realtime.

And the AI models require little or no processing power, since all the hard work was done at the training stage.

For the most part, unless you're some sort of power user running half a dozen docker containers, while also rendering 4K live video streams, all those architectural improvements are arguably just reducing the machine's power usage and thereby improving the battery life.

Because 99% of the time, those machines will be ticking along and using less than 10% of their CPUs capabilities.

juice Silver badge

Not just for the skilled or experienced...

> What the personal supercomputer has become is a divinely powerful construction set for ideas in any medium, technical and artistic, but only for the skilled and the experienced. You have to push it: golden age IT pulled you along behind it, if you just had the wit to hold on.

I think that's completely wrong.

Modern computers - backed by the cloud - have made it incredibly easy for the "unskilled and inexperienced" to do stuff. Because they're powerful enough that we've been able to build tools to help people do what they want.

Whether that's programming, video editing, audio mixing, writing, digital painting, live streaming, 3d modelling or any of the millions of other things which people are using their machines for.

Whatever you want to do, there's a tool for it. And thousands - if not millions - of examples, tutorials and prior art to use while learning.

Fundamentally, more people are doing more things than ever. And they don't need to get down to the metal to do it, either.

(and that has a few knock-on effects, particularly when it comes to discoverability and the perceived/actual value of said activities. But that's a separate topic...)

Huawei UK board members resign over silence on Ukraine invasion

juice Silver badge

Re: Another Possible Take on the He/She/IT/They being Questioned ‽

> A Q2A [Question to Answer] .... Do Bots Smoke Dope?

Depends if there's any sheep to dream about...

juice Silver badge

Re: Thank you for your service. Don’t let the door hit you on the way out.

> After a decade of scratching my head over their posts I'm largely convinced it's a human, albeit one that should smoke a bit less pot.

Why not both? ;)

My take is that it's a guy who had a bit of fun setting up an "AI" (probably trained on past Register forum posts for extra giggles), and left it to spew random comments out for a while.

Now either he's vastly improved his AI (in which case I for one welcome our new El Reg overlord), or he's simply turned it off and is now posting his own comments...

Where are the (serious) Russian cyberattacks?

juice Silver badge

Re: Maybe we've got it all wrong

> I think there's a good chance Ukraine can hold out on the defensive - even if Russia commit fully. But that means a few thousand dead civilians a week

Aye - I think we're generally in broad agreement on this one; when a "short victorious war" failed to occur, the only realistic outcome is a long, dragged out and messy slugfest which is unlikely to end until one side runs out of resources and is forced to compromise.

And a whole lot of people will suffer or die along the way.

This morning, a friend mentioned The Battle of Grozny, which I hadn't previously looked into. And having read that, there's definitely a worrying number of parallels between what happened when Russia tried to invade Chechnya in 1994, and what's happening now.

And it's perhaps telling that Grozny was 30 years ago, which is long enough for virtually all of the surviving Russian soldiers and commanders to have retired...

https://en.wikipedia.org/wiki/Battle_of_Grozny_(1994%E2%80%931995)

One key point from that article is what I alluded to above, about the possibility that Russian soldiers will decide to take a more brutal approach:

The Russians proceeded to bombard Grozny with artillery, tank, and rocket fire as the rest of the battle centered on new tactics in which the Russians proceeded to destroy the city block by block. White phosphorus rounds and fuel-air explosive Shmel rockets were used by the Russian forces. They would then send in small groups of men sometimes spearheaded by special forces, making effective use of sniper teams. Two long weeks of costly bitter fighting ensued as the Russians moved to take the Presidential Palace.

It also notes that the Russian forces repeatedly broke ceasefires and agreements to limit the use of heavy weapons. All of which I'm sure the Ukranian leadership has taken very careful note of...

juice Silver badge

Re: Maybe we've got it all wrong

I'll agree with most of that,, except for...

> And is taking losses of close to 3% a week - meaning that in 6 weeks time the best units of the Russian army will be 20% dead and have had 20% of its equipment either blown up or stolen by Ukrainian farmers.

It's a sad truth that losses are always higher at the start of the war; Darwin's law is ruthlessly applied and the weak, incompetent, inexperienced and plain unlucky are blown away like chaff.

And similar applies to their equipment; anything unreliable or nearing EOL will be winnowed out.

But the survivors learn. New doctrine evolves. Losses will drop. And they'll get better at protecting their equipment and vehicles, because their lives literally depend on it.

Admittedly, the Ukrainians are undergoing the exact same process, and learning the exact same lessons.

Which means that the war is likely to bog down even more, as both sides adopt more cautious approaches to taking on the opposition.

That is, unless Russia can persuade it's conscripts to keep running head first into situations where they're likely to get killed. After all, a war of attrition worked in WW2... but this isn't WW2, and they've not got any allies willing to send them money and war materiel to keep the grinder working.

And that also gives rise to another concern, especially when it comes to the Russians, since it makes them increasingly likely to adopt more brutal approaches to dealing with enemy attacks and ambushes. Sniper in a building? Call in an artillery strike, and to hell with collateral damage or any nearby civilians.

We just have to hope that some sort of diplomatic solution can be found sooner rather than later.

Rate of autonomous vehicle safety improvement slowing – research

juice Silver badge

How's the old saying go?

"The last 10% of building anything takes 90% of the effort"

So far, the various autonomous vehicle initiatives have been able to pick off some relatively low-hanging fruit. Motorway driving, where vehicles are generally going at the same speed, lanes are clearly marked and rules are clearly defined. Supermarket car parks, where everything is slow moving and clearly marked. Etc.

But now, we're hitting the complex stuff. Town and city driving, where pedestrians and cyclists can - and do - behave irrationally or unsafely. Road layouts (at least in Europe) inherited from the romans and then mutated through centuries of gradual evolution and hasty modern hacks to deal with population growth and the mass adoption of motorised vehicles. And so on.

And dealing with all these scenarios is exponentially more difficult than what's already been solved. And they all need to be solved to the level where 99.99% of situations can be safely dealt with, because neither the public nor the various government car-safety departments will accept anything less...

Russia acknowledges sanctions could hurt its tech companies

juice Silver badge

> We're all now rebuilding our Cold War state institutions (having had this shock) and I bet they'll be turned on China next. For Russia and China, this has been a policy disaster.

It certainly seems to be not going too well for Russia.

One thing which doesn't seem to get mentioned too much is that there's a number of financial, economic and political implications if Russia is successfully able to take over Ukraine. E.g. 80% of russian gas exports go through Ukraine (which was meant to be negated by Nord Stream 2), and Russia and Ukraine produce almost a third of the world's total wheat production.

Put simply, taking ownership of those gas pipes and wheat fields would give Russia a hell of a lot more revenue and a lot more political clout.

Meanwhile, China is stuck in a sanctions war with the USA, and is always looking for ways to legitimise it's claims over Taiwan, so was quite happy to side with Russia.

However, they've now seen what happens when a large but relatively inexperienced and underequipped conscript army attempts to invade a country with a modern military which has had time to prepare.

And how the West has reacted far more strongly to this invasion than most people would have predicted, both in terms of sanctions and by providing the Ukraine with lots of weapons and other forms of indirect support.

And how Putin has decided to up the ante by threatening to use nukes.

And all of the global economic impacts that this war is having, at a time when we're still dealing with the shocks caused by Coronavirus. Which in turn could devastate their export industries.

Overall, I don't think it's been a full blown policy disaster for China, but I do suspect that they're quietly shifting from tacit approval of Russia's actions to a more neutral stance.

And the more that Putin waves his magic nuke card, the more they'll sidestep away from him...

Russia is the advanced persistent threat that just triggered. Ready?

juice Silver badge

> Frantic much? A lot of fear, uncertainty and delusion in this one.

Fear, yes. Uncertainty, yes. Delusion? You might want to look in the mirror...

Russia has been dabbling with cyber hacking, espionage, propaganda and theft for the last few decades. While tacitly supporting "private" hacking groups, so long as they were attacking and stealing from non-Russians.

Now, with Russia actively being at war and with embargoes slamming into place against them, we're going to get a lot of things happening.

First, a lot more of the tools being used for state-level hacking are going to leak into the public domain, simply because they're being used more. I'd expect a flurry of new zero-day exploits to appear Real Soon Now. And once those are picked up by commercial black hats, there won't be a safe server in the world.

Secondly, as with North Korea, Russian hackers are going to be targeting anything with financial value that can either be stolen or ransomed. Because their economy is about to seriously tank and they'll be desperate for money in general and foreign currency in particular.

Thirdly, you're going to get lots of amateur "affiliated" people on both sides going on hacking and defacement sprees.

And so on.

It's all going to get very very messy, and thinking "it won't affect me" is frankly more delusional than expecting trouble.

BOFH: All hail the job cuts consultant

juice Silver badge
Pint

"Gerard's going to recommend firing the board."

Most cost effective move ever.

Work chat app Slack suffers services outage

juice Silver badge

It's been b0rked for the last 18 months...

... if not longer.

For some reason, it seems significantly worse when you use the mobile app, as compared to the desktop/web version.

During Christmas 2020, the app was effectively completely unusable: channels wouldn't load, messages wouldn't send - or would send, but still show up as a draft after being sent.

And the same during Christmas 2021. Which makes me think that there's some sort of bottleneck in their "app" infrastructure, which has a bad tendency to show itself over the festive holidays when a lot of people switch over to using their mobile to access Slack.

Certainly, things improved straight after the festive holidays for both 2020 and 2021.

But over the last 2 weeks, the app's performance has been rapidly deteriorating again.

And a messaging system where you can't reliably send or receive messages isn't really that useful...

Beware the techie who takes things literally

juice Silver badge

Re: this got me thinking.

> Also turns out that I'm running V5.00 Beta 8. How is it even possible that's still running and working perfectly on a daily basis? I think it came out in 2013. I should probably fix that

I've had my licence.rar (or whatever it's called) for WinRAR for at least a decade, and all credit to the company for still letting me use it to upgrade to the latest version.

... and that's probably worth doing, since there was that fuss a while ago about (IIRC) a major security hole in the ACE library they were using. OTOH, the odds of anyone throwing around ACE-compressed files is pretty low, these days ;)

Amazon, Visa strike global truce on credit card charges

juice Silver badge

Re: Still Avoiding Them

> OK I'l bite - HDDs ?? WTF !?!

Prices are starting to drop for solid state drives - after wading through Amazon's[*] surprisingly clunky search mechanisms, I can see 2TB SATA drives for as little as £132.

But a 2TB HDD is just £45, or roughly a third of the cost.

Similarly, a 10TB SATA HDD is £245, where as the largest SSD I can spot is just 8TB and £560.

Spinning rust is still the most cost effective storage, and it's still ridiculously easy to fill local storage, especially if you're doing things like 4K video editing/rendering.

... though I must confess that I'm tempted to grab one of those cheap Crucial 2TB SSDs for my photo/video storage. Should make editing and thumbnail-rendering a whole lot faster!

Though maybe from Scan, rather than Amazon...

[*] Using them is perhaps ironic, given the article, but hey.

Page:

SUBSCRIBE TO OUR WEEKLY TECH NEWSLETTER

Biting the hand that feeds IT © 1998–2022