Musk's people "Unbelievable at computers"
https://en.wikipedia.org/wiki/List_of_predictions_for_autonomous_Tesla_vehicles_by_Elon_Musk
819 publicly visible posts • joined 4 Jul 2008
For me, any flavour of Linux has been a non-starter so far. This is because I have a completely virtual home recording studio running on a Windows 10 desktop box.
Almost all of the top-drawer musical applications and hardware have issues when trying to install or run on Linux.
Out of the mainstream DAWs (Digital Audio Workstations), only Reaper and Bitwig have decent native support for Linux. I settled on Reason (Windows / Mac only) as my weapon of choice many years ago, and it would be a real blow to lose the internal 'devices' which I have made integral to my workflow.
Mainstream VSTs / Plug-ins are another area which simply don't have much in the way of Linux support, either. The big names in the field like Arturia, G-Force, Cherry Audio, Native Instruments and u-he simply don't provide native builds or installation software for Linux. Some people have managed to create workarounds so that they can install using the appropriate Windows installer under Wine, but it seems to be touch-and-go whether it will work at all, or if a working solution gets broken by the latest update. On Windows, it's (pretty much) a trouble-free process, and most of the big-name companies have decent support departments and user forums.
Finally, even when you're running all your instruments, effects and recording 'devices' in software, the input and output sides of the studio (keyboards, knob boxes, audio I/O, MIDI I/O) require real hardware. Once again, finding a way to get hardware boxes to play nicely with Linux is a bit of a crap shoot. Sometimes it will be genuine seamless plug-and-play, while at other times you can be trawling through forums for literally weeks. By contrast, if hardware devices have a Windows installer, then you're (pretty much) home-and-dry.
Yes, my situation is a corner case and has no bearing on those who perform 'generic' tasks with a PC. But Linux is currently a very bad fit for hosting a virtual home studio. I'm content to put up with the slight grubbiness of the 'Windows Experience' for my day-to-day computing needs in order to have a smooth experience when I'm relaxing with my creative hobby.
I admit that I have a minority use case, but here goes:
I currently run a Windows 10 desktop machine exclusively at home. My wife uses a Chromebook. I have had good experiences of installing Linux onto PC hardware over the past 15-20 years. I really like the streamlined simplicity of an OS which provides the interface layer between applications and the hardware, without any fuss.
However, my main hobby is creating music. In the past, this meant hardware synths / effects and perhaps recording and mixing in a DAW. Recently, I have made the transition to working completely In-The-Box, and I'm loving it. The main trouble is that the best selection of (high-quality) DAWs and plug-ins are targeted exclusively towards Windows and Apple platforms, with no Linux support. For various reasons, I'm not going to switch to the Apple universe, so I'm stuck with using Windows on PC hardware.
My DAW is Reason 12 (no Linux support).
I use paid-for VST instruments and effects from Arturia (no Linux support), G-Force (no Linux support) and Cherry Audio (no Linux support).
I also use many free-of-charge VSTs, which generally don't support Linux, and some are even exclusive to the Windows platform.
Now, I'm getting as fed up as anyone with how terrible an experience the Windows ecosystem has become over the past few years. Unfortunately, however much I'd like to change over to Linux, I'm really, really tied to Windows for the foreseeable future.
As I said at the start, I have a minority use case. However, even a tiny minority of the millions of Windows users throughout the world is a significant total in absolute terms.
This reminds me of the old joke about the retired engineer who fixes a huge problem for a company by turning a single screw through 90 degrees. He presents his client with a bill for £10,000. The client is shocked at the cost and asks for a complete itemised bill for the job. The engineer issues another bill in response:
1. Turning the screw - £1.00
2. Knowing which screw to turn - £9,999.00
I suspect that "knowing which prompts to ask" will similarly require a level of skill and wisdom which can only be gained through years of experience.
As part of my job, I did some prototyping work on a Silicon Labs Busy Bee development board. It has an 8051-based MCU at the heart of it. It was great fun to program a device somewhat similar to a turbo-charged ZX81. The board had its own black-and-white LCD display, 64K Flash and 4.5 K RAM, with a pipelined processor core running at up to 50MHz! It could even be powered by a button cell for portable use. Specs you could only dream about in the early '80s!
"A 1K tape-based monochrome machine with no lower case, sound or graphics doesn't look like much. But compared to no computer at all, it was magnificent."
And that's it in a nutshell. The 1K ZX81 actually gave you the ability to program a genuine computer and watch it carry out your instructions. Unless you lived through that period, it's difficult to imagine the thrill of that experience. A few brave souls (myself included) decided to roll up their sleeves and get into machine-code programming in order to overcome the speed and space limitations imposed by Sinclair BASIC.
To this day, I've been more drawn towards low-level coding. I'm fortunate to work for a company that specialises in embedded designs, mostly featuring low-power processors or microcontrollers. But without that early advantage of ACTUALLY owning a REAL COMPUTER in my early teens, who knows what path my career might have taken?
BASIC was the native language of most home micros. However, more importantly, it was also the Operating System and Command Line Interpreter / Shell of those same home micros. If you wanted to load a game written in assembly language from tape, you would perform those actions through the BASIC command line.
An interpreted language like BASIC is ideal for getting your head round programming concepts for the first time. You can break into programs, examine the contents of variables and GOTO whichever line you want. All without recompiling or running under a debug environment. And generally speaking, you can't crash the system if you don't POKE about or use a machine code CALL.
IF you decide to get into programming, and start to feel the limitations of BASIC, THEN you should GOTO a software catalogue for your machine and READ about what alternative languages are available. Only a tiny subset of micro users would have done this back in the day.
It would be interesting to know what percentage of 1980s home micro owners went on to have a career in software engineering. LESS interesting to know is how many Reg readers started off their career by being exposed to BASIC on a 1980s home micro. A bit like comparing how many heroin addicts started off smoking pot vs how many pot smokers 'graduated' to using heroin.
That would never be hacked by miscreants, would it?
"- I only looked at pictures and - "
"And beat one off on camera? That's what they got, yeah? Your hot little face, blurred fist, dick burping f_cking spunk everywhere?
"Your mum's gonna love that on Facebook, Twitter, Insta-f_cking-whatever.
"And her friends.
"All eyes on you, giving it that.
"Toss in the c_nts at work, calling you Spurty McGoo.
"Laughing at your come face, making it their desktop wallpaper"
My brother worked for a bank before retiring. During the '90s, he would sometimes have to demonstrate something on another department's PC (running some variant of MS-DOS and 16-bit Windows). I was visiting him one evening and he explained what he was going to demonstrate the next day, which involved editing a text file through the command line (it may have been CONFIG.SYS or AUTOEXEC.BAT). He was reasonably familiar with the MS-DOS commands and EDIT, having a PC at home in those "pre-internet" days. However, I warned him that older versions of MS-DOS wouldn't have EDIT available - he'd have to use EDLIN instead. I gave him a quick tutorial on the basics.
Turns out that he did in fact have to revert to EDLIN. Thereafter, he was held in awe as some sort of programming guru by his banking colleagues.
Using an integer number of bits per cell already leads to inefficiencies.
For example, in order to read a random page of data from a 3BPC flash device, the controller firmware has to perform the following steps:
1. Read the wordline at 3 different comparison levels
2. Store these intermediate wordlines in RAM
3. Perform a series of Boolean logic functions on the intermediate worldines to extract the noisy page of data and its noisy ECC bits.
4. Combine the noisy page and ECC data to extract the original, uncorrupted written data.
If your storage system spreads the ECC data between the three stored pages, then you must read the entire wordline at 7 different comparison levels in step 1. Store 7 intermediate wordlines in RAM. Perform 3 times the number of Boolean logic functions that you did in step 3 to extract all three noisy pages of data and ECC bits. Combine all three pages of noisy data and ECC to extract three uncorrupted pages as written initially.
Extending this to fractional numbers of bits per cell would necessitate reading MULTIPLE WORDLINES at MULTIPLE THRESHOLD LEVELS and applying bizarre Boolean logic functions to extract your noisy data+ECC pages. This ignores the complex maths involved in applying something like BCH to generate ECC in a fractional bit scenario. Yes, it's possible in theory, but in practice it's ugly beyond belief. The complexity (and hence cost) that this would add to firmware design and testing would swamp any possible gain in storage.
In short, you'd have to be certifiably insane to propose introducing fractional bits to commercial NAND devices.
Possibly because there was a single /threshold/ voltage comparison level to determine whether a cell represented a '0' or '1' on read.
MLC (for 2 bits per cell) covers the use of three comparison levels to fully decode two stored pages.
This however does not justify the extension of the nomenclature to TLC for 3-bits-per-cell technology.
Why the industry didn't just call it BPC (Bits-Per-Cell) from the beginning, I'll never know.
To Harry Stottle. A wonderful piece of work. Have a thousand upvotes. I had the exact same thoughts about a month or so ago, but I do not have the gumption or the talent to put them together in such a way. Thank you for articulating the concept.
My extension to the idea was to have a central AI which had access to everyone's recordings, and which could 'join the dots' between various experiences of the same situation. If judgement was requested by any of the participants in an incident or dispute, then the AI would reveal the encrypted evidence as required.
I think most people would allow recordings 24/7 because every time an injustice was resolved, the benefits would outweigh any doubts.
A twist in the tale: A Messiah figure who decides from an early age that he wants no further personal recordings. Haven't thought through the implications, yet. Of course, there will be a million EXTERNAL recordings of his actions from other citizens, which could be stitched together by the AI. Unless he lives as a recluse.
If you feel that £20 is fair payment for years of free use, then you shouldn't reclaim.
I've paid into the fundraiser for at least the past three years, and I will keep doing so in order to keep it ad-free, if nothing else. It's one of my (if not THE) most used sites on a daily basis.
SanDisk had a 43nm 4-bits-per-cell (x4) die in production in the noughties.
https://www.sandisk.co.uk/about/media-center/press-releases/2009/2009-10-13-sandisk-ships-world%E2%80%99s-first-flash-memory-cards-with-64-gigabit-x4-(4-bits-per-cell)-nand-flash-technology
Don't normally 'do' online campaigns, but I found this a significant cause. Very few people in my extended family have ever smoked, and neither have I. However, I truly sympathise with those who attempt to quit for good, or who want to continue, whilst reducing the harm of their addiction.
If I could give you 100 upvotes for all your posts on this topic, I would.
Each block in a MLC die can be erased as desired (MLC or SLC) by the firmware on board. Same goes for TLC. It usually makes sense to permanently partition the blocks in such a way that the important stuff (or stuff which is going to be re-written many, many times) is stored in SLC. Whilst it would make more sense to keep this partitioning static, as you say there's nothing in the physics to prevent you doing it dynamically. It's just FAR more difficult to keep track of wear-levelling, etc in the dynamic case, with very little benefit to be gained in real-world use-cases.