* Posts by DuncanLarge

652 posts • joined 10 Apr 2017

Page:

Do you want speed or security as expected? Spectre CPU defenses can cripple performance on Linux in tests

DuncanLarge

Re: The Foundation of Computational Trust...

> Without Speed there is no reason to uses computers instead of pen and paper

I think you vastly over estimate the speed of human computation using such methods.

Try reading up on the creation of Colossus and of the Bombe during WW2 and you may find out that humans are shit at doing computation fast, even a 486 (running appropriate software) will knock the socks off a pen and paper.

Apple announces lossless HD audio at no extra cost, then Amazon Music does too. The ball is now in Spotify's court

DuncanLarge

Re: Lol, round the bend

>CD oversampling is a method of anti-aliasing, nothing more. A sharp-edged digital anti-aliasing

Yes, thats EXACTLY what I said!

DuncanLarge

Re: Since it is lossless

All I tend to use are CD players.

Even the car has one.

DuncanLarge

Re: Yay!

> I am curious how material that was re-mastered for CD at 16 bit a good while ago is going to be 'converted to 24 bit' without remastering from the original. Stuff that has already been re-mastered for SACD is available, but the rest?

It will sound exactly the same. The bits only affect where the noise floor is.

However if they have methods to process the noise down to 24 bit level then yes, that would be an improvement. You wont hear it however, the CD @ 16 bits has the noise already way below where anyone will hear it. If you add dither to it, then it goes even lower!

If you hear noise on a CD, then the original had that noise. Or something added it at AD conversion time. A 24 bit remaster wont sound better because its 24 bit, it may sound better because they were able to move or eliminate the noise. But if you take that 24 bit sample and convert it to 16, you wont hear the difference, unless the CD is a recording of pins dropping and you have to turn up the volume loads.

DuncanLarge

Re: Yay!

> Converting an analog medium into a digital representation and then back to analog again is, to put it mildly, less than ideal.

> If you really want to hear it as it was live you need a good quality analog recording & quality playback equipment.

> Digital is convenient but it's always going to be a compromise.

No, this is plain wrong. CD quality audio or higher will record and reproduce the original waveform perfectly.

I said PERFECTLY.

The nyquist shannon theorem perfectly captures and reproduces a band limited signal (thats very important).

All the signals we use are band limited, to human hearing, thus 44.1kHz @ 16 bit will reproduce the original waveform perfectly.

Analogue recording methods cant do that. They simply cant, adding noise to the recording itself. The dynamic range of a cassette is only 6 bits deep, if you have a very good cassette. 6 bits is a lot lower than 16. Reel to reel can go deeper, but nowhere as deep as CD. Luckily for reel to reel, you dont need to go much deeper, CD really has a lot of unused dynamic range but more is better as it lowers the noise floor to below normal listening levels.

All of the superiority of CD audio is entirely dependent on the analogue to digital conversion itself and the amplifiers etc. Again, all analogue stuff. The analogue stages, on the input or output can distort the source or add noise to the output. The CD will PERFECTLY reproduce the crap that is put into it via a crappy input and of course there is plenty of crappiness on the output, from cheap amplifiers to cheap transducers not to mention unsheilded cables that may pick up any of the EMI shit we all live in these days.

Digital audio reproduces the original waveform, band limited to the human hearing range, an a minimum of 44,100kHz and with a bit depth of 16 bits has a noise floor lower than anything ever made in the analogue domain. We could even reduce the number of bits, it just raises the noise floor, thats all it does. We could use 6 or 8 bits and we will sound just as good as any nicely recorded tape.

The character, warmth etc that everyone goes on about is nothing more than we noticing the imperfections added to the audio, during recording and playback. That why people like analogue, because it adds that imperfection. The great thing is, we can add that imperfection with the CD just by choosing devices that are not perfect, but we have a perfect original, every time.

Of course, I am talking about lossless audio here. I have not even considered any lossy coded, no matter what the bitrate or where it becomes "transparent". There are plenty of codecs that will, while being lossy, produce a waveform that is perfect "to the ear".

DuncanLarge

Re: Yay!

> precisely bugger all to help the musicians

A lot of those CD's are out of print, the musiscians may not get anything anyway.

The same argument can be said for second hand books, which obviously fails as second hand books have been a thing for a long long time.

Instead of riding on the royalties of a couple of works, artists are supposed to be encouraged to create new works, constantly.

DuncanLarge

Re: Yay!

Fortunately, smart decent people who use common sense ignore it ;)

DuncanLarge

Re: Yay!

Well, they are.

They really are.

Its just most players/devices a lazy with the error correction. My blu-ray writer I recently discovered works miracles on heavily scratched CD's, even on a 1984 disc that HAS A HOLE IN IT!!

Sure, CD error correction is nothing when compared to the error correction used in DVD, that thing is complex! But they really can take a beating.

Its just the players that are supposed ti fix the issues are not doing a good job.

DuncanLarge

Re: Yay!

That only happens in the edge cases.

Unless it was a UK made disc in the 80's, which all came from a factory that was found to have a severe defect in he line. Those will rot.

I have only had/seen 1 disc that has rotted, a cheap unknown brand CD-R, just one.

Environmental factors play a big part. A hotter humid environment will attack the disc faster.

I'm in the UK so don't have that issue. Well, not most of the year anyway.

My oldest disc has damage from an impact. I scanned it recently, the C1 and C2 errors were well within the Red Book spec, till the damaged area obviously. I scanned a brand new disc and it was generally lower than the 1984 disc but not by much. Also the damaged area in the 1984 disc (its a hole in the reflective material) was fully repaired during playback by my bly-ray writer during ripping. I could only just tell that the repair had happened (you can slightly hear it). Normal CD players apply much less error correction and skip at that point.

CD's when paired with decent full error correction are pretty resilient. even my Pink Floyd The Wall disc, that I found on the pavement in the 90's, used as a hokey puck by some kids, scratches all over the place, ripped flawlessly in that writer. I hadn't even tried resurfacing the disc!

You just need the right device.

DuncanLarge

Re: Yay!

> My hearing isn't that good

Nobodies is.

Only a bat would be able to hear anything that required that sample rate.

DuncanLarge

Re: Buy the music/film

> Spielberg factor

Does he avoid putting out the older versions like Lucas then?

I have no issue with edits and recuts etc until you get Lucas saying that the latest version is the ONLY version and anything you have thats older should be trashed and he would live to come and trash it for you.

DuncanLarge

Re: Misguided effort, compared to Bluetooth connection strength and codecs

> 44,1 kHz used to impact low-pass filtering

Many if not most players use oversampling to go way beyond 48kHz to solve that very issue.

My 90's CD player is a 6x oversampling one for example, upsampling the 44.1kHz to 264.6 kHz, which is then used with shaped dither to push the quantisation noise above human hearing, dropping he noise floor thus increasing the dynamic range from 96dB to around 116 dB.

You can get away with a very gentle filter with that. You just need to have removed most frequency content that the speakers will have trouble with.

DuncanLarge

Lol, round the bend

Funny.

We got CD, ignoring issues with the actual recording, issues with lazy mastering and issues with the listeners equipment, CD give us perfect audio, lossless (albeit only stereo). Some improvements made here and there, oversampling being the main one and boom, perfect high res audio. barring the issues stated, which are beyond the scope of any playback medium to control.

Then we got MP3, which threw away loads of audio to compress the audio from something like a CD to something that the fledgling home internet could stand to distribute. Also there was the move to solid sate devices but I'm not considering that considering my first MP3 player held only 32MB which a CD simply laughed at.

We got other codecs etc, then we got faster internet and capacious solid state devices that could happily hold an uncompressed CD track, if not a flac track.

We have gone from lossless perfect audio (no, there are NO stair-steps, NO phase issues, there is only aliasing which was resolved by the 90's with oversampling), prefect because it perfectly reproduces any human hear able frequency (yes, you in the back, that DOES include multiple frequencies mixed together and the harmonics, if a HUMAN can hear it then its perfectly reproducible by CD technology) with a dynamic range that if used fully would make the listener deaf! We even extend that dynamic range further, just because we can, for just a little better noise management.

We then went to lossy audio.

Now we are back with lossless audio. Funny that, we were already there. Ok, people wanted it in their pocket and the tech we had took a while to outperform a CD, but now instead of just sitting back and enjoying a ripped CD or non-ripped one, we have those who think the CD is lossy, that it is not perfect. So off they go wasting bandwidth and storage space buying 192kHz sample rate files with no understanding that EVERY player will oversample to something as high if not higher than that, on the fly, at playback. My 90's CD player has 6x oversampling, that means it will up-sample the 44.1kHz audio to 264.6kHz.

Why? Well what they dont know is that there is NO audio supplied in the 192kHz file above 22kHz or so, which is stored exactly the same as on a CD, just with more samples, which are surplus. The reason why we oversample, on the fly at playback is because we can then move the quantisation noise above the 22kHz limit. We need the higher sampling rate to hold that JUNK audio above 22kHz, otherwise it will appear under 22kHz as aliasing/distortion. EVERYTHIG above 22kHz CAN NOT be heard and CANT be reproduced by speakers/hadphones etc. To prevent that JUNK noise from being a problem and distorting the audio because the dumb speaker will TRY to reproduce the waveform, we filter it out!

Oversampling is thus the method used since 90's CD players to push artificial quantisation noise above the human hearing range, we do this by ADDING in shaped noise (dither) that the player generates. The higher sampling rate thus allows for a filter design which is very simple and effective. The older CD players tried to filter HARD at 44.1kHz but they are never perfect and making them so was expensive. By oversampling, or even just jumping to 48kHz like with DVD audio, we can have a much better and cheaper filter. Its all filtered out above 22kHz or so and what was there was SHIT that you dont need.

All of this is done during playback, on the fly. Its merely maths and you only need a 44.1kHz sample rate to do it. Selling people "hi-res" audio is nothing more than selling people an ALREADY OVERSAMPLED file. The player does not need assistance, it can do it on the fly. There is no effing reason why the hell anyone would want to store such a file (for playback, recording and editing have other benefits here). For playback the final sellable file need never, ever to be at a sample rate greater than 44.1kHz or 48kHz. The player will create the 192kHz version on the fly during playback, no storage or bandwidth needed.

Hi-res is snake oil. A reason to have fast broadband, a reason to pay again, a reason to get a player with the same kind of "wank features" that used to be put all over CD players.

It a wank feature.

Unfortunately, to help force you to purchase the wank feature the mastering of the CD version is left to the idiots whilst the "hi-res" version is given the care that should have been there in the first place! The CD version thus sounds shit because they made it sound like shit so you get the hi-res version and bingo, the sucker now thinks he/she is hearing "hi-res" and that CD's sounded like shit all this time.

So, everyone. Lets be clear. Anyone thinking the hi-res version sounds better is only correct because:

1. They THINK they hear the difference.

2. The recording is a newer better one

3. The CD was mastered shitty

There are no other reasons. Its all down to the quality of the source material and mastering.

So buy the hi-res ones if you know thats the better recording/mastering etc. Then do the smart thing and downconvert it all back to 44.1kHz and save the space. It wont sound any different, in any way.

Stealthy Linux backdoor malware spotted after three years of minding your business

DuncanLarge

Re: What a supsirse.

Yesterday I tried to reboot my system and systemd decided to take 3 mins to do so. First of all it was waiting on a job for 1:30, well I thought ok systemd, wait then kill it, then once the 1:30 timer was over it added another 1:30 to it.

I knwo what caused the issue, a filesystem that wouldnt unmount, due to a kernel bug that I triggered when I was testing something but I remember the days when if the system was told to reboot, it rebooted.

DuncanLarge

Re: Disguising it as Systemd is cunning

> A simple bash alias could redirect your call to tripwire

In that case always use the full path fr such binaries

Attack of the cryptidiots: One wants Bitcoin-flush hard drive he threw out in 2013 back, the other lost USB stick password

DuncanLarge

Re: Best practices

> dumpster diver (or whatever you call them in the UK)

Skip raider

Debian 'Bullseye' enters final phase before release as team debates whether it will be last to work on i386 architecture

DuncanLarge

Re: i386 Support

> For newer kit, this makes sense as there is not that much i386 gear that is online. Most of the still running kit is for the CNC machine (and the like) that probably never was online on even on a network.

Everything you just said contradicts the article. Tell me, if these non-networked CNC machines are the only 32 bit systems, why do Debian see them as a huge number of systems that run recent Debian versions? Did you read the article? Debian said themselves that there is a massive number of 32 bit installs but they want to drop 32 bit because they hare finding it hard to test the builds.

What you say about a non networked machine continuing to run any OS it needs is correct, basically common sense (in the IT form at least). If a German car garage can still run a C64 to help test and tune Ferrari engines I'm sure there are plenty of Windows 3.1 and OS/2 machines doing stuff.

I know of an American county who were using an Acorn machine to run the counties school heating systems over wireless link. The UK rail network also used similar machines to display the live train information on the screens on the platforms. I think they finally migrated when they got to the point they found it hard to find reliable sources of error free floppies.

DuncanLarge

Re: I'm finding this hard to believe...

Read it again mate, it's not about dropping support for the 386 processor but for x86 entirely!

Targeting 386 is the lowest common denominator for any 32bit system. They are talking about dropping support for ALL 32bit CPU's (for booting, 32bit code execution can still be supported).

Btw mate. I still run old and brand spanking new 8-bit systems ;)

DuncanLarge

Re: Debian Bullseye 32 bit

> I think I typed the above nearly verbatim 25 years ago,

And you learned nothing.

You do know that these new 64 bit instructions you speak of tend to operate on larger operands? I.e ADDing two 64 bit numbers together on a 64 bit machine requires less cycles because the OPERANDS are bigger. On a 32bit machine you will need to load words from 4 separate memory locations and then perform 64 bit math using 32 bit registers. Thus much 32bit code will keep to 32 bit math, for speed.

So when that code is compiled for 64bit, it will be loading the same values in 64 bit operands, into 64 bit registers. This practically halves the time taken to perform the calculation, thus is faster.

But now all your math is using 64bit operands, see the problem? Unless you specifically use 32bit math instructions you will be loading TWICE as much data from ram and saving TWICE as much data to ram. This does not affect all instructions, and 32 bit systems were doing 128bit math too but when your basic operations now use 64bits of data:

IT MEANS YOUR SYSTEM USES MORE RAM FOR THE SAME JOB AS USED BY A 32BIT SYSTEM.

Sorry for shouting but, my god, when adults don't get basic math over 25 years it just, I'm lost for words.

DuncanLarge

Re: Debian Bullseye 32 bit

> Running really old machines is actively bad for the planet, beyond a certain point.

No, it's the opposite. Dumping functional old machines is way worse.

Think about it. I have a Ryzen system as my main machine. It does way more per watt than my 586 laptop. But I use the laptop to run stuff that does not need performance, like word processing, coding (depending on language and compilation time), reading etc. So, is it better for the planet if I send that machine to landfill and use the Ryzen instead? Or I have the Ryzen off while I play with regular expressions?

You're basically saying, use the efficient Ryzen to do barely anything while bury the 586 to leach out it's poisons into the water table. Yep, so good for the planet.

DuncanLarge

Re: Debian Bullseye 32 bit

> Would you really choose to do something as major as installing a new OS on a system that old, or would you use it as an incentive to replace it with something newer? I'd certainly go for the latter.

In a corporate IT environment, where the company operations rest on it, I will agree with you.

But, in most other environments, I'd consider the environment before throwing away a perfectly usable machine. My car is 10 years old, it has rust, but it will run for the next 10 years without much going wrong.

People still use Victorian, or even much older houses. Crazy.

I get new shoes when the old ones start actually falling apart.

I have books published before 1920, one is a science book that has an article on the possibility of the existence of Unicorns and the discovery of the Gulf Stream.

Computers should last for decades.

Signal boost: Secure chat app is wobbly at the moment. Not surprising after gaining 30m+ users in a week, though

DuncanLarge

I have been doing the same.

I found out about signal before it was named signal, knowing that there was a very low chance I'd ever see anyone else use it I have been using it as my default SMS app for the last 8 years. Simply because I prefer having signal read my sms's rather than the default app. Not for privacy reasons, but back then there were many security vulnerabilities in android surrounding SMS and MMS and the default app was the usual target.

DuncanLarge

> can't set different alert tone for personal messages and group messages

Do people still do this? I thought that was a thing people played with when camera phones with midi ringtones came out. Seriously, I have never been able to figure out how to set different notifications for specific apps on most of my recent phones, they just have ONE sound, and options for enabling or disabling the permission for that app.

DuncanLarge

> What's better? Signal or telegram?

Signal, Telegram rolled their own crypto, Signal uses standard, protocols that are actually being researched for vulnerabilities and further development by the crypto community. Also as Telegram is not open source, nobody can start doing that on telegrams use of crypto, we know HOW they do it, they described it openly and even then the old addage of "don't roll your own crypto" rang the warning bells.

If you don't care about licensing and security and only about numbers then Telegram wins.

Linux developers get ready to wield the secateurs against elderly microprocessors

DuncanLarge

Re: Dear me

> Am I to understand that there are still 486s that are in working order

Yes, plenty in all sorts of control systems.

You dont need an i7 to monitor the valves in a scada network in a brewery.

Windows 10 ends the year with more than half of PCs on a 2020 flavour

DuncanLarge

Re: Consumer versions

> I don't think it's wrong for Microsoft to use their own browser to display system state information and documentation.

Tell that to the international courts that punished Microsoft when they did this with internet explorer.

DuncanLarge

Re: Consumer versions

I'm glad I only boot win 10 every month or so when I want to play a Windows game. I've been on Debian for years.

Explained: The thinking behind the 32GB Windows Format limit on FAT32

DuncanLarge

Re: Whaddabout CDs?

> Must have interesting authoring a 650MB CD in the early 1980's...

You didn't. Orange book did not come out till 1990 and even though CD-R like burners existed in 1988 they were washing machine sized and cost $35,000.

It wasn't till 1995 that CD burners came about that were less than $1000 and by then we were starting to get win 95.

As far as authoring a CD-DA disc, well you stored the data on video tape as digital signals, much like barcodes. As the machines were used in NTSC and PAL regions it was found that a sample rate of 44,100Hz would work with both standards. This is why CD audio is 44.1kHz and not 48kHz as originally intended.

Didn't matter, 44,100 Hz, 16 bit still allowed perfect reproduction of the original signal, shame about audiophiles who think that hi-rez is worth it (beyond getting an unabused VERSION of a recording, they won't give you that on CD because they want to cheat you).

DuncanLarge

Re: What about FAT file transfer?

Yes, I never thought about that.

These days burning while mastering is the norm, even for BD-r.

New year, new rant: Linus Torvalds rails at Intel for 'killing' the ECC industry

DuncanLarge

Re: Don't forget the “Tech Press”

> every moron has a PC with 16 GB RAM

Mines 12, and before I upgraded to a new machine I had 8.

> SmartPhones come with 12 GB of RAM

Sure they do. You must be thinking of the overpriced flagships. Mine phone and tablet has 2GB. I have 16GB of storage, of which I can use about 4 if I have nothing installed.

DuncanLarge

Re: Memory Compression?

Do your research, memory compression is used to COMPRESS the contents of RAM.

It has been in kernel 3.14 and is used by Android 4.4 and above. Win 10 does it too.

DuncanLarge

Re: Who is this "Intel" ?

> There's probably some emulators running FORTRAN on VMS somewhere in NASA as well, to give the maintainers a sandbox.

It annoys me when people use their imagination to fill in the blanks, lol thinking that Fortran is so old you need a VMS emulator.

Well the truth is that Fortran 2018 is the latest version, it integrates with .Net and C, compiles to portable code and although not the most popular language is still used extensively in scientific circles.

Presumably your point was NASA would have an emulator for old versions of hardware like the Voyager probes. Maybe they do, can't think why. Those probes are way too far away to patch. As for NASA and development hardware, well they usually just chuck the stuff in the tip when the project ends.

https://ourcodingclub.github.io/tutorials/fortran-intro/

DuncanLarge

Re: Who is this "Intel" ?

> Most of the world does not use a desktop

Citation needed, but going with the assumption I suspect most of the world is indeed using laptops, the vast majority of which are based on x86-64 mostly using intel chips.

Oh you were thinking of tablets? Yeah some people may do banking on those when they are in bed or caught short in the shop, so they need ECC too.

> I bet there are still machines running FORTRAN on VMS in some corner of NASA.

Why would they need that? Fortran 2018 is the latest version and integrates with .Net

X.Org is now pretty much an ex-org: Maintainer declares the open-source windowing system largely abandoned

DuncanLarge

> RDP is a little better across the network

How? Last time I used RDP I had a full desktop with notifications and scrollbars because it couldnt fit on my monitor. All I needed to do was add a machine to DNS, RSAT tools was broken on my laptop for some reason and I probably could have used powershell. It would have been nice to simply launch the DNS dialog on the Domain Controller and have it work as a normal window on my machine. But no, I had to have a HUGE RDP window and wait for RDP to log me in.

DuncanLarge

Re: The power of open-source

> I like to think of that "ancient cruft" as "well tested bug-free code".

Totally agree.

Besides security issues that come along and need patching, code does not degrade with age and so "ancient cruft" is a newbie coders way of saying "I dont want to learn how it works because its hard getting up to speed so I will just write my own version that breaks all the cool stuff and implements only 50% of the original features because thats all thats needed to run Steam, then find ways to force everyone to use that while convincing everyone that its progressive.

There is a case for chucking everything out and starting from scratch, feature for feature.

There is a case for deciding if a bit of functionality SHOULD be in this code or outside. This is part of the Unix design philosophy where you can strip out features into their own programs thus making everything simpler and more modular. Does wayland have this? A Wayland Networking daemon that gets started on demand when networking comes into play? It could allow anything from single windows to whole desktop networking (basically a whole X session over the network) and can leverage another program to set up encrypted tunnels. Nope, that doesnt run steam so nope. All it needs to be is a proxy, forwarding Wayland events to and from each systems compositors. The compositor need not care that its over a network, just serve the standard wayland protocol.

"Ancient cruft" that may actually be there is most likely hacks put in to work around an issue with hardware that nobody uses anymore, even Linux has that.

DuncanLarge

Re: The power of open-source

> Linus took Unix

No, he didnt. That had been mostly done before he got involved so he only added a few bits and polished it off.

DuncanLarge

Re: Nobody likes X11

> Nobody likes X11

What ancient distro are you using?

DuncanLarge

My requirement

I dont mind using Wayland, well its just a protocol what I really mean is I dont mind using a wayland compositor as long as:

- That compositor never makes itself a requirement. Any application should be able to connect to it.

- It supports networking like X did but maybe in cooler ways.

The one thing I will hold on to X.Org for is to avoid stepping back into 2002 and having to use VNC just because I want to run a graphical program on a headless server or a VM.

What will you do with your Raspberry Pi 4 this week? RISC it for a biscuit perhaps?

DuncanLarge

Yay

All my Pi's run RISC OS. I also have a couple original machines (Risc PC 600 and a A3020).

On the PI you can write BBC BASIC programs, which includes an ARM assembler if you want to add machine code. The BBC BASIC is the latest version and can provide very low latency use of the GPIO pins.

DuncanLarge

Re: Zarch

Yes, but you need to run the archemedes emulator as the Pi CPU is incompatible

RIAA DMCAs GitHub into nuking popular YouTube video download tool, says it can be used to slurp music

DuncanLarge

Re: Let's be honest

> Did the EFF's claim of, "a world of lawful usages," include any examples perhaps?

We dont have to tax our imagination too hard. Here are just a few examples of very commonly practiced reasons for downloading from youtube:

1. Your internet is crap most of the time but it gets better during the day when you are at work. When you get home, you can barely watch anything, caching all the time, kids screaming for bandwidth. Well, you have cron on your linux box run youtube-dl to download the latest vids from your subscriptions. Yes, many people have very slow internet, I still go to work with people who think having 1Mb/s is normal.

2. You wish to debunk some crap that some idiot said about 5G. You want to use the DCMA's own "fair use" exemptions to allow you to incorporate reasonable length segments of the offending video. Somehow you need to import it into your NLE. How????

3. The video is licensed under a CC license permitting redistribution and perhaps modification. I wonder how we can pull it out of youtube.

4. You are a datahoarder. You download everything, cant help it. Your kind will be the saviours of human culture after the zombie apocalypse.

5. You are downloading a public domain work.

6. You are downloading your own videos as the youtube method is now slow and inconvenient after their yearly UI update that everyone hates.

7. You are an archivist, see datahoarder only without the need to hoard.

Shall we do cars next? They are used to kill people, kidnap children and run drugs but I think we can find a few legitimate uses for a car, if we try really hard we may be able to keep using them dont your think?

US govt wins right to snaffle Edward Snowden's $5m+ book royalties, speech fees – and all future related earnings

DuncanLarge

Oh the irony

You get caught spying on an unsuspecting public, breaking laws some of which are hundreds of years old, while lying under oath that you are not doing it.

Then you use other laws to get paid for having done that very thing.

I bet none of this money will even make it into any charities or public projects. It'll find its way to "the right people".

Thunderbird implements PGP crypto feature requested 21 years ago

DuncanLarge

Re: Encryption should be automatic

Correct, the MITM has to hope you dont verify the key validity outside of that conversation too.

How the hell you got 5 downvotes just shows that hardly anyone seems to understand decades old technology!

Even those who know how TLS works should be able to understand a MITM attack.

Its very basic stuff.

DuncanLarge

Re: identity and encryption

> No, email is sent end to end, it passes through servers along the way.

That is incorrect.

Email is sent point to point. Usually each point is at each end but if there is a server in-between you, you are forwarding your email TO that server where that server will initiate a separate connection to move it onwards WHEN it decided to do so.

That means that the email may be stored on that server for later delivery.

That is not end to end.

DuncanLarge

Re: identity and encryption

> you don't need identify assurance beyond the email address

Because everyone is born with an email address tattoo on their foot?

There is nothing in an email address that goes anywhere to prove identity.

Anyone can create a key for any address, even the one you use. Once they make a key for that address and then get hold of your address due to your terrible password choice was leaked in a breach that matched the rainbow table they have for unsalted hashes of common words they can literally impersonate you and just have to blag about how your key has changed etc. Savvy GPG users will then contact you via other means to confirm they key, and if you use a keyserver will think something is up if you have not revoked the old key.

We've had public key crypto for decades its not that hard to understand.

Ultimately to prove identity is to meet each other face to face and exchange public keys, then sign them. At a key signing party. Unfortunately that is a barrier but that is the ONLY way to confirm beyond a doubt that it is YOU behind that address and if you do key signing correctly, behind ANY address and ANY key you sign.

The tech is sound, the web of trust is the problem as its not always used.

No other way exists to prove you own an email address. Not without confirming other factors or confirming you have access certain devices. I could email you a random string then call you and have you read that out, that would work. But nothing allows you to prove identity simply by giving the email address.

DuncanLarge

Re: identity and encryption

As long as you have backups of your home directory that includes the .gnupg directory you are fine as long as you dont forget your keys password.

Of course most people barely backup anything so I think your question is just one of many similar questions that people only read when they lose all their data and go to reddit to ask how to recover it off a dead hdd.

DuncanLarge

Re: identity and encryption

> will automatically be encrypted after the first email exchange.

That can be abused, also it wont be used as there are loads of email clients and you basically need this in all of them.

I use Signal on my mobile for sending SMS and if I even find someone in my contacts who uses Signal too I can send secure messages. However I doubt I will ever find another Signal user as everyone else is still using Whatsapp et-al, who also implemented the Signal protocol. ow if only we could trust they implemented the Signal protocol properly and honestly, then we could have Signal and Whatsapp interoperate.

That would be great.

> Without the need for a third party key holder

There is no third party key holder in PGP. Well there isnt the NEED for one. Put your public key on your website, attach it to your unencrypted emails. Anything will do. The savvy users will then confirm your key is valid while more trusting users will simply TOFU. The key servers are useful only if the owner of the keys bothers to use them, which is probably a good idea as it allows key revocation.

> I do not like third party key holders

There arent any, but yes I dont like them either. Keyservers are not key holders, well not how you think and they dont do any tracking (well they could track you based on browser fingerprint). What you are thinking of is key escrow, where you must give up your private key.

> I do not like any protocol with a "revoke"

Why? If my key has been compromised and I'm no longer in control of it then I most certainly want to tell anyone who is sending me encrypted stuff not to use that key. I also want to have those people know that the email I sent them could not have come from me as I revoked the key, so when "I'm" telling my stock broker to transfer all my shares to some guy in South Africa maybe they will think that its probably best to not do that. Or maybe "I" send my solicitor who has never seen my face a scan of my driving license for proving ID on a house purchase.

Unfortunately no solicitor I looked at when I bought a house in 2012 used PGP/GPG so I had to send a colour scan of my ID documents IN THE CLEAR FFS. I seriously would have preffered to FAX it. Oh well, my risk to take. And no, no encrypted zips either, I only fond out about that limitation while in the middle of exchanging contracts.

> Want to change your public key? Then change it. People you communicate with will get a big fat warning that something is wrong because the key has changed. That's as it should be!

Thats how it is.

DuncanLarge

Re: Encryption should be automatic

WTF are you on about?

The key servers are not involved in encrypting the message. They just let you find someones public key and there are plenty of other methods to do that!

One very modern method is to serve your keys via DNS.

The significant issue with public key crypto is the building of the "web of trust". Technically its optional as you can confirm the key fingerprint and mark the key as trusted yourself or you can opt for a TOFU (Trust On First Use) method which if you are careful to tick your specific boxes and not simply do it without caring should be adequate for most people / situations.

SSL sorted out the web of trust by implementing the third party infrastructure you seem to be thinking of. PGP doesnt have that, its totally independent and only as strong as those who use it incorrectly.

The Battle of Britain couldn't have been won without UK's homegrown tech innovations

DuncanLarge

Re: The war is over, the empire is gone

> while it's nice to believe that

:D "believe"

:D

Who needs to "believe"?

Page:

SUBSCRIBE TO OUR WEEKLY TECH NEWSLETTER

Biting the hand that feeds IT © 1998–2021