Android System WebView
Thanks - will tweak the article.
C.
3533 publicly visible posts • joined 21 Sep 2011
It's true that people like watching bad stuff happen to other people. Getting a good look at something awful. Russian car crash dash cams are all the rage on YouTube. I dunno if that's possible to stop, or even a good thing to tackle.
OTOH while sites like LiveLeak have existed for ages and had loads of visitors, they're not on the scale of Facebook and YouTube, and also if you go to LL, you know you're getting gore and snuff. I suspect if LL had the reach of Facebook or YT, it would have been singled out early on.
I guess it boils down to this: censorship and moderation is harmful. Massive unedited and unpoliced platforms are harmful. There must be an in-between solution that keeps smaller platforms independent, and checks and balances kicking in when audiences start getting huge.
C.
FWIW government-level censorship is a terrible thing, and stripping unpleasant stuff from the internet is not great - OTOH it would be nice if FB took some responsibility for the content they are disseminating.
I highly suspect a lot of Register headlines would be deemed unpleasant by a large number of people and I'd hate for us to be thrown off the internet as a result. OTOH if The Reg had the same reach as Facebook, I don't think our headlines would be quite the same.
C.
No one's putting the blame on technology. As the article says, 'this murderous racist knew exactly what he was doing when he pulled the trigger'.
The problem is, how to contain viral murderous exploitative propaganda without stamping out other forms of expression. I'm all for individual outlets catering for all sorts of cultures and interests and people, all making their own free decisions on what to publish. What I'm, personally, not happy with, is a huge Mad Max platform that doesn't care a jot what is shared as long as it makes billions of dollars.
There are no easy answers. Tiered moderation, based on audience reach, might be one way forward.
C.
That is a problem that is difficult to solve without fundamentally changing Facebook - though funnily enough not a problem major, professional broadcasters have. Wonder why that is.
Facebook needs to grow up and realize what its platform is being used for. And it's not just livestream murders. It's anti-vaxx, flat earth, conspiracy theory nonsense that is suddenly given an immense platform.
I don't like any form of government censorship, heavy handed moderation, and similar - which is part of the reason why we try to push boundaries with headlines and writing.
On the other hand, it's not a black and white issue of freedom or no freedom. It's one thing to share stuff with friends or small groups privately that others may or may not like. It's quite another to have access to a huge potential audience.
Do I have the answers? No, no one has. Though, thinking about it, maybe one approach would be tiered moderation. After the first 10,000 views, it's flagged up for increasing levels of moderation as the views increase in stages (10k, 50k, 100k).
C.
Without the same reach as a vid streamed on Facebook, though.
Look, you can't stop small / niche / dark web platforms hosting this stuff, and I dunno if full-blown suppression of anything deemed nasty is the answer. I'm uncomfortable with heavy handed moderation. I don't want all bad stuff stamped out because it's v hard and there's the potential for certain views to be swept away.
OTOH I can think of a few things FB could spend some of that $22bn profit it made in 2018 on. The FB platform is too big and unmoderated. Would you live in a city with no police?
C.
I am dead against the government dictating what we can and can't see. If Fox News, NBC, BBC, etc decide it's too graphic to show real people being gunned down live, though, why is it beyond Facebook? Because of scale? Which is code for 'because we love making $$$$$$$$s from adverts with no consequences'.
If you make a TV show or documentary, and people refuse to broadcast it, or write a paper and a journal refuses to publish it, is it censorship or the application of standards? Don't get me wrong: this can be abused, and stuff can get suppressed for being uncool, unfashionable, or counter-cultural. That's why smaller platforms sprout up.
But if you have the reach of Facebook or YouTube, can't someone apply some kind of standards before a snuff livestream is disseminated? It's not black and white, freedom or zero freedom, it's not letting a platform with 1bn+ people just descend into Mad Max territory.
If there are riots in London, for instance, I expect and hope to see videos appear on the web. We don't need to see someone stave another person's head in with a mallet in real-time, though.
C.
"the nutter would have used a different service"
One with far fewer viewers and virtually no impact, hopefully, yes. There's no denying there are other platforms - in fact, why not create you're own. It's still a free country in that respect.
The trouble, IMHO and what Kieren was getting at, is that if you're going to have as vast a reach as Facebook, YouTube, etc, cripes, take some actual effective steps to prevent your systems being wielded as a deadly propaganda weapon.
Apologies for the cliche, but: with great power, comes great responsibility. And Silicon Valley has shrugged off all but the bare minimum of responsibility.
Again, IMHO.
C.
FWIW WhatsApp and Facebook Messenger do voice and video calls. The only people who phone me via the traditional phone system are PR people, restaurants confirming bookings, and robo-callers. Everyone else uses WhatsApp (or Signal) voice and messages.
Edit: I don't mind downvotes, people are free to vote how they want, but I get the feeling it was something I said. Anyone want to help me out and explain? Cheers.
C.
As we said a few times in the article, it's not a big deal for normal folk. There is still 63 bits of certificate serial number space.
It's just a bit - get it? - embarrassing for the usually by-the-book world of cryptography. And an interesting or amusing bug that we thought Reg readers would appreciate.
C.
'1' is a perfectly valid cert serial number, yes. There is no problem with it. The problem is that no serial number would be generated with the top bit set, halving the number of available serial numbers and increasing the chance of collision.
C.
To be clear, the problem is all about certificate serial numbers, and nothing to do with keys. I've cleared out any mention of keys to avoid any confusion.
The issue is that serial number length must be at least 64-bits and a positive integer. To ensure this, the generation software was keeping the top bit clear, effectively reducing the default 64-bit integer to 63 bits.
C.
It makes perfect sense: fewer desktop Intel CPUs, fewer desktop PCs, fewer orders for RAM, more RAM building up in warehouses, prices drop as supply outstrips demand.
We're talking about the price of RAM, not the supply of RAM. Supply is outstripping demand. No shit you found RAM in your computers - it's cheap as, er, chips at the moment ;)
Hope this helps
C.
FWIW our publishing system uses multi-factor authentication. It is mandatory: you cannot login to write, edit, publish, and manage articles without it.
So there's hope yet it'll be rolled out to comments.
C.
PS: We get our little red Reg badge when we post or reply to comments via the publishing backend.
FAN is being vague about the means - but it sounds as though the updates were intercepted or meddled with to allow the news org to be infected.
Here's verbatim from the news article - take with a pinch of salt.
"After connecting the Apple iPhone 7 Plus mobile device to the personal computer, not only the automatic launch of iTunes and the synchronization of user data were performed, but also Internet access was obtained from the Windows operating system and some system update files were downloaded that were installed automatically.
After that, the computer was actually managed remotely and all the necessary procedures were carried out to fully invade the local area network. It is worth noting that the intrusion into the local network was carried out from IP addresses controlled by American companies, including Amazon servers, which are usually used by hackers to sweep their tracks and hide the real source of attack."
C.
It went nowhere. The officials' demands for internal information was deemed overly broad by the judge.
C.
On client machines, DRM and cryptography. For servers, allowing you to upload code to run in an enclave in the cloud using remote attestation to prove the software hasn't been meddled with in transit or prior to execution.
That % SGX working as expected and intended.
C.
(See the 'read more' article in the piece on how SGX can be abused.)
See the paper, it discusses non-Intel technology.
"We focus on the Intel and AMD IOMMUs in our study. In the mobile space, ARM’s System MMU (SMMU) applies broadly the same concepts, and a natural extension of our work would consider use of the SMMU."
Intel may just be the start - it was the focus of the study.
C.
Yeah, we know the difference. Key thing is, abbreviation didn't fit nice in the headline space, which is the most important thing ever for us headline writers. The story is correct.
Now it's worked its way down the front page, where space isn't so limited, happy to switch it to the correct word.
Don't forget to email corrections@theregister.com if you spot a problem.
C.
In case anyone thinks we're misreporting this, here's the quote from NASA (in the linked-to webpage):
"During their last year or so of life, the Van Allen Probes will continue to gather data on Earth's dynamic radiation belts. And their new, lower passes through Earth's atmosphere will also provide new insight into how oxygen in Earth's upper atmosphere can degrade satellite instruments — information that could help engineers design more resilient satellite instruments in the future."
C.