Proper protocol
Nice that they launched both the QAS and PROD instance of the probe.
238 publicly visible posts • joined 28 Jan 2008
A few friends of mine worked in call centers for the Italian "market" (one of them was, for example, in tech support for HP for a while). All of them have actual four year university degrees in Italian, but, of course, they don't have native accents. It was not uncommon for call centers to have them pick false identities and they would be instructed to insist that they were Italian when challenged by the customer about it, or, if pressed really hard, to admit to not being Italian, but to lie that they were from some Eastern EU country (Slovakia, Romania) and never say that they were from Serbia. And here the friend who told me this made a remark: "They (Italians) say that 'extracomunitari' (term for people from outside of the EU) is not a slur, but that's not true, it is". So, I can imagine this having a legit use.
I think that, in this case, the people behind the token might be suable. They performed a quick cash in and that pretty much inevitably results in the token becoming dead and worthless. If that was the plan all along, it's a fraud. Also, if, for instance, the celebrity promoters failed to mark their social media posts as paid promotion, I think that they are culpable. Other than that, as some have said, I don't know if these "investors" realize how hypocritical their position is. "Crypto is the future; it's decentralized and deregulated, it's freedom, if it's on a blockchain, so it's yours, whereas money in the bank can be frozen etc; regulating it would defeat it's purpose."; then, after the deregulated nature of crypto bites them in the proverbial, they run to the central authority and regulator.
Additionally, a co-worker of mine recently got into crypto, after a net worth of several hours of research. That amount of research was enough for him to conclude that "celebrity endorsement is proportional to the likelihood of a token being a fraud". So, how many of the investors actually intended to exploit the pump and dump that was about to happen, only to find that their pull-out game wasn't good enough?! And now they are playing the victims. I'm not claiming that all of the people who got burned in this affair are in this group, but I expect that some are, because people do buy s*itcoins knowing that they are s*itcoins, as a short term "investment", based on the amount of buzz around them, and with the intention of selling them when they start slowing down.
From what I've read on the mater, Bridgefy was originally meant as an app for communicating or passing emergency information in places and situations where mobile coverage and internet were scarce. With that purpose of a "text-based walkie-talkie" in mind, perhaps it didn't have to be super secure and anonymous. However, when it started being used in situations where the communication and users were expected to be cybersecurity targets, it's rudimentary and flawed communication encryption became woefully inadequate.
"Fun" fact: according to ArsTechnica, the encoding method utilized by the app was introduced in 1993 and deprecated in 1998! How on Earth?!?!?! How does a modern app end up using something that was deprecated over 20 years ago?!
The frustrating thing is that for txt files there is a nice import wizard on open that allows you to set, among other things, the type of each column, so you can label the sensitive ones as text, but CSVs just open and you don't get to stop Excel from doing nasty things to them.
The workaround is to change the extension to txt before opening, but I'm sure that a lot of people in those research institutes aren't aware of it and don't know how or don't even have the permission to make Windows show extension for known types.
The additional problem with CSVs and Excel is regional formatting, because a lot of times a CSV gets exported under one set of local settings and then opened on another computer with different ones, such as reversed decimal and thousands separator (probably more prevalent in the Balkans, where half the computers run with US settings and half with local, but any large international cooperative is likely to bump into something like that).
Excel and CSVs, definitely not a match made in heaven, but if MS simply added an import wizard such as the one for txt files, it would almost make things bearable.
Seriously, how hard is it to wash your hands a few times a day without being reminded, especially under the circumstances? It's 3-4 minutes a day of my time so, ever since all this began, every morning when I wake up, I immediately go to the bathroom and wash my hands 6-7 times and I know I'm good for the day.
I know they kind of relaxed UAC in 7 and later, but, AFAIK, I think (and that's what I saw from skimming through that article) that all they did was tweak it so that changes to some settings in control panel do not trigger UAC, which IMO isn't such a big deal (how often do you tweak stuff in CP after you've finished the initial set up in the first several days after the fresh OS installation) and I'm actually a freak who goes and puts UAC on the highest setting (which I think is the Vista equivalent) instead of turning it off.
Even so, I still see the UAC prompt less often than once a week on average, I think, although one's mileage may vary: while I did more development, running an IDE such as Visual Studio triggered UAC (the debugger required elevated access), but nowadays, in general use, one sees it very rarely, except when installing new software. That's mostly thanks to the fact that developers have learnt to make applications that don't require admin rights unless actually needed, which wasn't always the case.
By contrast, back in the XP days, developers assumed that we were all logged in as administrators (which was almost 100% true in the home environment), so, if it was easier to do something in a way that required admin access, they did it that way. Then Vista came along ad suddenly some random application or game (for instance, some Football Manager from the era, if I remember correctly) would trigger UAC.
So, as far as I could gather, the fact that we now see UAC a lot less than when Vista came out has more to do with developers changing their practices than anything else.
Yup, those were two of the major issues that turned the public against Vista.
Regarding 1), adding UAC was a fine idea on a conceptual level (after all, Linux and MacOS have equivalent features). Where it failed practically was with the OEMs who installed bucketloads of bloatware on their PCs, most (if not all) requiring admin access, triggering UAC in some cases up to 30 times on each boot up (how anyone thought that that was ok is beyond me). Naturally, users hated the new OS and the UAC.
Regarding 2), the 512MB requirement was a concession from MS to (once again) OEMs, who could save a few $ (less than 10, if I remember correctly) per machine and still call it Vista compatible. As much as I understand that MS and large OEMs live in almost symbiosis, this was a bad call.
Another problem were graphics cards of the day. Most PCs at that time had Intel integrated graphics, yet Intel had no GPU capable of driving Vista Aero at the time the then new OS was released (DX9/SM2.0 capability was required). There were parts that had the capability on paper, but in practice (due to horrendous drivers and "shortcuts" made in hardware), none worked. Those users would buy brand spanking new "Vista compatible" laptops and get the ugly as sin Vista Basic GUI instead of what they saw in all of the ads and, whatever they did, they couldn't get it to work. Good luck trying to explain to a non technical person how Aero not working on their brand new, expensive machine was anything but MS's fault.
There were also problems with Nvidia, who was still unprepared at the time of release and whose drivers caused the vast majority of all of Vista's crashes in the first months after release. But whom did the average person blame, Nvidia (who, in the eyes of the broad public, could do no wrong with their drivers, ever) or Windows (whose instability was a meme before that term existed)? Plus there were problems with some older NV parts (at least some FX series GPUs) who had similar HW and SW shortcuts and now didn't work properly with Aero.
So, mix all that together and add the usual "if you want to sound tech savvy, just say that MS is crap" and people were hating on the new OS to no end. Meanwhile, my at the time already vintage single core Athlon64 machine with 1GB RAM and a Radeon 9800Pro ran Vista like a champ, although the OS did take maybe 200MB more out of the RAM than Windows 2000 did when ran on the same machine (Win2000 is a lot lighter on its own, but add GPU and other drivers, with their new .NET control panels, and an AV of the day and the difference was not as drastic).
I remember, when we were 12-13, my maths professor throwing the chalk at a classmate on one occasion, hitting him straight on the forehead. Then she asked for the piece of chalk to be returned and when she got it, she nailed him once more (she was a former handball player). The poor sod never said a word, he just went under the desk to fetch it again, but she said there was no need, she wouldn't do it anymore.
Sounds terrible when I write it like this, but it was a one time thing and kind of done in jest. Also, I remember that guy form preschool and he was trouble even back then. We had a class reunion recently and at some point, as we were reminiscing about school mischief, people started remembering stuff involving him like "Remember that time you tried to throw me in the dumpster?!" It got a little awkward after 3 or 4 of such stories.
Then again (while he was the class bully and did a number of really bad things over the years), by the time we turned 14-15, I actually realized on my own that he really wanted to fit in and not always be the bad guy. My guess is that he had been bullied himself a bit before he became the bully (he had a limp) and also that his family situation wasn't the greatest (I remember his aunt more than his parents and his aunt's son was the worst kid in my sister's class). Sadly, no one at school managed to get to him, although I believe that a couple did try, but probably with wrong methods (our principal for instance, whom I also remember for having a pick on a mate and threatening him with scissors because of his "girly" hairstyle, so, well meaning but a tad old-fashioned). The positive is that in the end he seems to have turned out a decent, regular guy.
Geesh, have I gone on a tangent or what?!
Here you still have to go in person, both to have your photo taken and to pick up the passport. The last time I went, they called me again the next day because the photo they took was later rejected. :/ Even the lady working there didn't know why exactly, but we assumed that it was probably glare or bad exposure because I shave my head, so she tried extra hard (with their really crappy light and camera) to make a decent photo. Similarly, something completely threw off the camera when they took the picture for my ID card, as the photo shows very little except a very faint silhouette of a head, a pair of eyes, a pair of nostrils and a mouth (but that one was somehow accepted). Fortunately I don't have to show it very often, because there's always a reaction.
There are two approaches to this problem:
1) What isn't allowed by the law is forbidden.
2) What isn't banned by the law is permitted.
In which of the two systems would you like to live? Because unless you opt for number 2, then you can't bring legal consequences against someone for doing something that isn't legally denoted as illegal. You can judge them on a personal level, you can advocate banning their practice, but you can't institutionally punish them for something that isn't illegal, because if we allow that, we have essentially abolished the law and we'll just have those in power making judgements based on their personal sense of justice and we'll have a system that's wide open for abuse.
I would agree with the spirit of your post if it weren't for the fact that Google makes employees' calendars public to their co-workers and gives the option to create just such alerters. It can be certainly be argued that this isn't why Google created the option, but if we combine the fact that Google did give the option to track one's coworkers' activities when one finds it worthwhile with their saying to their employees to act "if they see something that they think isn’t right", it can be said that the fired employees were using company provided tools to act upon a company policy, checking up on a project of dubious morality.
Opening smelly food on public transport is rude, I would agree, but how smelly is a freshly boiled egg?! It may accumulate some smell while sealed in a piece of Tupperware, but still. This part from the end has me further puzzled: "My commute to Vulture Central is frequently blighted by people opening up their stench boxes or unwrapping some greasy rancidity emblazoned with an 'M'." I would hardly consider the M food foul smelling. I would also hardly consider it great food, but here we are just talking odours and, while it, naturally, has a smell, I would definitely not classify it as an offensive one. As long as we haven't completely banned food from all public transport I think people are getting rather sensitive. Apparently, they'd like to use public transport and be alone on it.
All I've ever heard from Brexiters is (in no particular order):
1) EU makes everything complicated;
2) EU costs us too much;
3) we've allowed too many of *insert minorities of choice, essentially, the non-British*.
I direct your attention to the point no 3. The pesky non.Brits are a thorn in Brexiters' side because they supposedly mooch off the British social protection and healthcare system and because they drive the price of labour down (!) as they are used to a lower living standard. So I will be honestly shocked if, post Brexit, instead of explaining why you skipped that Portuguese taking a gap year, you'll have to do the same for a guy from Bradford who's done some Open University.
I'm one of the people who have tried it and still found that it too often produces inferior results to Google, despite the fact that I also agree on the criticism that others expressed of Google results, that sometimes the entire front page is stuffed with results that stopped being relevant ages ago and that additional options and switches that were very useful no longer work correctly, or at all. Bing used to be a bit better at some things, but it too has started to show a penchant for antiquated results. One "engine" that I find somewhat interesting, strictly from the tinfoil hat perspective is Start Page, which provides anonymized Google search results, for better and worse (for example, the fact that Google uses the country you're from as one of the parameters can be both desirable and undesirable at different times).
While hardware manufacturers always loved MS because new Windows versions helped them move new kit, the only way MS could have prevented the issue with the missing drivers was to never change the driver model/keep supporting the old one perpetually (and my guess is that there's a reason that those things change, on all OSes; would MAC OS X and Linux drivers written for their 6-7 year old versions work on the latest ones?).
As for the network issues, maybe it was a Vista problem, but I remember having a Vista machine together with a Windows 2000 machine in my home network and they played along just fine, even sharing the printer that was hooked to the Vista box. On the other hand, I see weird network issues every week, regardless of the OS.
Maybe you meant to say "three decades ago", but I'm not sure about that either. Two decades ago, building a new PC or performing CPU and RAM upgrades was as trivial as it is now and GPU upgrades were fairly similar to what they look like today as well (uninstall driver, swap cards, install new driver, cross your fingers). Replacing everything but the hard drive on a live system would have probably required an OS reinstall, whereas now Windows pretty much sucks it up, but it wouldn't be rocket science, it would just take an afternoon. And heck, I'm not even sure about three decades ago. In the early days, you often just popped expansion cards into the PC and software that was designed to support it just worked with it, without any drivers being required. Sure, maybe you had to set a few jumpers here and there, like finding an IRQ/DMA/IO combination that was available, but that wasn't as difficult a task as some suggest today, and with that you were often good to go.
In short, while I'm not denying some improvement has occurred since those early days, PCs have always (by design) been fairly approachable and upgradeable, it's just that they used to be new and people were more mystified by them than they are now.
Something is relatively good because it is a vast improvement over what was in place before it. And it's not like things are not progressing still. If you've been around Europe over the past years and noticed a big increase in the number of Chinese tourists everywhere? The purchasing power of the Chinese people is increasing constantly. Not yet as highly paid as their European and US counterparts? Maybe, but they are already a far cry from the stereotypical oppressed mass that works for a handful of rice a day and the trend is in their favour.
I like how ICJ can claim to have jurisdiction over something that happened in the 60's, while claiming to not have jurisdiction about the things that happened in the 90's on the basis of country X not having been a member of the international body Y at the time of the events.
Word of warning - that's not coffee on the keyboard (see title).
Well, maybe you don't care whether the Avengers were ever filmed or not, and I know many would agree on that (though because of different reasons, mainly that the films had 0 artistic value), but a whole lot of people would disagree. Many of them wouldn't pick up a comic. And a similarly large (if not larger) number wouldn't bother with some indie production superhero film (the niche in which those have a cult following is a tight one). So, arguably, it would be a loss.
I like a good read, even without the pictures, as much as the next guy, but the superhero stuff is partly fun because of the superhuman acts and being able to see them is a major part of it - that's why the genre got so well established in the comic book form (that, or the fact that both are, or were originally, meant for about the same mental age). You can get the human drama elsewhere, but the superhuman stuff is what makes this genre special. And if pictures are good, moving pictures that seamlessly combine live action and flawless CGI into a visually believable spectacle are better. And it's not just that. Any proper epic, with mass scenes, good costumes etc. - you need a budget to do properly.
Therefore, all joking aside, losing the ability to make large productions would be a loss to the film art. Though not necessarily because of the Avengers. As far as those are concerned, the Lumière brothers needn't have bothered.
I don't think that you're fixing any problems by transferring copyright from a studio to a committee of a thousand people. If someone is to invest tens of million dollars into a film (and that's the only way 1000 people productions can happen), they are going to want a (big) percentage of any earnings from the film. In the end, you have the same thing, except the titular holder of the copyright is different.
Shortening the copyright period and restricting what exactly is protected by the copyright law may have some benefits.
A buddy of mine worked at a popular place which catered to various tastes, though mostly rock (from "folk rock" to proper metal and gothic stuff). His experience: the darker the music, more polite the people ("They actually say please!"). I've noticed the same thing, bug guys in black leather and spikes apologizing if they slightly bump into me, the crowd breaking up a mosh pit to search for someone's contact lens... :D. On mainstream events, stay the whole night and you will see at least one incident (a fight, a bottle thrown...).
A kid "borrowing" a credit card from one of their parents' wallet is the same as a kid "borrowing" some cash from one of their parents' wallet. A business should have enough sense and conscience to have it's attention raised by an excessive amount being spent, but ultimately it is a matter that really needs to be resolved between the child and the parents. I doubt I'm the only kid who always returned the exact change home without anyone having to explain it to me, ever - it was the logical thing to do. And I was much younger than twelve when they started sending me to the grocery store. A twelve year old ought to have more than enough of understanding how money works as well as sufficient moral scruples to not do c*ap like this.
It's something in between. AMD's module (2 cores) has two execution pipelines, Intel's hyperthreadded core has one. So, assuming a four stage pipeline, in Intel's case, that would be
thread1_instr1 | thread2_instr1 | thread1_instr2 | thread2_instr2
whereas AMD's case would ideally look like this:
thread1_instr1 | thread1_instr2 | thread1_instr3 | thread1_instr4
thread2_instr1 | thread2_instr2 | thread2_instr3 | thread2_instr4
The problem appears when both threads need a shared resource at the same time, forcing one of the threads to skip a beat. That's not always too bad. If 10% of instructions are conditional, on average, there will be a collision every 100 "steps", due to both active threads' need for branch prediction, so in those 100 "steps" one core will be utilized fully, and one 99/100, making it a very small loss and still practically a lot closer to 2 cores than one hyperthreadded. The problem was that both cores needed the fetch and decode unit pretty much all the time, and that apparently did hurt the performance, but not to the point that it got reduced to hyperthreadding. Indeed, in well threaded tests of the time, the eight core AMDs compared well to Intel's quad core, hyperthreadded i7s, despite a generally significantly lower single core performance (and especially if price to performance ratio was considered, although that is an economic and not a technical parameter).
Sun UltraSPARC T1 had, if memory serves, 6 cores (with 4 logical cores each) all using one shared FPU. Nobody sued them and I don't remember anyone saying that it wasn't a 6-core CPU.
It's worth saying that Sun was fairly up-front about the limitations and I believe that official info said that performance suffered if the share of floating point instructions in the code exceeded something like 6%. In intended application scenarios - the so called enterprise loads that mostly just shuffle data around for a bunch of concurrent users, it ran circles around Xeon and Itanium competition with a comparable number of sockets, and that was good enough for people.
But it wasn't that different in AMD's case either. The Bulldozer (and Piledriver) CPUs performed very well under specific workloads and so and so in others and that too was well known, as a huge array of benchmarks and reviews was widely available.
It's also hard to claim having paid a premium for the chips when they were cheaper than Intel's mid range (i5), not to mention higher end (i7) CPUs. They were pretty much budget CPUs, some of them even had a launch price as low as $110.
The only possible exceptions are the 9000 series models, as those were expensive, but it's hard to claim that buyers didn't know what they were getting: they were merely factory overclocked models which launched almost 2 years after the first Bulldozers, end they were also reviewed fairly extensively on their own.
Additionally, FWIW, with AMD's share in pre-built system being what it is (and what it was at the time), the people who bought FX-8000 (and 9000) series CPUs were generally the people who build their own systems, not some uninformed poor souls who bought a box because AMD slapped it and said "This bad boy can fit so many threads!", so, IMO, this is either some buyer's remorse, or someone smelling free money.
Gah! I guess going back to Firefox will be the way to go, but I so dislike the idea that it hadn't even occurred to me until I read the comment. Nothing personal, but, for several years, it just hasn't sat well with me. I'm using Opera, BTW, and also have some love for what the team behind Vivaldi is doing, but since those too uses Chromium, my guess is that they will be just as affected.
EDIT: I was wondering for quite some time until when was Google going to be allowing ad blockers. I guess I got my answer.
Since we're discussing the presence of non-IT articles on El Reg, generally speaking, I have always appreciated that, from time to time, Reg covers topics and angles related to society (civil rights, policy changes...) that the mainstream media outlets often mishandle, or miss altogether (but researched and written much better than how the vast majority of "non-mainstream" sources handle stuff, which are mostly junk).
Those articles may not be an obvious fit for an IT site, but they have always been quite a welcome bonus for me.
I think I have participated in the survey and from the statistical breakdown I got at the end, which included the mentioned things such as gender, age, social status, etc, I don't think it got the rules after which I was making the decisions:
- my own autonomous vehicle should never decide to kill me;
- humans matter more than animals;
- the vehicle should not swerve into people who didn't step in front of it in order to save those that did.
The end.
A more general form of the third rule actually makes the first one redundant (assuming that self driving cars obey the regulations): the car should not put people not who aren't violating traffic rules at risk in order to protect those who are.
"If the Hubble team can implement solutions from the ground to compensate for the problem, the space 'scope will return to three-gyro operations – if not, one of the gyros now in service will be parked and the telescope will be put into single-gyro mode."
So, three gyroscopes are good, one is ok, but two are not an option?
Another proof that all the academic titles in the world aren't enough to protect a frail ego, which will still be hurt even by things such as a tongue in cheek jab primarily aimed at the frail ego's preferred gadget maker.
BTW, I'm not sure if mentioning Microsoft is at all relevant in this context, unless your work (presumably) in the field of non-linear control engineering limits your personal phone options to either Apple or Microsoft.
A have a few female coworkers at the moment and I appreciate them very much. Smart, hard working, dependable, well mannered, and I really mean all those things (which also generally apply to my male colleagues as well, it's a really nice team). I absolutely wouldn't turn anyone's job application down based on gender, race etc, but that is one thing.
A completely different thing is: you have an offer for which you will pay to be displayed to people on Facebook; you can't have it displayed to everyone as it's prohibitively expensive; therefore you pick parameters of a population that's most likely to respond positively so that you can get the best response for your money; if the job is in the US, you probably don't need to show that message to people in Germany; if 16/17 professionals in that branch are male, you target men with that paid ad. Because if you, say, pay $100 to have the job offer shown to 10000 random people, the expected number of truckers that you reached is 100. If you pay the same $100 to show the offer to 10000 random men, you've most likely reached somewhere around 200 truckers (supposedly, there are around 3.5M truckers in the US). The yield doubles.
That's unless you have reliable access to much more private information about everyone, such as previous job experience (in which case, of course, you pick those who have worked in the field), and that still doesn't mean that you turn down women who apply for the job.
The point of targeted ads is that they are to be shown to people more likely to click on them and go for what you are advertising. If one in 100 guys is a truck driver and just one in 1600 women is (numbers roughly correct for the US), then, since you're paying to have your job ad shown, you only pay to show it to guys, because otherwise you're wasting half of your money by paying to show the ad to people who are 16 times less likely to go for it.
I'm not sure I follow. Bulldozer and related microarchitectures were a flop because both performance and power efficiency were beyond comparison (except under very specific workloads). Ryzen is a completely different case. As for ARM, I'm a little sceptical.
In the past, performance and compatibility weren't there. As those improve, the gap in power consumption dwindles. I fear that in the end, if it does become an option, there will be no particular reason to go with ARM except just for the sake of getting something that's not x86. Which is cool in a way, but it would be even better if we could get a practical advantage. And they do need to go the full distance on performance and compatibility, as market has already rejected products such as Windows RT (power efficient, cheap, not really compatible) and Transmeta Crusoe and the likes (low power consumption, low performance, laptops weren't exactly cheap).
The problem is that they have a legitimate case. The sites in question are porn sites. Their business is showing porn to visitors (and making money from it the same way almost everyone else does). They have extensive collections of material, not just a random pile of stuff, but catalogued, curated, searchable content often with extended metadata. And it's all pirated. They've made a business from "broadcasting" (Streaming) pirated content. And their defence is essentially that they outsourced the procurement of the material to people whom they aren't paying. If it falls under safe heaven, then it's an obvious legal loophole that the rights holders will demand to be closed.