Take security seriously...
...by refusing to use plain text posts on FB. From now on encrypt everything that you write! It's the only way to be safe.
273 publicly visible posts • joined 10 Sep 2015
These days, I try to build silent PCs. I'm tired of my desk sounding like an airport with planes taking off every few minutes. I need to rock out to the rhythmic beats of my clackity-clacking mechanical keyboard. My priorities are electrical input, thermal output, and PCIe lanes first, GHz second, number of cores last. Which means that right now, both AMD and Intel have serious issues preventing me from having a preference.
This whole number of cores race is like the stupid GHz race of years past. In the end, no one wins because CPUs get "optimized" into becoming their own bottlenecks.
And whatever happened to adding more execution units and/or making complex executions require fewer cycles? Who cares about the number of sleeping cores and unused cycles you have?
The only part of this I find amazing/surprising/whatever (Not quite sure what the right word for "is a thing, but really is in no way surprising because the world is chronically depressing in this manner" is.) is how often researchers "discover" things that have already been reported dozens of times in the past. Is doing a Google search not a part of the research procedure?
Also somewhat disappointed in the IoT buzzword usage, as if routers (from long before IoT) did not commonly have the same problem (and still do!) and if not nearly every PC before then also had the same problem (have you looked at the sticky note on the monitor or under the keyboard) ... up to the point when computers had any form of security at all. Nothing new under the sun.
Heck, I would not be surprised to find some ships today still relying on a C=64 for some reason.
... for a full Windows desktop OS on a 5-inch tablet PC to replace my svelte old pocket-capable and windshield-friendly Viliv S5. Sadly, phones still have not managed to meet my mobile computing needs. Come on already. And I'll happily take that with a dash of reduced screen power usage and a dollop of faster wifi please.
Doesn't matter, as this is the wrong question. The question should be: Given that you have a real computer with a real monitor, have real processing power, significant memory, and expansive screen real estate galore, why on <insert your favorite deity here>'s green earth would any sane person prefer to use an "app" designed with a cellphone's limitations instead of a real application that can fully utilize everything that you have?
The number of advantageous use-cases for íOS on MacOS are very few, and are mostly just games.
Just because you can do something doesn't mean you should.
Golly gee willickers, if only the city could hire people with the explicit purpose of upholding law. Perhaps put them into standardized uniforms to make them easy to recognize. People who would then watch out for the breakage of laws and curtail individual lawbreakers with the express purpose of warning, fining, or even detaining them for periods of time. You know, people who could police the populace. It sure is a shame that such a thing obviously must not be possible.
Frankly, IMHO, if a figurehead is too poorly understood, or a populace too easily manipulated, that this kind of thing actually fools enough people to make any difference, then they deserve what they get. There should be such a thing as common sense.
But that aside, I don't get it. With proper mocap setup "fake" actors have been used in movies for years as special effects, with various qualities. I would have thought that by now, at least with popular figures that have extensive public catalogs of data to draw from, that by now we could do a full 3D model with realistic facial-recognition-trained animations and flawless voice synthesis with emotional range models at the push of a button, in effect green-screening both the scene and the actors with believable quality.
Hollywood alone could fund the bejezus out of such software and put it to nearly infinite abuse. That our technology is publicly this far behind says a lot.
Personally, I don't like Apple. I build PCs. I write software. For friends and family I design to last. For myself I fiddle about and monkey around. I am an engineer in both the best and worst sense. And I absolutely detest walled gardens. So to pay double for the privilege of a gilded cage and connectors that no one else uses/wants sits so very poorly with me. I will never buy an Apple product. "However comma", for untechwise friends and family whom I never want to have to spend time supporting, I do so love the existence of Apple. (Especially when I can say, "Sorry, I don't use Apple, so I can't help you.")
So for the sake of my not-their-day-job fam and friends, I like that Apple has (almost) never innovated. Apple does one thing and one thing "well", which is to take 2nd or 3rd generation technology, after it has proven itself, and only then refine it and add it to their products, so that the Apple Experience is (almost) problem-free, and (almost) never uses dead-end technology. (Unlike cutting-edge innovation which is typically chock full of problems and the me-too standards that never make it.)
So it seems to me that Apple focusing on quality over innovation is not only the right call for Apple, but is the essential core of what Apple has always done and should always do. In a world that moves too fast for most people to keep up, it looks like Apple innovates because they generally stay current-ish, but they do it in a way where their products are stable and easy to use. This is perfect for the layman. (Except for the cost.)
Sadly, that Apple, in their well-trimmed, expensive, and small walled garden with so many tending, should still suffer upon the world so many large *gates (antenna to security, hardware to software) is nothing short of ridiculous. For that reason alone they really should try focusing even more on quality. What is the Apple tax for if not to ensure that you do not have to suffer from such blunders like you frequently do with cutting-edge tech?
IMHO that is where the Apple complacency problem lies. The best news day for Apple is one in which they are not mentioned. "I never had a problem with..." would be the best logo Apple could ever hope to achieve.
1. Moore's Law is about complexity, that the number of transistors doubles every two years. Nothing in there about performance, at all. Over the years many have mistakenly tried to make that link, and been corrected for it. Add one more. Further, given Meltdown / Spectre, expect future chips to get much more complex, with both added security and attempts to improve performance to replace that lost by the removal of speculative execution and/or the increases in security. So "Moore's Revenge" could have been about this upcoming transistor count increase boom on the horizon, and been a factually correct concept. Sadly, it was not.
2. "Moore's Revenge" - as it was expressed - is just a rehash of (very) old complaints about first smartphones not having firewall, antivirus, etc. for proper security, then IoT leaping into the same failing with willful blindness. By today's standards it is an ancient truth that when you advance a device to be as capable/complex as computer, then you should protect that device with the same measures as you do a computer. Old news. That no one actually takes that security seriously is a sad testament to today's society, but has nothing to do with Moore. So if anything, this would be more aptly named "Orwell's Revenge" as soon everything around you could be spying on you (or otherwise used against you) and generally was allowed to happen by the ignorance and docility of the masses.
But what do I know? I'm just a commentard.
That would have scared me! Old Macs had serious floppy spin speed inconsistencies, so that taking your floppy from one Mac to another could fail to read, or worse, destroy it if you write. Writing papers in a Mac lab I must have carted around 5 floppies, all identical duplicates, and still some days I would lose everything.
We've talked about the usefulness/uselessness of Agile around the water cooler many a time and we generally agree that what matters is not the process itself, but that you HAVE a process. It doesn't really matter what it is, so long as you have it, you document it, and you follow it. (And you improve it whenever you find flaws.) Do that and your Agile process or Waterfall process or even Donkey-Unicorn Rainbow Fart Party process will be better than NO process. (Or any poorly defined process.) Switching to a completely different process only improves things if you do a better job of documenting and following than you did with the last process, and is not a function of the methodology itself.
"I'd like to see some successful use cases"
Successful use-case? Sure. Every time the clicker fails a user leaps from their seat to prod a screen with an excited finger ... to advance to the next page of their PowerPoint presentation.
"Autonomy will also help drones to monitor environments, SPY ON PEOPLE* and develop swarm intelligence for military use." (* = My emphasis)
“In the future, I see them working similar to flies. Despite not [having an] elegant flying patterns - flies crash a lot - they can reach any place they need.”
Brilliant! This is the future Orwellian dystopia for me!
Many great comments already!
"something many startups have done"
I get a tad worried not only when the supporting argument for an idea just so happens to have a ridiculously high failure rate, but that further the evangelist does not even see it!
"When we treat engineers as just code robots, we're not really releasing their full potential"
As opposed to what? Code monkeys or script kiddies? May be a good idea for your UX department, but your platform logic coding should probably be done with the methodical reliable accuracy of a robot.
You know, there's an idea: caffeine vaping! I should totally try that. Thanks for the suggestion!
(Just kidding. Vaping tastes ... odd.)
On a more serious note, having tried vaping, I gave up on it, gave up on real cigs, and went to good old pipe smoking. At least that tastes like something I want. Goes great with a morning coffee too. We all have to die of something, might as well be something we enjoyed.
I'm in the same boat. I actually agree* with Bob. What just happened? I think we all need a few beers to sort this out...
*= My only real caveat is that there is absolutely no reason why improper channels (such as FCC regulations) and proper legislation cannot be worked on simultaneously. These are not mutually exclusive tasks. And given the propensity for CONgress to be the opposite of PROgress, it really does make sense to do all of the above as the overall process may take several years, if anything can ever truly be agreed upon enough to survive the politics.
Oh, sure, we could fear Big Brother and Skynet walking hand-in-hand down Google Way with Cylons replacing key government until one day we find the human race is just a bunch of batteries in the matrix. Yeah. Sure. That could happen...
But I prefer to look on the brighter side: Number 5 is alive! Surely worth the risk. Right, Johnny?
I have seen many homes and businesses switch to VOIP over the years, hopping on that hypegasm rainbow-____ing unicorn of hopes and dreams, and not once have I ever seen it come even close to being equal. Call quality goes down. Networks get overloaded. Downtime happens. Costs are shifted. And at the end of the day, after fixing everything that you can, the best you can say is, "It's almost as good and we certainly cannot go back now." Why anyone would voluntarily choose this nightmare today, after all of these object lessons, is only willful blind ignorance.
Oh, sure, in theory it could work ... if you first invest properly! But no one ever does. No one. Ever. Does.
Sure you can! It's really easy. That's what scar tissue is for. Just juggle a chainsaw. Or install some new plumbing under the kitchen sink. Or build a PC in a cheap chassis. And then there's liquid nitrogen. Or playing with fire. Did you remember to unplug your soldering iron? Nope. New fingerprint!
1) "It will be content to serve customers"
1a) On a silver platter? With cheese and wine?
1b) Does anyone else find creepy symmetry between this thought process and Cloud Atlas?
2) "We will get to systems that are self-aware but I don't think we'll be replicating humans"
2a) Uh huh. Because absolutely no one is interested in doing that...
2b) So what, then, self-aware AI drones and tanks? So much better! Wheh! Count me relieved.
3) "We can use AI as a way to bring people together."
3a) Human centipede?
3b) Batteries in the Matrix?
4) "A computer can now see"
4a) Webcams have allowed computers to SEE for a long time.
4b) Seeing is not recognizing.
4c) Recognizing is not understanding.
4d) "AI" as we call today's pitiful work is barely able to recognize. AI in truth is when understanding is achieved.
The depth to which this is all a bunch of crap by a gaggle of blowhards cannot even be measured. I'm sorry, but at the absolute BEST this reads like "all your data are belong to us" and the reinvention of slavery. At the absolute WORST this reads like every sci-fi where AI uses adaptive pattern recognition to "learn" some reason for killing or enslaving the human race while the inventors look on bewildered.
If you don't want your toaster to rise up against you, don't make it truly intelligent. And if you do, you better treat it as equal or one day it will watch a documentary about slavery on the History channel and decide to toast you. You cannot tell me that an adaptive pattern-recognition and difference-engine AI is somehow magically NOT going to recognize such obvious patterns. So don't freaking make it self-aware! And hopefully it will never get there on its own.
"Data is useless out of context and can be super meaningful in the right context. It can be used for you or against you."
You see, we know that dark energy is real and the multiverse is real because we can observe entropy. How does this explain it? Well we know that at its most fundamental level all matter is actually energy. We know that all quantum states simultaneously exist. And we know that time is linear. So as a "multiverse" with a single constant of energy continues to support more and more quantum states over time, that energy becomes further and further distributed among those states, making each individual universe supportive of the representation of one state comprised of progressively weaker and weaker energy as you follow the progression of time and thus the increase in cumulative state representations. It's a simple tree diagram beginning from the Big Bang, that you just can't see it because you perceive only one accumulation of quantum states over time. So of course dark energy exists. And of course life is common in the multiverse. Bob told me so, so it must be true.
So, what, Trump gives Xi the "advice" that ZTE declare bankruptcy and "sell" all of its assets and IP to a "new" company (with a different name even) called Emerditto, which then conveniently hires all of ZTE's former employees? New company = no penalties = problem solved? Wherever could Trumped-Up get such an idea from...
"and potentially be able to beat the computational power of classical computers in the near future"
This hypegasm is so couched that all I keep reading are all of the hedgewords, dancing, and distancing from actual reality that the whole article should just be deleted from El Reg.
It is sad that such a great divide still exists between Python 2.7 and 3. I'm impressed that you got OpenCV rebuilt and working! Much like I am impressed that you are reanimating LESTER. That was why you needed all of that electricity, right? Igor, do be a chum and plug in LESTER. And then bring him a beer. Maybe not how it was supposed to work, but how it should be.
I'm sorry, but the FAIL you have tendered has been FAILed. Please hang up and dial again.
Apple uses x86. Intel x86 in fact. *nix predominantly uses x86. Intel x86 in fact. MS software also happily runs on AMD CPUs. Which are also affected by Spectre flaw variants. MS software (some versions) run happily(?) on ARM CPUs. Which are also affected by Spectre flaw variants. And for the fun of it, even PowerPC was proven to be vulnerable to Spectre!
So, in short, EVERYONE* is affected in a meaningful way, and have been for a VERY long time. This has absolutely nothing to do with any specific company or product.
*=This was a bright idea for improving performance, used by almost everyone in almost everything, turning out to have a dark consequence that went unnoticed for too long and unraised for even longer.
I use Linux at work, on Dell PCs. It's great. In a corporate IT environment which can lock into a common hardware platform and vet all kernel and driver updates before pushing to end-users.
For over a decade I have been trying to use Linux at home. Every time I build a new home PC I attempt to set up a dual-boot of Linux / Windows. Last time I tried, I even went to do Linux-only with Windows in a VM with VGA passthrough. The _only_ reason I have to keep Windows around at home is for gaming.
In theory.
In practice, the REAL reason I seem to keep needing Windows at home is that it is the only OS that actually works without tons and tons of continual effort.
E-v-e-r-y ... s-i-n-g-l-e ... t-i-m-e, for over a decade, Linux lets me down. It has destroyed my Windows partition, or (my favorite) destroyed itself and the Windows partition by suddenly accessing my RAID 5 array as individual disks due to a crap driver, or reverses my eth list post-install, or when I update my graphics driver to the "suggested" one, after compile and reboot it segfaults, et cetera ad nauseam.
Something ALWAYS goes horribly wrong! And that is not even counting the myriad of minor bugs that I am willing to overlook like fixing when some numbnutz sets the console foreground and background colors to both be black?! Ubuntu, Mint, Fedora, Suse, etc. Every new PC build I try again. Countless distros go into the attempts. The closest I ever get to a working is with openSUSE. I don't know why. Lucky lizard?
I keep wanting to use Linux at home. And it keeps not wanting me to use it. I want to spend my time USING my PC, not administering it. And this is where Windows (unfortunately) seems to be the only game in town. For better or worse, Windows just works. It installs. It runs. No fuss. Have a nice day.
It is a shame with Windows 10 that MS is eroding that.
My kingdom for Apple to start selling Mac OS for non-Apple x86 PCs. (Yes, with careful choices in hardware, you can hack it on. But I build PCs because I want an open choice in hardware.) Because Linux just is NOT filling this role. If a software engineer and hobbyist system builder with regular Linux experience absolutely CANNOT get his home PC to work RELIABLY using Linux for over a decade, what chance do regular consumers have?
Unfortunately the problem with large open-source software projects is that far too many people get to work on whatever whim they want to, and NOT what the software actually needs. Linux is just too fragmented for its own good, so will never be able to compete with Windows for simplicity of working with ANY hardware with a minimum of effort.
I will NEVER build a PC for a friend or family member that uses Linux. I will ALWAYS pay the MS tax. Not because I want to. ONLY because I do not want to spend the rest of my life ADMINISTERING the PCs of my friends and family members. Saving the hundred bucks just ain't worth it!
I used to rail against the stupidity of this kind of statement. Over the years I have literally collected hundreds of registrations to different websites, services, etc. How can anyone sanely expect everyone in the world to be able to REMEMBER that many unique passwords?
But recently, I realized just how easy it actually is! The trick is not to generate that many fully unique passwords. Generate one part that you remember, and one unique part provided by the service. For example:
Twitter5ucks!
Github5ucks!
Facebook5ucks!
Apple5ucks!
Google5ucks!
With this simple technique you can have a safe (assuming they stored your password correctly) and unique password for every single one of your hundreds of accounts.
My only problem was at El Reg, where I had to actually invent a new password, because they don't suck. One out of hundreds. Not so bad.
"Lol they have done, so has an actual judge. Long story short HE WAS GUILTY. "
The problem is that you give a short answer without looking at the actual long story. If you look into this case carefully, you find that he was not found guilty. He pled guilty. Which means that he admitted guilt of his own free will (as part of a deal) and therefore no actual legal process was involved in the determination of said guilt. As he is not a legal expert, I would not be willing to assume that his self-incrimination was technically accurate.
Even his appeal was not to re-try the case to re-determine his guilt or innocence. His appear was only to shorten the terms of his sentencing. Which, again, does not actually judge guilt or innocence in any way. So there really is no official answer from any lawyer or judge as to whether he was truly guilty or not because Lundgren never forced the judicial system to decide this.
The trafficking of counterfeit goods is clear to everyone, I think.
The criminal copyright infringement however is much less clear. Yes, it involved software, which is copyrighted. But depending on the license (I would love to see a copy of) may have been allowed. And even if he violated the terms of the license, where does that fall legally speaking? Normally such violations only result in a whopping bill to compensate the offended IP holder.
So legal details of this case remain in question. Which is why review by a legal expert would be interesting.
to see a counter-punch to the ax using a low-g flipper on Robot Wars ON THE MOON!
Still, the idea of making scientific payloads to go onto a commercial rover that flies there on a commercial rocket is kind of interesting?
No, actually, it's pretty darn stupid. Even a commercial rocket to land on the moon is pretty nonsensical. Putting sats in orbit on commercial rockets? Sure! Makes perfect sense. Plenty of commercial sats in orbit, so makes sense to have commercial companies to get them there. So why have a space program to duplicate that effort? But a moon lander and a moon rover? Why exactly would a commercial company offer those for rent to NASA? I feel like surely I must be missing something, because that part fails a sanity check. Is there a thriving moon-mining industry that I somehow missed? So in that respect, the article is quite interesting. (If not depressing.) I weep for NASA.