Re: The "Cloud" is great until it's not
"On the other hand (I'm running out of hands)"
The phrase for this would be "On the gripping hand"
:-)
35 posts • joined 27 May 2014
"QA pick up each bolt, carefully measure it, verify it was within spec - then grab a large hammer and a stamp to mark the part as "inspected"..."
Sounds just like the place I worked where the shipping department added corporate asset tags to precision weights.. (before my time, though, so who knows the accuracy)
A lot of what is in orbit is junk from old launches. Only a fraction of it is currently working and controllable satellites. Think about explosive bolts used to separate stages, empty upper stages that blew up, or just wandered off, dead satellites whose orbits wander over time, and so on. Anything not active that isn't in a stable Lagrange point will drift from irregularities in the earth's mass distribution, effects of the moon and other planets, atmospheric drag, and even uneven solar heating/illumination.
>>One wonders why Linux was ever ported to them in the first place
>Porting Linux is something you can do to prove it actually works.
Exactly - same reason I once worked on a port of X-Windows to the write-only frame buffer of the F22 Cockpit display subsystem.
Although it really sucked if the login prompt showed up while flying the full-up simulator...
"Good luck doing that in London, even the pros at this make mistakes.
I remember them drilling to lay the new supports on Bridge in London (I am not allowed to name it), slight issue in that they drilled through a secret tunnel and flooded it! Not just any old tunnel either, special tunnel that links downing street to whitehall!"
Pretty close to what happened in the Chicago Flood - https://en.wikipedia.org/wiki/Chicago_flood. Putting in new pilings for a bridge, and hit a tunnel. From wikipedia: "cost the city an estimated $1.95 billion"
"Does no-one else have a big box, lined with anti-static bags, full of every SIMM and DIMM module from every computer that you ever broke up and scrapped?"
Convince your SO that they are for a Christmas Wreath when you have enough of them. https://www.pinterest.com/pin/235383517998244985/
A written set of laws that cover as many bases as possible (as intended) - plus a section that gives an idea of what the law is, and is not, intended to cover in the general sense.
It would be nice to have a written letter of the law, with layman based explanations on intent or examples, and all that. Not that I ever see it happening.
It could be interesting though, image the laws starting to include the usual: "The key words "MUST", "MUST NOT", "REQUIRED", "SHALL", "SHALL NOT", "SHOULD", "SHOULD NOT", "RECOMMENDED", "MAY", and "OPTIONAL" in this document are to be interpreted as described in RFC 2119"
"No, that's not how it works. Developers with access to the systems use their own test data and are forbidden to view real data, while developers with access to user data are granted it on a strictly controlled test basis, just like any other telco engineer."
Depends on the specific company. Usually until there is some critical bug, and a bunch of data is replicated for debugging, or the developer does touch the production system.
Not all developers have access, and they might not have regular access to all the data, but plenty of data and access can be leaked, with very little tracing.
"create the illusion that the software was able to access large databases"
"lying about the software's ability to transfer records between doctors and audit transfers"
It wasn't able to access databases and wasn't able to transfer records or audit said transfers. It might be possible to fake that in regulatory tests, but how did the people actually using the software not notice it couldn't do any of the things they actually needed it to do?
This was likely to fool certain minimal security testing, like not seeing the patient name or diagnosis in network traffic. Once the test is over, they go back to just storing plain text in the DB, because that is easier. You can fake "transferring" records by just allowing the other user access to the same data, so it would work for users, just you never took the information away from the original person, and it wouldn't actually work across different installations, or to anyone else's system. Or you do brain-dead serialization/deserialization, and worry about transferring it between different software versions later. And real users looking at audit reports, hardly...
There are just so many ways of doing a crappy job that just tick a checkbox, as shown by "agile" in all sorts of places...
It looks like they decided they needed to document the analytics.
This probably is region specific, - in the US with V1.0.7.10_1.2.3 listed at the top of the UI, firmware update assistant was showing something about 1.0.7.12, but it looks like there has been another update. New features were shown as New Features and Enhancements:
Supports Dynamic QoS.
Supports Dynamic QoS database update
Bug Fixes:
Fixes for security issues.
Note: Firmware starting 1.0.7.12 will not include Arlo functionality
---
now seems to be referencing 1.0.8.34 for whatever reason - specifically it shows:
Current GUI Language Version: 1.0.7.10_2.1.38.1
New GUI Language Version: 1.0.8.34_2.1.38.1
Current Firmware Version 1.0.7.10
New Firmware Version 1.0.8.34_1.2.15
Release Notes:
1. [New Feature] Supports collection of router analytics data.
2. NOTE:It is strongly recommended that after the firmware is updated to this version, log back in to the router s web GUI and configure the settings for this feature.
May as well add prohibit any internal software development, or workers with even the slightest disability (or just not being a generic cog), since you are going to end up prohibiting them from getting any work done.
Rules like "20. Ban the use of USB devices" lead to policies requiring a doctor's note (and custom computer that doesn't have the USB ports epoxied) to use a trackball or vaguely ergonomic keyboard for carpal tunnel problems.
The only way to protect yourself is avoid the security nightmare that is Windows. If you only use web, buy a Chromebook. No malware, no key loggers, no constant intrusive updates, no need for antivirus, 2 second boot, it just gets the job done.
While I mostly agree, conexant (etc) drivers with built-in "debug" keyloggers are equally possible in a Chromebook. TLA's have disk drive firmware, and lot's of other vectors available, and more will come out. Avoiding Windows is a good step, but should be considered just one layer of many.
It didn't decide to do anything. The input photo is all the data collected about you. The output photo might be a single pixel describing your credit rating. And the filter is the entirety of the program.
One of us isn't getting this, and I don't think it's me!
Sorry, but afraid not. You are indeed missing it. There are no decision trees, or state machines in machine learning / neural nets (in they way you appear to be thinking).
Your retina and brain is comprised of neurons. How do we ask it how you decided you just saw a cat? Big hint, it's not the way you might think. We can slightly describe how it actually happens in terms of layers that look for vertical edges, horizontal edges, motion, image convolutions, and so on. That's all you can get out of machine learning.
Even better comparison - how do you recognize someone's voice? Can you describe an average friends voice well enough so that someone who has never met them will uniquely identify them the first time they hear it? If you magically tracked all the neural activity, you would have worthless information about relative weights of harmonics and frequencies and time delays, but it still results in either recognition, familiarity or not. Even with all the details, it doesn't tell us what voices might be easily misidentified, or who could do a good impression of that person.
"Why would you class this as important as the Pentium bug?
It's a bug that prints numbers.... Shouldn't you check numbers when printed?"
So someone should individually check every digit of every number in that spreadsheet / online bank statement / online receipt every time they save to a file?
I'd say this is a lot worse.
Just last month I had saved my annual earnings tax statement (W2 in the US), which is only available online, and what I saw online was completely different from what I saved for backup and printed (different issue, stupid special font problem, lots of "?"'s instead of real numbers printing from any other machine).
I couldn't file online without making up a number for an empty field, and if I tried to file that paper it would have been rejected. If it hadn't been such a large screw-up, but just a couple of digits it could have been a huge hassle. So many things wrong in that sequence, but anyway.
Looks like I might have mis-remembered where I did this, or at least I don't find the reference for VirtualBox at the moment, but https://communities.vmware.com/thread/394665 shows cpu id masking in VMware...
I think I ran into this trying to run an older Mac image on a new (non-Mac) machine, required some hacking, but nothing major.
"Why did they make no attempt to fix the situation the first, second or third times?"
Your imagination just isn't up to the level of incompetence out in the field. They probably have something hard coded into the bears or apps that are out in the field... We just haven't heard about it because it doesn't happen with every access, just something like initial setup or reset (and seriously, why spend more time investigating their level of security).
Current ML seems to be at the biological equivalent level of retinas, perhaps with a couple of neurons above.
Now think about all the optical illusions that we can be tricked with, and how hard it is to discover some of them - every new instance of ML is going to have it's own set of illusions/false outputs. They may get better and better, but every one of them is going to have it's blind spots and ways to be fooled.
And that isn't counting any of the basic coding, memory, etc., bugs, that can crash things, rather than "just" provide the wrong output. They may be incredibly useful, or even better than a human, but they will never be perfect.
"Corporations STILL allow numpties who have zero clue about Internet[0] security un-fettered access to the corporate email system? The mind absolutely boggles ..."
Yes, some corporations are still primarily composed of people. Some bright at certain things, and not so bright at other things.. Ok, and some really, really not so bright...
I got flown from the states to Germany to check the version number of video card firmware, and install the update. Back in the day when 1280x1024 was high end CAD workstations...
That was after a couple days of back and forth making "sure" they already had it.
Of course, I had another trip where my largest suitcase contained a server, padded with some clothes, so I could run tests.
I fully agree with "The cynical among us suggest foreigners are more than willing to work in the US on lower wages than citizens will accept, cutting Americans out of jobs and saving bosses a pretty penny."
They could also stop getting rid of anyone who starts to get older, more expensive, or unwilling to work insane hours. The cult of the young Brogrammer a-la Facebook is so prevalent that people publish ACM articles that directly state that Developers: "Just like competitive athletes, they simply burn out by the time they reach their mid-30s.". http://cacm.acm.org/magazines/2014/12/180776-the-responsive-enterprise (presumably requires membership, also found in the December 2014 (Vol. 57, No. 12)/The Responsive Enterprise: Embracing the Hacker Way).
Instead of "Give me your tired, your poor, Your huddled masses yearning to breathe free...", it is Let me trap your STEM graduates for a decade. and then go back for someone fresh.
Move heat in a reasonable way, and generate electricity from it in the end. I can't stand paying for air conditioning dumping heat to the hot outside, while at the same time paying to heat cold water coming in the water line. Or if I have to pay to heat the home, at least generate electricity first. Nothing for the consumer is compatible with anything else, and efficiency standards are always for standalone operation.
Biting the hand that feeds IT © 1998–2021