Oh, no. Just think about all those poor F-16 pilots that will need to find a new job!
In addition to not needing other pilots with the take-over of drone warfare.
Life (and death) just aren't what they used to be...
1126 publicly visible posts • joined 23 Oct 2013
There may be occasional lapses in IP mappings, etc. We call that "friendly fire."
The biggest problem in the US, at least, is a totally wild-west capitalist system where the biggest Vulture Capitalist can buy multiple health-care facilities and strip them of any costly IT departments that may try to deter these attacks.
I've witnessed two of my health-care providers in the little state of Vermont be rendered helpless by these attacks - in one case causing multi-week denial of services. These small (by international norms) organizations just don't have, and don't want to pay for, the resources to protect themselves.
Pity the small practices/practioners across this poor country (the US).
then the EPA could never take action against that company?
If the head of the FBI once flew on an airline owned by United and manufactured by Boeing, then the FBI could never investigate those two companies?
Lewis Carrol - more material for you!
and trying to figure out if a single change will impact thousands of other lines of code, and given this is usually DoD - how it would impact our national defense.
I've spent too many years looking at huge projects with so many dependencies and hundreds of devs that come and go and don't understand the entire code-base - they can't!
I understand that fears of AI failures are real and have been seen. But the same/worse happens with a developer who doesn't have the mental bandwidth to be perfect (that's me.)
Hard to get some new whiz-kid to look at a COBOL-68 program, though.
I'm guessing you could ask some AI engine to examine the whole corpus of a mainframe inventory/billing/HR system and have it spit out at least the basic functions that the system was supposed to provide written in some high-level pseudo code to be turned into Rust, Go, whatever.
Totally agree with your points. I have written way too much code in B (predecessor to C), C/C++, and Perl (along with many years of various assemblers.)
All of the languages that require a separate set of definitions that are independent of the implementations are a real PITA.
I'm writing mainly Python now because of the very rich ecosystem. When I see most of the libraries/frameworks available in Rust, I'll start my move.
That's possible also, but I think and hope that much of the original code has been refactored many time over the past years.
I know one of my more pleasant tasks (really!) is to go back and look at my older code base that is still in use and see if I can make it cleaner, more effective, more error-proof.
of an existing package?
Assuming that much of the conversion is using translations of existing functions, algorithms, and methodologies. Which would have been well debugged in the earlier implementation.
I'll still guess that Rust may be slightly easier to develop for new projects given its very good error checking during compilation.
""With respect to documentation, if the door plug removal was undocumented there would be no documentation to share. We will continue to cooperate fully and transparently with the NTSB's investigation.""
I'd forbid Boeing to sell any more planes into the commercial market until they come up with the documentation.
Military/black programs - we'll never know.
Just like trying to rein in "AI" it's impossible to stop the flow of data/information once it is in the wild.
Huge penalties might work if you can catch the perps. But many of these types live in countries that don't exactly care what other nations think.
We need to look towards a future where all known data points are public. Every person, every location, every bank account, every personal interaction, etc. If it isn't this, then it'll be massive state firewalls blocking data transfers.
What will our world look like then?
Wouldn't crims be able to see a lot of client interactions?
This seems like one of the "weakest links in the chain" attacks. When a crim could install malware on the remote desktop then they essentially have access to all of the customer's information.
I imagine there are a lot more of these types of in-the-middle applications that have too much access to too many credentials. When I've looked at integrating various cloud-based applications I'm encouraged to use things like zapier - but I need to give my credentials on both ends to that software for it to function.
Of course the leakage of very critical development data into the www is horrible. But those fine MBAs who have taken a CompSci course and think they understand enough about software and security and client relations - they get to tell the C-suite to jump onto the AI bandwagon.
I'm a frequent user (past 8 years) of several of Jetbrains' products. I don't really want any more clippy-style prompts jumping onto what I'm typing. I know how to make my own mistakes, thank you - and learn by correcting them.
I was planning to get them fed into a card-reader and then transferred to 800BPI tape for subsequent upload to a rotating IBM 360 drum and then finally downloaded to a PC-XT which accepted my 5 1/4 disks.
Guess those tax authorities won't get to see my 1970's returns after all.
(I've had to use all of those technologies and many more ancient and novel. Still the same game: ETL.)
It seems the US sentencing guidelines have become rather relaxed in recent times.
200,000 federal employees whose data was sent (probably for a price) to India. That's like 3 minutes behind bars per stolen ID. Each affected person will be spending months/years trying to clean up their lives.
"Indeed Redmond itself admitted: "If the same team were to deploy the legacy tenant today, mandatory Microsoft policy and workflows would ensure MFA and our active protections are enabled to comply with current policies and guidance, resulting in better protection against these sorts of attacks.""
I guess I don't understand how a MFA challenge/response test framework works that doesn't allow some automation of the MFA. I mean, if the bot is performing the test, the bot can answer the second/third/etc. request. CAPTCHA?
The ivory-tower assumption that "Computer Science" is not really part of IT is the basis for the whole problem.
I know Knuth's brilliant "Art of Computer Programming" series didn't worry about buffer overflows or re-using unreferenced memory or being able to read data outside of your assigned space. But because our current implementations use languages and models that don't worry about these things, we are spending a huge amount of time and effort to clean up sloppy-but-effective code.
The same arguments will be made against automated testing environments, rigorous documentation, pen testing, etc. It's sort of like asking an economics major to take charge of a real banking system. Real life is hard.
Not sure why you got 3 negative votes before my positive one.
Google services are different and in some ways better, some ways not as good.
For my current needs, if I need to drop into a Word/Excel world, LibreOffice works splendidly. No license, no clouds, and rarely (if ever) any targeted malware.
"Nuance matters: Much can turn on a shaking hand, a quivering voice, a change of inflection, a bead of sweat, a moment’s hesitation, a fleeting break in eye contact,"
This is from the chief honcho of the US Supreme Court? No wonder the SCOTUS is now a joke. (Pardons to the few remaining intelligent members!)
No court should be looking at body language and other physical presentations in deciding the cases before them. Imagine the failed president #45 of the US who mocked individuals with physical disabilities? Would those disabilities lose a case in front of the Supreme Court?
Roberts (and I'm sure others) don't understand that many new cases will be presented through virtual media. Unless great(?) progress is made, beads of sweat and eye contact length will not be obvious.
I can't remember the codename for the MS effort but I thought that it floundered because of, again, performance issues.
MUMPS is a forgotten multi-user language/platform that saw a lot of usage in the 70s-90s and is reported to still be used in a large percentage of US hospital systems.
While most groups involved with anything to to with "responsibility" at the x-shitter organization have been ground into soylent meal, perhaps some still whole humanoids are still roaming the lifeless corridors. I'll guess that the trons are out searching for them for further distribution.
Still waiting for that musk-zuch no-holds-barred spitting fest.
You'll get back the standard boilerplate, sometimes changed with actual agency/corporate names in the template.
"We take your security concerns very seriously. Our customers are our most important product and we will protect their secret information as if it is our own."
Foxes guarding hen houses.
Not trying to be too flippant, but nobody really wants to be in charge of these "hot potatoes".
There's no absolutely winning strategy that shows up in a staff members resume - actually this is all hush-hush.
Worked on a few DoD reports that were also deep-sixed after presentation. I think this is probably par for the course around the world.
This seems to be the real limiting factor.
I can see the pulses going along quite well as each bronto (using the familiar) bites the next ones tail. But do they need to reverse direction to make it full duplex? Can the force of the bite convey additional non-binary data? Inquisitive minds, and all that rot....
https://en.wikipedia.org/wiki/Total_Information_Awareness
It's still kicking and probably has many of the same corporations and players supporting it - and more importantly being supported by it.
"DHS envisions the system, initially projected to cost about $4.2 billion and to be completed by 2021." (Guess that has slipped by already.)
"Last year, a subsequent schedule adjustment and budget increase of $354 million left the project without a planned date of completion." (Nice work if you can get it!)
This seems to be the common refrain in the Ballad of the Microsoft Excuses.
And, of course, we'll never know that it has been corrected. Until the next refrain is sung on the same problem.
Perhaps a their culture could embrace a better response than saying "sorry". Suggesting https://en.wikipedia.org/wiki/Seppuku
Java must be at least 30 years old already. The J2EE libraries that I worked with 15 years ago were creaking with their baggage.
Not sure I'd really like to see the output product from COBOL-68 (last version I knew) to a modern language. DATA DIVISION: OBVIOUS-123 meet obfuscation_xyz.
It made a bit of sense to me - a bunch of analog devices to control flow and digital to supply the inputs and interpret the outputs. I think one of the primal examples was solving the traveling salesman problem almost instantaneously while the digital beast (IBM 7094) was still slogging along.
But the allure of working with just clean bits (on/off) was too much and the Pacer went out the door.
https://www.analogmuseum.org/english/collection/eai/pacer_700/
would guarantee that you'll never need to worry about heat or eleccy again! Under several thousand tons/tonnes of equipment you'll be insulated from any temperature fluctuations (well except the upcoming solar super-nova.)
Not to sound apocalyptic about this stuff or anything.
I know you brits had your fling with BoJo but I think he's winged his way into oblivion (other than various lordships, etc.)
The US colonies still have a solid remnant suffering from partial genetic brain damage and will vote for a trump or bojo or their dead heros like Strom Thurmond or Mitch McConnell (is he dead?)
The pustule-faced trump seems to be able to milk his low-lying partisans for every $5 they have. One would hope that they couldn't pay for necessities like toilet paper if they didn't have those $5. No worries - there's shrubberies outside.
Content is generated by a whole host of methods. Very little (if any) is original. Using an AI to craft a segment of a melody or even the vocals won't really be detectable especially once its gone through the normal blending and transformations. A fools errand just to make the "professional" societies look like they have an important part in the process.
Of course this applies to all forms of art - text, visual, music. And it's just going to get more intense.
(Speaking solely as a programmer who's artistic output is wonderfully exceeded now by the AI "helpers".)
Back in the old days, the memory bits were represented by single magnetic donuts - my recollection is about 3mm. These lil bits were weaved into a large plane with cross-wires. Apparently, flipping a single bit on and off fast enough (milliseconds?) would heat the poor thing up until it expired.
In the only slightly less old days, we were able to achieve the same thing (destruction of memory) by flipping a unit of memory (then called a "word") - probably using the wonderful XOR operation.
I guess there wasn't much use for something like Rowhammer back then.