Decision process FUD as always
This iteration: "Nobody ever got fired for choosing Exchange"
31 publicly visible posts • joined 30 Jan 2021
>I can easily imagine instructing such a system to render a group of Roman legionaries making testudo formation being eminently feasible. The interface might request a specific period and location by way of clarification.
Definitely. And with AI we are already at a point where such a scene might be automatically generated and even look pretty decent.
But as you mention further on you need architecture for a complete product (or in this case a 'work of art'):
A game (or a film for that matter) is not a single scene. It's the whole context which someone invented with good and bad characters, events that advance the plot of the story etc. And that's basically storytelling architecture. To really make a 'work' stand out we also want some novel elements in there that makes it noteworthy and different from other 'works'. And current AI is terrible at being novel.
I.e. a beautiful roman formation is one thing, but the scenes where the hero inspires his men, and the bad guy taunts the opposing army, the glances between people that share a past in the story etc. That's the storytelling, not just beautiful and realistic fighting scenes
Even the Commodore 64 had S.E.U.C.K (Shoot Em Up Construction Kit) and other programs aimed at constructing working games without extensive programming knowledge.
While there are several interesting games that came out of it (https://www.c64-wiki.com/wiki/S.E.U.C.K.#Overview_of_good_SEUCK_games), the market for shoot-em-up games didn't become flooded and it didn't remove the possibility of revenue for software houses to continue to create quality games in this genre.
Turns out that to complete a valuable product one still needs to put in a whole lot of learning, trying, failing and do a lot of drawing and planning to create these games, even if such a tool exists.
AI (as we have today) simply upped the game from "Hello, World" to "Nice app/site prototype with a few working buttons".
Low/no code has been around forever. Expect similar results - even if one supposedly does not need to know how to code, the amount one will have to learn to create something useful and valuable is still immense.
We are just going through increasing levels of devaluation through automation here (not entirely negative btw) - everything that can be fully automated (or generated) becomes a very cheap commodity and will be valued as such.
I've built my own SmartHome devices to control LEDs (I mean like LED lightning in whole rooms) with custom animations, automatic curtains, the garage gate, garden lights, UPS for my router rack etc.
I have the stuff integrated via my own hub that talks MQTT with Home Assistant (OSS Smart Home hub software), and using long range radio modules (such as LoRa) to connect much further than normal ZigBee (a commercial radio standard for SmartHomes) can do.
Seriously, the amount of devices, robots and automations you can build yourself today with microcontrollers and modules in your own home lab is practically unlimited. Add a 3D printer and you can make even more.
This isn't rocket science - it's just about having the imagination to see what's possible.
This yet again underlines why servers should NOT have direct internet access.
They should only have whitelisted connections to distribution repositories and update servers to stay current with security patches. Letting servers freely chat with the internet is asking for trouble.
In this case, proper egress filtering would have prevented both the C2 connections and the Dropbox exfiltration.
>However, I do think Apple works on privacy, for a very simple and very American reason: it makes them money.
That's exactly what I wrote I argued before.
Now I think "Customer Privacy" just goes on the same spreadsheet as "Selling customers out to Ad Networks" and if the latter is larger then the former the latter wins.
Sadly, I have argued this myself many times.
That's why I naively chose iOS over Android.
I believed that if I paid this much for the device, surely Apple wouldn't need to sell me out.
*Man who had already lost all faith in corporate morals somehow discovers he could still lose a little more.*
While global warming is real and can influence hurricane patterns, attributing any specific hurricane directly to climate change is complex.
People often mistakenly link individual weather events to climate change, like claiming a hot summer or cold winter proves or disproves global warming. Similarly, pointing to a certain hurricane being "contributed to" because of climate change's impact oversimplifies a complex issue.
We're watching a pretty wild economic show play out on the world stage. Big companies are still chasing those cheap labor hotspots, but now they're also going all-in on AI. It's like they're playing with fire on both ends. You've got workers in places like India saying "enough is enough."
But here's the thing that's got me scratching my head: these corporate giants might be outsmarting themselves. Sure, they're all about cutting costs now, but what happens when they've squeezed out so many jobs that few can afford their fancy gadgets anymore? (yes, very relevant for Samsung which sells much "want to have" rather than "must have" products) It's like they're sawing off the branch they're sitting on in the longer perspective.
From a management perspective the promise of the cloud was that they could eventually get rid of those pesky, expensive, admin types and save a good penny, so why on earth would they listen to those guys in the first place? "They would say anything to save their job"
We all know how that ended up now, but history will repeat itself as management is almost never held responsible for anything further away than a quarter or a year-end.
If someone tries to strongly convince me to choose an option, I usually react so negatively that I end up choosing anything else except the forced alternative.
JetBrains pushed their stupid AI extension so hard in their IDEs (with a subscription naturally) that it made me completely ignore it. It might be the best thing since sliced bread, but I wouldn't know because after they strong-armed me, I wouldn’t touch it with a 10-foot pole.
I can highly recommend Nextcloud. I've been using it for 5 years, both in the company and privately, and it is working really good.
The fine grained access control, the ability to add accounts all with their own or shared spaces and even making a temporary download (or upload!) link that you can text to a friend or paste in an email are great.
You can open a markdown document in two browsers at the same time to share text between devices, that update live on both.
I use it on Linux, iOS (iPhone) and Windows. Naturally you'll have to verify if it works on the devices you use.
Microsoft's practice of leveraging their extensive platform to aggressively promote their products is not an exception but rather their modus operandi.
They have, at times, forcefully installed Windows 10 on Windows 7 computers. "Oh, look how quickly Windows 10 usage has increased this last quarter!"
They bundled Teams with Office, leading to claims like, "Oh, look how much better Teams must be than the competition! Customers are fleeing from the competitors!"
They are also promoting Edge in ways that could be seen as anti-competitive.
For example, when attempting to download Firefox via Edge on a newly installed Windows 10 computer, users are presented with a full page of information claiming that Edge is superior to Firefox.
Monopolists will monopolize.
Even the services that do offer online unsubscription, often hide the button for unsubscribe in a color that looks a lot like the background, while the "Wait, I actually want to continue my subscription" is in a super visible green color.
Not as bad as being unable to unsubscribe at all, but super-unfriendly to users with vision disabilities.
The expected effort for a prototype will stay the same. What will change is the expected quality and functionality of the prototype.
Back in the 90s and early 2000s we prototyped using Delphi or VB, just "drawing" a few GUIs and then if investors bit, we started implementation. That level obviously doesn't cut it today.
I think you are mistaken if you think you will stay competitive with prototypes that took like a day or less to create. Always remember that if you can do it today, 100s other will know how to do it tomorrow.
A prototype usually takes about 1-2 weeks because 80% of that time is to polish things out so they look smooth. Polishing always takes the most time.
It should be mentioned that installing OpenBSD on modern hardware with a fast internet connection takes like <10 minutes. Assuming some script to orchestrate the servers after install (which is a good practice anyway) it should be possible to have a server running in like 30 mins.
Of course I agree with you that there are tradeoffs to choose OpenBSD, but I'd rather take the inconvenience of a little extra work during normal work hours than waking up to YET ANOTHER El Reg story and having to go through all systems to make sure they are patched AGAIN ... or worse, being called in in the night because someone broke in and now all the filesystems are encrypted.
In my company I have initiated a project to replace all incoming SSH connections to our clusters via OpenBSD servers. We are not looking to replace our Linux infrastructure but to secure the entrance and then "jump" from there.
OpenBSD has a fantastic track record for being unaffected by remote exploits. Even when exploits are discovered the rigid security consciousness in OpenBSD very often make the exploits not practical to attempt to abuse in an OpenBSD system.
Example: During the installation process of OpenBSD your kernel is personalized by being re-linked with randomization of addresses in such a way that trying to exploit buffer overflows and "guessing" jump vectors becomes almost impossible.
"The major difference between a thing that might go wrong and a thing that cannot possibly go wrong is that when a thing that cannot possibly go wrong goes wrong it usually turns out to be impossible to get at and repair."
- Douglas Adams
How true this is for the Cloud. The man was totally ahead if his time.
Today, there are almost no consequences for having your customer database stolen. While bad press exists, with the frequency of new breaches, it is becoming less impactful, and an organization is unlikely to lose much goodwill. The cost of experiencing a breach is, therefore, severely diminished.
It's time for big governments to enact laws that hold the entity collecting personal information responsible for its theft. This could be as simple as a flat rate per affected person and per breach damage cost. The fees don't need to be exorbitant, just significant enough to place them on the risk and budget spreadsheets of companies. Imagine a damage fee per person affected of $5 for personal information and $10 for sensitive information (credit card details, passwords, correspondence, health information).
This financial risk is all that is needed for companies to start prioritizing this issue. They need to weigh this risk against possible investments in security for their data systems. Consequently, insurance companies would begin offering packages to cover firms against personal information theft damages. These insurance companies would then create a list of security requirements their customers need to meet to be eligible for insurance payouts.
This is not in any way targeted specifically at Rite Aid, but rather a commentary on the PI-theft situation as a whole.
TL;DR: Putting a price tag on losing personal information would almost instantly create a new market for the standardization of best practices and auditing for systems and infrastructure handling Personal Information, reducing the risk of fines.
Came to support this. Windows 7 was the last Windows that was a real operating system, with the OS User in mind as the main consumer of the product.
Windows 8 was ... well it was Windows 8 (Windows MEs bigger brother)
And by Windows 10 they publicly declared that the consumer was no longer the OS User but rather the ones they sold the OS Users data to, by GIVING the OS away and even, in some cases, forcibly upgrading Windows 7 / 8 to Windows 10 without the OS Users consent. I.e. digitally predatory behavior.
Additionally Windows 10 was geared towards "Cloud" and all that OneDrive crap + they made it more clear than ever their intentions was to own the OS Users experience.
Surface laptops are in a tough spot because they are positioned between MacBooks and other professional laptop series, and their reputation for Linux compatibility is not good.
In contrast, other PC laptop brands like ThinkPads, HP EliteBooks, ASUS, and Acer all seem to work quite well with Linux, which is a significant point for tech professionals and enthusiasts.
Meanwhile, the average user often opts for a MacBook due to its elegance and ease of use.
Lots of comments here discuss whether or not the government should own data centers.
This is completely irrelevant!
What matters is to build and procure solutions that can run on any cloud or data center (And I've been preaching this for years as an Enterprise Architect):
1) Make products built on Kubernetes/KVM or other open virtualization technology.
2) Use OpenTufu (or other Terraform successors) to orchestrate infrastructure.
3) Build products that use open-source storage/queues.
When you do this and only rent lots of generic CPU/IOPS/Storage/Network, THEN you are, as an enormous client like the government, able to really play all the vendors against each other and force them to lower their prices till it hurts.
People are so incredibly naive - "I wrote these massive services based on proprietary tech like AWS DynamoDB, or Azure Service Bus, and now they massively inflated the price? HOW COULD I EVER IMAGINE THAT WOULD HAPPEN." </facepalms all around>
In the generative society the "generated" news and media has already started to eradicate all trust we can have about anything.
The previous internet cycle had "fake news", now we have 99% pure bulls*it.
It is a huge democracy threat that the generative technologies + advertisers + special interests just spew so much info that no one can be sure of anything.
As an old timer it is so sad to see what the internet have become - the internet that was supposed to democratize the access to knowledge for all, is just becoming a huge stinking pile of trash!
I registered to support this motion: My company only paid GitLab bronze as a kind of 'thank you for providing this service', not using any features above free tier.
We didn't even register at their support site until yesterday, so basically giving them free money.
Anyways we have already migrated (to an own-hosted Gitea repo) during this Friday.