Re: Stop writing software for Microsoft for free
Stop testing software (Windows 10, 11) for Microsoft for free
179 posts • joined 28 Dec 2012
We've been running ESXi 5.x, 6.x for close to 10 years w/ the internal SD card booting. NO issues.
But, on every machine, any tmp/temp files, all log files get redirected to external (Enterprise) storage where possible. This is done to reduce the exact problem they have (finally?) started thinking about: SD card writes.
Rather than making such an idiotic move (eliminating/not supporting SD boot), they should provide some guidance on how to reduce writes the SD card.
Really VMWARE? Stop being lazy.
An alternative is to copy/replace the SD card every once-in-a-while (every year or two?) during a maintenance cycle.
Although, I recently moved to SSD boot for RPi4's for the exact same reason, now that this capability is easier to set up and reversable, but there is also no convenience penalty, like there is on an enterprise system.
"right now it's time to... kick out the Chome"
Chrome on Pi is the new IE on Pi.. Except for using Chrome to download Firefox ESR, I have been (happily) using Firefox ESR on Pi for quite a few years now.
This includes installing/running my favorite plugins: Ghostery, Ublox, Adbloc, NoScript.
Yes, yes I do change the agent string to x86, otherwise there are just too many web sites (i.e. all of them), that when they detect ARM, they almost universally serve up mobile websites - web-designers are friggin' idiots...(I'm lookin at you, Amazon).
"I never disposed of anything ever since"
- I was chuckling to myself before I read the following line: "Oh, yes, I've had all the beardies chuckling"
Re: "512 kilobyte Compact Flash cards" - I have a bunch of 512MB CF cards because I use them in an older (very expensive Canon 1DsMII) camera that doesn't recognize cards > 2GB. At 16MB imager - the camera is still in service and takes awesome images. I just used it last month to take pictures of my niece's junior prom.
Re: Old 80GB (40-120GB) SATA SSD drives - I have just recently started using this pile to add SSD USB 3.0 boot drives to my Pi 4b's. They make a great pair - performance is 10x over any SD card I've tried.
I just inventoried my "card" drawer and the smallest CF card I could find is 4MB and the smallest SD card I found is 16MB along with numerous CF->PCMCIA adapters - remember those? Cards aren't worth much (now-a-days), but they're small and don't take up much room, so I keep them...
"I've surrendered on that front'.
I haven't yet. As previous posts have stated, it takes a little bit of tuning. And I've also found that preventing access to twitter, facebook, doubleclick almost never causes any issues with displaying pages for the majority of websites that I visit. These are the top websites on my "no-go" list. noscript is actually fairly flexible for tuning.
The biggest issue that I have is that I generally, temporarily disable noscript if while I am purchasing something. e-commerce websites get very knarly if they can't access something and enabling noscript while in the middle of a transaction can result in double charges... Ask me how I know...
I'm fairly comfortable writing software in C, a little less so in C++ and C#. I've been coding for 30+ years and I don't always feel the need to use the latest whiz-bang feature introduced in the latest iterations of C++ and C# (I only use templates in every other C++ project). But I'm getting a little tired of Microsoft's "3" year
development kill lifecycle.
I've been looking to make the move to Rust over the last couple of months. With this latest C#/memory alloc feature likely leading to disaster, I think I will be accelerating that move. At least for now, I like the idea that MS doesn't completely control the fate of Rust. There seems to be momentum with Rust and I would like to contribute to the momentum. I'll likely never stop coding in C/C++, but I think Rust will be the path forward - at least for cross-platform work and C# will get left behind (for me).
I agree re: Moor's Law, but Moore's Law was framed in the known transistor physics at the time. At the simplest level, the transistor is just a switch, 1 or 0. That's it. Yes there are billions of them and they switch very fast, but it is still just a 1 or a 0..
I think (hope) that in the future (hopefully years, decades and not centuries) that a new technology will be discovered that can provide a switch that holds a 1 or 0 state. Maybe somewhere along the way, the (fundamental?) bulding blocks of atoms (quarks?) are able to be manipulated and their states/spins are then used. I suspect that at this point Unified Field Theory will be reality or at least better understood.
I don't think "quantum" computers are the answer right now either. Biologic computers - maybe, but I suspect that solving/understanding the Unified Field Theory will come first.
Of course there is always the possiblity of trinary (ternary) computers becoming mainstream, along with their theoretical efficiency improvement, but Industry manufacturing inertia will likely prevent that.
If I had to do it all over again, I would have entered the material sciences, room temp. superconductors, graphene, carbon nanotubes - cool stuff and much more yet to be "discovered", invented.
"While this remains in preview, it is yet another demonstration of Microsoft's determination to embrace a cross-platform world"
Sure, as long as:
- you purchase a Windows OS license from MS
- you purchase a Visual Studio license from MS
or better yet, purchase a Visual Studio subscription.
EEE (embrace, extend, extinguish)
One of the most important lessons that I learned early in my (software) Engineering career is:
"Just because you can do some thing, doesn't mean you should do that thing".
This applies to so many things in life.
It is so applicable to technology in the world. Unfortunately, I've yet to encounter a graduate of a university where they taught this.
KISS, when possible.
Make it only as complex as needed, and no more.
Needless to say, I'm not a fan of:
And realize that just because this early release runs in a VM doesn't mean that the final released OS will be allowed to run in a VM.
I often wondered how VMWare emulates a TPM for a VM. Because one would think that this would be a major source of security issues if the VM were running in a production environment and would seem to be contrary to the TPM concept where simply cloning the VM clones the TPM info as well ? That doesn't seem to be consistent with security.
At some point, I think future versions of Windows will not be allowed to run in VMs. Since Windows 8, the OS is VM aware (the info is shown right in the task manager).
Once MS requires every citizen to have an on-line account to be able to log on, all sorts of things become possible (and a real danger) such as:
- Maybe your account has a special flag (that you need to purchase) to allow to log into a VM
- Maybe your account has a special flag (that you need to purchase) to allow to log into an OS that doesn't find a camera
- Maybe your account has a special flag (that you need to purchase) to allow logging into an OS w/o using facial recognition.
- Maybe as a politiian, you have special privileges that disables logging of certain user activities
- Need more? I can think of many more..
The seeds are being layed now (have been since forced updates in Win 10)
These are some of the reasons why Windows will never be any of my primary machines (for personal and professional use).
"Nothing prevents cameras with hardware disabling features, either to cover them or disconnect them entirely."
Are you sure about that? What happens when the OS finds a mobile CPU/Chipset and doesn't find a camera? Maybe facial recognition is the trojan horse that MS is trying to institutionalize. Maybe the OS won't let you log on/use your machine at all w/o a camera in such circumstances.
But I do agree, I think the camera requirement is more significant than the TPM requirement.
Now, imagine what would happen to the price of a precious metal (gold, silver, palladium, etc.) if 90% of its mining capacity had been removed?
I can guarantee that its price would not be going down!
It's interesting to note the dichotomy.. between a pure concept vs. physical. Recognize that neither cyptocurrencies or gold have any intrinsic value, but being able to take (and presumably own) physical possession might have advantages.. Both are speculative in nature and not an "investment" in the the classical sense of the word since neither produce anything - the expense is in the mining operation and the result is based on that + an artificial "profit". The latter contributing to the volatility.
Not knowing a lot about crypto, but it seems like it can be manipulated (if a single entity owns enough of the processing endpoints) just like fiat monies can be.
I wonder how many years (and updates) it will take before all of the security holes that this will introduce are patched.
Hackers are going to have a field day with this.
MS had better have a damn-secure way of when this is allowed and when this is not allowed.
MS track-record for security isn't exacly stellar...
I learned a long time ago:
That every position/job within a private/public company where the goal is to be profitable (so, excluding government positions) has a certain value to the company. If the company has any sense of being organized, the value of that position is known.
This is, for the most part, completely independent of where a potential candidate lives - assuming the basic requirements of remote access, ability to perform, security are met. So to not list a salary range, I call bullshit. It's the oldest game in the world, whoever (you or the employer) gives a range first, loses. This is the same tactic used by car salesman.
But I don't think it's all bad news for the employers, because what they get is a range of applicants that are presumably qualified for the job and have made (or are in a situation) where they can live at the salary specified. The applicant maybe resides in a modest location with a modest lifestyle with a modest pedigree, or not, that's up to the individual - this determines what an individual can accept in terms of compensation. If the employer doesn't get any responses to the range, then perhaps the value of the position is too low, or there isn't anyone available. Either way, the employer needs to re-assess the position's worth, or go without.
I think the spirit of the law has good intentions, we'll see how companies will react.
As other commenters have said, it is a two-way street - each is evaluating the other and both have to accept, there is freedom of choice involved.
It seems like this effort is akin to converting a PLC's (programmable logic controller) ladder logic into another programming language or vice-versa. I don't quite see the utility of this. If they're just trying to covert X number of inputs to Y number of outputs, then it is just a state machine. State machines can be very elegant (but those are usually quite obfuscated) or can be very inelegant (and usually easier to understand). Computers are very good for predicable behaviour (even without AI). Granted most, correctly written, software, excluding AI, can be distilled down to gigantic (predictable) state machines. Lucky for us humans.
To extend this thought, this is what FPGA tools already do. Take verilog / VHDL and turn it into a set of bits that define a huuuuge state machine that runs in the FPGA logic gates. Again, this has been done.
Taking examples from human programming for examples of (good) security just seems..... wrong. We (humans) aren't very good at that.
And lastly, why would computer (AI) generated language (designed by humans) to run on a computer be desirable? Once it's generated, humans are going to review and comment on the correctness, after the AI has already generated it based on learned examples (from potentially billions of input examples - both good and bad)?
We go from:
problem -> human -> (programming) language source -> preprocessed -> compiled machine code of choice
And with AI:
problem -> AI -> (programming) language source -> preprocessed -> compiled machine code of choice
Why not just:
problem -> AI -> compiled machine code?
We exist to serve our AI overlords.
"Python's decision to stab 20 years of Python codebases in the back was an exceedingly poor one, which may take at least another decade to get somewhat over."
As a 25+ year C/C++ developer on everything from 8-bit micros to Enterpise software, I can't imagine rewriting substantial amounts of code everytime the C++ standard is updated. (Although lately MS is trying its best to break shit from VS2019 update - to update). Java seemed to get it (backwards compatibility) right, at least for a while
I suspect one or more of the following is true of the Python deities that decide such things:
1) They aren't involved in large scale project development where 100000 LOC and 100's of developers are working on a project
2) They aren't involved in projects where the code life is expected to span 5,10 or even 20 years
3) They are lazy because it's much more difficult to add / fix functionality w/o breaking backwards compatibility than it is to keep APIs. I know this because I just finished spending 2 months on fixing an issue w/o API changes to a library, when it would have taken me 1 day if I had made a change to an existing API. This would have forced a relink / recompile of the library by the mulitude of applications that rely on it (yes, my paying customers appreciated this).
4) They aren't involved with developing software that runs on FDA (medical) equipment, where a single change can force the requalifying of the entire software chain that can take months of testing and cost $$$$
I actually don't have much respect for any programming language that relies on non-printable characters to format source code for successful compiling of it.
I usually allow one mulligan (redo) for major version changes, the real question is what will python 4.x be - will it break compatiblity with 3.x? Hopefully the language designers learn from their mistakes.
This idea that all code, everywhere, has to be rewritten every time a spec changes is just lunacy, IMO.
I Know.... ;TLDR
Users using the CD-ROM tray extended to hold their coffee cup. This was in the early days of CD-ROMs - when StarBucks and el Grande sizes didn't exist yet. The CD-ROM tray, when opened, would allow a stryrofoam cup of coffee to be suspended.
Also ran into someone that kept having their 5-1/4" floppy disks fail after taking them home and then bringing them back to work. Turns out the spouse was putting them on the fridge door (with a magnet) so that they wouldn't forget to take them back to work the next morning....
I have a cricut, but it's been packed away since my mother passed about 1-1/2 years ago. It's an older model, based on the fact that it uses the cartidges, of which there are about 30 of them with the printer. She used it a lot and I know it worked without being attached to any computer and not accessing the internet. I know this because I maintained her laptop (and it was/is running Linux Mint the entire time).
I don't know what it's worth (if anything).. I should probably let it go so someone can get some (possible) use out of it.
Re: Cricut's business model. Most of my Mom's friends that did the whole "maker" scene did it to 1) save money and 2) be creative. 100% of them were also on social security (fixed) income. I don't know if that thinking would allow a very lucrative business. Profitable? yes. Lucratively (Obscenely) profitable - maybe not, but what do I know? I've not made millions in the investment scene. Of course if the central banks keep printing 240 thousand million USD per month, inflation will demand that everyone will be millionaires because a loaf of bread will cost $5000. Those who ignore history, are doomed to repeat it..
"used to put grotty keyboards through a dishwasher"
Yep, same here. In the mid-90's worked for a company and personally saw the crew that worked on computers systems to use the dishwasher to clean up Sun 3/110 workstation keyboards (optional and very expensive in their day). Worked like a charm, after ruining the first 2 by not removing them before the "dry" (heat) cycle..
By experimentation, they had the best results:
- wash with keys facing down
- no detergent (just hot water)
- remove before heat cycle
- blow off w/ compressed air
- let air dry for a couple of days
My best support story:
Back in the late '90s the (East Coast) consulting company I was working for had been engaged by a (West Coast) ASIC manufacturer for creating a dev kit for their product. This involved several sequences of them sending us new hardware, we developed the SDK for it and shipped the software back to them to test/integrate with their solution.
After a couple of cycles, I get a desperate call from the mid-level manager (we call him "Grendle") and Grendle was completely distraught that their developer couldn't debug the latest release we had sent. I asked repeatedly that I be able to talk with the Dev and Grendle let loose with a stream of expletives and denied my request. He demanded that I immediately fly out there in person and fix the problem. After getting a written request from him and confirmation from my manager, I booked the next flight out the following day.
I arrived around 10AM (local), took a cab and arrived. I couldn't find Grendle when I arrived, so I just went and talked with the Developer. He showed me what was happening, I looked down at the hardware, plugged in the 2nd serial port cable (which was laying right next to the kit) and said please try again - and it worked, just as it always had in the past (yes the 2nd cable had always been required and in use). I asked the Dev if there were any other problems, he said no. About that time, Grendle showed up.... I explained what the problem was. He got very quiet and red-faced as I walked over to his Manager's office, explained to him why I was here and that I was now leaving. I caught the next flight home - running through the terminal and JUST as the flight attendant was closing the terminal door to the walkway...
A couple of month's later, Grendle was no longer working for the company.. Twit..
One a serious note, I thought there were enough space suits (that fit?) for every member on board at any given time.
If there was an emergency (air leak via meteor strike), wouldn't SOP dictate that everyone suits up? What - the women would draw straws?
This seems like more than just a casual oversight.. In fact I would suspect that NASA is likely just putting on a good face while much more serious discussions are taking place privately.
This^^. Why would anyone want to start someone young with outdated concepts such as BASIC? I have to think that the entire "team" for small basic is like.... 1 person. Porting to a new .NET framework counts as an entire release? Wow.
Grow into VB? Really? Get a clue MS -> you should be concentrating on hiring more people to test/QA Windows 10, instead of putting out half-baked crud like small basic. MS is so messed up right now, I really have to wonder what Windows will be like 5 years from now...
They're both solutions in search of a problem to solve.
I have a difficult time believing that carriers are going to sell this to John Q. Public when faster data will just more clearly demonstrate the "unlimited" data plan and data caps cost to the end-user.
The carriers need this because they dread the prospect of having to compete with each other on BW cost pricing, but hopefully that's exactly what will occur - until industry consolidation occurs at which point there will be one carrier left which will result in a monopoly and no competition to control prices.
I have a difficult time believing that mobile users require this high speed - for what? For updating facecrook pages, twitt messages, sending pics/video?
I just don't see a killer app that needs this?
Perhaps someone can clue me in?
"30 day password policy"
I don't know why this continues to be considered good practice in the industry. Because it's NOT. All's this does is encourage writing down the password on a post-it and then putting it on the bottom of one's keyboard.
Forcing someone to remember a new password every 30 days is ridiculous - In this age of smart phones, most (99%) people can't even remember a new phone number every 30 days.
And why 30 days? Why not every day, why not every year, why not every 5 years? Where's the proof that this does anything to improve overall security?
This policy actually results in less actual security - find a better way, this one has got to go.
To the original poster: (AC indeed is appropriate).
Any time a commercial entity (in this case Apple) proposes a change to a current, working system (such as sim cards) it's usually to the monetary benefit (less cost to manufacture & end-user lock-in to vendor) of the entity. One could say the same thing regarding the elimination of the headphone jack or the lightning connector, the list goes on...
Any possible perceived benefit (real or unreal) to the end-user is purely coincidental and seconday to the previous statement - and is used to lauch a publicity war on the users to get them to accept the change without them understanding the true impact and reason it is being made. - Basically most companies don't give a shit about their customers, they only care about extracting maximum profit from the user's personal worth.
You see, it really is just about profit for most companies nowadays - it's not about being the best at something, or providing the best value, it's just about profit anyway they can get it - it doesn't matter how, when, why, who or what.
Do you really think that any cost savings that Apple gets will be passed on to the consumer? I don't think so. More than likely, Apple will turn this into a profit center - after all, specialized software, infrastructure will be needed along with a surcharge - all under the guise of "security".
(non-E) sim cards work just fine, in fact they work too well for Apple's liking..
(As a side note, I keep reading about how millenials can't afford to purchase a home.. I wonder if not spending the $2500-3000 / year that each one spends on having a phone, premium cable, netflix, hulu, CBS, disney subscriptions and the $400/month lease on that brand new car, would help towards a mortgage - you can't have everything - grow up and choose what you really need - and not just something you "want") - For those that are being financially responsible, I apologize.
Flame away, I stand by my convictions..
Top bing searches:
"google" "chrome download"
"firefox" "firefox offline installer"
3. "midget porn"
4. "how do i clear history"
In the "old" days I used to call IE the "firefox download/installer", it looks like Bing has assumed that role. I run it once only if I don't have a thumb-drive handy with firefox offline installer on it. It's okay if it's out of date, I will update after the install.
I have to admit that lately I have to "hold my nose" with firefox's latest crap because it stinks so much. If there's a palemoon available for the OS I'm on, I will use that instead.
Chrome is not my first choice for a browser due to the auto-updates and because of google data privacy gobbling attitude - I also don't use google for searching (at least not directly). Either duckduckgo or startpage. Duckduckgo actually honors your search term requests, unlike google that ignores them when they have a paid avertiser key word match and then magically all of their products override search terms and show up in the first 5 pages of results..
Using google has become VERY irritating not to mention, very time-consuming having to read and then ignore their "search-vertisements" (you heard the term coined here first :) in the first 10 pages of returned results. Hey google, do you think nobody notices this crap? And I've never liked the fact the returned result link goes back through a google server first before going to the site and they hide this fact by not showing the true URL in the link. "do no evil" - I call BS.
Is this where they use the "internet of tubes" to delver the actual ink over the internet (which would actually be truly useful) - or is it just another subscription service that replaces the lower cost (to the consumer) of just purchasing the stuff?
inkjet ink has to be one of the worst/best inventions ever.
I am so glad that I use pdfs and tablets now. No paper & no printing costs.. And did I forget to mention that storage, organization and searching of electronic documents is a lot easier than physical media?
"I use all versions of Windows from Vista onward every day - and my conclusion is that, from a point of view of getting things done from hour to hour, there's very little difference. Each one has its quirks and they move the various system function controls around but basically, as an end-user, they are all the same."
I think this is kind-of the problem (from Microsoft's point of view, anyways)...
MS needs/wants a continous income stream from Windows and they are trying to get it by:
- Subscription-based Office 365
- Microsoft Store (buying apps, games)
- Inside-OS advertising
In order to enforce this, Window's 10 OS is being used as the jailer (keeps everyone in a narrow user-profile). By enforcing updates, Microsoft can change their marketing and Windows 10 users have no choice but to accept "updates", upgrades all under the guise of better security.
But the problem for MS is the age-old problem of being able to lead a horse to water, but not able to force it to drink (the cool-aid)..
How many "features" has MS put into 10 - Groove, photos, etc? and how many people use them? Hell, I don't even know what 90% of them do and I have no desire to even find out. I just need the OS to run the (mostly non-MS sponsored) apps to get my work done or play games.
And that's really all an OS should do - and that is my primary objection with Windows 10 - that of it being used as a hammer to beat its users over the head with and to spy on them.
I may have to use Win10 at my current job, but the Windows 10 OS is not on any of my 6 machines at home (not even in a dual-boot configuration and includes dozens of VMs). Right now it's 50/50: 3 linux boxes, 2 Win 7 and 1 win 8.1. The win 7 machines are laptops with special HW devices w/ no linux support.
I currently have no plans to ever have a Windows 10 machine within my home network.
"because the download is at least four gigabytes"
It kind of makes you wonder.. 4 gigabyte update...
So, updates typically are not introducing new features, updates remove older functionality and older drivers - why so big?
I guess collecting more telemetry and fixing security bugs would be the majority of 4GB of updates. That's a lot of fixin'... Either way it's not a good thing.
One wonders where this is will be in a couple of years. What's good enough for MS, for an OS? MS still says Windows version 10 is the last OS. How much is our personal data worth to MS & their customers? Is MS's collection of data about everyone worth more than Google's or Apple's or.. Amazon's or.. Netflix's. Me thinks at some point there will be a saturation and at point, these companies will start charging hard currency for access to their systems.
Want to create a Word doc? - 15p per doc or .01p per each word in the doc. Spreadsheet? - 0.1p charge for every cell used. Database? - 0.1p per row of data. 1p for each field. Talk via Teams? 1p per conversation...
Every aspect of every person's life will be owned and we'll have to pay to do anything/everything.
They're already controlling (via fines) water consumption in Ca. If someone can figure out how to control the air we breath, they will.
Just because you don't agree with the conclusion, doesn't mean it can't happen.
Land -vs- Sea
I am imagining 1000 ft. tides every 18 hours washing inland and colliding with volcanoes / lava and the resulting vapors getting put into the atmosphere each time.
Also with that kind of tidal action, there must have a significant portion of sea beds that got exposed to atmosphere at the same time.
Wow, this would be something to witness (not personally of course).
I wonder if some boffin somewhere has put together a video simulation of what the earth would actually look like - it might be kind of cool to watch.
BTW - why are the Earth's oceans salty? - Where the heck did all of the salt come from?
"I prefer the one in the title: 'Most of our ideas suck'"
I think that most would agree (that most ideas suck), the very big problem is:
Who determines that the idea sucked and should die a violent death? Because it's certainly not the customers, at least in the SystemD case (or Gnome3, Unity, TIFKAM...the list goes on). As far as I can determine, it's usually someone that has a vested (i.e. financial) interest that determines whether an idea remains or not.
I'm very glad that we (as a consumer and computer professionals) still have at least some level of choice, but that is rapidly disappearing - sadly, that choice may be completely gone within a couple of years.
"Apple recieved a lot of slagging off for wanting to remove 32bit support from MacOS (even though it has been on the cards for years). Will MS get the same angst and even hatred from the user community?"
I'm not completely sure, but wouldn't removing 32bit support from Windows completely kill VB6 apps? If so, then yes a lot of (legacy) apps would immediately be orphaned - and some of those apps are so old, the code/developers for them may not be available to port.
If you still do work on a PC, you might be surprised at how many apps are VB6-based. I was very surprised when Windows 10 was in preview that VB6 apps were still supported.
Also, it's interesting if you look at Task Manager on a running Win 10 x64 system, how many of the core OS services are still 32-bit - I wonder why? It seems a little strange - probably due to having to support 3rd party 32-bit apps?
Heating up some popcorn to watch this one unfold over the next (5-10?) years.
"It has recently come to our attention that a certain number of users have been using their MacBook Pros in a manner that is not consistent with Apple's
profit design parameters. This incorrect usage is causing a complete and total failure due to a design/manufacturing defect an anomaly with their keyboard. To reiterate, MacBook Pros are designed to be only used in environments that have fewer than 83 particles/cubic meter (https://en.wikipedia.org/wiki/Cleanroom). Using them in anything less clean is not covered under warranty"
I read that link.
I challange anyone, anywhere at any McDonalds in the world to purchase a burger, open it and find that it looks anything like those shown in the pictures in that link.
Pictures, or it didn't happen.
Not that any other fast-food joint is any different, but I get sick of the crap that companies are allowed to get away with.
I call MS - (Marketing Speak) which means the same as BS in a more common speak.
>>"Do you have a current full time career in software development or are you retired or more of a hobbyist perhaps? "
25 year career. 12 as a consultant, 13 as an employee. Hopefully with another 10+ to go.
For me, the difference is as a consultant, someone wants to change something, they pay for my time either way. As an employee, I have some say in the matter as to inputs into requirements and also make it clear how any changes will affect delivery times.
Re: requirements - I version them frequently, eventually they do get tracked back to a version that's released.
I don't really do too much with direct IT type of activities anymore.
I wonder how many organizations that use an agile style of development use contract employees - it's expensive to do so. Agile seems to work for smaller projects and colocation is almost a given, I've never seen it scale well to larger projects and also not on projects where certifications are required. I once worked on a project that involved a combined 600 SW & HW engineers across the globe. To do so without formal requirements would not have worked.
IMO agile appeals to younger coders because it gets them coding almost immediately and away from having to work on (the boring) requirements docs and it appeals to management because they don't have to worry about them as much either. So the end result is code early and recode/refactor often. I kind of get tired of recoding and throwing away code on a monthly basis.
I can point to products that have been out in the field for 20 years that I contributed code to, I get a very large sense of personal satisfaction from that, not something that's going to disappear from the web in a couple of months - but that's just me, some people just want a paycheck..Not that there's anything wrong with that at all - we all need to eat..
>>"Version numbers are an artefact of the pre-internet era way of doing software development."
Spoken by someone that has done a very narrow range of software development or none at all..
>>"With today's continuous release and cloud platforms (github, etc) they should just dump out the version numbers completely and go for YYYYMMDD-style date stamps."
Out of all the possible reasons for versioning software this has to be the absolute least useful information and format.
Software versioning is used by other software developers, not the end-user (except possibly to identify & report bugs/problem with a specific version).
Software versioning is important if one is maintaining other software packages that may have to be backwards/forwards compatible and may take a long time to release and get certified for use.
The concept that every piece of software is constantly undergoing change and release with every other piece of software is just plain idiotic, not to mention wasteful of developer's time.
Relating the mechanism (continuous releases) to why version numbers are important is meaningless.
You can call me old-fashioned, but I'm not the one rewriting the same code every other week because someone decided to change the story-board. I have better things to do with my time. Get the requirements right the first time..
>>"e.g. "release 20180416-S" has a lot more information included in the version than "release 4.15.whatever-rc3" - it indicates right there in the name WHEN it was the last release and whether it was a Stable or Release Candidate."
Who the fuck cares when a software package was released? Again useless information. Your statement indicates, that you really haven't a clue as to why / how version numbers came into existence.
Obviously, you have never written software for an embedded system where everthing is not always upgradeable in the field, nor should it be.
Not all software developed is the weeny-wanky web-based crap apps used on phones who's only real purpose is to mine your personal data for free.
For those who didn't quit reading (;tldr). Thank you.
I'll start out by saying: I know nothing about writing/designing software for the Apple ecosystem. So I have no credibility there. Mostly because I don't like the wall-garden (authoritarian) ecosystems.
But I will say that almost any software design that employs loosely coupled systems that provide a separation of concerns (i.e. GUI != App) that are event driven (i.e. communication between GUI & business logic) has lower overall effort and cost during its product lifecycle. Not to mention usually lower number of total defects.
That's not say that it's easier to design, because it's not, but in the long-run it's much easier to maintain and move to newer technologies (NT, anyone?). Hurd notwithstanding, most consumer (i.e. not real-time) apps would benefit from this approach, IM(25+ years of experience)O.
Running on a common core set of software services (OS) on a common core set of hardware (ARM) is a good goal, as long as the GUI-side (another common set of services) of things is left independent so it can provide the best End-user experiences for the specific platform being targeted.
Biting the hand that feeds IT © 1998–2021