Indeed
This is the point when some clever BOFH's post "Why don't you just use xxxxx, it does everything, and it's free. It even orders your round in at the bar while you wait for the green ticks of completion"
/sits and waits....
Like most of you reading this article, I neglect good patching hygiene. There are very good reasons why we should all of us obsessively test every patch and patch our systems immediately, but patching is a pain in the ASCII. The tools suck, rebooting sucks, and most damning of all, something usually breaks. Each PC, and most …
Still Linux fan refuse to understand the same applies to Linux. Not everything can be updated from the distro repositories or others, a lot of commercial software running on Linux *is not* available from the repositories nor in a distro package, and that software will still require its own patching procedure (which sometimes can be even more difficult than in Windows, if instead of running a setup you need a more complex procedure). Try, for example, patching Oracle on a Linux machine...
Sure, you can be a Linux purist (or taleban...) and refues any "commercial" software non available from repos, but that's not how sensibile business are run, and without commercial applications, there would be far fewer Linux servers running
And while Linux doesn't ask you to reboot explicitly, it does need a reboot if the kernel was updated, unless you use some tools like KSplice and accept the inherent risks - which is not the default mechanism. And if you don't reboot, the kernel is not updated and the patched vulnerabilities are still there... just giving you a false sense of security.
Not a Linux user, but it seems even to me that you're bitching at the wrong party. If the volunteers working on the kernel (you know, the DIFFICULT part of building the OS) can build a system that seamlessly updates the system, Oracle ought to be able to follow the model. Particularly as the DNA is already in the system.
Oracle will happily update Oracle Linux for you, and its hw+sw systems do it (at a nice subscription price...). But not other Linuxes (especially since it wants to sell you its own one)
But updating a complex database - maybe in a cluster - is a little more complex than updating the kernel, which is a complex piece of software, but relatively small and compact.
Sure, everything can be automated -SQL Server is far easier to patch, and even MySQL under Windows now - but don't expect Oracle packages in any Linux repository soon.
Other products may not want to invest in packaging for each supported distro, and let you manage the updates yourself, nor they may want to publish their own repositories because they want to control who gets the patches and when (you may need to pay for a support contract for them).
The distribution model that works so well for open source software, unluckily doesn't for commercial one. Because open source doesn't cover every needs, it doesn't really matters if you run Windows or Linux when it comes to patching.
NO, it looks a lot of Linux "admins" don't know Windows (it's no longer a consumer only OS since a long time, but probably you didn't use anything past Windows 98), and Linux too.
If your convinced Linux doesn't need to be rebooted, well, you don't know how Linux works. Your "conscious" decision looks to be based on faith, not on knowledge nor experience. And of course you don't like to discover you don't really know Linux at all...
@LDS
I think the freetards, myself included, believe that the Metro or WhateverIt'sCalledThisWeek interface is a consumer interface that has nothing to do in a work environment. Windows 8+ and Windows Server 2012 (ROFLMAO) are consumer OS'.
Linux only needs to be rebooted when either:
* kernel gets updated
* glibc gets updated
Those two are about it ... compare that to the three reboots required to update a printer driver or browser on Windows.
I have PostgreSQL on my Linux system, guess what, it gets updated/patched via the repository. Same for MariaDB as well as a number of other useful databases. Oracle DB is a "pain" to update, just like the Windows version of Oracle you have to go and download the patch unless you use the XE version (there are repositories with the XE version), HOWEVER, no rebooting three times bullshit. The same goes for Sybase, BTW, and again, no reboots required.
The problem is Windows file locking and VERY flaky driver load/unload support, Plug&Pray ... they completely fucked those up.
I have trained MCSE's, I know Windows.
OK, so their employers volunteered them for the job. Whether you're talking about the start of Linux way back in the mesolithic age or right this instant, Linus isn't paying people to develop the OS so they are volunteers one way or another.
But the primary point still stands: the OS inherently provides an easy to use upgrade path. Oracle are CHOOSING not to enable the easy upgrade path on their commercial software. So the fault for Oracle software not being easy to update on Linux as compared to Windows is not the fault of Linux but Oracle.
> So the fault for Oracle software not being easy to update on Linux as compared to Windows [...]
WTF, It is the same process for both, you download and install it. Oracle XE is available in repositories, so updating that is easy.
You sir, have no clue.
Link to Oracle Upgrade documentation:
http://docs.oracle.com/cd/E11882_01/server.112/e23633/upgrade.htm#UPGRD105
a lot of commercial software running on Linux *is not* available from the repositories nor in a distro package
...Which is why the very first thing you do on receipt of such a patch is to *put* it in a package. It's not hard.
You then use yum or apt to upgrade the commercial app just as easily as if it were a distro package. All the metadat is stashed away in the usual manner, all the rollback options are immediately available. In short, it's a small amount of effort for a *vast* improvement in usability. It pays for itself very quickly...
Vic.
just to start the theme... but it is free with many server licences and the server technology is working towards reduced reboots. Often it is poor application design that leads to the locks that cause the reboots though.
The writer says he keeps everything open all the time - its no wonder he is required to restart though!
I think I will put up with the occasional reboot rather than be pwned though
I have to agree with that. Not rebooting for more than 4 months on your home PC isn't lazy, it's obstinate. Granted my home PC is not fully protected, but that's not for failure to reboot. In fact, my home PC gets shutdown every time I log off.
Last time I looked it was reporting 6 vulnerabilities: 1 for a patch it was busily installing and 5 for EOL programs. At the moment, I doubt any of those EOL programs will be updated in the near future. I simply don't have the available cash flow to do so.
"Not rebooting for more than 4 months on your home PC isn't lazy, it's obstinate."
A reboot is the occasion you get to test that startup scripts and related gubbins are working correctly. You have more chance of sorting out problems if you are not changing a whole load of stuff at once.
To those who are proud of long uptimes, especially on servers, consider this:
Your uptime represents the amount of time since you last tested your startup.
"Your uptime represents the amount of time since you last tested your startup."
And my long term health represents the amount of time since I last tested my immune system. I don't want to get sick to test how good I am at getting better thank you very much.
I also don't want to lose all my data to test my backup/recovery process. Sure, I'll schedule it in on my own planned schedule; when it's convenient for me. Maybe I'll eat something a little bit "iffy" tonight to test my digestive system...
Sorry, not buying that one either. We're talking Windows specifically here, not Linux and on a home PC.
And if it IS a work server instead of the home PC, you OUGHT to be double checking those start up scripts and testing them regularly so the company's cash flow isn't at risk if you get one as a result of some other issue. Because, we'll I've been the technician trying to rapidly answer the phones and calm the frantic users when some damn fool on the network side trusted the contractor who came in to configure the array had properly written out the final configuration to the BIOS and he hadn't. Nope, nobody had a hardcopy of the configuration either. Yep, they went to restore from backup. Unfortunately it was also the first day the regular backup guy was back from his two week vacation. And guess what? Yep, that's right, the backup for the regular backup guy missed some critically important error messages in the backup agents. Yeah, the ones that said the job hadn't finished. So 350 people at the company lost two weeks worth of work.
We repeated the exercise a few months later when two drives in the array failed at the same time because of excessive heat in the server room. Thankfully that time they had both updated the BIOS and kept a hard copy of the information, plus the backups were current. It still was not a pleasant experience coming so close to the prior failure.
I am SO glad I don't work for that outfit any more.
Most software today was designed with one goal in mind: to get it out the door and money coming in, in the shortest possible time.
Hence, Version 1.0 is almost always Beta 0.1 and there are few major packages that are anywhere near usable before version 3. After that, it's a case of slapping patches on zaps on top of updates in a vain effort to plug the design holes that a faraway hacker-schoolchild with some free time discovered within a few minutes of installing a pirated copy.
What we need is a recognition that every patch we are required to install is a message from the vendor saying "this software (that we took your money for) is not fit for purpose". We need software companies to be held responsible for their shortcomings and irresponsible attitude of "we'll fix that in the next release". Maybe the answer is a "bug tax" where package makers are charged 1% of their revenues (not profits, that's too easy to manipulate and revenue is what the customers have paid) for every major weakness that is discovered by a third party and that money is then held for the customers as a sort of discount against the cost of maintenance&support contracts and future upgrades.
Well said, Pete 2.
For most users (even large enterprises) the overriding problems are:
1. Patches and updates are done in the background, invisible to the customer/end user.
2. Because the software is proprietary, technical information adequate to understand what the changes actually do, functionally, is not available.
3. This makes it impossible to predict the effect of the changes on user applications.
4. Fully automated regression testing can only identify whether the software is functioning, not whether what it does still meets the user's requirement.
5. From an end user standpoint, testing is an exercise in futility. I can't test something without having at least a clue what it is I'm testing. And any results are only valid until the next patch/update.
This is not likely to end particularly well for customers.
While my gut agrees completely with your sentiment, my head says it doesn't work that way. There will always be bugs in the software. The question is more how much needs to be allocated in resources for the benefit derived from finding the bug.
Back when I were a wee lad and HP was actually a decent engineering company I had the good fortune to be working at a company they had partnered with to develop some software. Being a good engineering firm they had a formula for predicting bug discoveries. If you really were working on patching/fixing your code as opposed to adding new features/bling, discovery asymptotically approaches zero. In the initial stages you find bugs easily and fix them quickly. the further along you get the longer they take to discover. I don't recall if they also got increasingly harder to fix. At some point you don't expect to find another bug even if you throw another 30 man months at testing. What they did was categorize how critical the bugs were, and when they didn't expect to find more bugs above a certain level within the next 30 days, they called the software good and shipped it.
The problem in the current environment is that companies aren't even doing that level of testing any more. They seem to be driven entirely by the marketing schedule, not the engineering reports. I do think your proposal is a good starting point to address that problem. Just don't expect it to eliminate all bugs.
It's buried somewhere in folder preferences.
Apparently any Mac running Lion or above will save all running program states for you on logging out and reopen them all again on logging in again, but I haven't been brave enough to try it.
I can live with Windows for the desktop, as so much useful software only runs under Windows. An automatic reboot at 4am wouldn't be too much of an issue, as I only have to remember the half-dozen tasks I've been working on and left open. Probably even less of an issue for normal people.
But servers, whatever the OS, don't get patched & rebooted automatically if you value your job.
Even for servers, depends on the server. You have to assess which servers are critical and need more care, which applications they run, what's the risk of delaying patches, and so on. Then you can create different group to manage different update policies (what, when and where...)
One size doesn't fit all. That's why you are paid as a sysadmin - if your aim is being lazy, well, that's not the job for you.
"Still not ready for the desktop.
Discuss."
The very slight disagreement with this would be the word "still". I think XP was OK as a desktop, but all the rest after, including 7 has been utter bollocks as far as desktop goes.
Heck, I built a Win 7 VM a while ago, and it took me a stunning one week (day and night) to bring it up to current patching level ! One freaking week !
If you screw your mac's OS, it takes no time to restore from backup (yes, I know, a couple of bugs with Maverick as far as restore goes, but still), and only 1.5 day to install from scratch via internet (on a 2.5 Mbps link). And patching occurs only every 6 months or so, and they're bundled, not distributed into 100s of sets that take forever to download. Updating a Mac takes no time and a single reboot, while Windows is now a permanent download/reboot thing.
I still remember when this bloke brought me a Win 7 laptop not booted since 6 month as a tool for a jogging race. Upon plugging it to WIFI, 5 hours after, after everything was finished (on my Mac), it was still updating and was still unusable !
Redmond will need to change their OS update schema or they will loose even more to OS X and/or Linux.
"If you screw your mac's OS, it takes no time to restore from backup "
The same applies to W7 if you take the precaution of backing up a system image to external storage. I gave someone a new laptop last week. The first thing they did was to install Chrome - from what appeared in retrospect to be a very sticky malware site. The W7 system image was reprimed from DVDs while they waited.
W7 also has incremental back-ups - but the users tend to ignore such facilities.
Linux user since 2004.......my son convinced me to install Win 7 on the SSD of the computer we just built (that took 20 minutes, the build that is.) 3 and a half hours later (ladies in the family tapping feet, consulting watches giving us dirty looks as they worked out that they would not be home in time for Emmerdale...) Win 7 seemed installed.
Back at home I booted up and Win7 wanted a further 3 quarters of an hour of more patches before it allowed me to work on installing some Line6 software to it.
I then installed Linux Mint 17 MATE to the traditional spinning iron disk in the machine ... 6 minutes..
and a further 25 to download and install updates (and remember that also includes an Office suite and sundry software)..
My Son assures me Win 8.1 is quicker....... I guess MS have not bothered to do anything about this as they expect Windows to be installed on Brand New machines at the Manufacturers..
Oh yes, ready to leave house for appointment, shutdown Windows...Windows is now applying updates...do not turn off machine...paranoid Missus will not leave until finished so as to turn off the electricals...ARGHHHHH !!
I've run Vista and Windows 7 for the last several years. If it is taking you more than an hour to patch, you have no clue what you are doing. Yes, if you built an XP image after they'd released SP3 and allowed all of the patches to download from update.windows it was going to take a bloody long time. But then, nobody competent ever updated from XP through to current patches from update.windows. You ALWAYS had an SP3 disk and ran that before connecting to the update server. Same thing for the Office SPs if you were dependent on them.
I bitch as much about Windows as the next guy. But I don't make stuff up about its problems, or blame my incompetence on Microsoft.
You ALWAYS had an SP3 disk and ran that before connecting to the update server. Same thing for the Office SPs if you were dependent on them.
Ohh, and where do I download the Windows 7 Service Pack 2 installer from?
On Debian-based Linux distributions, if it downloads an update, you'll find it in /var/cache/apt/archives until it's cleaned up by a cron job (assuming the cron job exists). I can back that up before a re-install, then do a dpkg -i *.deb on the contents after install to get back the updates I had.
What's the Microsoft way of doing this without involving additional servers? I note the instructions for WSUS explicitly require some version of Windows Server. That's great for enterprise that uses it but what about home users? Are we expected to shoehorn a server version of Windows onto a laptop just so we can save time deploying fixes to the neighbour's computer when it needs a reload?
I'll admit I'm ignorant about some aspects of Windows as I rarely use it myself, however I'm unaware of any patch bundle released from Microsoft equivalent to the old service packs (that they no longer seem to produce) or is able to generate from the current published lists of service packs. At best you have to download each and every one separately, then you spend a good hour just double-clicking on .exe files to install them.
Bearing in mind of course, I've seen less need to rebuild a Linux box than a Windows one. There's a good reason I can still remember two Windows 95 OEM keys off-by-heart despite not having used the OS in over a decade. Windows has improved over the years of course, but sometimes things get so bent out of shape, the only option is to bulldoze the lot and rebuild.
Checkout https://help.ubuntu.com/community/Repositories/Personal
Your own repository, put in there whatever you want to distribute to your Debian-based clients (Debian, Ubuntu, Mint ...). For Suse/Redhat you have similar instructions.
Takes less time to set up than find out the cost of the Windows equivalent.
"If it is taking you more than an hour to patch, you have no clue what you are doing"
Please explain?
I have had a fresh install of Vista (and recent installs of Win7) that took hours to get updated, rebooted, updated and that was simply following what MS offered. Are you saying that a consumer OS should need some special magic to make it less painful than just clicking 'OK' on the update option?
With Linux it is usually 10-30 minutes for all patches, then one reboot and that is it up to date.OK, it might not run certain special applications, but I can get an XP VM I prepared earlier up and running in less than 10 minutes...so still less pain than a typical fresh installation of Windows.
Bah, pass me the can of Tenants' brain damage please...
"So where the fuck do I get the latest SP disc for Windows 7?"
As any fool knows what you're looking for is WSUS Offline. Works a treat, saves many hours and a lot of bandwidth. It's a lot better than a service pack disk, being all the time totally up to date.
If you actually tried searching for a solution you might just find one out there. It's even free!
Having to reboot is one pain in the ASCII. Having Windows tell you more patches are available after just installing the latest patches is even more of a pain in the rear.
I just build a Win2K12 server the other day. How many patch/reboot cycles did I have to go through before it was fully patched? Three or four if I remember correctly. (I believe a re-install of Win2K8 requires even more) How many reboots to patch to current levels after installing Linux or MacOS? One (usually)
There will soon be a post here berating you for not using automated deployments/patching, eg WSUS.
IMHO, WSUS is fine for some places. Others (eg the 'S' in 'SME') really don't have the time/inclination/money/experience to use tools like WSUS.
If MS would get their patching act together there wouldn't be the need for all the rebooting.
My pet PITA is the frequent re-appearance of Slitherlight (Silverlight) in the list of optional patches. Despite hiding it, like a bad penny, it comes back. If someone can tell me how to remove (forever) this bit of nastyness (why do I need slitherlight on a server that is never going to display anything other than a few help pages and certainly not via IE {rant, rant,rant}) I'll be very happy.
I wonder why Windows Update has no option "update as long as important patches are available, and reboot automatically when needed". At least when you need to setup and don't have a slipstream installation available it won't require you to attend the machine for a while.
Relevant article from Raymond Chen:
http://technet.microsoft.com/en-us/magazine/2008.11.windowsconfidential.aspx
In short - Windows can replace files in use, but the potential side effects for cross-process communication are considered too problematic for it to be worth it. I'd be interested to learn how Linux copes with this scenario if anyone would care to indulge me?
The file handles to the .so files are opened at program start-up so the original file is kept open even though its filename has been disappeared from the directory. When the last program using the original file has exited then the file itself will be deleted.
If it's a dynamically loaded library that's repeatedly opened with dlopen() and closed then there might be a problem though. I suppose it's relatively rare.
If the update includes a new kernel (or possibly a few other critical things), then it needs a reboot to replace the running kernel. There are a few people who have a way of patching a running kernel without a reboot, but it doesn't seem to have caught on in a big way.
For desktop use, most Linux distros send their patches out as they become available (although Ubuntu will batch up the non-security related stuff). This means the patches come out in smaller chunks (keeping in mind that this includes all the applications), and so download and install quickly while you carry on with your work. Even if you need a reboot, the updater will ask you whether you want to do that now or just carry on working and do it later. All in all it's pretty painless.
The package manager can handle third party repos, not just those belonging to the distros. The software provider just has to publish their repo and the user has to add the vendor's repo to their list. There's nothing special about the distros repos, they just happen to be already in the list by default.
It's an organization issue. I have no issue shutting down my machine every evening, and booting it up in the morning. I can easily get where I was in a few minutes. It's just a matter of organization - if you have a "clean desk" policy you get used to store away what you are working on in the evening, and get them back quickly in the morning. I may choose a tool over another because it let me access recent data quicker, but also it's a matter of organize your tools and data, using a computer should not be an excuse to be lazy about that because it can look for data faster...
Why do I shutdown? Because it is more secure (and hibernate is disabled for the same way, I do not want my memory stored on disk), it is safer (less risk to damages to the machine and everything else if something bad happens to the environment) and because I do not want to waste power if I actually don't need the machine. Also devices not designed for 24x7 workloads last longer.
This post has been deleted by its author
Me too. Odd thing is, if you ever saw the inside of my house, you'd swear I couldn't keep a clean desktop. But then my first real job was as a DTP specialist. Files all over the place and so many they didn't fit on the network without compressing the hell out of them. So I had to be organized on the computer just to work. The habits mostly stuck after that. Except for GMail, which encourages you not to file things away.
"Looking at my own desktop, it has been pending updates for about four months now, even though I know full well how important patching is.:"
It was better in the old days when your computer crashed several times a week, relieving you of the necessity of pressing the Restart button....
And by the way, don't you love the apps that tell you you need to upgrade only when you open them to use them?
Paint.Net handles this latter scenario particularly well - it offers the eminently sensible option to update when you close the application (ie after you've done whatever you launched it to do). So the application gets updated when you're not trying to use it, but you don't have to remember to do it yourself. I've not seen anything else which takes this approach unfortunately.
I like Adobe's approach to patching. Yes we have an auto-updater. What that does is to pop a window up that says can I update. You click yes. Then it downloads an updater. You then have to find that, and click on it. That then downloads the actual software. Then a thing pops up trying to get you to miss a tickbox so it can add some crapware from McAfee. Then it FINALLY installs. Then it launches your browser to prove to you it's installed. Then it installs and runs the McAfee security scan you forgot to untick earlier. Heaven knows what that does next.
I love unobtrusive updates...
The startup scripts and their configuration are another "thing" to be managed. If you are undiciplined, and start up services by hand, on a reboot you end up with a non-functioning platform. Backing out the patch won't help, because it is the manual service starting (and possible fiddling) that is not tested.
So even if you no-reboot patch you must still schedule down time for a reboot - when you are ready to handle the potential non-availability of the server - in order to test the start-up process.
Yep, more than once I've been bitten having got a machine up, left it running, then it's had to be rebooted and then needs hand-holding once more to get it working properly again.
My workplace's mail server (Zarafa atop Zentyal Small Business Server) is one in my care that comes to mind.
For Windows 8, it's 105 patches, before you're allowed to install Windows 8.1 from the Marketplace. Then a few more after that. Unusually it allows you to do them all in one go, without multiple reboots in between. This will inevitably crash the PC during the patching or reboot though, so best to do it in smaller batches.
The Secunia scanner updater will check for many apps and provide the updates. Free for home use. It is not quick but it does work and updates lots of open source packages.
http://secunia.com/vulnerability_scanning/personal/
What I'm not sure of is how much info they collect on the application usage and then sell it to the world.
"None of us can do much about the need for Windows to reboot."
Yeah, unfortunately. The big reason for this: POSIX (UNIX standards) permit deleting an *in-use* file, and replacing it with a new one (the old file's space is not freed up until the last program closes the old file). So, *generally*, Linux only requires restarting to start into a new kernel. (And I saw a patch that allowed going straight from one running kernel to another without reboot; how, I have no idea, I would think any changed data structures would break it...) Windows, these files must be updated while the system is shutting down or booting up.
I must get in my 2 cents... doing the regular updates in any given Ubuntu install, I have not had problems with updates introducing bugs. Following Debian's roots, the updates to a LTS (long term support) version are very conservative and mainly fix bugs and security holes.
KSplice and Kexec overwrite kernel memory while the kernel itself is running. If it works or not depends on many factors, including devices status. That's why it is an optional kernel update system, and not the default one. It's up to you to assess if your configuration works with such system implemented, or not.
As another post pointed out (with the relevan Raymond Chen explanation), it was a design decision to forbid open files to be replaced in Windows because the risks were bigger than benefits (Windows and it applications usually heavily relies on shared libraries)
But, for example, recent version of Windows could replace drivers like the video ones without requiring a reboot any longer.
As another post pointed out (with the relevan Raymond Chen explanation), it was a design decision to forbid open files to be replaced in Windows because the risks were bigger than benefits (Windows and it applications usually heavily relies on shared libraries)
Actually, the problem is a concern over ABI changes breaking message passing between threads. This post references this Technet article which explains the problem quite clearly.
The problem is exacerbated by the fact that DLLs in Windows do not carry any version information in the file name (for historical reasons: DOS only supported 8-character file names), so a library is likely to get called something like "foo.dll" and an update would simply replace that file.
On a Unix-like system, it'd be called "libfoo.so.2", where the .2 is the ABI version number of libfoo. Thus allowing multiple parallel instances of the library. The application requests whichever version it was linked against, and so it's possible to have different applications linked against different versions.
Handling message passing ABIs is the developer's problem and in my observation, isn't a "problem" that occurs all that often.
At the same time that we're trying to make patching painless - which is a very admirable goal, we should also be looking at how to make not patching as inconvenient as it is insecure. Provide both the carrot (easy, even automatic updates that preserve application state) and the stick (mandatory security prompts for everything until you apply the updates).
Address the psychology of not updating technologically - make users hate using unpatched applications.
...but +1 for Secunia PSI! Although it doesn't avoid having to reboot it does a reasonable job of automatically identifying obselete applications and managing their updating on Windows PCs for personal use (I believe there are non-free corporate suitable versions too) so that likely everytime you DO reboot something gets updated. It has become the first thing I install on anyone's "...it seems a bit slow and it's asking me to do things, would you mind having a look at it?" computer. You can also set it up to update everything and keep out of the way. This generally significantly reduces the frequency of this question being asked.
This article reads like the Windows monologue. I really feel sorry for you, really, but you should just do that thing I have been telling you to do for the past 14 years, switch to Linux.
I read through most comments and yes, mostly the same crap you read every time Tuesday patchday approaches.
Who has NOT experienced the dreaded update loops where Windows re-installs the same updates every time it boots up, forever, until you manually fix the issue .... of course, this means that other important patches that are to be installed after these patches are not installed. Of course, you only get a warning in event viewer.
Imagine this happens on your server, it remains unpatched until you need to reboot it again and somebody notices the three patches it is installing were already supposed to be installed (good luck with that). Ok, you could guess (LOL) from the event log entry, although that just moans about a manifest not being available.
"I just want my computer to work and I don't want to take the better part of an hour saving, sorting, closing, figuring out what needs to be patched, patching, rebooting and then opening everything again."
With most Linux desktops there is a session manager that will remember what programs are running when you logout/shutdown and will restore them next time you login. It's something that has been available for $diety knows how many years.
int main(enter the void)
...