Re: Illegal bootleg of the 'Pirates of the Caribbean' theme
Even copyright infringement is not theft.
Yes, I would totally copy that car. Then drive away in it and feel pretty smug about it.
284 posts • joined 3 Mar 2017
Doing a quick google search, it seems that VS Code uses upwards of 700 MB, and a few GB in the more extreme cases. That puts it in the range of e.g. Java-based monstrosities like IntelliJ and Eclipse.
Meanwhile Notepad++ with a few dozen documents open and a stack of plugins installed is using less than 60 MB for me. That's over ten VS Code instances, apparently. Add a few MB for the mintty terminals and off you go.Ditto with Vim and a few Bash terminals on Linux/BSD.
VS Code doesn't seem very 'lightweight' to me. What's the appeal here?
When Scaled Composites flew the first-generation version of this craft in 2004 with SpaceShipOne, it was a pretty big achievement that had the world talking about space tourism and Space Shuttle-like access to orbit, but for the common man. Now, 17 years later (do you feel old yet?), it feels like things have changed dramatically.
Yes, after watching the livestream yesterday of Branson et al. skirting right below the Karman line I can see how this could be a fun joyride for anyone with the ~250,000 Yankee bucks that it'd cost per seat. As pointed out by Tim Dodd (Mr. Everyday Astronaut), the hybrid rocket engine they're using isn't particularly clean, so hopefully that can be improved some.
And yet, it feels like something that would have been more relevant a decade ago. When we're seeing SpaceX working towards launching paying crew towards the ISS and also on a sight-seeing trip in LEO in a Dragon capsule (multi-day excursion), then the main feeling I'm left with is one of being underwhelmed.
It's probably a good thing that I feel this way, as it shows just how much and how quickly SpaceX has upset the status quo, with their rapid development strategy which has left all their competitors choking in the dust. Ultimately it will be us 'common man' who will benefit from this rapid progress, and Branson got his joyride.
Win-win, basically :)
(Anyone tuning in for Jeff "Who?" Bezos' ride?)
I do not use Teams, and by the gods I pray that I never have to. I have used Slack on occasion, and it has given me a glimpse into the madness that lies that way.
Having Teams or Slack or other such tools which essentially just gets forced on you by your employer because of... reasons that may involve kickbacks or glorious ignorance is bad enough. Having every single system out there come with what is a work tool preinstalled and prominently put in a location where you are unlikely to miss it is borderline dystopian.
Heck, even back when MSN wasn't dead yet didn't MSFT dare to throw it into people's faces like that.
What's an OS? A Operating System that is application neutral and just happy to accept new drivers and run applications because that makes you happy as the user? Or a cynical intrusion into your private life that requires you to log into the MSFT mainframe using your MSFT-provided ID before logging into your MSFT Teams account for the working day.
Yes, I am one of those Windows 7 holdouts, why do you ask? :)
The ISA is only a small part of a processor's performance. As China has demonstrated with its MIPS and x86 processors so far, they just do not have CPU designs that are refined enough to compete with offerings from Arm, Intel or AMD. Slapping on another ISA on the same anemic core isn't going to make it magically run faster, especially when RISC-V and MIPS are essentially identical RISC ISAs.
This is just a Loongson-style CPU with a slightly different ISA.
The best code bases follow the KISS principle. Adding another language for no apparent reason and the whole tooling that comes with it introduces an exciting new level of potential issues. Why would you split a code base up into multiple languages, anyway?
Normally you'd pick a language for a project and stick to it. Rewrite it if is was a poor choice later on. I don't mix C++ or C with Ada unless I have very good reasons to do (compatibility, where the Interfaces module in Ada can be useful), but the better way is to still bite the bullet and just rewrite the whole thing in Ada so that you're not mixing and matching different paradigms and toolchains needlessly.
Since Linux won't be adding Ada support any time soon for whatever random reason Linus dreamed up, I guess that means that if I feel the itch to contribute to an OSS OS, I might as well pick a BSD or something fun like ReactOS, who don't seem stuck in monolithic kernel land, either.
One may hope that one day humankind can be just that, without artificial divisions along imaginary borders and so on. Much like in Carl Sagan's 'Pale Blue Dot'. Who after all, when looking upon Earth from space for the very first time, would have even slightest inkling of this 'human history' and 'traditions' which we so desperately cling to?
Hearing ISS astronauts talk about their experiences during spacewalks, when they realise that between them and the infinite reaches of space there is nothing but this spacesuit created through humankind's brightest minds and most skillful hands, as they float hundreds of kilometers above the pale blue surface of Earth. Only from there do you realise that you can - in fact - not see countries or borders.
Just a beautiful, unique planet whose biosphere is all that protects humankind from oblivion, much like an astronaut or taikonaut or cosmonaut's spacesuit. The fragility and beauty of something that is apparently so hard to appreciate when standing on the Earth's surface.
Is MSFT expecting that its victi^Wusers have already forgotten about the Aero Glass experience it was touting since Vista, and which has Win7 as this smorgasbord of translucent elements, rounded corners, drop shadows and being able to actually identify where an individual window begins and ends? Because to me it sounds like they're doing a miserable retreat from the 'Modern UI' AKA 'Metro' experiment they embarked on with Win8 and rammed down people's throats throughout Win10.
Imagine, an OS that's both pretty and useful. That must be something that will take Neo-MSFT some time to get used to, I'm sure.
Not that any of that matters to me until MSFT sorts out the system requirements for Win11. With them dropping the 'soft limits', my CPU (6700K, OG Skylake) isn't even remotely on the list and TPM modules are sold out until roughly 2024 anyway.
I'll just be forced to keep using Win7 that's totally adequate for my needs and also looks very much like Win11 already, while not forcing Teams and an online MSFT account onto me. The horror.
While my 2015-era desktop system could be coaxed into running Win11 (possibly...) by the addition of a TPM (14-1) module - all of which are sold out for the foreseeable future - I am still not feeling the love here.
I use Windows 10 on my (2019-era) laptop and will likely upgrade it to Win11 just because Win10 is so terrible, but for my desktop PC I'm not really seeing the point. To go from the last Windows OS that is literally just an OS and not a way for MSFT to experiment with useless features and sell its online services, to something which is actively user-hostile is not an attractive proposition.
Perhaps ironically, I was one of the Win2k holdouts when XP had been out for a while already and only jumped in around SP2 or so. XP to Win7 was a bit rough in some ways, but everything I didn't like there (new start menu mostly) was fixable.
Yet after using Win10 on my new (SSD-equipped, tricked out) laptop for a few years now, I won't be touching that with a barge pole on my desktop system, and it doesn't seem like Win11 is going to change that equation. Is Windows 7 the last of the 'just an OS' from the MSFT stables? That'd be somewhat tragic.
Unlike polysilicon for solar panels, US nuclear plants can be made with 100% US concrete and steel, and fuelled with uranium fuel from conflict zones like Canada and Australia.
The required amount of controversial rare-earth minerals is also far lower this way. Nuclear power is better overall for the environment, better for people, terrible for authoritarian regimes.
And with new Made In the US reactors like TerraPower's Natrium, there'll not even be spent fuel for people to get upset about any more, while massively reducing the need for uranium imports.
Assuming that the ISS will truly be deorbited a few years from now, it seems rather obvious at this point that this would mean having only a Chinese space station in LEO by the end of this decade. With likely a Chinese moon station to follow by the next decade. Meanwhile the US is wasting NASA's money on SLS and Russia's own space station seems unlikely to materialise.
This would mean a near future in which China would be essentially sharing space with SpaceX, while cordially inviting ESA and Roscosmos astronauts/cosmonauts over for a jolly good time at their facilities. But not NASA astronauts, as US Congress has decreed that this is inconceivable.
Like or hate China, but their highly focused and well-funded space programme provides a glimpse into what NASA (and ESA...) could be today if they had the same level of vision and funding behind them. Something which SpaceX is hammering home as well. Just having money isn't enough when you're going to waste it on the retirement plans of congress and other critters.
Considering that Windows 7 has all of the rounded corners, dropshadows, transparency and other fluff that people are gawking over right now with Win11, along with an actual themable UI that's also not the trainwreck of inconsistency that is WIn10's burning pile-up of Settings app & the Control Panel. The question comes to mind is whether this would finally get me to motivate me to upgrade from Win7 on my main desktop to Win11.
Don't get me wrong, I got Win10 on my laptop (thanks, UEFI), and I hate every moment of it, even after doing my best to fix some of the worst flaws in its interface. When after a recent service pack update, Win10 then also has the nerve to try and get me to 'upgrade' to an online MSFT account instead of just a local account... yeah, that's not me using the OS or it enabling me to use the system.
That's me and this 'OS' locked into an eternal war, where it attempts to coerce, beat and otherwise abuse me into giving into things I'd otherwise not consider. When I put that consideration next to my Win7 desktop that's... just a Windows OS that runs apps and cannot even do this 'online account' thing or do random system updates which can wipe one's hard drive contents... yeah, I'll stick with Win7.
Maybe if Win11 turns out to be just an OS again, and not some trojan horse for MSFT's web services. Maybe then it might be a consideration, but I somehow doubt that's going to happen.
I had the honor of sprucing up the HVAC system at the office of a former employer. That system had a lot of problems, from clogged drains from construction debris, defective water valves for the coil-fan units installed in the ceiling and thermostats mounted on a dark wall that would have the sun blasting it with IR for hours on end, as well as thermostats mounted next to the entrances of the large central room in blissful ignorance of the conditions near the windows where anyone who sat there would be blasted with ice-cold air all day long.
Made me think that most HVAC systems are put in more as a kind of placebo best case and human rights violation in the worst case.
So what advantages does any of this have over OpenRISC (open ISA & open reference design), OpenPOWER (open ISA, multiple reference designs based on older Power silicon) or OpenSPARC?
These glowing self-congratulatory pieces are getting somewhat tiresome, to be honest. Where's the journalism?
Might this mean that in the US, Facebook would finally be coerced to give two figs about privacy and data security instead of harvesting its victim^Wusers for every scrap of personal information and selling it to the highest bidder?
Seems like the ideal environment for a 'foreign adversary' to tap into the brain space of one's citizens. E.g. with Russia's 2016 & 2020 influencing of public opinion in the US prior to the presidential elections. According to the listed items, this would make FB a prime threat to the US, whereas TikTok, WeChat and others seem rather innocent in comparison.
Between OpenRISC, OpenSPARC and OpenPOWER the number of open and royalty-free ISAs is pretty large. OpenRISC even comes with a fully open source reference CPU design, unlike RISC-V.
This article read more like a sponsored piece by SiFive and other RISC-V parties with vested commercial interests.
I have been using Notepad++ ever since Crimson Editor stopped being developed, and do not regret it one moment. While arguably NP++ has some rough spots, it does do an amazing amount of tasks really well, with minimal CPU & RAM usage. I generally use NP++ in split-screen mode, with side-by-side editing of documents. Together with the Explorer plugin, it's pretty much my go-to for anything on Windows that needs editing or programming.
Over the years I have drifted away from behemoths like Visual Studio (VS 2012 was a kick in the teeth) and don't get me started on monstrosities like websites pretending to be real desktop applications, like VS Code. These days I find myself comfortably switching between NP++ on Windows and a Vim session in a (remote) Linux Bash session.
It's maybe not flashy or cool, but when you just want to get stuff done, you go for something that does the job without causing a fuss, and that's definitely what NP++ is. It's not supposed to be exciting, because it just works and has a load of plugins that make it work even better :)
After the whole mess with the Code of Conduct thrash fire a while back and senseless firing of a long-time moderator during that period, many figured that StackOverflow/StackExchange had already changed in an irreversible way, and not for the better. This purchase, if it goes through, would basically cement that it's perhaps time to move on.
Not sure it's as easy to move on from SO/SE as with e.g. moving away from Freenode, though. Anyone got a good programming forums they'd want to recommend?
There are two hugely exciting aspects about this Natrium project: the first is that it comes with its own energy storage (molten salt, much like with e.g. CSP), the second is that it's a fast neutron reactor. For those unaware, a fast neutron reactor employs fast neutrons (instead of slower, 'thermal' neutrons in LWRs) which are able to fission not just U-235 fuel like most fission reactors, but also transuranics and actinides.
This is similar to Russia's BN-series of reactors, of which the BN-600 and BN-800 types are currently being used to test an aspect that's central to Russia's nuclear power program: closing the uranium fuel cycle. This involves processing spent fuel from conventional LWR (thermal neutron) reactors and using it as MOX fuel for full burn-up. This would leave just some short-lived isotopes at the end, with no significant levels of uranium (U-235, U-238 or otherwise), plutonium isotopes, etc. remaining.
In essence, this Natrium reactor has the potential for not only lightning-fast load following due to its molten salt buffer, but also the ability to use up all the spent fuel in the US that's stored around the country.
My primary browser is Pale Moon, which in terms of UI and addon features is roughly on-par with Firefox 36 (LTS). For me getting off the rat race that was Mozilla's 'redesign everything' and cramming more and more useless features (and telemetry) into Firefox was the primary reason here. As far as I'm concerned, Firefox already stopped being a viable browser by the time Quantum rolled around.
I like Pale Moon, as it has all the features I want and need, and has a UI that works for me, as well as supporting the add-ons I care about. I also use Basilisk as secondary browser, and Opera when I need something that absolutely has to work with certain websites (hi, Netflix). Even so, Pale Moon has given me a good insight in how it's Google (with Chrome) that's leading the 'new web standards', usually when some basic functionality on sites like GitHub break.
Since Mozilla is essentially a wholly owned subsidiary of Google at this point, I fail to see how using Mozilla is somehow sticking it to Google. If anything Brave, Opera and others (even MSFT) have been giving Google the finger, such as with FLoC.If everyone who uses Chrome today were to switch to Edge, Brave, Opera and Safari, imagine the pressure that this would put on Google, as their Chrome browser could no longer be used to strong-arm new 'standards' into existence.
My go-to solution on Windows (my primary development platform) is MSYS2, with the Pacman package manager (same as Arch). This, together with a Manjaro (friendly Arch distro) VM, gives me a very coherent way to develop both for Windows and Linux, without having to resort to hacks like WSL(2).
On Windows I prefer to get an EXE or MSI installer package for large software packages (like LibreOffice), but I like using MSYS2's pacman to get libraries and tools I use during development, also because of the Linux-style filesystem layout this gives me in addition to a Bash shell.
I think I may have a USB-C port on the laptop I bought in 2019, but I'd have to check to make sure as I haven't used it yet. I'm honestly more bothered about having enough USB-A ports on my laptops so that I can actually plug in USB sticks, mice and external keyboards without dumb dongles.
I think I'd prefer a standard barrel jack configuration for laptops over using USB-C with USB-PC, but whatever incinerates your USB cable, I guess?
As a product of the original internet, IRC easily routes around damage, whether it's a server instance going offline, or some dystopian tin-pot dictator taking over a network. What's happening to Freenode isn't the first time, nor will it be the last time that a network keels over and dies for such a reason.
Over the past days I have seen a number of Freenode channels I'm in pack up and leave, and even large off-topic on-topic channels like ##electronics are slowly bleeding users, even as the new channel on Libera is growing steadily. I'm currently on Freenode, OFTC and Libera, waiting for when Freenode offers me no reason to hang around on it any more.
I hope this Andrew Lee jerk loves having a minuscule IRC network with almost no users to act out his dystopian fantasies, and that this Christel person who 'sold' Freenode never gets let near another IRC network again. She should be k-lined out of sheer principle.
Whereas up till Windows 7 rolled around, Windows' UI seemed rather consistent and following a 'lessons learned' evolutionary path, Windows 8 was probably the point where MSFT's management got a taste of 'revolution' over 'evolution', and figured that they could take the industry by storm, just like Apple. Thus we got Modern UI ('Metro'), 'touch everywhere' Windows 8 without start button, the flat, dystopian nightmare that's Windows 8 with zero regards to or lessons learned from decades of Windows UIs and its literally half-aborted 'Settings' 'App'. Because everything is an App now, fellow kids~
The development of end-user software should first and foremost be driven by what those end-users want and need.MSFT can then take this feedback and incorporate it into their next release, maybe along with a couple of original ideas of their own. Whatever doesn't work out (hi Clippy) then gets tossed the next release cycle. Lather, rinse, keep end-users happy.
I was happy to move from Windows version to Windows version. Win98 SE over Win95? Win2k over Win98 SE? Heck yes. WinXP improved a lot of UI stuff over Win2k, and Win7 (we don't talk about Vista) improved on that again, to provide a user experience that's based on the culminated experience of decades.
But it seems that MSFT is increasingly less interested in such a development path. They want to surprise the world, apparently, just like their secret love: Apple. The same Apple who can apparently not do anything wrong. Put a charging point on the wrong part of a mouse? Call it innovation! And yet Apple's MacOS doesn't stray that far away from its MacOS <9 roots, back in the PowerPC days and before.
Perhaps ironically it appears that while Apple appears revolutionary, they're actually quite evolutionary, without major disruptions, and easing things over in a transparent manner when disruption is inevitable. Meanwhile MSFT has shifted from that same model to one that is happy to alienate customers with every new release and increasing patch level.
This is not how you build a sustainable business, MSFT. Just FYI.
While the Soviet Union (1971) and Europe have attempted to land equipment on Mars before, these didn't last long enough to be called a successful mission. This makes China's triple whammy of a Mars orbiter, lander and rover seemingly working out so well even more impressive.
Also happy for NASA's two rovers & drone to have a bit more company on that planet now. Must be getting old, cruising past all those defunct rovers, landers and craters from unsuccessful landing attempts.
Ultimately this shows what the end game for 'browsers' will be, in so far as they aren't yet: a single runtime that provides access to the hardware, including 3D acceleration in the case of WebGL. This effectively means that 'web applications' can be written in a manner that's independent of the underlying OS, albeit with significant limitations and loss of performance. This is something which e.g. those who have experience with PhoneGap and Electron can attest to.
While this may seem like an obvious benefit, to have this level of portability, there are some major complications. The first is that HTML & CSS obviously weren't designed with this kind of abuse in mind, and the split nature of rendering engines between being a presentation engine and a full-featured CLR (or JSVM, if you wish) causes major optimisation issues.
The second problem is that of security. WebGL allows for unrestricted access to one's GPU driver by any random bit of code that runs in the browser. Whereas the .NET CLR and JVM have certain restrictions in place as well as the benefit of only running local applications, browsers are obviously designed to run every bit of application code it comes across while browsing the WWW.
None of this feels right, and Google's dominance over this whole 'common web runtime' (CWR?) is worrying at the very least.
Just set up Bitcoin mining operations at all the nuclear plants that are being turned off because cheap fossil methane plants is cheaper currently. Keeps those plants running, makes Bitcoin a lot more green and preserves those NPPs for when they're needed again to decarbonise the grid.
An important issue that situations like these highlight is the important distinction between source code being open and freely available, and the value of the work being put into said source code. For some reason it appears that people seem to mix the two up, even when closed source projects can just as well be unpaid hobby projects, and open source projects practically driven by commercial interests.
An interesting project is for example the Linux kernel, which by Linus' own admission could not have grown to where it's today if it wasn't for countless paid employees at companies around the world investing time and effort into developing the kernel and associated infrastructure.
When the expectation is that an open source project must by definition be developed by unpaid volunteers, then we end up at what is basically exploitation-ware. If a person is not duly compensated for their time and effort in some fashion, then this amounts to exploitation at best, and slavery at worst.
As someone who runs a couple of OSS projects which are moderately successful, I am well aware of the pressure that a project's success creates, but also the dangers that come with it. People may like the project, and gladly use it. Yet one should never lose sight of what is most important there: one's own happiness and health.
If you're not being paid for work, it's a hobby. Hobbies by definition have to be fun and bring some sense of reward. Nobody can force you to 'do' a hobby, because at that point it is work, and the person who pushes you to do such work should compensate you for it.
Which then raises the interesting question of how much a project is truly worth, and how much of the modern technology stack is based on such cynical exploitation of what are or were essentially hobby projects?
I think the elephant in the room with Android (and iOS) devices is the fact that they are not generic computing platforms like e.g. a PC. You can take a 2005-era or even 1990s PC or laptop and you can install on it whatever you want.Heck, write your own OS if that's your thing.
Meanwhile, smartphones are locked-down platforms where 'flashing firmware' is generally an arcane procedure that requires 'unlocking' the bootloader and jumping through various other non-intuitive hoops to hopefully end up with not a bricked device at the end.While some are brave enough to install e.g. LineageOS on their older Android device, it's neither enjoyable nor guaranteed to result in a functional device even if nothing breaks.
While Intel and Apple are pushing their whole 'signed bootloader' nonsense, so far endusers on these platforms still have a modicum of freedom here, allowing them to keep using a platform long after the manufacturer has given up on it. See e.g. Apple Power & MacBooks. Meanwhile pity anyone who is still using an iPhone 4 or 4s.
Another consideration is that of e-waste and upcycling. Although those smartphones often have SoCs in them that are pretty darn powerful, they're coupled with pitiful cooling solutions (if any) so that by the time they are tossed, those SoCs have barely had time to stretch their legs. Imagine if every one of those millions of smartphones that get tossed each year could be upcycled into a decent desktop/laptop/RPi-like toy system for general use?
A lot of this mashes up with the whole Right to Repair movement. If you can repair a device, you can likely also reconfigure and upcycle it. And everyone wins, except for the investors as the piles of e-waste grow ever smaller.
If AMD buys Xilinx, and NVidia gets approval to buy Arm, at that point NVidia only has to buy up Lattice and we'll have three major companies producing CPUs, GPUs and FGPAs.
That can be both a terrific or a terrible thing, depending on your perspective. Regardless, AMD buying Xilinx seems like a done deal, after Altera's acquisition by Intel. After the proclamations by AMD back in the K8/K10 days about running FPGAs in sockets alongside server CPUs which never really became a thing, this might lead to a revival of that concept.
In Scott Manley's video on the topic, he mentioned that it's been speculated that the intention with this LM-5B launch was to do a controlled deorbit. As China has shown with their one controlled deorbit of their space station, it's certainly not something which they cannot do or have no interest in.
It's quite possible that the LM-5B in question had the necessary equipment to do a controlled deorbit, but that it didn't work. We'll never know, however, as the Chinese state space program (including Long March) is highly secretive. Maybe with upcoming launches we'll be able to tell what their intent there is.
Since Audacity is no longer supported on Windows 7, I'm only using a crufty old version and was already looking for alternatives.
I get the 'it costs resources to support a platform' part, but I cannot imagine that dropping a platform the moment it's in extended support is very reasonable. Also looking at KiCad here.
What other languages have enforced such breaking changes? When I look at 30+ year old languages like C, C++, Ada, VHDL, COBOL, FORTRAN, etc. they somehow managed to get refinements and new features continuously over that time period, without abandoning existing codebases, and forcing developers to port over big stacks of code.
Heck, one can still compile pre-ANSI (<1989, K&R) C today in any modern compiler. Had to do that recently with the Dhrystone benchmark, for which the code predates ANSI C. Integrates fine in a C++17 codebase, too.
From my perspective as a developer - whose admittedly main Python experience exists out of porting Python code to C++ and debugging Python-based build systems - Python's decision to stab 20 years of Python codebases in the back was an exceedingly poor one, which may take at least another decade to get somewhat over.
Based on the chatter among the tank watchers so far, it appears that the landing flip was begun a bit earlier than before, presumably to give the craft more time to iron out any issues. Even so, it landed right on the edge of the landing pad. Any closer to the edge and it might have toppled over.
Still, this landing shows that SpaceX wasn't exaggerating SN15's new & improved features which warranted scrapping SN12 through SN14. The entire flight from what we could see (mostly hear) was controlled and the landing flip & burn was textbook. It relighting its three engines without sputtering or flaming out and then for the first time landing with two engines active all the way down for the slowest landing yet.
In the case of SN10, it nearly made it as well, but one of the engines failed to re-ignite, causing the craft to come in too hard and too fast, which ultimately damaged and ruptured internal structures, causing the ground-based RUD. After the low header tank pressure issues of SN8 (with engine-rich burn as a result), the re-ignition issues of SN9 (fail and swinging through the landing flip), the re-ignition issues of SN10 and the explosive turbo pump failure of SN11 (Starships In The Fog), I think that the SpaceX engineers had a fairly good idea of all that could go wrong.
So for the low-low price of four full-scale prototypes, they got a lot of crucial information that allowed them to fix issues in the Raptor engines and the Starship prototype as well.
Fail early, fail fast, then clear the debris field and try again. It's not crazy if it works, and you're based in a county which is more than happy to allow you to mess around like that :)
Managing remote computers is a pretty common thing in today's IT world, whether it's some industrial kit or a stack of servers running in a data center half-way across the country or globe.
Those happily have the option of a call-of-shame to a colleague or support person whenever the gear in question does not come back online in a healthy state after a reboot. Here it sounds like the nearby rover is the closest thing that JPL's engineers have to a 'call support to push the power button on the server' fallback :)
I found the aforementioned Jenkins CI setup with DSL automation plugin for setting up tasks from a Git repository (task updated when pushing a new commit) to be a reasonably effective setup.
Not that it was perfect, mind you. As with any CI system, there has to be an easy way to test and validate a CI task before adding/updating it, which was an unsolved problem then.
Have a staging/production division of the CI system as well, perhaps?
After a run-in with YAML for CI purposes in the context of Travis, I can't argue with the view that the use of YAML has all the appeal of sticking one's hand into a meat grinder. Heavens know that after the Nth cryptic YAML parse error from somewhere in the build pipeline I wanted nothing more than to rage quit.
While I had some not unpleasant experiences with the Groovy-based DSL for an automation plugin with Jenkins CI, I'm not convinced that Kotlin is more suited somehow. To me it feels more like a 'We use Kotlin because we created it' kind of deal, but that's something one has to find out in practice, I guess.
Either way it's not YAML, which is already a sole reason to use it over YAML :)
When comparing the sardines-in-a-can configuration with the Soyuz mission to the seas of space and elbow room in a Crew Dragon capsule, the latter must feel positively luxurious to those who have had the Soyuz experience before.
Not to mention the ability to seat four and all their carry-on luggage. No wonder there was talk of a 'Gray Dragon' version that'd allow for travel to the Moon. Pretty sure they got more space than the astronauts on the Saturn V moon missions. Plus touch screens :)
As bad as things got back in the Internet Explorer days and MSFT & Netscape rather forcing their idea of standards on the Web, neither of them were an advertising giant like Google, with a fierce interest in milking and selling your (tracking) info.
FLoC literally has zero uses for website owners (beyond selling ads) and end users (beyond being targeted with ads). With the outrage that e.g. China/Huawei got regarding their 6G plans (user connection tracking and severing if 'in violation'), I would argue that Google's FLoC is from the same dystopian scenario, just with a more ad-related flavour.
Is it better to have one's every move tracked on the Web for dystopian nationalist totalitarianism reasons, or so that one's movement data and browser history can be sold to anyone with an interesting in targeting you and you specifically with ads and worse? I'd argue neither.
In short, FLoC takes what could have been benign (third-party cookies, should be disabled in every browser) and turns it into the kind of tool which the Stasi would proudly have used if such technology existed back then.
Maybe I'm just a paranoid nut, though. I run a non-Google browser (Pale Moon) with uBlock Origin, NoScript and more addons that nuke anything that isn't pure HTML/CSS. On the average site I can leave the JS sources for half a dozen clearly ad-related domains disabled without affecting site functionality. At the very least it makes the browser run a lot snappier :)
Everyone seems to blaming crypto miners for the GPU shortage at least, with NVidia even releasing special 'mining cards', in a PR-heavy attempt to stem the shortages.
Meanwhile it seems that the shortages are basically due to the demand for PCs and new laptops having gone through the roof. Who could have figured that happening during a pandemic that's confined people to their houses and boost work-from-home like never before? Totally out of left field, that.
On the bright side, things appear to be moving in the right direction now at least. With SpaceX rather leading the way towards fully reusable rockets, and others working on extending the life of satellites, or at least moving them into safe graveyard orbits.
If SpaceX's Starship concept works out, then that'd be the first time since the Space Shuttle that a satellite can be launched or even recovered/repaired without adding to the pile of old hardware up there in terms of discarded second stages and what not.
As for cleaning up current trash in orbit, the main problem one has there is that of velocity and thus kinetic energy. When even a fleck of paint is zipping along at a few km/s, then simply putting something in its path won't do more than cause a very spectacular explosion worst case or best case a puncture hole and a paint fleck that's now hurtling either into space or towards the atmosphere.
While it seems hopelessly starry-eyed to believe all the hype about Starlink, one has to admit that such pessimism is hardly warranted either. It's essential to remember here that SpaceX has launched only a fraction of their satellites so far, and only the last handful actually have the laser-links for inter-sat communication.These are still 'testing phase' level, not unlike the first attempts to roll telegraph cables across the Atlantic Ocean.
Lots of companies back then tried to make this 'transatlantic cables' idea stick, while others scoffed and just send letters by mail via ship as was common. Lots of transatlantic cable companies went bankrupt and service was incredibly spotty for decades, yet somehow the issues were resolved one by one, and transatlantic fiber-optic cables quite literally form the backbone of modern society, along with communication satellites.
As an engineering challenge, Starlink is fairly straightforward, as we didn't have to invent new technology to make it possible. It's just that nobody had tried to use all of these technologies, like Ka/Ku band transceivers, ion thrusters and laser-links in tens of thousands of sats zipping at only a few hundred km above the Earth's surface before.
Will Starlink work as advertised? Probably. They're realistic about how much bandwidth they can provide per cell (area covered by at least one satellite at any given time), and each new generation of Starlink satellite is likely to increase the bandwidth even more as lessons are learned and applied, and new manufacturing techniques make certain optimisations possible.
At the end of the day Starlink won't be the end-all, be-all of internet access. But providing solid broadband coverage to areas where wired broadband doesn't exist in any acceptable fashion? That seems eminently realistic.
Real shame that Rust increases the possibility of logic and type errors due to its use of weak typing, similar to Python's weak type system.
Not to mention its more abstract syntax, complex alternative to OOP that seem to throw all beginners for a loop and it violating basically every tenet of the Steelman Requirements that underlie truly safe languages like Ada.
Basically, Rust will never be certified by the DoD, whereas Ada has been since the 1980s and C++ for over a decade now.
Sometimes I wonder how us old fogies ever made it through the internet of the 90s and 00s in one piece without the Invisible Hand of Privacy guiding our every move, or alternatively beating us into submission if we dare stray off the cordoned-off path.
Oh right, I think they told us to not give out any personal information and always use a nickname online. Don't talk to strangers, basically.
When doing literature research, I rarely use Sci-Hub. Most of the time I can get the paper directly from the researchers' ResearchGate page or (department) homepage. Either using Google Scholar or by directly googling the paper's title can more often than not get one the final version of the paper, with Nature markings or equivalent prominently visible.
Other times I end up using the arXiv version, which usually does not differ much from the final, peer-reviewed version.
I'd argue that there's incentive on the side of the researchers to get their paper out in public, as ultimately a researcher's career lives or dies by the number of citations their papers get. Restricting access to only (some) students and academics (assuming their university even has access to that particular journal), would seem rather counter to that notion.
That said, I think that most academics can probably agree that torching down Elsevier would massively improve their respective fields.
Biting the hand that feeds IT © 1998–2021