Indeed. Certainly exceeded my expectations!
Reckon it is the pictures of monkeys or frogs that make up that 5%? Or perhaps even those coordinates (virtual land?) in ratty Unity3D games.
1259 publicly visible posts • joined 10 Apr 2015
> Stallman's baby still not ready for prime time
Actually; upon reflection, it kinda is "ready". Think back to the situation then. If we didn't have Linux and BSD was still faffing with licensing, we would have absolutely jumped onto GNU/Hurd as it currently is now. It isn't missing too much that render it a non-starter.
So really, it is great to know that we have BSD and Hurd ready to pick up the slack if Linux ever did have the possibility to become problematic.
Where I used to work I wrote a C++ clone of the Unity 4.x API (so we could target the pluginless web via Emscripten a full 5 years before Unity): https://github.com/osen/mutiny
I have just been told by some of the developers still working there that they have been maintaining / improving an internal copy over the years and are strongly considering going forward with it.
Quite cool news!
> beware the open-source developer bringing gifts
What about Unity was ever open-source? Their test suite and a few sample projects perhaps.
I hope those Adobe Flash developers who jumped onto Unity and now have to jump onto the next thing *actually* choose open-source rather than a strong marketing department this time!
I wish them luck! ;)
Very cool.
My entire PhD thesis was looking at ways of passing out graphical calls from the VM onto the host and injecting the raster image back:
https://eprints.bournemouth.ac.uk/36043/1/PEDERSEN%2C%20Karsten_Ph.D._2021.pdf
If a modern "old" GPU was available to just to the right thing on older hardware, I would have just used that haha!
> Weird, I know, but nobody can revoke it, and there's no question of licensing. If it doesn't survive an EMP strike and you can't hold it in your hand, you don't own it.
I am of the same opinion. However, I assume you don't mirror the entire repository of packages for the Linux distro you are running do you? Ultimately, the slurping from package servers still has the same issues (other than being revoked), unless you have it on your physical disk, it could be made inaccessible at any time.
> which keeps the screen working by disabling power management.
Hah, yes. Full brightness blasting out the LCD, generating loads of heat via the efifb fallback driver and CPU rendering. Classic!
It is sad to see this with most ports to foreign hardware. They go 75% the way there but often can't get any further. I am actually largely surprised this hasn't happened with the Apple aarch64 platform.
We all know how this works in ARM-land. The day it is properly supported by a useful OS, is the day that Lenovo drops support for the laptop and Qualcomm drops support for the chip.
Then we put it in our tech gizmo cupboard; never to be used again and go back to using our decade old Intel ThinkPads.
In a similar boat. I recommend disabling encryption (but making software listen localhost only (and enforced by firewall)) and then using i.e stunnel (or with some fiddling, openssh) to create secure tunnels between them which act like "proxies".
In some ways I am considering doing this for more recent installs too in order to allow configuration of certificates in one central location rather than individual daemons.
Actually quite interested in buying a copy. Arca Noae just as a "Contact us" but does anyone know of a hardware compatibility database? As a bonus, providing a listing of Laptops.
I have a decent Thinkpad T23 (actually 5 of them) for this kind of stuff, but the S3 Savage GPU is a bit of an unknown.
Agreed. This kind of culture for dragging in technical debt (aka NPM, PIP, CPAN, crates.io, etc) is problematic.
I feel it comes from the fact that the entire computing platform is C and that bindings and bindings generators for any other non-C based language are not perfect; thus these package stores arise.
I think either a semi-C superset, or bolting a C compiler onto a language is the route to go in the long term. C++ is a (close) superset of C and its success and popularity possibly reflects this.
As others have said, this is just Java bytecode / VM technology. (i.e Dis/Limbo, JVM/Java, .NET/C#)
However, where this might be different is that it represents a Java VM that other languages can be used and officially supported. C++ (Emscripten), Go and Rust are liked by many and having to use Java or Javascript would be dealbreakers for some.
In some ways, the popularity of .NET is also because different languages could be used, including C++/clr, IronPython, F# and.... er VB.NET
Off-topic, also just found these old articles, quite charming about the early COOL/C#:
https://www.theregister.com/1999/09/08/microsoft_set_to_unleash_javakiller/
and:
https://www.theregister.com/2000/09/12/official_microsofts_csharp_is_cool/
One potential issue I see is that RISC-V is an open standard, so no corporate behind it to protect against things like ARM's riscv-basis.com smear campaign.
Similar in theory to C++. Microsoft was able to misuse the C "branding" with COOL/C# because there was no corporation to pay the lawyers to waste everyone's time.
They seem to only ever mention the Tritium. Which suggests to me it is the Carbon-14 that is the most damaging. 2% over regulatory standards. So 102%.
My question is, how often do other radioactive releases reach near to 100% of these maximum standards? Are they quite high bars to reach already?
Why so many downvotes?
I also think it is a good idea; I even put my family photos on a private GitHub repos (mirrored to GitLab and BitBucket as a bonus).
I also taught my father (eventually...) how to use Tortoise Git just so he can store his music collection there.
Sure, I am effectively misusing the "good graces" of Microsoft but I don't exactly lose sleep over that :)
In the Linux kernel, the big players pretty much employed the "little guys" to facilitate those stats.
But if you look at the actual packages that Red Hat is withholding SPEC files for, things like libpng, bash, vim, dia, brasero, gedit, etc are all very much little guys. You wouldn't get a company like IBM developing this stuff. There is no money in it individually.
All in, I imagine that RHEL as a complete product only contains < 5% code developed by Red Hat and IBM.
> If they have people on a steering council? They control releases? They contribute more than 50%?
All of those things to be fair.
Compare the company such as Red Hat, or even less obvious ones like that behind i.e NPM, etc; to community lead software like Debian. Very easy to identify if an open-source bit of software is run by a company.
> Company after company has had their start in open source software, and then gone on to dump their open source licenses once they've achieved a measure of success. It's time to stop it.
It is quite simple. Regardless of if open-source or not; just avoid anything run by a commercial company. This isn't the 90s anymore, GNU/FOSS has reached critical mass, we don't need them.
The XBox live arcade was annoying. Even code that *we* wrote is no longer runnable because the console needs to connect to their DRM / XBLA server before it executes.
I used to be quite active with Microsoft back then (my team won the UK finals for a fun little game development competition called the Imagine Cup (2009)). However I simply refused to port it to the XBLA until they removed that restriction. They never did with XBox 360, but they *did* with Windows 8/RT (when domain joined) and Windows 10.
Honestly life is too short for this kind of bullshit.
Pretty much every SoC uses Debian. Raspbian, Armbian, etc
The others also tend to "use Debian" by proxy of downstream distributions like Ubuntu, i.e Jetson Nano, pcduino, etc.
Even strange experimental things like Intel Galileo tend to get a Debian port first.
I personally find Debian's base (and minbase) to be a bit... well, random compared to the BSDs. But there is no denying, it is a really safe bet when putting together an appliance.
TL;DR; Great job Debian team! ${BEER}
Native messaging is fairly rare in plugins (and the manifest specifically asks for this).
https://developer.mozilla.org/en-US/docs/Mozilla/Add-ons/WebExtensions/Native_messaging
Plugins are Javascript and in the same sandbox that executes in every single page you visit. So unless you turn off Javascript, I assume you trust it?
That said... Firefox, Chrome and Android itself is built upon "dodgy extras". Welcome to open-source; you can only audit so much.
I feel its going the wrong way, we want *more* separate hierarchies.
The BSD /usr/local approach is very nice to separate ports/packages from the base install. Linux distros often lack a coherent concept of a base and now with this merge, it will get even harder.
Weirdly I really like the Solaris (i. 10) approach of /opt/csw, /usr/sfw, /usr/ucb, etc. And the OpenBSD Xenocara in /usr/X11R6 is just much neater.
One of my projects is designed to take this even further, and create small self-contained / encapsulated hierarchies out of any package and dependencies: https://github.com/osen/pkg_bundle
TL;DR; I basically don't like packages and dependencies and all their cruft sprawled all over my systems. It makes it difficult to audit for one.
Indeed.
No matter how cool a browser might be, if it relies on technical debt like the heavy legacy behemoth Chromium engine, it will sooner or later be pulled down into the cesspit along with it.
I would honestly be more interested if Arc was based on i.e hubub, libcss (two of netsurfs backends). At least then I would know it was maintainable in the long term. Plus I know it would not be contributing to annoying web developers making naff overconsuming web pages.
> The rules effectively banned the sale of Nvidia, AMD, and Intel's top spec GPUs
Isn't Intel partially an israeli company?
https://www.intel.com/content/www/us/en/corporate-responsibility/intel-in-israel.html
It is apparently the manufacturing center, so strange US/Israel relations aside, I don't see why banning them in the US would at all affect the supply to China.
> In fact, you can activate a modern iPhone without an internet connection as long as it's been properly reset and doesn't still have an activation lock from the last user
Unfortunately not; it always needs to connect to Apple's servers for *some* purpose (I suspect DRM).
They disguise it as anti-theft or (in newer Ventura macOS) firmware fetches but they are really just excuses for Apple to enforce planned obsolescence if needed.
The next war of the "glorified chat rooms" begins!
Ironically in many ways I am glad they exist. They do a great job of attracting all the low effort twits to them, leaving the rest of the internet communities relatively clear from trolls.
There is a fine balance between a useful "chat room" and one that is simply too large. These companies obviously go for the latter due to monetisation but ultimately they will never be as productive or as inclusive as a reasonably sized focused community (i.e a Linux forums, MS-DOS gaming forum, etc). So my theory is that no matter how good the intentions are for a large scale community system (fediverse, distributed, privacy oriented, etc), it will always become horrible.
I have a box of ~200 Sun Crossbow mice. So it is weirdly annoying that I can't justify ever buying another mouse.
They are USB (yay!) but ball rather than optical (boo!). The middle button has no scrollwheel (boo!) but is really pleasant to click as a result (yay!).
A true rollorcoaster of emotions!
> And under the terms of their contracts with the Hat, that means that they can't publish it
This is allowed by the GPL. However the worst Red Hat can do is terminate their contract with the one who published it. They can't i.e sue them.
What this probably means though is that downstream projects are going to have to create bots to sign up to the free RHEL developer accounts for access. Which has a knock on of:
1) People like me who hates being tied to a server and simply mirror the entire repo for offline backup. RH will probably see me as a risk and will put download restrictions on me.
2) RH will simply terminate the free developer access programme.
Commercially: I can't see why companies would jump through these hoops. There are loads of companies who offer equal Linux support; just look around rather than sticking to that 90's style inertia (as charming and nostalgic as it is).
Community: Do people *really* want these guys dictating the direction of Linux via Systemd, Wayland, NetworkManager, etc, etc?
I don't think open-source licenses need to change at all. There have *always* been criminals that have tried to exploit it without adhering to the license. This is not a new thing.
Companies (including AI companies) are just waking up to this wealth of functionality that has been slowly and steadily growing (whilst the proprietary stuff has burned out due to corporate lifespan or over monetization). The responsibility is on them to not break license terms. But this is what companies do; they push legal boundaries for maximum wealth.
If anything, what the open-source movement needs is a way of *mass* detecting when i.e GNU licensed code has been used and perhaps come up with a concept of crowd funded lawyers to tangle up the company in breach. Perhaps we should also consider AI GPL lawyers to churn through all the many companies I am sure are in breach.
Indeed.
GSA - "Glorified Search Algorithm"
This seems to be all us software developers have managed to achieve. It starts with A* pathfinding at school and whilst things get more complex, it is really just a search. But.... it is enough to fool the general public and that is where the money comes from :)