
Re: Arrgh
My next computer will definitely be a Linux machine
Welcome Aboard!
10515 publicly visible posts • joined 1 May 2015
why does a man climb a mountain? Or go to the South Pole?
Or why go to the moon? Or Mars? Or the bottom of the sea?
Same reason, yeah. Exploration is a good thing.
Back in the 60's, astronauts were HEROES. Then space became mundane, even boring. A couple of accidents and suddenly we have no space shuttles any more. And no HEROES. Not REAL ones, anyway.
It might be a REALLY GOOD thing to get back to that. And of course, NASA, ESA, UK, Russia's space agency, Japan, Israel, and anyone else who wants to come along. Should be fun.
there are various absurd things (negative energy, causality violation, singularities) which everyone kind of knows are implausible but we can't yet say why
yeah good stuff, but keep in mind that for every particle there is an anti-particle, except maybe for gamma [which you could call 'energy' I suppose]. gamma is the result of the annihilation of a particle and an anti-particle. However, if anti-particles are actually particles moving backwards through time [maybe disproved, maybe not] the need for anti-energy wouldn't be there - only anti-time.
And... consider the multi-verse interpretation, where causaility is not a fixed thing. A paradox is resolved through time travel the moment the time traveller "arrives", because a new universe gets created in the quantum event of his arrival. Or something like that.
I've actually considered a sci fi plot surrounding that. Man invents FTL drive, and after testing it, arrives in an alternate universe where history is messed up, like the Nazis won WW2 or it's still Rome in modern day, or everyone's a communist because the USSR won the cold war. Many 'Sliders' plots were like this. Anyway, the protagonist must now use his knowledge of history, discover how to move backwards in time, and find the point at which his arrival screwed things up, and then "fix it" so he can get back to his own time line. Not a new concept, really, but a new twist on an old trope. Could become an entire series... [and I can think of many that had history as a plot device, from Dr. Who and Time Tunnel to Sliders and Highlander]
(but I guess this article is kinda old now, and maybe nobody will see this...)
"So, since they've seen emissions for longer than they expected, one possibility is that the central object is indeed a neutron star and not a black hole, and this would be interesting because it would tell us things about neutron stars I think."
More like 'fusion' then - and being inelastic, kinetic energy has to go someplace... and so it spins so fast that it must slow down somehow, so emits gamma [and maybe absorbs a good deal of that, too, like you suggested]. But when the neutron star absorbs it, what does it cause to happen? I was assuming that it's behaving like an excited atom, which must either emit gamma or fission to remain stable. On a macro scale.
I hate to spoil your joke, but in reality, a "tenth thickness" of a shielding material is the approximate thickness (along the path of the gamma) needed to reduce radiation levels to 1/10 of what it was. So in theory SOME X-rays will get through lead, but depending on its thickness, most probably won't.
/me ducks rotting veggies and half-cups of liquid
quantum entanglement appears to be faster in some situations
Quantum entanglement seems to follow a different set of rules [as I understand it].
Gravity should follow the same rules of space/time as does both light and EM fields. But yeah, quantum entanglement is something else entirely.
NOTE: if you use super-strong gravity + EM (and maybe light) to create some kind of warp bubble, the rules of physics would apply normally within the bubble, and outside of the bubble, but not at its border, and so moving the bubble would move you along with it "faster than light", or so the theory goes. Some physicist from Mexico came up with a really good description of how it's done a decade or so ago.
The speed of light is really the speed of causality: it's the fastest way any information about an event can get between any two points
That's a nice summary!
One of the explanations (in the article) of the 3 years' of X rays is the creation of a 3rd neutron star. In short, they're suggesting that, like a nuclear reaction, on a macro scale the neutron stars collided to form a "massive unstable star-particle" which then "fissioned" into multiple parts... and when you study how nuclear fission actually works, this makes for a VERY interesting analogy. Keep in mind a neutron star is a bunch of neutrons held together by massive gravity, similar to a black hole. But inside a black hole, matter is even denser and has a "6th state", sort of a non-particle "mush" containing everything that WOULD be atoms and neutrons and quarks and electrons and all of that, except it's so dense it just "mushes" and all of the borders between things go away. Or something like that.
(as I understand it, there are 6 states of matter: solid, liquid, gas, plasma, gamma, and that mushy stuff you find inside a black hole - and only the first 4 states actually keep atomic structure somewhat intact)
A neutron is (as a particle) the combination of a proton, electron, and 2 anti-neutrinos. Reason is that anti-particles and particles must balance. And so those recently-proven-to-exist neutrinos provide the anti-particles to make up the neutron. And then you crush the neutrons under high gravity, to form a neutron star, which [in theory] would really be 'neutron mush' that can still act like neutron particles [until you crush it even further into a black hole]. In a way, it's like a ginormous atom.
NOW, you combine them in a collision, not unlike nuclear fusion. The new neutron star is unstable, and splits apart. Gamma is released, and quite possibly, additional "smaller bits" (i.e. 3rd neutron star), much like what you see in atomic fission reactions, only on a lot bigger scale.
And that 3rd star now has too much kinetic energy, quite possibly spinning at close to relativistic speeds. It's a fair bet that gravity forced an inelastic collision, which means kinetic energy had to go someplace. I suspect that the 3rd star is spinning like hell, WAY too fast to be stable, and is slowing down by emitting gamma. [I also have to wonder whether or not the 3rd neutron star would actually be a black hole created by the collission, and the same point about kinetic energy and inelastic collisions]
Anyway I could be as right or wrong as anyone based on the analysis here. But it would be fun to see, if we could get up close without being X-ray'd to death.
politicians asking for a magic unicorn to solve all their problems
You assume they MEAN WELL.
I suggest they are playing "think of the children" to MANIPULATE people into letting them take more freedom and put us ALL under surveilance. This is the kind of thing that gummints do, in a POWER GRAB.
I love the "real police work" example by French police with the use of warrants to crack the phones of people they wanted to spy on. WELL! DONE! FRENCH! COPS!!!
The rest of the world (including USA) needs to get a CLUE, and stop being LAZY. _REAL_ police work, please, and NOT blanket surveilance. [that violates the 4th amendment anyway]. GET a WARRANT.
(followed by the usual 'genii out of the bottle' 'open source encryption examples' 'de-CSS example for DVD playback' 'PGP and IDEA' 'when encryption is illegal, only criminals will have it' and so on)
as long as the remote connectivity works, any competent IT pro can put up with crappy baud rates, limited UIs, and unnecessarily slow response times, and STILL get the job done faster than actually BEING there... while in his underwear, sipping adult beverages and/or caffeine sources.
icon, because [answered my own question, heh]
Space - it's the realm of TRUE hackers, steely-eyed missile-men, the sparkiest of sparks, and the maddest of mad scientists.
studying Mars this way should reveal a lot, including the presence of water in the soil, deeper down where it's not sublimating so much.
Mars' atmosphere is 95% CO2 but extremely thin. Temperatures (below -60F) are such that CO2 is probably as effective as it can be at absorbing IR being radiated out into space, but being so thin, is unlikely to do a whole lot of good. The partial pressure of CO2 is around 600 pascals, if I deduce it correctly (from various sources), as compared to less than 50 pascals for earth (0.04%). Obviously there's more going on than just the partial pressure of CO2, which affects how well it can react with IR radiation escaping into space. 1 Atmosphere on earth is around 101,325 pascals (for those who didn't know already, _I_ had to look it up, I'm used to psi, tor, and bars). The rest is just maths.
So as a result you have a case where the atmosphere probably is NOT going to have a great effect on temperature of the soil. And so the probe can measure deeper down and figure out what's happening at the "permafrost" layer, if there is one [or would be one] and get a really good picture of what's going on planet-wide. And I think they may find that there's water or ice 'down there' making a HUGE difference in temperatures. [probably what they're looking for, I say]. and of course, instruments top side to relay the data and take its OWN measurements.
Should be fun.
occasionally I've needed to use exFAT to transfer very large files to a windows system on a USB drive, or if someone hands me a USB drive formatted from a windows system [which apparently defaults to exFAT unless you tell it NOT to, and that may not be very easy...]
In the past I've used FUSE. but as I use FreeBSD, is it part of OIN? Or are the people who make the FUSE driver part of it? In any case, I'll just use it anyway and ignore anyone peeking over my shoulder...
(I do not want my birthday taken away, no no, seriously, stop looking over my shoulder)
ext2 has no journal. not sure you would want that. But I think ext4 has differences with ext3 that make it NOT compatible with ext2 for a few things. In any case, if I'm on FreeBSD and I attempt to use an ext2 built-in driver to read ext4, it fails. I need to use the appropriate FUSE driver for ext4. but the ext2 driver "works" with ext3, as long as you don't care about journaling.
so yeah depending on what you need, ext4 is a good thing, but not very compatible outside of the Linux world.
hmmm... double-checked my firewall config, looks like I'd already disabled incoming ICMPv6 for types 133 through 137, which includes all of the NDP protocol stuff, to the best of my recollection...
heh, dodged a bullet there. /me wipes sweat from brow
https://en.wikipedia.org/wiki/ICMPv6
Ping of Death was the best bug ever.
an accidental script ran in response to the firewall detecting certain kinds of intrusion activity... accidental. Allegedly. Heh.
Your post reminded me of Code Red. it opened up a port on the intruding/probing server that had direct access to a CMD shell. Sending commands via that port COULD cause IIS to shut down, thereby stopping the probing for vulnerabilities... and maybe (allegedly) put a file called "IDIOT.TXT" on the logged in desktop, and MAYBE pop up a dialog box that announces the machine is infected with a virus and then name the virus and tell them to patch their system or shut off IIS ... {allegedly)
the primary issue I see is occasionally getting only quad-A responses out of DNS instead of A and AAAA responses when ::1 is listed as a DNS server.
just did a test on a windows 7 box - with FBSD running bind (as 'named') and serving up requests for IPv4 and IPv6, using 'ping' got me the IPv6 address, and nslookup showed both IPv6 and IPv4, with IPv6 listed first.
when I told nslookup to look specifically at the name server's ::1 address, the results were the same. But DHCP tells the windows 7 box that the DNS server has an IPv4 address on the LAN. So I'm not entirely sure how to reproduce that on my network... maybe manually set up the DNS with an IPv6 address? Or it just may be a matter of which one's specified first in the list o' DNS servers for DHCP/DHCPv6 or however it is that Windows 7 is grabbing its IPv6 info [I got DHCPv6 and 'auto address' and other support on the network, so Apple AND Android devices have no trouble with it]
Also, in my case, the ::1 DNS server returns the same A and AAAA records that the x.x.x.x one does. So maybe it's just a 'Micros~1 quirk' ? I'd be interested in what nslookup results look like for your domain controller, especially when you explicitly tell it which name server to use.
it's a fair bet that Windows 7 is vulnerable, right??
Good thing I don't web surf (especially via IPv6) with it. In case anyone forgot, an IPv6 address is NEARLY ALWAYS routeable from 'teh intarwebs'.
I may have to adapt my (FreeBSD) firewall rules to block incoming ICMPv6 packets, just in case.
This IPv6/ICMPv6 vulnerability sounds as bad as 'WinNuke".
NOTE: I believe tracking should be opt-in only, above board, and you should be able to view it and manage it yourself on the tracker's web site, but I doubt any legislation will really help, so I use my own mitigations anyway. it doesn't mean I won't support such legislation, I just don't have any hope in it. That being said...
For anything other than simple web surfing (including El Reg) I have a special non-priv logon that I use, and the browsers that I run get their caches adn history dumped, every time.
Firefox is simple, just tell it to delete history on exit.
Chrome is not so simple, but works better with CAPTCHA [firefox fails a lot for some reason, probably by design] but you can run a script to wipe out everything in the following directories to clear chrome's cache:
rm -rf ~/.cache/chromium/Default/*
rm -rf ~/.config/chromium/BrowserMetrics*/*
etc. - there are others, too - my script is pretty long, and rather thorough
But as a hint, there are many files in ~/.config/chromium/Default/ that are created by chrome and if you wipe them out, they just re-appear. Some of them have persistent "things" in them. YMMV.
In any case, getting a handle on how to purge your cache and history (while keeping any important settings 'intact') might make a topic of its own someplace. in the mean time you can experiment a bit.
What's good about using Linux or FreeBSD: if you set your X server up to allow connections to localhost via the DISPLAY environment variable, you can use a shell to log in to a very unprivileged user account, then run firefox or chromium on the current desktop by setting DISPLAY via 'export' (or similar), and be in a COMPLETELY different user context. It works for video playback, too. then when you are done, wipe away ALL history. Hard to track you with NO history, NO cookies, NO persistent data, yotta yoltta. Other settings like 'private browsing' and whatnot can't hurt, either. And of course THIS would be for any site where script is unavoidable, like the DMV or certain electronics parts retailers that I can't avoid using.
worth pointing out, windows 7 had 'run as' which could be used in a similar way, so that you have a apecial user context JUST for web surfing that really doesn't do anything else... and you can auto-delete history and cache and so on with no consequence to YOU.
But... if you EVER log into certain sites, that 'icon' on half the pages you visit is part of their tracking. Its very presence probably tracked you opening up that web page... unless you do NOT have login information stored in a cookie [which is where purging the persistent data comes into play]. So if you did use FB or twitter or reddit or google login, you'd do that from the "web surfing only" user account, or maybe even a special "FB only" user account, and "flush" when you're done, so they don't know it's you.
and for everything else, on your 'normal surfing' user account, you NEVER log into google, FB, twitter, reddit, or ANY of those other "they will track you" web sites.
FYI - I grew up in San Jose (and the general area) decades ago, and left before it became truly "Silly". There was this one highway to nowhere they called "Stonehenge", a short section of a future bridge that stuck way up in the air and could be seen for MANY miles, and took something like 10 years to complete (while it stood there as a monument to gummint inefficiency and cost overruns). A city council dude (Joe Colla) put a car on it for a joke, took photos. Then someone actually ticketed it. This was back in the 70's, though, when the "culture" still had sane elements, so people could still take the time to point out the silliness with good humor.
https://en.wikipedia.org/wiki/Joe_Colla_Interchange
(good for some more laughs)
the fact that it uses Node.JS is bad enough, but when i played with it early on it was ALL 2D FLATTY FLATSO FLATASS (instead of 3D Skeuomorphic) and I _REALLY_ hate that. Nevermind the inherent slowness of JAVASCRIPT.
(this is why I stiil use DevStudio 2010 for C/C++ on Windows, which is happening less and less often these days anyway...)
From the art6icle: Is there anything that can disrupt it?
Yes. Micros~1 bloat. When a JAVASCRIPT EDITOR ultimately takes A MINUTE (or more) to load (give them some time, it'll happen), because of all of the bloat and extensions and 'did I mention bloat', performance goes out the window, and this is the problem Eclipse and InteliJ (which are actual JAVA, which is better) and other non-native-compiiled editors have had. That, and a potential "must have 16Gb of free RAM available or it won't even load" [do not doubt me, if unchecked, this is coming, we've seen this trend before].
I've DEFINITELY seen something like this with IntelliJ already, though not as bad as the 16Gb I suggested here. I think I managed to get IntelliJ to work acceptably with only 4G (or maybe 5G) for the VM that was running it by carefully tweeking its settings (unfortunately IntelliJ garbage collect memory MISmanagement is piggy, and Gradle made it worse). Later I allowed the VM to have 8G when I installed more RAM on my new workstation. And IntelliJ is compiled p-code Java, NOT JAVASCRIPT, which would be WORSE. So I have reason to make this future claim of hideous bloat and gluttonous memory hogging.
Why not just make an open source C/C++ native compile editor that uses a commonly accepted toolkit like Qt or GTK or wxWidgets so that it PERORMS WELL on embedded systems, and can also use X11 remotely to display on a local computer which is what I've been doing with pluma for YEARS now... and before that, gedit [until gedit's interface turned into SUCK, but thanks to Mate, pluma was forked from its earlier version and improved]. (Of course, I've been working on my own for several years but I still need to earn money to live on, too... which is why it's not finished yet)
Last year I built a PC using a 6 core Ryzen which has hyperthreads.
I left hyperthreading on. Works great. It's also running FreeBSD. Since I don't download and run windows binaries with viruses and trojan horses in them, and generally disable scripting in browsers, I should be protected from side-channel attacks.
Mitigation can be done in BETTER ways than disabling SMT and/or hyperthreads. You do not remove your toe because it has an ingrown nail.
icon, because, facepalm.
[and for a laptop I'd do the same thing - Ryzen, multi-core, and either Linux or FreeBSD. If I need windows I can run it in a VM and make it single core if need be]
Warrants to obtain details of everyone who uses particular Google search terms already exist in the wild
Ok, what are those terms, and just how hard would it be to publish that list so that nobody uses them, except for people (possibly like me) who do it in a bash script in the backgtround to make those requests several hundred times per day, in protest, via the Tor network... thus filling their database with SO much crap it becomes WORTHLESS.
It's a fair bet that for an individual, the amount of bandwidth this would generate would be small. If a few THOUSAND people do this, it might become large enough to make such "search term" investigations IMPOSSIBLE. It really would not take very much to frustrate them into silence.
It also makes you wonder how the specific search terms were figured out... any MASS SURVEYLANCE involved in that process?
It would simply cause new comms tools to be developed that are simply off the radar.
Easily done. Remember how PGP emerged? IDEA? OpenSSL? And the STUPID attempts by gummints to limit "strong encryption" exports. The defense: it went OPEN SOURCE.
Too many other examples of outright REBELLION against government control over encryption happened back in the 90's, and some bad fallout (Korean government requiring an ActiveX component fror online banking as one example). Just remember PGP, IDEA, OpenSSL, and those PGP T shirts... (when shipping the binary compiled code was "illegal", but putting the math behind it on a T shirt was NOT)
They really will never learn.
My point is that they HAVE learned, and the result of their "learning" is far more sinister than anyone wants to admit. They ignore facts, focus on FEEL, and the media helps them.
And they know DAMN WELL what the truth is, and blatantly LIE about it ANYWAY.
Legislating against mathematics is fruitless.
Not when you a) manipulate with *FEEL* in every election, b) have willing accomplices in the vast majority of the news media, and c) rely on your electorate being a bunch of "Sheeple".
(sadly, unfortunately, with deep regret)
icon, because, 'for the children" was mentioned early on in the article as a primary reason for justifying this, but we know what *THEY* _REALLY_ want: POWER. It's _ALWAYS_ about POWER. And to do that, they MUST "dis-empower" US.
this can be especially hard for a new/growing org
E-bay is full of used, reconditioned, and old stock computers, and I've never had trouble buying old hardware on e-bay [except maybe having to replace a power supply or hard drive earlier than usual]. But that's way less cost than a new machine. If you get inexpensive monitors locally, and the rest of the system in a marketplace venue like E-bay, you can avoid excessive 'capex' and get started on a much lower budget. Then, upgrade/repair/replace as needed later on... with minimal 'opex'.
(welcome to 'small business and startup' world, where I prefer to be anyway)
the only time leasing (over buying) makes sense is when the hardware is prohibitively expensive, like those old IBM mainframe and mini-computers (and the lease no doubt came with a service contract).
Modern computer hardware is generally inexpensive, doesn't really need constant maintenance, and lasts for YEARS if you're not addicted to to the "always upgrade" hype.
Might as well put a coin box on the side, like a payphone. [but the coin box and hopper would probably cost more than the computer, and you'd need someone to collect from it periodically]
Is THAT something they might consider? The extra cost associated with subscriptions and leases, vs the (alleged) benefit to customers?
I use a pre-paid cell. It costs me $100/year to do this, less than $10/mlonth. I always forget to re-charge it until after the minutes expire. I never use them all anyway. And it's the cheapest I know of, and I rarely use the phone anyway. It makes sense for ME to do a "per-usage" because of the obvious COST SAVINGS.
Would COST SAVINGS _ACTUALLY_ happen_ for customers for per-use charges on HARDWARE? I think it would be like MS wanting subscription based Windows OS, so that it would be a constant funneling of money from you to them...
at least the NASA money BUYS SOMETHING (like a rocket). There is a *bit* of 'trickle down' benefit to making something _like_ a rocket, as opposed to _OTHER_ kinds of spending, which might as well be dumping public money into an incinerator... like pretty much anything with "subsidy" (or similar) in its name.
Also history shows many consequential products and scientific developments come out of the space program, from integrated circuits to "space food" (Tang doesn't count, it apparently existed before the 1st U.S. launch but gained fame for being included as a refreshment during John Glenn's Mercury flight). Although these things would probably exist without the space program, the use of such things BY the space program [and the otherwise prohibitive costs associated with them] often resulted in mass production and rapid drop in cost/price [particularly ICs]. PCs and cell phones COULD have taken a decade longer to show up in modern society were it not for the use of their core technology by NASA (and the U.S. military).
Can AMD management handle all three (CPU, GPU, FPGA) at the same time?
I doubt it
What if they were to integrate their existing products AND their senior management in a way that leverages ALL of those technologies and the best skilled people to manage them?
Yeah, I have a bit more confidence in AMD than maybe Intel... especially THESE days!
FPGAs are interesting [I stlll nee to get myself a dev kit of some kind to play with].
It's very likely that AMD would benefit greatly by inserting this kind of tech into their existing products, primarily video adapters and network-related stuff.
Hybrid solutions (i.e. FPGA plus CPU) might be the best ones, when any kind of 'signal processing' or decoding happens. this goes double for WiFi and MPEG. And encryption, in general.
IANAL. Heres a real world example: an electronic circuit board.
a) patent the circuit itself, its overall design and uniqueness to solving a problem or providing a product. The overall design, excluding how it's presented, would not be upheld under copyright.
b) copyright the board layout itself, which may also contain a presentation of design [such as the size of copper or unique layout, very important in the RF world].
The first protects your basic design. The 2nd protects your board layout from plagiarism. Significant changes to both would be needed for a competitor to NOT license the design from you and legally produce a competing product.
No solution is perfect, but each part has its use. And IANAL. yet WRONGLY interpreting what copyright means, when a patent might be needed, could throw the entire system into CHAOS with various forms of abusive legal trolling, or outright piracy going unchecked.
but they also didn't want to comply with the GPL and open source their implementation
And, consistent with your observation, if Oracle "wins", it means that closed-source APIs (say Win32, SMB, etc.) could THEN become threats to freely available open source equivalents.
And yes, that WOULD threaten future (and current, and even past) software development, where the monopolistic big-boys get to KEEP their monopolies INDEFINITELY.
as more people decide that they ALSO need an updated PC, they'll get one.
But Lenovo could improve this even more by offering pre-installed Linux on ALL systems!
I still insist that it's Win-10-nic that has driven much of the decline in "new PC" sales. A large number of people would rather fix the old one than be FORCED into using Win-10-nic.
iPhone is a somewhat artificial "upgrade your hardware" environment, and to some extent, Android as well. So "replacement sales" are a big factor, not so much "new customer".
WHEN PC sales are driven by "replacement" (since pretty much everyone has a PC that wants one) and there are no artificial dis-incentives (read: requiring Win-10-nic pre-installed) you'll see a rebound in new PC sales as the economy continues to improve, and more people are adopting a "new normal" of frequent work-from-home.
But certainly a LINUX OPTION on ALL PCs would be "a good start". Especially if you get a PRICE DISCOUNT for it!!!
An alternative: pre-install virtualbox, and Win-10-nic in a VM, with a Linux host. yeah.
while it's a bit difficult to replicate a serch engine locally
for UN-BIASED global internet search, you'd need to have some pretty good storage and bandwidth available.
I suggested to Fox News that they do a competing service to Google and Bing, to offer unbiased searching of news-related articles. Still waiting. [if I had the funding _I_ would do a no-track no-filter service of this nature]
I did - by purchasing the individual parts. Did it about a year ago, a 6-core AMD Ryzen based system.
It's running FreeBSD 12 with Mate desktop.
(most of the parts were from Amazon though... a few were things I had already (DVD, case), and some parts came from Frys [power supply, extra fans])
tablets and phones are now our 24/7.
While I agree with many of your points, this one falls short. Most people seem to use phones as phones, except for that percentage that a friend of mine calls "4 inchers" becaue they see EVERYTHING through a 4 inch screen, in super-tall portrait mode, as if they are wearing a set of blinders designed for horses. [it's why I despise videos and photos taken that way - my eyes are side to side, and so that's how I view the world, in 70mm >2:1 aspect, and NOT 1:2.4 !!!]
ahem. anyway, phones are phones, slabs are slabs, and real work gets done on a PC because you have an actual keyboard and mouse to do it with. And 'new computer sales' is NOT the same as USAGE. Stat counter shows Android being a few percentage points ahead of windows when comparing OSs. And iOS is about 1.5 times what OSX is (grossly approximate). So they're a bit higher for internet content consumption, but do not include other uses of PCs, from games to content creation.
And I think the limited screen size and lack of keyboard on phones and slabs is the primary reason.
A reminder from history, Teddy Roosevelt was elected in part to deal with the robber barons, He was a REPUBLICAN. He went against MANY in his arty, who were willing to put business interests ahead of people's (and small businesess') interests. "Bully!" [his face belongs on Mt. Rushmore along with the other 3]
from the article: the recommendation that Amazon et al be restructured was a surprise.
To the Dems that signed onto anti-trust action against "big tech", I say "welcome aboard".
assessing "merit" in a way that turns out to embody an element of indirect discrimination
The worst kind, perhaps, being "the soft bigotry of low expectations". Something to think about.
best thing to do: Just treat everyone the same regardless of "whatever characteristic", ESPECIALLY when hiring. That means that political conservatives make good employees, too.
I would guess that MOST of what is done using Javascript is COMPLETELY unnecessary. I generally avoid it, unless I'm coding a UI for an embedded system that uses a web-based UI. In such a case, to get the kinds of performance you might need, resorting to javascript becomes the easier solution (like maybe coding a popup detail editing thing that acts like a dialog box, by unhiding nested 'div' sections to display it, and re-hiding when you pres 'ok' or 'cancel' to make it go away). Otherwise, it's pure HTML and CSS, with all of the work done server side whenever possible. (yeah I do UIs too, when I have to. I prefer device control, but one-man-banding it means doing the UI so...)
I really hate working with scripty HTML devs, though, primarily because I will probably end up cleaning their mess (using LOTS of profanity more often than not while doing so), and THEM locking things into a particular monolithic library (or style sheet from hell) just makes it worse. Such "developers" need to be "educated" properly, often with a Cat-5-o-nine-tails, clue-bat, or rubber chicken. Just kidding. (no I'm not)
Agile sounds like it has too much bureaucracy in it, and an oxymoronic name.
What you really need is a bunch of mad scientist types (like me), a decent engineer as a manager, a clear goal, and a sufficient slot of time. Generally can get it all done under budget that way with as much as 10:1 productivity (or even better, depending), so long as you have a competent engineer dividing up the tasks in a sane manner [and as few meetings as possible].
You know, OLD school! And NO rapid/radical direction changes. Those go into "rev 2".
the thought of using layers of libraries which you aren't in control of, fills me with horror
it fills me with nausea. yeah, same idea.
"What's up? My LUNCH, that's what!"
icon, because, that.
On a related note, looks like they were only scanning container thingies. So I guess all of us C and C++ devs are left out of their security scans...
you have the actual evidence to justify your claim?
It's my understanding that possessing illegal copies of someone's tax returns is a federal offense. So NYT can _CLAIM_ they have them, but actually having them could get them arrested. And should. Because if they DID have them, they would have been OBTAINED ILLEGALLY. Publishing them would not only prove that point, but would also expose THEM to being prosecuted for violating U.S. law. it would be the journalistic equivalent of a suicide jacket.
https://www.law.cornell.edu/uscode/text/26/7213
It shall be unlawful for any person to whom any return or return information (as defined in section 6103(b)) is disclosed in a manner unauthorized by this title thereafter willfully to print or publish in any manner not provided by law any such return or return information.
Proof, please. Or else your claim would appear to be part of the LIBEL the NYT would allegedly have engaged in by claiming they've seen the returns and then just making stuff up and publishing it as if it were true.
"Cancel Culture" notwithstanding, I'm both disappointed AND relieved at Stallman's absence.
I disagree with Stallman on his rigid interpretation of what the GPL should be. Copy-left indeed, because if it intends to FORCE people, it is about as left as it can be.
If the FSF is *truly* about freedom, free as in freedom, then MAXIMIZING freedom is what it should do. These alleged "license incompatibilities" between the GPL and MIT, BSD, and others, could simply be resolved by allowing for copyright statements to comply with the other licenses. Instead, GPLv3 emerged.
I tend to 'dual license' my open source stuff, so that you can use a BSD-like or MIT-like license if you want, OR a GPLv2 or later license, YOUR CHOICE. [I got quite a bit over on github, easily found if you want it, lots of W.I.P. though]
In many cases you need to have 'closed source' for at least some of the code, to protect a trade secret, to comply with legal regulations, and so on. BLOBs are often used for this in Linux kernel drivers. But it's my understanding tht GPLv3 *eliminates* that possibility. Also it was necessary to adapt the gcc library licensing to allow for linkage into closed-source applications, because [L]GPLv2 was ambiguous on this possibility, favoring the NON-publishing of source compiled with gcc as binary-only. Clang doesn't have this problem, just to mention. [nor does gcc any more, to my best understanding]
So Stallman had a good idea, to make software open, and keep anything you license under GPL "open". it also attempts to drive ALL software into becoming "open" which is not practical, and shoots its own foot in the process. [but without Stallman directly driving, this may be improving]
So if Stallman really IS "pro freedom", and wants software to be "free as in freedom", he shouldn't try to CONTROL things so much. And I think the FSF is doing that. Without him.
(profit is good - it is NOT evil to make money! Who said that? *ME*)