
I agree on the national security part. Not the rest.
(there are many costs in manufacturing, and from my viewpoint, a LOT of these are government-caused, like regulations and taxes, which is why REDUCING those becomes "the incentive")
10507 publicly visible posts • joined 1 May 2015
smartphone users are idiots who need protecting from themselves
I wouldn't go THAT far. Even the average LUser can be educated.
All you need to do is make the default security as tight as possible, and allow people to turn things off if they don't want them. This MUST include both side-loading AND installing without code-signing, for TRUE freedom for the user.
This way open source and independent developers can more easily distribute their stuff (without paying the 'Apple Tax' or hiding your needle within their ginormous HAY STACK).
But if the end-user doesn't want that, he can just leave all of the security options ON. So simple.
(and if a system reset can wipe everything and restore the phone/slab to factory state from a ROM, so much the better, for dealing with viruses and malware, in case they show up more frequently)
Yeah, PERSONAL CHOICE. Who knew?
I would expect that generating the deepfake in a very high resolution, and then using a standard method of shrinking it down to something with less resolution (cubic interpolation, let's say), or using JPEG vs PNG even, might be just enough to fool the deep-fake spotter-bots.
I like use 'gimp' and a hand-done "fuzzy" technique around the borders of, let's say, a face, surrounded by transparency, that melds right into another photo when proper re-sizing and perspective is done (no pr0n though, just funny things).
in 'Little Nicky' (Adam Sandler) Nicky's brother pasted his face over Al Pacino in a clip from 'Scarface', and did it poorly. But often times it's funnier if poorly done, which is then obvious to everyone (including algorithms I hope).
this reminds me of a Dr. Who episode - in space, O2 is expensive, after all! (in that episode the suits literally took over)
Along with a zillion other things I'm not an expert on, I have kicked around a design concept for a one-size-fits-all space suit, mostly for emergencies. In short, a vinyl bubble [made of thick material similar to a waterbed] with an integral helmet and radio, pockets for important things like an O2 bottle with a regulator on it, a couple of hose connections (for the O2 bottle etc.), and a bunch of oversized SHOE STRINGS that you'd use to obtain a proper fit.
For working it may not be tough enough, but if you put something _like_ this on, then put some stronger clothing over it (like pants and a jacket, only SPACE pants+jacket) and gloves, it might be pretty effective, fit ANYONE, be relatively easy to put on, and maybe even cheap enough to be DISPOSABLE.
And you could put these in boxes at various places inside the people compartment, for emergency use, in case of sudden decompression.
(you'd have to strap on the extra A/C when working outside but that could be more like a backpack thrown on after everything else, connect up hoses, etc.)
yeah - I think in terms of doing it "on the cheap" yet being extremely effective and reliable. But it would look like you were wearing a balloon tied with shoelaces.
de-orbit of small things can be absorbed by the atmosphere as long as they don't have anything REALLY toxic or dangerous in them. The large amounts of things like Selenium and Cadmium in solar panels might cause some level of contamination depending on where the particles end up falling. Over the ocean, not so bad. Over a large body of water that supplies drinking water for people and water for farming, not so good.
And de-orbit of BIG things (Skylab, Mir) hasn't gone so well in the past...
(keeping the ISS alive and expanding may prove to be the better alternative, at least in MY bombastic opinion)
The ISS is a good test environment for solar panels in a worst-case environment.
* frequently heating/cooling every 2.5 hours [or whatever it is] as they orbit
* exclusive source of power for ISS and must be reliable
* "up there" for years, not so easy to replace or repair
So far they seem to be doing very well. Hopefully the replacement/upgrade panels will outperform and outlast these, as the tech develops.
(I would like to see a method of producing power via solar wind, especially for interplanetary craft)
Solar cell disposal, however, is another problem entirely. Recycling is best (due to things like CE prohibited materials, etc.) but can it be broken into panels and returned via Dragon capsules? That's where a Space Shuttle would be more effective.. (ok these are staying up there for now but eventually)
And while they're at it, they should send up some additional modules to go with the extended power availability... and maybe some (laser, TIG) welding equipment and laser or plasma cutters (if not already there). I'd like to see them get a head start on orbital construction on a much larger scale.
then they'll need MORE panels, MORE modules, and MORE trips to/from ISS. But when the ISS is all connected, you have one "thing" orbiting, i.e. easier to control the orbit.
Just saying...
Like every OTHER framework (and language developed by Micros~1) before it, I'm sure Next.js will turn out to be
* bloated and inefficient (except when run on/by "the overlord" application/OS that will miraculously be able to run it well)
* attempts to be all to all and do all with all
* requires ridiculous hacks to actually USE it (in at least some cases)
* breaks those hacks at random with all-to-frequent updates
My opinion, of course.
and establishment enforcement of 'rules' can't POSSIBLY cover 100% of use cases. Which means something important WILL break. Or require ridiculous hacks. Eventually.
Something like ESLint (let's say) could become a way of spotting "code smells" that do not comply to well established standards, but they should be COMMUNITY standards, and not "just Google".
I would guess that this is a part of it (i.e. prevent someone withdrawing their submission and expecting that to be honored). But I think the "work for hire" concept would be more practical.
* as a contributor, you are "working for" the project
* As 'work for hire', the project has ownership
* They can, and should in my opinion, allow you to also distribute things on your own terms that are derived from your work [but I don't think this is part of it] to avoid YOU being sued later if you copy/pasta your own code into some other project
* They are free to license it consistently with the rest of the project (as owners)
In short, you gave them the contribution, so it's theirs now. Plus, if you GPL it, a derived work can always be made from the source (so no withdrawing it later).
So yeah, that and IANAL and my understanding of these things is limited to my own experiences.
(I think most contributions take the form of patches to existing things, though I somewhat recently contributed a userland application to the FreeBSD project - it may still have my copyright but it is under a BSD license anyway. It had to meet their somewhat tough standards, too, or they would not accept it)
you'd think they could afford an actual SCIENTIST who would tell them that below 600km altitude orbits decay in relatively short time frame.
According to NASA "Debris left in orbits below 370 miles (600 km) normally fall back to Earth within several years."
I'd say a collision at THAT altitude is much less dangerous "to the environment" than advertised...
I put 'ransomware' in quotes (followed by a 'd') because it's a similar concept, to threaten something based on a data leak and to potentially want money to NOT release it [unautorized data encryption/decryption being another variant].
But it was released, nonetheless, and the details about whether money exchanged hands (or did not) wasn't in the article... and if "police are investigating" it implies something a bit worse than your average data theft intrusion.
I suppose I could have said "data-leaked" or "cyber-burglared" or similar and been more accurate
more like their definition of "smart" does not include "savvy"
(and as a result, one of the 'smartest' organizations in the world gets 'ransomwared')
It's like the 'ivory tower' mentality in its most irritating form. From my observations, I think their I.Q. tests may be oriented towards making themselves look smart at the expense of everyone else [especially older people, and "non-college-students" in general].
Perhaps with practice any reasonably smart person could 'ace' their IQ tests, but anyone NOT accustomed to "what they expect for an answer" will be at a serious disadvantage [making their IQ results _and_ membership requirements completely out of touch with reality].
A friend of mine (back in the 80's) who had at one time been a member warned me about them. He was pretty smart, but had not been able to complete one of the most difficult military schools [one that I had done pretty well with, the U.S. Navy nuclear power program] where half the students typically drop out. But he was smart enough to have joined Mensa. I have to wonder how many OTHER Mensa members could pass that school...
(If I saw a resume/CV with Mensa membership on it, I'd accidentally round-file it)
if Wayland weren't trying to re-invent things [for the lulz apparently] while SIMULTANEOUSLY becoming more Windows-like _AND_ removing the one capability that makes X11 superior to all [display and interact on remote desktop over network or even on the same machine with a different login context, simply by assigning the 'DISPLAY' environment variable] I might actually consider using it. But I don't.
So if Mint/Cinnamon devs want to FOCUS RESOURCES ON THINGS THAT MATTER, and NOT waste time re-re-doing things JUST for Wayland, I'm in agreement with them.
I tried to read his comments by following the link, and that LIGHT GREY TEXT on OFF-WHITE was _SO_ HIDEOUS I had to expand it to 150 percent to even TRY to read it... it was like trying to read badly faded print in candlelight on yellowed paper.
I have to wonder what desktop the *ahem* web author of that page uses. THAT page is just *HORRIBLE*!!! WORST! WEB! DESIGN! EVAR!!!
Open Source can't innovate, only duplicate.
Seriously? Gnome and KDE had multiple desktop support around 2005-ish, as I recall. 'vtwm' even had it, a short time later [if I remember correctly], and all of the 'box' managers after that. This same feature took 10 years to show up in Windows. That is *ONE* example where innovation (in this case, for usability and productivity) came to open source FIRST.
Countless other examples exist. Where did that (obviously inaccurate) concept even COME from?
(I do recall some kind of 'powertool' for XP that came AFTER the multi-desktop support in gnome and KDE and it _attempted_ to provide multi-desktop support in XP, but it was brittle and sloppy and generally unusable)
it always seems to me to be a mistake in generalising and assuming that what works for you is the only way to do things.
THIS is what is WRONG with _SO_ _MANY_ _UI_ _DESIGNS_ these days!!!
The thing that _I_ noticed right away: screenshot of dialog box had 3D skeuomorphic buttons and borders!!!
And to me, the 2D FLATTY is an IMMEDIATE DEAL BREAKER!
So, THANKS to the UI makers for getting *THAT* part *RIGHT* !!
(and using FreeBSD was a cool idea, too)
well if "NOT boring" means "constantly using intarweb bandwidth (and wall time) to 'move fast and break things' and automatically surprise you with unwanted changes and unnecessary bloatware" then I _DEFINITELY_ approve of 'boring'.
"bleeding edge" is SO overrated... [why do people do this to themselves?]
I prefer being able to get work done, and abruptly changing the rules and/or creating instability just slows me down. It's why I like Debian for a lot of things. But I admit, I use Devuan, which is mostly like that too, except no systemd.
as for lipstick on a pig - the oinky end would have been better
Let's hope Micros~1 gets a clue and AT THE VERY LEAST let's US choose 3D Skeuomorphic and a "Classic" Start Menu/Button over 2D FLATTY McFLATFACE FLATSO [which they're apparently TRIPLING down on now] and a Mac-like look on the task bar...
Granted I can get that with certain desktop managers but I use Mate and want my WINDOWS SYSTEMS to be equally CONFIGURABLE BY ME. And 3D Skeuomorphic, not 2D FLATTY.
Not expecting much else. Too much disappointment and "not listening to customers" LAST time around. It is my strong belief that this practice isn't going to change no matter HOW hard the end users complain.
"a buildup of noble gasses"
There is only ONE thing that can cause this: Fuel Element Failure. (the article does describe this)
Fuel rod containment failed and allowed fission products into the primary coolant. Not only could this corrupt the physics parameters [because fuel can ALSO circulate] it can greatly increase the radiation hazards when working on the plant while it is shut down ['crud' traps with dangerous levels of gamma radiation, as opposed to something you could work near by for a few hours without it endangering your life]. F.E.F.'s are BAD.
Any leaks between primary and secondary systems (even tiny ones) can cause fission product gasses to end up in the 'air ejector' system of the secondary plant, which then radioactively decay into particulate matter (like Cs and Rb) which end up in the lungs and cause longer term damage. They'd also increase background radiation levels in the steam plant.
The only thing worse than a gross F.E.F. is a MELTDOWN [which is also a type of F.E.F.]
If this gas accumulation is SO bad that they have to release it (via a 'de-gas' operation which should be infrequent) into the atmosphere at levels above French safety standards, it HAS to be at least SERIOUS.
As I mentioned, fission product gasses (typically Xe and Kr) decay into particulates (typically Cs and Rb), which (as particulate) stay in your lungs for a while after you breathe in, and create "longer" term radiation damage. it's not like you can go outside and breathe fresh air to get rid of it. There is a known "biological half life" for removal of the radioactive particulates. I think it is a month or so.
The article points out that TINY F.E.F.s are acceptable (still bad but you can operate). This is why limits exist. If you are above the limit, your problem is SERIOUS and you need to SHUT DOWN and FIX it. And everything I have read in the article suggests that it's SERIOUS enough to SHUT IT DOWN, at the very least. Then they can replace or repair the affected rod and start it up again. That could take weeks, though [you would probably have to wait for decay heat and radiation to be sufficiently low, etc. beforehand]
(you KNOW the N.R.C. in the U.S. would have shut them down!)
it appears, though, that all eggs may be in one basket and there's no other source of electricity that can make up for the temporary loss of this power plant. Lack of proper planning aside, the pressure is ON to KEEP IT RUNNING ANYWAY.
We can expect further information leaking out about this event to be "filtered" accordingly. Unlike the gas releases...
I think they need to send a few more (inflatable?) "Hotel Modules" up there to accommodate the guests.
"Hey can I get some more towels? And another bathrobe? And will there be cappuccino at the continental breakfast? You can add a bit of liqueur to mine, please."
(Don't forget to tip the bellman)
or maybe it's more like camping out under the stars... "roughing it" for fun
changing over to the Linux kernel
as cool as that would be, particularly for kernel drivers (assuming they don't taint the kernel nor require signing certificates), our hopes were dashed LAST time when we all probably believed that Windows 10 would be a revert back to 7's UI...
(and a big big frowny face for that)
an international standard for batteries would be less important than a universal standard for charging connections. Sorta like using USB-C connectors except it's a car. The power source would know its own limits, and query the limits of the car, and give you a voltage+current that is appropriate.
the car would be responsible for converting that power into battery voltage and charging correctly, like your phone.
@tfb - ok - here's some math
According to one source, 173 terawatts of semi-usable and otherwise unusable energy strikes the sunny side of the earth continuously. Only a fraction of this can be practically turned into electricity, when you consider solar panel efficiency, infrastructure, cabling, conversion and transmission losses, and unpredictable weather. And we can't cover every square meter with solar panels, so most will be unavailable for use. So we're limited by the flux (energy per square meter), and the ability of our tech to turn that into electricity [when the weather is nice].
It takes about 30-50kwh of energy (let's say) to charge a typical electric car daily to support average commutes (let's say an hour each way) in California, based on claims of battery capacity and range and some ballpark atmospheric extrapolation.
A solar panel that is one square meter (on a good day, at a good time) could produce 200W (and in 8 hours, 1.6kwh). Angle of the sun and day/night limits this effective time period.
with 10 million commuters in California, for 300 million kwh, you would need at LEAST 200 million square meters of panel if you get 8 hours of usable sun each day (or equivalent). More than likely you'll need at least TWICE that and some means of backup power for extended bad weather.
For a single family house, it would mean 20-40 square meters of panels per car JUST to power the cars, assuming good weather all of the time and hyper-efficient storage to handle the peak demand of having your car plugged in every day. That's a lot of panels for one house, bad weather notwithstanding. Apartment buldings with multiple floors would be even LESS practical.
Many places (mountains, lakes, forests, roads, agriculture areas, etc.) could NEVER have solar panels on them. Power loss in conversion and transmission forces you to have the generators reasonably close to the point of demand.
In short, claiming that the sun shines 10,000 times the energy needed by human electrical demand onto the planet is EXTREMELY short sighted and does NOT take into consideration what it would require to harness it (nor the efficiency). You just can't put solar panels EVERYWHERE and expect it to work. It's just not practical. [I used to run a nuclear reactor on a submarine, including the electrical power plant, and worked in the power industry for a while, so I have a pretty good idea about a lot of this].
My conclusion: there is NOT enough energy being produced by the sun in order to meet the demands of human electrical power, if you include all of the cars and our needs (and desires) for transportation, and factor in the needs to convert, transmit, and store this power to meet demand.
Hence, we need to use Nuclear, Fusion, or even fossil fuels until we have a technology that's better than what's available now. Keep in mind, with fast-charge high capacity batteries (which we do not have yet) and some form of nuclear power (which is being resisted on every level from what I see) to charge them, electric cars make sense. Without all that, they do not.
(I really didn't want to get this wordy, but it looks like I had to)
This is ALL very true. Very often fossil fuel plants are the ONLY ones capable of handling that kind of electrical demand. There is NOT enough solar radiation on the planet to power cars, whether it is collected via solar panels, wind, or falling water.
The only real solution (if you are trying to eliminate fossil fuels, a position I wholeheartedly disagree with) is FUSION energy, which is the only thing that would make electric vehicles practical AND universal. The second possibility is FISSION energy, until we have working fusion plants.
(Other implications are obvious)
I would like to see worldwide fusion energy. I see it as the cleanest and most abundant form of electrical production. But nothing comes without cost. The question is whether the extremists are willing to shut up about their extremism long enough to get it done.
when high speed rotors are spinning, a sight out-of-balance is all you need to cause catastrophic failure in a short period of time. Severe vibration, blades strike shrouds, or blades break off. All bad for Mr. Drone, who will soon have a very, very bad day.
a while back someone did an online mini-series of videos called "Salmon Days". it had its funny moments, especially when the BOFH wrestled with the paperclip because it was repeating "It looks like you're writing a letter", and there was a lady that sounded like a modem or a FAX, and 'Microshaft' customer support froze up in mid phone-answer with the floating paper overhead... but unfortuately a lot of it just wasn't that funny, and it disappeared into the ether.
just worth pointing out... vaccination _is_ a way of establishing herd immunity...
(either that or do what parents did in the 60's for mumps, measles, chicken pox, etc. - expose your kids when they're young and get over it faster, whenever there's a breakout)
I prefer vaccines, myself. Only reason I haven't gotten one (yet) is I probably had the virus in early Jan of 2020 when a co-worker came back from china and then had to leave work due to fever etc. and then a couple of weeks later I had the symptoms, which went away in a day or so, and came back just once a week later (but milder). And a relative that lives with me got similar symptoms in that time frame. And things with sugar in them taste funny, now (instead of sweet). And early on, I thought "I'll let others get it first" since I'd most likely had the virus, when it was in ultra-high demand. But eventually, when it's convenient, I'll probably get the jab(s) just to make sure I'm immune.
So yeah I'm not even remotely close to being an anti-vaxxer, but i still don't like seeing people turn into extremists on the either side of this, especially when the arguments deviate from actual science.
the problem with saying "the truth" is that both sides of an issue often make claim to exclusive ownership of it, often with no proof other than "feelz" and "wantz" and "afraidz". On BOTH sides.
science demands peer review and repeatability and modified theories when the results do not come up as expected (and there were no lab mistakes that might have caused it). Over time, something close to "the truth" becomes possible. I would expect this to be true with a LOT of things. Over time, the truth will eventually be known. June 15th comes to mind on this one... (see icon)
But of course mRNA had nothing to do with kernels and so Linus was 100% right.
hmmm - that actually makes a LOT of sense depending on how the date/time math was being done.
more reason to ALWAYS store and work with date+time info as time_t (as GMT), or something very similar, to avoid [most] date+time math issues (then just tolerate any others).
I've done a LOT of date+time kinds of calculations with databases, etc. over the years, for decades even (from business analysis tools to capturing electric power waveform data to millisecond motion data capture) and the idea that a date+time calculation that crosses 0:00 might be responsible for a system-wide outage sounds VERY plausible.
I think AWS (and others), had a similar problem once (or maybe MORE than once) due to a leap second and its effect on the world-wide synchronization of data...
sorta mentioned in the article, BTC apparently dropped quite a bit in value... and if it is THAT easy for the FBI to play 'follow the money' to find crooks, BTC has significantly less value at hiding them from Johnny Law.
(yet another reason why I don't invest in it)