
Re: The Hall Effect, Polarized Light and Instantaneous Universal Communication
Blahblahblah. Another pile of pseudoscientific bullshit waffle from FSS.
5956 publicly visible posts • joined 8 Oct 2009
Two Antec PSUs that went wonky, first one after just under a year, the other about half a year later. And one motherboard (some ASUS socket 754 iirc) that started to dislike half its memory. Apart from that, just external influences like a lightning strike across the street killing a soundcard and modem.
That's hardly an endorsement. Kit like that should last at least five years, else I'd consider it 'not fit for purpose' with a subsequent claim against the manufacturer.
The most-intensively used systems I can call my own are all Thinkpads of varying age, none younger than those five years. The only problems I have to deal with is a reluctant chipset fan (not the primary CPU fan, and once it's past POST the system apparently doesn't care about it stalling again) on an X61, and the batteries on two 701C's being, quite understandably, rather expired. Somewhere in the past I had a T23 and an A21 joining the choir invisible at age 7+, and an X22 that was loaned out, subjected to a puddle of soft drink, improperly cleaned and only handed back after several days. It did keep going for about three months, but finally ceased to be. An X30 is still in use 24/7.
When other people are using your WiFi then you are to a certain degree being their 'ISP by proxy'.
My router, my rules. Also applies to ad networks trying to route packets in.
If people are paying for connectivity, they can expect sites to be blocked or not according to their wishes. If they don't, tough shit.
You missepled "thrustworthy". Worthy of thrusting onto the scrapheap.
Which, apropos of not very much at all, reminds me of the prank a couple of friends pulled off at an event. It started with a Cisco 2900 and a pickaxe, which turned the 2900 into a very dented 2900, nearly split down the middle. The circuit board was removed and two 8-port desktop switches fitted, so that 14 ports could still work. A few strips of sticking plaster were applied to cover the more ragged edges of the case, and then they casually walked in to the event NOC, to have an uplink activated.
As the network team tended to use 2900's as field distribution switches, they understandably assumed it was one of theirs and collectively went rather pale. Demonstrating that the switch still worked when plugged in added a good pile of incredulity to the paleness.
Yeah, and we were limited to 640k in the 286-486 days, then came Extended Memory pushing it up to 1 MB
The 286 was already capable of 16MB, but DOS, running in 8086 real mode, could only deal with 1MB, of which the area from 640k to 1M was used for the BIOS, controllers and a window into expanded memory if you had that. Everything over 1M was designated as extended memory.
Several other OSes of that time were capable of using all available memory without having to move back and forth between real and protected mode.
And of course there were also other architectures that weren't saddled with icky design decisions for the sake of backwards compatibility that few people actually used.
Memory DOES come in power-of-2 sizes, always has been, always will be. Width, as seen from the bus, will be a multiple of 8, or 32, or 64, and address size will also be a multiple of 2 because of the way address decoding and address mapping between memory banks works.
None of this disk drive sizes marketroid malarkey.
"I don't want to live on this planet anymore"
When tackling problems, any problems, one should start at the source. In this case the whale stick burners and joss song hummers, plus any that have comissioned them to burn and hum, plus those that have accepted the result.
'B' ark, hold 17.
If the power LED (turning from green to red to indicate 'off', instead of just plain turning off) draws 300mA, that cam would make a nice table lamp.
Indicator LEDs don't need to draw more than a couple of milliamps, 10mA if you want a nice bright one. Whatever the NEST cam is doing, it's not actually turning off.
I would so like to see this message written in fire in letters thirty feet high on the far side of the Quentulus Quazgar Mountains in the land of Sevorbeupstry on the planet of Preliumtarn, which orbits the star Zarss, which is located in the Grey Binding Fiefdoms of Saxaquine.
And made out of correctly stacked MSFT executives doused in petrol.
Windows 10 is making me seriously consider getting a linux for the home pc
A bit slow on the uptake, eh?
not the place to rant about MS being total bastards because it just shows them as incompetent.
So it's OK because they're occasionally merely incompetent bastards, and full-blown proficient bastards at other times?
It's not that they messed up and had to reset the Advertising ID setting to what it was before, it's the fact that such an identifier is assigned to W10 users in the first place. Google Anal-ytics has the "decency" to work via a separate domain, which can be blocked. Making it part of the OS turns it into a different kettle of fish.
Whoever thought that up should be fed, genitalia first, into a meat grinder.
Recently, Windows Update on my dad's laptop (this one, and his desktop PC are the only W7 systems in my care) b0rked. Several MSFT fixing tools chewed on the problem, but failed to correct it. So finally I put a new disk in, installed W7+SP1 (and Mint next to it), told it to fetch updates, AND VETTED THEM ONE BY ONE. All updates mentioned in one of the replies here or in the other W10 articles were unchecked and hidden, as well as any of the other updates that smelled even vaguely suspicious. The Do Not Want settings were applied to the registry, and only then were the remaining updates applied.
Still, powering up the lappie to hand it back last monday, it showed that &^%*&$^*^!! "Get W10" icon in the notification area. Investigating, it turned out that there were a couple of entries in the Task Scheduler related to GWX, which were terminated with extreme prejudice.
I ddin't have time to investigate the source of this particular GWX invasion, but told my dad not to click on it if it ever reappeared (he fully understands why, so I don't expect problems there) and let me know.
Arrgh.
It's nice if you know the content of the messages, but knowing who sends what when is a good start. The British used it in WW2 already for Axis radio messages, with unusual message levels or message lengths being indicators for higher priority for deciphering.
The Da'esh know this, and therefore use a phone just a few times before dumping it, so well before any of the spook organisations would become aware of it being used for nefarious purposes. The fault there was not using plain text messaging, the fault was dumping it where it could be recovered and too close to where the attack took place. Recovering the phone led to checking when it was used and where the messages were sent from/to, which again led to the raid in Saint Denis.
I think an interesting voting system would be that allocates a number of points to each voter, who then can assign any number of points in favour of or against any of the candidates. So, with a 10 point quotum as an example, one might vote
Candidate Not Just No But Hell No: -5
Candidate Fairly Sensible: 1
Candidate Monster Raving Loony: 4
And with it, proportional representation, not first-past-the-post
sounds outside of human hearing in the first place?
It is generally accepted that HiFi reproduction should go up to 20kHz with a maximum roll-off of 3dB. Sure, a lot of people won't be able to hear frequencies that high, but quite a few do, even when over 25. I had a housemate bang my door when I was testing my speakers' frequency response, and had left the generator at about 23kHz afterwards. My own hearing went to 19.45kHz back then. And it's not just straight high frequency sound reproduction that matters, there's also all kinds of step response matters that come into play.
Anyway, even if an average TV sound system would start to roll-off at, say, 15kHz, it would still be possible to send info to a phone at 17..18kHz, only at higher levels so that it can still be picked up. Only with digital filtering can you effectively create a sharp-ish cutoff at a particular frequency, but that has to be built into the TV's sound processing system. And why should the manufacturer do that?