Re: Dear me
I have a working original IBM PC from about 1982 with original monitor in my office at work.
134 posts • joined 17 May 2011
This is the company that switched from Motorola 680x0 to PowerPC to Intel to Apple's own take on ARM pretty painlessly*. They seem quite good at dealing with major changes to me. Pretty ridiculous article if you ask me.
* I'm not counting the 6502 - switching from that also involved the user switching to a quite different OS/Interface.
I think the new font is the BBC's very own new 'Reith' font. It was, at one point available to download in case you were going to create any content for the BBC. It is horrible. I can't understand why on earth they thought they needed their own font, let alone one as ugly as this. They were planning to move everything over to 'Reith' in the future.
Quarantine in response to notification from the app is voluntary, so you will not be fined if you ignore it (although you will likely be an idiot). Quarantine in response to a call from Serco's track and trace is not voluntary and you may be fined if you ignore the instruction.
Barrett said back when the project was launched, "but by partnering today with stakeholders across industries and agencies, we can set up the United States for this aerospace phenomenon."
I read this as "but by partnering today with SKATEBOARDERS across industries and agencies..."
I think they might have more success if they did partner with skateboarders for this one.
A friend of mine who was doing his PhD and programming PCs in Prolog to do something to do with analysing people's understanding of skin diseases (shades of The Singing Detective) took it on himself to 'tidy up' a sparcstation 2 that was about to replace something ancient we'd been using as a fileserver and host for some early experiments in website design (this is 1990 or so). For some reason he decided (as root) to delete /dev as it seemed to be full of lots of useless empty files. Not a good idea.The machine was connected to a network and the console was running the SunOS 4.1 GUI with a terminal open. I did not know much more than my friend, but I did have my own Sun 3/60 and I'd been on a short course for scientists who had to deal with new-fangled workstation things. I have forgotten how I did it, but armed with my trusty SERC 'how to be a unix system admin' manual that came with the two day course I'd done, I managed to retrieve everything. In the land of the blind the one-eyed man is king.
What is the new full definition of the Candela? It is an interesting SI unit because it is not defined wholly in physical terms. It is a psychophysical unit in that it takes into account the function relating the sensitivity of the human visual system to lights of different wavelengths (V(lambda)). There should only be one wavelength where the candela can be defined wholly physically, for every other wavelength that value (in watts per steradian - that's where you can get the caesium atom transitions and fractions of the speed of light in) there must be multiplication by V(lambda). Someone just has to measure that (V(lambda)) with real people by asking them to make judgements about the relative brightness of lights of different colours (wavelengths). How is that bit (measurement of V(lambda)) now defined?
See the incredibly beautiful study of Hecht, Shlaer, and Pirenne (1942) Energy, quanta and vision. J. Gen. Physiol. 20 819-840. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC2142545/ Dark adapted human rod cells require between 5 and 8 quanta on average to produce a neural signal. After the effects of absorption by parts of the eye between the outside world and the retina this corresponds to an ability of the humans tested to detect between 54 and 148 quanta (i.e. that's all you need to see a light under optimal conditions - the integration time is about 100ms, so all the quanta have to arrive within 100ms for optimal performance).
Here is the relevant section of the paper's abstract:
"With these three corrections, the range of 54 to 148 quanta at the cornea becomes as an upper limit 5 to 14 quanta actually absorbed by the retinal rods. 3. This small number of quanta, in comparison with the large number of rods (500) involved, precludes any significant two quantum absorptions per rod, and means that in order to produce a visual effect, one quantum must be absorbed by each of 5 to 14 rods in the retina. 4. Because this number of individual events is so small, it may be derived from an independent statistical study of the relation between the intensity of a light flash and the frequency with which it is seen. Such experiments give values of 5 to 8 for the number of critical events involved at the threshold of vision."
A Vision Scientist.
I'm amazed at how many of the tin-foil hat brigade showed up for this one. If you are that paranoid about surveillance and the lengths higher powers will go to to get it, then surely you realise that the NSA have 'special' code inserted into all smartphone OSs and every network adapter driver for every operating system used by more than five people. Despite all this guff about the security of open-source, no-one actually checks everything (and they don't know the 'one special trick').
That's a great move. I like them. I (genuinely) only use TeamViewer for home stuff. At one point, not because of usage, but because I clicked a button to try some new feature for free, they decided I must be using it commercially. I emailed them. They checked their records. I was put back to 'free' and they even apologised. All also done very quickly. Quite amazing.
The reviewer said "At nearly £300, it's almost as much as a pair of Sony WH-1000XM3 headphones – which is one of this reviewer's daily drivers". These are *headphones*. They seem to be being confused with WF-1000XM3s in the comments. Those are Sony earbuds. I now don't know which I've missed out on buying for $280 from Orange-Idiot Land. One a related note (again risking bud and phone mixups) I had some Sennheiser Momentum Wireless Headphones which sounded great but were quite incapable of maintaining a connection to my phone when there were other bluetooth devices nearby - something which does occasionally happen in places like aeroplanes. I replaced them with Sony WH blah blah blah which might not have sounded quite as good, but did work wirelessly.
Can't someone just recreate the simplicity of Borland Builder as it was in the late 90's? Made coding the interface trivial so you could concentrate on the stuff behind. I'm not a professional programmer, but I program a lot (I'm an academic) and nothing seems to have replicated Builder's combination of simplifying building the interface, yet still letting you do whatever you want behind the scenes (I program experiments in visual perception, so I don't care about things like database integrity, but sometimes I want to write programs that other people - students - can use easily).
Isn't writing some code to normalise image stats and provide a gui to tag regions of pixels a 10 minute job in Matlab if you know its gui builder and have the image processing toolbox? I did something like this to allow someone to tag regions of a scene and generate a recolored version for eye-movement analysis a few years ago and it was trivial. My understanding is that it is this front end that gets your software classified as a munition or whatever, not using the tagged results in training some network. Absolutely crazy.
Do these need a wire to connect to a sound source? Despite being a bit of Senn fan (HD25 and HD26Pro are fantastic indestructible headphones) I have just sent back a pair of Sennheiser Momentum 3 wireless noise cancelling headphones because, although the sound quality, noise cancellation, and comfort, were great, the bluetooth performance was terrible - in an aeroplane where lots of other people were using wireless headphones, they kept cutting out horribly. I got a pair of sony wh1000-mx3 (such a romantic name) instead.
Out of genuine interest, what is currently wrong with iOS 13.2.3? In addition, has anything been done about MacOS Catalina yet or is it still a disaster zone? Whatever is/was wrong with iOS, seems minor compared to what I read about current MacOS...
Can't algorithmic programs learn by example? Isn't Doug Lent's Cyc (https://www.cyc.com/) algorithmic rather than an artificial neural net (so far as I can see the book is really about ANN-based AI and not AI as it was practiced in the days of LISP machines and 60's cognitive science)?
Partially it is down to IT service management policies where I work. If you use a windows machine for office stuff it ends up being centrally managed which mean you lose admin rights. If you have a Mac none of this happens. I used to work with Macs in the late 1980s and came to hate them, but I thought I'd try try giving them another go a few years ago because the admin rights business was so frustrating. The terrible instabilities of MacOS 7.5.3 etc. from days gone by were no more. Everything worked without a hitch. I could run enormous displays and use a nice keyboard at work (and even run a nice graphics card in an eGPU) through a dock and then take the Macbook home by just unplugging a single Thunderbolt 3 cable. I could have set all this up from Windows but I suspect it would have been harder to do. So, all in all, I don't think Macs are leaps ahead of Windows, but they are easy to use and once I'd made the switch I liked the experience.
Bollocks. Here I sit with an enormous Windows box with a rather good Nvidia GTX 2080 Ti running an HTC Vive Pro Eye that I program with C# in Unity and next to that some HP dual Xeon server running Linux in which I develop (not use, develop) multispectral physics-based raytracers for vision research. But I'm typing this on a MacBook Pro because that's what I like using an an office machine. So piss off with your "People who buy Apple computers are either not IT literate or idiots, no ifs buts or maybes". Idiot.
Is this the same OPPO who sell audio gear? Planar-magnetic headphones which are too expensive for me to buy (a rich friend has some though), a very good portable headphone amp which I do possess, and desktop headphone amp which is way out of my price range. All decidedly at the high (but not totally, totally insane) end of things. On that basis I'd expect them to be selling £1000+ phones with maybe one lesser option.
(The megaphone is the closest icon I could find for 'audio')
The current NN craze (CNNs, deep learning etc.) have pretty much nothing in common with the real thing in which the individual neurons have complicated time-dependent dynamics even before cyclic connectivity give the network itself complicated dynamics (difficult things which are intentionally avoided in artificial neural networks). Many years ago I was in Santa Fe at the Complex Systems institute where I met a mathematical physicist (ex-string theorist I think) who was trying to work out how the different modes of 'chewing' in the stomach of a lobster were produced by the 12 neurons in the somatogastric ganglion of the lobster - repeat just TWELVE neurons. He was part of a whole team of heavy duty applied mathematicians, they hadn't made a great deal of progress in understanding the 12 neuron system when I heard about it (but they did do experiments and got to eat their subjects once the experiments were over, which was good). It is relatively simple to construct a model that approximates the neural network in a nematode (although figuring out how much detail you need in the models of individual neurons to adequately mimic the behaviour of the real thing is tricky), but that is utterly trivial compared with understanding how the properties of network give rise to particular patterns of neural behaviour and understanding why these patterns of neural behaviour are of use to the organism as a whole.
The idea that everyone in the 50% of the school-leaver population who were encouraged to go to university would actually benefit from it is crazy. A lot of people were essentially conned into spending a lot of money on something they probably didn't enjoy (the learning bit, not the social life bit) and that, in truth, wasn't going to land them with better jobs than they could have got with a university education.
All this business about needing PGCEs or other teaching qualifications to be hired to teach in universities is either odd, weird, or wrong. These days in my hallowed institution (in the worldwide top-100 etc.) lecturers are hired regardless of whether they have any teaching qualifications (we do care about track record in research though) but have to do a teaching course during their first two years of employment. This happened some time after I started, so, as a member of the old guard, I have no teaching qualifications of any kind yet still get to be Prof. and teach at all levels. If, however, I retired from university and wanted to do some school teaching then I really would need a PGCE or something (although I think I could teach at a private school without any teaching qualifications).
Suppose I want to get an input n times and each time I get it I then test it's value and then choose what to go and do on the basis of this value before returning and getting the next input. How do I do this in a language like Bosque?
[This isn't some cynical trick Reg-comments-esqe question - I don't know the answer and I'd genuinely like someone to explain it to me.]
Biting the hand that feeds IT © 1998–2021