Re: wut?
You get Intel Inside (TM)
14 publicly visible posts • joined 22 Dec 2008
Has anyone considered they might be using something like a PenTile matrix to achieve the resolution/pixel density of the "new iPad"? That would certainly explain why an image could look worse when compared to the iPad 2 since the pixel quality on the new iPad would not be equal to that of the iPad 2.
If not then Apple has only it's own programming to blame, since it should just be a simple case of drawing what was one larger pixel on the iPad 2 over four of the pixels on the new iPad which is the simplest method of image up-sampling possible surely?
The long:
Amazon already have their own app-store. It's only available in the US at the moment although you can browse it (but not download anything) from elsewhere. It's more categorized and easy to browse or find what you are looking for than the Android Market, and it also has daily offers where paid-for-apps are offered for free one day (e.g. today they are offering QuickOffice Pro worth $14.99 for free).
No Android Market and the other Google apps will not be included on the device.
Lots of high-end GPS devices actually include a barometer. Although GPS is very good at locating your position on the surface of the Earth (x, y), it is pretty bad at determining your altitude.
A barometer can be used to get a quite accurate altitude reading which can then be used to speed up the GPS lock.
At least that's one of the reasons the Galaxy Nexus and Galaxy Note have them; I'm not certain whether the Xoom and other tablets use it for the same thing but I would guess so.
Maybe because Universities have things called mailing lists, and considering the amount of email people get in general about careers/conferences etc at these places people don't really feel the need to look elsewhere for even more.
So have you/they contacted Edinburgh? AFAIK here at the ICC in Durham we have had no publicity about the event -- this is the first time I've seen it. If we had been told then I'm sure some of us would come! I know a bunch of us are heading down Edinburgh for the DEISA/EPRACE Symposium at the end of this month.
However a good point is made by @AC: barely any of us know hardware in-depth: there's not time. It takes long enough to build a simulation and then there's GBs/TBs of data to analyse when you're done. This does seem to be aimed more at the computer science people who build/design supercomputers rather than the scientists/engineers/mathematicians that use them most of the time.
We have sysadmins and IT staff to install the new supercomputers, tune them, deal with email and network issues etc: so researchers only really play a part in procurement of new supercomputers - i.e. is X good enough for our simulation Y.
And this is generally cost-limited: the performance gains by tweaking our supercomputer to the maximum config (over what has already been done) would not be gigantic: for several months spent improving the system we might cut a 9 month job down to 8 or maybe 7.
Also the issue with these jobs is not that they take 8/9 months to run in the first place. It is that a simulation 10x the size takes 8/9years+ to run, and for it to be a feasible use of departmental computing resources it has to generally take less than a year or so to run. No one cares about running a simulation with 1.1x the resolution of somebody else: it has to be around an order of magnitude if you want to shout about it.
I think we could see larger improvements by making better use of the supercomputer and computer resources we already have - i.e. better batch queue systems, being able to use a bunch of idle workstations for smaller computations, better connections to other University systems so if we need 100 extra cores just for a week/month we can get them easily -- sadly UK Universities have not set up an on-demand grid to be shared between everyone. Still I guess we'll all be moving to platforms like Amazon's EC2 in the future which should improve this aspect.
Anyway I'll ask the folks around the dept if anyone's interested.
There a 2 major flaws in the Nexus S:
1) Prior to 2.3.2 it has an SMS bug
2) In 2.3.2 there is a bug causing it to randomly restart (especially on calls longer than 3 mins)
one of which *must* currently be present in the handset when sold (it will have 2.3.2 or a previous version) yet I see both CPW and Vodafone failing to advertise as such (despite Google having publicly announced these).
Thus AFAIK (IANAL) all current sales technically breach the sale of goods act (1999) - that a product must be as advertised and of 'satisfactory quality'.
PS: Was Nexus S owner. Might get one again when they've sorted out this mess. Wish I'd never let my N1 go.
@AC: I digress, psychologists and biologists *are* scientists really; we physicists just love to tease the others - it gives us sense of superiority that many (a larger proportion than in psychology etc) of us physicists fail to achieve outside of scientific practice.
AFAIK no one uses h-indices to actually to select or differentiate candidates for a position; they will look at the total number of citations your papers have relative to others in the field and the total number of papers you have published relative to their length and the time you have been researching.
The h-index is really measure of how popular you are as a scientist *overall*, not how popular a certain piece of work is in it's field.
For instance if you worked in a really obscure field with only 3 other researchers in total and had produced 2 papers both cited by all 3 others, then those pieces of work would clearly be very significant (espec. if you were being hired to do something like that), but none of this would be revealed by your h-index.
So within their field, homoeopathists etc can get lots of citations and thus jobs in homoeopathy, since their employer really only cares about their performance in that field (quite rightly so).
As a researcher at the Institute for Computational Cosmology in Durham, I wholeheartedly agree with the move away from the discredited 'scientist' term, and would like to second 'boffin' for the replacement.
Furthermore I hereby propose a test which any boffin may apply to distinguish a scientist from a boffin:
1) They can handle basic maths: integration, differentiation, logarithms (removes biologists/wasters)
2) Every number they quote has an error attached (removes mathematicians, economists)
3) They never, ever use the word 'believe' when talking about science (this one removes the amateur scientists)
Your subject is a boffin if (and only if) they pass all three stages.
I propose we set up a national centre for boffinry, and only once one has completed the required testing while you be allowed to call yourself a boffin.
Whilst to most of you it may seem to be an error that the application identifies a Compaq server as serving coffee, those of us who dedicate our lives to improving coffee drinking know different.
Computational coffee preparation is an actively researched topic - the basics were formerly declared in RFC 2324 (HTCPCP/1.0) - including the PROPFIND command required to find out meta-data about brewed resources.
However we all know that the future of coffee lies in the cloud like everything else*, so I expect to see a specification on "large-scale distributed coffee preparation using HTCPCP" in the near future.
I hate to say it, but M$ should probably just can XP. It was good, but half the people who want it now haven't actually really tried Vista, they just like to insult it. I'm not saying it's brilliant, but certainly better than using an OS that's 7 years old. More secure, more stable with SP1 providing you don't run a load of legacy apps, and it actually uses your GPU.
Also another comment I've heard about Vista is 'it's different' - well what do you expect, it'd be pretty poor if Microsoft came back after 5 years and gave you XP with a new name. Things have changed to make it better people.
Not worth upgrading if you're running an ancient comp though.