Re: Wow
The VAIO is not so good for portability and battery life though is it? Four hours of light use on a charge, seriously?
2645 publicly visible posts • joined 19 Sep 2007
Similarly teaching them the principles of programming on the Pi or Arduino is a good thing.
Given that, would kids playing with Lego be worthy of a news article? Would complaining about such a news article be expected on any news site that doesn't deal with children's play as a major theme?
There are some powerful and interesting toys out there. People are coming up with interesting uses for them on a daily basis. Comparatively this phone isn't special (it's not even the first phone to be built out of similar bits), but it involved the Pi, which somehow elevated it to newsworthy.
Daniel's not far wrong, in the world of hacking this kind of thing is considered to be the equivalent of playing with Lego. I'm pretty sure I have all the bits I'd need to do the same kind of thing on the shelf at home, but just because you can do something doesn't mean you should.
What is considered clever in that world is either coming up with a use for the parts that no one has thought of or done yet (I remember one project that implemented a Harry Potter style clock, that used the clock hands to show where you were, based on a GSM module) or are technically very complicated or intricate (one guy reverse-engineered a failed custom IC in a HP logic analyser, and built a replacement from surface mount parts on a tiny board that plugged into the original socket). The fact that this uses a Pi for a controller may get it a mention on El Reg, but it's small beer compared to the kinds of things that are presented every day in that world.
That the food most folk buy for these things seems to come mainly from supermarkets. If you're really on a budget then scouring street markets or, at a push, the likes of Aldi and Liddl (mostly for what's on special offer) seems a better bet. Shopping direct from farms, foraging etc also seem to be ignored. Your third world inhabitant doesn't have to feed the same number of middle men that we do.
Erm, how about you wait until (or if) Apple release the thing, look at what it actually offers, see if the price and features match up with your needs and then decide? Saying you won't buy an unannounced, mythical thing because you think it will work in a certain way is only setting yourself up for ridicule.
You're kidding right? Latency is not an issue for HFT? Why on earth are the banks paying millions to co-locate their servers in the same machine room as the exchange systems, then going that extra step by using FPGA based devices to work out whether any given piece of data has arrived first on the primary or backup network link?
This would be opposed to Microsoft, who habitually run public betas?
It doesn't matter how many you have in your QA team, there will always be someone using things in a way you hadn't expected. If you can find and fix those bugs before releasing the final product then people will be happier. At the same time there are those who actively want to try out new software. This addresses both sides of the equation. Why you should feel the need to moan about it is beyond me.
iMacs in sleep mode use about 1W of power, MacBook laptops get down to about 2/3rds of a watt. Even when off/hibernating they consume a small amount of power (about 1/4W). If you think that amount is going to have significant impact in your green footprint then I've got a bridge I can sell you.
How much money do you think they could sell a license for? Now how much money do you think they made selling a Mac? (Hint: look at what Microsoft can charge for Windows). Almost every clone sale was a lost sale to Apple and netted them less money. The market wasn't growing to any significant extent and Apple was competing against their own OS. I'll say this again, it was a bloody stupid idea. Microsoft could licence software because that was their business model. They didn't care about OEMS competing against each other because they'd already been paid. They can't however start selling their own machines without significant backlash.
Microprocessors have always been subject to economies of scale. Apple and Apple clones would have had to increase their market share many times in order for PowerPC to compete with X86 on cost, but the reason that Apple decided to move was one of power consumption. There was no roadmap forward for the PowerPC that involved both a significant improvement in speed and a reduction in power. The G5 was pretty hot, and late, and it didn't look like Motorola or IBM were going to be able to meet Apple's future requirements. Because of this they jumped to Intel, and the resulting machines are much faster on a lower power budget. That's a win for users whatever religious view you hold about x86.
You are kidding? The biggest mistake they ever made was in allowing them in the first place. Apple has always been in the business of selling hardware. Licensing their OS out, then having to compete with the clones on price nearly finished them. The clones weren't good pieces of hardware either, so they tarnished the brand.
Microsoft grew from the opposite direction, licensing software and making little in the line of hardware. They have had precisely the same kinds of problem trying to get into the hardware manufacturing business. Surface and Windows RT did badly in part because they were competing with their own OEMS.
1) technically Woz never left the company, he's still on the payroll
2) the aircraft crash in 1981 (which messed his memory up) and not wanting to get involved in management were more the cause of him wanting to move on to other things like teaching.
3) Woz wasn't interested in the money part of the business, why would he have been concerned over which products made money?
He's talking pure rubbish, that's why. If you've got 50% or less of something then you need 100% improvement or better to reach 100% or more of the total. If you have 99% of that thing then you need just over 1% improvement to reach 100%
In this case the math is 0.6 + 0.6 * 0.7 = 1.02 = 102%
So either the panels were previously running at about 362W out of their fully rated 1000W and are now running at 615 (or just over 60%) and El Reg misquoted or someone has seriously mucked up their math.
Eh? I'm not following your logic. It seems to be similar to that used by old mainframe users when the microprocessor came out ("but if it breaks how will you repair it?"). More and more gets integrated onto the CPU because (1) there is now space for it, (2) it makes little difference to the price and (3) the propagation delays inherent in going off-chip kill performance.
As to whether you need 8, 16, 32, 64 or whatever size of register, that's down to the performance wall in clock speeds. You can't (easily) make the chip go any faster, but you can make it do more in one clock cycle by making the processor wider. You keep doing that until diminishing returns set in and the extra width ceases to be useful. 64 bit integer and (possibly) 128 bit floating point. Once you have hit those limits you can only multiply the number of compute cores, which HSA is one variant of.
It's not always faster to build wider ALUs, there is a propagation delay for the carry bits from one binary digit to the next for example. There are clever tricks that can be used to reduce/minimise that delay, but they cost silicon real estate. With GPU computation you have many cores, so you want to keep them as small as possible, so what you end up with is a compromise between size and performance.
What evidence do you have that this type of sensor is any more secure? The techniques used to bypass TouchID were developed first for this type of scanner. Fingerprints aren't a very secure authentication method. They're better than a 4 digit pin, which is why the way that Apple uses them makes some kind of sense, but hooking them to a cash payment system?
The fingerprint scanner is something that other sites are saying that a Samsung got badly wrong. It needs to be used two handed, requires you to drag your finger across it and is sensitive to the angle of your finger. On paper it sounds good, the actual implementation is definitely lacking.
OS X is actually fairly easy to port (it started life on PowerPC, migrated to x86, was cut down to make iOS etc), but the prediction is complete crap. Current A series CPUs are something like 1/3-1/4 of the computing power of Core i5 series CPUs. The laws of diminishing returns also cut in, so the chances of Apple being able to repeatedly double CPU performance in the near to medium future are slim.
Untrue, lots of people were asking the question. The US were working on their own supersonic design, the Russians pinched much of the Concorde data for themselves etc, but the introduction of the 747 by Boeing changed the economics of flying and that was mid-way through the design process.
The second factor was in the rapid improvement in telecoms, there was much less need for passenger travel at those speeds.
Those two together are what killed it, not that the idea at the time it was proposed was bad.
There was a story of a pilot from one of the Arabic countries who came across to evaluate the Lightning. There was no two seat version so he had to read the manual and be given a pilot briefing. He was told "don't use reheat during your takeoff run", but chose to ignore that. He reached 20,000 feet before he managed to get the landing gear up.
I'm looking with interest at the ALTERA Cyclone V SE at the moment. A twin core ARM Cortex A9 @800MHz combined with an FPGA on one chip. More IOs than you can shake a stick at and lots of logic resources. You can get a complete dev board, including 1GB of RAM connected to the A9s and 64MB to the FPGA, 85K of logic elements, Gigabit Ethernet, VGA, 24 bit audio in/out, 2 USB host and a USB slave port, a micro SD slot, two 40 pin IO ports and a built-in USB JTAG programmer for less than £170
As with any medical procedure there is a risk. It's also not perfect like they would have you believe. Saying that I had my eyes laser treated when it reached the point that if I couldn't remember where my glasses were I was in trouble (about -3.5 diopters short sighted, with an astigmatism). I still use glasses for reading (age makes it harder to focus on near objects) but I'm good to drive without them and my distance vision is fine. In my case it was a good choice, YMMV.
As usual Wikipedia isn't a comprehensive source. See for example http://books.google.co.uk/books?id=q2w3JSFD7l4C&pg=PA139&lpg=PA139&dq=ibm+360/168+virtual+memory&source=bl&ots=i3OQQExn_i&sig=MTqFlizLAFWINMmVkqgr_OhdbsY&hl=en&sa=X&ei=ZvtCU-mbHMmd0QX6vIHgAw&ved=0CFEQ6AEwBQ#v=onepage&q=ibm%20360%2F168%20virtual%20memory&f=false
The 360/168 had a proper MMU and thus supported virtual memory. I interviewed at Bradford university, where they had a 360/168 that they were doing all sorts of things that IBM hadn't contemplated with (like using conventional glass teletypes hooked to minicomputers so they could emulate the page based - and more expensive - IBM terminals).
I didn't get to use an IBM mainframe in anger until the 3090/600 was available (where DEC told the company that they'd need a 96 VAX cluster and IBM said that one 3090/600J would do the same task). At the time we were using VM/TSO and SQL/DS, and were hitting 16MB memory size limits.
Data detectors is an old patent, dating back to the 1990s when it was used in MacOS. Contrary to belief around here, sticking something on a mobile device doesn't mean that it isn't subject to the original patent. The only bit that is patentable is how you adapt it to work on mobile, and as far as I can see this required no adapting.
Is the Marantz MCR510 network streamer plus a decent set of shelf stander speakers. That gives you TOS optical playback, Internet radio, Spotify et al, DLNA and AirPlay. As it sells for £250 you can get a pretty damned good set of speakers from the £600 the Sonos costs and still have change.
Either tap the HDMI audio channel directly (device -> sound bar -> TV) or take the digital out provided by most modern TV sets and play that.
For rather less money I feed the TV via TOS optical into a HiFi DAC, through a Marantz amp and out of a pair of floor standing speakers. The result is only stereo, but is head and shoulders above any TV speakers.