* Posts by ROlsen

14 posts • joined 18 Jul 2008

US ATM fraud surges despite EMV


The US delayed chip because it was a big change and our fraud rate was relatively low. Europe pushed forward because their fraud rate was significantly higher, which EMV has brought down to match US rates.

In 2004 US card fraud rate was 0.05 and Europe was 0.11, more than double. By 2010, EMV brought Europe down to a fraud rate of 0.06 while the US rose to 0.08.

Buffet would take a $2bn hit on IBM shares … if he cashed in now


Future Chips

I'm guessing that Warren Buffet retains his investment in IBM on the hopes of a large payoff as the industry switches to graphene/photonics/spintronics/etc. IBM R&D is still good, whoever solves the silicon problem should be able to generate some big profits.

Neuroboffins use supercomputer to partially build DIGITAL RAT BRAIN


Neuron and Synapse Oversimplified?

@David Lester: I'm not a neuro-scientist but I like to follow the research/read articles etc. It seems like there are complexities in synapse and neuron function that would need to be accounted for to make the model valuable.

I'm sure you are aware of many more examples, but two I've been reading about recently:

1-Dendrite preprocessing of information, seems like there is a lot more going on there than previously thought, not sure if the models take that into account.

2 - Neurons switching between slow and fast firing type depending on conditions.

Are the models being used (for synapse and neuron activity) good enough to think the entire model will provide a reasonable simulation of actual?

SHOCK and AWS: The fall of Amazon's deflationary cloud


Re: Fundamental limits

There is room for differentiation. There are workloads that benefit from different types of computing, (e.g. search and FPGA's), that the cloud providers could offer.

D-Wave disputes benchmark study showing sluggish quantum computer


I think I know what's up

The computer is simultaneously fast and slow

Inside Nvidia's GK110 monster GPU


Re: So many cores!


You are correct. I ported my neural net program to cuda and found that it was not the best match. Either reads are in order but writes are random, or the reverse (depending on which direction you decide to slice and dice the calculation and how the data structure is sorted in advance).

Can Larrabee Lazarus stunt Nvidia's Tesla?


Other advantages than x86

One of the things I was looking forward to with the Intel chip is that it is less parallel than the GPU's but more parallel than a few cores. I have a simulation that has gained some from porting to GPU but it really requires an intermediate amount of parallelism and better random memory access. Intel's chip looked like it would be in the sweet spot for me, and I can't be the only one in this situation.

Big Blue juices OS formerly known as 400



In another stroke of genius, IBM's marketing department will be renaming the i operating system to the . operating system. Yep, just one single little .. (The os name was the first dot, the second one was the period at the end of the sentence).

Microsoft: Oracle will take us back to 1970s hell



Hmmm. Let's count how many people we have running our Citrix farms, the number of people required just to solve our various printing problems, etc. and Muglia thinks mini's are a step backward? For enterprise business apps, what exactly is the advantage of using a PC server? If your answer is cost I would counter that it is more than made up for in headcount to support the environment.

Ballmer: People don't 'get' Microsoft's Yahoo! marriage

Jobs Horns


At first Yahoo makes some money on this deal. Then, as MS presents search results, they also provide all of the other things Yahoo currently provides, and gives user option to reset home page. Voila - Yahoo "customer" base "purchased" at a fraction of the cost.

Microsoft to EU: Cut me down, and Google will rule the world!



"I am happy to have IE on my system"

I know it says James Harrington, but is that really you Bill?

US court sides with Janet Jackson's breast


Nine Sixteenths was Long Enough...

Many are discounting such a short showing, but I am here to tell you that was more than enough time for me to...well..I'm not going to say because this is a public forum, but I think you get the idea...

Here's the multi-core man coding robots, 3-D worlds and Wall Street


Can Sun Keep Up?

"I certainly consider Sun the leader in the multi-core era, as it was the first mainstream vendor with a very, very aggressive design thanks to Afara."

I wouldn't disagree Sun has been the most aggressive. Although, at the same time, my thinking has really changed on all of this over the last few months as I've looked for an inexpensive way to get lots of processing power for a highly parallel application. I looked at multiple x86, cell, sun, fpga, etc. and finally found the answer (for me): GPU and in this case NVIDIA CUDA.

Clearly the battle is on as CPU and GPU head in each others direction. I seriously wonder if Sun can compete due to lack of volume compared to intel and gpu mfg's.


Sun Not First

"Sun led with the Afara-based Niagara line of processors, and now every major chip company has both "regular" multi-core chips and research underway into far more radical designs"

IBM's multi-core Power4 was out years ahead of any multi-core cpu from Sun/Intel/AMD.


Biting the hand that feeds IT © 1998–2021