In a nutshell: quantum computing algorithms work with different operations from classical systems. In classical systems you can set a bit to either 1 or 0, and then you can perform gate logic with them. In quantum computing algorithms, you can do most gate operations except copy patterns (because all quantum gate operations must be reversible for reasons that have to do with a need for humans actually being able to understand the math that underpins the algorithms). In addition to the regular gate operations, there's also a set of operations that can "shift the balance" between the 0 and 1 state of a qubit so that it's, say, 50% 0 and 50% 1. then any further gate operation will actually operate on both those states. negating the qubit will make it 50% 1 and 50% 0 for instance. As long as you don't read the qubit befor your algorithm is done, this is great: it means you can apply one operation to a series of qubits and -if your algorithm is welldesigned, and extremeffs they are not easy to design- get out a qubit sequence that contains every possible correct answer, and none of the wrong answers.
The problem is you can only 'select' on of these answers, and you can't pick which one. once you measure the qubits, you get only one of the numbers, so as long as you only want "a right answer" rather than "a specific right answer" (for instance, factoring a number into its prime constituents) , or you designed your algorithm in such a way that the final qubit pattern is the only possible answer, job's done.
The real problem is that qubits are unreliable. Unlike classical bits, they tend to wobble about a bit and not always contain what you want them to contain, so quantum algorithms operate on the principle of "it's bloody unlikely that if I run it ten times and it gives the same answer ten times, that answer is wrong". Unlike a classical algorithm, you have to run a quantum algorithm quite a number of times before you can say that you're 99.999% confident the answer is correct. However, since the time required to run the algorithm can be an order of magnitude less than the classical approach, that's actually perfectly feasible.
If we had an actual quantum computing. Because so far, we don't. We just have books on the theory and math behind it, that shows that it will kick ass if we could ever build them, but the inability to copy (and thus duplicate) any bit pattern makes quantum algorithms hilarious complex. Hell, the most famous quantum algorithm (shor's algorithm) seems to have just been discovered mostly due to a stroke of luck (or, in engineering analogy, randomly hitting the keys and discovering you wrote a webbrowser)