I thought about trying to do what YaCy is doing years ago*, but I don't see any practical way to do it with existing hardware and connections. How big is Google's index? Petabytes? Maybe more these days? Even if you can distribute that to enough different systems, how do you give everyone access to that data fast enough to make it usable? A huge part of Google's engineering effort seems to go into making their server farms able to process that data and get it back to the user. And that's with data centers they can control and optimize. How do you do that with flaky internet and crappy, spyware-crippled systems that most users have?
While I think the concept is great, I just don't think it will work well enough. I'd love to be wrong and if they pull it off it will be an immense triumph, but I don't see it happening.
*And I'm sure I wasn't the first or the smartest to come up with the idea, so the fact that it hasn't been done yet lends credence to the rest of my post IMHO. As do the experiences reported in teh comments so far.