Heat pumps pulling in from "outside" are 300% efficient because outside is heated with many hundreds of kW by the sun.
These would be 100% efficient at best, same as a storage heater, but far less dense.
You'd probably need 10 servers to heat a 3 bed house, and you'd also need to vary the workloads to match the heating demand in the home.
In midsummer demand across the entire network would be zero most of the time, only rising when water needs heating after running taps/showers. If the servers run at all, they'd need active cooling running to dump the heat outside!
In a cold snap, everyone would want all "their" servers running at full whack.
Datacentre load doesn't follow the annual heating demand curve, let alone the daily/hourly swings.
So everyone needs both supplementary heating and a full cooling system for the servers.
Put simply, I don't see how it could be done economically. There's a reason why datacentres exist, and it's not so servers can have a chat over a coffee in the break room.