Dual feed power line, one for home electrics, one for immersion heater. Store excess renewably generated energy in household hot water tanks during the day, to be used in the evening in place of fossil-fuel generated supply, which also evens out load peaks as a side effect. This also works well with home solar.
I think grid load management for water heaters is already fairly common. I've basic versions of the concept in use by a coop in South Carolina, and I can't imagine them being anywhere near the bleeding edge.
The Seebeck effect will definitely have less than 8% efficiency with 80C water. Perhaps GP means that the hot water will already be available to use for hygiene and laundry, which for those with an electric water heater, is a large portion of the household power draw.
I'm not sure if a household sized water tank-full could provide heat over the course of a cold night, and whether a heat exchanger for air heating or water pipes would be more efficient. I suppose it depends on the insulation and placement of the ducts and pipes and how much of the heat makes it to and through a wall.
I did some very crude calculations but assuming 50 gallons at 60C and 1000W energy loss per hour from a moderately insulated house on a 50F night, the full water heater could keep the house at 70F for 5.14 hours. Someone with more recent practical physics usage is welcome to check this figure.
Not at all. Peak shaving with approaches like this is fairly common with utilities. Lots of them will give huge rebates on smart meters in exchange for this.
The energy difference between ambient-temp-to-frozen and ambient-temp to steam is much larger. I would think that this affects scalability.
I live in a city center that uses chilled water for some use cases, but it certainly does not seem scalable enough to be an "easy technical remedy" to the issues of distribution being expensive.