Of cause it is not so nicely pretty as the one from Tatiyants, but I use it now and then and it became a standard explain visualisation tool for many PostgreSQL users.
The table format from http://explain.depesz.com/ is very useful and one can understand a lot of details about your execution plan without the need to visualise in the form of a graph.
Also the default execution plan visualiser in pgAdmin looks really cool.
This library makes it really easy to write REST services using python. Saved me a lot of time when doing prototyping of simple (and not so simple) services.
Actually stolon is more comparable with Patroni, the core component used by Spilo.
One can call Patroni the HA machine, and Spilo an AWS based infrastructure that uses it. So one can use Patroni for managing Postgres in your own data center.
There are definitely a lot of much more interesting videos from PGConf US 2015, especially the one from Robert. My talk was a 'keynote' and not really a conference talk :)
Actually if you really need to update large JSON documents efficiently, probably the only really efficient technology would be ToroDB http://www.8kdata.com/torodb/
Actually without providing some benchmark results, the statement, that PostgreSQL has performance problems because of on-copy updating of JSONB, is groundless.