You don’t have an ability to “erase it from their servers”. There is no way to be sure they actually delete anything when you erase it, they could just be hiding the access.
Speaking as someone who worked in a large company before (although perhaps not Google-sized), and several smaller ones, they are very motivated to reduce storage costs whenever possible. Sure they could be "just hiding the access" immediately after you request them to delete, but that storage will likely be overwritten soon by something else.
Training off data you published for public consumption, e.g. pretty much user-generated content on social media, or anything publicly accessible on the web, is one thing. Training off private conversations is a whole different thing. I doubt any major company is doing the latter. Would be a PR and legal firestorm. Which doesn't serve the interests of companies training AI models either.