Haha wow this is a great post. I am a lawyer and you may have solved a problem I recently encountered. So you think this is saying that generic language in the Zoom BAA constitutes permission to de-identify?
Are there examples of healthcare ai chatbots trained on de-id data btw? If you're familiar would love to see.
> Haha wow this is a great post. I am a lawyer and you may have solved a problem I recently encountered. So you think this is saying that generic language in the Zoom BAA constitutes permission to de-identify?
Not that I’m an expert on the nuance here but I think it gives them permission to use PHI, especially if spun in the correct way, which then gives them permission to deid and do whatever with.
My experience has been that it’s pretty easy to spin something into QI.
> Are there examples of healthcare ai chatbots trained on de-id data btw? If you're familiar would love to see.
https://loyalhealth.com/ is one I’ve recently heard of that trains on de-id’d PHI from customers.
> What's your line of work out of curiosity?
Previously founded a health tech startup and now working primarily as a clinician and researcher (NLP) with some side work advising startups and VCs.
Happy to help. Let me know where to send the invoice for my non-legal legal expertise, if your rate is anything like my startup's lawyer you'll find me a bargain! Haha.
Are there examples of healthcare ai chatbots trained on de-id data btw? If you're familiar would love to see.
What's your line of work out of curiosity?