If I was building this in a system design interview I would use whisper, NLP, and “classic ML” classifiers with deterministic results. I would not want an LLM in the loop at all. Facebook and Google have been able to target you better than you could even perceive for years.
LLMs are slow, expensive and inconsistent. More importantly It’s not the right tool for the job.
Really feels like more “oohhh look at how important and scary LLMs are”.
*edit* PS, my company does marketing, communication and trade surveillance for FINRA registered broker dealer firms. If the CCO or anyone else with admin access wanted to monitor for someone talking badly about them they absolutely could update their list. No LLMs in the loop, very scalable, affordable, auditable and reliable. LLMs are just an interface not a solution for analysis.
I was asking myself the same question this morning and kind of came to the same conclusion as you. It makes much more sense to design a automated drone piloting system built with a decision-making algorithm than use a LLM. For Mass surveillance, I can see a bit more the use case, where you can use classical methods to process the information about a specific person but use the LLM to generate summaries or synthesize it when you have information that is not too organized or comes from different sources.
I definitely think there is some overconfidence on the side of the decision makers in the government on what these tools can be used for. Also maybe some wishful thinking on what LLMs will be able to do in a few years?
LLMs are slow, expensive and inconsistent. More importantly It’s not the right tool for the job.
Really feels like more “oohhh look at how important and scary LLMs are”.
*edit* PS, my company does marketing, communication and trade surveillance for FINRA registered broker dealer firms. If the CCO or anyone else with admin access wanted to monitor for someone talking badly about them they absolutely could update their list. No LLMs in the loop, very scalable, affordable, auditable and reliable. LLMs are just an interface not a solution for analysis.