Also agree. I spent so much time messing with fuzzy matching libraries and NERs for various entity resolution tasks, collecting and cleaning lists of various entity types, and so forth. IMO you really need a model with the encoded world knowledge of an LLM to reliably and flexibly make determinations like that "WMT" and "wally world" are referring to the same corporate entity.
A knowledge base - something where the LLM knows how to find the knowledge it needs for a given task. I am working on this idea in https://zby.github.io/commonplace/
>AAP sits above transport protocols. MCP is a valid execution backend — so are function calling, REST, and CLI. AAP defines the what. Transport protocols define the how.
>[MCP] needs to die a SOAP death so REST can rise in it's place.
So is AAP a replacement for MCP or something that sits above it? If the latter, why use AAP over Skills?
Also, IMO the biggest problem with MCP is that it's over-specced and over-constrained. This feels even more over-specced.
You're right that I say both "above and not in place of" and "MCP needs to die"... I should that (but cant edit anymore).. it's unclear.... someday I see MCP being replaced by something else. But it's not my intention to completely replace MCP, but to solve the problem above it today.... I think that will be sufficient for today.
fun fact: when baking, you can use blood in many of the same places where you might use eggs, since the albumin proteins in blood coagulate the same way. There's untapped potential here for some very interesting "red velvet" cakes.
The idea that it's harder to query and delete everything relating to a person from a well-organized graph than from the typical corporate patchwork of data systems seems very improbable. The post also reads like a barely tweaked Gemini output. I'm not a Palantir fan, but this feels flimsy.
Fair point. I realize that I oversimplified.. My main argument is that the Ontology isn't a clean internal graph. It ingests from sources Palantir doesn't own or control, so deleting your node doesn't touch the upstream data. And inferred edges (risk scores, behavioral patterns) were never stored as discrete objects. You can't delete an inference.
And, I will hold my hand up to say I did use an LLM (Claude, actually). But only to make the text read and flow better (something I definitely won't do again). The underlying research is my own and something I am very passionate about. Thank you for your feedback! I appreciate it. :)
I appreciate the work you're doing! Speaking as someone who had to manually execute CCPA right to delete requests in a past life, the state of this capability in most enterprises is pretty lacking. I hope to read stronger/clearer content from you on this topic in the future.
Well, it's not exactly a fair comparison, since they're comparing a volume number with GDP, which is total value produced in a year. Volume numbers are usually much bigger than production numbers, since money moves around a lot.
If I pay a restaurant $200 for dinner and my three friends each venmo me $50 for their share, then the exchanged volume was $350, but only $200 worth of value was generated.
good argument, bad example. GDP is a net revenue number but Stripe is using a gross revenue number (their equivalent of "GMV"). so the numerator/denominator are as different as possible to make it impressive.
If I buy something worth 17$ from an SMB's website, and the SMB buys their ingredients from their merchants via stripe for 10$, then those suppliers get a bill for 8$, then that's still only 17$ of GDP, but 35$ of stripe transactions.
I think the parent is correct in saying that’s an economic activity though it’s not GDP (Gross Product). It’s difficult to do circular transactions with no “economic value” in stripe the same you can do with bank accounts/wallets.
You’re absolutely right i didnt think of that. Isn’t this what they call velocity of money? So we’d need to calculate the “velocity of stripe” (the flow of dollars within the system).
With all due respect, this comment demonstrates a misunderstanding of basic economic concepts. Gross domestic product is not simply “economic activity” but a measure of value creation. The mere movement of existing value through transactions does not, by itself, constitute economic production, which is why comparing transaction volume to GDP is misleading.
The idea of having the model create a plan/spec, which you then mark up with comments before execution, is a cornerstone of how the new generation of AI IDEs like Google Antigravity operate.
Claude Code also has "Planning Mode" which will do this, but in my experience its "plan" sometimes includes the full source code of several files, which kind of defeats the purpose.
reply