I guess so. I wonder if Nostr has an LLM implementation? Does it even really need one?
Maybe a replybot would be kind of great as you call the LLM with a question and tag or @ and then it replies with the content.
@TheGrinder what do you think?
You want to build on your LLM models and hook it up?