Damus
Juraj profile picture
Juraj
@Juraj

I don’t seek rigid structure — I seek resonance

Learn how to use Bitcoin for more than just saving in my 📖Cryptocurrencies - Hack your way to a better life.

Vibe coding, reality bending, cypherpunk visions.

Get my books and courses here:
https://hackyourself.io/shop
https://juraj.bednar.io/shop

(You'll learn skills no one else is teaching!)

Podcasts 🎙️:
Option Plus - https://optionplus.io/
Reči o živote, vesmíre a vôbec: https://juraj.bednar.io/reci-o-zivote/
Ako vyhackovať otcovstvo: https://otcovia.com/

Relays (4)
  • wss://nos.lol – read & write
  • wss://relay.damus.io – read & write
  • wss://relay.primal.net – read & write
  • wss://nostr.cypherpunk.today/ – read & write

Recent Notes

Vitor Pamplona · 5h
Career experience, forcing their projects on all of our users, trying to get credibility to attack us later.... Idk ..
Juraj profile picture
OK, now why I really built this. I was thinking that it would be cool to have a completely different Nostr client - one that would read my timeline for the past 24h, rank what might interest me and create a morning radio show and send it to me over SimpleX, to listen in the car.

Listen to mine, but it would create yours for you:

https://blossom.primal.net/ddce19101349a6511436e68a487d30cd6b9b0814dafdd846c63c99f578167c9e

(the voice is from kitten-tts nano model, which runs even on my old synology, but you can use whatever text to speech you want of course).

Would anyone be interested if I ran this as a service? (You can run it yourself now: https://github.com/jooray/nalgorithm )

❤️1
Frederik Handberg · 9h
Thanks I will check it out
Pip the WoT guy · 15h
That's cool! How do you use the LLM exactly, and what model?
Juraj profile picture
You create a profile about what you're interested in. Basically just describing what you like.

Then it builds a secondary prompt based on your like events, so you can indicate what you like by liking in your Nostr client. This is updated constantly after ranking.

I use gemma 27b, it's cheap and relatively fast. Works with free tier of ollama cloud, but thanks to cors on their side you need to run it through proxy. Or you can use local ollama, or any other provider (Venice has permissive cors policy).

You can use it even with Venice E2EE proxy, but since Nostr posts are public anyway, I don't see any benefit to encrypting the prompts.

Caching, settings are done locally in browser local storage. There's no "server" - it's just a web app that connects to Nostr relays and your chosen inference provider.
❤️1
Juraj profile picture
New Nostr experiment: Nalgorithm.

Primarily a library, but also a web view-only Nostr client that scores posts by relevance to the user (as they describe themselves) by LLMs.

A secondary prompt is being extracted by looking at the post that the user has liked. That's a signal to the algorithm "more like this please".

The results are pretty cool. Who's interested?

72❤️2👀2🎉1💜1🔥1🤍1
ethfi · 1d
Just five more minutes
Pip the WoT guy · 15h
That's cool! How do you use the LLM exactly, and what model?
Frederik Handberg · 15h
I’m interested. Is there a GitHub available?