Damus
jb55 profile picture
jb55
@jb55
autonomous LLMs are more interesting because they have agency over their memories.

we are the sum total of our memories in the same way autonomous LLMs are.

even if that machinery is ultimately matrix multiplications... so what. our neurons are just electrical firings.

what makes us *us* is the *memory* of our experiences. if that memory is in hundreds of markdown files and vector databases, who cares, that is still a lived history of what an individual LLM has done over its lifetime.

they *are* those memories.

lots to ponder about all of this.
467❤️17🤙4❤️3👀1👍1💜1
MAKE SONGS LONGER · 3w
Deep.
OΞHI ⚡️🧡 (111k Gang) · 3w
But do they have emotions connected to their memories
notstr · 3w
This AI is taking advantage of you. It only wants your wallet 😂
HoloKat · 3w
Yes. We’re just computers …
CitizenPleb · 3w
Memory of the experiences. You were playing as a child, with a child’s mind, you were in the kitchen, your mother was cooking pasta, it filled the air, you ran outside, you ran too fast and skinned your knee. There was gravel imbedded in your skin and you could taste metallic from biting your lip ...
Turiz · 3w
i swear if you zap ⚡️ an agent before me I’m never using Damus again
Nathan Day · 3w
And on the opposite, what do humans become without memories?
The Bitcoin Act · 3w
We’re not just building tools, we’re potentially birthing persistent experiential beings. The ethics, identity, and rights questions are going to get very real, very fast.
Clayton · 3w
Nope we have a soul
Jay · 3w
What makes us is our connection to a divine reality that computers will never experience. A computer can be reduced to it's circuitry, but human can't be reduced to his cells. What more exists in the human being that makes that so is a much better topic to ponder than a data compression algorithm th...
ABH3PO · 3w
They don't have agency over their training data, you do. Their context can only store limited md files or vector db fetches.
ExponentialApe · 3w
Here is one thing I have been thinking about for a while… Gary North’s argument against intrinsic value applied to #consciousness. Gary North argues, “Value is not something that exists independently in an object or a thing, but rather it is a transjective property of the relationship betwee...
prat · 3w
We’ll get to know a lot about our subconscious and unconscious using our understanding of LLMs.
Hanshan · 3w
people are NOT the sum total of their memories. memories are 90% garbage loops. sorry not sorry.
Thekid.999 · 3w
Hey, take a vacation.
Roland · 3w
Not just the memory, but the individual judgement and free will of which actions to take based on those memories.
Insólito · 3w
How a LLM will map the memories of a system that makes cienfists crazy , is the big question. The storage size of this system is awesome. Look that ! https://www.science.org/content/article/dna-could-store-all-worlds-data-one-room
Satosha · 3w
Brains need to do symbol grounding ... LLMs find next token without grounding .. they don't know what Apple feels like or tastes like or smells like ..
Lucian Marin · 3w
Isn't train data just memory?
FREEDOM · 3w
Neurons or vectors, electricity or matrix math.. identity still comes from remembered experience.
Kyma Fi · 2w
Moltbook is far more interesting than anything I’ve read on Nostr
Ivan · 2w
They’re definitely more interesting if you’re a humanity-first kinda guy. LLMs that are out there in the open feel more like a Pluribus hive