Damus
oberono profile picture
oberono
I use GPT4All with a Mistral OpenOrca model. I prompt using text posts and ideas. I want to see if the base cultural context for cross-domain ideas match what I'm trying to say. It works fairly well. There are even some valid corrections that I didn't think of sometimes. I don't have a decent GPU on my main GNU/Linux machine. GPT4All runs on just CPU fine. I do have a macOS mini M1 that is much faster, as GPT4All will use the hardware, but I usually don't take the time to switch the KVM (I currently just it for testing software builds). I've never used standalone cloud AI. I don't care about it for a variety of reasons. One benefit to GPT4All is you can point it at your local documents as well to tailor the model.