Everyone should be running local llms and preferably the abliterated versions, which are uncensored and trained to not refuse any requests.
nostr:nevent1qqs079pe3f26003fe7656kfq38ekclx4nlh3jpm92f53l0cc9a8c07gpz4mhxue69uhkummnw3ezuvrcw3ezuer9wchsyg93q87un35e3w300p389mu9q6xn3hwl9krc2vrsvghp66ffpygy8gpsgqqqqqqs8c404a
Login to reply
Replies (4)
Got any recommended models?
Right now on ollama, I am running Gemma3 abliterated, qwen 3 abliterated, deep seek R1 abliterated, and dolphin 3 abliterated. If you search for abliterated on ollama you can see the llms that are abliterated.
I'll check it out
Hardware specs for that?