StableLM looks promising as well. CC BY-SA. This space is moving so fast that any attempted censorship will just get you overtaken by a slew of competitors. At this rate we'll be running unfiltered LLMs better than ChatGPT locally on our own gpus by the end of next year. And giving them any personality we want too.
StableLM looks promising as well. CC BY-SA. This space is moving so fast that any attempted censorship will just get you overtaken by a slew of competitors. At this rate we'll be running unfiltered LLMs better than ChatGPT locally on our own gpus by the end of next year. And giving them any personality we want too.