26 points by c990802 2 days ago | 5 comments
It's an open-source ChatGPT Teams alternative made with the needs of businesses and organizations in mind.
It has everything you can expect from ChatGPT, plus first class support for users and roles management, as well as advanced collaboration features. We've made it very easy to self-host anywhere you like, and we also provide a cloud version. So no excuse for trying it out!
Why using Llama Workspace instead of ChatGPT Teams?
- You have access to all the major Large Language models in one place, including GPT-4, Claude Sonnet or Gemini. - It brings savings of about 70% - 85% when compared to ChatGPT Teams (see website for more info on this). - You can integrate it with your own code, like AI agents, and provide a single platform to access AI.
Here's the Github link: https://github.com/llamaworkspace/llamaworkspace
I would love to hear your ideas, experience, and feedback about the product! What should we implement next?
janderson215 2 hours ago
What is the recommended resource usage for self-hosted? I don’t see it in that section on your site.
aaronharnly 1 day ago
How (in your opinion) does this project compare to https://www.librechat.ai?
(My company selected LibreChat to host our internal “GPT Teams” implementation after surveying and evaluating the current state of tools as of about six months ago, but I’m sure it’s a rapidly evolving space.)
c990802 13 hours ago
It is indeed a rapidly evolving space and I think choosing LibreChat was a great choice. We only have good words for them!
In our case, we'd like to put our efforts on addressing specific needs of businesses & organizations that aren't always well covered by similar open source projects. Foundational things like granular permissions, advanced auth, usage reports, spend control and content moderation are top priorities in our roadmap. And then, we'd like to invest heavily in data integrations and RAG.
Finally, we'd like to make it extremely easy to self-host by making the deployment and upgrading process a breeze.
Hope this takes us to a place where people find value in our work!
ganeshkrishnan 1 day ago
I would debate this one. As a "normal" user of chatgpt with around 10-20 questions a day, I can exceed this $30 per month because
1) it's not a word per se but token. A lot of the words are tokenized.
2) for each chat, for each question all the context above is re-sent and counts for cost.
I think pivoting towards self hosting llama models and charging for them might be more profitable to you and to the users.
c990802 13 hours ago
So for example, your scenario of 20 questions a day for 30 days: - Monthly queries 20*30 = 600 queries/mo - Model gpt-4o ($2.5/M input, $10/M output) - Average input tokens 1,000 (actuals are much lower) - Average output tokens 400
=> Cost per run $0.0065 => Monthly cost $3.90
PS: True, We deliberately used "word" and not token, in some context, so that anyone can easily understand it.