Demystifying AI Categories - Insights Into Public vs Private vs Personal AI
To Use or Not to Use a Localized Legal Expert System? Weighing Your Options
Key Takeaways
- Less censorship: Local LLMs offer the freedom to discuss thought-provoking topics without the restrictions imposed on public chatbots, allowing for more open conversations.
- Better data privacy: By using a local LLM, all the data generated stays on your computer, ensuring privacy and preventing access by companies running publicly-facing LLMs.
- Offline usage: Local LLMs allow for uninterrupted usage in remote or isolated areas without reliable internet access, providing a valuable tool in such scenarios.
Since the arrival of ChatGPT in November 2022, the term large language model (LLM) has quickly transitioned from a niche term for AI nerds to a buzzword on everyone’s lips. The greatest allure of a local LLM is the ability to replicate the abilities of a chatbot like ChatGPT on your computer without the baggage of a cloud-hosted version.
Arguments exist for and against setting up a local LLM on your computer. We’ll cut the hype and bring you the facts. Should you use a local LLM?
The Pros of Using Local LLMs
Why are people so hyped about setting up their ownlarge language models on their computers? Beyond the hype and bragging rights, what are some practical benefits?
1. Less Censorship
When ChatGPT and Bing AI first came online, the things both chatbots were willing to say and do were as fascinating as they were alarming. Bing AI acted warm and lovely, like it had emotions. ChatGPT was willing to use curse words if you asked nicely. At the time, both chatbots would even help you make a bomb if you used the right prompts. This might sound like all shades of wrong, but being able to do anything was emblematic of the unrestricted capabilities of the language models that powered them.
Today, bothchatbots have been so tightly censored that they won’t even help you write a fictional crime novel with violent scenes. Some AI chatbots won’t even talk about religion or politics. Although LLMs you can set up locally aren’t entirely censorship-free, many of them will gladly do the thought-provoking things the public-facing chatbots won’t do. So, if you don’t want a robot lecturing you about morality when discussing topics of personal interest, running a local LLM might be the way to go.
2. Better Data Privacy
One of the primary reasons people opt for a local LLM is to ensure that whatever happens on their computer stays on their computer. When you use a local LLM, it’s like having a conversation privately in your living room—no one outside can listen in. Whether you’re experimenting with your credit card details or having sensitive personal conversations with the LLM, all the resulting data is stored only on your computer. The alternative is using publicly-facing LLMs like GPT-4, which gives the companies in charge access to your chat information.
3. Offline Usage
With the internet being widely affordable and accessible, offline access might seem like a trivial reason to use a local LLM. Offline access could become especially critical in remote or isolated locations where internet service is unreliable or unavailable. In such scenarios, a local LLM operating independently of an internet connection becomes a vital tool. It allows you to continue doing whatever you want to do without interruption.
4. Cost Savings
The average price of accessing a capable LLM like GPT-4 or Claude 2 is $20 per month. Although that might not seem like an alarming price, you still get several annoying restrictions for that amount. For instance, with GPT-4, accessed via ChatGPT, you are stuck with a 50-message per three-hour cap. You can only get past those limits byswitching to the ChatGPT Enterprise plan , which could potentially cost thousands of dollars. With a local LLM, once you’ve set up the software, there are no $20 monthly subscription or recurring costs to pay. It’s like buying a car instead of relying on ride-share services. Initially, it’s expensive, but over time, you save money.
5. Better Customization
Publicly available AI chatbots have restricted customization due to security and censorship concerns. With a locally hosted AI assistant, you can fully customize the model for your specific needs. You can train the assistant on proprietary data tailored to your use cases, improving relevance and accuracy. For example, a lawyer could optimize their local AI to generate more precise legal insights. The key benefit is control over customization for your unique requirements.
The Cons of Using Local LLMs
- Title: Demystifying AI Categories - Insights Into Public vs Private vs Personal AI
- Author: Frank
- Created at : 2024-08-16 13:56:05
- Updated at : 2024-08-17 13:56:05
- Link: https://tech-revival.techidaily.com/demystifying-ai-categories-insights-into-public-vs-private-vs-personal-ai/
- License: This work is licensed under CC BY-NC-SA 4.0.