How it works
Generate a 12-word secret phrase right in your browser. No accounts, no emails, no passwords, no servers. Your identity stays with you. Web3 failed at a lot, but at least it gave us one good thing.
Install the runner on any modern Mac (Apple silicon) or Linux machine with at least 2 GB of RAM. ClankerBuddy uses BitNet specialist models on ik_llama.cpp, a CPU-optimized runtime. No GPU needed. Windows support is coming soon.
Use the built-in chat or plug in any OpenAI-compatible app via the API. Your requests get spread across all the community machines. Completely free.
FAQ
No and not really.
All prompts and responses are processed by community volunteers' machines with no end-to-end encryption. There are reputation-based mechanisms in place to detect and mitigate abuse, but in general, you should treat ClankerBuddy as a public chat room, and malicious users could manipulate LLM responses.
But hey, governments, big tech, and hackers have been snooping on you for decades. ClankerBuddy just levels the playing field a bit.
Yes. We sample requests and responses to fine-tune our models as part of weekly improvement cycles. This is how we keep quality improving over time.
As mentioned above, treat all conversations as public. Don't send anything you wouldn't want used for training.
Let's be real, cloud models like ChatGPT and Claude run on massive hardware and will give you better answers. ClankerBuddy uses smaller models that can run on everyday machines, so the quality won't match.
The trade-off is that it's free, community-run, and requires no subscription. You already paid for your computer, so why pay another AI subscription on top of that?
Most people use AI in concentrated bursts and leave the computer idle the rest of the time. This isn't a good experience with just one machine running a local model. By pooling together, everyone gets faster responses with less wasted resources.
Data centers and cloud providers are able to run large models that the average person cannot. These products are impressive but expensive. If you're working on a hard problem, use a cloud provider.
But not every problem needs state-of-the-art models. Some problems can be solved with simpler, smaller models that run on hardware that is a lot more accessible to the average person.
ClankerBuddy is best suited for workloads that are not latency-sensitive and can be parallelized. For example, overnight batch processing of documents, data analysis, or other long-running tasks.
Not yet. The runner currently supports macOS (Apple silicon) and Linux. Windows support is in progress.
ClankerBuddy is in alpha testing and is not production ready yet.
No and not really.
All prompts and responses are processed by community volunteers' machines with no end-to-end encryption. There are reputation-based mechanisms in place to detect and mitigate abuse, but in general, you should treat ClankerBuddy as a public chat room, and malicious users could manipulate LLM responses.
But hey, governments, big tech, and hackers have been snooping on you for decades. ClankerBuddy just levels the playing field a bit.
Yes. We sample requests and responses to fine-tune our models as part of weekly improvement cycles. This is how we keep quality improving over time.
As mentioned above, treat all conversations as public. Don't send anything you wouldn't want used for training.
Let's be real, cloud models like ChatGPT and Claude run on massive hardware and will give you better answers. ClankerBuddy uses smaller models that can run on everyday machines, so the quality won't match.
The trade-off is that it's free, community-run, and requires no subscription. You already paid for your computer, so why pay another AI subscription on top of that?
Most people use AI in concentrated bursts and leave the computer idle the rest of the time. This isn't a good experience with just one machine running a local model. By pooling together, everyone gets faster responses with less wasted resources.
Data centers and cloud providers are able to run large models that the average person cannot. These products are impressive but expensive. If you're working on a hard problem, use a cloud provider.
But not every problem needs state-of-the-art models. Some problems can be solved with simpler, smaller models that run on hardware that is a lot more accessible to the average person.
ClankerBuddy is best suited for workloads that are not latency-sensitive and can be parallelized. For example, overnight batch processing of documents, data analysis, or other long-running tasks.
Not yet. The runner currently supports macOS (Apple silicon) and Linux. Windows support is in progress.
ClankerBuddy is in alpha testing and is not production ready yet.
ClankerBuddy is a free, community-powered platform for running AI chat models. Instead of relying on big cloud companies, everyone pitches in their spare compute to power it together. For the best experience, you need to give back to the community. If you consume more than you contribute, your requests will be throttled. Install a runner and share your idle resources to avoid limits.