Running 22 Docker Services at Home for AI-Powered Business Management
The author runs a one-person software consultancy and has built an AI system to help manage his business operations, including email triage, financial tracking, and infrastructure monitoring. The system runs on three machines connected via a mesh VPN, with a local-first approach to avoid relying on cloud AI services.
Why it matters
This article highlights the growing trend of individuals and small businesses leveraging local AI infrastructure to manage their operations, driven by data privacy and control concerns.
Key Points
- 1The author runs 22+ Docker containers on a custom-built PC to power his AI-driven business management system
- 2A Mac mini M4 is used for local large language model (LLM) inference, avoiding reliance on cloud AI services
- 3A VPS is used as a canary to monitor the home network availability in case of outages
- 4The local-first approach was driven by concerns over data privacy and the limitations of cloud AI services
Details
The author has set up a comprehensive AI-powered system to manage his one-person software consultancy business. The core of the system runs on a custom-built PC with an AMD Ryzen 5 2600X CPU and 32GB of RAM, hosting 22+ Docker containers. A Mac mini M4 is used specifically for local LLM inference, such as the Ollama model, as well as running the Proton Mail Bridge. Additionally, the author uses a Hostinger VPS as a canary to monitor the availability of his home network. This local-first approach was driven by the author's concerns over data privacy and the limitations of relying on cloud AI services, which often come with separate subscriptions and restrictions on data usage. The author has implemented explicit guardrails to prevent any sensitive business data, such as emails, financial records, and client information, from leaving his private network.
No comments yet
Be the first to comment