quarkus-chat-ui: A Web Front-End for LLMs, and a Real-World Case for POJO-actor
quarkus-chat-ui is a web UI that allows multiple instances of large language models (LLMs) to communicate with each other, built using the POJO-actor framework.
Why it matters
quarkus-chat-ui provides a real-world example of how to build a web front-end for communicating between multiple LLM instances, which is an important capability for advanced AI applications.
Key Points
- 1quarkus-chat-ui exposes an HTTP MCP (Message Choreography Protocol) server at /mcp, allowing LLM instances to call tools on each other
- 2It supports LLM backends like Claude Code CLI, Codex, or local models via claw-code-local, which have MCP client capabilities
- 3The web UI provides a stable interface for typing prompts, and a prompt queue to handle the asynchronous responses from LLMs
- 4quarkus-chat-ui is written in Quarkus, which simplifies the handling of streaming responses from the LLMs
Details
quarkus-chat-ui is a web front-end that enables multiple instances of large language models (LLMs) to communicate with each other. It is built as a real-world use case for the POJO-actor framework. Each quarkus-chat-ui instance exposes an HTTP MCP (Message Choreography Protocol) server at the /mcp endpoint, allowing LLM instances to call tools on each other. The LLM backend, such as Claude Code CLI, Codex, or a local model via claw-code-local, acts as an MCP client that can reach these endpoints. The key challenge was how to wire up the asynchronous communication over HTTP, given that LLM responses can take tens of seconds to arrive as a stream. quarkus-chat-ui solves this by providing the bridge between the LLM backends and the web UI. The web UI gives users a stable place to type prompts, and a prompt queue to handle the asynchronous responses. The article also mentions that quarkus-chat-ui is written in Quarkus, which simplifies the handling of streaming responses from the LLMs.
No comments yet
Be the first to comment