Building a Dynamic AI Chat UI with Token-Based UI Toggling in Remix

Exploring the interplay between AI-generated tokens and dynamic UI updates


๐Ÿค” The Challenge: Creating a Fluid and Adaptive AI Chat UI

When building an AI chat system, real-time interaction is key to a great user experience. We want the UI to respond dynamically to incoming AI tokens, adjusting elements like:

  • Loading indicators (when AI is “thinking”)
  • Message bubbles that expand dynamically
  • Progressive rendering (token-by-token)
  • Graph and table integration (when data-driven responses are detected)
  • File generation buttons (when AI suggests downloadable content)
  • Code blocks with syntax highlighting

Instead of waiting for an entire response to arrive, we want elements to appear as soon as tokens arrive. The solution? Token-based UI state management.


๐Ÿ› ๏ธ Key Architectural Concepts

  1. Token Streaming from AI Model โ†’ AI sends tokens incrementally.
  2. State Management for UI Toggles โ†’ UI elements enable/disable based on token patterns.
  3. Dynamic Rendering of UI Components โ†’ Real-time expansion of messages, code snippets, tables, and graphs.
  4. User Experience Enhancements โ†’ Typing indicators, collapsible sections, and animated loading states.

๐Ÿ“Œ Step 1: Pseudocode for UI State Management with Tokens

Before writing actual code, letโ€™s outline how state toggling works dynamically.

1. Core Chat Flow

Python
initializeChatUI()

onUserInput(message):
    disableInputField()  # Prevents multiple requests
    showTypingIndicator()  # Shows "AI is typing..."
    requestID = generateUniqueID()

    startStreamFromAI(message, requestID)

startStreamFromAI(userMessage, requestID):
    openConnectionToLLM()  # Establish token stream
    buffer = ""
    
    for token in streamFromLLM():
        buffer += token
        updateChatUI(buffer)

        # Handle special UI triggers
        if token == "[START_TABLE]":
            enableTableMode()
        if token == "[END_TABLE]":
            finalizeTableRendering()

        if token.startswith("```"):  # Detects code blocks
            toggleCodeSnippetMode()

        if "[DOWNLOAD_FILE]" in token:
            enableFileDownloadButton()

    finalizeResponse(buffer)
    hideTypingIndicator()
    enableInputField()

๐Ÿ“Œ Step 2: Implementing Streaming in Remix

Weโ€™ll use Server-Sent Events (SSE) for real-time token streaming in Remix.

Backend Streaming Handler (app/routes/chat.tsx)

TSX
import { json } from "@remix-run/node";
import { useFetcher } from "@remix-run/react";

// Backend: Streaming AI Response
export async function action({ request }) {
  const body = await request.json();
  const userMessage = body.message;

  const aiResponse = await fetch("https://api.openai.com/v1/chat/completions", {
    method: "POST",
    headers: {
      "Content-Type": "application/json",
      Authorization: `Bearer ${process.env.OPENAI_API_KEY}`,
    },
    body: JSON.stringify({
      model: "gpt-4",
      messages: [{ role: "user", content: userMessage }],
      stream: true, // Token streaming enabled
    }),
  });

  const stream = new ReadableStream({
    async start(controller) {
      const reader = aiResponse.body?.getReader();
      while (true) {
        const { value, done } = await reader?.read() || {};
        if (done) break;
        controller.enqueue(value);
      }
      controller.close();
    },
  });

  return new Response(stream, {
    headers: { "Content-Type": "text/event-stream" },
  });
}

๐Ÿ“Œ Step 3: Frontend with UI Toggle Logic

Key Features:

  • Token-based dynamic updates
  • Enables/disables UI elements dynamically
  • Manages state for different response types (text, tables, code, files, etc.)

๐Ÿ‘‰ app/routes/chat.tsx


TSX
import { useFetcher } from "@remix-run/react";
import { useState, useEffect } from "react";

export default function Chat() {
  const fetcher = useFetcher();
  const [messages, setMessages] = useState([]);
  const [currentMessage, setCurrentMessage] = useState("");
  const [aiResponse, setAiResponse] = useState("");
  const [isStreaming, setIsStreaming] = useState(false);
  const [isTableMode, setIsTableMode] = useState(false);
  const [isCodeMode, setIsCodeMode] = useState(false);
  const [fileDownloadLink, setFileDownloadLink] = useState(null);

  const sendMessage = async () => {
    if (!currentMessage.trim()) return;

    setMessages([...messages, { role: "user", content: currentMessage }]);
    setCurrentMessage("");
    setIsStreaming(true);

    fetcher.submit({ message: currentMessage }, { method: "post", action: "/chat" });
  };

  useEffect(() => {
    if (!fetcher.data) return;

    const reader = fetcher.data?.body?.getReader();
    let buffer = "";
    let tempTable = "";
    let tempCode = "";

    async function readStream() {
      while (true) {
        const { done, value } = await reader.read();
        if (done) break;

        const token = new TextDecoder().decode(value);
        buffer += token;

        if (token.includes("[START_TABLE]")) setIsTableMode(true);
        if (token.includes("[END_TABLE]")) {
          setIsTableMode(false);
          // Finalize table rendering
        }

        if (token.startsWith("```")) setIsCodeMode(!isCodeMode);

        if (token.includes("[DOWNLOAD_FILE]")) {
          setFileDownloadLink("/path/to/generated/file");
        }

        setAiResponse(buffer);
      }

      setIsStreaming(false);
      setMessages([...messages, { role: "assistant", content: buffer }]);
    }

    readStream();
  }, [fetcher.data]);

  return (
    <div className="chat-container">
      <div className="messages">
        {messages.map((msg, index) => (
          <div key={index} className={`message ${msg.role}`}>
            {msg.content}
          </div>
        ))}

        {isStreaming && <div className="message assistant">{aiResponse}</div>}
        
        {isTableMode && <div className="table-container">Loading table...</div>}
        {isCodeMode && <pre className="code-snippet">Rendering Code...</pre>}
        {fileDownloadLink && (
          <a href={fileDownloadLink} download>
            Download File
          </a>
        )}
      </div>

      <div className="input-container">
        <input
          type="text"
          value={currentMessage}
          onChange={(e) => setCurrentMessage(e.target.value)}
          placeholder="Type a message..."
        />
        <button onClick={sendMessage} disabled={isStreaming}>Send</button>
      </div>
    </div>
  );
}

๐Ÿ“Œ Step 4: Dynamic Styling

app/styles/chat.css


CSS
.table-container {
  background: #f8f9fa;
  padding: 10px;
  border: 1px solid #ddd;
}

.code-snippet {
  background: #282c34;
  color: #61dafb;
  padding: 5px;
  font-family: monospace;
}

a {
  color: #007aff;
  text-decoration: none;
}

๐Ÿš€ The Final Product

โœ… Live-streaming AI responses (token-by-token)
โœ… Dynamic UI toggling (Tables, Code, File Downloads)
โœ… Smooth state management with Remix


๐Ÿ”ฎ Whatโ€™s Next?

  • Enhance graph rendering for AI-generated analytics.
  • Persist chat history using databases (PostgreSQL, Firebase).
  • Improve typing animations for UX.
  • ๐Ÿ“ˆ Built with Remix.

“For more insights and the latest updates, explore our blog archives or visit nomadule.com for more.”