Skip to content

Deploying a Langflow App

Introduction

Langflow is an open-source UI for building and deploying LLM-driven workflows. Deploying Langflow with a Dockerfile on Klutch.sh provides reproducible builds, managed secrets, and persistent storage for projects and vector data—all accessible from klutch.sh/app. This guide covers installation, repository prep, a production-ready Dockerfile, deployment steps, Nixpacks overrides, sample usage, and production tips.


Prerequisites

  • A Klutch.sh account (create one)
  • A GitHub repository containing your Langflow code/config (GitHub is the only supported git source)
  • Docker familiarity and Python 3.10+ knowledge
  • API keys for LLM providers (OpenAI, etc.) as required by your flows
  • Optional: Database or vector store credentials if you persist embeddings externally

For onboarding, see the Quick Start.


Architecture and ports

  • Langflow serves HTTP; set the internal container port to 7860.
  • If you connect to databases or vector stores, deploy them as separate Klutch.sh TCP apps exposed on port 8000 and connect on their native ports.
  • Persistent storage is recommended for data (projects, uploads) and optional for logs/cache.

Repository layout

langflow/
├── Dockerfile # Must be at repo root for auto-detection
├── requirements.txt
├── start.sh # Optional helper
├── data/ # Projects/uploads (mount as volume)
└── .env.example # Template only; no secrets

Keep secrets out of Git; store them in Klutch.sh environment variables.


Installation (local) and starter commands

Install dependencies and run locally before pushing to GitHub:

Terminal window
python -m venv .venv
source .venv/bin/activate
pip install -r requirements.txt
langflow run --host 0.0.0.0 --port 7860

Optional start.sh for portability and Nixpacks fallback:

#!/usr/bin/env bash
set -euo pipefail
exec langflow run --host 0.0.0.0 --port 7860

Make it executable with chmod +x start.sh.


Dockerfile for Langflow (production-ready)

Place this Dockerfile at the repo root; Klutch.sh auto-detects it (no Docker selection in the UI):

FROM python:3.11-slim
WORKDIR /app
RUN apt-get update && apt-get install -y build-essential git && rm -rf /var/lib/apt/lists/*
COPY requirements.txt /app/requirements.txt
RUN pip install --no-cache-dir -r requirements.txt
COPY . /app
ENV PORT=7860
EXPOSE 7860
CMD ["langflow", "run", "--host", "0.0.0.0", "--port", "7860"]

Notes:

  • Add extra system packages if your flows need them (e.g., ffmpeg, poppler).
  • Pin Python package versions in requirements.txt for reproducibility.

Environment variables (Klutch.sh)

Set these in the Klutch.sh app settings (Secrets tab) before deploying:

  • PORT=7860
  • LANGFLOW_HOST=https://example-app.klutch.sh
  • OPENAI_API_KEY (or other LLM provider keys)
  • LANGFLOW_DATA_DIR=/app/data
  • Any DB/vector store connection strings your flows require

If you deploy without the Dockerfile and need Nixpacks overrides:

  • NIXPACKS_BUILD_CMD=pip install -r requirements.txt
  • NIXPACKS_START_CMD=langflow run --host 0.0.0.0 --port 7860
  • NIXPACKS_PYTHON_VERSION=3.11

These keep Langflow compatible with Nixpacks defaults when a Dockerfile is absent.


Attach persistent volumes

In Klutch.sh storage settings, add mount paths and sizes (no names required):

  • /app/data — required for projects, flows, and uploads.
  • /app/logs — optional if you store logs on disk.

Ensure these paths are writable inside the container.


Deploy Langflow on Klutch.sh (Dockerfile workflow)

  1. Push your repository (with the Dockerfile at the root) to GitHub.
  2. Open klutch.sh/app, create a project, and add an app.
  1. Connect the GitHub repository; Klutch.sh automatically detects the Dockerfile.
  2. Choose HTTP traffic for Langflow.
  3. Set the internal port to 7860.
  4. Add the environment variables above (LLM keys, data dir, and any NIXPACKS_* overrides if you temporarily deploy without the Dockerfile).
  5. Attach persistent volumes for /app/data (and /app/logs if used), selecting sizes that fit your projects and logs.
  6. Deploy. Your Langflow UI will be reachable at https://example-app.klutch.sh; attach a custom domain if desired.

Sample usage

Trigger a flow via the API (replace placeholders):

Terminal window
curl -X POST "https://example-app.klutch.sh/api/v1/flows/<flow-id>/run" \
-H "Content-Type: application/json" \
-d '{"inputs": {"question": "What is Langflow?"}}'

Health checks and production tips

  • Add a reverse proxy probe to / or a lightweight status route.
  • Enforce HTTPS at the edge; forward HTTP to port 7860 internally.
  • Keep secrets in Klutch.sh variables; rotate API keys regularly.
  • Monitor disk usage on /app/data; resize volumes before they fill.
  • Pin dependency versions and test upgrades to avoid runtime surprises.

Langflow on Klutch.sh combines reproducible Docker builds with managed secrets, persistent storage, and flexible HTTP/TCP routing. With the Dockerfile at the repo root and port 7860 configured, you can build and serve LLM workflows without extra YAML or workflow overhead.