Skip to content

Deploying Evidence

Introduction

Evidence is an open-source, code-based business intelligence tool that lets you build reports and dashboards using SQL and markdown. Unlike traditional drag-and-drop BI tools, Evidence treats your analytics as code, enabling version control, code review, and collaboration through familiar development workflows. Evidence generates a static website from markdown files containing SQL queries, charts, and components, making it perfect for data teams who want to deliver beautiful, fast, and maintainable analytics applications.

With Evidence, you write SQL statements directly in markdown files to query your data sources, then use built-in components to render charts, tables, and visualizations. The platform supports templated pages, loops, conditional rendering, and a rich component library that makes creating professional data products straightforward and enjoyable. Evidence is built on modern web technologies including SvelteKit, making it incredibly fast and responsive.

This guide walks you through deploying Evidence on Klutch.sh with Docker, from repository setup through production deployment. You’ll learn how to connect data sources, create your first report, and implement best practices for production analytics workloads.

Why Deploy Evidence on Klutch.sh

  • Automatic Dockerfile Detection - Klutch.sh detects and builds your Dockerfile automatically without manual configuration
  • GitHub Integration - Direct integration with GitHub for continuous deployment from your repository
  • Persistent Storage - Attach volumes for DuckDB files, cache, and static assets
  • Environment Variables - Securely manage database credentials and API keys for data sources
  • Custom Domains - Connect your own domain with automatic HTTPS certificates
  • HTTP Routing - Built-in load balancing and SSL termination for web access
  • Scalable Infrastructure - Start small and scale as your analytics needs grow
  • Zero Downtime Deployments - Rolling updates keep your reports accessible during deployments
  • Version Control Workflow - Deploy from Git branches with full version history

Prerequisites

Before deploying Evidence on Klutch.sh, ensure you have:

  • A Klutch.sh account
  • A GitHub account with a repository for your Evidence project
  • Basic understanding of SQL and markdown
  • Access to data sources (PostgreSQL, MySQL, DuckDB, BigQuery, Snowflake, etc.)
  • Node.js 18 or higher installed locally for development
  • Familiarity with command-line tools

Understanding Evidence Architecture

Evidence is built on several key technologies and concepts:

Core Technologies

  • SvelteKit - Modern JavaScript framework for building the frontend
  • Node.js - Server runtime for the Evidence application
  • DuckDB - Embedded database for local data processing (optional)
  • SQL - Primary interface for querying data sources
  • Markdown - Content format for reports and documentation

How Evidence Works

  1. Markdown Files - Write reports in markdown with embedded SQL queries
  2. SQL Queries - Queries run against configured data sources (PostgreSQL, BigQuery, etc.)
  3. Components - Built-in components render query results as charts and tables
  4. Static Generation - Evidence builds a static website from your markdown files
  5. Templating - Create dynamic pages from data using loops and conditionals

Data Source Support

Evidence connects to multiple data source types:

  • PostgreSQL - Relational database queries
  • MySQL/MariaDB - MySQL-compatible databases
  • DuckDB - Embedded analytics database
  • BigQuery - Google Cloud data warehouse
  • Snowflake - Cloud data platform
  • Redshift - AWS data warehouse
  • SQLite - Lightweight embedded database
  • CSV Files - Local data files
  • Parquet Files - Columnar data files

Creating Your Evidence Project

Step 1: Initialize a New Evidence Project

Start by creating a new Evidence project locally:

Terminal window
# Create project directory
mkdir my-evidence-app
cd my-evidence-app
# Initialize npm project
npm init -y
# Install Evidence
npm install --save-exact @evidence-dev/evidence

Step 2: Initialize Evidence

Create the Evidence project structure:

Terminal window
# Initialize Evidence project
npx evidence init

This creates the following structure:

my-evidence-app/
├── pages/
│ └── index.md # Your first Evidence page
├── sources/
│ └── connection.yaml # Data source configuration
├── components/
│ └── custom/ # Custom Svelte components
├── static/
│ └── assets/ # Static assets (images, fonts)
├── evidence.config.yaml # Evidence configuration
├── package.json
└── .gitignore

Step 3: Create Your First Page

Edit pages/index.md with a simple query:

# My First Evidence Report
Welcome to your Evidence dashboard!
## Sample Data Query
\`\`\`sql demo_data
SELECT
'January' as month,
1000 as revenue,
100 as customers
UNION ALL
SELECT 'February', 1200, 120
UNION ALL
SELECT 'March', 1400, 140
\`\`\`
## Revenue Chart
<BarChart
data={demo_data}
x=month
y=revenue
title="Monthly Revenue"
/>
## Data Table
<DataTable data={demo_data} />

Step 4: Test Locally

Run Evidence locally to verify setup:

Terminal window
npm run dev

Visit http://localhost:3000 to see your report. You should see a bar chart and data table with the sample data.

Configuring Data Sources

PostgreSQL Connection

Create sources/postgres.connection.yaml:

name: postgres
type: postgres
options:
host: ${POSTGRES_HOST}
port: ${POSTGRES_PORT}
database: ${POSTGRES_DATABASE}
user: ${POSTGRES_USER}
password: ${POSTGRES_PASSWORD}
ssl: true

For PostgreSQL deployment on Klutch.sh, see our PostgreSQL guide.

DuckDB Connection

For local analytics with DuckDB:

name: duckdb
type: duckdb
options:
filename: data/analytics.duckdb

BigQuery Connection

Create sources/bigquery.connection.yaml:

name: bigquery
type: bigquery
options:
project_id: ${BIGQUERY_PROJECT_ID}
credentials: ${BIGQUERY_CREDENTIALS}

Snowflake Connection

Create sources/snowflake.connection.yaml:

name: snowflake
type: snowflake
options:
account: ${SNOWFLAKE_ACCOUNT}
username: ${SNOWFLAKE_USERNAME}
password: ${SNOWFLAKE_PASSWORD}
database: ${SNOWFLAKE_DATABASE}
warehouse: ${SNOWFLAKE_WAREHOUSE}
schema: ${SNOWFLAKE_SCHEMA}

Creating the Dockerfile

Create a production-ready Dockerfile at your repository root:

FROM node:20-alpine
WORKDIR /app
# Install system dependencies
RUN apk add --no-cache \
python3 \
make \
g++ \
git
# Copy package files
COPY package*.json ./
# Install dependencies
RUN npm ci --production=false
# Copy application files
COPY . .
# Build Evidence project
RUN npm run build
# Remove dev dependencies
RUN npm prune --production
# Create directory for DuckDB files
RUN mkdir -p /app/data
# Set environment
ENV NODE_ENV=production
ENV PORT=3000
# Expose port
EXPOSE 3000
# Health check
HEALTHCHECK --interval=30s --timeout=3s --start-period=40s --retries=3 \
CMD node -e "require('http').get('http://localhost:3000/', (r) => {process.exit(r.statusCode === 200 ? 0 : 1)})"
# Start Evidence
CMD ["npm", "run", "start"]

Dockerfile with Custom Build

For advanced configurations:

FROM node:20-alpine AS builder
WORKDIR /app
# Install build dependencies
RUN apk add --no-cache python3 make g++ git
# Copy package files
COPY package*.json ./
RUN npm ci
# Copy source files
COPY . .
# Build application
RUN npm run build
# Production stage
FROM node:20-alpine
WORKDIR /app
# Install runtime dependencies only
RUN apk add --no-cache dumb-init
# Copy built application
COPY --from=builder /app/.evidence ./evidence
COPY --from=builder /app/package*.json ./
COPY --from=builder /app/node_modules ./node_modules
# Create data directory
RUN mkdir -p /app/data && chown -R node:node /app
USER node
ENV NODE_ENV=production
ENV PORT=3000
EXPOSE 3000
HEALTHCHECK --interval=30s --timeout=3s --start-period=40s --retries=3 \
CMD node -e "require('http').get('http://localhost:3000/', (r) => {process.exit(r.statusCode === 200 ? 0 : 1)})"
ENTRYPOINT ["dumb-init", "--"]
CMD ["npm", "run", "start"]

Environment Configuration

Create a .env.example file to document required environment variables:

Terminal window
# Evidence Configuration
NODE_ENV=production
PORT=3000
EVIDENCE_HOST=0.0.0.0
# PostgreSQL Data Source
POSTGRES_HOST=postgres-app.klutch.sh
POSTGRES_PORT=8000
POSTGRES_DATABASE=analytics
POSTGRES_USER=analytics_user
POSTGRES_PASSWORD=your_secure_password
# BigQuery Data Source (Optional)
BIGQUERY_PROJECT_ID=your-project-id
BIGQUERY_CREDENTIALS={"type":"service_account","project_id":"..."}
# Snowflake Data Source (Optional)
SNOWFLAKE_ACCOUNT=your-account
SNOWFLAKE_USERNAME=your_username
SNOWFLAKE_PASSWORD=your_password
SNOWFLAKE_DATABASE=ANALYTICS
SNOWFLAKE_WAREHOUSE=COMPUTE_WH
SNOWFLAKE_SCHEMA=PUBLIC
# DuckDB Configuration
DUCKDB_PATH=/app/data/analytics.duckdb
# Evidence Settings
EVIDENCE_BUILD_DEV=false
EVIDENCE_STRICT=true

Deploying to Klutch.sh

    1. Prepare Your Repository

      Commit all files to Git:

      Terminal window
      git init
      git add .
      git commit -m "Initial Evidence deployment"
      git branch -M main
      git remote add origin https://github.com/your-username/evidence-app.git
      git push -u origin main
    2. Log in to Klutch.sh

      Navigate to klutch.sh/app and sign in to your account.

    3. Create a New Project

      • Click “New Project”
      • Enter a project name (e.g., “Evidence Analytics”)
      • Select your organization or personal account
    4. Create a New App

      • Within your project, click “New App”
      • Give your app a name (e.g., “evidence-reports”)
    5. Connect Your GitHub Repository

      • Select GitHub as your Git source
      • Authorize Klutch.sh to access your repositories if prompted
      • Choose the repository containing your Evidence project
      • Select the branch to deploy (typically main)
    6. Configure Network Settings

      Klutch.sh will automatically detect your Dockerfile. Configure the following:

      • Traffic Type: Select HTTP (Evidence is a web application)
      • Internal Port: Set to 3000 (Evidence’s default port)
    7. Set Environment Variables

      Add the environment variables for your data sources in the Klutch.sh dashboard:

      Required Variables:

      NODE_ENV=production
      PORT=3000
      EVIDENCE_HOST=0.0.0.0

      Data Source Variables (PostgreSQL example):

      POSTGRES_HOST=your-postgres-app.klutch.sh
      POSTGRES_PORT=8000
      POSTGRES_DATABASE=analytics
      POSTGRES_USER=analytics_user
      POSTGRES_PASSWORD=your_secure_password

      Important: Mark sensitive variables like POSTGRES_PASSWORD and BIGQUERY_CREDENTIALS as secret.

    8. Attach Persistent Volume (Optional)

      If using DuckDB or storing cache files:

      • Click “Add Volume” in the storage section
      • Mount Path: /app/data
      • Size: Start with 5GB (adjust based on data volume)
    9. Configure Additional Settings

      • Region: Choose the region closest to your users
      • Compute Resources: Minimum 512MB RAM, 1GB+ recommended
      • Instances: Start with 1 instance
    10. Deploy the Application

      Click “Create” or “Deploy” to start the deployment. Klutch.sh will:

      • Detect your Dockerfile automatically
      • Build the Docker image
      • Install dependencies
      • Build the Evidence project
      • Start the application server
      • Assign a URL (e.g., https://evidence-app.klutch.sh)
    11. Wait for Deployment

      Monitor the build logs. The initial deployment may take 3-5 minutes as it:

      • Installs Node.js dependencies
      • Builds the Evidence application
      • Compiles SvelteKit assets
      • Starts the server
    12. Verify Deployment

      Once deployed, visit your application URL (https://your-app.klutch.sh). You should see your Evidence reports and dashboards.

Building Your First Report

Creating a Sales Dashboard

Create pages/sales-dashboard.md:

# Sales Dashboard
## Overview Metrics
\`\`\`sql summary
SELECT
SUM(revenue) as total_revenue,
COUNT(DISTINCT customer_id) as total_customers,
AVG(order_value) as avg_order_value,
COUNT(*) as total_orders
FROM sales
WHERE order_date >= CURRENT_DATE - INTERVAL '30 days'
\`\`\`
<BigValue
data={summary}
value=total_revenue
title="Total Revenue (30d)"
fmt="$#,##0"
/>
<BigValue
data={summary}
value=total_customers
title="Total Customers"
/>
## Revenue Trend
\`\`\`sql revenue_by_day
SELECT
DATE(order_date) as date,
SUM(revenue) as daily_revenue
FROM sales
WHERE order_date >= CURRENT_DATE - INTERVAL '90 days'
GROUP BY DATE(order_date)
ORDER BY date
\`\`\`
<LineChart
data={revenue_by_day}
x=date
y=daily_revenue
title="Daily Revenue (90 days)"
yFmt="$#,##0"
/>
## Top Products
\`\`\`sql top_products
SELECT
product_name,
SUM(quantity) as units_sold,
SUM(revenue) as product_revenue
FROM sales
WHERE order_date >= CURRENT_DATE - INTERVAL '30 days'
GROUP BY product_name
ORDER BY product_revenue DESC
LIMIT 10
\`\`\`
<DataTable
data={top_products}
rows=10
>
<Column id=product_name title="Product" />
<Column id=units_sold title="Units Sold" fmt="#,##0" />
<Column id=product_revenue title="Revenue" fmt="$#,##0.00" />
</DataTable>

Adding Customer Analysis

Create pages/customer-analysis.md:

# Customer Analysis
## Customer Segmentation
\`\`\`sql customer_segments
SELECT
CASE
WHEN total_spent >= 10000 THEN 'Premium'
WHEN total_spent >= 5000 THEN 'Gold'
WHEN total_spent >= 1000 THEN 'Silver'
ELSE 'Bronze'
END as segment,
COUNT(*) as customer_count,
AVG(total_spent) as avg_customer_value
FROM (
SELECT
customer_id,
SUM(order_value) as total_spent
FROM sales
GROUP BY customer_id
) customer_totals
GROUP BY segment
ORDER BY avg_customer_value DESC
\`\`\`
<BarChart
data={customer_segments}
x=segment
y=customer_count
title="Customers by Segment"
/>
## Cohort Analysis
\`\`\`sql cohort_data
SELECT
DATE_TRUNC('month', first_purchase_date) as cohort_month,
DATE_TRUNC('month', order_date) as order_month,
COUNT(DISTINCT customer_id) as active_customers
FROM (
SELECT
customer_id,
order_date,
MIN(order_date) OVER (PARTITION BY customer_id) as first_purchase_date
FROM sales
) cohort_base
WHERE first_purchase_date >= CURRENT_DATE - INTERVAL '12 months'
GROUP BY cohort_month, order_month
ORDER BY cohort_month, order_month
\`\`\`
<Heatmap
data={cohort_data}
x=order_month
y=cohort_month
value=active_customers
title="Customer Retention Cohorts"
/>

Using Templated Pages

Create pages/[product_id].md for dynamic product pages:

# {params.product_id} Performance
\`\`\`sql product_details
SELECT
product_name,
category,
SUM(quantity) as units_sold,
SUM(revenue) as total_revenue,
AVG(rating) as avg_rating
FROM sales
WHERE product_id = '${params.product_id}'
GROUP BY product_name, category
\`\`\`
## Sales Performance
<BigValue
data={product_details}
value=total_revenue
title="Total Revenue"
fmt="$#,##0"
/>
## Daily Sales Trend
\`\`\`sql daily_sales
SELECT
DATE(order_date) as date,
SUM(quantity) as units,
SUM(revenue) as revenue
FROM sales
WHERE product_id = '${params.product_id}'
AND order_date >= CURRENT_DATE - INTERVAL '90 days'
GROUP BY DATE(order_date)
ORDER BY date
\`\`\`
<LineChart
data={daily_sales}
x=date
y=revenue
y2=units
title="Sales Trend"
/>

Advanced Features

Custom Components

Create custom Svelte components in components/custom/:

components/custom/MetricCard.svelte
<script>
export let title;
export let value;
export let change;
export let format = "#,##0";
</script>
<div class="metric-card">
<div class="title">{title}</div>
<div class="value">{value}</div>
{#if change}
<div class="change" class:positive={change > 0} class:negative={change < 0}>
{change > 0 ? '' : ''} {Math.abs(change)}%
</div>
{/if}
</div>
<style>
.metric-card {
padding: 1.5rem;
border-radius: 8px;
background: white;
box-shadow: 0 2px 8px rgba(0,0,0,0.1);
}
.title {
font-size: 0.875rem;
color: #666;
margin-bottom: 0.5rem;
}
.value {
font-size: 2rem;
font-weight: bold;
color: #333;
}
.change {
font-size: 0.875rem;
margin-top: 0.5rem;
}
.positive { color: #10b981; }
.negative { color: #ef4444; }
</style>

Use it in your markdown:

<MetricCard
title="Monthly Revenue"
value="$125,430"
change={15.3}
/>

Loops and Conditionals

Use loops to generate dynamic content:

## Product Performance
\`\`\`sql products
SELECT product_id, product_name, revenue
FROM sales
GROUP BY product_id, product_name
ORDER BY revenue DESC
LIMIT 5
\`\`\`
{#each products as product}
### {product.product_name}
Revenue: ${product.revenue}
{/each}

Use conditionals to show/hide content:

\`\`\`sql sales_target
SELECT
SUM(revenue) as actual,
1000000 as target
FROM sales
WHERE EXTRACT(MONTH FROM order_date) = EXTRACT(MONTH FROM CURRENT_DATE)
\`\`\`
{#if sales_target[0].actual >= sales_target[0].target}
## 🎉 Target Achieved!
Great work! We've exceeded our monthly target.
{:else}
## Keep Pushing
We're at {Math.round(sales_target[0].actual / sales_target[0].target * 100)}% of our target.
{/if}

Parameterized Reports

Add date pickers and filters:

---
title: Sales Report
---
<DateRange
name=date_range
start='2024-01-01'
end='2024-12-31'
/>
\`\`\`sql filtered_sales
SELECT
DATE(order_date) as date,
SUM(revenue) as revenue
FROM sales
WHERE order_date BETWEEN '${inputs.date_range.start}' AND '${inputs.date_range.end}'
GROUP BY DATE(order_date)
ORDER BY date
\`\`\`
<LineChart data={filtered_sales} x=date y=revenue />

Production Best Practices

Security Configuration

1. Secure Database Credentials

Never commit credentials to Git. Use Klutch.sh environment variables:

sources/postgres.connection.yaml
name: postgres
type: postgres
options:
host: ${POSTGRES_HOST}
port: ${POSTGRES_PORT}
database: ${POSTGRES_DATABASE}
user: ${POSTGRES_USER}
password: ${POSTGRES_PASSWORD}
ssl: true
ssl_reject_unauthorized: true

2. Use Read-Only Database Users

Create dedicated read-only users for Evidence:

-- PostgreSQL
CREATE USER evidence_readonly WITH PASSWORD 'secure_password';
GRANT CONNECT ON DATABASE analytics TO evidence_readonly;
GRANT USAGE ON SCHEMA public TO evidence_readonly;
GRANT SELECT ON ALL TABLES IN SCHEMA public TO evidence_readonly;
ALTER DEFAULT PRIVILEGES IN SCHEMA public GRANT SELECT ON TABLES TO evidence_readonly;

3. Implement Row-Level Security

Use database views or row-level security to restrict data access:

-- Create view with filtered data
CREATE VIEW sales_filtered AS
SELECT * FROM sales
WHERE region = current_setting('app.user_region');
-- Grant access to view only
GRANT SELECT ON sales_filtered TO evidence_readonly;

Performance Optimization

1. Query Optimization

Write efficient SQL queries:

-- Bad: Selecting all columns and filtering in JavaScript
SELECT * FROM sales;
-- Good: Select only needed columns and filter in SQL
SELECT
product_id,
SUM(revenue) as total_revenue
FROM sales
WHERE order_date >= CURRENT_DATE - INTERVAL '30 days'
GROUP BY product_id;

2. Use Materialized Views

Pre-compute expensive queries:

-- Create materialized view
CREATE MATERIALIZED VIEW daily_sales_summary AS
SELECT
DATE(order_date) as date,
product_category,
COUNT(*) as order_count,
SUM(revenue) as total_revenue
FROM sales
GROUP BY DATE(order_date), product_category;
-- Refresh periodically
REFRESH MATERIALIZED VIEW daily_sales_summary;

3. Enable Caching

Configure Evidence caching in evidence.config.yaml:

build:
cache: true
cacheDuration: 3600 # 1 hour in seconds
queries:
defaultCacheDuration: 300 # 5 minutes

4. Optimize Data Sources

Use connection pooling for databases:

sources/postgres.connection.yaml
name: postgres
type: postgres
options:
host: ${POSTGRES_HOST}
port: ${POSTGRES_PORT}
database: ${POSTGRES_DATABASE}
user: ${POSTGRES_USER}
password: ${POSTGRES_PASSWORD}
pool:
min: 2
max: 10
idle_timeout: 30000

Monitoring and Logging

1. Application Monitoring

Monitor Evidence application health:

Terminal window
# Check application status
curl -f https://evidence-app.klutch.sh/ || echo "Application down"
# Monitor response times
time curl -s -o /dev/null https://evidence-app.klutch.sh/

2. Query Performance Monitoring

Log slow queries in your database:

-- PostgreSQL: Enable slow query logging
ALTER DATABASE analytics SET log_min_duration_statement = 1000; -- 1 second
-- View slow queries
SELECT
query,
mean_exec_time,
calls
FROM pg_stat_statements
ORDER BY mean_exec_time DESC
LIMIT 10;

3. Error Tracking

Implement error tracking in Evidence:

// Add to evidence.config.yaml or custom component
window.addEventListener('error', (event) => {
fetch('/api/log-error', {
method: 'POST',
headers: { 'Content-Type': 'application/json' },
body: JSON.stringify({
message: event.message,
filename: event.filename,
lineno: event.lineno,
timestamp: new Date().toISOString()
})
});
});

Backup and Disaster Recovery

1. Version Control

Keep all Evidence code in Git:

Terminal window
# Regular commits
git add .
git commit -m "Update sales dashboard with new metrics"
git push origin main
# Tag releases
git tag -a v1.0.0 -m "Initial production release"
git push origin v1.0.0

2. Data Source Backups

Backup database data regularly:

backup-postgres.sh
#!/bin/bash
DATE=$(date +%Y%m%d_%H%M%S)
BACKUP_DIR="/backups"
pg_dump -h postgres-app.klutch.sh -p 8000 -U analytics_user -d analytics \
| gzip > ${BACKUP_DIR}/analytics_${DATE}.sql.gz
# Keep last 7 days
find ${BACKUP_DIR} -name "*.sql.gz" -mtime +7 -delete
echo "Backup completed: ${DATE}"

3. DuckDB Data Backup

If using DuckDB with persistent volumes:

Terminal window
# Backup DuckDB file
docker cp evidence-container:/app/data/analytics.duckdb ./backups/analytics_$(date +%Y%m%d).duckdb
# Compress backup
gzip ./backups/analytics_$(date +%Y%m%d).duckdb

Scaling Strategies

1. Read Replicas

Use database read replicas for analytics queries:

sources/postgres_replica.connection.yaml
name: postgres_replica
type: postgres
options:
host: ${POSTGRES_REPLICA_HOST}
port: ${POSTGRES_REPLICA_PORT}
database: ${POSTGRES_DATABASE}
user: ${POSTGRES_USER}
password: ${POSTGRES_PASSWORD}
ssl: true

2. Horizontal Scaling

Increase Evidence instances in Klutch.sh dashboard:

  • Navigate to your app settings
  • Increase instance count to 2-3 instances
  • Klutch.sh automatically load balances traffic

3. CDN Integration

Serve static assets through a CDN:

evidence.config.yaml
build:
staticPath: /static
assetHost: https://cdn.example.com

Troubleshooting

Build Failures

Problem: Build fails with dependency errors

npm ERR! Could not resolve dependency

Solutions:

  1. Delete package-lock.json and node_modules:

    Terminal window
    rm package-lock.json
    rm -rf node_modules
    npm install
  2. Update Evidence to the latest version:

    Terminal window
    npm install @evidence-dev/evidence@latest
  3. Check Node.js version in Dockerfile matches local version

Data Source Connection Errors

Problem: Cannot connect to PostgreSQL

Error connecting to data source: Connection refused

Solutions:

  1. Verify environment variables are set correctly:

    Terminal window
    echo $POSTGRES_HOST
    echo $POSTGRES_PORT
  2. Check database is accessible from Klutch.sh:

    Terminal window
    psql -h postgres-app.klutch.sh -p 8000 -U analytics_user -d analytics
  3. Verify SSL settings match database requirements:

    options:
    ssl: true
    ssl_reject_unauthorized: false # Only for testing
  4. Check firewall rules allow connections from Klutch.sh

Query Errors

Problem: SQL query fails with syntax error

SQL Error: syntax error at or near "SELECT"

Solutions:

  1. Validate SQL syntax in your database client first

  2. Check data source type matches query dialect:

    • PostgreSQL vs MySQL have different syntax
    • DuckDB has unique functions
  3. Escape special characters in queries:

    SELECT * FROM products WHERE name LIKE '%O''Brien%'
  4. Use proper date formatting for your database:

    -- PostgreSQL
    WHERE order_date >= CURRENT_DATE - INTERVAL '30 days'
    -- MySQL
    WHERE order_date >= DATE_SUB(CURRENT_DATE, INTERVAL 30 DAY)

Performance Issues

Problem: Pages load slowly

Solutions:

  1. Check query execution time:

    EXPLAIN ANALYZE
    SELECT * FROM large_table;
  2. Add indexes to frequently queried columns:

    CREATE INDEX idx_order_date ON sales(order_date);
    CREATE INDEX idx_customer_id ON sales(customer_id);
  3. Enable query result caching:

    queries:
    defaultCacheDuration: 300
  4. Use aggregated tables or materialized views for heavy queries

Chart Rendering Issues

Problem: Charts don’t display correctly

Solutions:

  1. Verify query returns expected data structure:

    <DataTable data={my_query} /> <!-- Debug with table first -->
  2. Check column names match chart configuration:

    <BarChart data={sales} x=date y=revenue /> <!-- Must match query columns -->
  3. Ensure data types are correct:

    SELECT
    CAST(order_date AS DATE) as date, -- Ensure date type
    CAST(revenue AS DECIMAL(10,2)) as revenue -- Ensure numeric type
    FROM sales

Updating Evidence

To update your Evidence deployment:

    1. Update Dependencies Locally

      Terminal window
      cd my-evidence-app
      npm update @evidence-dev/evidence
      npm test
    2. Test Changes

      Terminal window
      npm run dev
      # Verify reports work correctly
    3. Commit and Push

      Terminal window
      git add package.json package-lock.json
      git commit -m "Update Evidence to v2.0.0"
      git push origin main
    4. Monitor Deployment

      Klutch.sh automatically rebuilds and deploys the updated application. Monitor the deployment logs for any issues.

    5. Verify Production

      • Test all reports and dashboards
      • Check data source connections
      • Verify charts render correctly
      • Test any custom components

Sample Code Examples

Creating a KPI Dashboard

# Executive Dashboard
## Key Performance Indicators
\`\`\`sql kpis
SELECT
(SELECT SUM(revenue) FROM sales WHERE order_date >= CURRENT_DATE - INTERVAL '30 days') as revenue_30d,
(SELECT SUM(revenue) FROM sales WHERE order_date >= CURRENT_DATE - INTERVAL '60 days' AND order_date < CURRENT_DATE - INTERVAL '30 days') as revenue_prev_30d,
(SELECT COUNT(DISTINCT customer_id) FROM sales WHERE order_date >= CURRENT_DATE - INTERVAL '30 days') as customers_30d,
(SELECT AVG(order_value) FROM sales WHERE order_date >= CURRENT_DATE - INTERVAL '30 days') as avg_order_30d
\`\`\`
\`\`\`sql calculations
SELECT
*,
ROUND(((revenue_30d - revenue_prev_30d) / revenue_prev_30d * 100), 1) as revenue_change
FROM ${kpis}
\`\`\`
<Grid cols=3>
<BigValue
data={calculations}
value=revenue_30d
title="Revenue (30d)"
fmt="$#,##0"
comparison=revenue_prev_30d
comparisonTitle="vs Previous Period"
/>
<BigValue
data={calculations}
value=customers_30d
title="Active Customers"
fmt="#,##0"
/>
<BigValue
data={calculations}
value=avg_order_30d
title="Avg Order Value"
fmt="$#,##0.00"
/>
</Grid>

Geographic Sales Analysis

# Geographic Performance
\`\`\`sql sales_by_region
SELECT
region,
state,
SUM(revenue) as total_revenue,
COUNT(*) as order_count,
COUNT(DISTINCT customer_id) as customer_count
FROM sales
WHERE order_date >= CURRENT_DATE - INTERVAL '90 days'
GROUP BY region, state
ORDER BY total_revenue DESC
\`\`\`
<AreaMap
data={sales_by_region}
areaCol=state
value=total_revenue
title="Revenue by State"
fmt="$#,##0"
/>
## Regional Breakdown
<BarChart
data={sales_by_region}
x=region
y=total_revenue
title="Revenue by Region"
swapXY=true
/>

Funnel Analysis

# Conversion Funnel
\`\`\`sql funnel_stages
SELECT
'Website Visits' as stage,
1 as stage_order,
COUNT(*) as users
FROM website_events
WHERE event_date >= CURRENT_DATE - INTERVAL '30 days'
UNION ALL
SELECT
'Product Views' as stage,
2 as stage_order,
COUNT(DISTINCT user_id) as users
FROM website_events
WHERE event_type = 'product_view'
AND event_date >= CURRENT_DATE - INTERVAL '30 days'
UNION ALL
SELECT
'Add to Cart' as stage,
3 as stage_order,
COUNT(DISTINCT user_id) as users
FROM website_events
WHERE event_type = 'add_to_cart'
AND event_date >= CURRENT_DATE - INTERVAL '30 days'
UNION ALL
SELECT
'Purchase' as stage,
4 as stage_order,
COUNT(DISTINCT customer_id) as users
FROM sales
WHERE order_date >= CURRENT_DATE - INTERVAL '30 days'
ORDER BY stage_order
\`\`\`
<FunnelChart
data={funnel_stages}
nameCol=stage
valueCol=users
title="User Conversion Funnel"
/>

Advanced Configuration

Custom Domain Setup

    1. Add Domain in Klutch.sh

      • Go to your app settings
      • Add your custom domain (e.g., analytics.yourcompany.com)
    2. Configure DNS

      Add a CNAME record pointing to your Klutch.sh app:

      analytics.yourcompany.com CNAME your-evidence-app.klutch.sh
    3. Update Evidence Configuration

      Update base URL in evidence.config.yaml:

      deployment:
      url: https://analytics.yourcompany.com

Authentication Integration

While Evidence doesn’t include built-in auth, you can add it with a reverse proxy:

nginx.conf
server {
listen 80;
server_name analytics.example.com;
auth_basic "Analytics Access";
auth_basic_user_file /etc/nginx/.htpasswd;
location / {
proxy_pass http://localhost:3000;
proxy_set_header Host $host;
proxy_set_header X-Real-IP $remote_addr;
}
}

Or use an external authentication service like Auth0 or Clerk.

Multi-Environment Setup

Create separate configurations for development, staging, and production:

evidence.config.yaml
environments:
development:
build:
cache: false
staging:
build:
cache: true
cacheDuration: 600
production:
build:
cache: true
cacheDuration: 3600

Resources and Further Reading

Official Documentation

Component Library

Learning Resources

Conclusion

You’ve successfully deployed Evidence on Klutch.sh! Your code-based business intelligence platform is now ready to create beautiful, fast, and maintainable reports from your data. Evidence combines the power of SQL with the simplicity of markdown, enabling you to build sophisticated analytics applications that integrate seamlessly with your development workflow.

With Evidence running on Klutch.sh, you benefit from automatic deployments, secure environment variable management, persistent storage options, and scalable infrastructure. Your reports are version controlled, code-reviewed, and deployed through the same workflows you use for application code.

Start creating reports by writing SQL queries in markdown files, use the rich component library to visualize your data, and leverage templating for dynamic pages. As your analytics needs grow, you can add custom components, implement caching strategies, and scale your deployment to handle increasing traffic.

For questions or issues, join the Evidence community on Slack, consult the documentation, or explore the example projects to see what’s possible with code-based BI.