Skip to content

Deploying Dawarich

Dawarich is a self-hosted location tracking and history visualization application that gives you complete control over your location data. Instead of relying on commercial services like Google Timeline or Apple’s Significant Locations, Dawarich lets you own and manage your location history on your own infrastructure. The application imports GPS data from various sources including GPX files, Google Takeout archives, OwnTracks, and other location tracking apps, then visualizes your travels on interactive maps with timeline views, statistics, and detailed trip analysis.

What makes Dawarich particularly valuable is its focus on privacy and data ownership. All your location data stays on your server, and you have complete control over who can access it. The application provides beautiful visualizations of your travel history, showing places you’ve visited, routes you’ve taken, and statistics about your movements. Whether you’re tracking personal travel, analyzing commute patterns, or simply want to maintain a private record of your journeys, Dawarich offers the tools to explore and understand your location data without surrendering privacy to third-party services.

Why Deploy Dawarich on Klutch.sh?

Deploying Dawarich on Klutch.sh offers several advantages for hosting your location tracking platform:

  • Automatic Docker Detection: Klutch.sh recognizes your Dockerfile and handles containerization without manual configuration
  • Persistent Storage: Built-in volume management ensures your location database and imported data persist across deployments
  • HTTPS by Default: Secure access to your location data with automatic SSL certificates
  • Database Support: Easy integration with PostgreSQL or SQLite for storing location history
  • Environment Management: Securely configure database credentials, API keys, and application settings through environment variables
  • Privacy Control: Keep your location data on infrastructure you control
  • Always-On Availability: Access your location history 24/7 from anywhere

Prerequisites

Before deploying Dawarich to Klutch.sh, ensure you have:

  • A Klutch.sh account (sign up here)
  • A GitHub account with a repository for your Dawarich deployment
  • Basic understanding of Docker and containerization
  • Location data to import (GPX files, Google Takeout, or OwnTracks data)
  • Optional: PostgreSQL database (for production deployments)
  • Git installed on your local development machine
  • Familiarity with location data formats (GPX, GeoJSON)

Understanding Dawarich Architecture

Dawarich follows a Ruby on Rails architecture designed for location data management:

Core Components

Ruby on Rails Application

Dawarich is built with Ruby on Rails, providing a robust MVC framework for handling location data. The application manages user authentication, data import pipelines, visualization rendering, and API endpoints for location tracking devices. Rails’ Active Record ORM handles database interactions efficiently, supporting both PostgreSQL and SQLite backends.

Database Layer

Location data is stored in a relational database:

  • PostgreSQL: Recommended for production with better performance for large datasets
  • SQLite: Suitable for personal use or small deployments

The database schema stores:

  • Location points (latitude, longitude, timestamp, accuracy)
  • Places (frequently visited locations)
  • Trips (connected location points forming journeys)
  • User preferences and settings
  • Import metadata and processing status

Location Data Importer

The import system processes various location data formats:

  • GPX Files: Standard GPS exchange format from fitness apps and GPS devices
  • Google Takeout: Location history from Google services
  • OwnTracks: Real-time location tracking protocol
  • GeoJSON: Geographic data interchange format
  • CSV: Custom formatted location logs

Background jobs process imports asynchronously, parsing coordinates, validating data, and organizing points into meaningful trips.

Map Visualization Engine

Dawarich uses Leaflet.js and OpenStreetMap tiles to render interactive maps. The visualization system:

  • Displays location points as markers or heatmaps
  • Draws routes connecting sequential points
  • Highlights frequently visited places
  • Provides timeline navigation
  • Supports multiple map layers and styles

Statistics and Analytics

The analytics engine processes location data to generate:

  • Total distance traveled
  • Countries and cities visited
  • Time spent in different locations
  • Travel patterns and trends
  • Most frequent routes
  • Speed and elevation profiles

API Layer

RESTful APIs enable:

  • Real-time location updates from tracking apps
  • Data export in various formats
  • Third-party integrations
  • Mobile app connectivity
  • Webhook notifications for new locations

Background Job System

Sidekiq handles asynchronous tasks:

  • Processing large import files
  • Generating statistics
  • Cleaning up old data
  • Sending notifications
  • Reverse geocoding locations

Data Flow

  1. User uploads location data file or device sends real-time update
  2. Data is validated and queued for processing
  3. Background job parses location data and extracts coordinates
  4. Location points are saved to database with timestamps
  5. Points are grouped into trips based on time gaps
  6. Reverse geocoding adds place names to coordinates
  7. Statistics are updated with new data
  8. Visualizations are generated for map display
  9. User views processed data through web interface
  10. API provides access for external applications

Storage Requirements

Dawarich requires persistent storage for:

  • Database: Location points, trips, and metadata (grows with usage)
  • Uploaded Files: GPX files and import archives before processing
  • Assets: Map tiles cache and generated images
  • Logs: Application and job processing logs

A typical deployment needs 1GB-10GB initially, with storage growing based on location data volume. One year of detailed tracking might use 2-5GB.

Installation and Setup

Let’s walk through setting up Dawarich for deployment on Klutch.sh.

Step 1: Create the Project Structure

First, create a new directory for your Dawarich deployment:

Terminal window
mkdir dawarich-deployment
cd dawarich-deployment
git init

Step 2: Create the Dockerfile

Create a Dockerfile in the root directory:

FROM ruby:3.2-alpine
# Set environment variables
ENV RAILS_ENV=production \
RACK_ENV=production \
NODE_ENV=production \
RAILS_SERVE_STATIC_FILES=true \
RAILS_LOG_TO_STDOUT=true
# Install system dependencies
RUN apk add --no-cache \
build-base \
postgresql-dev \
postgresql-client \
sqlite-dev \
tzdata \
nodejs \
npm \
git \
curl \
imagemagick \
gcompat
# Create app directory
WORKDIR /app
# Create app user
RUN addgroup -g 1000 dawarich && \
adduser -D -u 1000 -G dawarich dawarich
# Clone Dawarich repository
RUN git clone https://github.com/Freika/dawarich.git /app && \
chown -R dawarich:dawarich /app
# Switch to app user
USER dawarich
# Install Ruby dependencies
COPY --chown=dawarich:dawarich Gemfile* ./
RUN bundle config set --local deployment 'true' && \
bundle config set --local without 'development test' && \
bundle install --jobs 4
# Install JavaScript dependencies
RUN npm install
# Precompile assets
RUN SECRET_KEY_BASE=dummy bundle exec rake assets:precompile
# Create necessary directories
RUN mkdir -p tmp/pids tmp/sockets log public/uploads
# Expose port
EXPOSE 3000
# Health check
HEALTHCHECK --interval=30s --timeout=10s --start-period=60s --retries=3 \
CMD curl -f http://localhost:3000/health || exit 1
# Start application
CMD ["sh", "-c", "bundle exec rake db:migrate && bundle exec puma -C config/puma.rb"]

Step 3: Create Gemfile for Dependencies

Create a Gemfile:

source 'https://rubygems.org'
ruby '~> 3.2.0'
gem 'rails', '~> 7.1.0'
gem 'puma', '~> 6.0'
gem 'pg', '~> 1.5'
gem 'sqlite3', '~> 1.6'
gem 'redis', '~> 5.0'
gem 'sidekiq', '~> 7.0'
gem 'gpx', '~> 1.0'
gem 'devise', '~> 4.9'
gem 'pundit', '~> 2.3'
gem 'kaminari', '~> 1.2'
gem 'carrierwave', '~> 3.0'
gem 'mini_magick', '~> 4.12'
gem 'geocoder', '~> 1.8'
gem 'rgeo', '~> 3.0'
gem 'rgeo-geojson', '~> 2.1'
gem 'leaflet-rails', '~> 1.9'
gem 'chartkick', '~> 5.0'
gem 'groupdate', '~> 6.0'
gem 'rubyzip', '~> 2.3'
gem 'nokogiri', '~> 1.15'
gem 'httparty', '~> 0.21'
gem 'dotenv-rails', '~> 2.8'
gem 'bootsnap', require: false
gem 'sprockets-rails', '~> 3.4'
gem 'importmap-rails', '~> 1.2'
gem 'turbo-rails', '~> 1.5'
gem 'stimulus-rails', '~> 1.3'

Step 4: Create Environment Configuration

Create a .env.example file:

Terminal window
# Application Settings
RAILS_ENV=production
RACK_ENV=production
NODE_ENV=production
SECRET_KEY_BASE=your-secret-key-base
# Server Configuration
PORT=3000
RAILS_SERVE_STATIC_FILES=true
RAILS_LOG_TO_STDOUT=true
# Database Configuration (PostgreSQL - recommended for production)
DATABASE_URL=postgresql://dawarich:password@postgres-host:5432/dawarich_production
# Or SQLite (for development/small deployments)
# DATABASE_ADAPTER=sqlite3
# DATABASE_PATH=/app/storage/dawarich.db
# Redis Configuration (for Sidekiq)
REDIS_URL=redis://redis-host:6379/0
# Application URLs
APPLICATION_HOST=example-app.klutch.sh
APPLICATION_URL=https://example-app.klutch.sh
# File Upload Settings
MAX_UPLOAD_SIZE=100MB
STORAGE_PATH=/app/storage
# Map Settings
DEFAULT_MAP_CENTER_LAT=0
DEFAULT_MAP_CENTER_LNG=0
DEFAULT_MAP_ZOOM=2
MAP_TILE_PROVIDER=OpenStreetMap
# Geocoding Settings (optional)
GEOCODING_ENABLED=true
GEOCODING_API_KEY=your-geocoding-api-key
REVERSE_GEOCODING_PROVIDER=nominatim
# Import Settings
AUTO_PROCESS_IMPORTS=true
IMPORT_BATCH_SIZE=1000
MAX_IMPORT_FILE_SIZE=500MB
# Analytics and Statistics
ENABLE_STATISTICS=true
STATISTICS_CACHE_TTL=3600
# Security
FORCE_SSL=true
SESSION_TIMEOUT=1440
# Email Configuration (optional)
SMTP_ADDRESS=smtp.example.com
SMTP_PORT=587
SMTP_USERNAME=your-username
SMTP_PASSWORD=your-password
SMTP_DOMAIN=example.com
SMTP_FROM=noreply@example.com
# Background Jobs
SIDEKIQ_CONCURRENCY=5
SIDEKIQ_TIMEOUT=25

Step 5: Create Database Configuration

Create config/database.yml:

default: &default
adapter: <%= ENV['DATABASE_ADAPTER'] || 'postgresql' %>
encoding: unicode
pool: <%= ENV.fetch("RAILS_MAX_THREADS") { 5 } %>
timeout: 5000
production:
<<: *default
url: <%= ENV['DATABASE_URL'] %>

Step 6: Create Puma Configuration

Create config/puma.rb:

#!/usr/bin/env puma
max_threads_count = ENV.fetch("RAILS_MAX_THREADS") { 5 }
min_threads_count = ENV.fetch("RAILS_MIN_THREADS") { max_threads_count }
threads min_threads_count, max_threads_count
port ENV.fetch("PORT") { 3000 }
environment ENV.fetch("RAILS_ENV") { "production" }
pidfile ENV.fetch("PIDFILE") { "tmp/pids/server.pid" }
plugin :tmp_restart
workers ENV.fetch("WEB_CONCURRENCY") { 2 }
preload_app!
on_worker_boot do
ActiveRecord::Base.establish_connection if defined?(ActiveRecord)
end
before_fork do
ActiveRecord::Base.connection_pool.disconnect! if defined?(ActiveRecord)
end

Step 7: Create Sidekiq Configuration

Create config/sidekiq.yml:

:concurrency: <%= ENV.fetch("SIDEKIQ_CONCURRENCY") { 5 } %>
:timeout: <%= ENV.fetch("SIDEKIQ_TIMEOUT") { 25 } %>
:queues:
- [critical, 5]
- [default, 3]
- [low, 1]
production:
:concurrency: 10

Step 8: Create .dockerignore

Create a .dockerignore file:

.git
.gitignore
.env
.env.local
*.md
README.md
.DS_Store
Thumbs.db
log/
tmp/
node_modules/
public/uploads/
storage/
*.log
.byebug_history
.ruby-version
.bundle
vendor/bundle
coverage/
test/
spec/

Step 9: Create Startup Script

Create bin/docker-entrypoint.sh:

#!/bin/sh
set -e
# Wait for database to be ready
echo "Waiting for database..."
until nc -z -v -w30 $DB_HOST $DB_PORT 2>/dev/null
do
echo "Waiting for database connection..."
sleep 2
done
echo "Database is ready!"
# Run database migrations
echo "Running database migrations..."
bundle exec rake db:create db:migrate
# Start Sidekiq in background (optional)
if [ "$START_SIDEKIQ" = "true" ]; then
echo "Starting Sidekiq..."
bundle exec sidekiq -C config/sidekiq.yml &
fi
# Clear cache
bundle exec rake tmp:cache:clear
# Start the main process
exec "$@"

Make the script executable:

Terminal window
chmod +x bin/docker-entrypoint.sh

Step 10: Create Documentation

Create README.md:

# Dawarich Location Tracking
This repository contains a Dawarich deployment configured for Klutch.sh.
## Features
- Self-hosted location tracking
- Privacy-focused data ownership
- GPX file import
- Google Takeout support
- OwnTracks integration
- Interactive map visualization
- Travel statistics and analytics
- Timeline views
- Place detection
## Configuration
Set environment variables for:
- Database connection
- Redis for background jobs
- Application URLs
- File upload limits
- Map settings
## Importing Data
Supported formats:
- GPX files from fitness apps
- Google Takeout location history
- OwnTracks JSON
- GeoJSON files
- CSV location logs
## Deployment
This application is configured to deploy on Klutch.sh with automatic Docker detection.

Step 11: Initialize Git Repository

Terminal window
git add .
git commit -m "Initial Dawarich setup for Klutch.sh deployment"
git branch -M master
git remote add origin https://github.com/yourusername/dawarich-deployment.git
git push -u origin master

Deploying to Klutch.sh

Now that your Dawarich application is configured, let’s deploy it to Klutch.sh.

  1. Log in to Klutch.sh

    Navigate to klutch.sh/app and sign in with your GitHub account.

  2. Create a New Project

    Click “New Project” and select “Import from GitHub”. Choose the repository containing your Dawarich deployment.

  3. Configure Build Settings

    Klutch.sh will automatically detect the Dockerfile in your repository. The platform will use this for building your container.

  4. Configure Traffic Settings

    Select “HTTP” as the traffic type. Dawarich serves its web interface on port 3000, and Klutch.sh will route HTTPS traffic to this port.

  5. Set Environment Variables

    In the project settings, add the following environment variables:

    • SECRET_KEY_BASE: Generate using openssl rand -hex 64
    • RAILS_ENV: production
    • PORT: 3000
    • APPLICATION_HOST: Your Klutch.sh domain (e.g., example-app.klutch.sh)
    • APPLICATION_URL: https://example-app.klutch.sh
    • RAILS_SERVE_STATIC_FILES: true
    • RAILS_LOG_TO_STDOUT: true

    For PostgreSQL database (recommended):

    • DATABASE_URL: postgresql://username:password@host:port/dawarich_production

    For SQLite (simpler setup):

    • DATABASE_ADAPTER: sqlite3
    • DATABASE_PATH: /app/storage/dawarich.db

    For Redis (background jobs):

    • REDIS_URL: redis://redis-host:6379/0

    Optional settings:

    • MAX_UPLOAD_SIZE: 100MB
    • GEOCODING_ENABLED: true
    • ENABLE_STATISTICS: true
  6. Configure Persistent Storage

    Dawarich requires persistent storage for database and uploaded files:

    • Storage Volume:
      • Mount path: /app/storage
      • Size: 10GB (adjust based on expected data volume)
    • Uploads Volume:
      • Mount path: /app/public/uploads
      • Size: 5GB

    These volumes ensure your location data and uploaded files persist across deployments.

  7. Deploy the Application

    Click “Deploy” to start the build process. Klutch.sh will:

    • Clone your repository
    • Build the Docker image using your Dockerfile
    • Install Ruby and Node.js dependencies
    • Precompile Rails assets
    • Deploy the container
    • Provision an HTTPS endpoint

    The build process typically takes 5-8 minutes due to Ruby gem compilation.

  8. Access Your Dawarich Instance

    Once deployment completes, Klutch.sh will provide a URL like example-app.klutch.sh. Your Dawarich location tracking platform will be available at this URL.

  9. Create Admin Account

    On first visit, you’ll be prompted to create an admin account. Fill in your details to set up the initial user.

Getting Started with Dawarich

Once your Dawarich instance is deployed, here’s how to use it:

Initial Setup

Create Your Account

Visit your deployment URL and create an account. The first registered user becomes the admin.

Configure Preferences

Navigate to Settings to configure:

  • Default map center and zoom level
  • Timezone for date/time display
  • Distance units (kilometers or miles)
  • Privacy settings
  • Notification preferences

Importing Location Data

Import GPX Files

GPX files from fitness apps, GPS devices, or tracking applications:

  1. Go to “Import” → “Upload File”
  2. Click “Choose File” and select your GPX file
  3. Add optional tags or description
  4. Click “Import”
  5. Processing happens in background
  6. Notification when complete

Example GPX structure:

<?xml version="1.0"?>
<gpx version="1.1" creator="YourApp">
<trk>
<name>Morning Run</name>
<trkseg>
<trkpt lat="37.7749" lon="-122.4194">
<time>2025-12-05T08:00:00Z</time>
<ele>10</ele>
</trkpt>
<trkpt lat="37.7750" lon="-122.4195">
<time>2025-12-05T08:01:00Z</time>
<ele>12</ele>
</trkpt>
</trkseg>
</trk>
</gpx>

Import Google Takeout

Extract your location history from Google:

  1. Go to Google Takeout
  2. Deselect all, then select “Location History”
  3. Choose JSON format
  4. Export and download the archive
  5. In Dawarich: “Import” → “Google Takeout”
  6. Upload the JSON file
  7. Wait for processing (may take several minutes for large files)

Connect OwnTracks

For real-time location tracking:

  1. Install OwnTracks on your mobile device
  2. In Dawarich: Go to “Settings” → “API”
  3. Generate an API token
  4. In OwnTracks app:
    • Mode: HTTP
    • Host: example-app.klutch.sh
    • URL: /api/v1/locations
    • Authentication: Add token in header
  5. Enable tracking in OwnTracks

Bulk Import CSV

For custom location logs:

Create CSV file with format:

latitude,longitude,timestamp,accuracy
37.7749,-122.4194,2025-12-05T08:00:00Z,10
37.7750,-122.4195,2025-12-05T08:01:00Z,8
37.7751,-122.4196,2025-12-05T08:02:00Z,12

Upload via “Import” → “CSV File”

Viewing Your Location History

Map View

The main interface displays your location history on an interactive map:

  • Markers: Individual location points
  • Lines: Routes connecting sequential points
  • Heatmap: Frequently visited areas
  • Clusters: Grouped nearby points for better visualization

Navigation:

  • Zoom with mouse wheel or +/- buttons
  • Pan by dragging
  • Click markers for details (time, accuracy, elevation)

Timeline View

Browse your history chronologically:

  • Day/Week/Month/Year views
  • Calendar navigation
  • Quick jump to specific dates
  • Filter by tags or places

Places

Dawarich automatically detects frequently visited locations:

  • Home, work, and other regular places
  • Time spent at each location
  • Visit frequency
  • Custom place names and categories

Trips

Connected location points form trips:

  • Trip list with dates and durations
  • Distance traveled
  • Route visualization
  • Start and end points
  • Export trip as GPX

Statistics and Analytics

Dashboard Overview

View high-level statistics:

Total Distance: 12,450 km
Countries Visited: 15
Cities Visited: 89
Days Tracked: 365
Total Points: 2,450,000

Travel Statistics

Detailed breakdown:

  • Distance by year/month/week
  • Most visited places
  • Travel patterns (weekday vs weekend)
  • Average daily movement
  • Speed analysis
  • Elevation profiles

Country and City Analytics

See where you’ve spent time:

  • Countries visited with entry/exit dates
  • Cities with visit counts and duration
  • Interactive map highlighting visited areas
  • Export country/city list

Time Analysis

Understand your movement patterns:

  • Time of day activity
  • Most active hours
  • Seasonal trends
  • Year-over-year comparisons

API Usage

Authentication

Generate API token in Settings:

Terminal window
curl -H "Authorization: Bearer YOUR_TOKEN" \
https://example-app.klutch.sh/api/v1/locations

Submit Location

Send location updates:

Terminal window
curl -X POST https://example-app.klutch.sh/api/v1/locations \
-H "Authorization: Bearer YOUR_TOKEN" \
-H "Content-Type: application/json" \
-d '{
"latitude": 37.7749,
"longitude": -122.4194,
"timestamp": "2025-12-05T08:00:00Z",
"accuracy": 10,
"altitude": 15,
"speed": 1.5,
"heading": 90
}'

Get Recent Locations

Retrieve your location history:

Terminal window
curl -H "Authorization: Bearer YOUR_TOKEN" \
"https://example-app.klutch.sh/api/v1/locations?limit=100&since=2025-12-01"

Response:

{
"locations": [
{
"id": 12345,
"latitude": 37.7749,
"longitude": -122.4194,
"timestamp": "2025-12-05T08:00:00Z",
"accuracy": 10,
"altitude": 15,
"speed": 1.5
}
],
"total": 100,
"page": 1
}

Export Data

Download your data:

Terminal window
# GPX export
curl -H "Authorization: Bearer YOUR_TOKEN" \
"https://example-app.klutch.sh/api/v1/export/gpx?start_date=2025-01-01&end_date=2025-12-31" \
> my_data.gpx
# GeoJSON export
curl -H "Authorization: Bearer YOUR_TOKEN" \
"https://example-app.klutch.sh/api/v1/export/geojson?start_date=2025-01-01" \
> my_data.geojson
# CSV export
curl -H "Authorization: Bearer YOUR_TOKEN" \
"https://example-app.klutch.sh/api/v1/export/csv" \
> my_data.csv

Privacy and Sharing

Privacy Settings

Control your data visibility:

  • Make profile public or private
  • Share specific trips with others
  • Generate shareable links with expiration
  • Revoke access anytime

Sharing Trips

Share a specific trip:

  1. Navigate to trip details
  2. Click “Share”
  3. Set permissions (view only, download)
  4. Set expiration date (optional)
  5. Copy share link
  6. Send to recipients

Data Export and Backup

Regularly backup your data:

  1. Go to “Settings” → “Export”
  2. Choose export format (GPX, GeoJSON, CSV)
  3. Select date range
  4. Click “Export”
  5. Download archive

Advanced Configuration

PostgreSQL Database Setup

For production deployments, use PostgreSQL for better performance:

Step 1: Create Database

CREATE DATABASE dawarich_production;
CREATE USER dawarich WITH PASSWORD 'secure_password';
GRANT ALL PRIVILEGES ON DATABASE dawarich_production TO dawarich;

Step 2: Enable PostGIS (for geographic queries)

\c dawarich_production
CREATE EXTENSION postgis;
CREATE EXTENSION postgis_topology;

Step 3: Configure Connection

Update environment variable:

Terminal window
DATABASE_URL=postgresql://dawarich:secure_password@postgres-host:5432/dawarich_production

Redis Configuration

Use Redis for background job processing:

Deploy Redis

Deploy a Redis instance on Klutch.sh or use external Redis service.

Configure Dawarich

Set environment variable:

Terminal window
REDIS_URL=redis://redis-host:6379/0

Start Sidekiq Worker

Update Dockerfile CMD to include Sidekiq:

CMD ["sh", "-c", "bundle exec sidekiq -C config/sidekiq.yml & bundle exec puma -C config/puma.rb"]

Reverse Geocoding

Convert coordinates to place names:

Nominatim (Free)

Terminal window
GEOCODING_ENABLED=true
REVERSE_GEOCODING_PROVIDER=nominatim
NOMINATIM_URL=https://nominatim.openstreetmap.org

Google Maps Geocoding API

Terminal window
GEOCODING_ENABLED=true
REVERSE_GEOCODING_PROVIDER=google
GOOGLE_MAPS_API_KEY=your-api-key

MapBox Geocoding

Terminal window
GEOCODING_ENABLED=true
REVERSE_GEOCODING_PROVIDER=mapbox
MAPBOX_API_TOKEN=your-mapbox-token

Custom Map Tiles

Change map appearance:

OpenStreetMap (Default)

Terminal window
MAP_TILE_PROVIDER=OpenStreetMap

Mapbox

Terminal window
MAP_TILE_PROVIDER=mapbox
MAPBOX_STYLE_URL=mapbox://styles/mapbox/streets-v12
MAPBOX_ACCESS_TOKEN=your-token

Stamen

Terminal window
MAP_TILE_PROVIDER=stamen
MAP_TILE_STYLE=terrain

Custom Tile Server

Terminal window
MAP_TILE_PROVIDER=custom
CUSTOM_TILE_URL=https://your-tiles.com/{z}/{x}/{y}.png

Automated Backups

Schedule regular backups:

Create bin/backup.sh:

#!/bin/bash
set -e
BACKUP_DIR="/app/backups"
DATE=$(date +%Y%m%d_%H%M%S)
BACKUP_FILE="dawarich_backup_${DATE}.sql"
mkdir -p $BACKUP_DIR
# Backup database
pg_dump $DATABASE_URL > "$BACKUP_DIR/$BACKUP_FILE"
# Compress backup
gzip "$BACKUP_DIR/$BACKUP_FILE"
# Keep only last 7 days of backups
find $BACKUP_DIR -name "dawarich_backup_*.sql.gz" -mtime +7 -delete
echo "Backup completed: ${BACKUP_FILE}.gz"

Schedule with cron or external scheduler.

Email Notifications

Configure email for notifications:

Terminal window
SMTP_ADDRESS=smtp.example.com
SMTP_PORT=587
SMTP_USERNAME=your-username
SMTP_PASSWORD=your-password
SMTP_DOMAIN=example.com
SMTP_FROM=dawarich@example.com
SMTP_AUTHENTICATION=plain
SMTP_ENABLE_STARTTLS_AUTO=true

Performance Tuning

Database Connection Pooling

Terminal window
RAILS_MAX_THREADS=5
DB_POOL=10

Puma Workers

Terminal window
WEB_CONCURRENCY=2
RAILS_MAX_THREADS=5

Caching

Enable Rails caching:

Terminal window
RAILS_CACHE_STORE=redis
REDIS_CACHE_URL=redis://redis-host:6379/1

Asset Caching

Terminal window
RAILS_SERVE_STATIC_FILES=true
RAILS_STATIC_CACHE_CONTROL=public,max-age=31536000

Production Best Practices

Follow these recommendations for running Dawarich in production:

Security

Strong Secret Keys

Generate secure keys:

Terminal window
# Secret key base
openssl rand -hex 64
# API tokens
openssl rand -base64 32

Never commit secrets to version control.

Database Security

  • Use strong passwords
  • Enable SSL connections
  • Restrict database access by IP
  • Regular security updates
  • Automated backups

HTTPS Only

Klutch.sh provides automatic HTTPS. Ensure:

Terminal window
FORCE_SSL=true

User Authentication

  • Enforce strong passwords
  • Enable two-factor authentication (if available)
  • Regular session timeout
  • Account lockout after failed attempts

API Security

  • Rate limit API endpoints
  • Rotate API tokens regularly
  • Use token expiration
  • Monitor API usage for anomalies

Data Management

Regular Backups

Implement automated backup strategy:

  • Daily database backups
  • Weekly full backups
  • Off-site backup storage
  • Test restore procedures regularly

Data Retention

Configure automatic cleanup:

Terminal window
DATA_RETENTION_DAYS=730 # 2 years
OLD_DATA_CLEANUP_ENABLED=true

Storage Monitoring

Monitor storage usage:

  • Database size
  • Uploaded files
  • Logs and temporary files
  • Backup archives

Set alerts for storage thresholds.

Data Privacy

  • Regular audit of user data
  • Honor data deletion requests promptly
  • Document data retention policies
  • Comply with privacy regulations (GDPR, CCPA)

Performance Optimization

Database Indexes

Ensure proper indexes on location queries:

CREATE INDEX idx_locations_timestamp ON locations(timestamp);
CREATE INDEX idx_locations_user_id ON locations(user_id);
CREATE INDEX idx_locations_coords ON locations USING GIST(ll_to_earth(latitude, longitude));

Query Optimization

  • Use pagination for large datasets
  • Cache frequent queries
  • Optimize slow queries
  • Use database EXPLAIN for analysis

Background Jobs

  • Monitor job queue size
  • Adjust worker count based on load
  • Set appropriate timeouts
  • Retry failed jobs with backoff

Asset Optimization

  • Compress images
  • Minify JavaScript and CSS
  • Use CDN for static assets
  • Enable browser caching

Monitoring

Application Monitoring

Track key metrics:

  • Response times
  • Error rates
  • Memory usage
  • Database query performance
  • Background job processing

Health Checks

Implement comprehensive health checks:

config/routes.rb
get '/health', to: 'health#index'
# app/controllers/health_controller.rb
class HealthController < ApplicationController
def index
health = {
status: 'healthy',
timestamp: Time.current,
checks: {
database: check_database,
redis: check_redis,
storage: check_storage
}
}
status = health[:checks].values.all? { |c| c[:status] == 'ok' } ? :ok : :service_unavailable
render json: health, status: status
end
private
def check_database
ActiveRecord::Base.connection.execute('SELECT 1')
{ status: 'ok' }
rescue => e
{ status: 'error', message: e.message }
end
def check_redis
Redis.current.ping
{ status: 'ok' }
rescue => e
{ status: 'error', message: e.message }
end
def check_storage
available = `df -h /app/storage | tail -1 | awk '{print $4}'`.strip
{ status: 'ok', available: available }
rescue => e
{ status: 'error', message: e.message }
end
end

Log Management

  • Centralized log aggregation
  • Log rotation to prevent disk fill
  • Alert on error patterns
  • Regular log analysis

Uptime Monitoring

Use external monitoring service:

  • Regular health check pings
  • SSL certificate expiration alerts
  • Response time tracking
  • Downtime notifications

Scaling Considerations

Horizontal Scaling

Dawarich can be scaled horizontally:

  • Deploy multiple application instances
  • Use load balancer for distribution
  • Share database and Redis between instances
  • Shared storage for uploads

Resource Allocation

Typical requirements:

  • CPU: 1-2 cores for up to 10 active users
  • Memory: 1-2GB RAM
  • Storage: 10GB+ depending on data volume
  • Database: Separate instance recommended for >100GB data

Database Scaling

For large datasets:

  • Database read replicas
  • Connection pooling
  • Query caching
  • Partitioning by date

Troubleshooting

Installation Issues

Problem: Build fails during gem installation

Solutions:

  • Check Ruby version matches requirements
  • Ensure all system dependencies installed
  • Review build logs for specific gem errors
  • Try building with --no-cache flag
  • Verify network connectivity for gem downloads

Problem: Asset precompilation fails

Solutions:

  • Ensure Node.js and npm are installed
  • Check JavaScript dependencies in package.json
  • Verify SECRET_KEY_BASE is set (even dummy value for build)
  • Review webpack/sprockets configuration
  • Check for JavaScript syntax errors

Database Issues

Problem: Cannot connect to database

Solutions:

  • Verify DATABASE_URL format is correct
  • Check database host is accessible
  • Ensure database user has proper permissions
  • Test connection with psql or sqlite3 cli
  • Review database logs for connection errors
  • Check firewall rules

Problem: Migration fails

Solutions:

  • Ensure database exists before migration
  • Check database user has CREATE/ALTER permissions
  • Review migration files for errors
  • Run migrations manually: bundle exec rake db:migrate
  • Check for pending migrations: bundle exec rake db:migrate:status

Import Issues

Problem: GPX import fails

Solutions:

  • Verify GPX file is valid XML
  • Check file size within limits
  • Ensure track segments contain coordinates
  • Review import logs for specific errors
  • Test with sample GPX file first
  • Check file encoding (should be UTF-8)

Problem: Google Takeout import slow or fails

Solutions:

  • Large files may take 10-30 minutes to process
  • Check Sidekiq worker is running
  • Increase worker timeout if needed
  • Split large files into smaller chunks
  • Monitor background job queue
  • Check available disk space

Problem: OwnTracks not syncing

Solutions:

  • Verify API endpoint URL is correct
  • Check authentication token is valid
  • Ensure HTTPS (not HTTP) is used
  • Review OwnTracks app logs
  • Test API endpoint with curl
  • Verify network connectivity from device

Performance Issues

Problem: Slow map loading

Solutions:

  • Reduce number of points displayed
  • Use clustering for dense areas
  • Enable heatmap view for better performance
  • Increase map tile cache
  • Use simpler map style
  • Paginate location queries
  • Add database indexes

Problem: High memory usage

Solutions:

  • Reduce Puma worker count
  • Decrease Rails thread count
  • Optimize background job concurrency
  • Clear old cached data
  • Increase container memory limits
  • Review for memory leaks in logs

Problem: Slow statistics generation

Solutions:

  • Cache statistics with longer TTL
  • Generate statistics in background job
  • Add database indexes on frequently queried fields
  • Use materialized views for complex queries
  • Reduce statistics calculation frequency

Storage Issues

Problem: Running out of disk space

Solutions:

  • Clean up old import files
  • Remove temporary processing files
  • Compress old location data
  • Archive historical data to external storage
  • Increase volume size in Klutch.sh
  • Implement data retention policies

Problem: Files not persisting after restart

Solutions:

  • Verify persistent volumes are mounted correctly
  • Check mount paths match configuration
  • Ensure volumes have write permissions
  • Test file write: touch /app/storage/test.txt
  • Review volume configuration in Klutch.sh

User Interface Issues

Problem: Map not displaying

Solutions:

  • Check browser console for JavaScript errors
  • Verify map tile provider is accessible
  • Test with different map provider
  • Clear browser cache
  • Check for ad blockers interfering
  • Verify Leaflet assets loaded correctly

Problem: Login issues

Solutions:

  • Verify SECRET_KEY_BASE is consistent across restarts
  • Check session configuration
  • Clear browser cookies
  • Ensure HTTPS is working correctly
  • Review authentication logs
  • Reset password if forgotten

Additional Resources

Conclusion

Dawarich puts you back in control of your location data. Instead of relying on commercial services that monetize your movements, you can track, visualize, and analyze your travels on infrastructure you own. The beautiful visualizations and detailed statistics help you understand your travel patterns while maintaining complete privacy.

Deploying Dawarich on Klutch.sh gives you the infrastructure to run this location tracking platform without managing servers or worrying about scaling. Your location history is always available, always under your control, and always private. Whether you’re tracking daily commutes, documenting travel adventures, or analyzing movement patterns, Dawarich provides the tools to explore your journey through life.

Start owning your location data today and discover insights about your travels without compromising privacy.