Deploying Dawarich
Dawarich is a self-hosted location tracking and history visualization application that gives you complete control over your location data. Instead of relying on commercial services like Google Timeline or Apple’s Significant Locations, Dawarich lets you own and manage your location history on your own infrastructure. The application imports GPS data from various sources including GPX files, Google Takeout archives, OwnTracks, and other location tracking apps, then visualizes your travels on interactive maps with timeline views, statistics, and detailed trip analysis.
What makes Dawarich particularly valuable is its focus on privacy and data ownership. All your location data stays on your server, and you have complete control over who can access it. The application provides beautiful visualizations of your travel history, showing places you’ve visited, routes you’ve taken, and statistics about your movements. Whether you’re tracking personal travel, analyzing commute patterns, or simply want to maintain a private record of your journeys, Dawarich offers the tools to explore and understand your location data without surrendering privacy to third-party services.
Why Deploy Dawarich on Klutch.sh?
Deploying Dawarich on Klutch.sh offers several advantages for hosting your location tracking platform:
- Automatic Docker Detection: Klutch.sh recognizes your Dockerfile and handles containerization without manual configuration
- Persistent Storage: Built-in volume management ensures your location database and imported data persist across deployments
- HTTPS by Default: Secure access to your location data with automatic SSL certificates
- Database Support: Easy integration with PostgreSQL or SQLite for storing location history
- Environment Management: Securely configure database credentials, API keys, and application settings through environment variables
- Privacy Control: Keep your location data on infrastructure you control
- Always-On Availability: Access your location history 24/7 from anywhere
Prerequisites
Before deploying Dawarich to Klutch.sh, ensure you have:
- A Klutch.sh account (sign up here)
- A GitHub account with a repository for your Dawarich deployment
- Basic understanding of Docker and containerization
- Location data to import (GPX files, Google Takeout, or OwnTracks data)
- Optional: PostgreSQL database (for production deployments)
- Git installed on your local development machine
- Familiarity with location data formats (GPX, GeoJSON)
Understanding Dawarich Architecture
Dawarich follows a Ruby on Rails architecture designed for location data management:
Core Components
Ruby on Rails Application
Dawarich is built with Ruby on Rails, providing a robust MVC framework for handling location data. The application manages user authentication, data import pipelines, visualization rendering, and API endpoints for location tracking devices. Rails’ Active Record ORM handles database interactions efficiently, supporting both PostgreSQL and SQLite backends.
Database Layer
Location data is stored in a relational database:
- PostgreSQL: Recommended for production with better performance for large datasets
- SQLite: Suitable for personal use or small deployments
The database schema stores:
- Location points (latitude, longitude, timestamp, accuracy)
- Places (frequently visited locations)
- Trips (connected location points forming journeys)
- User preferences and settings
- Import metadata and processing status
Location Data Importer
The import system processes various location data formats:
- GPX Files: Standard GPS exchange format from fitness apps and GPS devices
- Google Takeout: Location history from Google services
- OwnTracks: Real-time location tracking protocol
- GeoJSON: Geographic data interchange format
- CSV: Custom formatted location logs
Background jobs process imports asynchronously, parsing coordinates, validating data, and organizing points into meaningful trips.
Map Visualization Engine
Dawarich uses Leaflet.js and OpenStreetMap tiles to render interactive maps. The visualization system:
- Displays location points as markers or heatmaps
- Draws routes connecting sequential points
- Highlights frequently visited places
- Provides timeline navigation
- Supports multiple map layers and styles
Statistics and Analytics
The analytics engine processes location data to generate:
- Total distance traveled
- Countries and cities visited
- Time spent in different locations
- Travel patterns and trends
- Most frequent routes
- Speed and elevation profiles
API Layer
RESTful APIs enable:
- Real-time location updates from tracking apps
- Data export in various formats
- Third-party integrations
- Mobile app connectivity
- Webhook notifications for new locations
Background Job System
Sidekiq handles asynchronous tasks:
- Processing large import files
- Generating statistics
- Cleaning up old data
- Sending notifications
- Reverse geocoding locations
Data Flow
- User uploads location data file or device sends real-time update
- Data is validated and queued for processing
- Background job parses location data and extracts coordinates
- Location points are saved to database with timestamps
- Points are grouped into trips based on time gaps
- Reverse geocoding adds place names to coordinates
- Statistics are updated with new data
- Visualizations are generated for map display
- User views processed data through web interface
- API provides access for external applications
Storage Requirements
Dawarich requires persistent storage for:
- Database: Location points, trips, and metadata (grows with usage)
- Uploaded Files: GPX files and import archives before processing
- Assets: Map tiles cache and generated images
- Logs: Application and job processing logs
A typical deployment needs 1GB-10GB initially, with storage growing based on location data volume. One year of detailed tracking might use 2-5GB.
Installation and Setup
Let’s walk through setting up Dawarich for deployment on Klutch.sh.
Step 1: Create the Project Structure
First, create a new directory for your Dawarich deployment:
mkdir dawarich-deploymentcd dawarich-deploymentgit initStep 2: Create the Dockerfile
Create a Dockerfile in the root directory:
FROM ruby:3.2-alpine
# Set environment variablesENV RAILS_ENV=production \ RACK_ENV=production \ NODE_ENV=production \ RAILS_SERVE_STATIC_FILES=true \ RAILS_LOG_TO_STDOUT=true
# Install system dependenciesRUN apk add --no-cache \ build-base \ postgresql-dev \ postgresql-client \ sqlite-dev \ tzdata \ nodejs \ npm \ git \ curl \ imagemagick \ gcompat
# Create app directoryWORKDIR /app
# Create app userRUN addgroup -g 1000 dawarich && \ adduser -D -u 1000 -G dawarich dawarich
# Clone Dawarich repositoryRUN git clone https://github.com/Freika/dawarich.git /app && \ chown -R dawarich:dawarich /app
# Switch to app userUSER dawarich
# Install Ruby dependenciesCOPY --chown=dawarich:dawarich Gemfile* ./RUN bundle config set --local deployment 'true' && \ bundle config set --local without 'development test' && \ bundle install --jobs 4
# Install JavaScript dependenciesRUN npm install
# Precompile assetsRUN SECRET_KEY_BASE=dummy bundle exec rake assets:precompile
# Create necessary directoriesRUN mkdir -p tmp/pids tmp/sockets log public/uploads
# Expose portEXPOSE 3000
# Health checkHEALTHCHECK --interval=30s --timeout=10s --start-period=60s --retries=3 \ CMD curl -f http://localhost:3000/health || exit 1
# Start applicationCMD ["sh", "-c", "bundle exec rake db:migrate && bundle exec puma -C config/puma.rb"]Step 3: Create Gemfile for Dependencies
Create a Gemfile:
source 'https://rubygems.org'
ruby '~> 3.2.0'
gem 'rails', '~> 7.1.0'gem 'puma', '~> 6.0'gem 'pg', '~> 1.5'gem 'sqlite3', '~> 1.6'gem 'redis', '~> 5.0'gem 'sidekiq', '~> 7.0'gem 'gpx', '~> 1.0'gem 'devise', '~> 4.9'gem 'pundit', '~> 2.3'gem 'kaminari', '~> 1.2'gem 'carrierwave', '~> 3.0'gem 'mini_magick', '~> 4.12'gem 'geocoder', '~> 1.8'gem 'rgeo', '~> 3.0'gem 'rgeo-geojson', '~> 2.1'gem 'leaflet-rails', '~> 1.9'gem 'chartkick', '~> 5.0'gem 'groupdate', '~> 6.0'gem 'rubyzip', '~> 2.3'gem 'nokogiri', '~> 1.15'gem 'httparty', '~> 0.21'gem 'dotenv-rails', '~> 2.8'gem 'bootsnap', require: falsegem 'sprockets-rails', '~> 3.4'gem 'importmap-rails', '~> 1.2'gem 'turbo-rails', '~> 1.5'gem 'stimulus-rails', '~> 1.3'Step 4: Create Environment Configuration
Create a .env.example file:
# Application SettingsRAILS_ENV=productionRACK_ENV=productionNODE_ENV=productionSECRET_KEY_BASE=your-secret-key-base
# Server ConfigurationPORT=3000RAILS_SERVE_STATIC_FILES=trueRAILS_LOG_TO_STDOUT=true
# Database Configuration (PostgreSQL - recommended for production)DATABASE_URL=postgresql://dawarich:password@postgres-host:5432/dawarich_production
# Or SQLite (for development/small deployments)# DATABASE_ADAPTER=sqlite3# DATABASE_PATH=/app/storage/dawarich.db
# Redis Configuration (for Sidekiq)REDIS_URL=redis://redis-host:6379/0
# Application URLsAPPLICATION_HOST=example-app.klutch.shAPPLICATION_URL=https://example-app.klutch.sh
# File Upload SettingsMAX_UPLOAD_SIZE=100MBSTORAGE_PATH=/app/storage
# Map SettingsDEFAULT_MAP_CENTER_LAT=0DEFAULT_MAP_CENTER_LNG=0DEFAULT_MAP_ZOOM=2MAP_TILE_PROVIDER=OpenStreetMap
# Geocoding Settings (optional)GEOCODING_ENABLED=trueGEOCODING_API_KEY=your-geocoding-api-keyREVERSE_GEOCODING_PROVIDER=nominatim
# Import SettingsAUTO_PROCESS_IMPORTS=trueIMPORT_BATCH_SIZE=1000MAX_IMPORT_FILE_SIZE=500MB
# Analytics and StatisticsENABLE_STATISTICS=trueSTATISTICS_CACHE_TTL=3600
# SecurityFORCE_SSL=trueSESSION_TIMEOUT=1440
# Email Configuration (optional)SMTP_ADDRESS=smtp.example.comSMTP_PORT=587SMTP_USERNAME=your-usernameSMTP_PASSWORD=your-passwordSMTP_DOMAIN=example.comSMTP_FROM=noreply@example.com
# Background JobsSIDEKIQ_CONCURRENCY=5SIDEKIQ_TIMEOUT=25Step 5: Create Database Configuration
Create config/database.yml:
default: &default adapter: <%= ENV['DATABASE_ADAPTER'] || 'postgresql' %> encoding: unicode pool: <%= ENV.fetch("RAILS_MAX_THREADS") { 5 } %> timeout: 5000
production: <<: *default url: <%= ENV['DATABASE_URL'] %>Step 6: Create Puma Configuration
Create config/puma.rb:
#!/usr/bin/env puma
max_threads_count = ENV.fetch("RAILS_MAX_THREADS") { 5 }min_threads_count = ENV.fetch("RAILS_MIN_THREADS") { max_threads_count }threads min_threads_count, max_threads_count
port ENV.fetch("PORT") { 3000 }environment ENV.fetch("RAILS_ENV") { "production" }
pidfile ENV.fetch("PIDFILE") { "tmp/pids/server.pid" }
plugin :tmp_restart
workers ENV.fetch("WEB_CONCURRENCY") { 2 }preload_app!
on_worker_boot do ActiveRecord::Base.establish_connection if defined?(ActiveRecord)end
before_fork do ActiveRecord::Base.connection_pool.disconnect! if defined?(ActiveRecord)endStep 7: Create Sidekiq Configuration
Create config/sidekiq.yml:
:concurrency: <%= ENV.fetch("SIDEKIQ_CONCURRENCY") { 5 } %>:timeout: <%= ENV.fetch("SIDEKIQ_TIMEOUT") { 25 } %>
:queues: - [critical, 5] - [default, 3] - [low, 1]
production: :concurrency: 10Step 8: Create .dockerignore
Create a .dockerignore file:
.git.gitignore.env.env.local*.mdREADME.md.DS_StoreThumbs.dblog/tmp/node_modules/public/uploads/storage/*.log.byebug_history.ruby-version.bundlevendor/bundlecoverage/test/spec/Step 9: Create Startup Script
Create bin/docker-entrypoint.sh:
#!/bin/shset -e
# Wait for database to be readyecho "Waiting for database..."until nc -z -v -w30 $DB_HOST $DB_PORT 2>/dev/nulldo echo "Waiting for database connection..." sleep 2done
echo "Database is ready!"
# Run database migrationsecho "Running database migrations..."bundle exec rake db:create db:migrate
# Start Sidekiq in background (optional)if [ "$START_SIDEKIQ" = "true" ]; then echo "Starting Sidekiq..." bundle exec sidekiq -C config/sidekiq.yml &fi
# Clear cachebundle exec rake tmp:cache:clear
# Start the main processexec "$@"Make the script executable:
chmod +x bin/docker-entrypoint.shStep 10: Create Documentation
Create README.md:
# Dawarich Location Tracking
This repository contains a Dawarich deployment configured for Klutch.sh.
## Features
- Self-hosted location tracking- Privacy-focused data ownership- GPX file import- Google Takeout support- OwnTracks integration- Interactive map visualization- Travel statistics and analytics- Timeline views- Place detection
## Configuration
Set environment variables for:- Database connection- Redis for background jobs- Application URLs- File upload limits- Map settings
## Importing Data
Supported formats:- GPX files from fitness apps- Google Takeout location history- OwnTracks JSON- GeoJSON files- CSV location logs
## Deployment
This application is configured to deploy on Klutch.sh with automatic Docker detection.Step 11: Initialize Git Repository
git add .git commit -m "Initial Dawarich setup for Klutch.sh deployment"git branch -M mastergit remote add origin https://github.com/yourusername/dawarich-deployment.gitgit push -u origin masterDeploying to Klutch.sh
Now that your Dawarich application is configured, let’s deploy it to Klutch.sh.
-
Log in to Klutch.sh
Navigate to klutch.sh/app and sign in with your GitHub account.
-
Create a New Project
Click “New Project” and select “Import from GitHub”. Choose the repository containing your Dawarich deployment.
-
Configure Build Settings
Klutch.sh will automatically detect the Dockerfile in your repository. The platform will use this for building your container.
-
Configure Traffic Settings
Select “HTTP” as the traffic type. Dawarich serves its web interface on port 3000, and Klutch.sh will route HTTPS traffic to this port.
-
Set Environment Variables
In the project settings, add the following environment variables:
SECRET_KEY_BASE: Generate usingopenssl rand -hex 64RAILS_ENV:productionPORT:3000APPLICATION_HOST: Your Klutch.sh domain (e.g.,example-app.klutch.sh)APPLICATION_URL:https://example-app.klutch.shRAILS_SERVE_STATIC_FILES:trueRAILS_LOG_TO_STDOUT:true
For PostgreSQL database (recommended):
DATABASE_URL:postgresql://username:password@host:port/dawarich_production
For SQLite (simpler setup):
DATABASE_ADAPTER:sqlite3DATABASE_PATH:/app/storage/dawarich.db
For Redis (background jobs):
REDIS_URL:redis://redis-host:6379/0
Optional settings:
MAX_UPLOAD_SIZE:100MBGEOCODING_ENABLED:trueENABLE_STATISTICS:true
-
Configure Persistent Storage
Dawarich requires persistent storage for database and uploaded files:
- Storage Volume:
- Mount path:
/app/storage - Size:
10GB(adjust based on expected data volume)
- Mount path:
- Uploads Volume:
- Mount path:
/app/public/uploads - Size:
5GB
- Mount path:
These volumes ensure your location data and uploaded files persist across deployments.
- Storage Volume:
-
Deploy the Application
Click “Deploy” to start the build process. Klutch.sh will:
- Clone your repository
- Build the Docker image using your Dockerfile
- Install Ruby and Node.js dependencies
- Precompile Rails assets
- Deploy the container
- Provision an HTTPS endpoint
The build process typically takes 5-8 minutes due to Ruby gem compilation.
-
Access Your Dawarich Instance
Once deployment completes, Klutch.sh will provide a URL like
example-app.klutch.sh. Your Dawarich location tracking platform will be available at this URL. -
Create Admin Account
On first visit, you’ll be prompted to create an admin account. Fill in your details to set up the initial user.
Getting Started with Dawarich
Once your Dawarich instance is deployed, here’s how to use it:
Initial Setup
Create Your Account
Visit your deployment URL and create an account. The first registered user becomes the admin.
Configure Preferences
Navigate to Settings to configure:
- Default map center and zoom level
- Timezone for date/time display
- Distance units (kilometers or miles)
- Privacy settings
- Notification preferences
Importing Location Data
Import GPX Files
GPX files from fitness apps, GPS devices, or tracking applications:
- Go to “Import” → “Upload File”
- Click “Choose File” and select your GPX file
- Add optional tags or description
- Click “Import”
- Processing happens in background
- Notification when complete
Example GPX structure:
<?xml version="1.0"?><gpx version="1.1" creator="YourApp"> <trk> <name>Morning Run</name> <trkseg> <trkpt lat="37.7749" lon="-122.4194"> <time>2025-12-05T08:00:00Z</time> <ele>10</ele> </trkpt> <trkpt lat="37.7750" lon="-122.4195"> <time>2025-12-05T08:01:00Z</time> <ele>12</ele> </trkpt> </trkseg> </trk></gpx>Import Google Takeout
Extract your location history from Google:
- Go to Google Takeout
- Deselect all, then select “Location History”
- Choose JSON format
- Export and download the archive
- In Dawarich: “Import” → “Google Takeout”
- Upload the JSON file
- Wait for processing (may take several minutes for large files)
Connect OwnTracks
For real-time location tracking:
- Install OwnTracks on your mobile device
- In Dawarich: Go to “Settings” → “API”
- Generate an API token
- In OwnTracks app:
- Mode: HTTP
- Host:
example-app.klutch.sh - URL:
/api/v1/locations - Authentication: Add token in header
- Enable tracking in OwnTracks
Bulk Import CSV
For custom location logs:
Create CSV file with format:
latitude,longitude,timestamp,accuracy37.7749,-122.4194,2025-12-05T08:00:00Z,1037.7750,-122.4195,2025-12-05T08:01:00Z,837.7751,-122.4196,2025-12-05T08:02:00Z,12Upload via “Import” → “CSV File”
Viewing Your Location History
Map View
The main interface displays your location history on an interactive map:
- Markers: Individual location points
- Lines: Routes connecting sequential points
- Heatmap: Frequently visited areas
- Clusters: Grouped nearby points for better visualization
Navigation:
- Zoom with mouse wheel or +/- buttons
- Pan by dragging
- Click markers for details (time, accuracy, elevation)
Timeline View
Browse your history chronologically:
- Day/Week/Month/Year views
- Calendar navigation
- Quick jump to specific dates
- Filter by tags or places
Places
Dawarich automatically detects frequently visited locations:
- Home, work, and other regular places
- Time spent at each location
- Visit frequency
- Custom place names and categories
Trips
Connected location points form trips:
- Trip list with dates and durations
- Distance traveled
- Route visualization
- Start and end points
- Export trip as GPX
Statistics and Analytics
Dashboard Overview
View high-level statistics:
Total Distance: 12,450 kmCountries Visited: 15Cities Visited: 89Days Tracked: 365Total Points: 2,450,000Travel Statistics
Detailed breakdown:
- Distance by year/month/week
- Most visited places
- Travel patterns (weekday vs weekend)
- Average daily movement
- Speed analysis
- Elevation profiles
Country and City Analytics
See where you’ve spent time:
- Countries visited with entry/exit dates
- Cities with visit counts and duration
- Interactive map highlighting visited areas
- Export country/city list
Time Analysis
Understand your movement patterns:
- Time of day activity
- Most active hours
- Seasonal trends
- Year-over-year comparisons
API Usage
Authentication
Generate API token in Settings:
curl -H "Authorization: Bearer YOUR_TOKEN" \ https://example-app.klutch.sh/api/v1/locationsSubmit Location
Send location updates:
curl -X POST https://example-app.klutch.sh/api/v1/locations \ -H "Authorization: Bearer YOUR_TOKEN" \ -H "Content-Type: application/json" \ -d '{ "latitude": 37.7749, "longitude": -122.4194, "timestamp": "2025-12-05T08:00:00Z", "accuracy": 10, "altitude": 15, "speed": 1.5, "heading": 90 }'Get Recent Locations
Retrieve your location history:
curl -H "Authorization: Bearer YOUR_TOKEN" \ "https://example-app.klutch.sh/api/v1/locations?limit=100&since=2025-12-01"Response:
{ "locations": [ { "id": 12345, "latitude": 37.7749, "longitude": -122.4194, "timestamp": "2025-12-05T08:00:00Z", "accuracy": 10, "altitude": 15, "speed": 1.5 } ], "total": 100, "page": 1}Export Data
Download your data:
# GPX exportcurl -H "Authorization: Bearer YOUR_TOKEN" \ "https://example-app.klutch.sh/api/v1/export/gpx?start_date=2025-01-01&end_date=2025-12-31" \ > my_data.gpx
# GeoJSON exportcurl -H "Authorization: Bearer YOUR_TOKEN" \ "https://example-app.klutch.sh/api/v1/export/geojson?start_date=2025-01-01" \ > my_data.geojson
# CSV exportcurl -H "Authorization: Bearer YOUR_TOKEN" \ "https://example-app.klutch.sh/api/v1/export/csv" \ > my_data.csvPrivacy and Sharing
Privacy Settings
Control your data visibility:
- Make profile public or private
- Share specific trips with others
- Generate shareable links with expiration
- Revoke access anytime
Sharing Trips
Share a specific trip:
- Navigate to trip details
- Click “Share”
- Set permissions (view only, download)
- Set expiration date (optional)
- Copy share link
- Send to recipients
Data Export and Backup
Regularly backup your data:
- Go to “Settings” → “Export”
- Choose export format (GPX, GeoJSON, CSV)
- Select date range
- Click “Export”
- Download archive
Advanced Configuration
PostgreSQL Database Setup
For production deployments, use PostgreSQL for better performance:
Step 1: Create Database
CREATE DATABASE dawarich_production;CREATE USER dawarich WITH PASSWORD 'secure_password';GRANT ALL PRIVILEGES ON DATABASE dawarich_production TO dawarich;Step 2: Enable PostGIS (for geographic queries)
\c dawarich_productionCREATE EXTENSION postgis;CREATE EXTENSION postgis_topology;Step 3: Configure Connection
Update environment variable:
DATABASE_URL=postgresql://dawarich:secure_password@postgres-host:5432/dawarich_productionRedis Configuration
Use Redis for background job processing:
Deploy Redis
Deploy a Redis instance on Klutch.sh or use external Redis service.
Configure Dawarich
Set environment variable:
REDIS_URL=redis://redis-host:6379/0Start Sidekiq Worker
Update Dockerfile CMD to include Sidekiq:
CMD ["sh", "-c", "bundle exec sidekiq -C config/sidekiq.yml & bundle exec puma -C config/puma.rb"]Reverse Geocoding
Convert coordinates to place names:
Nominatim (Free)
GEOCODING_ENABLED=trueREVERSE_GEOCODING_PROVIDER=nominatimNOMINATIM_URL=https://nominatim.openstreetmap.orgGoogle Maps Geocoding API
GEOCODING_ENABLED=trueREVERSE_GEOCODING_PROVIDER=googleGOOGLE_MAPS_API_KEY=your-api-keyMapBox Geocoding
GEOCODING_ENABLED=trueREVERSE_GEOCODING_PROVIDER=mapboxMAPBOX_API_TOKEN=your-mapbox-tokenCustom Map Tiles
Change map appearance:
OpenStreetMap (Default)
MAP_TILE_PROVIDER=OpenStreetMapMapbox
MAP_TILE_PROVIDER=mapboxMAPBOX_STYLE_URL=mapbox://styles/mapbox/streets-v12MAPBOX_ACCESS_TOKEN=your-tokenStamen
MAP_TILE_PROVIDER=stamenMAP_TILE_STYLE=terrainCustom Tile Server
MAP_TILE_PROVIDER=customCUSTOM_TILE_URL=https://your-tiles.com/{z}/{x}/{y}.pngAutomated Backups
Schedule regular backups:
Create bin/backup.sh:
#!/bin/bashset -e
BACKUP_DIR="/app/backups"DATE=$(date +%Y%m%d_%H%M%S)BACKUP_FILE="dawarich_backup_${DATE}.sql"
mkdir -p $BACKUP_DIR
# Backup databasepg_dump $DATABASE_URL > "$BACKUP_DIR/$BACKUP_FILE"
# Compress backupgzip "$BACKUP_DIR/$BACKUP_FILE"
# Keep only last 7 days of backupsfind $BACKUP_DIR -name "dawarich_backup_*.sql.gz" -mtime +7 -delete
echo "Backup completed: ${BACKUP_FILE}.gz"Schedule with cron or external scheduler.
Email Notifications
Configure email for notifications:
SMTP_ADDRESS=smtp.example.comSMTP_PORT=587SMTP_USERNAME=your-usernameSMTP_PASSWORD=your-passwordSMTP_DOMAIN=example.comSMTP_FROM=dawarich@example.comSMTP_AUTHENTICATION=plainSMTP_ENABLE_STARTTLS_AUTO=truePerformance Tuning
Database Connection Pooling
RAILS_MAX_THREADS=5DB_POOL=10Puma Workers
WEB_CONCURRENCY=2RAILS_MAX_THREADS=5Caching
Enable Rails caching:
RAILS_CACHE_STORE=redisREDIS_CACHE_URL=redis://redis-host:6379/1Asset Caching
RAILS_SERVE_STATIC_FILES=trueRAILS_STATIC_CACHE_CONTROL=public,max-age=31536000Production Best Practices
Follow these recommendations for running Dawarich in production:
Security
Strong Secret Keys
Generate secure keys:
# Secret key baseopenssl rand -hex 64
# API tokensopenssl rand -base64 32Never commit secrets to version control.
Database Security
- Use strong passwords
- Enable SSL connections
- Restrict database access by IP
- Regular security updates
- Automated backups
HTTPS Only
Klutch.sh provides automatic HTTPS. Ensure:
FORCE_SSL=trueUser Authentication
- Enforce strong passwords
- Enable two-factor authentication (if available)
- Regular session timeout
- Account lockout after failed attempts
API Security
- Rate limit API endpoints
- Rotate API tokens regularly
- Use token expiration
- Monitor API usage for anomalies
Data Management
Regular Backups
Implement automated backup strategy:
- Daily database backups
- Weekly full backups
- Off-site backup storage
- Test restore procedures regularly
Data Retention
Configure automatic cleanup:
DATA_RETENTION_DAYS=730 # 2 yearsOLD_DATA_CLEANUP_ENABLED=trueStorage Monitoring
Monitor storage usage:
- Database size
- Uploaded files
- Logs and temporary files
- Backup archives
Set alerts for storage thresholds.
Data Privacy
- Regular audit of user data
- Honor data deletion requests promptly
- Document data retention policies
- Comply with privacy regulations (GDPR, CCPA)
Performance Optimization
Database Indexes
Ensure proper indexes on location queries:
CREATE INDEX idx_locations_timestamp ON locations(timestamp);CREATE INDEX idx_locations_user_id ON locations(user_id);CREATE INDEX idx_locations_coords ON locations USING GIST(ll_to_earth(latitude, longitude));Query Optimization
- Use pagination for large datasets
- Cache frequent queries
- Optimize slow queries
- Use database EXPLAIN for analysis
Background Jobs
- Monitor job queue size
- Adjust worker count based on load
- Set appropriate timeouts
- Retry failed jobs with backoff
Asset Optimization
- Compress images
- Minify JavaScript and CSS
- Use CDN for static assets
- Enable browser caching
Monitoring
Application Monitoring
Track key metrics:
- Response times
- Error rates
- Memory usage
- Database query performance
- Background job processing
Health Checks
Implement comprehensive health checks:
get '/health', to: 'health#index'
# app/controllers/health_controller.rbclass HealthController < ApplicationController def index health = { status: 'healthy', timestamp: Time.current, checks: { database: check_database, redis: check_redis, storage: check_storage } }
status = health[:checks].values.all? { |c| c[:status] == 'ok' } ? :ok : :service_unavailable render json: health, status: status end
private
def check_database ActiveRecord::Base.connection.execute('SELECT 1') { status: 'ok' } rescue => e { status: 'error', message: e.message } end
def check_redis Redis.current.ping { status: 'ok' } rescue => e { status: 'error', message: e.message } end
def check_storage available = `df -h /app/storage | tail -1 | awk '{print $4}'`.strip { status: 'ok', available: available } rescue => e { status: 'error', message: e.message } endendLog Management
- Centralized log aggregation
- Log rotation to prevent disk fill
- Alert on error patterns
- Regular log analysis
Uptime Monitoring
Use external monitoring service:
- Regular health check pings
- SSL certificate expiration alerts
- Response time tracking
- Downtime notifications
Scaling Considerations
Horizontal Scaling
Dawarich can be scaled horizontally:
- Deploy multiple application instances
- Use load balancer for distribution
- Share database and Redis between instances
- Shared storage for uploads
Resource Allocation
Typical requirements:
- CPU: 1-2 cores for up to 10 active users
- Memory: 1-2GB RAM
- Storage: 10GB+ depending on data volume
- Database: Separate instance recommended for >100GB data
Database Scaling
For large datasets:
- Database read replicas
- Connection pooling
- Query caching
- Partitioning by date
Troubleshooting
Installation Issues
Problem: Build fails during gem installation
Solutions:
- Check Ruby version matches requirements
- Ensure all system dependencies installed
- Review build logs for specific gem errors
- Try building with
--no-cacheflag - Verify network connectivity for gem downloads
Problem: Asset precompilation fails
Solutions:
- Ensure Node.js and npm are installed
- Check JavaScript dependencies in package.json
- Verify SECRET_KEY_BASE is set (even dummy value for build)
- Review webpack/sprockets configuration
- Check for JavaScript syntax errors
Database Issues
Problem: Cannot connect to database
Solutions:
- Verify DATABASE_URL format is correct
- Check database host is accessible
- Ensure database user has proper permissions
- Test connection with psql or sqlite3 cli
- Review database logs for connection errors
- Check firewall rules
Problem: Migration fails
Solutions:
- Ensure database exists before migration
- Check database user has CREATE/ALTER permissions
- Review migration files for errors
- Run migrations manually:
bundle exec rake db:migrate - Check for pending migrations:
bundle exec rake db:migrate:status
Import Issues
Problem: GPX import fails
Solutions:
- Verify GPX file is valid XML
- Check file size within limits
- Ensure track segments contain coordinates
- Review import logs for specific errors
- Test with sample GPX file first
- Check file encoding (should be UTF-8)
Problem: Google Takeout import slow or fails
Solutions:
- Large files may take 10-30 minutes to process
- Check Sidekiq worker is running
- Increase worker timeout if needed
- Split large files into smaller chunks
- Monitor background job queue
- Check available disk space
Problem: OwnTracks not syncing
Solutions:
- Verify API endpoint URL is correct
- Check authentication token is valid
- Ensure HTTPS (not HTTP) is used
- Review OwnTracks app logs
- Test API endpoint with curl
- Verify network connectivity from device
Performance Issues
Problem: Slow map loading
Solutions:
- Reduce number of points displayed
- Use clustering for dense areas
- Enable heatmap view for better performance
- Increase map tile cache
- Use simpler map style
- Paginate location queries
- Add database indexes
Problem: High memory usage
Solutions:
- Reduce Puma worker count
- Decrease Rails thread count
- Optimize background job concurrency
- Clear old cached data
- Increase container memory limits
- Review for memory leaks in logs
Problem: Slow statistics generation
Solutions:
- Cache statistics with longer TTL
- Generate statistics in background job
- Add database indexes on frequently queried fields
- Use materialized views for complex queries
- Reduce statistics calculation frequency
Storage Issues
Problem: Running out of disk space
Solutions:
- Clean up old import files
- Remove temporary processing files
- Compress old location data
- Archive historical data to external storage
- Increase volume size in Klutch.sh
- Implement data retention policies
Problem: Files not persisting after restart
Solutions:
- Verify persistent volumes are mounted correctly
- Check mount paths match configuration
- Ensure volumes have write permissions
- Test file write:
touch /app/storage/test.txt - Review volume configuration in Klutch.sh
User Interface Issues
Problem: Map not displaying
Solutions:
- Check browser console for JavaScript errors
- Verify map tile provider is accessible
- Test with different map provider
- Clear browser cache
- Check for ad blockers interfering
- Verify Leaflet assets loaded correctly
Problem: Login issues
Solutions:
- Verify SECRET_KEY_BASE is consistent across restarts
- Check session configuration
- Clear browser cookies
- Ensure HTTPS is working correctly
- Review authentication logs
- Reset password if forgotten
Additional Resources
- Dawarich GitHub Repository
- Dawarich Wiki
- Leaflet.js Documentation
- Ruby on Rails Guides
- OpenStreetMap
- OwnTracks Documentation
- Klutch.sh Documentation
- Persistent Volumes Guide
Conclusion
Dawarich puts you back in control of your location data. Instead of relying on commercial services that monetize your movements, you can track, visualize, and analyze your travels on infrastructure you own. The beautiful visualizations and detailed statistics help you understand your travel patterns while maintaining complete privacy.
Deploying Dawarich on Klutch.sh gives you the infrastructure to run this location tracking platform without managing servers or worrying about scaling. Your location history is always available, always under your control, and always private. Whether you’re tracking daily commutes, documenting travel adventures, or analyzing movement patterns, Dawarich provides the tools to explore your journey through life.
Start owning your location data today and discover insights about your travels without compromising privacy.