tigris-backup-export
Tigris Backup & Export
Back up databases, export application data, and archive files to Tigris. Covers automated backup pipelines, retention policies, and restore workflows for all major frameworks.
Prerequisites
Before doing anything else, install the Tigris CLI if it's not already available:
tigris help || npm install -g @tigrisdata/cli
If you need to install it, tell the user: "I'm installing the Tigris CLI (@tigrisdata/cli) so we can work with Tigris object storage."
Overview
| Component | Purpose |
|---|---|
| Backup script | Dump database, compress, upload to Tigris |
| Scheduler | Run backups on a cron schedule |
| Retention | Delete old backups automatically (lifecycle rules) |
| Restore | Download and restore from a specific backup |
Naming convention: backups/db/YYYY-MM-DDTHH:MM:SSZ.sql.gz
General Pattern
#!/bin/bash
# scripts/backup.sh
TIMESTAMP=$(date -u +"%Y-%m-%dT%H:%M:%SZ")
BACKUP_FILE="backup-${TIMESTAMP}.sql.gz"
BUCKET="my-app-backups"
# Dump and compress
pg_dump "$DATABASE_URL" | gzip > "/tmp/$BACKUP_FILE"
# Upload to Tigris
tigris cp "/tmp/$BACKUP_FILE" "t3://$BUCKET/backups/db/$BACKUP_FILE"
# Clean up local file
rm "/tmp/$BACKUP_FILE"
echo "Backup uploaded: backups/db/$BACKUP_FILE"
For MySQL:
mysqldump -h "$DB_HOST" -u "$DB_USER" -p"$DB_PASS" "$DB_NAME" | gzip > "/tmp/$BACKUP_FILE"
Next.js / Express (Node.js)
import { exec } from "child_process";
import { promisify } from "util";
import { createReadStream } from "fs";
import { unlink } from "fs/promises";
import { put } from "@tigrisdata/storage";
const execAsync = promisify(exec);
async function backupDatabase() {
const timestamp = new Date().toISOString().replace(/[:.]/g, "-");
const filename = `backup-${timestamp}.sql.gz`;
const localPath = `/tmp/${filename}`;
// Dump and compress
await execAsync(
`pg_dump "${process.env.DATABASE_URL}" | gzip > "${localPath}"`,
);
// Upload to Tigris
const stream = createReadStream(localPath);
const result = await put(`backups/db/${filename}`, stream, {
contentType: "application/gzip",
multipart: true,
config: { bucket: "my-app-backups" },
});
// Clean up
await unlink(localPath);
if (result.error) throw result.error;
return result.data?.path;
}
Schedule with node-cron
import cron from "node-cron";
// Daily at 2 AM UTC
cron.schedule("0 2 * * *", async () => {
try {
const path = await backupDatabase();
console.log(`Backup complete: ${path}`);
} catch (error) {
console.error("Backup failed:", error);
}
});
Rails
Rake Task
Note: No native Tigris Ruby SDK exists yet. Uses
aws-sdk-s3pointed at Tigris.
# lib/tasks/backup.rake
namespace :db do
desc "Backup database to Tigris"
task backup: :environment do
require "aws-sdk-s3"
timestamp = Time.now.utc.strftime("%Y-%m-%dT%H:%M:%SZ")
filename = "backup-#{timestamp}.sql.gz"
local_path = "/tmp/#{filename}"
# Dump and compress
db_config = ActiveRecord::Base.connection_db_config.configuration_hash
system("pg_dump '#{db_config[:url] || build_pg_url(db_config)}' | gzip > '#{local_path}'")
# Upload to Tigris
s3 = Aws::S3::Client.new(
endpoint: "https://t3.storage.dev",
region: "auto",
access_key_id: ENV["AWS_ACCESS_KEY_ID"],
secret_access_key: ENV["AWS_SECRET_ACCESS_KEY"],
)
File.open(local_path, "rb") do |file|
s3.put_object(
bucket: ENV["TIGRIS_BACKUP_BUCKET"],
key: "backups/db/#{filename}",
body: file,
content_type: "application/gzip",
)
end
File.delete(local_path)
puts "Backup uploaded: backups/db/#{filename}"
end
end
Schedule with whenever
# config/schedule.rb (whenever gem)
every 1.day, at: "2:00 am" do
rake "db:backup"
end
Django
Management Command
Uses
tigris-boto3-ext(install:pip install tigris-boto3-ext).
# myapp/management/commands/dbbackup.py
import gzip
import subprocess
from datetime import datetime, timezone
from django.core.management.base import BaseCommand
from django.conf import settings
import boto3
from botocore.client import Config
class Command(BaseCommand):
help = "Backup database to Tigris"
def handle(self, *args, **options):
timestamp = datetime.now(timezone.utc).strftime("%Y-%m-%dT%H:%M:%SZ")
filename = f"backup-{timestamp}.sql.gz"
local_path = f"/tmp/{filename}"
# Dump and compress
db = settings.DATABASES["default"]
dump_cmd = (
f'pg_dump "postgresql://{db["USER"]}:{db["PASSWORD"]}'
f'@{db["HOST"]}:{db["PORT"]}/{db["NAME"]}"'
)
with gzip.open(local_path, "wb") as f:
result = subprocess.run(dump_cmd, shell=True, capture_output=True)
f.write(result.stdout)
# Upload to Tigris
s3 = boto3.client(
"s3",
endpoint_url="https://t3.storage.dev",
aws_access_key_id=settings.AWS_ACCESS_KEY_ID,
aws_secret_access_key=settings.AWS_SECRET_ACCESS_KEY,
region_name="auto",
config=Config(s3={"addressing_style": "virtual"}),
)
s3.upload_file(
local_path,
settings.TIGRIS_BACKUP_BUCKET,
f"backups/db/{filename}",
)
import os
os.unlink(local_path)
self.stdout.write(f"Backup uploaded: backups/db/{filename}")
python manage.py dbbackup
Schedule with celery-beat
# settings.py
CELERY_BEAT_SCHEDULE = {
"daily-backup": {
"task": "myapp.tasks.backup_database",
"schedule": crontab(hour=2, minute=0),
},
}
Laravel
Artisan Command
// app/Console/Commands/BackupDatabase.php
namespace App\Console\Commands;
use Illuminate\Console\Command;
use Illuminate\Support\Facades\Storage;
class BackupDatabase extends Command
{
protected $signature = 'db:backup';
protected $description = 'Backup database to Tigris';
public function handle()
{
$timestamp = now()->utc()->format('Y-m-d\TH:i:s\Z');
$filename = "backup-{$timestamp}.sql.gz";
$localPath = "/tmp/{$filename}";
// Dump and compress
$dbUrl = config('database.connections.pgsql.url')
?? sprintf('postgresql://%s:%s@%s:%s/%s',
config('database.connections.pgsql.username'),
config('database.connections.pgsql.password'),
config('database.connections.pgsql.host'),
config('database.connections.pgsql.port'),
config('database.connections.pgsql.database'),
);
exec("pg_dump \"{$dbUrl}\" | gzip > \"{$localPath}\"");
// Upload to Tigris
Storage::disk('tigris')->put(
"backups/db/{$filename}",
file_get_contents($localPath),
);
unlink($localPath);
$this->info("Backup uploaded: backups/db/{$filename}");
}
}
Schedule in Kernel
// app/Console/Kernel.php
protected function schedule(Schedule $schedule)
{
$schedule->command('db:backup')->dailyAt('02:00');
}
Or with spatie/laravel-backup:
composer require spatie/laravel-backup
php artisan vendor:publish --provider="Spatie\Backup\BackupServiceProvider"
Configure the tigris disk in config/backup.php.
Restore
# List backups
tigris ls t3://my-app-backups/backups/db/
# Download and restore
tigris cp t3://my-app-backups/backups/db/backup-2026-03-18T02:00:00Z.sql.gz /tmp/restore.sql.gz
gunzip /tmp/restore.sql.gz
psql "$DATABASE_URL" < /tmp/restore.sql
Retention Policy
Use Tigris lifecycle rules to auto-delete old backups:
tigris buckets lifecycle set my-app-backups --config lifecycle.json
{
"Rules": [
{
"ID": "delete-old-backups",
"Filter": { "Prefix": "backups/db/" },
"Status": "Enabled",
"Expiration": { "Days": 30 }
}
]
}
Recommended retention:
| Type | Retention |
|---|---|
| Daily backups | 30 days |
| Weekly backups | 90 days |
| Monthly backups | 1 year |
Critical Rules
Always: Compress before uploading (gzip) | Use timestamps in filenames | Set up retention/lifecycle rules | Test restore periodically | Store backup bucket credentials separately from app credentials
Never: Store backups in the same bucket as application data | Skip compression for database dumps | Forget to clean up local temp files after upload
Related Skills
- tigris-lifecycle-management — Auto-delete old backups
- tigris-security-access-control — Separate backup credentials
- file-storage — CLI setup and access key management
Official Documentation
- Tigris CLI: https://www.tigrisdata.com/docs/