Files
BC-bak/bc-backup.conf.template
Malin 3bad3ad171 feat: add incremental backups, S3 cleanup, and cron scheduling
Incremental backups using BC API's lastModifiedDateTime filter to only
export records changed since the last successful run. Runs every 15
minutes via cron, with a daily full backup for complete snapshots.

bc-export.ps1:
- Add -SinceDateTime parameter for incremental filtering
- Append $filter=lastModifiedDateTime gt {timestamp} to all entity URLs
- Exit code 2 when no records changed (skip archive/upload)
- Record mode and sinceDateTime in export-metadata.json

bc-backup.sh:
- Accept --mode full|incremental flag (default: incremental)
- State file (last-run-state.json) tracks last successful run timestamp
- Auto-fallback to full when no state file exists
- Skip archive/encrypt/upload when incremental finds 0 changes
- Lock file (.backup.lock) prevents overlapping cron runs
- S3 keys organized by mode: backups/full/ vs backups/incremental/

bc-cleanup.sh (new):
- Lists all S3 objects under backups/ prefix
- Deletes objects older than RETENTION_DAYS (default 30)
- Handles pagination for large buckets
- Gracefully handles COMPLIANCE-locked objects

bc-backup.conf.template:
- Add BACKUP_MODE_DEFAULT option

cron-examples.txt:
- Recommended setup: 15-min incremental + daily full + daily cleanup
- Alternative schedules (30-min, hourly)
- Systemd timer examples

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-16 10:22:08 +01:00

123 lines
4.2 KiB
Plaintext

# Business Central SaaS Backup Configuration
# Copy this file to bc-backup.conf and fill in your values
# IMPORTANT: Keep this file secure! It contains sensitive credentials.
# ===================================
# Azure AD Application Configuration
# ===================================
# Create an Azure AD App Registration with the following:
# 1. Navigate to https://portal.azure.com
# 2. Go to Azure Active Directory > App registrations > New registration
# 3. Name: "BC-Backup-Service" (or your preferred name)
# 4. Supported account types: "Accounts in this organizational directory only"
# 5. After creation, note the following:
# Your Azure AD Tenant ID (Directory ID)
AZURE_TENANT_ID="ea58ff97-60cb-4e6d-bc25-a55921f9c93c"
# Application (client) ID from the app registration
AZURE_CLIENT_ID="6430f1b8-b968-4e91-8214-0386618bc920"
# Client secret (create under Certificates & secrets > New client secret)
# IMPORTANT: Save this immediately - it won't be shown again!
AZURE_CLIENT_SECRET="uuB8Q~sh~WUpwGJXeV8NL2KVO4lKQWSnZnWV_aav"
# ===================================
# Azure AD API Permissions
# ===================================
# Add the following API permission to your app:
# 1. Go to API permissions > Add a permission
# 2. Select "Dynamics 365 Business Central"
# 3. Select "Application permissions"
# 4. Check "API.ReadWrite.All"
# 5. Click "Grant admin consent" (requires Global Admin)
# ===================================
# Business Central Configuration
# ===================================
# Your BC environment name (e.g., "Production", "Sandbox")
# Find this in BC Admin Center: https://businesscentral.dynamics.com/
BC_ENVIRONMENT_NAME="Production"
# Optional: Limit export to a specific company name
# Leave empty to export all companies in the environment
BC_COMPANY_NAME=""
# ===================================
# Encryption Configuration
# ===================================
# Strong passphrase for GPG encryption
# Generate a secure passphrase: openssl rand -base64 32
# IMPORTANT: Store this securely! You'll need it to decrypt backups.
ENCRYPTION_PASSPHRASE="pUmLZqBxukhpfoFSKrtP1Fd735131JLLGm4QxLOAl0w="
# Alternative: Use GPG key ID instead of passphrase (leave empty to use passphrase)
# GPG_KEY_ID=""
# ===================================
# S3 Storage Configuration
# ===================================
# S3 bucket name (must already exist with Object Lock enabled)
S3_BUCKET="bcbak"
# S3 endpoint URL
# AWS S3: https://s3.amazonaws.com or https://s3.REGION.amazonaws.com
# MinIO: http://minio.example.com:9000 or https://minio.example.com
# Wasabi: https://s3.wasabisys.com or https://s3.REGION.wasabisys.com
# Backblaze: https://s3.REGION.backblazeb2.com
S3_ENDPOINT="https://s3.palmasolutions.net:9000"
# AWS Access Key ID (or compatible credentials)
AWS_ACCESS_KEY_ID="DFuYw5lpgvPX9qUxwbzB"
# AWS Secret Access Key (or compatible credentials)
AWS_SECRET_ACCESS_KEY="xrojt6w1RK8dCRIWJll7NZaqn6Ppy3uxficfpHak"
# S3 region (for AWS, required; for others, may be optional)
AWS_DEFAULT_REGION="eu-south-1"
# S3 tool to use: "awscli" (recommended) or "s3cmd"
S3_TOOL="awscli"
# ===================================
# Backup Configuration
# ===================================
# Default backup mode when --mode is not specified on command line
# "incremental" = only export records changed since last run (fast, for cron)
# "full" = export everything (complete snapshot)
BACKUP_MODE_DEFAULT="incremental"
# Object lock retention period in days (must match or exceed bucket minimum)
# Also used by bc-cleanup.sh to determine which S3 objects to delete
RETENTION_DAYS="30"
# Maximum retry attempts for failed operations
MAX_RETRIES="3"
# Clean up local files after successful upload? (true/false)
CLEANUP_LOCAL="true"
# ===================================
# Optional: Email Notifications
# ===================================
# Enable email notifications on failure? (true/false)
# ENABLE_EMAIL_NOTIFICATIONS="false"
# Email address to send notifications to
# NOTIFICATION_EMAIL=""
# ===================================
# Advanced Configuration
# ===================================
# Local temporary directory (default: ./temp)
# WORK_DIR="/var/tmp/bc-backup"
# Log directory (default: ./logs)
# LOG_DIR="/var/log/bc-backup"