feat: add incremental backups, S3 cleanup, and cron scheduling

Incremental backups using BC API's lastModifiedDateTime filter to only
export records changed since the last successful run. Runs every 15
minutes via cron, with a daily full backup for complete snapshots.

bc-export.ps1:
- Add -SinceDateTime parameter for incremental filtering
- Append $filter=lastModifiedDateTime gt {timestamp} to all entity URLs
- Exit code 2 when no records changed (skip archive/upload)
- Record mode and sinceDateTime in export-metadata.json

bc-backup.sh:
- Accept --mode full|incremental flag (default: incremental)
- State file (last-run-state.json) tracks last successful run timestamp
- Auto-fallback to full when no state file exists
- Skip archive/encrypt/upload when incremental finds 0 changes
- Lock file (.backup.lock) prevents overlapping cron runs
- S3 keys organized by mode: backups/full/ vs backups/incremental/

bc-cleanup.sh (new):
- Lists all S3 objects under backups/ prefix
- Deletes objects older than RETENTION_DAYS (default 30)
- Handles pagination for large buckets
- Gracefully handles COMPLIANCE-locked objects

bc-backup.conf.template:
- Add BACKUP_MODE_DEFAULT option

cron-examples.txt:
- Recommended setup: 15-min incremental + daily full + daily cleanup
- Alternative schedules (30-min, hourly)
- Systemd timer examples

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
This commit is contained in:
2026-02-16 10:22:08 +01:00
parent b407e2aeb7
commit 3bad3ad171
6 changed files with 334 additions and 92 deletions

View File

@@ -6,7 +6,8 @@
param(
[Parameter(Mandatory=$true)]
[string]$OutputPath
[string]$OutputPath,
[string]$SinceDateTime = "" # ISO 8601, e.g. "2026-02-15T00:00:00Z" for incremental
)
# Get configuration from environment variables
@@ -190,6 +191,9 @@ function Export-EntityData {
)
$entityUrl = "$baseUrl/companies($CompanyId)/$EntityName"
if ($SinceDateTime) {
$entityUrl += "?`$filter=lastModifiedDateTime gt $SinceDateTime"
}
$maxEntityRetries = 5
for ($entityAttempt = 1; $entityAttempt -le $maxEntityRetries; $entityAttempt++) {
@@ -255,6 +259,9 @@ function Export-DocumentWithLines {
# Step 1: Fetch document headers page by page (no $expand)
# BC API default page size is ~100, with @odata.nextLink for more
$currentUrl = "$baseUrl/companies($CompanyId)/$DocumentEntity"
if ($SinceDateTime) {
$currentUrl += "?`$filter=lastModifiedDateTime gt $SinceDateTime"
}
while ($currentUrl) {
$response = Invoke-BCApi -Url $currentUrl
@@ -337,10 +344,16 @@ function Export-DocumentWithLines {
# Main execution
try {
$exportMode = if ($SinceDateTime) { "incremental" } else { "full" }
Write-Log "========================================="
Write-Log "BC Data Export Script (API v2.0)"
Write-Log "========================================="
Write-Log "Environment: $environmentName"
Write-Log "Mode: $exportMode"
if ($SinceDateTime) {
Write-Log "Changes since: $SinceDateTime"
}
Write-Log "Output Path: $OutputPath"
Write-Log "Entities to extract: $($entities.Count + $documentEntities.Count) ($($documentEntities.Count) with line items)"
@@ -434,6 +447,8 @@ try {
$metadata = @{
exportDate = (Get-Date -Format "yyyy-MM-dd HH:mm:ss UTC" -AsUTC)
environment = $environmentName
mode = $exportMode
sinceDateTime = if ($SinceDateTime) { $SinceDateTime } else { $null }
companies = @($targetCompanies | ForEach-Object { $_.name })
entitiesExported = $totalEntities
totalRecords = $totalRecords
@@ -443,6 +458,7 @@ try {
Write-Log "========================================="
Write-Log "Export completed"
Write-Log "Mode: $exportMode"
Write-Log "Companies: $($targetCompanies.Count)"
Write-Log "Entities: $totalEntities"
Write-Log "Total records: $totalRecords"
@@ -450,6 +466,12 @@ try {
Write-Log "Failed/empty: $($failedEntities.Count) entities" "WARN"
}
Write-Log "========================================="
# Exit code 2 = success but no records (used by bc-backup.sh to skip empty incrementals)
if ($totalRecords -eq 0 -and $exportMode -eq "incremental") {
Write-Log "No changes detected since $SinceDateTime"
exit 2
}
exit 0
}
catch {