Files
BC-bak/README.md

562 lines
14 KiB
Markdown
Raw Permalink Blame History

This file contains ambiguous Unicode characters
This file contains Unicode characters that might be confused with other characters. If you think that this is intentional, you can safely ignore this warning. Use the Escape button to reveal them.
# Business Central SaaS Automated Backup System
Comprehensive backup solution for Microsoft Dynamics 365 Business Central SaaS that:
- Exports database via BC Admin Center API every hour
- Encrypts backups with GPG (AES-256)
- Uploads to S3-compatible object storage
- Enables immutability with 30-day delete prevention
- Maintains timestamped backup history
## Features
- **Automated Hourly Backups**: Uses cron/systemd to run backups on schedule
- **Secure Encryption**: GPG encryption with AES-256 cipher
- **Immutable Storage**: S3 Object Lock (WORM) with 30-day retention
- **Multiple S3 Providers**: AWS S3, MinIO, Wasabi, Backblaze B2
- **Comprehensive Logging**: Detailed logs for troubleshooting
- **Error Handling**: Retries and proper error reporting
- **Clean Architecture**: Modular bash + PowerShell scripts
## Architecture
```
┌─────────────────────┐
│ Cron/Systemd │
│ (Hourly Trigger) │
└──────────┬──────────┘
┌─────────────────────┐
│ bc-backup.sh │ ◄─── Main orchestration script (bash)
│ - Orchestrates │
│ - Encryption │
│ - S3 Upload │
└──────────┬──────────┘
┌─────────────────────┐
│ bc-export.ps1 │ ◄─── BC export logic (PowerShell)
│ - Azure AD auth │
│ - API calls │
│ - Download BACPAC │
└──────────┬──────────┘
┌─────────────────────┐
│ BC Admin API │
│ (Microsoft) │
└─────────────────────┘
```
## Prerequisites
### 1. System Requirements
- **Linux server** (Ubuntu 20.04+, Debian 10+, CentOS 7+, or similar)
- **PowerShell 7+** (installed automatically by setup script)
- **GPG** (for encryption)
- **AWS CLI v2** or **s3cmd** (for S3 uploads)
- **Root or sudo access** (for initial setup)
### 2. Business Central Requirements
- Active Business Central SaaS subscription (paid, not trial)
- Production environment (exports only available from Production)
- Admin access to BC Admin Center
### 3. Azure AD App Registration
You need an Azure AD application with API permissions to access BC Admin Center API.
### 4. S3-Compatible Storage
- S3 bucket with **Object Lock enabled** (immutability)
- Access credentials (Access Key ID + Secret Access Key)
## Quick Start
### 1. Download and Setup
```bash
cd /home/malin/c0ding/bcbak
chmod +x setup.sh
./setup.sh
```
The setup script will:
- Check and install dependencies (PowerShell, GPG, AWS CLI)
- Create directory structure
- Copy configuration template
- Set proper permissions
### 2. Create Azure AD App Registration
#### Step-by-Step:
1. Navigate to [Azure Portal](https://portal.azure.com)
2. Go to **Azure Active Directory** > **App registrations** > **New registration**
3. Configure:
- **Name**: `BC-Backup-Service`
- **Supported account types**: Accounts in this organizational directory only
- **Redirect URI**: Leave empty
4. Click **Register**
5. Note the following from the Overview page:
- **Application (client) ID**
- **Directory (tenant) ID**
#### Create Client Secret:
1. Go to **Certificates & secrets** > **New client secret**
2. **Description**: `BC Backup Key`
3. **Expires**: Choose appropriate duration (6 months, 1 year, etc.)
4. Click **Add**
5. **Copy the secret value immediately** (shown only once!)
#### Add API Permissions:
1. Go to **API permissions** > **Add a permission**
2. Select **Dynamics 365 Business Central**
3. Choose **Application permissions** (not Delegated)
4. Select: `Automation.ReadWrite.All` or `API.ReadWrite.All`
5. Click **Add permissions**
6. **Important**: Click **Grant admin consent for [Your Organization]**
- Requires Global Administrator role
### 3. Configure S3 Bucket with Object Lock
#### AWS S3 Example:
```bash
# Create bucket with Object Lock enabled
aws s3api create-bucket \
--bucket my-bc-backups \
--region us-east-1 \
--object-lock-enabled-for-bucket
# Configure default retention (30 days, COMPLIANCE mode)
aws s3api put-object-lock-configuration \
--bucket my-bc-backups \
--object-lock-configuration '{
"ObjectLockEnabled": "Enabled",
"Rule": {
"DefaultRetention": {
"Mode": "COMPLIANCE",
"Days": 30
}
}
}'
```
#### MinIO Example:
```bash
# Create bucket
mc mb myminio/my-bc-backups --with-lock
# Set retention
mc retention set --default COMPLIANCE "30d" myminio/my-bc-backups
```
**Important**: Object Lock can **only be enabled when creating a bucket**. You cannot add it to existing buckets.
### 4. Configure the Backup System
Edit the configuration file:
```bash
nano bc-backup.conf
```
Fill in the required values:
```bash
# Azure AD Configuration
AZURE_TENANT_ID="your-tenant-id-here"
AZURE_CLIENT_ID="your-client-id-here"
AZURE_CLIENT_SECRET="your-client-secret-here"
# Business Central
BC_ENVIRONMENT_NAME="Production"
# Generate a strong encryption passphrase
ENCRYPTION_PASSPHRASE="$(openssl rand -base64 32)"
# S3 Configuration
S3_BUCKET="my-bc-backups"
S3_ENDPOINT="https://s3.amazonaws.com" # or your S3 provider
AWS_ACCESS_KEY_ID="your-access-key"
AWS_SECRET_ACCESS_KEY="your-secret-key"
AWS_DEFAULT_REGION="us-east-1"
# Backup settings
RETENTION_DAYS="30"
S3_TOOL="awscli"
CLEANUP_LOCAL="true"
```
**Security**: Store your `ENCRYPTION_PASSPHRASE` securely in a password manager!
### 5. Test Manual Backup
Run a test backup:
```bash
./bc-backup.sh
```
Monitor the logs:
```bash
tail -f logs/backup.log
```
Expected output:
```
[2026-01-07 10:00:00] =========================================
[2026-01-07 10:00:00] Starting Business Central backup process
[2026-01-07 10:00:00] =========================================
[2026-01-07 10:00:00] Environment: Production
[2026-01-07 10:00:00] S3 Bucket: my-bc-backups
[2026-01-07 10:00:00] Retention: 30 days
[2026-01-07 10:00:01] Step 1: Initiating database export via BC Admin Center API
...
```
### 6. Set Up Automated Hourly Backups
#### Option A: Using Cron (Simpler)
```bash
crontab -e
```
Add this line for hourly backups:
```
0 * * * * /home/malin/c0ding/bcbak/bc-backup.sh >> /home/malin/c0ding/bcbak/logs/cron.log 2>&1
```
See `cron-examples.txt` for more scheduling options.
#### Option B: Using Systemd Timers (More Reliable)
Create service file:
```bash
sudo nano /etc/systemd/system/bc-backup.service
```
```ini
[Unit]
Description=Business Central Database Backup
[Service]
Type=oneshot
User=malin
WorkingDirectory=/home/malin/c0ding/bcbak
ExecStart=/home/malin/c0ding/bcbak/bc-backup.sh
StandardOutput=append:/home/malin/c0ding/bcbak/logs/backup.log
StandardError=append:/home/malin/c0ding/bcbak/logs/backup.log
```
Create timer file:
```bash
sudo nano /etc/systemd/system/bc-backup.timer
```
```ini
[Unit]
Description=Run BC Backup Every Hour
[Timer]
OnCalendar=hourly
Persistent=true
[Install]
WantedBy=timers.target
```
Enable and start:
```bash
sudo systemctl daemon-reload
sudo systemctl enable bc-backup.timer
sudo systemctl start bc-backup.timer
sudo systemctl status bc-backup.timer
```
## File Structure
```
bcbak/
├── bc-backup.sh # Main orchestration script
├── bc-export.ps1 # PowerShell BC export logic
├── bc-backup.conf # Your configuration (gitignored)
├── bc-backup.conf.template # Configuration template
├── setup.sh # Installation script
├── cron-examples.txt # Cron scheduling examples
├── README.md # This file
├── logs/ # Backup logs
│ ├── backup.log
│ └── cron.log
└── temp/ # Temporary files (auto-cleaned)
└── bc_backup_*.bacpac
```
## How It Works
### 1. Database Export (bc-export.ps1)
- Authenticates to Azure AD using client credentials (OAuth 2.0)
- Calls BC Admin Center API to initiate database export
- Polls for completion (exports can take 15-60 minutes)
- Downloads BACPAC file to local temp directory
### 2. Encryption (bc-backup.sh)
- Uses GPG with AES-256 symmetric encryption
- Encrypts the BACPAC file with your passphrase
- Original unencrypted file is deleted
### 3. Upload to S3
- Uploads encrypted file with timestamp in filename
- Format: `backups/bc_backup_Production_20260107_100000.bacpac.gpg`
- Sets Object Lock retention (COMPLIANCE mode, 30 days)
- Files are **immutable** and **cannot be deleted** until retention expires
### 4. Verification & Cleanup
- Verifies upload success
- Removes local encrypted file (optional)
- Logs all operations
## Restoring from Backup
### 1. Download Encrypted Backup
```bash
# Using AWS CLI
aws s3 cp \
s3://my-bc-backups/backups/bc_backup_Production_20260107_100000.bacpac.gpg \
./backup.bacpac.gpg \
--endpoint-url https://s3.amazonaws.com
```
### 2. Decrypt the Backup
```bash
# Enter your ENCRYPTION_PASSPHRASE when prompted
gpg --decrypt backup.bacpac.gpg > backup.bacpac
```
### 3. Restore to Azure SQL Database
```bash
# Using SqlPackage (download from Microsoft)
sqlpackage /a:Import \
/sf:backup.bacpac \
/tsn:your-server.database.windows.net \
/tdn:RestoredDatabase \
/tu:admin \
/tp:password
```
### 4. Connect BC to Restored Database
Contact Microsoft Support to point your BC environment to the restored database.
## Monitoring and Maintenance
### Check Backup Logs
```bash
# View latest backup log
tail -100 logs/backup.log
# Follow live log
tail -f logs/backup.log
# Check for errors
grep ERROR logs/backup.log
```
### List S3 Backups
```bash
# AWS CLI
aws s3 ls s3://my-bc-backups/backups/ --endpoint-url https://s3.amazonaws.com
# s3cmd
s3cmd ls s3://my-bc-backups/backups/
```
### Check Object Lock Status
```bash
aws s3api get-object-retention \
--bucket my-bc-backups \
--key backups/bc_backup_Production_20260107_100000.bacpac.gpg \
--endpoint-url https://s3.amazonaws.com
```
### Verify Cron/Timer Status
```bash
# Cron
crontab -l
grep CRON /var/log/syslog | tail
# Systemd
systemctl status bc-backup.timer
journalctl -u bc-backup.service -n 50
```
## Troubleshooting
### Issue: "Authentication failed"
**Solution**: Verify Azure AD credentials
- Check `AZURE_TENANT_ID`, `AZURE_CLIENT_ID`, `AZURE_CLIENT_SECRET`
- Verify API permissions are granted with admin consent
- Ensure client secret hasn't expired
### Issue: "Database export failed - not authorized"
**Causes**:
- Only Production environments can be exported
- Trial subscriptions don't support exports
- Missing API permissions
**Solution**: Verify environment is Production with paid subscription
### Issue: "Export timeout exceeded"
**Solution**: Increase timeout
```bash
# In bc-backup.conf
MAX_EXPORT_WAIT_MINUTES="180" # 3 hours
```
### Issue: "Object lock not supported"
**Solution**: Recreate bucket with Object Lock
- Object Lock can only be enabled at bucket creation
- Migrate existing backups to new bucket
### Issue: "Upload failed - access denied"
**Solution**: Check S3 credentials and permissions
```bash
# Test AWS CLI configuration
aws s3 ls --endpoint-url https://s3.amazonaws.com
# Verify bucket policy allows PutObject and PutObjectRetention
```
### Issue: "Decryption failed"
**Solution**: Verify encryption passphrase
- Ensure you're using the correct `ENCRYPTION_PASSPHRASE`
- Check for special characters that might need escaping
## Security Best Practices
1. **Protect Configuration File**
- Set proper permissions: `chmod 600 bc-backup.conf`
- Never commit to version control (use `.gitignore`)
2. **Rotate Credentials Regularly**
- Azure AD client secrets (every 6-12 months)
- S3 access keys (annually)
- Encryption passphrase (when staff changes)
3. **Use Separate Service Account**
- Create dedicated Linux user for backups
- Run with minimal permissions
4. **Encryption Key Management**
- Store `ENCRYPTION_PASSPHRASE` in password manager
- Document in secure runbook
- Test decryption regularly
5. **Monitor for Failures**
- Set up log monitoring/alerting
- Test restore process monthly
6. **Network Security**
- Use HTTPS for S3 endpoints
- Consider VPN for sensitive environments
## Limitations
1. **BC API Limits**
- Maximum 10 database exports per month (Microsoft limit)
- This script tracks recent exports to avoid unnecessary duplicates
2. **Export Restrictions**
- Only Production environments
- Only paid subscriptions
- Exports can take 15-60 minutes
3. **Object Lock Immutability**
- Files cannot be deleted until retention expires
- Ensure adequate S3 storage capacity
- Plan for storage costs
4. **Bandwidth**
- Large databases require significant bandwidth
- Consider S3 transfer costs
## Cost Considerations
### S3 Storage Costs (Example: AWS)
For a 50GB database with hourly backups:
- **Storage**: ~50GB × 720 backups (30 days) = 36TB × $0.023/GB = ~$830/month
- **Uploads**: 720 requests × $0.005/1000 = ~$0.004/month
- **Data Transfer Out** (for restores): $0.09/GB
**Recommendation**: Consider daily backups instead of hourly to reduce costs.
### Optimization Strategies
1. **Reduce Frequency**: Daily or every 6 hours instead of hourly
2. **Lifecycle Policies**: Move older backups to cheaper storage tiers
3. **Incremental Backups**: Consider BC's built-in continuous backup for point-in-time recovery
## Support and Contributing
### Getting Help
1. Check logs: `logs/backup.log`
2. Review troubleshooting section above
3. Check BC Admin Center for export status
4. Verify S3 bucket configuration
### Reporting Issues
When reporting issues, include:
- Relevant log excerpts
- BC environment type (Production/Sandbox)
- S3 provider (AWS/MinIO/etc.)
- Error messages
## License
This backup solution is provided as-is without warranty. Use at your own risk.
## References
- [BC Admin Center API Documentation](https://learn.microsoft.com/en-us/dynamics365/business-central/dev-itpro/administration/administration-center-api)
- [BC Data Extraction](https://github.com/microsoft/BCTech/tree/master/samples/ExtractData)
- [AWS S3 Object Lock](https://docs.aws.amazon.com/AmazonS3/latest/userguide/object-lock.html)
- [GPG Documentation](https://gnupg.org/documentation/)
## Changelog
### v1.0.0 (2026-01-07)
- Initial release
- Hourly automated backups
- GPG encryption with AES-256
- S3 Object Lock support
- AWS CLI and s3cmd support
- Comprehensive logging