Initial commit: BC backup project
This commit is contained in:
34
.gitignore
vendored
Normal file
34
.gitignore
vendored
Normal file
@@ -0,0 +1,34 @@
|
||||
# Business Central Backup - Git Ignore File
|
||||
# Prevents sensitive data from being committed to version control
|
||||
|
||||
# Configuration file with secrets
|
||||
bc-backup.conf
|
||||
|
||||
# Log files
|
||||
logs/
|
||||
*.log
|
||||
|
||||
# Temporary files
|
||||
temp/
|
||||
*.bacpac
|
||||
*.gpg
|
||||
*.tmp
|
||||
|
||||
# Backup downloads
|
||||
backups/
|
||||
*.bak
|
||||
|
||||
# System files
|
||||
.DS_Store
|
||||
Thumbs.db
|
||||
*~
|
||||
|
||||
# IDE files
|
||||
.vscode/
|
||||
.idea/
|
||||
*.swp
|
||||
*.swo
|
||||
|
||||
# Environment variables
|
||||
.env
|
||||
.env.local
|
||||
167
QUICKSTART.md
Normal file
167
QUICKSTART.md
Normal file
@@ -0,0 +1,167 @@
|
||||
# Quick Start Guide - BC Backup System
|
||||
|
||||
## 5-Minute Setup
|
||||
|
||||
### 1. Install Dependencies
|
||||
|
||||
```bash
|
||||
./setup.sh
|
||||
```
|
||||
|
||||
### 2. Create Azure AD App
|
||||
|
||||
1. Go to [Azure Portal](https://portal.azure.com) → Azure AD → App registrations → New
|
||||
2. Name: `BC-Backup-Service`
|
||||
3. Note: **Application ID** and **Tenant ID**
|
||||
4. Create **Client Secret** (save immediately!)
|
||||
5. Add API Permission: **Dynamics 365 Business Central** → **Automation.ReadWrite.All**
|
||||
6. Click **Grant admin consent**
|
||||
|
||||
### 3. Create S3 Bucket with Object Lock
|
||||
|
||||
**AWS:**
|
||||
```bash
|
||||
aws s3api create-bucket \
|
||||
--bucket my-bc-backups \
|
||||
--region us-east-1 \
|
||||
--object-lock-enabled-for-bucket
|
||||
|
||||
aws s3api put-object-lock-configuration \
|
||||
--bucket my-bc-backups \
|
||||
--object-lock-configuration '{
|
||||
"ObjectLockEnabled": "Enabled",
|
||||
"Rule": {"DefaultRetention": {"Mode": "COMPLIANCE", "Days": 30}}
|
||||
}'
|
||||
```
|
||||
|
||||
**MinIO:**
|
||||
```bash
|
||||
mc mb myminio/my-bc-backups --with-lock
|
||||
mc retention set --default COMPLIANCE "30d" myminio/my-bc-backups
|
||||
```
|
||||
|
||||
### 4. Configure
|
||||
|
||||
```bash
|
||||
nano bc-backup.conf
|
||||
```
|
||||
|
||||
Minimum required:
|
||||
```bash
|
||||
AZURE_TENANT_ID="xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx"
|
||||
AZURE_CLIENT_ID="xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx"
|
||||
AZURE_CLIENT_SECRET="your-secret-here"
|
||||
BC_ENVIRONMENT_NAME="Production"
|
||||
ENCRYPTION_PASSPHRASE="$(openssl rand -base64 32)" # Generate strong key
|
||||
S3_BUCKET="my-bc-backups"
|
||||
S3_ENDPOINT="https://s3.amazonaws.com"
|
||||
AWS_ACCESS_KEY_ID="AKIAXXXXXXXXXXXXXXXX"
|
||||
AWS_SECRET_ACCESS_KEY="your-secret-key"
|
||||
AWS_DEFAULT_REGION="us-east-1"
|
||||
```
|
||||
|
||||
**IMPORTANT**: Save your `ENCRYPTION_PASSPHRASE` in a password manager!
|
||||
|
||||
### 5. Test Configuration
|
||||
|
||||
```bash
|
||||
./test-config.sh
|
||||
```
|
||||
|
||||
### 6. Test Backup
|
||||
|
||||
```bash
|
||||
./bc-backup.sh
|
||||
```
|
||||
|
||||
Watch logs:
|
||||
```bash
|
||||
tail -f logs/backup.log
|
||||
```
|
||||
|
||||
### 7. Schedule Hourly Backups
|
||||
|
||||
```bash
|
||||
crontab -e
|
||||
```
|
||||
|
||||
Add:
|
||||
```
|
||||
0 * * * * /home/malin/c0ding/bcbak/bc-backup.sh >> /home/malin/c0ding/bcbak/logs/cron.log 2>&1
|
||||
```
|
||||
|
||||
## Done!
|
||||
|
||||
Your backups will now run every hour automatically.
|
||||
|
||||
---
|
||||
|
||||
## Common Commands
|
||||
|
||||
```bash
|
||||
# View latest backup log
|
||||
tail -100 logs/backup.log
|
||||
|
||||
# List backups in S3
|
||||
aws s3 ls s3://my-bc-backups/backups/ --endpoint-url https://s3.amazonaws.com
|
||||
|
||||
# Test configuration
|
||||
./test-config.sh
|
||||
|
||||
# Decrypt a backup
|
||||
./decrypt-backup.sh backup.bacpac.gpg
|
||||
|
||||
# Check cron jobs
|
||||
crontab -l
|
||||
|
||||
# View cron logs
|
||||
tail -f logs/cron.log
|
||||
```
|
||||
|
||||
## Restore Process
|
||||
|
||||
1. Download encrypted backup from S3
|
||||
2. Decrypt: `./decrypt-backup.sh backup.bacpac.gpg`
|
||||
3. Import to Azure SQL with SqlPackage
|
||||
4. Contact Microsoft to connect BC
|
||||
|
||||
See [README.md](README.md) for detailed instructions.
|
||||
|
||||
## Troubleshooting
|
||||
|
||||
| Issue | Solution |
|
||||
|-------|----------|
|
||||
| Authentication failed | Check Azure AD credentials, verify API permissions granted |
|
||||
| Export not authorized | Only Production environments with paid subscriptions can export |
|
||||
| Object Lock error | Bucket must be created with Object Lock enabled |
|
||||
| Upload failed | Verify S3 credentials and bucket name |
|
||||
|
||||
Full troubleshooting guide in [README.md](README.md).
|
||||
|
||||
## Important Notes
|
||||
|
||||
- **Encryption passphrase**: Store securely! Can't decrypt without it
|
||||
- **API limit**: BC allows max 10 exports per month (script reuses recent exports)
|
||||
- **Export time**: Database exports take 15-60 minutes
|
||||
- **Immutability**: Files can't be deleted for 30 days (by design)
|
||||
- **Cost**: Monitor S3 storage costs (hourly backups = ~720 files/month)
|
||||
|
||||
## File Structure
|
||||
|
||||
```
|
||||
bcbak/
|
||||
├── bc-backup.sh # Main script (run this)
|
||||
├── bc-export.ps1 # BC export logic
|
||||
├── bc-backup.conf # Your config (secret!)
|
||||
├── decrypt-backup.sh # Decrypt backups
|
||||
├── test-config.sh # Validate setup
|
||||
├── setup.sh # Install dependencies
|
||||
├── README.md # Full documentation
|
||||
└── logs/ # Backup logs
|
||||
```
|
||||
|
||||
## Need Help?
|
||||
|
||||
1. Check `logs/backup.log` for errors
|
||||
2. Run `./test-config.sh` to validate setup
|
||||
3. Review [README.md](README.md) troubleshooting section
|
||||
561
README.md
Normal file
561
README.md
Normal file
@@ -0,0 +1,561 @@
|
||||
# Business Central SaaS Automated Backup System
|
||||
|
||||
Comprehensive backup solution for Microsoft Dynamics 365 Business Central SaaS that:
|
||||
- Exports database via BC Admin Center API every hour
|
||||
- Encrypts backups with GPG (AES-256)
|
||||
- Uploads to S3-compatible object storage
|
||||
- Enables immutability with 30-day delete prevention
|
||||
- Maintains timestamped backup history
|
||||
|
||||
## Features
|
||||
|
||||
- **Automated Hourly Backups**: Uses cron/systemd to run backups on schedule
|
||||
- **Secure Encryption**: GPG encryption with AES-256 cipher
|
||||
- **Immutable Storage**: S3 Object Lock (WORM) with 30-day retention
|
||||
- **Multiple S3 Providers**: AWS S3, MinIO, Wasabi, Backblaze B2
|
||||
- **Comprehensive Logging**: Detailed logs for troubleshooting
|
||||
- **Error Handling**: Retries and proper error reporting
|
||||
- **Clean Architecture**: Modular bash + PowerShell scripts
|
||||
|
||||
## Architecture
|
||||
|
||||
```
|
||||
┌─────────────────────┐
|
||||
│ Cron/Systemd │
|
||||
│ (Hourly Trigger) │
|
||||
└──────────┬──────────┘
|
||||
│
|
||||
▼
|
||||
┌─────────────────────┐
|
||||
│ bc-backup.sh │ ◄─── Main orchestration script (bash)
|
||||
│ - Orchestrates │
|
||||
│ - Encryption │
|
||||
│ - S3 Upload │
|
||||
└──────────┬──────────┘
|
||||
│
|
||||
▼
|
||||
┌─────────────────────┐
|
||||
│ bc-export.ps1 │ ◄─── BC export logic (PowerShell)
|
||||
│ - Azure AD auth │
|
||||
│ - API calls │
|
||||
│ - Download BACPAC │
|
||||
└──────────┬──────────┘
|
||||
│
|
||||
▼
|
||||
┌─────────────────────┐
|
||||
│ BC Admin API │
|
||||
│ (Microsoft) │
|
||||
└─────────────────────┘
|
||||
```
|
||||
|
||||
## Prerequisites
|
||||
|
||||
### 1. System Requirements
|
||||
|
||||
- **Linux server** (Ubuntu 20.04+, Debian 10+, CentOS 7+, or similar)
|
||||
- **PowerShell 7+** (installed automatically by setup script)
|
||||
- **GPG** (for encryption)
|
||||
- **AWS CLI v2** or **s3cmd** (for S3 uploads)
|
||||
- **Root or sudo access** (for initial setup)
|
||||
|
||||
### 2. Business Central Requirements
|
||||
|
||||
- Active Business Central SaaS subscription (paid, not trial)
|
||||
- Production environment (exports only available from Production)
|
||||
- Admin access to BC Admin Center
|
||||
|
||||
### 3. Azure AD App Registration
|
||||
|
||||
You need an Azure AD application with API permissions to access BC Admin Center API.
|
||||
|
||||
### 4. S3-Compatible Storage
|
||||
|
||||
- S3 bucket with **Object Lock enabled** (immutability)
|
||||
- Access credentials (Access Key ID + Secret Access Key)
|
||||
|
||||
## Quick Start
|
||||
|
||||
### 1. Download and Setup
|
||||
|
||||
```bash
|
||||
cd /home/malin/c0ding/bcbak
|
||||
chmod +x setup.sh
|
||||
./setup.sh
|
||||
```
|
||||
|
||||
The setup script will:
|
||||
- Check and install dependencies (PowerShell, GPG, AWS CLI)
|
||||
- Create directory structure
|
||||
- Copy configuration template
|
||||
- Set proper permissions
|
||||
|
||||
### 2. Create Azure AD App Registration
|
||||
|
||||
#### Step-by-Step:
|
||||
|
||||
1. Navigate to [Azure Portal](https://portal.azure.com)
|
||||
2. Go to **Azure Active Directory** > **App registrations** > **New registration**
|
||||
3. Configure:
|
||||
- **Name**: `BC-Backup-Service`
|
||||
- **Supported account types**: Accounts in this organizational directory only
|
||||
- **Redirect URI**: Leave empty
|
||||
4. Click **Register**
|
||||
5. Note the following from the Overview page:
|
||||
- **Application (client) ID**
|
||||
- **Directory (tenant) ID**
|
||||
|
||||
#### Create Client Secret:
|
||||
|
||||
1. Go to **Certificates & secrets** > **New client secret**
|
||||
2. **Description**: `BC Backup Key`
|
||||
3. **Expires**: Choose appropriate duration (6 months, 1 year, etc.)
|
||||
4. Click **Add**
|
||||
5. **Copy the secret value immediately** (shown only once!)
|
||||
|
||||
#### Add API Permissions:
|
||||
|
||||
1. Go to **API permissions** > **Add a permission**
|
||||
2. Select **Dynamics 365 Business Central**
|
||||
3. Choose **Application permissions** (not Delegated)
|
||||
4. Select: `Automation.ReadWrite.All` or `API.ReadWrite.All`
|
||||
5. Click **Add permissions**
|
||||
6. **Important**: Click **Grant admin consent for [Your Organization]**
|
||||
- Requires Global Administrator role
|
||||
|
||||
### 3. Configure S3 Bucket with Object Lock
|
||||
|
||||
#### AWS S3 Example:
|
||||
|
||||
```bash
|
||||
# Create bucket with Object Lock enabled
|
||||
aws s3api create-bucket \
|
||||
--bucket my-bc-backups \
|
||||
--region us-east-1 \
|
||||
--object-lock-enabled-for-bucket
|
||||
|
||||
# Configure default retention (30 days, COMPLIANCE mode)
|
||||
aws s3api put-object-lock-configuration \
|
||||
--bucket my-bc-backups \
|
||||
--object-lock-configuration '{
|
||||
"ObjectLockEnabled": "Enabled",
|
||||
"Rule": {
|
||||
"DefaultRetention": {
|
||||
"Mode": "COMPLIANCE",
|
||||
"Days": 30
|
||||
}
|
||||
}
|
||||
}'
|
||||
```
|
||||
|
||||
#### MinIO Example:
|
||||
|
||||
```bash
|
||||
# Create bucket
|
||||
mc mb myminio/my-bc-backups --with-lock
|
||||
|
||||
# Set retention
|
||||
mc retention set --default COMPLIANCE "30d" myminio/my-bc-backups
|
||||
```
|
||||
|
||||
**Important**: Object Lock can **only be enabled when creating a bucket**. You cannot add it to existing buckets.
|
||||
|
||||
### 4. Configure the Backup System
|
||||
|
||||
Edit the configuration file:
|
||||
|
||||
```bash
|
||||
nano bc-backup.conf
|
||||
```
|
||||
|
||||
Fill in the required values:
|
||||
|
||||
```bash
|
||||
# Azure AD Configuration
|
||||
AZURE_TENANT_ID="your-tenant-id-here"
|
||||
AZURE_CLIENT_ID="your-client-id-here"
|
||||
AZURE_CLIENT_SECRET="your-client-secret-here"
|
||||
|
||||
# Business Central
|
||||
BC_ENVIRONMENT_NAME="Production"
|
||||
|
||||
# Generate a strong encryption passphrase
|
||||
ENCRYPTION_PASSPHRASE="$(openssl rand -base64 32)"
|
||||
|
||||
# S3 Configuration
|
||||
S3_BUCKET="my-bc-backups"
|
||||
S3_ENDPOINT="https://s3.amazonaws.com" # or your S3 provider
|
||||
AWS_ACCESS_KEY_ID="your-access-key"
|
||||
AWS_SECRET_ACCESS_KEY="your-secret-key"
|
||||
AWS_DEFAULT_REGION="us-east-1"
|
||||
|
||||
# Backup settings
|
||||
RETENTION_DAYS="30"
|
||||
S3_TOOL="awscli"
|
||||
CLEANUP_LOCAL="true"
|
||||
```
|
||||
|
||||
**Security**: Store your `ENCRYPTION_PASSPHRASE` securely in a password manager!
|
||||
|
||||
### 5. Test Manual Backup
|
||||
|
||||
Run a test backup:
|
||||
|
||||
```bash
|
||||
./bc-backup.sh
|
||||
```
|
||||
|
||||
Monitor the logs:
|
||||
|
||||
```bash
|
||||
tail -f logs/backup.log
|
||||
```
|
||||
|
||||
Expected output:
|
||||
```
|
||||
[2026-01-07 10:00:00] =========================================
|
||||
[2026-01-07 10:00:00] Starting Business Central backup process
|
||||
[2026-01-07 10:00:00] =========================================
|
||||
[2026-01-07 10:00:00] Environment: Production
|
||||
[2026-01-07 10:00:00] S3 Bucket: my-bc-backups
|
||||
[2026-01-07 10:00:00] Retention: 30 days
|
||||
[2026-01-07 10:00:01] Step 1: Initiating database export via BC Admin Center API
|
||||
...
|
||||
```
|
||||
|
||||
### 6. Set Up Automated Hourly Backups
|
||||
|
||||
#### Option A: Using Cron (Simpler)
|
||||
|
||||
```bash
|
||||
crontab -e
|
||||
```
|
||||
|
||||
Add this line for hourly backups:
|
||||
```
|
||||
0 * * * * /home/malin/c0ding/bcbak/bc-backup.sh >> /home/malin/c0ding/bcbak/logs/cron.log 2>&1
|
||||
```
|
||||
|
||||
See `cron-examples.txt` for more scheduling options.
|
||||
|
||||
#### Option B: Using Systemd Timers (More Reliable)
|
||||
|
||||
Create service file:
|
||||
```bash
|
||||
sudo nano /etc/systemd/system/bc-backup.service
|
||||
```
|
||||
|
||||
```ini
|
||||
[Unit]
|
||||
Description=Business Central Database Backup
|
||||
|
||||
[Service]
|
||||
Type=oneshot
|
||||
User=malin
|
||||
WorkingDirectory=/home/malin/c0ding/bcbak
|
||||
ExecStart=/home/malin/c0ding/bcbak/bc-backup.sh
|
||||
StandardOutput=append:/home/malin/c0ding/bcbak/logs/backup.log
|
||||
StandardError=append:/home/malin/c0ding/bcbak/logs/backup.log
|
||||
```
|
||||
|
||||
Create timer file:
|
||||
```bash
|
||||
sudo nano /etc/systemd/system/bc-backup.timer
|
||||
```
|
||||
|
||||
```ini
|
||||
[Unit]
|
||||
Description=Run BC Backup Every Hour
|
||||
|
||||
[Timer]
|
||||
OnCalendar=hourly
|
||||
Persistent=true
|
||||
|
||||
[Install]
|
||||
WantedBy=timers.target
|
||||
```
|
||||
|
||||
Enable and start:
|
||||
```bash
|
||||
sudo systemctl daemon-reload
|
||||
sudo systemctl enable bc-backup.timer
|
||||
sudo systemctl start bc-backup.timer
|
||||
sudo systemctl status bc-backup.timer
|
||||
```
|
||||
|
||||
## File Structure
|
||||
|
||||
```
|
||||
bcbak/
|
||||
├── bc-backup.sh # Main orchestration script
|
||||
├── bc-export.ps1 # PowerShell BC export logic
|
||||
├── bc-backup.conf # Your configuration (gitignored)
|
||||
├── bc-backup.conf.template # Configuration template
|
||||
├── setup.sh # Installation script
|
||||
├── cron-examples.txt # Cron scheduling examples
|
||||
├── README.md # This file
|
||||
├── logs/ # Backup logs
|
||||
│ ├── backup.log
|
||||
│ └── cron.log
|
||||
└── temp/ # Temporary files (auto-cleaned)
|
||||
└── bc_backup_*.bacpac
|
||||
```
|
||||
|
||||
## How It Works
|
||||
|
||||
### 1. Database Export (bc-export.ps1)
|
||||
|
||||
- Authenticates to Azure AD using client credentials (OAuth 2.0)
|
||||
- Calls BC Admin Center API to initiate database export
|
||||
- Polls for completion (exports can take 15-60 minutes)
|
||||
- Downloads BACPAC file to local temp directory
|
||||
|
||||
### 2. Encryption (bc-backup.sh)
|
||||
|
||||
- Uses GPG with AES-256 symmetric encryption
|
||||
- Encrypts the BACPAC file with your passphrase
|
||||
- Original unencrypted file is deleted
|
||||
|
||||
### 3. Upload to S3
|
||||
|
||||
- Uploads encrypted file with timestamp in filename
|
||||
- Format: `backups/bc_backup_Production_20260107_100000.bacpac.gpg`
|
||||
- Sets Object Lock retention (COMPLIANCE mode, 30 days)
|
||||
- Files are **immutable** and **cannot be deleted** until retention expires
|
||||
|
||||
### 4. Verification & Cleanup
|
||||
|
||||
- Verifies upload success
|
||||
- Removes local encrypted file (optional)
|
||||
- Logs all operations
|
||||
|
||||
## Restoring from Backup
|
||||
|
||||
### 1. Download Encrypted Backup
|
||||
|
||||
```bash
|
||||
# Using AWS CLI
|
||||
aws s3 cp \
|
||||
s3://my-bc-backups/backups/bc_backup_Production_20260107_100000.bacpac.gpg \
|
||||
./backup.bacpac.gpg \
|
||||
--endpoint-url https://s3.amazonaws.com
|
||||
```
|
||||
|
||||
### 2. Decrypt the Backup
|
||||
|
||||
```bash
|
||||
# Enter your ENCRYPTION_PASSPHRASE when prompted
|
||||
gpg --decrypt backup.bacpac.gpg > backup.bacpac
|
||||
```
|
||||
|
||||
### 3. Restore to Azure SQL Database
|
||||
|
||||
```bash
|
||||
# Using SqlPackage (download from Microsoft)
|
||||
sqlpackage /a:Import \
|
||||
/sf:backup.bacpac \
|
||||
/tsn:your-server.database.windows.net \
|
||||
/tdn:RestoredDatabase \
|
||||
/tu:admin \
|
||||
/tp:password
|
||||
```
|
||||
|
||||
### 4. Connect BC to Restored Database
|
||||
|
||||
Contact Microsoft Support to point your BC environment to the restored database.
|
||||
|
||||
## Monitoring and Maintenance
|
||||
|
||||
### Check Backup Logs
|
||||
|
||||
```bash
|
||||
# View latest backup log
|
||||
tail -100 logs/backup.log
|
||||
|
||||
# Follow live log
|
||||
tail -f logs/backup.log
|
||||
|
||||
# Check for errors
|
||||
grep ERROR logs/backup.log
|
||||
```
|
||||
|
||||
### List S3 Backups
|
||||
|
||||
```bash
|
||||
# AWS CLI
|
||||
aws s3 ls s3://my-bc-backups/backups/ --endpoint-url https://s3.amazonaws.com
|
||||
|
||||
# s3cmd
|
||||
s3cmd ls s3://my-bc-backups/backups/
|
||||
```
|
||||
|
||||
### Check Object Lock Status
|
||||
|
||||
```bash
|
||||
aws s3api get-object-retention \
|
||||
--bucket my-bc-backups \
|
||||
--key backups/bc_backup_Production_20260107_100000.bacpac.gpg \
|
||||
--endpoint-url https://s3.amazonaws.com
|
||||
```
|
||||
|
||||
### Verify Cron/Timer Status
|
||||
|
||||
```bash
|
||||
# Cron
|
||||
crontab -l
|
||||
grep CRON /var/log/syslog | tail
|
||||
|
||||
# Systemd
|
||||
systemctl status bc-backup.timer
|
||||
journalctl -u bc-backup.service -n 50
|
||||
```
|
||||
|
||||
## Troubleshooting
|
||||
|
||||
### Issue: "Authentication failed"
|
||||
|
||||
**Solution**: Verify Azure AD credentials
|
||||
- Check `AZURE_TENANT_ID`, `AZURE_CLIENT_ID`, `AZURE_CLIENT_SECRET`
|
||||
- Verify API permissions are granted with admin consent
|
||||
- Ensure client secret hasn't expired
|
||||
|
||||
### Issue: "Database export failed - not authorized"
|
||||
|
||||
**Causes**:
|
||||
- Only Production environments can be exported
|
||||
- Trial subscriptions don't support exports
|
||||
- Missing API permissions
|
||||
|
||||
**Solution**: Verify environment is Production with paid subscription
|
||||
|
||||
### Issue: "Export timeout exceeded"
|
||||
|
||||
**Solution**: Increase timeout
|
||||
```bash
|
||||
# In bc-backup.conf
|
||||
MAX_EXPORT_WAIT_MINUTES="180" # 3 hours
|
||||
```
|
||||
|
||||
### Issue: "Object lock not supported"
|
||||
|
||||
**Solution**: Recreate bucket with Object Lock
|
||||
- Object Lock can only be enabled at bucket creation
|
||||
- Migrate existing backups to new bucket
|
||||
|
||||
### Issue: "Upload failed - access denied"
|
||||
|
||||
**Solution**: Check S3 credentials and permissions
|
||||
```bash
|
||||
# Test AWS CLI configuration
|
||||
aws s3 ls --endpoint-url https://s3.amazonaws.com
|
||||
|
||||
# Verify bucket policy allows PutObject and PutObjectRetention
|
||||
```
|
||||
|
||||
### Issue: "Decryption failed"
|
||||
|
||||
**Solution**: Verify encryption passphrase
|
||||
- Ensure you're using the correct `ENCRYPTION_PASSPHRASE`
|
||||
- Check for special characters that might need escaping
|
||||
|
||||
## Security Best Practices
|
||||
|
||||
1. **Protect Configuration File**
|
||||
- Set proper permissions: `chmod 600 bc-backup.conf`
|
||||
- Never commit to version control (use `.gitignore`)
|
||||
|
||||
2. **Rotate Credentials Regularly**
|
||||
- Azure AD client secrets (every 6-12 months)
|
||||
- S3 access keys (annually)
|
||||
- Encryption passphrase (when staff changes)
|
||||
|
||||
3. **Use Separate Service Account**
|
||||
- Create dedicated Linux user for backups
|
||||
- Run with minimal permissions
|
||||
|
||||
4. **Encryption Key Management**
|
||||
- Store `ENCRYPTION_PASSPHRASE` in password manager
|
||||
- Document in secure runbook
|
||||
- Test decryption regularly
|
||||
|
||||
5. **Monitor for Failures**
|
||||
- Set up log monitoring/alerting
|
||||
- Test restore process monthly
|
||||
|
||||
6. **Network Security**
|
||||
- Use HTTPS for S3 endpoints
|
||||
- Consider VPN for sensitive environments
|
||||
|
||||
## Limitations
|
||||
|
||||
1. **BC API Limits**
|
||||
- Maximum 10 database exports per month (Microsoft limit)
|
||||
- This script tracks recent exports to avoid unnecessary duplicates
|
||||
|
||||
2. **Export Restrictions**
|
||||
- Only Production environments
|
||||
- Only paid subscriptions
|
||||
- Exports can take 15-60 minutes
|
||||
|
||||
3. **Object Lock Immutability**
|
||||
- Files cannot be deleted until retention expires
|
||||
- Ensure adequate S3 storage capacity
|
||||
- Plan for storage costs
|
||||
|
||||
4. **Bandwidth**
|
||||
- Large databases require significant bandwidth
|
||||
- Consider S3 transfer costs
|
||||
|
||||
## Cost Considerations
|
||||
|
||||
### S3 Storage Costs (Example: AWS)
|
||||
|
||||
For a 50GB database with hourly backups:
|
||||
- **Storage**: ~50GB × 720 backups (30 days) = 36TB × $0.023/GB = ~$830/month
|
||||
- **Uploads**: 720 requests × $0.005/1000 = ~$0.004/month
|
||||
- **Data Transfer Out** (for restores): $0.09/GB
|
||||
|
||||
**Recommendation**: Consider daily backups instead of hourly to reduce costs.
|
||||
|
||||
### Optimization Strategies
|
||||
|
||||
1. **Reduce Frequency**: Daily or every 6 hours instead of hourly
|
||||
2. **Lifecycle Policies**: Move older backups to cheaper storage tiers
|
||||
3. **Incremental Backups**: Consider BC's built-in continuous backup for point-in-time recovery
|
||||
|
||||
## Support and Contributing
|
||||
|
||||
### Getting Help
|
||||
|
||||
1. Check logs: `logs/backup.log`
|
||||
2. Review troubleshooting section above
|
||||
3. Check BC Admin Center for export status
|
||||
4. Verify S3 bucket configuration
|
||||
|
||||
### Reporting Issues
|
||||
|
||||
When reporting issues, include:
|
||||
- Relevant log excerpts
|
||||
- BC environment type (Production/Sandbox)
|
||||
- S3 provider (AWS/MinIO/etc.)
|
||||
- Error messages
|
||||
|
||||
## License
|
||||
|
||||
This backup solution is provided as-is without warranty. Use at your own risk.
|
||||
|
||||
## References
|
||||
|
||||
- [BC Admin Center API Documentation](https://learn.microsoft.com/en-us/dynamics365/business-central/dev-itpro/administration/administration-center-api)
|
||||
- [BC Data Extraction](https://github.com/microsoft/BCTech/tree/master/samples/ExtractData)
|
||||
- [AWS S3 Object Lock](https://docs.aws.amazon.com/AmazonS3/latest/userguide/object-lock.html)
|
||||
- [GPG Documentation](https://gnupg.org/documentation/)
|
||||
|
||||
## Changelog
|
||||
|
||||
### v1.0.0 (2026-01-07)
|
||||
- Initial release
|
||||
- Hourly automated backups
|
||||
- GPG encryption with AES-256
|
||||
- S3 Object Lock support
|
||||
- AWS CLI and s3cmd support
|
||||
- Comprehensive logging
|
||||
118
bc-backup.conf.template
Normal file
118
bc-backup.conf.template
Normal file
@@ -0,0 +1,118 @@
|
||||
# Business Central SaaS Backup Configuration
|
||||
# Copy this file to bc-backup.conf and fill in your values
|
||||
# IMPORTANT: Keep this file secure! It contains sensitive credentials.
|
||||
|
||||
# ===================================
|
||||
# Azure AD Application Configuration
|
||||
# ===================================
|
||||
# Create an Azure AD App Registration with the following:
|
||||
# 1. Navigate to https://portal.azure.com
|
||||
# 2. Go to Azure Active Directory > App registrations > New registration
|
||||
# 3. Name: "BC-Backup-Service" (or your preferred name)
|
||||
# 4. Supported account types: "Accounts in this organizational directory only"
|
||||
# 5. After creation, note the following:
|
||||
|
||||
# Your Azure AD Tenant ID (Directory ID)
|
||||
AZURE_TENANT_ID=""
|
||||
|
||||
# Application (client) ID from the app registration
|
||||
AZURE_CLIENT_ID=""
|
||||
|
||||
# Client secret (create under Certificates & secrets > New client secret)
|
||||
# IMPORTANT: Save this immediately - it won't be shown again!
|
||||
AZURE_CLIENT_SECRET=""
|
||||
|
||||
# ===================================
|
||||
# Azure AD API Permissions
|
||||
# ===================================
|
||||
# Add the following API permission to your app:
|
||||
# 1. Go to API permissions > Add a permission
|
||||
# 2. Select "Dynamics 365 Business Central"
|
||||
# 3. Select "Application permissions"
|
||||
# 4. Check "Automation.ReadWrite.All" or "API.ReadWrite.All"
|
||||
# 5. Click "Grant admin consent" (requires Global Admin)
|
||||
|
||||
# ===================================
|
||||
# Business Central Configuration
|
||||
# ===================================
|
||||
|
||||
# Your BC environment name (e.g., "Production", "Sandbox")
|
||||
# Find this in BC Admin Center: https://businesscentral.dynamics.com/
|
||||
BC_ENVIRONMENT_NAME=""
|
||||
|
||||
# BC Admin API version (default: v2.15, adjust if needed)
|
||||
BC_API_VERSION="v2.15"
|
||||
|
||||
# ===================================
|
||||
# Encryption Configuration
|
||||
# ===================================
|
||||
|
||||
# Strong passphrase for GPG encryption
|
||||
# Generate a secure passphrase: openssl rand -base64 32
|
||||
# IMPORTANT: Store this securely! You'll need it to decrypt backups.
|
||||
ENCRYPTION_PASSPHRASE=""
|
||||
|
||||
# Alternative: Use GPG key ID instead of passphrase (leave empty to use passphrase)
|
||||
# GPG_KEY_ID=""
|
||||
|
||||
# ===================================
|
||||
# S3 Storage Configuration
|
||||
# ===================================
|
||||
|
||||
# S3 bucket name (must already exist with Object Lock enabled)
|
||||
S3_BUCKET=""
|
||||
|
||||
# S3 endpoint URL
|
||||
# AWS S3: https://s3.amazonaws.com or https://s3.REGION.amazonaws.com
|
||||
# MinIO: http://minio.example.com:9000 or https://minio.example.com
|
||||
# Wasabi: https://s3.wasabisys.com or https://s3.REGION.wasabisys.com
|
||||
# Backblaze: https://s3.REGION.backblazeb2.com
|
||||
S3_ENDPOINT=""
|
||||
|
||||
# AWS Access Key ID (or compatible credentials)
|
||||
AWS_ACCESS_KEY_ID=""
|
||||
|
||||
# AWS Secret Access Key (or compatible credentials)
|
||||
AWS_SECRET_ACCESS_KEY=""
|
||||
|
||||
# S3 region (for AWS, required; for others, may be optional)
|
||||
AWS_DEFAULT_REGION="us-east-1"
|
||||
|
||||
# S3 tool to use: "awscli" (recommended) or "s3cmd"
|
||||
S3_TOOL="awscli"
|
||||
|
||||
# ===================================
|
||||
# Backup Configuration
|
||||
# ===================================
|
||||
|
||||
# Object lock retention period in days (must match or exceed bucket minimum)
|
||||
RETENTION_DAYS="30"
|
||||
|
||||
# Maximum retry attempts for failed operations
|
||||
MAX_RETRIES="3"
|
||||
|
||||
# Clean up local files after successful upload? (true/false)
|
||||
CLEANUP_LOCAL="true"
|
||||
|
||||
# ===================================
|
||||
# Optional: Email Notifications
|
||||
# ===================================
|
||||
|
||||
# Enable email notifications on failure? (true/false)
|
||||
# ENABLE_EMAIL_NOTIFICATIONS="false"
|
||||
|
||||
# Email address to send notifications to
|
||||
# NOTIFICATION_EMAIL=""
|
||||
|
||||
# ===================================
|
||||
# Advanced Configuration
|
||||
# ===================================
|
||||
|
||||
# Maximum time to wait for BC export completion (minutes)
|
||||
# MAX_EXPORT_WAIT_MINUTES="120"
|
||||
|
||||
# Local temporary directory (default: ./temp)
|
||||
# WORK_DIR="/var/tmp/bc-backup"
|
||||
|
||||
# Log directory (default: ./logs)
|
||||
# LOG_DIR="/var/log/bc-backup"
|
||||
237
bc-backup.sh
Executable file
237
bc-backup.sh
Executable file
@@ -0,0 +1,237 @@
|
||||
#!/bin/bash
|
||||
#
|
||||
# Business Central SaaS Automated Backup Script
|
||||
# Downloads BC database export, encrypts, and uploads to S3 with immutability
|
||||
#
|
||||
|
||||
set -euo pipefail
|
||||
|
||||
# Script directory
|
||||
SCRIPT_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" && pwd)"
|
||||
CONFIG_FILE="${SCRIPT_DIR}/bc-backup.conf"
|
||||
LOG_DIR="${SCRIPT_DIR}/logs"
|
||||
WORK_DIR="${SCRIPT_DIR}/temp"
|
||||
|
||||
# Ensure log directory exists
|
||||
mkdir -p "${LOG_DIR}"
|
||||
mkdir -p "${WORK_DIR}"
|
||||
|
||||
# Logging function
|
||||
log() {
|
||||
echo "[$(date '+%Y-%m-%d %H:%M:%S')] $*" | tee -a "${LOG_DIR}/backup.log"
|
||||
}
|
||||
|
||||
log_error() {
|
||||
echo "[$(date '+%Y-%m-%d %H:%M:%S')] ERROR: $*" | tee -a "${LOG_DIR}/backup.log" >&2
|
||||
}
|
||||
|
||||
# Load configuration
|
||||
if [[ ! -f "${CONFIG_FILE}" ]]; then
|
||||
log_error "Configuration file not found: ${CONFIG_FILE}"
|
||||
exit 1
|
||||
fi
|
||||
|
||||
source "${CONFIG_FILE}"
|
||||
|
||||
# Validate required configuration
|
||||
required_vars=(
|
||||
"AZURE_TENANT_ID"
|
||||
"AZURE_CLIENT_ID"
|
||||
"AZURE_CLIENT_SECRET"
|
||||
"BC_ENVIRONMENT_NAME"
|
||||
"ENCRYPTION_PASSPHRASE"
|
||||
"S3_BUCKET"
|
||||
"S3_ENDPOINT"
|
||||
"AWS_ACCESS_KEY_ID"
|
||||
"AWS_SECRET_ACCESS_KEY"
|
||||
)
|
||||
|
||||
for var in "${required_vars[@]}"; do
|
||||
if [[ -z "${!var:-}" ]]; then
|
||||
log_error "Required configuration variable not set: ${var}"
|
||||
exit 1
|
||||
fi
|
||||
done
|
||||
|
||||
# Set defaults
|
||||
RETENTION_DAYS="${RETENTION_DAYS:-30}"
|
||||
S3_TOOL="${S3_TOOL:-awscli}"
|
||||
MAX_RETRIES="${MAX_RETRIES:-3}"
|
||||
CLEANUP_LOCAL="${CLEANUP_LOCAL:-true}"
|
||||
BC_API_VERSION="${BC_API_VERSION:-v2.15}"
|
||||
|
||||
log "========================================="
|
||||
log "Starting Business Central backup process"
|
||||
log "========================================="
|
||||
log "Environment: ${BC_ENVIRONMENT_NAME}"
|
||||
log "S3 Bucket: ${S3_BUCKET}"
|
||||
log "Retention: ${RETENTION_DAYS} days"
|
||||
|
||||
# Generate timestamp for backup filename
|
||||
TIMESTAMP=$(date '+%Y%m%d_%H%M%S')
|
||||
BACKUP_FILENAME="bc_backup_${BC_ENVIRONMENT_NAME}_${TIMESTAMP}"
|
||||
|
||||
# Step 1: Export database using PowerShell script
|
||||
log "Step 1: Initiating database export via BC Admin Center API"
|
||||
|
||||
export AZURE_TENANT_ID
|
||||
export AZURE_CLIENT_ID
|
||||
export AZURE_CLIENT_SECRET
|
||||
export BC_ENVIRONMENT_NAME
|
||||
export BC_API_VERSION
|
||||
export WORK_DIR
|
||||
|
||||
BACPAC_FILE="${WORK_DIR}/${BACKUP_FILENAME}.bacpac"
|
||||
|
||||
if ! pwsh -File "${SCRIPT_DIR}/bc-export.ps1" -OutputPath "${BACPAC_FILE}"; then
|
||||
log_error "Database export failed"
|
||||
exit 1
|
||||
fi
|
||||
|
||||
if [[ ! -f "${BACPAC_FILE}" ]]; then
|
||||
log_error "BACPAC file not found after export: ${BACPAC_FILE}"
|
||||
exit 1
|
||||
fi
|
||||
|
||||
BACPAC_SIZE=$(du -h "${BACPAC_FILE}" | cut -f1)
|
||||
log "Database export completed successfully (${BACPAC_SIZE})"
|
||||
|
||||
# Step 2: Encrypt the backup
|
||||
log "Step 2: Encrypting backup file with GPG"
|
||||
|
||||
ENCRYPTED_FILE="${BACPAC_FILE}.gpg"
|
||||
|
||||
if ! echo "${ENCRYPTION_PASSPHRASE}" | gpg \
|
||||
--batch \
|
||||
--yes \
|
||||
--passphrase-fd 0 \
|
||||
--symmetric \
|
||||
--cipher-algo AES256 \
|
||||
--compress-algo none \
|
||||
--output "${ENCRYPTED_FILE}" \
|
||||
"${BACPAC_FILE}"; then
|
||||
log_error "Encryption failed"
|
||||
exit 1
|
||||
fi
|
||||
|
||||
ENCRYPTED_SIZE=$(du -h "${ENCRYPTED_FILE}" | cut -f1)
|
||||
log "Encryption completed successfully (${ENCRYPTED_SIZE})"
|
||||
|
||||
# Remove unencrypted BACPAC
|
||||
if [[ "${CLEANUP_LOCAL}" == "true" ]]; then
|
||||
rm -f "${BACPAC_FILE}"
|
||||
log "Removed unencrypted BACPAC file"
|
||||
fi
|
||||
|
||||
# Step 3: Upload to S3 with object lock
|
||||
log "Step 3: Uploading encrypted backup to S3"
|
||||
|
||||
S3_KEY="backups/${BACKUP_FILENAME}.bacpac.gpg"
|
||||
S3_URI="s3://${S3_BUCKET}/${S3_KEY}"
|
||||
|
||||
# Calculate retention date
|
||||
if [[ "$OSTYPE" == "darwin"* ]]; then
|
||||
# macOS date command
|
||||
RETENTION_DATE=$(date -u -v+${RETENTION_DAYS}d '+%Y-%m-%dT%H:%M:%S')
|
||||
else
|
||||
# Linux date command
|
||||
RETENTION_DATE=$(date -u -d "+${RETENTION_DAYS} days" '+%Y-%m-%dT%H:%M:%S')
|
||||
fi
|
||||
|
||||
upload_success=false
|
||||
|
||||
if [[ "${S3_TOOL}" == "awscli" ]]; then
|
||||
log "Using AWS CLI for upload"
|
||||
|
||||
# Upload with object lock retention
|
||||
if aws s3api put-object \
|
||||
--bucket "${S3_BUCKET}" \
|
||||
--key "${S3_KEY}" \
|
||||
--body "${ENCRYPTED_FILE}" \
|
||||
--endpoint-url "${S3_ENDPOINT}" \
|
||||
--object-lock-mode COMPLIANCE \
|
||||
--object-lock-retain-until-date "${RETENTION_DATE}Z" \
|
||||
--metadata "backup-timestamp=${TIMESTAMP},environment=${BC_ENVIRONMENT_NAME},encrypted=true"; then
|
||||
upload_success=true
|
||||
fi
|
||||
|
||||
elif [[ "${S3_TOOL}" == "s3cmd" ]]; then
|
||||
log "Using s3cmd for upload"
|
||||
|
||||
# Upload file first
|
||||
if s3cmd put \
|
||||
--host="${S3_ENDPOINT#*://}" \
|
||||
--host-bucket="${S3_ENDPOINT#*://}" \
|
||||
"${ENCRYPTED_FILE}" \
|
||||
"${S3_URI}"; then
|
||||
|
||||
log "File uploaded, attempting to set object lock retention"
|
||||
# Note: s3cmd may not support object lock natively
|
||||
# Fallback to aws cli for setting retention if available
|
||||
if command -v aws &> /dev/null; then
|
||||
aws s3api put-object-retention \
|
||||
--bucket "${S3_BUCKET}" \
|
||||
--key "${S3_KEY}" \
|
||||
--endpoint-url "${S3_ENDPOINT}" \
|
||||
--retention Mode=COMPLIANCE,RetainUntilDate="${RETENTION_DATE}Z" || \
|
||||
log_error "Warning: Could not set object lock retention via AWS CLI"
|
||||
else
|
||||
log_error "Warning: s3cmd doesn't support object lock. Install aws-cli for full functionality"
|
||||
fi
|
||||
upload_success=true
|
||||
fi
|
||||
else
|
||||
log_error "Invalid S3_TOOL: ${S3_TOOL}. Must be 'awscli' or 's3cmd'"
|
||||
exit 1
|
||||
fi
|
||||
|
||||
if [[ "${upload_success}" == "true" ]]; then
|
||||
log "Upload completed successfully: ${S3_URI}"
|
||||
log "Object lock retention until: ${RETENTION_DATE}Z"
|
||||
else
|
||||
log_error "Upload failed"
|
||||
exit 1
|
||||
fi
|
||||
|
||||
# Step 4: Verify upload
|
||||
log "Step 4: Verifying upload"
|
||||
|
||||
if [[ "${S3_TOOL}" == "awscli" ]]; then
|
||||
if aws s3api head-object \
|
||||
--bucket "${S3_BUCKET}" \
|
||||
--key "${S3_KEY}" \
|
||||
--endpoint-url "${S3_ENDPOINT}" > /dev/null 2>&1; then
|
||||
log "Upload verification successful"
|
||||
else
|
||||
log_error "Upload verification failed"
|
||||
exit 1
|
||||
fi
|
||||
elif [[ "${S3_TOOL}" == "s3cmd" ]]; then
|
||||
if s3cmd info "${S3_URI}" --host="${S3_ENDPOINT#*://}" > /dev/null 2>&1; then
|
||||
log "Upload verification successful"
|
||||
else
|
||||
log_error "Upload verification failed"
|
||||
exit 1
|
||||
fi
|
||||
fi
|
||||
|
||||
# Step 5: Cleanup
|
||||
if [[ "${CLEANUP_LOCAL}" == "true" ]]; then
|
||||
log "Step 5: Cleaning up local files"
|
||||
rm -f "${ENCRYPTED_FILE}"
|
||||
log "Local encrypted file removed"
|
||||
else
|
||||
log "Step 5: Skipping cleanup (CLEANUP_LOCAL=false)"
|
||||
log "Encrypted backup retained at: ${ENCRYPTED_FILE}"
|
||||
fi
|
||||
|
||||
# Log rotation - keep last 30 days of logs
|
||||
find "${LOG_DIR}" -name "backup.log.*" -mtime +30 -delete 2>/dev/null || true
|
||||
|
||||
log "========================================="
|
||||
log "Backup completed successfully"
|
||||
log "Backup file: ${S3_KEY}"
|
||||
log "Size: ${ENCRYPTED_SIZE}"
|
||||
log "========================================="
|
||||
|
||||
exit 0
|
||||
295
bc-export.ps1
Executable file
295
bc-export.ps1
Executable file
@@ -0,0 +1,295 @@
|
||||
#!/usr/bin/env pwsh
|
||||
#
|
||||
# Business Central Database Export via Admin Center API
|
||||
# Authenticates to Azure AD and exports BC database as BACPAC
|
||||
#
|
||||
|
||||
param(
|
||||
[Parameter(Mandatory=$true)]
|
||||
[string]$OutputPath
|
||||
)
|
||||
|
||||
# Get configuration from environment variables
|
||||
$tenantId = $env:AZURE_TENANT_ID
|
||||
$clientId = $env:AZURE_CLIENT_ID
|
||||
$clientSecret = $env:AZURE_CLIENT_SECRET
|
||||
$environmentName = $env:BC_ENVIRONMENT_NAME
|
||||
$apiVersion = $env:BC_API_VERSION
|
||||
|
||||
if (-not $apiVersion) {
|
||||
$apiVersion = "v2.15"
|
||||
}
|
||||
|
||||
$baseUrl = "https://api.businesscentral.dynamics.com/admin/$apiVersion"
|
||||
|
||||
function Write-Log {
|
||||
param([string]$Message, [string]$Level = "INFO")
|
||||
$timestamp = Get-Date -Format "yyyy-MM-dd HH:mm:ss"
|
||||
Write-Host "[$timestamp] [$Level] $Message"
|
||||
}
|
||||
|
||||
function Get-AzureADToken {
|
||||
param(
|
||||
[string]$TenantId,
|
||||
[string]$ClientId,
|
||||
[string]$ClientSecret
|
||||
)
|
||||
|
||||
Write-Log "Authenticating to Azure AD..."
|
||||
|
||||
$tokenUrl = "https://login.microsoftonline.com/$TenantId/oauth2/v2.0/token"
|
||||
|
||||
$body = @{
|
||||
client_id = $ClientId
|
||||
client_secret = $ClientSecret
|
||||
scope = "https://api.businesscentral.dynamics.com/.default"
|
||||
grant_type = "client_credentials"
|
||||
}
|
||||
|
||||
try {
|
||||
$response = Invoke-RestMethod -Uri $tokenUrl -Method Post -Body $body -ContentType "application/x-www-form-urlencoded"
|
||||
Write-Log "Successfully authenticated to Azure AD"
|
||||
return $response.access_token
|
||||
}
|
||||
catch {
|
||||
Write-Log "Failed to authenticate: $_" "ERROR"
|
||||
throw
|
||||
}
|
||||
}
|
||||
|
||||
function Start-DatabaseExport {
|
||||
param(
|
||||
[string]$Token,
|
||||
[string]$EnvironmentName
|
||||
)
|
||||
|
||||
Write-Log "Initiating database export for environment: $EnvironmentName"
|
||||
|
||||
$headers = @{
|
||||
"Authorization" = "Bearer $Token"
|
||||
"Content-Type" = "application/json"
|
||||
}
|
||||
|
||||
$exportUrl = "$baseUrl/applications/businesscentral/environments/$EnvironmentName/databaseExports"
|
||||
|
||||
try {
|
||||
$response = Invoke-RestMethod -Uri $exportUrl -Method Post -Headers $headers
|
||||
Write-Log "Database export initiated successfully"
|
||||
return $response
|
||||
}
|
||||
catch {
|
||||
Write-Log "Failed to initiate export: $_" "ERROR"
|
||||
Write-Log "Response: $($_.ErrorDetails.Message)" "ERROR"
|
||||
throw
|
||||
}
|
||||
}
|
||||
|
||||
function Get-ExportStatus {
|
||||
param(
|
||||
[string]$Token,
|
||||
[string]$EnvironmentName
|
||||
)
|
||||
|
||||
$headers = @{
|
||||
"Authorization" = "Bearer $Token"
|
||||
}
|
||||
|
||||
$statusUrl = "$baseUrl/applications/businesscentral/environments/$EnvironmentName/databaseExports"
|
||||
|
||||
try {
|
||||
$response = Invoke-RestMethod -Uri $statusUrl -Method Get -Headers $headers
|
||||
return $response
|
||||
}
|
||||
catch {
|
||||
Write-Log "Failed to get export status: $_" "ERROR"
|
||||
return $null
|
||||
}
|
||||
}
|
||||
|
||||
function Wait-ForExport {
|
||||
param(
|
||||
[string]$Token,
|
||||
[string]$EnvironmentName,
|
||||
[int]$MaxWaitMinutes = 120
|
||||
)
|
||||
|
||||
Write-Log "Waiting for export to complete (max $MaxWaitMinutes minutes)..."
|
||||
|
||||
$startTime = Get-Date
|
||||
$pollInterval = 30 # seconds
|
||||
|
||||
while ($true) {
|
||||
$elapsed = ((Get-Date) - $startTime).TotalMinutes
|
||||
|
||||
if ($elapsed -gt $MaxWaitMinutes) {
|
||||
Write-Log "Export timeout exceeded ($MaxWaitMinutes minutes)" "ERROR"
|
||||
return $null
|
||||
}
|
||||
|
||||
$status = Get-ExportStatus -Token $Token -EnvironmentName $EnvironmentName
|
||||
|
||||
if ($null -eq $status -or $status.value.Count -eq 0) {
|
||||
Write-Log "No export found, waiting..." "WARN"
|
||||
Start-Sleep -Seconds $pollInterval
|
||||
continue
|
||||
}
|
||||
|
||||
# Get the most recent export
|
||||
$latestExport = $status.value | Sort-Object -Property createdOn -Descending | Select-Object -First 1
|
||||
|
||||
$exportStatus = $latestExport.status
|
||||
|
||||
Write-Log "Export status: $exportStatus (Elapsed: $([math]::Round($elapsed, 1)) min)"
|
||||
|
||||
switch ($exportStatus) {
|
||||
"complete" {
|
||||
Write-Log "Export completed successfully"
|
||||
return $latestExport
|
||||
}
|
||||
"failed" {
|
||||
Write-Log "Export failed" "ERROR"
|
||||
if ($latestExport.failureReason) {
|
||||
Write-Log "Failure reason: $($latestExport.failureReason)" "ERROR"
|
||||
}
|
||||
return $null
|
||||
}
|
||||
"inProgress" {
|
||||
Write-Log "Export in progress..."
|
||||
}
|
||||
"queued" {
|
||||
Write-Log "Export queued..."
|
||||
}
|
||||
default {
|
||||
Write-Log "Unknown status: $exportStatus" "WARN"
|
||||
}
|
||||
}
|
||||
|
||||
Start-Sleep -Seconds $pollInterval
|
||||
}
|
||||
}
|
||||
|
||||
function Download-Export {
|
||||
param(
|
||||
[object]$Export,
|
||||
[string]$OutputPath
|
||||
)
|
||||
|
||||
if (-not $Export.blobUri) {
|
||||
Write-Log "No download URI available in export object" "ERROR"
|
||||
return $false
|
||||
}
|
||||
|
||||
$downloadUri = $Export.blobUri
|
||||
|
||||
Write-Log "Downloading BACPAC from: $downloadUri"
|
||||
Write-Log "Saving to: $OutputPath"
|
||||
|
||||
try {
|
||||
# Use Invoke-WebRequest for better progress tracking
|
||||
$ProgressPreference = 'SilentlyContinue' # Disable progress bar for better performance
|
||||
Invoke-WebRequest -Uri $downloadUri -OutFile $OutputPath -UseBasicParsing
|
||||
|
||||
if (Test-Path $OutputPath) {
|
||||
$fileSize = (Get-Item $OutputPath).Length
|
||||
$fileSizeMB = [math]::Round($fileSize / 1MB, 2)
|
||||
Write-Log "Download completed successfully ($fileSizeMB MB)"
|
||||
return $true
|
||||
}
|
||||
else {
|
||||
Write-Log "Download failed - file not found" "ERROR"
|
||||
return $false
|
||||
}
|
||||
}
|
||||
catch {
|
||||
Write-Log "Download failed: $_" "ERROR"
|
||||
return $false
|
||||
}
|
||||
}
|
||||
|
||||
# Main execution
|
||||
try {
|
||||
Write-Log "========================================="
|
||||
Write-Log "BC Database Export Script"
|
||||
Write-Log "========================================="
|
||||
Write-Log "Environment: $environmentName"
|
||||
Write-Log "API Version: $apiVersion"
|
||||
Write-Log "Output Path: $OutputPath"
|
||||
|
||||
# Step 1: Get Azure AD token
|
||||
$token = Get-AzureADToken -TenantId $tenantId -ClientId $clientId -ClientSecret $clientSecret
|
||||
|
||||
# Step 2: Check for existing in-progress exports
|
||||
Write-Log "Checking for existing exports..."
|
||||
$existingStatus = Get-ExportStatus -Token $token -EnvironmentName $environmentName
|
||||
|
||||
$activeExport = $null
|
||||
if ($existingStatus -and $existingStatus.value.Count -gt 0) {
|
||||
$latestExport = $existingStatus.value | Sort-Object -Property createdOn -Descending | Select-Object -First 1
|
||||
|
||||
if ($latestExport.status -eq "inProgress" -or $latestExport.status -eq "queued") {
|
||||
Write-Log "Found existing export in progress (created: $($latestExport.createdOn))"
|
||||
$activeExport = $latestExport
|
||||
|
||||
# Ask if we should wait for it or start a new one
|
||||
$created = [DateTime]::Parse($latestExport.createdOn)
|
||||
$age = (Get-Date) - $created
|
||||
|
||||
if ($age.TotalHours -lt 2) {
|
||||
Write-Log "Existing export is recent (age: $([math]::Round($age.TotalMinutes, 1)) min), will wait for it"
|
||||
}
|
||||
else {
|
||||
Write-Log "Existing export is old (age: $([math]::Round($age.TotalHours, 1)) hours), starting new export" "WARN"
|
||||
$activeExport = Start-DatabaseExport -Token $token -EnvironmentName $environmentName
|
||||
}
|
||||
}
|
||||
elseif ($latestExport.status -eq "complete") {
|
||||
$created = [DateTime]::Parse($latestExport.createdOn)
|
||||
$age = (Get-Date) - $created
|
||||
|
||||
if ($age.TotalHours -lt 1) {
|
||||
Write-Log "Found recent completed export (age: $([math]::Round($age.TotalMinutes, 1)) min)"
|
||||
Write-Log "Using existing export to avoid API limits"
|
||||
$activeExport = $latestExport
|
||||
}
|
||||
else {
|
||||
Write-Log "Latest export is too old, starting new export"
|
||||
$activeExport = Start-DatabaseExport -Token $token -EnvironmentName $environmentName
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
# Step 3: Start new export if needed
|
||||
if (-not $activeExport) {
|
||||
$activeExport = Start-DatabaseExport -Token $token -EnvironmentName $environmentName
|
||||
}
|
||||
|
||||
# Step 4: Wait for export to complete (if not already complete)
|
||||
if ($activeExport.status -ne "complete") {
|
||||
$completedExport = Wait-ForExport -Token $token -EnvironmentName $environmentName -MaxWaitMinutes 120
|
||||
|
||||
if (-not $completedExport) {
|
||||
Write-Log "Export did not complete successfully" "ERROR"
|
||||
exit 1
|
||||
}
|
||||
|
||||
$activeExport = $completedExport
|
||||
}
|
||||
|
||||
# Step 5: Download the BACPAC file
|
||||
$downloadSuccess = Download-Export -Export $activeExport -OutputPath $OutputPath
|
||||
|
||||
if (-not $downloadSuccess) {
|
||||
Write-Log "Failed to download export" "ERROR"
|
||||
exit 1
|
||||
}
|
||||
|
||||
Write-Log "========================================="
|
||||
Write-Log "Export completed successfully"
|
||||
Write-Log "========================================="
|
||||
exit 0
|
||||
}
|
||||
catch {
|
||||
Write-Log "Unexpected error: $_" "ERROR"
|
||||
Write-Log "Stack trace: $($_.ScriptStackTrace)" "ERROR"
|
||||
exit 1
|
||||
}
|
||||
93
cron-examples.txt
Normal file
93
cron-examples.txt
Normal file
@@ -0,0 +1,93 @@
|
||||
# Cron Job Examples for BC Backup Automation
|
||||
# Add these to your crontab with: crontab -e
|
||||
|
||||
# ===================================
|
||||
# Hourly Backup (Every hour at minute 0)
|
||||
# ===================================
|
||||
0 * * * * /home/malin/c0ding/bcbak/bc-backup.sh >> /home/malin/c0ding/bcbak/logs/cron.log 2>&1
|
||||
|
||||
# ===================================
|
||||
# Every 2 hours
|
||||
# ===================================
|
||||
0 */2 * * * /home/malin/c0ding/bcbak/bc-backup.sh >> /home/malin/c0ding/bcbak/logs/cron.log 2>&1
|
||||
|
||||
# ===================================
|
||||
# Every 4 hours
|
||||
# ===================================
|
||||
0 */4 * * * /home/malin/c0ding/bcbak/bc-backup.sh >> /home/malin/c0ding/bcbak/logs/cron.log 2>&1
|
||||
|
||||
# ===================================
|
||||
# Every 6 hours (at 00:00, 06:00, 12:00, 18:00)
|
||||
# ===================================
|
||||
0 0,6,12,18 * * * /home/malin/c0ding/bcbak/bc-backup.sh >> /home/malin/c0ding/bcbak/logs/cron.log 2>&1
|
||||
|
||||
# ===================================
|
||||
# Daily at 2:00 AM
|
||||
# ===================================
|
||||
0 2 * * * /home/malin/c0ding/bcbak/bc-backup.sh >> /home/malin/c0ding/bcbak/logs/cron.log 2>&1
|
||||
|
||||
# ===================================
|
||||
# Multiple times per day (8 AM, 12 PM, 4 PM, 8 PM)
|
||||
# ===================================
|
||||
0 8,12,16,20 * * * /home/malin/c0ding/bcbak/bc-backup.sh >> /home/malin/c0ding/bcbak/logs/cron.log 2>&1
|
||||
|
||||
# ===================================
|
||||
# Business hours only (9 AM - 5 PM, hourly)
|
||||
# ===================================
|
||||
0 9-17 * * 1-5 /home/malin/c0ding/bcbak/bc-backup.sh >> /home/malin/c0ding/bcbak/logs/cron.log 2>&1
|
||||
|
||||
# ===================================
|
||||
# With email notifications (requires mail/sendmail)
|
||||
# ===================================
|
||||
MAILTO=your-email@example.com
|
||||
0 * * * * /home/malin/c0ding/bcbak/bc-backup.sh >> /home/malin/c0ding/bcbak/logs/cron.log 2>&1
|
||||
|
||||
# ===================================
|
||||
# With environment variables
|
||||
# ===================================
|
||||
PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin
|
||||
0 * * * * /home/malin/c0ding/bcbak/bc-backup.sh >> /home/malin/c0ding/bcbak/logs/cron.log 2>&1
|
||||
|
||||
# ===================================
|
||||
# Systemd Timer Alternative (More Reliable)
|
||||
# ===================================
|
||||
# Instead of cron, you can use systemd timers.
|
||||
# Create files in /etc/systemd/system/:
|
||||
#
|
||||
# bc-backup.service:
|
||||
# [Unit]
|
||||
# Description=Business Central Database Backup
|
||||
#
|
||||
# [Service]
|
||||
# Type=oneshot
|
||||
# User=malin
|
||||
# WorkingDirectory=/home/malin/c0ding/bcbak
|
||||
# ExecStart=/home/malin/c0ding/bcbak/bc-backup.sh
|
||||
# StandardOutput=append:/home/malin/c0ding/bcbak/logs/backup.log
|
||||
# StandardError=append:/home/malin/c0ding/bcbak/logs/backup.log
|
||||
#
|
||||
# bc-backup.timer:
|
||||
# [Unit]
|
||||
# Description=Run BC Backup Every Hour
|
||||
#
|
||||
# [Timer]
|
||||
# OnCalendar=hourly
|
||||
# Persistent=true
|
||||
#
|
||||
# [Install]
|
||||
# WantedBy=timers.target
|
||||
#
|
||||
# Enable with:
|
||||
# sudo systemctl daemon-reload
|
||||
# sudo systemctl enable bc-backup.timer
|
||||
# sudo systemctl start bc-backup.timer
|
||||
# sudo systemctl status bc-backup.timer
|
||||
|
||||
# ===================================
|
||||
# Useful Cron Management Commands
|
||||
# ===================================
|
||||
# Edit crontab: crontab -e
|
||||
# List crontab: crontab -l
|
||||
# Remove all cron jobs: crontab -r
|
||||
# View cron logs: grep CRON /var/log/syslog
|
||||
# Test cron environment: * * * * * env > /tmp/cron-env.txt
|
||||
150
decrypt-backup.sh
Executable file
150
decrypt-backup.sh
Executable file
@@ -0,0 +1,150 @@
|
||||
#!/bin/bash
|
||||
#
|
||||
# Business Central Backup Decryption Utility
|
||||
# Decrypts a GPG-encrypted BACPAC backup file
|
||||
#
|
||||
|
||||
set -euo pipefail
|
||||
|
||||
# Colors
|
||||
GREEN='\033[0;32m'
|
||||
YELLOW='\033[1;33m'
|
||||
RED='\033[0;31m'
|
||||
NC='\033[0m'
|
||||
|
||||
echo_info() {
|
||||
echo -e "${GREEN}[INFO]${NC} $*"
|
||||
}
|
||||
|
||||
echo_warn() {
|
||||
echo -e "${YELLOW}[WARN]${NC} $*"
|
||||
}
|
||||
|
||||
echo_error() {
|
||||
echo -e "${RED}[ERROR]${NC} $*"
|
||||
}
|
||||
|
||||
# Check if GPG is installed
|
||||
if ! command -v gpg &> /dev/null; then
|
||||
echo_error "GPG is not installed. Install it first:"
|
||||
echo " Ubuntu/Debian: sudo apt-get install gnupg"
|
||||
echo " CentOS/RHEL: sudo yum install gnupg2"
|
||||
exit 1
|
||||
fi
|
||||
|
||||
# Usage information
|
||||
show_usage() {
|
||||
cat << EOF
|
||||
Business Central Backup Decryption Utility
|
||||
|
||||
Usage: $0 <encrypted-file> [output-file]
|
||||
|
||||
Arguments:
|
||||
<encrypted-file> Path to the encrypted .gpg backup file
|
||||
[output-file] Optional: Path for decrypted output (default: removes .gpg extension)
|
||||
|
||||
Examples:
|
||||
# Decrypt to default name (backup.bacpac)
|
||||
$0 backup.bacpac.gpg
|
||||
|
||||
# Decrypt to specific name
|
||||
$0 backup.bacpac.gpg restored_database.bacpac
|
||||
|
||||
# Download from S3 and decrypt
|
||||
aws s3 cp s3://bucket/backups/bc_backup_Production_20260107_100000.bacpac.gpg ./backup.gpg
|
||||
$0 backup.gpg
|
||||
|
||||
Note: You will be prompted for the encryption passphrase.
|
||||
This is the ENCRYPTION_PASSPHRASE from bc-backup.conf
|
||||
EOF
|
||||
}
|
||||
|
||||
# Check arguments
|
||||
if [[ $# -lt 1 ]]; then
|
||||
show_usage
|
||||
exit 1
|
||||
fi
|
||||
|
||||
ENCRYPTED_FILE="$1"
|
||||
OUTPUT_FILE="${2:-}"
|
||||
|
||||
# Validate encrypted file exists
|
||||
if [[ ! -f "$ENCRYPTED_FILE" ]]; then
|
||||
echo_error "Encrypted file not found: $ENCRYPTED_FILE"
|
||||
exit 1
|
||||
fi
|
||||
|
||||
# Determine output filename
|
||||
if [[ -z "$OUTPUT_FILE" ]]; then
|
||||
# Remove .gpg extension
|
||||
OUTPUT_FILE="${ENCRYPTED_FILE%.gpg}"
|
||||
|
||||
# If still the same (no .gpg extension), append .decrypted
|
||||
if [[ "$OUTPUT_FILE" == "$ENCRYPTED_FILE" ]]; then
|
||||
OUTPUT_FILE="${ENCRYPTED_FILE}.decrypted"
|
||||
fi
|
||||
fi
|
||||
|
||||
# Check if output file already exists
|
||||
if [[ -f "$OUTPUT_FILE" ]]; then
|
||||
echo_warn "Output file already exists: $OUTPUT_FILE"
|
||||
read -p "Overwrite? (y/n) " -n 1 -r
|
||||
echo
|
||||
if [[ ! $REPLY =~ ^[Yy]$ ]]; then
|
||||
echo_info "Aborted."
|
||||
exit 0
|
||||
fi
|
||||
fi
|
||||
|
||||
echo_info "========================================="
|
||||
echo_info "BC Backup Decryption"
|
||||
echo_info "========================================="
|
||||
echo_info "Encrypted file: $ENCRYPTED_FILE"
|
||||
echo_info "Output file: $OUTPUT_FILE"
|
||||
echo_info "File size: $(du -h "$ENCRYPTED_FILE" | cut -f1)"
|
||||
echo ""
|
||||
echo_warn "You will be prompted for the encryption passphrase"
|
||||
echo_warn "This is the ENCRYPTION_PASSPHRASE from bc-backup.conf"
|
||||
echo ""
|
||||
|
||||
# Decrypt the file
|
||||
if gpg \
|
||||
--decrypt \
|
||||
--output "$OUTPUT_FILE" \
|
||||
"$ENCRYPTED_FILE"; then
|
||||
|
||||
echo ""
|
||||
echo_info "========================================="
|
||||
echo_info "Decryption completed successfully!"
|
||||
echo_info "========================================="
|
||||
echo_info "Decrypted file: $OUTPUT_FILE"
|
||||
echo_info "File size: $(du -h "$OUTPUT_FILE" | cut -f1)"
|
||||
echo ""
|
||||
echo_info "Next steps for restoration:"
|
||||
echo ""
|
||||
echo "1. Install SqlPackage (if not already installed):"
|
||||
echo " Download from: https://learn.microsoft.com/en-us/sql/tools/sqlpackage/sqlpackage-download"
|
||||
echo ""
|
||||
echo "2. Create or identify target Azure SQL Database"
|
||||
echo ""
|
||||
echo "3. Import the BACPAC:"
|
||||
echo " sqlpackage /a:Import \\"
|
||||
echo " /sf:$OUTPUT_FILE \\"
|
||||
echo " /tsn:your-server.database.windows.net \\"
|
||||
echo " /tdn:RestoredBCDatabase \\"
|
||||
echo " /tu:admin \\"
|
||||
echo " /tp:YourPassword"
|
||||
echo ""
|
||||
echo "4. Contact Microsoft Support to connect BC to the restored database"
|
||||
echo ""
|
||||
|
||||
exit 0
|
||||
else
|
||||
echo ""
|
||||
echo_error "Decryption failed!"
|
||||
echo_error "Possible causes:"
|
||||
echo " - Incorrect passphrase"
|
||||
echo " - Corrupted encrypted file"
|
||||
echo " - File is not GPG-encrypted"
|
||||
exit 1
|
||||
fi
|
||||
316
setup.sh
Executable file
316
setup.sh
Executable file
@@ -0,0 +1,316 @@
|
||||
#!/bin/bash
|
||||
#
|
||||
# Business Central Backup System - Setup Script
|
||||
# Installs dependencies and configures the backup environment
|
||||
#
|
||||
|
||||
set -euo pipefail
|
||||
|
||||
SCRIPT_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" && pwd)"
|
||||
CONFIG_FILE="${SCRIPT_DIR}/bc-backup.conf"
|
||||
TEMPLATE_FILE="${SCRIPT_DIR}/bc-backup.conf.template"
|
||||
|
||||
# Colors for output
|
||||
RED='\033[0;31m'
|
||||
GREEN='\033[0;32m'
|
||||
YELLOW='\033[1;33m'
|
||||
BLUE='\033[0;34m'
|
||||
NC='\033[0m' # No Color
|
||||
|
||||
echo_info() {
|
||||
echo -e "${BLUE}[INFO]${NC} $*"
|
||||
}
|
||||
|
||||
echo_success() {
|
||||
echo -e "${GREEN}[SUCCESS]${NC} $*"
|
||||
}
|
||||
|
||||
echo_warn() {
|
||||
echo -e "${YELLOW}[WARN]${NC} $*"
|
||||
}
|
||||
|
||||
echo_error() {
|
||||
echo -e "${RED}[ERROR]${NC} $*"
|
||||
}
|
||||
|
||||
check_command() {
|
||||
if command -v "$1" &> /dev/null; then
|
||||
echo_success "$1 is installed"
|
||||
return 0
|
||||
else
|
||||
echo_warn "$1 is NOT installed"
|
||||
return 1
|
||||
fi
|
||||
}
|
||||
|
||||
echo_info "========================================="
|
||||
echo_info "Business Central Backup System Setup"
|
||||
echo_info "========================================="
|
||||
echo ""
|
||||
|
||||
# Detect OS
|
||||
if [[ -f /etc/os-release ]]; then
|
||||
. /etc/os-release
|
||||
OS=$ID
|
||||
VER=$VERSION_ID
|
||||
echo_info "Detected OS: $PRETTY_NAME"
|
||||
else
|
||||
echo_error "Cannot detect OS"
|
||||
exit 1
|
||||
fi
|
||||
|
||||
# Check if running as root
|
||||
if [[ $EUID -eq 0 ]]; then
|
||||
echo_warn "Running as root. Dependencies will be installed system-wide."
|
||||
SUDO=""
|
||||
else
|
||||
echo_info "Running as regular user. May prompt for sudo password."
|
||||
SUDO="sudo"
|
||||
fi
|
||||
|
||||
echo ""
|
||||
echo_info "=== Checking Dependencies ==="
|
||||
echo ""
|
||||
|
||||
# Track what needs to be installed
|
||||
MISSING_DEPS=()
|
||||
|
||||
# Check PowerShell
|
||||
echo_info "Checking PowerShell..."
|
||||
if ! check_command pwsh; then
|
||||
MISSING_DEPS+=("pwsh")
|
||||
fi
|
||||
|
||||
# Check GPG
|
||||
echo_info "Checking GPG..."
|
||||
if ! check_command gpg; then
|
||||
MISSING_DEPS+=("gpg")
|
||||
fi
|
||||
|
||||
# Check AWS CLI
|
||||
echo_info "Checking AWS CLI..."
|
||||
if ! check_command aws; then
|
||||
MISSING_DEPS+=("awscli")
|
||||
fi
|
||||
|
||||
# Check curl and wget
|
||||
echo_info "Checking curl..."
|
||||
check_command curl || MISSING_DEPS+=("curl")
|
||||
|
||||
echo_info "Checking wget..."
|
||||
check_command wget || MISSING_DEPS+=("wget")
|
||||
|
||||
# Check jq (useful for debugging)
|
||||
echo_info "Checking jq (optional)..."
|
||||
check_command jq || echo_warn "jq not installed (optional, useful for JSON parsing)"
|
||||
|
||||
# Install missing dependencies
|
||||
if [[ ${#MISSING_DEPS[@]} -gt 0 ]]; then
|
||||
echo ""
|
||||
echo_warn "Missing dependencies: ${MISSING_DEPS[*]}"
|
||||
echo ""
|
||||
read -p "Install missing dependencies? (y/n) " -n 1 -r
|
||||
echo
|
||||
if [[ $REPLY =~ ^[Yy]$ ]]; then
|
||||
for dep in "${MISSING_DEPS[@]}"; do
|
||||
echo_info "Installing $dep..."
|
||||
|
||||
case $dep in
|
||||
pwsh)
|
||||
# Install PowerShell
|
||||
case $OS in
|
||||
ubuntu|debian)
|
||||
# Download Microsoft repository GPG keys
|
||||
wget -q "https://packages.microsoft.com/config/$OS/$VER/packages-microsoft-prod.deb" -O /tmp/packages-microsoft-prod.deb
|
||||
$SUDO dpkg -i /tmp/packages-microsoft-prod.deb
|
||||
rm /tmp/packages-microsoft-prod.deb
|
||||
$SUDO apt-get update
|
||||
$SUDO apt-get install -y powershell
|
||||
;;
|
||||
centos|rhel|fedora)
|
||||
$SUDO rpm --import https://packages.microsoft.com/keys/microsoft.asc
|
||||
curl -o /tmp/packages-microsoft-prod.rpm "https://packages.microsoft.com/config/$OS/$VER/packages-microsoft-prod.rpm"
|
||||
$SUDO rpm -i /tmp/packages-microsoft-prod.rpm
|
||||
rm /tmp/packages-microsoft-prod.rpm
|
||||
$SUDO yum install -y powershell
|
||||
;;
|
||||
*)
|
||||
echo_error "Unsupported OS for automatic PowerShell installation"
|
||||
echo_info "Please install PowerShell manually: https://docs.microsoft.com/en-us/powershell/scripting/install/installing-powershell-on-linux"
|
||||
;;
|
||||
esac
|
||||
;;
|
||||
gpg)
|
||||
# Install GPG
|
||||
case $OS in
|
||||
ubuntu|debian)
|
||||
$SUDO apt-get update
|
||||
$SUDO apt-get install -y gnupg
|
||||
;;
|
||||
centos|rhel|fedora)
|
||||
$SUDO yum install -y gnupg2
|
||||
;;
|
||||
*)
|
||||
echo_error "Unsupported OS for automatic GPG installation"
|
||||
;;
|
||||
esac
|
||||
;;
|
||||
awscli)
|
||||
# Install AWS CLI v2
|
||||
echo_info "Installing AWS CLI v2..."
|
||||
case $(uname -m) in
|
||||
x86_64)
|
||||
curl "https://awscli.amazonaws.com/awscli-exe-linux-x86_64.zip" -o "/tmp/awscliv2.zip"
|
||||
;;
|
||||
aarch64)
|
||||
curl "https://awscli.amazonaws.com/awscli-exe-linux-aarch64.zip" -o "/tmp/awscliv2.zip"
|
||||
;;
|
||||
*)
|
||||
echo_error "Unsupported architecture for AWS CLI"
|
||||
continue
|
||||
;;
|
||||
esac
|
||||
unzip -q /tmp/awscliv2.zip -d /tmp
|
||||
$SUDO /tmp/aws/install
|
||||
rm -rf /tmp/aws /tmp/awscliv2.zip
|
||||
;;
|
||||
curl)
|
||||
case $OS in
|
||||
ubuntu|debian)
|
||||
$SUDO apt-get update
|
||||
$SUDO apt-get install -y curl
|
||||
;;
|
||||
centos|rhel|fedora)
|
||||
$SUDO yum install -y curl
|
||||
;;
|
||||
esac
|
||||
;;
|
||||
wget)
|
||||
case $OS in
|
||||
ubuntu|debian)
|
||||
$SUDO apt-get update
|
||||
$SUDO apt-get install -y wget
|
||||
;;
|
||||
centos|rhel|fedora)
|
||||
$SUDO yum install -y wget
|
||||
;;
|
||||
esac
|
||||
;;
|
||||
esac
|
||||
done
|
||||
else
|
||||
echo_error "Cannot proceed without required dependencies"
|
||||
exit 1
|
||||
fi
|
||||
fi
|
||||
|
||||
echo ""
|
||||
echo_success "All required dependencies are installed"
|
||||
echo ""
|
||||
|
||||
# Create directory structure
|
||||
echo_info "=== Setting up directory structure ==="
|
||||
mkdir -p "${SCRIPT_DIR}/logs"
|
||||
mkdir -p "${SCRIPT_DIR}/temp"
|
||||
echo_success "Created logs/ and temp/ directories"
|
||||
|
||||
# Set up configuration file
|
||||
echo ""
|
||||
echo_info "=== Configuration Setup ==="
|
||||
|
||||
if [[ -f "${CONFIG_FILE}" ]]; then
|
||||
echo_warn "Configuration file already exists: ${CONFIG_FILE}"
|
||||
read -p "Overwrite with template? (y/n) " -n 1 -r
|
||||
echo
|
||||
if [[ $REPLY =~ ^[Yy]$ ]]; then
|
||||
cp "${TEMPLATE_FILE}" "${CONFIG_FILE}"
|
||||
echo_success "Configuration template copied to bc-backup.conf"
|
||||
fi
|
||||
else
|
||||
cp "${TEMPLATE_FILE}" "${CONFIG_FILE}"
|
||||
echo_success "Configuration template copied to bc-backup.conf"
|
||||
fi
|
||||
|
||||
# Make scripts executable
|
||||
echo ""
|
||||
echo_info "=== Setting permissions ==="
|
||||
chmod +x "${SCRIPT_DIR}/bc-backup.sh"
|
||||
chmod +x "${SCRIPT_DIR}/bc-export.ps1"
|
||||
chmod 600 "${CONFIG_FILE}" # Restrict config file permissions
|
||||
echo_success "Scripts are now executable"
|
||||
echo_success "Config file permissions set to 600 (owner read/write only)"
|
||||
|
||||
# Test AWS CLI configuration
|
||||
echo ""
|
||||
echo_info "=== Testing AWS CLI ==="
|
||||
if [[ -f "${CONFIG_FILE}" ]]; then
|
||||
# Source config to test
|
||||
if grep -q 'AWS_ACCESS_KEY_ID=""' "${CONFIG_FILE}"; then
|
||||
echo_warn "AWS credentials not yet configured in bc-backup.conf"
|
||||
else
|
||||
echo_info "AWS CLI appears to be configured in bc-backup.conf"
|
||||
fi
|
||||
fi
|
||||
|
||||
# S3 bucket object lock check
|
||||
echo ""
|
||||
echo_info "=== Important: S3 Object Lock Configuration ==="
|
||||
echo_warn "Your S3 bucket MUST have Object Lock enabled for immutability"
|
||||
echo_info "Object Lock can only be enabled when creating a bucket"
|
||||
echo ""
|
||||
echo_info "To create an S3 bucket with Object Lock (AWS example):"
|
||||
echo " aws s3api create-bucket --bucket YOUR-BUCKET-NAME \\"
|
||||
echo " --region YOUR-REGION \\"
|
||||
echo " --create-bucket-configuration LocationConstraint=YOUR-REGION \\"
|
||||
echo " --object-lock-enabled-for-bucket"
|
||||
echo ""
|
||||
echo_info "Then configure default retention:"
|
||||
echo " aws s3api put-object-lock-configuration --bucket YOUR-BUCKET-NAME \\"
|
||||
echo " --object-lock-configuration '{\"ObjectLockEnabled\":\"Enabled\",\"Rule\":{\"DefaultRetention\":{\"Mode\":\"COMPLIANCE\",\"Days\":30}}}'"
|
||||
echo ""
|
||||
|
||||
# Setup cron job
|
||||
echo ""
|
||||
echo_info "=== Cron Job Setup ==="
|
||||
echo_info "To run backups hourly, add this to your crontab:"
|
||||
echo ""
|
||||
echo " 0 * * * * ${SCRIPT_DIR}/bc-backup.sh >> ${SCRIPT_DIR}/logs/cron.log 2>&1"
|
||||
echo ""
|
||||
read -p "Add this cron job now? (y/n) " -n 1 -r
|
||||
echo
|
||||
if [[ $REPLY =~ ^[Yy]$ ]]; then
|
||||
CRON_CMD="0 * * * * ${SCRIPT_DIR}/bc-backup.sh >> ${SCRIPT_DIR}/logs/cron.log 2>&1"
|
||||
(crontab -l 2>/dev/null | grep -v "${SCRIPT_DIR}/bc-backup.sh"; echo "$CRON_CMD") | crontab -
|
||||
echo_success "Cron job added successfully"
|
||||
echo_info "View your crontab with: crontab -l"
|
||||
else
|
||||
echo_info "Skipped cron job setup. You can add it manually later."
|
||||
fi
|
||||
|
||||
# Final instructions
|
||||
echo ""
|
||||
echo_info "========================================="
|
||||
echo_success "Setup completed successfully!"
|
||||
echo_info "========================================="
|
||||
echo ""
|
||||
echo_info "Next steps:"
|
||||
echo ""
|
||||
echo "1. Edit configuration file:"
|
||||
echo " nano ${CONFIG_FILE}"
|
||||
echo ""
|
||||
echo "2. Fill in the following required values:"
|
||||
echo " - AZURE_TENANT_ID, AZURE_CLIENT_ID, AZURE_CLIENT_SECRET"
|
||||
echo " - BC_ENVIRONMENT_NAME"
|
||||
echo " - ENCRYPTION_PASSPHRASE (generate with: openssl rand -base64 32)"
|
||||
echo " - S3_BUCKET, S3_ENDPOINT"
|
||||
echo " - AWS_ACCESS_KEY_ID, AWS_SECRET_ACCESS_KEY"
|
||||
echo ""
|
||||
echo "3. Test the backup manually:"
|
||||
echo " ${SCRIPT_DIR}/bc-backup.sh"
|
||||
echo ""
|
||||
echo "4. Check logs for any issues:"
|
||||
echo " tail -f ${SCRIPT_DIR}/logs/backup.log"
|
||||
echo ""
|
||||
echo_warn "IMPORTANT: Store your ENCRYPTION_PASSPHRASE securely!"
|
||||
echo_warn "Without it, you cannot decrypt your backups."
|
||||
echo ""
|
||||
287
test-config.sh
Executable file
287
test-config.sh
Executable file
@@ -0,0 +1,287 @@
|
||||
#!/bin/bash
|
||||
#
|
||||
# Business Central Backup Configuration Tester
|
||||
# Validates setup and connectivity before running actual backups
|
||||
#
|
||||
|
||||
set -euo pipefail
|
||||
|
||||
SCRIPT_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" && pwd)"
|
||||
CONFIG_FILE="${SCRIPT_DIR}/bc-backup.conf"
|
||||
|
||||
# Colors
|
||||
GREEN='\033[0;32m'
|
||||
YELLOW='\033[1;33m'
|
||||
RED='\033[0;31m'
|
||||
BLUE='\033[0;34m'
|
||||
NC='\033[0m'
|
||||
|
||||
pass_count=0
|
||||
warn_count=0
|
||||
fail_count=0
|
||||
|
||||
echo_info() {
|
||||
echo -e "${BLUE}[INFO]${NC} $*"
|
||||
}
|
||||
|
||||
echo_pass() {
|
||||
echo -e "${GREEN}[PASS]${NC} $*"
|
||||
((pass_count++))
|
||||
}
|
||||
|
||||
echo_warn() {
|
||||
echo -e "${YELLOW}[WARN]${NC} $*"
|
||||
((warn_count++))
|
||||
}
|
||||
|
||||
echo_fail() {
|
||||
echo -e "${RED}[FAIL]${NC} $*"
|
||||
((fail_count++))
|
||||
}
|
||||
|
||||
check_command() {
|
||||
if command -v "$1" &> /dev/null; then
|
||||
echo_pass "$1 is installed ($(command -v "$1"))"
|
||||
return 0
|
||||
else
|
||||
echo_fail "$1 is NOT installed"
|
||||
return 1
|
||||
fi
|
||||
}
|
||||
|
||||
check_config_var() {
|
||||
local var_name="$1"
|
||||
local var_value="${!var_name:-}"
|
||||
|
||||
if [[ -z "$var_value" ]]; then
|
||||
echo_fail "$var_name is not set"
|
||||
return 1
|
||||
elif [[ "$var_value" == *"your-"* ]] || [[ "$var_value" == *"example"* ]]; then
|
||||
echo_warn "$var_name looks like a placeholder value"
|
||||
return 1
|
||||
else
|
||||
echo_pass "$var_name is set"
|
||||
return 0
|
||||
fi
|
||||
}
|
||||
|
||||
echo_info "=========================================="
|
||||
echo_info "BC Backup Configuration Tester"
|
||||
echo_info "=========================================="
|
||||
echo ""
|
||||
|
||||
# Test 1: Configuration file exists
|
||||
echo_info "=== Test 1: Configuration File ==="
|
||||
if [[ -f "$CONFIG_FILE" ]]; then
|
||||
echo_pass "Configuration file exists: $CONFIG_FILE"
|
||||
|
||||
# Check permissions
|
||||
perms=$(stat -c "%a" "$CONFIG_FILE" 2>/dev/null || stat -f "%A" "$CONFIG_FILE" 2>/dev/null)
|
||||
if [[ "$perms" == "600" ]] || [[ "$perms" == "400" ]]; then
|
||||
echo_pass "Configuration file has secure permissions ($perms)"
|
||||
else
|
||||
echo_warn "Configuration file permissions are $perms (recommend 600)"
|
||||
fi
|
||||
|
||||
# Load configuration
|
||||
source "$CONFIG_FILE"
|
||||
else
|
||||
echo_fail "Configuration file not found: $CONFIG_FILE"
|
||||
echo_info "Run: cp bc-backup.conf.template bc-backup.conf"
|
||||
exit 1
|
||||
fi
|
||||
echo ""
|
||||
|
||||
# Test 2: Required commands
|
||||
echo_info "=== Test 2: Required Commands ==="
|
||||
check_command pwsh
|
||||
check_command gpg
|
||||
check_command aws || check_command s3cmd
|
||||
check_command curl
|
||||
echo ""
|
||||
|
||||
# Test 3: Configuration variables
|
||||
echo_info "=== Test 3: Configuration Variables ==="
|
||||
check_config_var "AZURE_TENANT_ID"
|
||||
check_config_var "AZURE_CLIENT_ID"
|
||||
check_config_var "AZURE_CLIENT_SECRET"
|
||||
check_config_var "BC_ENVIRONMENT_NAME"
|
||||
check_config_var "ENCRYPTION_PASSPHRASE"
|
||||
check_config_var "S3_BUCKET"
|
||||
check_config_var "S3_ENDPOINT"
|
||||
check_config_var "AWS_ACCESS_KEY_ID"
|
||||
check_config_var "AWS_SECRET_ACCESS_KEY"
|
||||
echo ""
|
||||
|
||||
# Test 4: Azure AD Authentication
|
||||
echo_info "=== Test 4: Azure AD Authentication ==="
|
||||
if command -v pwsh &> /dev/null; then
|
||||
echo_info "Testing Azure AD authentication..."
|
||||
|
||||
auth_test=$(pwsh -Command "
|
||||
\$tokenUrl = 'https://login.microsoftonline.com/$AZURE_TENANT_ID/oauth2/v2.0/token'
|
||||
\$body = @{
|
||||
client_id = '$AZURE_CLIENT_ID'
|
||||
client_secret = '$AZURE_CLIENT_SECRET'
|
||||
scope = 'https://api.businesscentral.dynamics.com/.default'
|
||||
grant_type = 'client_credentials'
|
||||
}
|
||||
try {
|
||||
\$response = Invoke-RestMethod -Uri \$tokenUrl -Method Post -Body \$body -ContentType 'application/x-www-form-urlencoded' -ErrorAction Stop
|
||||
Write-Output 'SUCCESS'
|
||||
} catch {
|
||||
Write-Output \"FAILED: \$_\"
|
||||
}
|
||||
" 2>&1)
|
||||
|
||||
if [[ "$auth_test" == "SUCCESS" ]]; then
|
||||
echo_pass "Azure AD authentication successful"
|
||||
else
|
||||
echo_fail "Azure AD authentication failed"
|
||||
echo " Error: $auth_test"
|
||||
fi
|
||||
else
|
||||
echo_warn "Skipping Azure AD test (pwsh not installed)"
|
||||
fi
|
||||
echo ""
|
||||
|
||||
# Test 5: S3 Connectivity
|
||||
echo_info "=== Test 5: S3 Connectivity ==="
|
||||
|
||||
export AWS_ACCESS_KEY_ID
|
||||
export AWS_SECRET_ACCESS_KEY
|
||||
export AWS_DEFAULT_REGION
|
||||
|
||||
if [[ "${S3_TOOL:-awscli}" == "awscli" ]] && command -v aws &> /dev/null; then
|
||||
echo_info "Testing S3 connectivity with AWS CLI..."
|
||||
|
||||
if aws s3 ls "s3://${S3_BUCKET}/" --endpoint-url "${S3_ENDPOINT}" > /dev/null 2>&1; then
|
||||
echo_pass "S3 bucket is accessible: ${S3_BUCKET}"
|
||||
|
||||
# Check Object Lock status
|
||||
echo_info "Checking Object Lock configuration..."
|
||||
lock_config=$(aws s3api get-object-lock-configuration \
|
||||
--bucket "${S3_BUCKET}" \
|
||||
--endpoint-url "${S3_ENDPOINT}" 2>&1 || echo "NOT_CONFIGURED")
|
||||
|
||||
if [[ "$lock_config" != "NOT_CONFIGURED" ]] && [[ "$lock_config" != *"ObjectLockConfigurationNotFoundError"* ]]; then
|
||||
echo_pass "Object Lock is enabled on bucket"
|
||||
else
|
||||
echo_fail "Object Lock is NOT enabled on bucket"
|
||||
echo " Object Lock must be enabled at bucket creation"
|
||||
echo " See README.md for setup instructions"
|
||||
fi
|
||||
else
|
||||
echo_fail "Cannot access S3 bucket: ${S3_BUCKET}"
|
||||
echo " Check: S3_BUCKET, S3_ENDPOINT, AWS credentials"
|
||||
fi
|
||||
elif [[ "${S3_TOOL}" == "s3cmd" ]] && command -v s3cmd &> /dev/null; then
|
||||
echo_info "Testing S3 connectivity with s3cmd..."
|
||||
|
||||
if s3cmd ls "s3://${S3_BUCKET}/" --host="${S3_ENDPOINT#*://}" > /dev/null 2>&1; then
|
||||
echo_pass "S3 bucket is accessible: ${S3_BUCKET}"
|
||||
echo_warn "s3cmd doesn't support Object Lock verification"
|
||||
echo " Install aws-cli for full Object Lock support"
|
||||
else
|
||||
echo_fail "Cannot access S3 bucket: ${S3_BUCKET}"
|
||||
fi
|
||||
else
|
||||
echo_warn "Skipping S3 test (no S3 tool available)"
|
||||
fi
|
||||
echo ""
|
||||
|
||||
# Test 6: Directory permissions
|
||||
echo_info "=== Test 6: Directory Permissions ==="
|
||||
|
||||
for dir in "logs" "temp"; do
|
||||
dir_path="${SCRIPT_DIR}/${dir}"
|
||||
if [[ -d "$dir_path" ]] && [[ -w "$dir_path" ]]; then
|
||||
echo_pass "Directory is writable: $dir"
|
||||
else
|
||||
echo_fail "Directory is not writable: $dir"
|
||||
echo " Run: mkdir -p $dir_path && chmod 755 $dir_path"
|
||||
fi
|
||||
done
|
||||
echo ""
|
||||
|
||||
# Test 7: Script permissions
|
||||
echo_info "=== Test 7: Script Permissions ==="
|
||||
|
||||
for script in "bc-backup.sh" "bc-export.ps1"; do
|
||||
script_path="${SCRIPT_DIR}/${script}"
|
||||
if [[ -f "$script_path" ]] && [[ -x "$script_path" ]]; then
|
||||
echo_pass "Script is executable: $script"
|
||||
else
|
||||
echo_warn "Script is not executable: $script"
|
||||
echo " Run: chmod +x $script_path"
|
||||
fi
|
||||
done
|
||||
echo ""
|
||||
|
||||
# Test 8: GPG encryption test
|
||||
echo_info "=== Test 8: GPG Encryption Test ==="
|
||||
|
||||
test_file=$(mktemp)
|
||||
test_encrypted=$(mktemp)
|
||||
test_decrypted=$(mktemp)
|
||||
|
||||
echo "test data" > "$test_file"
|
||||
|
||||
if echo "$ENCRYPTION_PASSPHRASE" | gpg \
|
||||
--batch \
|
||||
--yes \
|
||||
--passphrase-fd 0 \
|
||||
--symmetric \
|
||||
--cipher-algo AES256 \
|
||||
--output "$test_encrypted" \
|
||||
"$test_file" 2>/dev/null; then
|
||||
|
||||
echo_pass "GPG encryption successful"
|
||||
|
||||
# Test decryption
|
||||
if echo "$ENCRYPTION_PASSPHRASE" | gpg \
|
||||
--batch \
|
||||
--yes \
|
||||
--passphrase-fd 0 \
|
||||
--decrypt \
|
||||
--output "$test_decrypted" \
|
||||
"$test_encrypted" 2>/dev/null; then
|
||||
|
||||
if diff -q "$test_file" "$test_decrypted" &>/dev/null; then
|
||||
echo_pass "GPG decryption successful"
|
||||
else
|
||||
echo_fail "GPG decryption produced different output"
|
||||
fi
|
||||
else
|
||||
echo_fail "GPG decryption failed"
|
||||
fi
|
||||
else
|
||||
echo_fail "GPG encryption failed"
|
||||
fi
|
||||
|
||||
rm -f "$test_file" "$test_encrypted" "$test_decrypted"
|
||||
echo ""
|
||||
|
||||
# Summary
|
||||
echo_info "=========================================="
|
||||
echo_info "Test Summary"
|
||||
echo_info "=========================================="
|
||||
echo_pass "Passed: $pass_count"
|
||||
echo_warn "Warnings: $warn_count"
|
||||
echo_fail "Failed: $fail_count"
|
||||
echo ""
|
||||
|
||||
if [[ $fail_count -eq 0 ]]; then
|
||||
echo_info "Configuration looks good!"
|
||||
echo_info "You can now run a test backup:"
|
||||
echo " ${SCRIPT_DIR}/bc-backup.sh"
|
||||
exit 0
|
||||
elif [[ $fail_count -le 2 ]] && [[ $pass_count -ge 10 ]]; then
|
||||
echo_warn "Configuration has minor issues but may work"
|
||||
echo_warn "Review failed tests above"
|
||||
exit 0
|
||||
else
|
||||
echo_fail "Configuration has significant issues"
|
||||
echo_fail "Please fix the failed tests before running backups"
|
||||
exit 1
|
||||
fi
|
||||
Reference in New Issue
Block a user