2026-02-09 18:57:39 +01:00
#!/usr/bin/env pwsh
#
feat: switch from Admin Center database export to BC API v2.0 data extraction
The Admin Center export API requires an Azure Storage SAS URI which
requires an Azure Subscription - defeating the purpose of an independent
backup. Instead, use BC API v2.0 to extract critical business data
(customers, vendors, items, GL entries, invoices, etc.) as JSON files.
- bc-export.ps1: rewritten to use BC API v2.0 endpoints, extracts 23
entity types per company with OData pagination support
- bc-backup.sh: handles JSON export directory, creates tar.gz archive
before encrypting and uploading to S3
- bc-backup.conf.template: removed Azure Storage SAS config, added
optional BC_COMPANY_NAME filter
- decrypt-backup.sh: updated for tar.gz.gpg format, shows extracted
entity files and metadata after decryption
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-10 07:33:32 +01:00
# Business Central Data Export via BC API v2.0
# Authenticates to Azure AD and extracts critical business data as JSON
2026-02-09 18:57:39 +01:00
#
param (
[ Parameter ( Mandatory = $true ) ]
2026-02-16 10:22:08 +01:00
[ string ] $OutputPath ,
[ string ] $SinceDateTime = " " # ISO 8601, e.g. "2026-02-15T00:00:00Z" for incremental
2026-02-09 18:57:39 +01:00
)
# Get configuration from environment variables
$tenantId = $env:AZURE_TENANT_ID
$clientId = $env:AZURE_CLIENT_ID
$clientSecret = $env:AZURE_CLIENT_SECRET
$environmentName = $env:BC_ENVIRONMENT_NAME
feat: switch from Admin Center database export to BC API v2.0 data extraction
The Admin Center export API requires an Azure Storage SAS URI which
requires an Azure Subscription - defeating the purpose of an independent
backup. Instead, use BC API v2.0 to extract critical business data
(customers, vendors, items, GL entries, invoices, etc.) as JSON files.
- bc-export.ps1: rewritten to use BC API v2.0 endpoints, extracts 23
entity types per company with OData pagination support
- bc-backup.sh: handles JSON export directory, creates tar.gz archive
before encrypting and uploading to S3
- bc-backup.conf.template: removed Azure Storage SAS config, added
optional BC_COMPANY_NAME filter
- decrypt-backup.sh: updated for tar.gz.gpg format, shows extracted
entity files and metadata after decryption
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-10 07:33:32 +01:00
$bcCompanyName = $env:BC_COMPANY_NAME # optional: filter to specific company
$baseUrl = " https://api.businesscentral.dynamics.com/v2.0/ $tenantId / $environmentName /api/v2.0 "
feat: export all available BC API v2.0 entities
Added 31 missing entities across three categories:
Standalone (16 new): companyInformation, itemCategories,
shipmentMethods, taxAreas, taxGroups, unitsOfMeasure,
timeRegistrationEntries, contacts, generalProductPostingGroups,
inventoryPostingGroups, itemLedgerEntries, opportunities,
locations, projects, journalLines, irs1099
Financial reports (10 new, always full export): agedAccountsPayable,
agedAccountsReceivable, balanceSheet, cashFlowStatement,
incomeStatement, retainedEarningsStatement, trialBalance,
customerFinancialDetails, customerSales, vendorPurchases
Document+lines (5 new): salesQuotes, salesShipments,
purchaseReceipts, customerPaymentJournals, vendorPaymentJournals
Total entities: 19 → 50
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-17 09:18:24 +01:00
# Standalone entities to extract (support lastModifiedDateTime filter for incremental)
feat: switch from Admin Center database export to BC API v2.0 data extraction
The Admin Center export API requires an Azure Storage SAS URI which
requires an Azure Subscription - defeating the purpose of an independent
backup. Instead, use BC API v2.0 to extract critical business data
(customers, vendors, items, GL entries, invoices, etc.) as JSON files.
- bc-export.ps1: rewritten to use BC API v2.0 endpoints, extracts 23
entity types per company with OData pagination support
- bc-backup.sh: handles JSON export directory, creates tar.gz archive
before encrypting and uploading to S3
- bc-backup.conf.template: removed Azure Storage SAS config, added
optional BC_COMPANY_NAME filter
- decrypt-backup.sh: updated for tar.gz.gpg format, shows extracted
entity files and metadata after decryption
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-10 07:33:32 +01:00
$entities = @ (
" accounts " ,
" customers " ,
" vendors " ,
" items " ,
" generalLedgerEntries " ,
" bankAccounts " ,
" employees " ,
" dimensions " ,
" dimensionValues " ,
" currencies " ,
" paymentTerms " ,
" paymentMethods " ,
" journals " ,
feat: export all available BC API v2.0 entities
Added 31 missing entities across three categories:
Standalone (16 new): companyInformation, itemCategories,
shipmentMethods, taxAreas, taxGroups, unitsOfMeasure,
timeRegistrationEntries, contacts, generalProductPostingGroups,
inventoryPostingGroups, itemLedgerEntries, opportunities,
locations, projects, journalLines, irs1099
Financial reports (10 new, always full export): agedAccountsPayable,
agedAccountsReceivable, balanceSheet, cashFlowStatement,
incomeStatement, retainedEarningsStatement, trialBalance,
customerFinancialDetails, customerSales, vendorPurchases
Document+lines (5 new): salesQuotes, salesShipments,
purchaseReceipts, customerPaymentJournals, vendorPaymentJournals
Total entities: 19 → 50
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-17 09:18:24 +01:00
" journalLines " ,
" countriesRegions " ,
" companyInformation " ,
" itemCategories " ,
" shipmentMethods " ,
" taxAreas " ,
" taxGroups " ,
" unitsOfMeasure " ,
" timeRegistrationEntries " ,
" contacts " ,
" generalProductPostingGroups " ,
" inventoryPostingGroups " ,
" itemLedgerEntries " ,
" opportunities " ,
" locations " ,
" projects " ,
" irs1099 "
)
# Financial report entities (always full export, no incremental filter support)
$reportEntities = @ (
" agedAccountsPayable " ,
" agedAccountsReceivable " ,
" balanceSheet " ,
" cashFlowStatement " ,
" incomeStatement " ,
" retainedEarningsStatement " ,
" trialBalance " ,
" customerFinancialDetails " ,
" customerSales " ,
" vendorPurchases "
feat: switch from Admin Center database export to BC API v2.0 data extraction
The Admin Center export API requires an Azure Storage SAS URI which
requires an Azure Subscription - defeating the purpose of an independent
backup. Instead, use BC API v2.0 to extract critical business data
(customers, vendors, items, GL entries, invoices, etc.) as JSON files.
- bc-export.ps1: rewritten to use BC API v2.0 endpoints, extracts 23
entity types per company with OData pagination support
- bc-backup.sh: handles JSON export directory, creates tar.gz archive
before encrypting and uploading to S3
- bc-backup.conf.template: removed Azure Storage SAS config, added
optional BC_COMPANY_NAME filter
- decrypt-backup.sh: updated for tar.gz.gpg format, shows extracted
entity files and metadata after decryption
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-10 07:33:32 +01:00
)
2026-02-09 18:57:39 +01:00
2026-02-16 09:09:47 +01:00
# Document entities with line items
2026-02-16 09:20:52 +01:00
# Lines cannot be queried standalone at the top level.
# We fetch document headers first, then fetch lines per document.
2026-02-10 07:57:46 +01:00
$documentEntities = @ {
feat: export all available BC API v2.0 entities
Added 31 missing entities across three categories:
Standalone (16 new): companyInformation, itemCategories,
shipmentMethods, taxAreas, taxGroups, unitsOfMeasure,
timeRegistrationEntries, contacts, generalProductPostingGroups,
inventoryPostingGroups, itemLedgerEntries, opportunities,
locations, projects, journalLines, irs1099
Financial reports (10 new, always full export): agedAccountsPayable,
agedAccountsReceivable, balanceSheet, cashFlowStatement,
incomeStatement, retainedEarningsStatement, trialBalance,
customerFinancialDetails, customerSales, vendorPurchases
Document+lines (5 new): salesQuotes, salesShipments,
purchaseReceipts, customerPaymentJournals, vendorPaymentJournals
Total entities: 19 → 50
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-17 09:18:24 +01:00
" salesInvoices " = " salesInvoiceLines "
" salesOrders " = " salesOrderLines "
" salesCreditMemos " = " salesCreditMemoLines "
" purchaseInvoices " = " purchaseInvoiceLines "
" purchaseOrders " = " purchaseOrderLines "
" salesQuotes " = " salesQuoteLines "
" salesShipments " = " salesShipmentLines "
" purchaseReceipts " = " purchaseReceiptLines "
" customerPaymentJournals " = " customerPayments "
" vendorPaymentJournals " = " vendorPayments "
2026-02-10 07:57:46 +01:00
}
2026-02-16 09:09:47 +01:00
# Token management
$script:currentToken = $null
$script:tokenExpiry = [ datetime ] :: MinValue
2026-02-09 18:57:39 +01:00
function Write-Log {
param ( [ string ] $Message , [ string ] $Level = " INFO " )
$timestamp = Get-Date -Format " yyyy-MM-dd HH:mm:ss "
Write-Host " [ $timestamp ] [ $Level ] $Message "
}
function Get-AzureADToken {
Write-Log " Authenticating to Azure AD... "
2026-02-16 09:09:47 +01:00
$tokenUrl = " https://login.microsoftonline.com/ $tenantId /oauth2/v2.0/token "
2026-02-09 18:57:39 +01:00
$body = @ {
2026-02-16 09:09:47 +01:00
client_id = $clientId
client_secret = $clientSecret
2026-02-09 18:57:39 +01:00
scope = " https://api.businesscentral.dynamics.com/.default "
grant_type = " client_credentials "
}
try {
$response = Invoke-RestMethod -Uri $tokenUrl -Method Post -Body $body -ContentType " application/x-www-form-urlencoded "
2026-02-16 09:09:47 +01:00
$script:currentToken = $response . access_token
# Refresh 5 minutes before actual expiry (tokens typically last 60-90 min)
$script:tokenExpiry = ( Get-Date ) . AddSeconds ( $response . expires_in - 300 )
2026-02-16 09:20:52 +01:00
Write-Log " Successfully authenticated (token valid for $( $response . expires_in ) s) "
2026-02-16 09:09:47 +01:00
return $script:currentToken
2026-02-09 18:57:39 +01:00
}
catch {
Write-Log " Failed to authenticate: $_ " " ERROR "
throw
}
}
2026-02-16 09:09:47 +01:00
function Get-ValidToken {
if ( $null -eq $script:currentToken -or ( Get-Date ) -ge $script:tokenExpiry ) {
Write-Log " Token expired or missing, refreshing... "
Get-AzureADToken | Out-Null
}
return $script:currentToken
}
2026-02-16 09:20:52 +01:00
function Invoke-BCApi {
2026-02-09 18:57:39 +01:00
param (
2026-02-16 09:20:52 +01:00
[ string ] $Url ,
[ int ] $TimeoutSec = 120 ,
2026-02-16 09:41:17 +01:00
[ int ] $MaxRetries = 10
2026-02-09 18:57:39 +01:00
)
2026-02-16 09:20:52 +01:00
for ( $attempt = 1 ; $attempt -le $MaxRetries ; $attempt + + ) {
$token = Get-ValidToken
$headers = @ {
" Authorization " = " Bearer $token "
" Accept " = " application/json "
}
2026-02-09 19:21:06 +01:00
feat: switch from Admin Center database export to BC API v2.0 data extraction
The Admin Center export API requires an Azure Storage SAS URI which
requires an Azure Subscription - defeating the purpose of an independent
backup. Instead, use BC API v2.0 to extract critical business data
(customers, vendors, items, GL entries, invoices, etc.) as JSON files.
- bc-export.ps1: rewritten to use BC API v2.0 endpoints, extracts 23
entity types per company with OData pagination support
- bc-backup.sh: handles JSON export directory, creates tar.gz archive
before encrypting and uploading to S3
- bc-backup.conf.template: removed Azure Storage SAS config, added
optional BC_COMPANY_NAME filter
- decrypt-backup.sh: updated for tar.gz.gpg format, shows extracted
entity files and metadata after decryption
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-10 07:33:32 +01:00
try {
2026-02-16 09:20:52 +01:00
$response = Invoke-RestMethod -Uri $Url -Method Get -Headers $headers -TimeoutSec $TimeoutSec
return $response
feat: switch from Admin Center database export to BC API v2.0 data extraction
The Admin Center export API requires an Azure Storage SAS URI which
requires an Azure Subscription - defeating the purpose of an independent
backup. Instead, use BC API v2.0 to extract critical business data
(customers, vendors, items, GL entries, invoices, etc.) as JSON files.
- bc-export.ps1: rewritten to use BC API v2.0 endpoints, extracts 23
entity types per company with OData pagination support
- bc-backup.sh: handles JSON export directory, creates tar.gz archive
before encrypting and uploading to S3
- bc-backup.conf.template: removed Azure Storage SAS config, added
optional BC_COMPANY_NAME filter
- decrypt-backup.sh: updated for tar.gz.gpg format, shows extracted
entity files and metadata after decryption
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-10 07:33:32 +01:00
}
catch {
2026-02-16 09:20:52 +01:00
$statusCode = $null
2026-02-16 09:41:17 +01:00
$errorBody = " "
2026-02-16 09:20:52 +01:00
if ( $_ . Exception . Response ) {
$statusCode = [ int ] $_ . Exception . Response . StatusCode
}
2026-02-16 09:41:17 +01:00
if ( $_ . ErrorDetails . Message ) {
$errorBody = $_ . ErrorDetails . Message
}
2026-02-16 20:17:28 +01:00
# Log the actual error so we can diagnose issues
$shortError = if ( $errorBody . Length -gt 200 ) { $errorBody . Substring ( 0 , 200 ) + " ... " } else { $errorBody }
if ( -not $shortError ) { $shortError = " $_ " }
# Table lock: BC returns 500 with very specific wording
$isTableLock = $errorBody -match " transaction done by another session|being updated in a transaction|deadlock victim "
2026-02-16 09:41:17 +01:00
# Rate limit
$isThrottled = ( $statusCode -eq 429 )
# Timeout
$isTimeout = ( $_ -match " Timeout " )
2026-02-16 20:17:28 +01:00
# Other server errors (500+)
$isServerError = ( $statusCode -ge 500 -and -not $isTableLock )
2026-02-16 09:41:17 +01:00
$isRetryable = $isTableLock -or $isThrottled -or $isServerError -or $isTimeout
2026-02-16 09:20:52 +01:00
if ( $isRetryable -and $attempt -lt $MaxRetries ) {
2026-02-16 09:41:17 +01:00
if ( $isTableLock ) {
$wait = [ math ] :: Min ( 30 + ( $attempt * 15 ) , 120 )
2026-02-16 20:17:28 +01:00
Write-Log " Table lock (attempt $attempt / $MaxRetries ), waiting ${wait} s... Error: $shortError " " WARN "
2026-02-16 09:41:17 +01:00
}
elseif ( $isThrottled ) {
$wait = [ math ] :: Min ( 30 * $attempt , 300 )
Write-Log " Rate limited (attempt $attempt / $MaxRetries ), waiting ${wait} s... " " WARN "
}
2026-02-16 20:17:28 +01:00
elseif ( $isTimeout ) {
$wait = [ math ] :: Min ( 15 * $attempt , 120 )
Write-Log " Timeout (attempt $attempt / $MaxRetries ), retrying in ${wait} s... " " WARN "
}
2026-02-16 09:41:17 +01:00
else {
2026-02-16 20:17:28 +01:00
$wait = [ math ] :: Min ( 15 * $attempt , 120 )
Write-Log " HTTP $statusCode (attempt $attempt / $MaxRetries ), retrying in ${wait} s... Error: $shortError " " WARN "
2026-02-16 09:41:17 +01:00
}
2026-02-16 09:20:52 +01:00
Start-Sleep -Seconds $wait
continue
}
feat: switch from Admin Center database export to BC API v2.0 data extraction
The Admin Center export API requires an Azure Storage SAS URI which
requires an Azure Subscription - defeating the purpose of an independent
backup. Instead, use BC API v2.0 to extract critical business data
(customers, vendors, items, GL entries, invoices, etc.) as JSON files.
- bc-export.ps1: rewritten to use BC API v2.0 endpoints, extracts 23
entity types per company with OData pagination support
- bc-backup.sh: handles JSON export directory, creates tar.gz archive
before encrypting and uploading to S3
- bc-backup.conf.template: removed Azure Storage SAS config, added
optional BC_COMPANY_NAME filter
- decrypt-backup.sh: updated for tar.gz.gpg format, shows extracted
entity files and metadata after decryption
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-10 07:33:32 +01:00
throw
}
2026-02-16 09:20:52 +01:00
}
}
function Get-BCData {
param (
[ string ] $Url
)
$allRecords = @ ( )
$currentUrl = $Url
while ( $currentUrl ) {
$response = Invoke-BCApi -Url $currentUrl
2026-02-09 19:21:06 +01:00
feat: switch from Admin Center database export to BC API v2.0 data extraction
The Admin Center export API requires an Azure Storage SAS URI which
requires an Azure Subscription - defeating the purpose of an independent
backup. Instead, use BC API v2.0 to extract critical business data
(customers, vendors, items, GL entries, invoices, etc.) as JSON files.
- bc-export.ps1: rewritten to use BC API v2.0 endpoints, extracts 23
entity types per company with OData pagination support
- bc-backup.sh: handles JSON export directory, creates tar.gz archive
before encrypting and uploading to S3
- bc-backup.conf.template: removed Azure Storage SAS config, added
optional BC_COMPANY_NAME filter
- decrypt-backup.sh: updated for tar.gz.gpg format, shows extracted
entity files and metadata after decryption
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-10 07:33:32 +01:00
if ( $response . value ) {
$allRecords + = $response . value
}
2026-02-09 18:57:39 +01:00
feat: switch from Admin Center database export to BC API v2.0 data extraction
The Admin Center export API requires an Azure Storage SAS URI which
requires an Azure Subscription - defeating the purpose of an independent
backup. Instead, use BC API v2.0 to extract critical business data
(customers, vendors, items, GL entries, invoices, etc.) as JSON files.
- bc-export.ps1: rewritten to use BC API v2.0 endpoints, extracts 23
entity types per company with OData pagination support
- bc-backup.sh: handles JSON export directory, creates tar.gz archive
before encrypting and uploading to S3
- bc-backup.conf.template: removed Azure Storage SAS config, added
optional BC_COMPANY_NAME filter
- decrypt-backup.sh: updated for tar.gz.gpg format, shows extracted
entity files and metadata after decryption
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-10 07:33:32 +01:00
$currentUrl = $response . '@odata.nextLink'
2026-02-09 18:57:39 +01:00
}
feat: switch from Admin Center database export to BC API v2.0 data extraction
The Admin Center export API requires an Azure Storage SAS URI which
requires an Azure Subscription - defeating the purpose of an independent
backup. Instead, use BC API v2.0 to extract critical business data
(customers, vendors, items, GL entries, invoices, etc.) as JSON files.
- bc-export.ps1: rewritten to use BC API v2.0 endpoints, extracts 23
entity types per company with OData pagination support
- bc-backup.sh: handles JSON export directory, creates tar.gz archive
before encrypting and uploading to S3
- bc-backup.conf.template: removed Azure Storage SAS config, added
optional BC_COMPANY_NAME filter
- decrypt-backup.sh: updated for tar.gz.gpg format, shows extracted
entity files and metadata after decryption
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-10 07:33:32 +01:00
return $allRecords
2026-02-09 18:57:39 +01:00
}
feat: switch from Admin Center database export to BC API v2.0 data extraction
The Admin Center export API requires an Azure Storage SAS URI which
requires an Azure Subscription - defeating the purpose of an independent
backup. Instead, use BC API v2.0 to extract critical business data
(customers, vendors, items, GL entries, invoices, etc.) as JSON files.
- bc-export.ps1: rewritten to use BC API v2.0 endpoints, extracts 23
entity types per company with OData pagination support
- bc-backup.sh: handles JSON export directory, creates tar.gz archive
before encrypting and uploading to S3
- bc-backup.conf.template: removed Azure Storage SAS config, added
optional BC_COMPANY_NAME filter
- decrypt-backup.sh: updated for tar.gz.gpg format, shows extracted
entity files and metadata after decryption
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-10 07:33:32 +01:00
function Get-Companies {
Write-Log " Fetching companies... "
$companiesUrl = " $baseUrl /companies "
2026-02-16 09:09:47 +01:00
$companies = Get-BCData -Url $companiesUrl
feat: switch from Admin Center database export to BC API v2.0 data extraction
The Admin Center export API requires an Azure Storage SAS URI which
requires an Azure Subscription - defeating the purpose of an independent
backup. Instead, use BC API v2.0 to extract critical business data
(customers, vendors, items, GL entries, invoices, etc.) as JSON files.
- bc-export.ps1: rewritten to use BC API v2.0 endpoints, extracts 23
entity types per company with OData pagination support
- bc-backup.sh: handles JSON export directory, creates tar.gz archive
before encrypting and uploading to S3
- bc-backup.conf.template: removed Azure Storage SAS config, added
optional BC_COMPANY_NAME filter
- decrypt-backup.sh: updated for tar.gz.gpg format, shows extracted
entity files and metadata after decryption
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-10 07:33:32 +01:00
Write-Log " Found $( $companies . Count ) company/companies "
return $companies
2026-02-09 18:57:39 +01:00
}
feat: switch from Admin Center database export to BC API v2.0 data extraction
The Admin Center export API requires an Azure Storage SAS URI which
requires an Azure Subscription - defeating the purpose of an independent
backup. Instead, use BC API v2.0 to extract critical business data
(customers, vendors, items, GL entries, invoices, etc.) as JSON files.
- bc-export.ps1: rewritten to use BC API v2.0 endpoints, extracts 23
entity types per company with OData pagination support
- bc-backup.sh: handles JSON export directory, creates tar.gz archive
before encrypting and uploading to S3
- bc-backup.conf.template: removed Azure Storage SAS config, added
optional BC_COMPANY_NAME filter
- decrypt-backup.sh: updated for tar.gz.gpg format, shows extracted
entity files and metadata after decryption
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-10 07:33:32 +01:00
function Export-EntityData {
2026-02-09 18:57:39 +01:00
param (
feat: switch from Admin Center database export to BC API v2.0 data extraction
The Admin Center export API requires an Azure Storage SAS URI which
requires an Azure Subscription - defeating the purpose of an independent
backup. Instead, use BC API v2.0 to extract critical business data
(customers, vendors, items, GL entries, invoices, etc.) as JSON files.
- bc-export.ps1: rewritten to use BC API v2.0 endpoints, extracts 23
entity types per company with OData pagination support
- bc-backup.sh: handles JSON export directory, creates tar.gz archive
before encrypting and uploading to S3
- bc-backup.conf.template: removed Azure Storage SAS config, added
optional BC_COMPANY_NAME filter
- decrypt-backup.sh: updated for tar.gz.gpg format, shows extracted
entity files and metadata after decryption
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-10 07:33:32 +01:00
[ string ] $CompanyId ,
[ string ] $CompanyName ,
[ string ] $EntityName ,
feat: export all available BC API v2.0 entities
Added 31 missing entities across three categories:
Standalone (16 new): companyInformation, itemCategories,
shipmentMethods, taxAreas, taxGroups, unitsOfMeasure,
timeRegistrationEntries, contacts, generalProductPostingGroups,
inventoryPostingGroups, itemLedgerEntries, opportunities,
locations, projects, journalLines, irs1099
Financial reports (10 new, always full export): agedAccountsPayable,
agedAccountsReceivable, balanceSheet, cashFlowStatement,
incomeStatement, retainedEarningsStatement, trialBalance,
customerFinancialDetails, customerSales, vendorPurchases
Document+lines (5 new): salesQuotes, salesShipments,
purchaseReceipts, customerPaymentJournals, vendorPaymentJournals
Total entities: 19 → 50
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-17 09:18:24 +01:00
[ string ] $OutputDir ,
[ switch ] $NoFilter
2026-02-09 18:57:39 +01:00
)
feat: switch from Admin Center database export to BC API v2.0 data extraction
The Admin Center export API requires an Azure Storage SAS URI which
requires an Azure Subscription - defeating the purpose of an independent
backup. Instead, use BC API v2.0 to extract critical business data
(customers, vendors, items, GL entries, invoices, etc.) as JSON files.
- bc-export.ps1: rewritten to use BC API v2.0 endpoints, extracts 23
entity types per company with OData pagination support
- bc-backup.sh: handles JSON export directory, creates tar.gz archive
before encrypting and uploading to S3
- bc-backup.conf.template: removed Azure Storage SAS config, added
optional BC_COMPANY_NAME filter
- decrypt-backup.sh: updated for tar.gz.gpg format, shows extracted
entity files and metadata after decryption
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-10 07:33:32 +01:00
$entityUrl = " $baseUrl /companies( $CompanyId )/ $EntityName "
feat: export all available BC API v2.0 entities
Added 31 missing entities across three categories:
Standalone (16 new): companyInformation, itemCategories,
shipmentMethods, taxAreas, taxGroups, unitsOfMeasure,
timeRegistrationEntries, contacts, generalProductPostingGroups,
inventoryPostingGroups, itemLedgerEntries, opportunities,
locations, projects, journalLines, irs1099
Financial reports (10 new, always full export): agedAccountsPayable,
agedAccountsReceivable, balanceSheet, cashFlowStatement,
incomeStatement, retainedEarningsStatement, trialBalance,
customerFinancialDetails, customerSales, vendorPurchases
Document+lines (5 new): salesQuotes, salesShipments,
purchaseReceipts, customerPaymentJournals, vendorPaymentJournals
Total entities: 19 → 50
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-17 09:18:24 +01:00
if ( $SinceDateTime -and -not $NoFilter ) {
2026-02-16 10:22:08 +01:00
$entityUrl + = " ? `$ filter=lastModifiedDateTime gt $SinceDateTime "
}
2026-02-16 09:41:17 +01:00
$maxEntityRetries = 5
2026-02-09 18:57:39 +01:00
2026-02-16 09:41:17 +01:00
for ( $entityAttempt = 1 ; $entityAttempt -le $maxEntityRetries ; $entityAttempt + + ) {
Write-Log " Exporting $EntityName ... "
2026-02-09 18:57:39 +01:00
2026-02-16 09:41:17 +01:00
try {
$data = Get-BCData -Url $entityUrl
$count = 0
if ( $data ) { $count = $data . Count }
2026-02-09 18:57:39 +01:00
2026-02-16 09:41:17 +01:00
$outputFile = Join-Path $OutputDir " $EntityName .json "
$data | ConvertTo-Json -Depth 10 | Out-File -FilePath $outputFile -Encoding utf8
2026-02-09 18:57:39 +01:00
2026-02-16 09:41:17 +01:00
Write-Log " $EntityName : $count records "
return $count
}
catch {
$errorMsg = " $_ "
2026-02-16 20:17:28 +01:00
$isTableLock = $errorMsg -match " transaction done by another session|being updated in a transaction|deadlock victim "
2026-02-16 09:41:17 +01:00
if ( $isTableLock -and $entityAttempt -lt $maxEntityRetries ) {
$wait = 60 * $entityAttempt
Write-Log " Table lock on $EntityName (attempt $entityAttempt / $maxEntityRetries ), restarting in ${wait} s... " " WARN "
Start-Sleep -Seconds $wait
continue
}
Write-Log " Failed to export ${EntityName} : $errorMsg " " WARN "
$outputFile = Join-Path $OutputDir " $EntityName .json "
" [] " | Out-File -FilePath $outputFile -Encoding utf8
return 0
}
2026-02-09 18:57:39 +01:00
}
2026-02-16 09:41:17 +01:00
return 0
2026-02-09 18:57:39 +01:00
}
2026-02-10 07:57:46 +01:00
function Export-DocumentWithLines {
param (
[ string ] $CompanyId ,
[ string ] $CompanyName ,
[ string ] $DocumentEntity ,
[ string ] $LineEntity ,
[ string ] $OutputDir
)
2026-02-16 09:41:17 +01:00
# Retry the entire entity export if it fails (e.g. table lock on first page)
$maxEntityRetries = 5
2026-02-16 09:09:47 +01:00
2026-02-16 09:41:17 +01:00
for ( $entityAttempt = 1 ; $entityAttempt -le $maxEntityRetries ; $entityAttempt + + ) {
Write-Log " Exporting $DocumentEntity (headers + lines)... "
2026-02-10 07:57:46 +01:00
2026-02-16 09:41:17 +01:00
$docFile = Join-Path $OutputDir " $DocumentEntity .jsonl "
$lineFile = Join-Path $OutputDir " $LineEntity .jsonl "
[ System.IO.File ] :: WriteAllText ( $docFile , " " )
[ System.IO.File ] :: WriteAllText ( $lineFile , " " )
2026-02-10 07:57:46 +01:00
2026-02-16 09:41:17 +01:00
$docCount = 0
$lineCount = 0
$failed = $false
2026-02-16 09:09:47 +01:00
2026-02-16 09:41:17 +01:00
try {
# Step 1: Fetch document headers page by page (no $expand)
# BC API default page size is ~100, with @odata.nextLink for more
$currentUrl = " $baseUrl /companies( $CompanyId )/ $DocumentEntity "
2026-02-16 10:22:08 +01:00
if ( $SinceDateTime ) {
$currentUrl + = " ? `$ filter=lastModifiedDateTime gt $SinceDateTime "
}
2026-02-16 09:09:47 +01:00
2026-02-16 09:41:17 +01:00
while ( $currentUrl ) {
$response = Invoke-BCApi -Url $currentUrl
2026-02-16 09:09:47 +01:00
2026-02-16 09:41:17 +01:00
if ( -not $response . value -or $response . value . Count -eq 0 ) {
break
}
# Step 2: For each document in this page, fetch its lines
foreach ( $doc in $response . value ) {
$docCount + +
$docId = $doc . id
# Write document header to disk
$jsonLine = $doc | ConvertTo-Json -Depth 10 -Compress
[ System.IO.File ] :: AppendAllText ( $docFile , $jsonLine + " `n " )
# Fetch lines for this document
$linesUrl = " $baseUrl /companies( $CompanyId )/ $DocumentEntity ( $docId )/ $LineEntity "
try {
$linesResponse = Invoke-BCApi -Url $linesUrl -TimeoutSec 60
if ( $linesResponse . value -and $linesResponse . value . Count -gt 0 ) {
foreach ( $line in $linesResponse . value ) {
$lineCount + +
$lineJson = $line | ConvertTo-Json -Depth 10 -Compress
[ System.IO.File ] :: AppendAllText ( $lineFile , $lineJson + " `n " )
}
2026-02-16 09:20:52 +01:00
2026-02-16 09:41:17 +01:00
# Handle pagination within lines (unlikely but possible)
$nextLinesUrl = $linesResponse . '@odata.nextLink'
while ( $nextLinesUrl ) {
$moreLinesResponse = Invoke-BCApi -Url $nextLinesUrl -TimeoutSec 60
if ( $moreLinesResponse . value ) {
foreach ( $line in $moreLinesResponse . value ) {
$lineCount + +
$lineJson = $line | ConvertTo-Json -Depth 10 -Compress
[ System.IO.File ] :: AppendAllText ( $lineFile , $lineJson + " `n " )
}
2026-02-16 09:20:52 +01:00
}
2026-02-16 09:41:17 +01:00
$nextLinesUrl = $moreLinesResponse . '@odata.nextLink'
2026-02-16 09:20:52 +01:00
}
}
2026-02-16 09:09:47 +01:00
}
2026-02-16 09:41:17 +01:00
catch {
Write-Log " Warning: failed to fetch lines for $DocumentEntity $docId : $_ " " WARN "
}
2026-02-16 09:09:47 +01:00
2026-02-16 09:41:17 +01:00
# Progress every 100 documents
if ( $docCount % 100 -eq 0 ) {
Write-Log " Progress: $docCount documents, $lineCount lines "
}
2026-02-10 07:57:46 +01:00
}
2026-02-16 09:41:17 +01:00
# Next page of documents
$currentUrl = $response . '@odata.nextLink'
2026-02-10 07:57:46 +01:00
}
2026-02-16 09:41:17 +01:00
Write-Log " $DocumentEntity : $docCount documents, $lineCount lines (complete) "
return ( $docCount + $lineCount )
2026-02-16 09:09:47 +01:00
}
2026-02-16 09:41:17 +01:00
catch {
$errorMsg = " $_ "
2026-02-16 20:17:28 +01:00
$isTableLock = $errorMsg -match " transaction done by another session|being updated in a transaction|deadlock victim "
2026-02-16 09:09:47 +01:00
2026-02-16 09:41:17 +01:00
if ( $isTableLock -and $entityAttempt -lt $maxEntityRetries ) {
$wait = 60 * $entityAttempt
Write-Log " Table lock on $DocumentEntity (attempt $entityAttempt / $maxEntityRetries ), restarting in ${wait} s... " " WARN "
Start-Sleep -Seconds $wait
continue
}
Write-Log " Failed to export ${DocumentEntity} at doc # $docCount : $errorMsg " " WARN "
Write-Log " Partial data saved ( $docCount docs, $lineCount lines) " " WARN "
return ( $docCount + $lineCount )
}
2026-02-10 07:57:46 +01:00
}
2026-02-16 09:41:17 +01:00
return 0
2026-02-10 07:57:46 +01:00
}
2026-02-09 18:57:39 +01:00
# Main execution
try {
2026-02-16 10:22:08 +01:00
$exportMode = if ( $SinceDateTime ) { " incremental " } else { " full " }
2026-02-09 18:57:39 +01:00
Write-Log " ========================================= "
feat: switch from Admin Center database export to BC API v2.0 data extraction
The Admin Center export API requires an Azure Storage SAS URI which
requires an Azure Subscription - defeating the purpose of an independent
backup. Instead, use BC API v2.0 to extract critical business data
(customers, vendors, items, GL entries, invoices, etc.) as JSON files.
- bc-export.ps1: rewritten to use BC API v2.0 endpoints, extracts 23
entity types per company with OData pagination support
- bc-backup.sh: handles JSON export directory, creates tar.gz archive
before encrypting and uploading to S3
- bc-backup.conf.template: removed Azure Storage SAS config, added
optional BC_COMPANY_NAME filter
- decrypt-backup.sh: updated for tar.gz.gpg format, shows extracted
entity files and metadata after decryption
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-10 07:33:32 +01:00
Write-Log " BC Data Export Script (API v2.0) "
2026-02-09 18:57:39 +01:00
Write-Log " ========================================= "
Write-Log " Environment: $environmentName "
2026-02-16 10:22:08 +01:00
Write-Log " Mode: $exportMode "
if ( $SinceDateTime ) {
Write-Log " Changes since: $SinceDateTime "
}
2026-02-09 18:57:39 +01:00
Write-Log " Output Path: $OutputPath "
feat: export all available BC API v2.0 entities
Added 31 missing entities across three categories:
Standalone (16 new): companyInformation, itemCategories,
shipmentMethods, taxAreas, taxGroups, unitsOfMeasure,
timeRegistrationEntries, contacts, generalProductPostingGroups,
inventoryPostingGroups, itemLedgerEntries, opportunities,
locations, projects, journalLines, irs1099
Financial reports (10 new, always full export): agedAccountsPayable,
agedAccountsReceivable, balanceSheet, cashFlowStatement,
incomeStatement, retainedEarningsStatement, trialBalance,
customerFinancialDetails, customerSales, vendorPurchases
Document+lines (5 new): salesQuotes, salesShipments,
purchaseReceipts, customerPaymentJournals, vendorPaymentJournals
Total entities: 19 → 50
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-17 09:18:24 +01:00
$totalEntityCount = $entities . Count + $reportEntities . Count + $documentEntities . Count
Write-Log " Entities to extract: $totalEntityCount ( $( $entities . Count ) standalone, $( $reportEntities . Count ) reports, $( $documentEntities . Count ) with line items) "
2026-02-09 18:57:39 +01:00
feat: switch from Admin Center database export to BC API v2.0 data extraction
The Admin Center export API requires an Azure Storage SAS URI which
requires an Azure Subscription - defeating the purpose of an independent
backup. Instead, use BC API v2.0 to extract critical business data
(customers, vendors, items, GL entries, invoices, etc.) as JSON files.
- bc-export.ps1: rewritten to use BC API v2.0 endpoints, extracts 23
entity types per company with OData pagination support
- bc-backup.sh: handles JSON export directory, creates tar.gz archive
before encrypting and uploading to S3
- bc-backup.conf.template: removed Azure Storage SAS config, added
optional BC_COMPANY_NAME filter
- decrypt-backup.sh: updated for tar.gz.gpg format, shows extracted
entity files and metadata after decryption
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-10 07:33:32 +01:00
# Create output directory
$exportDir = $OutputPath
if ( -not ( Test-Path $exportDir ) ) {
New-Item -ItemType Directory -Path $exportDir -Force | Out-Null
2026-02-09 19:21:06 +01:00
}
2026-02-09 18:57:39 +01:00
# Step 1: Get Azure AD token
2026-02-16 09:09:47 +01:00
Get-AzureADToken | Out-Null
2026-02-09 18:57:39 +01:00
feat: switch from Admin Center database export to BC API v2.0 data extraction
The Admin Center export API requires an Azure Storage SAS URI which
requires an Azure Subscription - defeating the purpose of an independent
backup. Instead, use BC API v2.0 to extract critical business data
(customers, vendors, items, GL entries, invoices, etc.) as JSON files.
- bc-export.ps1: rewritten to use BC API v2.0 endpoints, extracts 23
entity types per company with OData pagination support
- bc-backup.sh: handles JSON export directory, creates tar.gz archive
before encrypting and uploading to S3
- bc-backup.conf.template: removed Azure Storage SAS config, added
optional BC_COMPANY_NAME filter
- decrypt-backup.sh: updated for tar.gz.gpg format, shows extracted
entity files and metadata after decryption
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-10 07:33:32 +01:00
# Step 2: Get companies
2026-02-16 09:09:47 +01:00
$companies = Get-Companies
2026-02-09 18:57:39 +01:00
feat: switch from Admin Center database export to BC API v2.0 data extraction
The Admin Center export API requires an Azure Storage SAS URI which
requires an Azure Subscription - defeating the purpose of an independent
backup. Instead, use BC API v2.0 to extract critical business data
(customers, vendors, items, GL entries, invoices, etc.) as JSON files.
- bc-export.ps1: rewritten to use BC API v2.0 endpoints, extracts 23
entity types per company with OData pagination support
- bc-backup.sh: handles JSON export directory, creates tar.gz archive
before encrypting and uploading to S3
- bc-backup.conf.template: removed Azure Storage SAS config, added
optional BC_COMPANY_NAME filter
- decrypt-backup.sh: updated for tar.gz.gpg format, shows extracted
entity files and metadata after decryption
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-10 07:33:32 +01:00
if ( $companies . Count -eq 0 ) {
Write-Log " No companies found in environment $environmentName " " ERROR "
2026-02-09 19:21:06 +01:00
exit 1
2026-02-09 18:57:39 +01:00
}
feat: switch from Admin Center database export to BC API v2.0 data extraction
The Admin Center export API requires an Azure Storage SAS URI which
requires an Azure Subscription - defeating the purpose of an independent
backup. Instead, use BC API v2.0 to extract critical business data
(customers, vendors, items, GL entries, invoices, etc.) as JSON files.
- bc-export.ps1: rewritten to use BC API v2.0 endpoints, extracts 23
entity types per company with OData pagination support
- bc-backup.sh: handles JSON export directory, creates tar.gz archive
before encrypting and uploading to S3
- bc-backup.conf.template: removed Azure Storage SAS config, added
optional BC_COMPANY_NAME filter
- decrypt-backup.sh: updated for tar.gz.gpg format, shows extracted
entity files and metadata after decryption
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-10 07:33:32 +01:00
# Save companies list
$companies | ConvertTo-Json -Depth 10 | Out-File -FilePath ( Join-Path $exportDir " companies.json " ) -Encoding utf8
# Filter to specific company if configured
$targetCompanies = $companies
if ( $bcCompanyName ) {
$targetCompanies = $companies | Where-Object { $_ . name -eq $bcCompanyName -or $_ . displayName -eq $bcCompanyName }
if ( $targetCompanies . Count -eq 0 ) {
Write-Log " Company ' $bcCompanyName ' not found. Available: $( $companies . name -join ', ' ) " " ERROR "
exit 1
}
Write-Log " Filtering to company: $bcCompanyName "
2026-02-09 18:57:39 +01:00
}
feat: switch from Admin Center database export to BC API v2.0 data extraction
The Admin Center export API requires an Azure Storage SAS URI which
requires an Azure Subscription - defeating the purpose of an independent
backup. Instead, use BC API v2.0 to extract critical business data
(customers, vendors, items, GL entries, invoices, etc.) as JSON files.
- bc-export.ps1: rewritten to use BC API v2.0 endpoints, extracts 23
entity types per company with OData pagination support
- bc-backup.sh: handles JSON export directory, creates tar.gz archive
before encrypting and uploading to S3
- bc-backup.conf.template: removed Azure Storage SAS config, added
optional BC_COMPANY_NAME filter
- decrypt-backup.sh: updated for tar.gz.gpg format, shows extracted
entity files and metadata after decryption
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-10 07:33:32 +01:00
$totalRecords = 0
$totalEntities = 0
$failedEntities = @ ( )
2026-02-09 18:57:39 +01:00
feat: switch from Admin Center database export to BC API v2.0 data extraction
The Admin Center export API requires an Azure Storage SAS URI which
requires an Azure Subscription - defeating the purpose of an independent
backup. Instead, use BC API v2.0 to extract critical business data
(customers, vendors, items, GL entries, invoices, etc.) as JSON files.
- bc-export.ps1: rewritten to use BC API v2.0 endpoints, extracts 23
entity types per company with OData pagination support
- bc-backup.sh: handles JSON export directory, creates tar.gz archive
before encrypting and uploading to S3
- bc-backup.conf.template: removed Azure Storage SAS config, added
optional BC_COMPANY_NAME filter
- decrypt-backup.sh: updated for tar.gz.gpg format, shows extracted
entity files and metadata after decryption
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-10 07:33:32 +01:00
# Step 3: Export data for each company
foreach ( $company in $targetCompanies ) {
$companyName = $company . name
$companyId = $company . id
Write-Log " ----------------------------------------- "
Write-Log " Exporting company: $companyName ( $companyId ) "
# Create company directory (sanitize name for filesystem)
$safeName = $companyName -replace '[\\/:*?"<>|]' , '_'
$companyDir = Join-Path $exportDir $safeName
if ( -not ( Test-Path $companyDir ) ) {
New-Item -ItemType Directory -Path $companyDir -Force | Out-Null
}
2026-02-10 07:57:46 +01:00
# Export standalone entities
feat: switch from Admin Center database export to BC API v2.0 data extraction
The Admin Center export API requires an Azure Storage SAS URI which
requires an Azure Subscription - defeating the purpose of an independent
backup. Instead, use BC API v2.0 to extract critical business data
(customers, vendors, items, GL entries, invoices, etc.) as JSON files.
- bc-export.ps1: rewritten to use BC API v2.0 endpoints, extracts 23
entity types per company with OData pagination support
- bc-backup.sh: handles JSON export directory, creates tar.gz archive
before encrypting and uploading to S3
- bc-backup.conf.template: removed Azure Storage SAS config, added
optional BC_COMPANY_NAME filter
- decrypt-backup.sh: updated for tar.gz.gpg format, shows extracted
entity files and metadata after decryption
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-10 07:33:32 +01:00
foreach ( $entity in $entities ) {
$count = Export-EntityData `
-CompanyId $companyId `
-CompanyName $companyName `
-EntityName $entity `
-OutputDir $companyDir
$totalRecords + = $count
$totalEntities + +
if ( $count -eq 0 ) {
$failedEntities + = " $companyName / $entity "
}
}
2026-02-10 07:57:46 +01:00
feat: export all available BC API v2.0 entities
Added 31 missing entities across three categories:
Standalone (16 new): companyInformation, itemCategories,
shipmentMethods, taxAreas, taxGroups, unitsOfMeasure,
timeRegistrationEntries, contacts, generalProductPostingGroups,
inventoryPostingGroups, itemLedgerEntries, opportunities,
locations, projects, journalLines, irs1099
Financial reports (10 new, always full export): agedAccountsPayable,
agedAccountsReceivable, balanceSheet, cashFlowStatement,
incomeStatement, retainedEarningsStatement, trialBalance,
customerFinancialDetails, customerSales, vendorPurchases
Document+lines (5 new): salesQuotes, salesShipments,
purchaseReceipts, customerPaymentJournals, vendorPaymentJournals
Total entities: 19 → 50
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-17 09:18:24 +01:00
# Export financial report entities (always full, no incremental filter)
foreach ( $entity in $reportEntities ) {
$count = Export-EntityData `
-CompanyId $companyId `
-CompanyName $companyName `
-EntityName $entity `
-OutputDir $companyDir `
-NoFilter
$totalRecords + = $count
$totalEntities + +
if ( $count -eq 0 ) {
$failedEntities + = " $companyName / $entity "
}
}
2026-02-16 09:09:47 +01:00
# Export document entities with their line items
2026-02-10 07:57:46 +01:00
foreach ( $docEntity in $documentEntities . Keys ) {
$lineEntity = $documentEntities [ $docEntity ]
$count = Export-DocumentWithLines `
-CompanyId $companyId `
-CompanyName $companyName `
-DocumentEntity $docEntity `
-LineEntity $lineEntity `
-OutputDir $companyDir
$totalRecords + = $count
$totalEntities + +
if ( $count -eq 0 ) {
$failedEntities + = " $companyName / $docEntity "
}
}
2026-02-09 18:57:39 +01:00
}
feat: switch from Admin Center database export to BC API v2.0 data extraction
The Admin Center export API requires an Azure Storage SAS URI which
requires an Azure Subscription - defeating the purpose of an independent
backup. Instead, use BC API v2.0 to extract critical business data
(customers, vendors, items, GL entries, invoices, etc.) as JSON files.
- bc-export.ps1: rewritten to use BC API v2.0 endpoints, extracts 23
entity types per company with OData pagination support
- bc-backup.sh: handles JSON export directory, creates tar.gz archive
before encrypting and uploading to S3
- bc-backup.conf.template: removed Azure Storage SAS config, added
optional BC_COMPANY_NAME filter
- decrypt-backup.sh: updated for tar.gz.gpg format, shows extracted
entity files and metadata after decryption
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-10 07:33:32 +01:00
# Save export metadata
$metadata = @ {
exportDate = ( Get-Date -Format " yyyy-MM-dd HH:mm:ss UTC " -AsUTC )
environment = $environmentName
2026-02-16 10:22:08 +01:00
mode = $exportMode
sinceDateTime = if ( $SinceDateTime ) { $SinceDateTime } else { $null }
feat: switch from Admin Center database export to BC API v2.0 data extraction
The Admin Center export API requires an Azure Storage SAS URI which
requires an Azure Subscription - defeating the purpose of an independent
backup. Instead, use BC API v2.0 to extract critical business data
(customers, vendors, items, GL entries, invoices, etc.) as JSON files.
- bc-export.ps1: rewritten to use BC API v2.0 endpoints, extracts 23
entity types per company with OData pagination support
- bc-backup.sh: handles JSON export directory, creates tar.gz archive
before encrypting and uploading to S3
- bc-backup.conf.template: removed Azure Storage SAS config, added
optional BC_COMPANY_NAME filter
- decrypt-backup.sh: updated for tar.gz.gpg format, shows extracted
entity files and metadata after decryption
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-10 07:33:32 +01:00
companies = @ ( $targetCompanies | ForEach-Object { $_ . name } )
entitiesExported = $totalEntities
totalRecords = $totalRecords
failedEntities = $failedEntities
}
$metadata | ConvertTo-Json -Depth 5 | Out-File -FilePath ( Join-Path $exportDir " export-metadata.json " ) -Encoding utf8
2026-02-09 18:57:39 +01:00
Write-Log " ========================================= "
feat: switch from Admin Center database export to BC API v2.0 data extraction
The Admin Center export API requires an Azure Storage SAS URI which
requires an Azure Subscription - defeating the purpose of an independent
backup. Instead, use BC API v2.0 to extract critical business data
(customers, vendors, items, GL entries, invoices, etc.) as JSON files.
- bc-export.ps1: rewritten to use BC API v2.0 endpoints, extracts 23
entity types per company with OData pagination support
- bc-backup.sh: handles JSON export directory, creates tar.gz archive
before encrypting and uploading to S3
- bc-backup.conf.template: removed Azure Storage SAS config, added
optional BC_COMPANY_NAME filter
- decrypt-backup.sh: updated for tar.gz.gpg format, shows extracted
entity files and metadata after decryption
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-10 07:33:32 +01:00
Write-Log " Export completed "
2026-02-16 10:22:08 +01:00
Write-Log " Mode: $exportMode "
feat: switch from Admin Center database export to BC API v2.0 data extraction
The Admin Center export API requires an Azure Storage SAS URI which
requires an Azure Subscription - defeating the purpose of an independent
backup. Instead, use BC API v2.0 to extract critical business data
(customers, vendors, items, GL entries, invoices, etc.) as JSON files.
- bc-export.ps1: rewritten to use BC API v2.0 endpoints, extracts 23
entity types per company with OData pagination support
- bc-backup.sh: handles JSON export directory, creates tar.gz archive
before encrypting and uploading to S3
- bc-backup.conf.template: removed Azure Storage SAS config, added
optional BC_COMPANY_NAME filter
- decrypt-backup.sh: updated for tar.gz.gpg format, shows extracted
entity files and metadata after decryption
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-10 07:33:32 +01:00
Write-Log " Companies: $( $targetCompanies . Count ) "
Write-Log " Entities: $totalEntities "
Write-Log " Total records: $totalRecords "
if ( $failedEntities . Count -gt 0 ) {
Write-Log " Failed/empty: $( $failedEntities . Count ) entities " " WARN "
}
2026-02-09 18:57:39 +01:00
Write-Log " ========================================= "
2026-02-16 10:22:08 +01:00
# Exit code 2 = success but no records (used by bc-backup.sh to skip empty incrementals)
if ( $totalRecords -eq 0 -and $exportMode -eq " incremental " ) {
Write-Log " No changes detected since $SinceDateTime "
exit 2
}
2026-02-09 18:57:39 +01:00
exit 0
}
catch {
Write-Log " Unexpected error: $_ " " ERROR "
Write-Log " Stack trace: $( $_ . ScriptStackTrace ) " " ERROR "
exit 1
}