mirror of
https://github.com/SuperClaude-Org/SuperClaude_Framework.git
synced 2025-12-29 16:16:08 +00:00
refactor: PEP8 compliance - directory rename and code formatting (#425)
* fix(orchestration): add WebFetch auto-trigger for infrastructure configuration Problem: Infrastructure configuration changes (e.g., Traefik port settings) were being made based on assumptions without consulting official documentation, violating the 'Evidence > assumptions' principle in PRINCIPLES.md. Solution: - Added Infrastructure Configuration Validation section to MODE_Orchestration.md - Auto-triggers WebFetch for infrastructure tools (Traefik, nginx, Docker, etc.) - Enforces MODE_DeepResearch activation for investigation - BLOCKS assumption-based configuration changes Testing: Verified WebFetch successfully retrieves Traefik official docs (port 80 default) This prevents production outages from infrastructure misconfiguration by ensuring all technical recommendations are backed by official documentation. * feat: Add PM Agent (Project Manager Agent) for seamless orchestration Introduces PM Agent as the default orchestration layer that coordinates all sub-agents and manages workflows automatically. Key Features: - Default orchestration: All user interactions handled by PM Agent - Auto-delegation: Intelligent sub-agent selection based on task analysis - Docker Gateway integration: Zero-token baseline with dynamic MCP loading - Self-improvement loop: Automatic documentation of patterns and mistakes - Optional override: Users can specify sub-agents explicitly if desired Architecture: - Agent spec: SuperClaude/Agents/pm-agent.md - Command: SuperClaude/Commands/pm.md - Updated docs: README.md (15→16 agents), agents.md (new Orchestration category) User Experience: - Default: PM Agent handles everything (seamless, no manual routing) - Optional: Explicit --agent flag for direct sub-agent access - Both modes available simultaneously (no user downside) Implementation Status: - ✅ Specification complete - ✅ Documentation complete - ⏳ Prototype implementation needed - ⏳ Docker Gateway integration needed - ⏳ Testing and validation needed Refs: kazukinakai/docker-mcp-gateway (IRIS MCP Gateway integration) * feat: Add Agent Orchestration rules for PM Agent default activation Implements PM Agent as the default orchestration layer in RULES.md. Key Changes: - New 'Agent Orchestration' section (CRITICAL priority) - PM Agent receives ALL user requests by default - Manual override with @agent-[name] bypasses PM Agent - Agent Selection Priority clearly defined: 1. Manual override → Direct routing 2. Default → PM Agent → Auto-delegation 3. Delegation based on keywords, file types, complexity, context User Experience: - Default: PM Agent handles everything (seamless) - Override: @agent-[name] for direct specialist access - Transparent: PM Agent reports delegation decisions This establishes PM Agent as the orchestration layer while respecting existing auto-activation patterns and manual overrides. Next Steps: - Local testing in agiletec project - Iteration based on actual behavior - Documentation updates as needed * refactor(pm-agent): redesign as self-improvement meta-layer Problem Resolution: PM Agent's initial design competed with existing auto-activation for task routing, creating confusion about orchestration responsibilities and adding unnecessary complexity. Design Change: Redefined PM Agent as a meta-layer agent that operates AFTER specialist agents complete tasks, focusing on: - Post-implementation documentation and pattern recording - Immediate mistake analysis with prevention checklists - Monthly documentation maintenance and noise reduction - Pattern extraction and knowledge synthesis Two-Layer Orchestration System: 1. Task Execution Layer: Existing auto-activation handles task routing (unchanged) 2. Self-Improvement Layer: PM Agent meta-layer handles documentation (new) Files Modified: - SuperClaude/Agents/pm-agent.md: Complete rewrite with meta-layer design - Category: orchestration → meta - Triggers: All user interactions → Post-implementation, mistakes, monthly - Behavioral Mindset: Continuous learning system - Self-Improvement Workflow: BEFORE/DURING/AFTER/MISTAKE RECOVERY/MAINTENANCE - SuperClaude/Core/RULES.md: Agent Orchestration section updated - Split into Task Execution Layer + Self-Improvement Layer - Added orchestration flow diagram - Clarified PM Agent activates AFTER task completion - README.md: Updated PM Agent description - "orchestrates all interactions" → "ensures continuous learning" - Docs/User-Guide/agents.md: PM Agent section rewritten - Section: Orchestration Agent → Meta-Layer Agent - Expertise: Project orchestration → Self-improvement workflow executor - Examples: Task coordination → Post-implementation documentation - PR_DOCUMENTATION.md: Comprehensive PR documentation added - Summary, motivation, changes, testing, breaking changes - Two-layer orchestration system diagram - Verification checklist Integration Validated: Tested with agiletec project's self-improvement-workflow.md: ✅ PM Agent aligns with existing BEFORE/DURING/AFTER/MISTAKE RECOVERY phases ✅ Complements (not competes with) existing workflow ✅ agiletec workflow defines WHAT, PM Agent defines WHO executes it Breaking Changes: None - Existing auto-activation continues unchanged - Specialist agents unaffected - User workflows remain the same - New capability: Automatic documentation and knowledge maintenance Value Proposition: Transforms SuperClaude into a continuously learning system that accumulates knowledge, prevents recurring mistakes, and maintains fresh documentation without manual intervention. 🤖 Generated with [Claude Code](https://claude.com/claude-code) Co-Authored-By: Claude <noreply@anthropic.com> * docs: add Claude Code conversation history management research Research covering .jsonl file structure, performance impact, and retention policies. Content: - Claude Code .jsonl file format and message types - Performance issues from GitHub (memory leaks, conversation compaction) - Retention policies (consumer vs enterprise) - Rotation recommendations based on actual data - File history snapshot tracking mechanics Source: Moved from agiletec project (research applicable to all Claude Code projects) 🤖 Generated with [Claude Code](https://claude.com/claude-code) Co-Authored-By: Claude <noreply@anthropic.com> * feat: add Development documentation structure Phase 1: Documentation Structure complete - Add Docs/Development/ directory for development documentation - Add ARCHITECTURE.md - System architecture with PM Agent meta-layer - Add ROADMAP.md - 5-phase development plan with checkboxes - Add TASKS.md - Daily task tracking with progress indicators - Add PROJECT_STATUS.md - Current status dashboard and metrics - Add pm-agent-integration.md - Implementation guide for PM Agent mode This establishes comprehensive documentation foundation for: - System architecture understanding - Development planning and tracking - Implementation guidance - Progress visibility Related: #pm-agent-mode #documentation #phase-1 🤖 Generated with [Claude Code](https://claude.com/claude-code) Co-Authored-By: Claude <noreply@anthropic.com> * feat: PM Agent session lifecycle and PDCA implementation Phase 2: PM Agent Mode Integration (Design Phase) Commands/pm.md updates: - Add "Always-Active Foundation Layer" concept - Add Session Lifecycle (Session Start/During Work/Session End) - Add PDCA Cycle (Plan/Do/Check/Act) automation - Add Serena MCP Memory Integration (list/read/write_memory) - Document auto-activation triggers Agents/pm-agent.md updates: - Add Session Start Protocol (MANDATORY auto-activation) - Add During Work PDCA Cycle with example workflows - Add Session End Protocol with state preservation - Add PDCA Self-Evaluation Pattern - Add Documentation Strategy (temp → patterns/mistakes) - Add Memory Operations Reference Key Features: - Session start auto-activation for context restoration - 30-minute checkpoint saves during work - Self-evaluation with think_about_* operations - Systematic documentation lifecycle - Knowledge evolution to CLAUDE.md Implementation Status: - ✅ Design complete (Commands/pm.md, Agents/pm-agent.md) - ⏳ Implementation pending (Core components) - ⏳ Serena MCP integration pending Salvaged from mistaken development in ~/.claude directory Related: #pm-agent-mode #session-lifecycle #pdca-cycle #phase-2 🤖 Generated with [Claude Code](https://claude.com/claude-code) Co-Authored-By: Claude <noreply@anthropic.com> * fix: disable Serena MCP auto-browser launch Disable web dashboard and GUI log window auto-launch in Serena MCP server to prevent intrusive browser popups on startup. Users can still manually access the dashboard at http://localhost:24282/dashboard/ if needed. Changes: - Add CLI flags to Serena run command: - --enable-web-dashboard false - --enable-gui-log-window false - Ensures Git-tracked configuration (no reliance on ~/.serena/serena_config.yml) - Aligns with AIRIS MCP Gateway integration approach 🤖 Generated with [Claude Code](https://claude.com/claude-code) Co-Authored-By: Claude <noreply@anthropic.com> * refactor: rename directories to lowercase for PEP8 compliance - Rename superclaude/Agents -> superclaude/agents - Rename superclaude/Commands -> superclaude/commands - Rename superclaude/Core -> superclaude/core - Rename superclaude/Examples -> superclaude/examples - Rename superclaude/MCP -> superclaude/mcp - Rename superclaude/Modes -> superclaude/modes This change follows Python PEP8 naming conventions for package directories. * style: fix PEP8 violations and update package name to lowercase Changes: - Format all Python files with black (43 files reformatted) - Update package name from 'SuperClaude' to 'superclaude' in pyproject.toml - Fix import statements to use lowercase package name - Add missing imports (timedelta, __version__) - Remove old SuperClaude.egg-info directory PEP8 violations reduced from 2672 to 701 (mostly E501 line length due to black's 88 char vs flake8's 79 char limit). * docs: add PM Agent development documentation Add comprehensive PM Agent development documentation: - PM Agent ideal workflow (7-phase autonomous cycle) - Project structure understanding (Git vs installed environment) - Installation flow understanding (CommandsComponent behavior) - Task management system (current-tasks.md) Purpose: Eliminate repeated explanations and enable autonomous PDCA cycles 🤖 Generated with [Claude Code](https://claude.com/claude-code) Co-Authored-By: Claude <noreply@anthropic.com> * feat(pm-agent): add self-correcting execution and warning investigation culture ## Changes ### superclaude/commands/pm.md - Add "Self-Correcting Execution" section with root cause analysis protocol - Add "Warning/Error Investigation Culture" section enforcing zero-tolerance for dismissal - Define error detection protocol: STOP → Investigate → Hypothesis → Different Solution → Execute - Document anti-patterns (retry without understanding) and correct patterns (research-first) ### docs/Development/hypothesis-pm-autonomous-enhancement-2025-10-14.md - Add PDCA workflow hypothesis document for PM Agent autonomous enhancement ## Rationale PM Agent must never retry failed operations without understanding root causes. All warnings and errors require investigation via context7/WebFetch/documentation to ensure production-quality code and prevent technical debt accumulation. 🤖 Generated with [Claude Code](https://claude.com/claude-code) Co-Authored-By: Claude <noreply@anthropic.com> * feat(installer): add airis-mcp-gateway MCP server option ## Changes - Add airis-mcp-gateway to MCP server options in installer - Configuration: GitHub-based installation via uvx - Repository: https://github.com/oraios/airis-mcp-gateway - Purpose: Dynamic MCP Gateway for zero-token baseline and on-demand tool loading ## Implementation Added to setup/components/mcp.py self.mcp_servers dictionary with: - install_method: github - install_command: uvx test installation - run_command: uvx runtime execution - required: False (optional server) 🤖 Generated with [Claude Code](https://claude.com/claude-code) Co-Authored-By: Claude <noreply@anthropic.com> --------- Co-authored-by: kazuki <kazuki@kazukinoMacBook-Air.local> Co-authored-by: Claude <noreply@anthropic.com>
This commit is contained in:
@@ -8,9 +8,4 @@ from .config import ConfigService
|
||||
from .files import FileService
|
||||
from .settings import SettingsService
|
||||
|
||||
__all__ = [
|
||||
'CLAUDEMdService',
|
||||
'ConfigService',
|
||||
'FileService',
|
||||
'SettingsService'
|
||||
]
|
||||
__all__ = ["CLAUDEMdService", "ConfigService", "FileService", "SettingsService"]
|
||||
|
||||
@@ -10,105 +10,107 @@ from ..utils.logger import get_logger
|
||||
|
||||
class CLAUDEMdService:
|
||||
"""Manages CLAUDE.md file updates while preserving user customizations"""
|
||||
|
||||
|
||||
def __init__(self, install_dir: Path):
|
||||
"""
|
||||
Initialize CLAUDEMdService
|
||||
|
||||
|
||||
Args:
|
||||
install_dir: Installation directory (typically ~/.claude)
|
||||
"""
|
||||
self.install_dir = install_dir
|
||||
self.claude_md_path = install_dir / "CLAUDE.md"
|
||||
self.logger = get_logger()
|
||||
|
||||
|
||||
def read_existing_imports(self) -> Set[str]:
|
||||
"""
|
||||
Parse CLAUDE.md for existing @import statements
|
||||
|
||||
|
||||
Returns:
|
||||
Set of already imported filenames (without @)
|
||||
"""
|
||||
existing_imports = set()
|
||||
|
||||
|
||||
if not self.claude_md_path.exists():
|
||||
return existing_imports
|
||||
|
||||
|
||||
try:
|
||||
with open(self.claude_md_path, 'r', encoding='utf-8') as f:
|
||||
with open(self.claude_md_path, "r", encoding="utf-8") as f:
|
||||
content = f.read()
|
||||
|
||||
|
||||
# Find all @import statements using regex
|
||||
import_pattern = r'^@([^\s\n]+\.md)\s*$'
|
||||
import_pattern = r"^@([^\s\n]+\.md)\s*$"
|
||||
matches = re.findall(import_pattern, content, re.MULTILINE)
|
||||
existing_imports.update(matches)
|
||||
|
||||
|
||||
self.logger.debug(f"Found existing imports: {existing_imports}")
|
||||
|
||||
|
||||
except Exception as e:
|
||||
self.logger.warning(f"Could not read existing CLAUDE.md imports: {e}")
|
||||
|
||||
|
||||
return existing_imports
|
||||
|
||||
|
||||
def read_existing_content(self) -> str:
|
||||
"""
|
||||
Read existing CLAUDE.md content
|
||||
|
||||
|
||||
Returns:
|
||||
Existing content or empty string if file doesn't exist
|
||||
"""
|
||||
if not self.claude_md_path.exists():
|
||||
return ""
|
||||
|
||||
|
||||
try:
|
||||
with open(self.claude_md_path, 'r', encoding='utf-8') as f:
|
||||
with open(self.claude_md_path, "r", encoding="utf-8") as f:
|
||||
return f.read()
|
||||
except Exception as e:
|
||||
self.logger.warning(f"Could not read existing CLAUDE.md: {e}")
|
||||
return ""
|
||||
|
||||
|
||||
def extract_user_content(self, content: str) -> str:
|
||||
"""
|
||||
Extract user content (everything before framework imports section)
|
||||
|
||||
|
||||
Args:
|
||||
content: Full CLAUDE.md content
|
||||
|
||||
|
||||
Returns:
|
||||
User content without framework imports
|
||||
"""
|
||||
# Look for framework imports section marker
|
||||
framework_marker = "# ===================================================\n# SuperClaude Framework Components"
|
||||
|
||||
|
||||
if framework_marker in content:
|
||||
user_content = content.split(framework_marker)[0].rstrip()
|
||||
else:
|
||||
# If no framework section exists, preserve all content
|
||||
user_content = content.rstrip()
|
||||
|
||||
|
||||
return user_content
|
||||
|
||||
def organize_imports_by_category(self, files_by_category: Dict[str, List[str]]) -> str:
|
||||
|
||||
def organize_imports_by_category(
|
||||
self, files_by_category: Dict[str, List[str]]
|
||||
) -> str:
|
||||
"""
|
||||
Organize imports into categorized sections
|
||||
|
||||
|
||||
Args:
|
||||
files_by_category: Dict mapping category names to lists of files
|
||||
|
||||
|
||||
Returns:
|
||||
Formatted import sections
|
||||
"""
|
||||
if not files_by_category:
|
||||
return ""
|
||||
|
||||
|
||||
sections = []
|
||||
|
||||
|
||||
# Framework imports section header
|
||||
sections.append("# ===================================================")
|
||||
sections.append("# SuperClaude Framework Components")
|
||||
sections.append("# ===================================================")
|
||||
sections.append("")
|
||||
|
||||
|
||||
# Add each category
|
||||
for category, files in files_by_category.items():
|
||||
if files:
|
||||
@@ -116,131 +118,139 @@ class CLAUDEMdService:
|
||||
for file in sorted(files):
|
||||
sections.append(f"@{file}")
|
||||
sections.append("")
|
||||
|
||||
|
||||
return "\n".join(sections)
|
||||
|
||||
|
||||
def add_imports(self, files: List[str], category: str = "Framework") -> bool:
|
||||
"""
|
||||
Add new imports with duplicate checking and user content preservation
|
||||
|
||||
|
||||
Args:
|
||||
files: List of filenames to import
|
||||
category: Category name for organizing imports
|
||||
|
||||
|
||||
Returns:
|
||||
True if successful, False otherwise
|
||||
"""
|
||||
try:
|
||||
# Ensure CLAUDE.md exists
|
||||
self.ensure_claude_md_exists()
|
||||
|
||||
|
||||
# Read existing content and imports
|
||||
existing_content = self.read_existing_content()
|
||||
existing_imports = self.read_existing_imports()
|
||||
|
||||
|
||||
# Filter out files already imported
|
||||
new_files = [f for f in files if f not in existing_imports]
|
||||
|
||||
|
||||
if not new_files:
|
||||
self.logger.info("All files already imported, no changes needed")
|
||||
return True
|
||||
|
||||
self.logger.info(f"Adding {len(new_files)} new imports to category '{category}': {new_files}")
|
||||
|
||||
|
||||
self.logger.info(
|
||||
f"Adding {len(new_files)} new imports to category '{category}': {new_files}"
|
||||
)
|
||||
|
||||
# Extract user content (preserve everything before framework section)
|
||||
user_content = self.extract_user_content(existing_content)
|
||||
|
||||
|
||||
# Parse existing framework imports by category
|
||||
existing_framework_imports = self._parse_existing_framework_imports(existing_content)
|
||||
|
||||
existing_framework_imports = self._parse_existing_framework_imports(
|
||||
existing_content
|
||||
)
|
||||
|
||||
# Add new files to the specified category
|
||||
if category not in existing_framework_imports:
|
||||
existing_framework_imports[category] = []
|
||||
existing_framework_imports[category].extend(new_files)
|
||||
|
||||
|
||||
# Build new content
|
||||
new_content_parts = []
|
||||
|
||||
|
||||
# Add user content
|
||||
if user_content.strip():
|
||||
new_content_parts.append(user_content)
|
||||
new_content_parts.append("") # Add blank line before framework section
|
||||
|
||||
|
||||
# Add organized framework imports
|
||||
framework_section = self.organize_imports_by_category(existing_framework_imports)
|
||||
framework_section = self.organize_imports_by_category(
|
||||
existing_framework_imports
|
||||
)
|
||||
if framework_section:
|
||||
new_content_parts.append(framework_section)
|
||||
|
||||
|
||||
# Write updated content
|
||||
new_content = "\n".join(new_content_parts)
|
||||
|
||||
with open(self.claude_md_path, 'w', encoding='utf-8') as f:
|
||||
|
||||
with open(self.claude_md_path, "w", encoding="utf-8") as f:
|
||||
f.write(new_content)
|
||||
|
||||
|
||||
self.logger.success(f"Updated CLAUDE.md with {len(new_files)} new imports")
|
||||
return True
|
||||
|
||||
|
||||
except Exception as e:
|
||||
self.logger.error(f"Failed to update CLAUDE.md: {e}")
|
||||
return False
|
||||
|
||||
|
||||
def _parse_existing_framework_imports(self, content: str) -> Dict[str, List[str]]:
|
||||
"""
|
||||
Parse existing framework imports organized by category
|
||||
|
||||
|
||||
Args:
|
||||
content: Full CLAUDE.md content
|
||||
|
||||
|
||||
Returns:
|
||||
Dict mapping category names to lists of imported files
|
||||
"""
|
||||
imports_by_category = {}
|
||||
|
||||
|
||||
# Look for framework imports section
|
||||
framework_marker = "# ===================================================\n# SuperClaude Framework Components"
|
||||
|
||||
|
||||
if framework_marker not in content:
|
||||
return imports_by_category
|
||||
|
||||
|
||||
# Extract framework section
|
||||
framework_section = content.split(framework_marker)[1] if framework_marker in content else ""
|
||||
|
||||
framework_section = (
|
||||
content.split(framework_marker)[1] if framework_marker in content else ""
|
||||
)
|
||||
|
||||
# Parse categories and imports
|
||||
lines = framework_section.split('\n')
|
||||
lines = framework_section.split("\n")
|
||||
current_category = None
|
||||
|
||||
|
||||
for line in lines:
|
||||
line = line.strip()
|
||||
|
||||
|
||||
# Skip section header lines and empty lines
|
||||
if line.startswith('# ===') or not line:
|
||||
if line.startswith("# ===") or not line:
|
||||
continue
|
||||
|
||||
|
||||
# Category header (starts with # but not the section divider)
|
||||
if line.startswith('# ') and not line.startswith('# ==='):
|
||||
if line.startswith("# ") and not line.startswith("# ==="):
|
||||
current_category = line[2:].strip() # Remove "# "
|
||||
if current_category not in imports_by_category:
|
||||
imports_by_category[current_category] = []
|
||||
|
||||
|
||||
# Import line (starts with @)
|
||||
elif line.startswith('@') and current_category:
|
||||
elif line.startswith("@") and current_category:
|
||||
import_file = line[1:].strip() # Remove "@"
|
||||
if import_file not in imports_by_category[current_category]:
|
||||
imports_by_category[current_category].append(import_file)
|
||||
|
||||
|
||||
return imports_by_category
|
||||
|
||||
|
||||
def ensure_claude_md_exists(self) -> None:
|
||||
"""
|
||||
Create CLAUDE.md with default content if it doesn't exist
|
||||
"""
|
||||
if self.claude_md_path.exists():
|
||||
return
|
||||
|
||||
|
||||
try:
|
||||
# Create directory if it doesn't exist
|
||||
self.claude_md_path.parent.mkdir(parents=True, exist_ok=True)
|
||||
|
||||
|
||||
# Default CLAUDE.md content
|
||||
default_content = """# SuperClaude Entry Point
|
||||
|
||||
@@ -249,34 +259,36 @@ You can add your own custom instructions and configurations here.
|
||||
|
||||
The SuperClaude framework components will be automatically imported below.
|
||||
"""
|
||||
|
||||
with open(self.claude_md_path, 'w', encoding='utf-8') as f:
|
||||
|
||||
with open(self.claude_md_path, "w", encoding="utf-8") as f:
|
||||
f.write(default_content)
|
||||
|
||||
|
||||
self.logger.info("Created CLAUDE.md with default content")
|
||||
|
||||
|
||||
except Exception as e:
|
||||
self.logger.error(f"Failed to create CLAUDE.md: {e}")
|
||||
raise
|
||||
|
||||
|
||||
def remove_imports(self, files: List[str]) -> bool:
|
||||
"""
|
||||
Remove specific imports from CLAUDE.md
|
||||
|
||||
|
||||
Args:
|
||||
files: List of filenames to remove from imports
|
||||
|
||||
|
||||
Returns:
|
||||
True if successful, False otherwise
|
||||
"""
|
||||
try:
|
||||
if not self.claude_md_path.exists():
|
||||
return True # Nothing to remove
|
||||
|
||||
|
||||
existing_content = self.read_existing_content()
|
||||
user_content = self.extract_user_content(existing_content)
|
||||
existing_framework_imports = self._parse_existing_framework_imports(existing_content)
|
||||
|
||||
existing_framework_imports = self._parse_existing_framework_imports(
|
||||
existing_content
|
||||
)
|
||||
|
||||
# Remove files from all categories
|
||||
removed_any = False
|
||||
for category, category_files in existing_framework_imports.items():
|
||||
@@ -284,33 +296,37 @@ The SuperClaude framework components will be automatically imported below.
|
||||
if file in category_files:
|
||||
category_files.remove(file)
|
||||
removed_any = True
|
||||
|
||||
|
||||
# Remove empty categories
|
||||
existing_framework_imports = {k: v for k, v in existing_framework_imports.items() if v}
|
||||
|
||||
existing_framework_imports = {
|
||||
k: v for k, v in existing_framework_imports.items() if v
|
||||
}
|
||||
|
||||
if not removed_any:
|
||||
return True # Nothing was removed
|
||||
|
||||
|
||||
# Rebuild content
|
||||
new_content_parts = []
|
||||
|
||||
|
||||
if user_content.strip():
|
||||
new_content_parts.append(user_content)
|
||||
new_content_parts.append("")
|
||||
|
||||
framework_section = self.organize_imports_by_category(existing_framework_imports)
|
||||
|
||||
framework_section = self.organize_imports_by_category(
|
||||
existing_framework_imports
|
||||
)
|
||||
if framework_section:
|
||||
new_content_parts.append(framework_section)
|
||||
|
||||
|
||||
# Write updated content
|
||||
new_content = "\n".join(new_content_parts)
|
||||
|
||||
with open(self.claude_md_path, 'w', encoding='utf-8') as f:
|
||||
|
||||
with open(self.claude_md_path, "w", encoding="utf-8") as f:
|
||||
f.write(new_content)
|
||||
|
||||
|
||||
self.logger.info(f"Removed {len(files)} imports from CLAUDE.md")
|
||||
return True
|
||||
|
||||
|
||||
except Exception as e:
|
||||
self.logger.error(f"Failed to remove imports from CLAUDE.md: {e}")
|
||||
return False
|
||||
return False
|
||||
|
||||
@@ -10,16 +10,18 @@ from pathlib import Path
|
||||
try:
|
||||
import jsonschema
|
||||
from jsonschema import validate, ValidationError
|
||||
|
||||
JSONSCHEMA_AVAILABLE = True
|
||||
except ImportError:
|
||||
JSONSCHEMA_AVAILABLE = False
|
||||
|
||||
|
||||
class ValidationError(Exception):
|
||||
"""Simple validation error for when jsonschema is not available"""
|
||||
|
||||
def __init__(self, message):
|
||||
self.message = message
|
||||
super().__init__(message)
|
||||
|
||||
|
||||
def validate(instance, schema):
|
||||
"""Dummy validation function"""
|
||||
# Basic type checking only
|
||||
@@ -32,17 +34,19 @@ except ImportError:
|
||||
elif expected_type == "string" and not isinstance(instance, str):
|
||||
raise ValidationError(f"Expected string, got {type(instance).__name__}")
|
||||
elif expected_type == "integer" and not isinstance(instance, int):
|
||||
raise ValidationError(f"Expected integer, got {type(instance).__name__}")
|
||||
raise ValidationError(
|
||||
f"Expected integer, got {type(instance).__name__}"
|
||||
)
|
||||
# Skip detailed validation if jsonschema not available
|
||||
|
||||
|
||||
class ConfigService:
|
||||
"""Manages configuration files and validation"""
|
||||
|
||||
|
||||
def __init__(self, config_dir: Path):
|
||||
"""
|
||||
Initialize config manager
|
||||
|
||||
|
||||
Args:
|
||||
config_dir: Directory containing configuration files
|
||||
"""
|
||||
@@ -51,7 +55,7 @@ class ConfigService:
|
||||
self.requirements_file = config_dir / "requirements.json"
|
||||
self._features_cache = None
|
||||
self._requirements_cache = None
|
||||
|
||||
|
||||
# Schema for features.json
|
||||
self.features_schema = {
|
||||
"type": "object",
|
||||
@@ -68,24 +72,24 @@ class ConfigService:
|
||||
"category": {"type": "string"},
|
||||
"dependencies": {
|
||||
"type": "array",
|
||||
"items": {"type": "string"}
|
||||
"items": {"type": "string"},
|
||||
},
|
||||
"enabled": {"type": "boolean"},
|
||||
"required_tools": {
|
||||
"type": "array",
|
||||
"items": {"type": "string"}
|
||||
}
|
||||
"items": {"type": "string"},
|
||||
},
|
||||
},
|
||||
"required": ["name", "version", "description", "category"],
|
||||
"additionalProperties": False
|
||||
"additionalProperties": False,
|
||||
}
|
||||
}
|
||||
},
|
||||
}
|
||||
},
|
||||
"required": ["components"],
|
||||
"additionalProperties": False
|
||||
"additionalProperties": False,
|
||||
}
|
||||
|
||||
|
||||
# Schema for requirements.json
|
||||
self.requirements_schema = {
|
||||
"type": "object",
|
||||
@@ -94,21 +98,18 @@ class ConfigService:
|
||||
"type": "object",
|
||||
"properties": {
|
||||
"min_version": {"type": "string"},
|
||||
"max_version": {"type": "string"}
|
||||
"max_version": {"type": "string"},
|
||||
},
|
||||
"required": ["min_version"]
|
||||
"required": ["min_version"],
|
||||
},
|
||||
"node": {
|
||||
"type": "object",
|
||||
"properties": {
|
||||
"min_version": {"type": "string"},
|
||||
"max_version": {"type": "string"},
|
||||
"required_for": {
|
||||
"type": "array",
|
||||
"items": {"type": "string"}
|
||||
}
|
||||
"required_for": {"type": "array", "items": {"type": "string"}},
|
||||
},
|
||||
"required": ["min_version"]
|
||||
"required": ["min_version"],
|
||||
},
|
||||
"disk_space_mb": {"type": "integer"},
|
||||
"external_tools": {
|
||||
@@ -121,14 +122,14 @@ class ConfigService:
|
||||
"min_version": {"type": "string"},
|
||||
"required_for": {
|
||||
"type": "array",
|
||||
"items": {"type": "string"}
|
||||
"items": {"type": "string"},
|
||||
},
|
||||
"optional": {"type": "boolean"}
|
||||
"optional": {"type": "boolean"},
|
||||
},
|
||||
"required": ["command"],
|
||||
"additionalProperties": False
|
||||
"additionalProperties": False,
|
||||
}
|
||||
}
|
||||
},
|
||||
},
|
||||
"installation_commands": {
|
||||
"type": "object",
|
||||
@@ -140,136 +141,138 @@ class ConfigService:
|
||||
"darwin": {"type": "string"},
|
||||
"win32": {"type": "string"},
|
||||
"all": {"type": "string"},
|
||||
"description": {"type": "string"}
|
||||
"description": {"type": "string"},
|
||||
},
|
||||
"additionalProperties": False
|
||||
"additionalProperties": False,
|
||||
}
|
||||
}
|
||||
}
|
||||
},
|
||||
},
|
||||
},
|
||||
"required": ["python", "disk_space_mb"],
|
||||
"additionalProperties": False
|
||||
"additionalProperties": False,
|
||||
}
|
||||
|
||||
|
||||
def load_features(self) -> Dict[str, Any]:
|
||||
"""
|
||||
Load and validate features configuration
|
||||
|
||||
|
||||
Returns:
|
||||
Features configuration dict
|
||||
|
||||
|
||||
Raises:
|
||||
FileNotFoundError: If features.json not found
|
||||
ValidationError: If features.json is invalid
|
||||
"""
|
||||
if self._features_cache is not None:
|
||||
return self._features_cache
|
||||
|
||||
|
||||
if not self.features_file.exists():
|
||||
raise FileNotFoundError(f"Features config not found: {self.features_file}")
|
||||
|
||||
|
||||
try:
|
||||
with open(self.features_file, 'r') as f:
|
||||
with open(self.features_file, "r") as f:
|
||||
features = json.load(f)
|
||||
|
||||
|
||||
# Validate schema
|
||||
validate(instance=features, schema=self.features_schema)
|
||||
|
||||
|
||||
self._features_cache = features
|
||||
return features
|
||||
|
||||
|
||||
except json.JSONDecodeError as e:
|
||||
raise ValidationError(f"Invalid JSON in {self.features_file}: {e}")
|
||||
except ValidationError as e:
|
||||
raise ValidationError(f"Invalid features schema: {str(e)}")
|
||||
|
||||
|
||||
def load_requirements(self) -> Dict[str, Any]:
|
||||
"""
|
||||
Load and validate requirements configuration
|
||||
|
||||
|
||||
Returns:
|
||||
Requirements configuration dict
|
||||
|
||||
|
||||
Raises:
|
||||
FileNotFoundError: If requirements.json not found
|
||||
ValidationError: If requirements.json is invalid
|
||||
"""
|
||||
if self._requirements_cache is not None:
|
||||
return self._requirements_cache
|
||||
|
||||
|
||||
if not self.requirements_file.exists():
|
||||
raise FileNotFoundError(f"Requirements config not found: {self.requirements_file}")
|
||||
|
||||
raise FileNotFoundError(
|
||||
f"Requirements config not found: {self.requirements_file}"
|
||||
)
|
||||
|
||||
try:
|
||||
with open(self.requirements_file, 'r') as f:
|
||||
with open(self.requirements_file, "r") as f:
|
||||
requirements = json.load(f)
|
||||
|
||||
|
||||
# Validate schema
|
||||
validate(instance=requirements, schema=self.requirements_schema)
|
||||
|
||||
|
||||
self._requirements_cache = requirements
|
||||
return requirements
|
||||
|
||||
|
||||
except json.JSONDecodeError as e:
|
||||
raise ValidationError(f"Invalid JSON in {self.requirements_file}: {e}")
|
||||
except ValidationError as e:
|
||||
raise ValidationError(f"Invalid requirements schema: {str(e)}")
|
||||
|
||||
|
||||
def get_component_info(self, component_name: str) -> Optional[Dict[str, Any]]:
|
||||
"""
|
||||
Get information about a specific component
|
||||
|
||||
|
||||
Args:
|
||||
component_name: Name of component
|
||||
|
||||
|
||||
Returns:
|
||||
Component info dict or None if not found
|
||||
"""
|
||||
features = self.load_features()
|
||||
return features.get("components", {}).get(component_name)
|
||||
|
||||
|
||||
def get_enabled_components(self) -> List[str]:
|
||||
"""
|
||||
Get list of enabled component names
|
||||
|
||||
|
||||
Returns:
|
||||
List of enabled component names
|
||||
"""
|
||||
features = self.load_features()
|
||||
enabled = []
|
||||
|
||||
|
||||
for name, info in features.get("components", {}).items():
|
||||
if info.get("enabled", True): # Default to enabled
|
||||
enabled.append(name)
|
||||
|
||||
|
||||
return enabled
|
||||
|
||||
|
||||
def get_components_by_category(self, category: str) -> List[str]:
|
||||
"""
|
||||
Get component names by category
|
||||
|
||||
|
||||
Args:
|
||||
category: Component category
|
||||
|
||||
|
||||
Returns:
|
||||
List of component names in category
|
||||
"""
|
||||
features = self.load_features()
|
||||
components = []
|
||||
|
||||
|
||||
for name, info in features.get("components", {}).items():
|
||||
if info.get("category") == category:
|
||||
components.append(name)
|
||||
|
||||
|
||||
return components
|
||||
|
||||
|
||||
def get_component_dependencies(self, component_name: str) -> List[str]:
|
||||
"""
|
||||
Get dependencies for a component
|
||||
|
||||
|
||||
Args:
|
||||
component_name: Name of component
|
||||
|
||||
|
||||
Returns:
|
||||
List of dependency component names
|
||||
"""
|
||||
@@ -277,82 +280,86 @@ class ConfigService:
|
||||
if component_info:
|
||||
return component_info.get("dependencies", [])
|
||||
return []
|
||||
|
||||
|
||||
def get_system_requirements(self) -> Dict[str, Any]:
|
||||
"""
|
||||
Get system requirements
|
||||
|
||||
|
||||
Returns:
|
||||
System requirements dict
|
||||
"""
|
||||
return self.load_requirements()
|
||||
|
||||
def get_requirements_for_components(self, component_names: List[str]) -> Dict[str, Any]:
|
||||
|
||||
def get_requirements_for_components(
|
||||
self, component_names: List[str]
|
||||
) -> Dict[str, Any]:
|
||||
"""
|
||||
Get consolidated requirements for specific components
|
||||
|
||||
|
||||
Args:
|
||||
component_names: List of component names
|
||||
|
||||
|
||||
Returns:
|
||||
Consolidated requirements dict
|
||||
"""
|
||||
requirements = self.load_requirements()
|
||||
features = self.load_features()
|
||||
|
||||
|
||||
# Start with base requirements
|
||||
result = {
|
||||
"python": requirements["python"],
|
||||
"disk_space_mb": requirements["disk_space_mb"],
|
||||
"external_tools": {}
|
||||
"external_tools": {},
|
||||
}
|
||||
|
||||
|
||||
# Add Node.js requirements if needed
|
||||
node_required = False
|
||||
for component_name in component_names:
|
||||
component_info = features.get("components", {}).get(component_name, {})
|
||||
required_tools = component_info.get("required_tools", [])
|
||||
|
||||
|
||||
if "node" in required_tools:
|
||||
node_required = True
|
||||
break
|
||||
|
||||
|
||||
if node_required and "node" in requirements:
|
||||
result["node"] = requirements["node"]
|
||||
|
||||
|
||||
# Add external tool requirements
|
||||
for component_name in component_names:
|
||||
component_info = features.get("components", {}).get(component_name, {})
|
||||
required_tools = component_info.get("required_tools", [])
|
||||
|
||||
|
||||
for tool in required_tools:
|
||||
if tool in requirements.get("external_tools", {}):
|
||||
result["external_tools"][tool] = requirements["external_tools"][tool]
|
||||
|
||||
result["external_tools"][tool] = requirements["external_tools"][
|
||||
tool
|
||||
]
|
||||
|
||||
return result
|
||||
|
||||
|
||||
def validate_config_files(self) -> List[str]:
|
||||
"""
|
||||
Validate all configuration files
|
||||
|
||||
|
||||
Returns:
|
||||
List of validation errors (empty if all valid)
|
||||
"""
|
||||
errors = []
|
||||
|
||||
|
||||
try:
|
||||
self.load_features()
|
||||
except Exception as e:
|
||||
errors.append(f"Features config error: {e}")
|
||||
|
||||
|
||||
try:
|
||||
self.load_requirements()
|
||||
except Exception as e:
|
||||
errors.append(f"Requirements config error: {e}")
|
||||
|
||||
|
||||
return errors
|
||||
|
||||
|
||||
def clear_cache(self) -> None:
|
||||
"""Clear cached configuration data"""
|
||||
self._features_cache = None
|
||||
self._requirements_cache = None
|
||||
self._requirements_cache = None
|
||||
|
||||
@@ -12,83 +12,87 @@ import hashlib
|
||||
|
||||
class FileService:
|
||||
"""Cross-platform file operations manager"""
|
||||
|
||||
|
||||
def __init__(self, dry_run: bool = False):
|
||||
"""
|
||||
Initialize file manager
|
||||
|
||||
|
||||
Args:
|
||||
dry_run: If True, only simulate file operations
|
||||
"""
|
||||
self.dry_run = dry_run
|
||||
self.copied_files: List[Path] = []
|
||||
self.created_dirs: List[Path] = []
|
||||
|
||||
def copy_file(self, source: Path, target: Path, preserve_permissions: bool = True) -> bool:
|
||||
|
||||
def copy_file(
|
||||
self, source: Path, target: Path, preserve_permissions: bool = True
|
||||
) -> bool:
|
||||
"""
|
||||
Copy single file with permission preservation
|
||||
|
||||
|
||||
Args:
|
||||
source: Source file path
|
||||
target: Target file path
|
||||
preserve_permissions: Whether to preserve file permissions
|
||||
|
||||
|
||||
Returns:
|
||||
True if successful, False otherwise
|
||||
"""
|
||||
if not source.exists():
|
||||
raise FileNotFoundError(f"Source file not found: {source}")
|
||||
|
||||
|
||||
if not source.is_file():
|
||||
raise ValueError(f"Source is not a file: {source}")
|
||||
|
||||
|
||||
if self.dry_run:
|
||||
print(f"[DRY RUN] Would copy {source} -> {target}")
|
||||
return True
|
||||
|
||||
|
||||
try:
|
||||
# Ensure target directory exists
|
||||
target.parent.mkdir(parents=True, exist_ok=True)
|
||||
|
||||
|
||||
# Copy file
|
||||
if preserve_permissions:
|
||||
shutil.copy2(source, target)
|
||||
else:
|
||||
shutil.copy(source, target)
|
||||
|
||||
|
||||
self.copied_files.append(target)
|
||||
return True
|
||||
|
||||
|
||||
except Exception as e:
|
||||
print(f"Error copying {source} to {target}: {e}")
|
||||
return False
|
||||
|
||||
def copy_directory(self, source: Path, target: Path, ignore_patterns: Optional[List[str]] = None) -> bool:
|
||||
|
||||
def copy_directory(
|
||||
self, source: Path, target: Path, ignore_patterns: Optional[List[str]] = None
|
||||
) -> bool:
|
||||
"""
|
||||
Recursively copy directory with gitignore-style patterns
|
||||
|
||||
|
||||
Args:
|
||||
source: Source directory path
|
||||
target: Target directory path
|
||||
ignore_patterns: List of patterns to ignore (gitignore style)
|
||||
|
||||
|
||||
Returns:
|
||||
True if successful, False otherwise
|
||||
"""
|
||||
if not source.exists():
|
||||
raise FileNotFoundError(f"Source directory not found: {source}")
|
||||
|
||||
|
||||
if not source.is_dir():
|
||||
raise ValueError(f"Source is not a directory: {source}")
|
||||
|
||||
|
||||
ignore_patterns = ignore_patterns or []
|
||||
default_ignores = ['.git', '.gitignore', '__pycache__', '*.pyc', '.DS_Store']
|
||||
default_ignores = [".git", ".gitignore", "__pycache__", "*.pyc", ".DS_Store"]
|
||||
all_ignores = ignore_patterns + default_ignores
|
||||
|
||||
|
||||
if self.dry_run:
|
||||
print(f"[DRY RUN] Would copy directory {source} -> {target}")
|
||||
return True
|
||||
|
||||
|
||||
try:
|
||||
# Create ignore function
|
||||
def ignore_func(directory: str, contents: List[str]) -> List[str]:
|
||||
@@ -96,250 +100,258 @@ class FileService:
|
||||
for item in contents:
|
||||
item_path = Path(directory) / item
|
||||
rel_path = item_path.relative_to(source)
|
||||
|
||||
|
||||
# Check against ignore patterns
|
||||
for pattern in all_ignores:
|
||||
if fnmatch.fnmatch(item, pattern) or fnmatch.fnmatch(str(rel_path), pattern):
|
||||
if fnmatch.fnmatch(item, pattern) or fnmatch.fnmatch(
|
||||
str(rel_path), pattern
|
||||
):
|
||||
ignored.append(item)
|
||||
break
|
||||
|
||||
|
||||
return ignored
|
||||
|
||||
|
||||
# Copy tree
|
||||
shutil.copytree(source, target, ignore=ignore_func, dirs_exist_ok=True)
|
||||
|
||||
|
||||
# Track created directories and files
|
||||
for item in target.rglob('*'):
|
||||
for item in target.rglob("*"):
|
||||
if item.is_dir():
|
||||
self.created_dirs.append(item)
|
||||
else:
|
||||
self.copied_files.append(item)
|
||||
|
||||
|
||||
return True
|
||||
|
||||
|
||||
except Exception as e:
|
||||
print(f"Error copying directory {source} to {target}: {e}")
|
||||
return False
|
||||
|
||||
|
||||
def ensure_directory(self, directory: Path, mode: int = 0o755) -> bool:
|
||||
"""
|
||||
Create directory and parents if they don't exist
|
||||
|
||||
|
||||
Args:
|
||||
directory: Directory path to create
|
||||
mode: Directory permissions (Unix only)
|
||||
|
||||
|
||||
Returns:
|
||||
True if successful, False otherwise
|
||||
"""
|
||||
if self.dry_run:
|
||||
print(f"[DRY RUN] Would create directory {directory}")
|
||||
return True
|
||||
|
||||
|
||||
try:
|
||||
directory.mkdir(parents=True, exist_ok=True, mode=mode)
|
||||
|
||||
|
||||
if directory not in self.created_dirs:
|
||||
self.created_dirs.append(directory)
|
||||
|
||||
|
||||
return True
|
||||
|
||||
|
||||
except Exception as e:
|
||||
print(f"Error creating directory {directory}: {e}")
|
||||
return False
|
||||
|
||||
|
||||
def remove_file(self, file_path: Path) -> bool:
|
||||
"""
|
||||
Remove single file
|
||||
|
||||
|
||||
Args:
|
||||
file_path: Path to file to remove
|
||||
|
||||
|
||||
Returns:
|
||||
True if successful, False otherwise
|
||||
"""
|
||||
if not file_path.exists():
|
||||
return True # Already gone
|
||||
|
||||
|
||||
if self.dry_run:
|
||||
print(f"[DRY RUN] Would remove file {file_path}")
|
||||
return True
|
||||
|
||||
|
||||
try:
|
||||
if file_path.is_file():
|
||||
file_path.unlink()
|
||||
else:
|
||||
print(f"Warning: {file_path} is not a file, skipping")
|
||||
return False
|
||||
|
||||
|
||||
# Remove from tracking
|
||||
if file_path in self.copied_files:
|
||||
self.copied_files.remove(file_path)
|
||||
|
||||
|
||||
return True
|
||||
|
||||
|
||||
except Exception as e:
|
||||
print(f"Error removing file {file_path}: {e}")
|
||||
return False
|
||||
|
||||
|
||||
def remove_directory(self, directory: Path, recursive: bool = False) -> bool:
|
||||
"""
|
||||
Remove directory
|
||||
|
||||
|
||||
Args:
|
||||
directory: Directory path to remove
|
||||
recursive: Whether to remove recursively
|
||||
|
||||
|
||||
Returns:
|
||||
True if successful, False otherwise
|
||||
"""
|
||||
if not directory.exists():
|
||||
return True # Already gone
|
||||
|
||||
|
||||
if self.dry_run:
|
||||
action = "recursively remove" if recursive else "remove"
|
||||
print(f"[DRY RUN] Would {action} directory {directory}")
|
||||
return True
|
||||
|
||||
|
||||
try:
|
||||
if recursive:
|
||||
shutil.rmtree(directory)
|
||||
else:
|
||||
directory.rmdir() # Only works if empty
|
||||
|
||||
|
||||
# Remove from tracking
|
||||
if directory in self.created_dirs:
|
||||
self.created_dirs.remove(directory)
|
||||
|
||||
|
||||
return True
|
||||
|
||||
|
||||
except Exception as e:
|
||||
print(f"Error removing directory {directory}: {e}")
|
||||
return False
|
||||
|
||||
|
||||
def resolve_home_path(self, path: str) -> Path:
|
||||
"""
|
||||
Convert path with ~ to actual home path on any OS
|
||||
|
||||
|
||||
Args:
|
||||
path: Path string potentially containing ~
|
||||
|
||||
|
||||
Returns:
|
||||
Resolved Path object
|
||||
"""
|
||||
return Path(path).expanduser().resolve()
|
||||
|
||||
|
||||
def make_executable(self, file_path: Path) -> bool:
|
||||
"""
|
||||
Make file executable (Unix/Linux/macOS)
|
||||
|
||||
|
||||
Args:
|
||||
file_path: Path to file to make executable
|
||||
|
||||
|
||||
Returns:
|
||||
True if successful, False otherwise
|
||||
"""
|
||||
if not file_path.exists():
|
||||
return False
|
||||
|
||||
|
||||
if self.dry_run:
|
||||
print(f"[DRY RUN] Would make {file_path} executable")
|
||||
return True
|
||||
|
||||
|
||||
try:
|
||||
# Get current permissions
|
||||
current_mode = file_path.stat().st_mode
|
||||
|
||||
|
||||
# Add execute permissions for owner, group, and others
|
||||
new_mode = current_mode | stat.S_IXUSR | stat.S_IXGRP | stat.S_IXOTH
|
||||
|
||||
|
||||
file_path.chmod(new_mode)
|
||||
return True
|
||||
|
||||
|
||||
except Exception as e:
|
||||
print(f"Error making {file_path} executable: {e}")
|
||||
return False
|
||||
|
||||
def get_file_hash(self, file_path: Path, algorithm: str = 'sha256') -> Optional[str]:
|
||||
|
||||
def get_file_hash(
|
||||
self, file_path: Path, algorithm: str = "sha256"
|
||||
) -> Optional[str]:
|
||||
"""
|
||||
Calculate file hash
|
||||
|
||||
|
||||
Args:
|
||||
file_path: Path to file
|
||||
algorithm: Hash algorithm (md5, sha1, sha256, etc.)
|
||||
|
||||
|
||||
Returns:
|
||||
Hex hash string or None if error
|
||||
"""
|
||||
if not file_path.exists() or not file_path.is_file():
|
||||
return None
|
||||
|
||||
|
||||
try:
|
||||
hasher = hashlib.new(algorithm)
|
||||
|
||||
with open(file_path, 'rb') as f:
|
||||
|
||||
with open(file_path, "rb") as f:
|
||||
# Read in chunks for large files
|
||||
for chunk in iter(lambda: f.read(8192), b""):
|
||||
hasher.update(chunk)
|
||||
|
||||
|
||||
return hasher.hexdigest()
|
||||
|
||||
|
||||
except Exception:
|
||||
return None
|
||||
|
||||
def verify_file_integrity(self, file_path: Path, expected_hash: str, algorithm: str = 'sha256') -> bool:
|
||||
|
||||
def verify_file_integrity(
|
||||
self, file_path: Path, expected_hash: str, algorithm: str = "sha256"
|
||||
) -> bool:
|
||||
"""
|
||||
Verify file integrity using hash
|
||||
|
||||
|
||||
Args:
|
||||
file_path: Path to file to verify
|
||||
expected_hash: Expected hash value
|
||||
algorithm: Hash algorithm used
|
||||
|
||||
|
||||
Returns:
|
||||
True if file matches expected hash, False otherwise
|
||||
"""
|
||||
actual_hash = self.get_file_hash(file_path, algorithm)
|
||||
return actual_hash is not None and actual_hash.lower() == expected_hash.lower()
|
||||
|
||||
|
||||
def get_directory_size(self, directory: Path) -> int:
|
||||
"""
|
||||
Calculate total size of directory in bytes
|
||||
|
||||
|
||||
Args:
|
||||
directory: Directory path
|
||||
|
||||
|
||||
Returns:
|
||||
Total size in bytes
|
||||
"""
|
||||
if not directory.exists() or not directory.is_dir():
|
||||
return 0
|
||||
|
||||
|
||||
total_size = 0
|
||||
try:
|
||||
for file_path in directory.rglob('*'):
|
||||
for file_path in directory.rglob("*"):
|
||||
if file_path.is_file():
|
||||
total_size += file_path.stat().st_size
|
||||
except Exception:
|
||||
pass # Skip files we can't access
|
||||
|
||||
|
||||
return total_size
|
||||
|
||||
def find_files(self, directory: Path, pattern: str = '*', recursive: bool = True) -> List[Path]:
|
||||
|
||||
def find_files(
|
||||
self, directory: Path, pattern: str = "*", recursive: bool = True
|
||||
) -> List[Path]:
|
||||
"""
|
||||
Find files matching pattern
|
||||
|
||||
|
||||
Args:
|
||||
directory: Directory to search
|
||||
pattern: Glob pattern to match
|
||||
recursive: Whether to search recursively
|
||||
|
||||
|
||||
Returns:
|
||||
List of matching file paths
|
||||
"""
|
||||
if not directory.exists() or not directory.is_dir():
|
||||
return []
|
||||
|
||||
|
||||
try:
|
||||
if recursive:
|
||||
return list(directory.rglob(pattern))
|
||||
@@ -347,52 +359,54 @@ class FileService:
|
||||
return list(directory.glob(pattern))
|
||||
except Exception:
|
||||
return []
|
||||
|
||||
def backup_file(self, file_path: Path, backup_suffix: str = '.backup') -> Optional[Path]:
|
||||
|
||||
def backup_file(
|
||||
self, file_path: Path, backup_suffix: str = ".backup"
|
||||
) -> Optional[Path]:
|
||||
"""
|
||||
Create backup copy of file
|
||||
|
||||
|
||||
Args:
|
||||
file_path: Path to file to backup
|
||||
backup_suffix: Suffix to add to backup file
|
||||
|
||||
|
||||
Returns:
|
||||
Path to backup file or None if failed
|
||||
"""
|
||||
if not file_path.exists() or not file_path.is_file():
|
||||
return None
|
||||
|
||||
|
||||
backup_path = file_path.with_suffix(file_path.suffix + backup_suffix)
|
||||
|
||||
|
||||
if self.copy_file(file_path, backup_path):
|
||||
return backup_path
|
||||
return None
|
||||
|
||||
|
||||
def get_free_space(self, path: Path) -> int:
|
||||
"""
|
||||
Get free disk space at path in bytes
|
||||
|
||||
|
||||
Args:
|
||||
path: Path to check (can be file or directory)
|
||||
|
||||
|
||||
Returns:
|
||||
Free space in bytes
|
||||
"""
|
||||
try:
|
||||
if path.is_file():
|
||||
path = path.parent
|
||||
|
||||
|
||||
stat_result = shutil.disk_usage(path)
|
||||
return stat_result.free
|
||||
except Exception:
|
||||
return 0
|
||||
|
||||
|
||||
def cleanup_tracked_files(self) -> None:
|
||||
"""Remove all files and directories created during this session"""
|
||||
if self.dry_run:
|
||||
print("[DRY RUN] Would cleanup tracked files")
|
||||
return
|
||||
|
||||
|
||||
# Remove files first
|
||||
for file_path in reversed(self.copied_files):
|
||||
try:
|
||||
@@ -400,7 +414,7 @@ class FileService:
|
||||
file_path.unlink()
|
||||
except Exception:
|
||||
pass
|
||||
|
||||
|
||||
# Remove directories (in reverse order of creation)
|
||||
for directory in reversed(self.created_dirs):
|
||||
try:
|
||||
@@ -408,21 +422,21 @@ class FileService:
|
||||
directory.rmdir()
|
||||
except Exception:
|
||||
pass
|
||||
|
||||
|
||||
self.copied_files.clear()
|
||||
self.created_dirs.clear()
|
||||
|
||||
|
||||
def get_operation_summary(self) -> Dict[str, Any]:
|
||||
"""
|
||||
Get summary of file operations performed
|
||||
|
||||
|
||||
Returns:
|
||||
Dict with operation statistics
|
||||
"""
|
||||
return {
|
||||
'files_copied': len(self.copied_files),
|
||||
'directories_created': len(self.created_dirs),
|
||||
'dry_run': self.dry_run,
|
||||
'copied_files': [str(f) for f in self.copied_files],
|
||||
'created_directories': [str(d) for d in self.created_dirs]
|
||||
}
|
||||
"files_copied": len(self.copied_files),
|
||||
"directories_created": len(self.created_dirs),
|
||||
"dry_run": self.dry_run,
|
||||
"copied_files": [str(f) for f in self.copied_files],
|
||||
"created_directories": [str(d) for d in self.created_dirs],
|
||||
}
|
||||
|
||||
@@ -14,11 +14,11 @@ import copy
|
||||
|
||||
class SettingsService:
|
||||
"""Manages settings.json file operations"""
|
||||
|
||||
|
||||
def __init__(self, install_dir: Path):
|
||||
"""
|
||||
Initialize settings manager
|
||||
|
||||
|
||||
Args:
|
||||
install_dir: Installation directory containing settings.json
|
||||
"""
|
||||
@@ -26,27 +26,29 @@ class SettingsService:
|
||||
self.settings_file = install_dir / "settings.json"
|
||||
self.metadata_file = install_dir / ".superclaude-metadata.json"
|
||||
self.backup_dir = install_dir / "backups" / "settings"
|
||||
|
||||
|
||||
def load_settings(self) -> Dict[str, Any]:
|
||||
"""
|
||||
Load settings from settings.json
|
||||
|
||||
|
||||
Returns:
|
||||
Settings dict (empty if file doesn't exist)
|
||||
"""
|
||||
if not self.settings_file.exists():
|
||||
return {}
|
||||
|
||||
|
||||
try:
|
||||
with open(self.settings_file, 'r', encoding='utf-8') as f:
|
||||
with open(self.settings_file, "r", encoding="utf-8") as f:
|
||||
return json.load(f)
|
||||
except (json.JSONDecodeError, IOError) as e:
|
||||
raise ValueError(f"Could not load settings from {self.settings_file}: {e}")
|
||||
|
||||
def save_settings(self, settings: Dict[str, Any], create_backup: bool = True) -> None:
|
||||
|
||||
def save_settings(
|
||||
self, settings: Dict[str, Any], create_backup: bool = True
|
||||
) -> None:
|
||||
"""
|
||||
Save settings to settings.json with optional backup
|
||||
|
||||
|
||||
Args:
|
||||
settings: Settings dict to save
|
||||
create_backup: Whether to create backup before saving
|
||||
@@ -54,46 +56,46 @@ class SettingsService:
|
||||
# Create backup if requested and file exists
|
||||
if create_backup and self.settings_file.exists():
|
||||
self._create_settings_backup()
|
||||
|
||||
|
||||
# Ensure directory exists
|
||||
self.settings_file.parent.mkdir(parents=True, exist_ok=True)
|
||||
|
||||
|
||||
# Save with pretty formatting
|
||||
try:
|
||||
with open(self.settings_file, 'w', encoding='utf-8') as f:
|
||||
with open(self.settings_file, "w", encoding="utf-8") as f:
|
||||
json.dump(settings, f, indent=2, ensure_ascii=False, sort_keys=True)
|
||||
except IOError as e:
|
||||
raise ValueError(f"Could not save settings to {self.settings_file}: {e}")
|
||||
|
||||
|
||||
def load_metadata(self) -> Dict[str, Any]:
|
||||
"""
|
||||
Load SuperClaude metadata from .superclaude-metadata.json
|
||||
|
||||
|
||||
Returns:
|
||||
Metadata dict (empty if file doesn't exist)
|
||||
"""
|
||||
if not self.metadata_file.exists():
|
||||
return {}
|
||||
|
||||
|
||||
try:
|
||||
with open(self.metadata_file, 'r', encoding='utf-8') as f:
|
||||
with open(self.metadata_file, "r", encoding="utf-8") as f:
|
||||
return json.load(f)
|
||||
except (json.JSONDecodeError, IOError) as e:
|
||||
raise ValueError(f"Could not load metadata from {self.metadata_file}: {e}")
|
||||
|
||||
|
||||
def save_metadata(self, metadata: Dict[str, Any]) -> None:
|
||||
"""
|
||||
Save SuperClaude metadata to .superclaude-metadata.json
|
||||
|
||||
|
||||
Args:
|
||||
metadata: Metadata dict to save
|
||||
"""
|
||||
# Ensure directory exists
|
||||
self.metadata_file.parent.mkdir(parents=True, exist_ok=True)
|
||||
|
||||
|
||||
# Save with pretty formatting
|
||||
try:
|
||||
with open(self.metadata_file, 'w', encoding='utf-8') as f:
|
||||
with open(self.metadata_file, "w", encoding="utf-8") as f:
|
||||
json.dump(metadata, f, indent=2, ensure_ascii=False, sort_keys=True)
|
||||
except IOError as e:
|
||||
raise ValueError(f"Could not save metadata to {self.metadata_file}: {e}")
|
||||
@@ -125,128 +127,134 @@ class SettingsService:
|
||||
def migrate_superclaude_data(self) -> bool:
|
||||
"""
|
||||
Migrate SuperClaude-specific data from settings.json to metadata file
|
||||
|
||||
|
||||
Returns:
|
||||
True if migration occurred, False if no data to migrate
|
||||
"""
|
||||
settings = self.load_settings()
|
||||
|
||||
|
||||
# SuperClaude-specific fields to migrate
|
||||
superclaude_fields = ["components", "framework", "superclaude", "mcp"]
|
||||
data_to_migrate = {}
|
||||
fields_found = False
|
||||
|
||||
|
||||
# Extract SuperClaude data
|
||||
for field in superclaude_fields:
|
||||
if field in settings:
|
||||
data_to_migrate[field] = settings[field]
|
||||
fields_found = True
|
||||
|
||||
|
||||
if not fields_found:
|
||||
return False
|
||||
|
||||
|
||||
# Load existing metadata (if any) and merge
|
||||
existing_metadata = self.load_metadata()
|
||||
merged_metadata = self._deep_merge(existing_metadata, data_to_migrate)
|
||||
|
||||
|
||||
# Save to metadata file
|
||||
self.save_metadata(merged_metadata)
|
||||
|
||||
|
||||
# Remove SuperClaude fields from settings
|
||||
clean_settings = {k: v for k, v in settings.items() if k not in superclaude_fields}
|
||||
|
||||
clean_settings = {
|
||||
k: v for k, v in settings.items() if k not in superclaude_fields
|
||||
}
|
||||
|
||||
# Save cleaned settings
|
||||
self.save_settings(clean_settings, create_backup=True)
|
||||
|
||||
|
||||
return True
|
||||
|
||||
|
||||
def merge_settings(self, modifications: Dict[str, Any]) -> Dict[str, Any]:
|
||||
"""
|
||||
Deep merge modifications into existing settings
|
||||
|
||||
|
||||
Args:
|
||||
modifications: Settings modifications to merge
|
||||
|
||||
|
||||
Returns:
|
||||
Merged settings dict
|
||||
"""
|
||||
existing = self.load_settings()
|
||||
return self._deep_merge(existing, modifications)
|
||||
|
||||
def update_settings(self, modifications: Dict[str, Any], create_backup: bool = True) -> None:
|
||||
|
||||
def update_settings(
|
||||
self, modifications: Dict[str, Any], create_backup: bool = True
|
||||
) -> None:
|
||||
"""
|
||||
Update settings with modifications
|
||||
|
||||
|
||||
Args:
|
||||
modifications: Settings modifications to apply
|
||||
create_backup: Whether to create backup before updating
|
||||
"""
|
||||
merged = self.merge_settings(modifications)
|
||||
self.save_settings(merged, create_backup)
|
||||
|
||||
|
||||
def get_setting(self, key_path: str, default: Any = None) -> Any:
|
||||
"""
|
||||
Get setting value using dot-notation path
|
||||
|
||||
|
||||
Args:
|
||||
key_path: Dot-separated path (e.g., "hooks.enabled")
|
||||
default: Default value if key not found
|
||||
|
||||
|
||||
Returns:
|
||||
Setting value or default
|
||||
"""
|
||||
settings = self.load_settings()
|
||||
|
||||
|
||||
try:
|
||||
value = settings
|
||||
for key in key_path.split('.'):
|
||||
for key in key_path.split("."):
|
||||
value = value[key]
|
||||
return value
|
||||
except (KeyError, TypeError):
|
||||
return default
|
||||
|
||||
def set_setting(self, key_path: str, value: Any, create_backup: bool = True) -> None:
|
||||
|
||||
def set_setting(
|
||||
self, key_path: str, value: Any, create_backup: bool = True
|
||||
) -> None:
|
||||
"""
|
||||
Set setting value using dot-notation path
|
||||
|
||||
|
||||
Args:
|
||||
key_path: Dot-separated path (e.g., "hooks.enabled")
|
||||
value: Value to set
|
||||
create_backup: Whether to create backup before updating
|
||||
"""
|
||||
# Build nested dict structure
|
||||
keys = key_path.split('.')
|
||||
keys = key_path.split(".")
|
||||
modification = {}
|
||||
current = modification
|
||||
|
||||
|
||||
for key in keys[:-1]:
|
||||
current[key] = {}
|
||||
current = current[key]
|
||||
|
||||
|
||||
current[keys[-1]] = value
|
||||
|
||||
|
||||
self.update_settings(modification, create_backup)
|
||||
|
||||
|
||||
def remove_setting(self, key_path: str, create_backup: bool = True) -> bool:
|
||||
"""
|
||||
Remove setting using dot-notation path
|
||||
|
||||
|
||||
Args:
|
||||
key_path: Dot-separated path to remove
|
||||
create_backup: Whether to create backup before updating
|
||||
|
||||
|
||||
Returns:
|
||||
True if setting was removed, False if not found
|
||||
"""
|
||||
settings = self.load_settings()
|
||||
keys = key_path.split('.')
|
||||
|
||||
keys = key_path.split(".")
|
||||
|
||||
# Navigate to parent of target key
|
||||
current = settings
|
||||
try:
|
||||
for key in keys[:-1]:
|
||||
current = current[key]
|
||||
|
||||
|
||||
# Remove the target key
|
||||
if keys[-1] in current:
|
||||
del current[keys[-1]]
|
||||
@@ -254,14 +262,16 @@ class SettingsService:
|
||||
return True
|
||||
else:
|
||||
return False
|
||||
|
||||
|
||||
except (KeyError, TypeError):
|
||||
return False
|
||||
|
||||
def add_component_registration(self, component_name: str, component_info: Dict[str, Any]) -> None:
|
||||
|
||||
def add_component_registration(
|
||||
self, component_name: str, component_info: Dict[str, Any]
|
||||
) -> None:
|
||||
"""
|
||||
Add component to registry in metadata
|
||||
|
||||
|
||||
Args:
|
||||
component_name: Name of component
|
||||
component_info: Component metadata dict
|
||||
@@ -269,21 +279,21 @@ class SettingsService:
|
||||
metadata = self.load_metadata()
|
||||
if "components" not in metadata:
|
||||
metadata["components"] = {}
|
||||
|
||||
|
||||
metadata["components"][component_name] = {
|
||||
**component_info,
|
||||
"installed_at": datetime.now().isoformat()
|
||||
"installed_at": datetime.now().isoformat(),
|
||||
}
|
||||
|
||||
|
||||
self.save_metadata(metadata)
|
||||
|
||||
|
||||
def remove_component_registration(self, component_name: str) -> bool:
|
||||
"""
|
||||
Remove component from registry in metadata
|
||||
|
||||
|
||||
Args:
|
||||
component_name: Name of component to remove
|
||||
|
||||
|
||||
Returns:
|
||||
True if component was removed, False if not found
|
||||
"""
|
||||
@@ -293,64 +303,64 @@ class SettingsService:
|
||||
self.save_metadata(metadata)
|
||||
return True
|
||||
return False
|
||||
|
||||
|
||||
def get_installed_components(self) -> Dict[str, Dict[str, Any]]:
|
||||
"""
|
||||
Get all installed components from registry
|
||||
|
||||
|
||||
Returns:
|
||||
Dict of component_name -> component_info
|
||||
"""
|
||||
metadata = self.load_metadata()
|
||||
return metadata.get("components", {})
|
||||
|
||||
|
||||
def is_component_installed(self, component_name: str) -> bool:
|
||||
"""
|
||||
Check if component is registered as installed
|
||||
|
||||
|
||||
Args:
|
||||
component_name: Name of component to check
|
||||
|
||||
|
||||
Returns:
|
||||
True if component is installed, False otherwise
|
||||
"""
|
||||
components = self.get_installed_components()
|
||||
return component_name in components
|
||||
|
||||
|
||||
def get_component_version(self, component_name: str) -> Optional[str]:
|
||||
"""
|
||||
Get installed version of component
|
||||
|
||||
|
||||
Args:
|
||||
component_name: Name of component
|
||||
|
||||
|
||||
Returns:
|
||||
Version string or None if not installed
|
||||
"""
|
||||
components = self.get_installed_components()
|
||||
component_info = components.get(component_name, {})
|
||||
return component_info.get("version")
|
||||
|
||||
|
||||
def update_framework_version(self, version: str) -> None:
|
||||
"""
|
||||
Update SuperClaude framework version in metadata
|
||||
|
||||
|
||||
Args:
|
||||
version: Framework version string
|
||||
"""
|
||||
metadata = self.load_metadata()
|
||||
if "framework" not in metadata:
|
||||
metadata["framework"] = {}
|
||||
|
||||
|
||||
metadata["framework"]["version"] = version
|
||||
metadata["framework"]["updated_at"] = datetime.now().isoformat()
|
||||
|
||||
|
||||
self.save_metadata(metadata)
|
||||
|
||||
|
||||
def check_installation_exists(self) -> bool:
|
||||
"""
|
||||
Get SuperClaude framework version from metadata
|
||||
|
||||
|
||||
Returns:
|
||||
Version string or None if not set
|
||||
"""
|
||||
@@ -364,152 +374,160 @@ class SettingsService:
|
||||
Version string or None if not set
|
||||
"""
|
||||
return self.settings_file.exists()
|
||||
|
||||
|
||||
def get_metadata_setting(self, key_path: str, default: Any = None) -> Any:
|
||||
"""
|
||||
Get metadata value using dot-notation path
|
||||
|
||||
|
||||
Args:
|
||||
key_path: Dot-separated path (e.g., "framework.version")
|
||||
default: Default value if key not found
|
||||
|
||||
|
||||
Returns:
|
||||
Metadata value or default
|
||||
"""
|
||||
metadata = self.load_metadata()
|
||||
|
||||
|
||||
try:
|
||||
value = metadata
|
||||
for key in key_path.split('.'):
|
||||
for key in key_path.split("."):
|
||||
value = value[key]
|
||||
return value
|
||||
except (KeyError, TypeError):
|
||||
return default
|
||||
|
||||
def _deep_merge(self, base: Dict[str, Any], overlay: Dict[str, Any]) -> Dict[str, Any]:
|
||||
|
||||
def _deep_merge(
|
||||
self, base: Dict[str, Any], overlay: Dict[str, Any]
|
||||
) -> Dict[str, Any]:
|
||||
"""
|
||||
Deep merge two dictionaries
|
||||
|
||||
|
||||
Args:
|
||||
base: Base dictionary
|
||||
overlay: Dictionary to merge on top
|
||||
|
||||
|
||||
Returns:
|
||||
Merged dictionary
|
||||
"""
|
||||
result = copy.deepcopy(base)
|
||||
|
||||
|
||||
for key, value in overlay.items():
|
||||
if key in result and isinstance(result[key], dict) and isinstance(value, dict):
|
||||
if (
|
||||
key in result
|
||||
and isinstance(result[key], dict)
|
||||
and isinstance(value, dict)
|
||||
):
|
||||
result[key] = self._deep_merge(result[key], value)
|
||||
else:
|
||||
result[key] = copy.deepcopy(value)
|
||||
|
||||
|
||||
return result
|
||||
|
||||
|
||||
def _create_settings_backup(self) -> Path:
|
||||
"""
|
||||
Create timestamped backup of settings.json
|
||||
|
||||
|
||||
Returns:
|
||||
Path to backup file
|
||||
"""
|
||||
if not self.settings_file.exists():
|
||||
raise ValueError("Cannot backup non-existent settings file")
|
||||
|
||||
|
||||
# Create backup directory
|
||||
self.backup_dir.mkdir(parents=True, exist_ok=True)
|
||||
|
||||
|
||||
# Create timestamped backup
|
||||
timestamp = datetime.now().strftime("%Y%m%d_%H%M%S")
|
||||
backup_file = self.backup_dir / f"settings_{timestamp}.json"
|
||||
|
||||
|
||||
shutil.copy2(self.settings_file, backup_file)
|
||||
|
||||
|
||||
# Keep only last 10 backups
|
||||
self._cleanup_old_backups()
|
||||
|
||||
|
||||
return backup_file
|
||||
|
||||
|
||||
def _cleanup_old_backups(self, keep_count: int = 10) -> None:
|
||||
"""
|
||||
Remove old backup files, keeping only the most recent
|
||||
|
||||
|
||||
Args:
|
||||
keep_count: Number of backups to keep
|
||||
"""
|
||||
if not self.backup_dir.exists():
|
||||
return
|
||||
|
||||
|
||||
# Get all backup files sorted by modification time
|
||||
backup_files = []
|
||||
for file in self.backup_dir.glob("settings_*.json"):
|
||||
backup_files.append((file.stat().st_mtime, file))
|
||||
|
||||
|
||||
backup_files.sort(reverse=True) # Most recent first
|
||||
|
||||
|
||||
# Remove old backups
|
||||
for _, file in backup_files[keep_count:]:
|
||||
try:
|
||||
file.unlink()
|
||||
except OSError:
|
||||
pass # Ignore errors when cleaning up
|
||||
|
||||
|
||||
def list_backups(self) -> List[Dict[str, Any]]:
|
||||
"""
|
||||
List available settings backups
|
||||
|
||||
|
||||
Returns:
|
||||
List of backup info dicts with name, path, and timestamp
|
||||
"""
|
||||
if not self.backup_dir.exists():
|
||||
return []
|
||||
|
||||
|
||||
backups = []
|
||||
for file in self.backup_dir.glob("settings_*.json"):
|
||||
try:
|
||||
stat = file.stat()
|
||||
backups.append({
|
||||
"name": file.name,
|
||||
"path": str(file),
|
||||
"size": stat.st_size,
|
||||
"created": datetime.fromtimestamp(stat.st_ctime).isoformat(),
|
||||
"modified": datetime.fromtimestamp(stat.st_mtime).isoformat()
|
||||
})
|
||||
backups.append(
|
||||
{
|
||||
"name": file.name,
|
||||
"path": str(file),
|
||||
"size": stat.st_size,
|
||||
"created": datetime.fromtimestamp(stat.st_ctime).isoformat(),
|
||||
"modified": datetime.fromtimestamp(stat.st_mtime).isoformat(),
|
||||
}
|
||||
)
|
||||
except OSError:
|
||||
continue
|
||||
|
||||
|
||||
# Sort by creation time, most recent first
|
||||
backups.sort(key=lambda x: x["created"], reverse=True)
|
||||
return backups
|
||||
|
||||
|
||||
def restore_backup(self, backup_name: str) -> bool:
|
||||
"""
|
||||
Restore settings from backup
|
||||
|
||||
|
||||
Args:
|
||||
backup_name: Name of backup file to restore
|
||||
|
||||
|
||||
Returns:
|
||||
True if successful, False otherwise
|
||||
"""
|
||||
backup_file = self.backup_dir / backup_name
|
||||
|
||||
|
||||
if not backup_file.exists():
|
||||
return False
|
||||
|
||||
|
||||
try:
|
||||
# Validate backup file first
|
||||
with open(backup_file, 'r', encoding='utf-8') as f:
|
||||
with open(backup_file, "r", encoding="utf-8") as f:
|
||||
json.load(f) # Will raise exception if invalid
|
||||
|
||||
|
||||
# Create backup of current settings
|
||||
if self.settings_file.exists():
|
||||
self._create_settings_backup()
|
||||
|
||||
|
||||
# Restore backup
|
||||
shutil.copy2(backup_file, self.settings_file)
|
||||
return True
|
||||
|
||||
|
||||
except (json.JSONDecodeError, IOError):
|
||||
return False
|
||||
|
||||
Reference in New Issue
Block a user