SuperClaude/setup/services/settings.py
kazuki nakai 050d5ea2ab
refactor: PEP8 compliance - directory rename and code formatting (#425)
* fix(orchestration): add WebFetch auto-trigger for infrastructure configuration

Problem: Infrastructure configuration changes (e.g., Traefik port settings)
were being made based on assumptions without consulting official documentation,
violating the 'Evidence > assumptions' principle in PRINCIPLES.md.

Solution:
- Added Infrastructure Configuration Validation section to MODE_Orchestration.md
- Auto-triggers WebFetch for infrastructure tools (Traefik, nginx, Docker, etc.)
- Enforces MODE_DeepResearch activation for investigation
- BLOCKS assumption-based configuration changes

Testing: Verified WebFetch successfully retrieves Traefik official docs (port 80 default)

This prevents production outages from infrastructure misconfiguration by ensuring
all technical recommendations are backed by official documentation.

* feat: Add PM Agent (Project Manager Agent) for seamless orchestration

Introduces PM Agent as the default orchestration layer that coordinates
all sub-agents and manages workflows automatically.

Key Features:
- Default orchestration: All user interactions handled by PM Agent
- Auto-delegation: Intelligent sub-agent selection based on task analysis
- Docker Gateway integration: Zero-token baseline with dynamic MCP loading
- Self-improvement loop: Automatic documentation of patterns and mistakes
- Optional override: Users can specify sub-agents explicitly if desired

Architecture:
- Agent spec: SuperClaude/Agents/pm-agent.md
- Command: SuperClaude/Commands/pm.md
- Updated docs: README.md (15→16 agents), agents.md (new Orchestration category)

User Experience:
- Default: PM Agent handles everything (seamless, no manual routing)
- Optional: Explicit --agent flag for direct sub-agent access
- Both modes available simultaneously (no user downside)

Implementation Status:
-  Specification complete
-  Documentation complete
-  Prototype implementation needed
-  Docker Gateway integration needed
-  Testing and validation needed

Refs: kazukinakai/docker-mcp-gateway (IRIS MCP Gateway integration)

* feat: Add Agent Orchestration rules for PM Agent default activation

Implements PM Agent as the default orchestration layer in RULES.md.

Key Changes:
- New 'Agent Orchestration' section (CRITICAL priority)
- PM Agent receives ALL user requests by default
- Manual override with @agent-[name] bypasses PM Agent
- Agent Selection Priority clearly defined:
  1. Manual override → Direct routing
  2. Default → PM Agent → Auto-delegation
  3. Delegation based on keywords, file types, complexity, context

User Experience:
- Default: PM Agent handles everything (seamless)
- Override: @agent-[name] for direct specialist access
- Transparent: PM Agent reports delegation decisions

This establishes PM Agent as the orchestration layer while
respecting existing auto-activation patterns and manual overrides.

Next Steps:
- Local testing in agiletec project
- Iteration based on actual behavior
- Documentation updates as needed

* refactor(pm-agent): redesign as self-improvement meta-layer

Problem Resolution:
PM Agent's initial design competed with existing auto-activation for task routing,
creating confusion about orchestration responsibilities and adding unnecessary complexity.

Design Change:
Redefined PM Agent as a meta-layer agent that operates AFTER specialist agents
complete tasks, focusing on:
- Post-implementation documentation and pattern recording
- Immediate mistake analysis with prevention checklists
- Monthly documentation maintenance and noise reduction
- Pattern extraction and knowledge synthesis

Two-Layer Orchestration System:
1. Task Execution Layer: Existing auto-activation handles task routing (unchanged)
2. Self-Improvement Layer: PM Agent meta-layer handles documentation (new)

Files Modified:
- SuperClaude/Agents/pm-agent.md: Complete rewrite with meta-layer design
  - Category: orchestration → meta
  - Triggers: All user interactions → Post-implementation, mistakes, monthly
  - Behavioral Mindset: Continuous learning system
  - Self-Improvement Workflow: BEFORE/DURING/AFTER/MISTAKE RECOVERY/MAINTENANCE

- SuperClaude/Core/RULES.md: Agent Orchestration section updated
  - Split into Task Execution Layer + Self-Improvement Layer
  - Added orchestration flow diagram
  - Clarified PM Agent activates AFTER task completion

- README.md: Updated PM Agent description
  - "orchestrates all interactions" → "ensures continuous learning"

- Docs/User-Guide/agents.md: PM Agent section rewritten
  - Section: Orchestration Agent → Meta-Layer Agent
  - Expertise: Project orchestration → Self-improvement workflow executor
  - Examples: Task coordination → Post-implementation documentation

- PR_DOCUMENTATION.md: Comprehensive PR documentation added
  - Summary, motivation, changes, testing, breaking changes
  - Two-layer orchestration system diagram
  - Verification checklist

Integration Validated:
Tested with agiletec project's self-improvement-workflow.md:
 PM Agent aligns with existing BEFORE/DURING/AFTER/MISTAKE RECOVERY phases
 Complements (not competes with) existing workflow
 agiletec workflow defines WHAT, PM Agent defines WHO executes it

Breaking Changes: None
- Existing auto-activation continues unchanged
- Specialist agents unaffected
- User workflows remain the same
- New capability: Automatic documentation and knowledge maintenance

Value Proposition:
Transforms SuperClaude into a continuously learning system that accumulates
knowledge, prevents recurring mistakes, and maintains fresh documentation
without manual intervention.

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>

* docs: add Claude Code conversation history management research

Research covering .jsonl file structure, performance impact, and retention policies.

Content:
- Claude Code .jsonl file format and message types
- Performance issues from GitHub (memory leaks, conversation compaction)
- Retention policies (consumer vs enterprise)
- Rotation recommendations based on actual data
- File history snapshot tracking mechanics

Source: Moved from agiletec project (research applicable to all Claude Code projects)

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>

* feat: add Development documentation structure

Phase 1: Documentation Structure complete

- Add Docs/Development/ directory for development documentation
- Add ARCHITECTURE.md - System architecture with PM Agent meta-layer
- Add ROADMAP.md - 5-phase development plan with checkboxes
- Add TASKS.md - Daily task tracking with progress indicators
- Add PROJECT_STATUS.md - Current status dashboard and metrics
- Add pm-agent-integration.md - Implementation guide for PM Agent mode

This establishes comprehensive documentation foundation for:
- System architecture understanding
- Development planning and tracking
- Implementation guidance
- Progress visibility

Related: #pm-agent-mode #documentation #phase-1

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>

* feat: PM Agent session lifecycle and PDCA implementation

Phase 2: PM Agent Mode Integration (Design Phase)

Commands/pm.md updates:
- Add "Always-Active Foundation Layer" concept
- Add Session Lifecycle (Session Start/During Work/Session End)
- Add PDCA Cycle (Plan/Do/Check/Act) automation
- Add Serena MCP Memory Integration (list/read/write_memory)
- Document auto-activation triggers

Agents/pm-agent.md updates:
- Add Session Start Protocol (MANDATORY auto-activation)
- Add During Work PDCA Cycle with example workflows
- Add Session End Protocol with state preservation
- Add PDCA Self-Evaluation Pattern
- Add Documentation Strategy (temp → patterns/mistakes)
- Add Memory Operations Reference

Key Features:
- Session start auto-activation for context restoration
- 30-minute checkpoint saves during work
- Self-evaluation with think_about_* operations
- Systematic documentation lifecycle
- Knowledge evolution to CLAUDE.md

Implementation Status:
-  Design complete (Commands/pm.md, Agents/pm-agent.md)
-  Implementation pending (Core components)
-  Serena MCP integration pending

Salvaged from mistaken development in ~/.claude directory

Related: #pm-agent-mode #session-lifecycle #pdca-cycle #phase-2

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>

* fix: disable Serena MCP auto-browser launch

Disable web dashboard and GUI log window auto-launch in Serena MCP server
to prevent intrusive browser popups on startup. Users can still manually
access the dashboard at http://localhost:24282/dashboard/ if needed.

Changes:
- Add CLI flags to Serena run command:
  - --enable-web-dashboard false
  - --enable-gui-log-window false
- Ensures Git-tracked configuration (no reliance on ~/.serena/serena_config.yml)
- Aligns with AIRIS MCP Gateway integration approach

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>

* refactor: rename directories to lowercase for PEP8 compliance

- Rename superclaude/Agents -> superclaude/agents
- Rename superclaude/Commands -> superclaude/commands
- Rename superclaude/Core -> superclaude/core
- Rename superclaude/Examples -> superclaude/examples
- Rename superclaude/MCP -> superclaude/mcp
- Rename superclaude/Modes -> superclaude/modes

This change follows Python PEP8 naming conventions for package directories.

* style: fix PEP8 violations and update package name to lowercase

Changes:
- Format all Python files with black (43 files reformatted)
- Update package name from 'SuperClaude' to 'superclaude' in pyproject.toml
- Fix import statements to use lowercase package name
- Add missing imports (timedelta, __version__)
- Remove old SuperClaude.egg-info directory

PEP8 violations reduced from 2672 to 701 (mostly E501 line length due to black's 88 char vs flake8's 79 char limit).

* docs: add PM Agent development documentation

Add comprehensive PM Agent development documentation:
- PM Agent ideal workflow (7-phase autonomous cycle)
- Project structure understanding (Git vs installed environment)
- Installation flow understanding (CommandsComponent behavior)
- Task management system (current-tasks.md)

Purpose: Eliminate repeated explanations and enable autonomous PDCA cycles

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>

* feat(pm-agent): add self-correcting execution and warning investigation culture

## Changes

### superclaude/commands/pm.md
- Add "Self-Correcting Execution" section with root cause analysis protocol
- Add "Warning/Error Investigation Culture" section enforcing zero-tolerance for dismissal
- Define error detection protocol: STOP → Investigate → Hypothesis → Different Solution → Execute
- Document anti-patterns (retry without understanding) and correct patterns (research-first)

### docs/Development/hypothesis-pm-autonomous-enhancement-2025-10-14.md
- Add PDCA workflow hypothesis document for PM Agent autonomous enhancement

## Rationale

PM Agent must never retry failed operations without understanding root causes.
All warnings and errors require investigation via context7/WebFetch/documentation
to ensure production-quality code and prevent technical debt accumulation.

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>

* feat(installer): add airis-mcp-gateway MCP server option

## Changes

- Add airis-mcp-gateway to MCP server options in installer
- Configuration: GitHub-based installation via uvx
- Repository: https://github.com/oraios/airis-mcp-gateway
- Purpose: Dynamic MCP Gateway for zero-token baseline and on-demand tool loading

## Implementation

Added to setup/components/mcp.py self.mcp_servers dictionary with:
- install_method: github
- install_command: uvx test installation
- run_command: uvx runtime execution
- required: False (optional server)

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>

---------

Co-authored-by: kazuki <kazuki@kazukinoMacBook-Air.local>
Co-authored-by: Claude <noreply@anthropic.com>
2025-10-14 08:47:09 +05:30

534 lines
16 KiB
Python

"""
Settings management for SuperClaude installation system
Handles settings.json migration to the new SuperClaude metadata json file
Allows for manipulation of these json files with deep merge and backup
"""
import json
import shutil
from typing import Dict, Any, Optional, List
from pathlib import Path
from datetime import datetime
import copy
class SettingsService:
"""Manages settings.json file operations"""
def __init__(self, install_dir: Path):
"""
Initialize settings manager
Args:
install_dir: Installation directory containing settings.json
"""
self.install_dir = install_dir
self.settings_file = install_dir / "settings.json"
self.metadata_file = install_dir / ".superclaude-metadata.json"
self.backup_dir = install_dir / "backups" / "settings"
def load_settings(self) -> Dict[str, Any]:
"""
Load settings from settings.json
Returns:
Settings dict (empty if file doesn't exist)
"""
if not self.settings_file.exists():
return {}
try:
with open(self.settings_file, "r", encoding="utf-8") as f:
return json.load(f)
except (json.JSONDecodeError, IOError) as e:
raise ValueError(f"Could not load settings from {self.settings_file}: {e}")
def save_settings(
self, settings: Dict[str, Any], create_backup: bool = True
) -> None:
"""
Save settings to settings.json with optional backup
Args:
settings: Settings dict to save
create_backup: Whether to create backup before saving
"""
# Create backup if requested and file exists
if create_backup and self.settings_file.exists():
self._create_settings_backup()
# Ensure directory exists
self.settings_file.parent.mkdir(parents=True, exist_ok=True)
# Save with pretty formatting
try:
with open(self.settings_file, "w", encoding="utf-8") as f:
json.dump(settings, f, indent=2, ensure_ascii=False, sort_keys=True)
except IOError as e:
raise ValueError(f"Could not save settings to {self.settings_file}: {e}")
def load_metadata(self) -> Dict[str, Any]:
"""
Load SuperClaude metadata from .superclaude-metadata.json
Returns:
Metadata dict (empty if file doesn't exist)
"""
if not self.metadata_file.exists():
return {}
try:
with open(self.metadata_file, "r", encoding="utf-8") as f:
return json.load(f)
except (json.JSONDecodeError, IOError) as e:
raise ValueError(f"Could not load metadata from {self.metadata_file}: {e}")
def save_metadata(self, metadata: Dict[str, Any]) -> None:
"""
Save SuperClaude metadata to .superclaude-metadata.json
Args:
metadata: Metadata dict to save
"""
# Ensure directory exists
self.metadata_file.parent.mkdir(parents=True, exist_ok=True)
# Save with pretty formatting
try:
with open(self.metadata_file, "w", encoding="utf-8") as f:
json.dump(metadata, f, indent=2, ensure_ascii=False, sort_keys=True)
except IOError as e:
raise ValueError(f"Could not save metadata to {self.metadata_file}: {e}")
def merge_metadata(self, modifications: Dict[str, Any]) -> Dict[str, Any]:
"""
Deep merge modifications into existing settings
Args:
modifications: Settings modifications to merge
Returns:
Merged settings dict
"""
existing = self.load_metadata()
return self._deep_merge(existing, modifications)
def update_metadata(self, modifications: Dict[str, Any]) -> None:
"""
Update settings with modifications
Args:
modifications: Settings modifications to apply
create_backup: Whether to create backup before updating
"""
merged = self.merge_metadata(modifications)
self.save_metadata(merged)
def migrate_superclaude_data(self) -> bool:
"""
Migrate SuperClaude-specific data from settings.json to metadata file
Returns:
True if migration occurred, False if no data to migrate
"""
settings = self.load_settings()
# SuperClaude-specific fields to migrate
superclaude_fields = ["components", "framework", "superclaude", "mcp"]
data_to_migrate = {}
fields_found = False
# Extract SuperClaude data
for field in superclaude_fields:
if field in settings:
data_to_migrate[field] = settings[field]
fields_found = True
if not fields_found:
return False
# Load existing metadata (if any) and merge
existing_metadata = self.load_metadata()
merged_metadata = self._deep_merge(existing_metadata, data_to_migrate)
# Save to metadata file
self.save_metadata(merged_metadata)
# Remove SuperClaude fields from settings
clean_settings = {
k: v for k, v in settings.items() if k not in superclaude_fields
}
# Save cleaned settings
self.save_settings(clean_settings, create_backup=True)
return True
def merge_settings(self, modifications: Dict[str, Any]) -> Dict[str, Any]:
"""
Deep merge modifications into existing settings
Args:
modifications: Settings modifications to merge
Returns:
Merged settings dict
"""
existing = self.load_settings()
return self._deep_merge(existing, modifications)
def update_settings(
self, modifications: Dict[str, Any], create_backup: bool = True
) -> None:
"""
Update settings with modifications
Args:
modifications: Settings modifications to apply
create_backup: Whether to create backup before updating
"""
merged = self.merge_settings(modifications)
self.save_settings(merged, create_backup)
def get_setting(self, key_path: str, default: Any = None) -> Any:
"""
Get setting value using dot-notation path
Args:
key_path: Dot-separated path (e.g., "hooks.enabled")
default: Default value if key not found
Returns:
Setting value or default
"""
settings = self.load_settings()
try:
value = settings
for key in key_path.split("."):
value = value[key]
return value
except (KeyError, TypeError):
return default
def set_setting(
self, key_path: str, value: Any, create_backup: bool = True
) -> None:
"""
Set setting value using dot-notation path
Args:
key_path: Dot-separated path (e.g., "hooks.enabled")
value: Value to set
create_backup: Whether to create backup before updating
"""
# Build nested dict structure
keys = key_path.split(".")
modification = {}
current = modification
for key in keys[:-1]:
current[key] = {}
current = current[key]
current[keys[-1]] = value
self.update_settings(modification, create_backup)
def remove_setting(self, key_path: str, create_backup: bool = True) -> bool:
"""
Remove setting using dot-notation path
Args:
key_path: Dot-separated path to remove
create_backup: Whether to create backup before updating
Returns:
True if setting was removed, False if not found
"""
settings = self.load_settings()
keys = key_path.split(".")
# Navigate to parent of target key
current = settings
try:
for key in keys[:-1]:
current = current[key]
# Remove the target key
if keys[-1] in current:
del current[keys[-1]]
self.save_settings(settings, create_backup)
return True
else:
return False
except (KeyError, TypeError):
return False
def add_component_registration(
self, component_name: str, component_info: Dict[str, Any]
) -> None:
"""
Add component to registry in metadata
Args:
component_name: Name of component
component_info: Component metadata dict
"""
metadata = self.load_metadata()
if "components" not in metadata:
metadata["components"] = {}
metadata["components"][component_name] = {
**component_info,
"installed_at": datetime.now().isoformat(),
}
self.save_metadata(metadata)
def remove_component_registration(self, component_name: str) -> bool:
"""
Remove component from registry in metadata
Args:
component_name: Name of component to remove
Returns:
True if component was removed, False if not found
"""
metadata = self.load_metadata()
if "components" in metadata and component_name in metadata["components"]:
del metadata["components"][component_name]
self.save_metadata(metadata)
return True
return False
def get_installed_components(self) -> Dict[str, Dict[str, Any]]:
"""
Get all installed components from registry
Returns:
Dict of component_name -> component_info
"""
metadata = self.load_metadata()
return metadata.get("components", {})
def is_component_installed(self, component_name: str) -> bool:
"""
Check if component is registered as installed
Args:
component_name: Name of component to check
Returns:
True if component is installed, False otherwise
"""
components = self.get_installed_components()
return component_name in components
def get_component_version(self, component_name: str) -> Optional[str]:
"""
Get installed version of component
Args:
component_name: Name of component
Returns:
Version string or None if not installed
"""
components = self.get_installed_components()
component_info = components.get(component_name, {})
return component_info.get("version")
def update_framework_version(self, version: str) -> None:
"""
Update SuperClaude framework version in metadata
Args:
version: Framework version string
"""
metadata = self.load_metadata()
if "framework" not in metadata:
metadata["framework"] = {}
metadata["framework"]["version"] = version
metadata["framework"]["updated_at"] = datetime.now().isoformat()
self.save_metadata(metadata)
def check_installation_exists(self) -> bool:
"""
Get SuperClaude framework version from metadata
Returns:
Version string or None if not set
"""
return self.metadata_file.exists()
def check_v2_installation_exists(self) -> bool:
"""
Get SuperClaude framework version from metadata
Returns:
Version string or None if not set
"""
return self.settings_file.exists()
def get_metadata_setting(self, key_path: str, default: Any = None) -> Any:
"""
Get metadata value using dot-notation path
Args:
key_path: Dot-separated path (e.g., "framework.version")
default: Default value if key not found
Returns:
Metadata value or default
"""
metadata = self.load_metadata()
try:
value = metadata
for key in key_path.split("."):
value = value[key]
return value
except (KeyError, TypeError):
return default
def _deep_merge(
self, base: Dict[str, Any], overlay: Dict[str, Any]
) -> Dict[str, Any]:
"""
Deep merge two dictionaries
Args:
base: Base dictionary
overlay: Dictionary to merge on top
Returns:
Merged dictionary
"""
result = copy.deepcopy(base)
for key, value in overlay.items():
if (
key in result
and isinstance(result[key], dict)
and isinstance(value, dict)
):
result[key] = self._deep_merge(result[key], value)
else:
result[key] = copy.deepcopy(value)
return result
def _create_settings_backup(self) -> Path:
"""
Create timestamped backup of settings.json
Returns:
Path to backup file
"""
if not self.settings_file.exists():
raise ValueError("Cannot backup non-existent settings file")
# Create backup directory
self.backup_dir.mkdir(parents=True, exist_ok=True)
# Create timestamped backup
timestamp = datetime.now().strftime("%Y%m%d_%H%M%S")
backup_file = self.backup_dir / f"settings_{timestamp}.json"
shutil.copy2(self.settings_file, backup_file)
# Keep only last 10 backups
self._cleanup_old_backups()
return backup_file
def _cleanup_old_backups(self, keep_count: int = 10) -> None:
"""
Remove old backup files, keeping only the most recent
Args:
keep_count: Number of backups to keep
"""
if not self.backup_dir.exists():
return
# Get all backup files sorted by modification time
backup_files = []
for file in self.backup_dir.glob("settings_*.json"):
backup_files.append((file.stat().st_mtime, file))
backup_files.sort(reverse=True) # Most recent first
# Remove old backups
for _, file in backup_files[keep_count:]:
try:
file.unlink()
except OSError:
pass # Ignore errors when cleaning up
def list_backups(self) -> List[Dict[str, Any]]:
"""
List available settings backups
Returns:
List of backup info dicts with name, path, and timestamp
"""
if not self.backup_dir.exists():
return []
backups = []
for file in self.backup_dir.glob("settings_*.json"):
try:
stat = file.stat()
backups.append(
{
"name": file.name,
"path": str(file),
"size": stat.st_size,
"created": datetime.fromtimestamp(stat.st_ctime).isoformat(),
"modified": datetime.fromtimestamp(stat.st_mtime).isoformat(),
}
)
except OSError:
continue
# Sort by creation time, most recent first
backups.sort(key=lambda x: x["created"], reverse=True)
return backups
def restore_backup(self, backup_name: str) -> bool:
"""
Restore settings from backup
Args:
backup_name: Name of backup file to restore
Returns:
True if successful, False otherwise
"""
backup_file = self.backup_dir / backup_name
if not backup_file.exists():
return False
try:
# Validate backup file first
with open(backup_file, "r", encoding="utf-8") as f:
json.load(f) # Will raise exception if invalid
# Create backup of current settings
if self.settings_file.exists():
self._create_settings_backup()
# Restore backup
shutil.copy2(backup_file, self.settings_file)
return True
except (json.JSONDecodeError, IOError):
return False