- Replace per-service network_mode:bridge with a shared lingvai-net
so lingvai-app can reach lingvai-translator by hostname
- Set LOCAL_MODEL_ENDPOINT=http://lingvai-translator:5000/predictions
and REPLICATE_MODE=local via environment so admin panel defaults
are pre-configured for the local container
- Add depends_on: translator so the app starts after the model is up
- Host port 5030->5000 kept for direct debugging access
Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>