When building AI/LLM-related tooling (tool-calls, background execution, inference/image generation), enforce compatibility at the boundaries:

1) Use team-standard LLM-friendly IDs

2) Make AI hardware initialization conservative + always fall back

Example (ID generation pattern):

# pseudo-example: use the team helper rather than custom UUID strings
from python.helpers.guids import llm_guid  # assumed name from your guids.py

call_id = llm_guid(length=8)  # keep the default length consistent

Checklist to apply during review