Ensure consistent parameter naming, format, and usage across all API interfaces, documentation, and code examples. This includes maintaining uniform parameter names between SDK examples and proxy configurations, using correct environment variable names as specified in official documentation, and applying proper naming conventions for model identifiers and...
Ensure consistent parameter naming, format, and usage across all API interfaces, documentation, and code examples. This includes maintaining uniform parameter names between SDK examples and proxy configurations, using correct environment variable names as specified in official documentation, and applying proper naming conventions for model identifiers and API routes.
Key areas to verify:
heroku/claude-3-5-haiku
) to enable proper provider identificationapi_base
, proxy config should use the same)ANTHROPIC_AUTH_TOKEN
and ANTHROPIC_BASE_URL
for Anthropic)openai/responses/<model>
instead of chat/<model>
for responses API)Example of consistent parameter usage:
# SDK example
response = completion(
model="heroku/claude-3-5-haiku",
api_base="https://us.inference.heroku.com",
api_key="fake-heroku-key"
)
# Corresponding proxy config should use same parameter names
model_list:
- model_name: claude-haiku
litellm_params:
model: heroku/claude-3-5-haiku
api_base: https://us.inference.heroku.com
api_key: fake-heroku-key
This consistency prevents confusion, reduces integration errors, and ensures predictable behavior across different usage contexts.
Enter the URL of a public GitHub repository