Back to all reviewers

Document AI infrastructure requirements

menloresearch/jan
Based on 4 comments
Markdown

When documenting AI applications that run local models or perform inference, always provide comprehensive infrastructure requirements including hardware specifications, platform-specific drivers, and setup instructions. This ensures users can successfully run AI models without encountering compatibility issues.

AI Markdown

Reviewer Prompt

When documenting AI applications that run local models or perform inference, always provide comprehensive infrastructure requirements including hardware specifications, platform-specific drivers, and setup instructions. This ensures users can successfully run AI models without encountering compatibility issues.

Include the following details:

  • Hardware requirements: Specify RAM/VRAM needs for different model sizes (e.g., “8GB RAM/VRAM for 3B models, 16GB for 7B models”)
  • CPU/GPU architecture support: List supported architectures (ARM, x86 for CPU; NVIDIA, AMD, Intel for GPU)
  • Platform-specific drivers: Include GPU driver requirements (NVIDIA drivers for Windows, CUDA Toolkit for Linux)
  • Model size relationships: Explain how hardware capacity relates to usable model sizes

Example documentation structure:

### Hardware Requirements
- **RAM/VRAM**: 8GB minimum (3B models), 16GB recommended (7B models)
- **CPU**: ARM, x86 architectures supported
- **GPU**: NVIDIA (via llama.cpp), AMD and Intel support coming soon

### Platform Setup
- **Windows**: Install NVIDIA drivers if GPU available
- **Linux**: Install CUDA Toolkit if GPU available

This prevents user frustration and ensures AI applications can run as intended across different hardware configurations.

4
Comments Analyzed
Markdown
Primary Language
AI
Category

Source Discussions