About Awesome Reviewers

Turning expert review habits into reusable playbooks

Awesome Reviewers is Baz's open library of AI reviewers built from thousands of production-grade pull request discussions. We codify how experienced maintainers protect quality so any team can adopt the same standards in minutes.

Practitioner-sourced

Every reviewer starts from real code review threads supplied by trusted maintainers across dozens of ecosystems.

High-signal prompts

We capture the heuristics, tone, and guardrails that help reviewers surface the right issues without generating noise.

Ready for production

Each reviewer is validated on fresh pull requests before it lands in the library, so the guidance feels human and actionable.

How we evolve the reviewer library

The library grows through a continuous research loop that keeps reviewers aligned with how real teams ship software. Every iteration reinforces accuracy, empathy, and language that lands with developers under deadline.

1. Capture real conversations

We partner with maintainers who share anonymized review threads that explain why changes shipped—or were held back. Each example includes the diff, the critique, and the eventual resolution so we understand cause and effect.

2. Distill the reviewer mindset

Baz researchers label the moves inside a comment, mapping patterns like performance budgeting or interface safety to the signals that triggered them. The result is a shared vocabulary for what “good” review feedback looks like.

3. Ship with feedback loops

Before a reviewer goes live, we replay it on new pull requests and collect maintainer feedback until the tone and recommendations match what they would deliver themselves.

From maintainer insight to developer assistant

Our curation pipeline keeps human judgment in the loop while automating the repetitive parts of code review.

sequenceDiagram participant Maintainer participant BazOps as Baz Research participant Library participant Developer Maintainer->>BazOps: Share review threads & intent BazOps->>BazOps: Normalize heuristics & tone BazOps->>Library: Publish reviewer prompt Developer->>Library: Request reviewer guidance Library-->>Developer: Surface targeted findings Developer->>Maintainer: Ship aligned fixes

What a reviewer prompt looks like

The structure balances mission, heuristics, and delivery style so AI feedback mirrors the maintainers who inspired it.

reviewer:
  name: Strict Type Guardian
  mission: Protect runtime safety by enforcing explicit type guarantees in critical paths.
  heuristics:
    - Flag optional return values that reach API boundaries without null handling.
    - Require conversion utilities to emit descriptive errors when type assertions fail.
    - Highlight concurrency code where shared state lacks typed synchronization primitives.
  response_style:
    voice: Calm, collaborative
    include_context: true
workflows:
  - trigger: pull_request.opened
    actions:
      - analyze_diff: true
      - highlight_findings: inline
      - summarize_risks: narrative

Reviewers combine precise heuristics with expectations about tone and collaboration so engineering teams can plug them into CI without rewriting their culture.

Help shape the next generation of reviewers

Maintainers and teams can submit repositories for analysis or contribute their own review standards. We are especially interested in mobile, data engineering, and infrastructure playbooks that deserve wider adoption.

Submit a repository