LLM carbon footprint API
Send model id and token counts — get auditable gCO₂e per call, per tenant, and in monthly reports. Prompts never leave your infrastructure.
carbon-llm maps each supported model to a documented coefficient (grams CO₂e per 1,000 tokens), using vendor LCAs, cloud inference disclosures, and peer-reviewed benchmarks where available — with explicit confidence tiers. Your integration sends metadata only: model identifier and prompt/completion token counts from the provider's API response.
That design aligns with GHG Protocol Scope 3 activity data (tokens × factor) and supports CSRD / ESRS E1 narrative: traceable methodology, not black-box scores.
POST /estimate — instant gCO₂e for a single request (ideal for UX or pre-flight checks).
POST /track — append-only event ingestion with tenant_id for multi-tenant SaaS; powers dashboards and PDF exports.
Full request/response shapes and auth are in the documentation.