“We can estimate LLM emissions” is not enough when you’re under ESRS E1 / assurance pressure. What matters is the evidence trail: activity data, mapping logic, coefficient provenance, and clear boundaries — so an auditor can reproduce the computation and understand uncertainty.
The evidence checklist (what to provide)
Use this checklist as a practical annex to your reporting package:
- Activity data extract: token totals for the reporting period, per model (and per tenant if you allocate).
- Model mapping table: how provider model identifiers map to your internal coefficient rows (versioned and documented).
- Coefficient provenance: links/references used for each coefficient and a confidence label (measured / benchmarked / estimated).
- Computation methodology: the exact formula and aggregation steps, including rounding rules and exclusions.
- Assumptions & limitations: what you do not model line-by-line (PUE/network granularity, location variability, model evolution).
- Boundary statement: which scopes/categories and what reporting perimeter your company uses (single entity vs group consolidated).
- Change log: when coefficients or mappings change and how prior periods are handled.
Where auditors often probe
- whether activity data corresponds to real production usage (and how test usage is separated);
- whether missing usage fields are treated consistently (and whether the impact is material);
- whether coefficient versions used for a report can be reconstructed later;
- whether the “classification story” (e.g. Scope 3 cat. 1 vs cat. 11) is aligned with your company’s boundary policy.
Practical framing: even if your team iterates on the final Scope 3 category, you can usually keep the same activity data and coefficient computations — only the narrative annex changes. That makes the overall process more stable for assurance readiness.
How to make your file “auditor friendly”
Auditors want a document they can skim and then audit. Concretely:
- Keep a one-page methodology summary (formula + inputs + coefficient table).
- Add an annex with assumptions and limitations.
- Include a short classification context note to explain how you decided boundaries.
- Ensure the report period is explicit and consistent with your data pipeline.
Further reading
If you need the measurement “how”, start with: CO₂ methodology →. If you need the mapping story, see: category 1 vs 11 →.