Carbon-LLM/Chrome extension
Back

Chrome extension for web chat UIs

See CO₂ per chat reply in minutes—same POST /api/v1/estimate as production, model + tokens only, prompts never leave your browser, no account. When you embed LLMs for customers, sign up free for /track, API keys, and dashboards—that path is for integration, not for trying the estimate.

Build from source (steps below)—no account. Free account when you ship /track or need keys and the dashboard.

Install (from source)

A Chrome Web Store build is planned. Today you load the unpacked folder from the repository:

  1. Clone or download the carbon-llm repository.
  2. At the repo root, run npm ci then npm run build:extension.
  3. Chrome → Extensions → Developer mode → Load unpacked → select extensions/chrome-llm-carbon/ (folder containing manifest.json).

Technical README: in your clone, see extensions/chrome-llm-carbon/README.md — platform quirks and limits are documented there.

carbon-llm floating panel on an AI chat: estimated CO₂e for this reply, privacy note, and quick tips to reduce impact
In the chat — grams CO₂e per reply, local estimate note, and short tips (shorter prompts, model choice, reuse threads).
My footprint dashboard: weekly CO₂e ring gauge, concrete equivalents, and sample or live aggregates from the extension
My footprint — weekly total, equivalents (showers, EV km), and trends; data comes from extension aggregates (no chat text on our servers).

The extension intercepts provider responses to read token usage, then calls POST /api/v1/estimate on carbon-llm. It complements the REST API for teams who want transparency before wiring server-side /track.

My footprint (extension users) is separate from the API dashboard for integrations.

Methodology and confidence labels match PDF exports and the API.