Monitor your cloud's carbon footprint in real-time
Cloud Carbon Exporter automatically discovers cloud resources, estimates energy consumption and carbon emissions in real-time. This tool provides valuable insights for operational and tech teams interested in following the Carbon-Driven Development principles.
Carbon-Driven Development
Carbon-Driven Development (CDD) is a philosophy to build digital services and manage cloud infrastructure with environmental sustainability at its core.
It revolves around three pillars:
Estimate energy consumption for each cloud resource (servers, load balancers, storage, etc.)
Collect data in production environments
Aggregate data in real-time
By applying these few rules, production teams will be able to:
Measure the overall energy efficiency of a system in relation to a business use (active user, transaction, etc.).
Detect infrastructure anomalies faster
Engage the company's operational teams and Execs more widely in continuous improvement
Multi Cloud · We want to support as much cloud platforms as possible. From hyperscalers to edge datacenters to regional provider. List of supported services
Dangofish Model · This tool will prioritize the number of supported resources over the precision of the exported metrics. Estimating precisely the energy consumption of a resource is a hard task. The complexity and opacity of a Cloud service increase the margin of error but trends should be respected. Model calculations are based on public data - mixed with our own hypothesis documented in primitives model and cloud model
Once the resource energy draw is estimated, the exporter evaluates the carbon intensity of the resource at its location based on publicly available datasets.
OpenMetrics · The exporter is compatible OpenMetrics format. Therefore, you can ingest metrics into Prometheus, Datadog and every time series database that support this standard.
Performance · We're paying close attention to the exporter performance. Most API requests are done concurrently and cached. Most scrapes finish under 1000ms even with thousand monitored resources.
The Cloud Carbon Exporter can work on Google Cloud Platform, Amazon Web Service and Scaleway (more to come).
Google Cloud Platform
sequenceDiagram
Prometheus->>cloud-carbon-exporter: scrape metrics
cloud-carbon-exporter->>Asset Inventory: Query all used resources
cloud-carbon-exporter->>GCP Resources API: Describe Resource
cloud-carbon-exporter->>Monitoring: Get Resource statistics
cloud-carbon-exporter-->>Prometheus: Returns Watts and CO2 metrics
The exporter uses GCP Application Default Credentials:
The attached service account, returned by the metadata server (inside GCP environment)
$ docker run -p 2922 ghcr.io/superdango/cloud-carbon-exporter:latest \
-cloud.provider=gcp \
-cloud.gcp.projectid=myproject
Amazon Web Services
sequenceDiagram
Prometheus->>cloud-carbon-exporter: scrape metrics
cloud-carbon-exporter->>AWS Cost Explorer API: Query used services and regions
cloud-carbon-exporter->>AWS Resources API: Describe Resource
cloud-carbon-exporter->>Cloudwatch API: Get Resource statistics
cloud-carbon-exporter-->>Prometheus: Returns Watts and CO2 metrics
$ docker run -p 2922 ghcr.io/superdango/cloud-carbon-exporter:latest \
-cloud.provider=aws
Scaleway
sequenceDiagram
Prometheus->>cloud-carbon-exporter: scrape metrics
cloud-carbon-exporter->>Regional APIs: Query all used resources
cloud-carbon-exporter->>Cockpit: Get Resource statistics
cloud-carbon-exporter-->>Prometheus: Returns Watts and CO2 metrics
$ docker run -p 2922 ghcr.io/superdango/cloud-carbon-exporter:latest \
-cloud.provider=scw
Deployment
Cloud Carbon Exporter can easily run on serverless platform like GCP Cloud Run or AWS Lambda for testing purpose. However, we do recommend running the exporter as a long lived process to keep its cache in memory (lowering the cost)
Usage
Usage of ./cloud-carbon-exporter:
-cloud.aws.defaultregion string
aws default region (default "us-east-1")
-cloud.aws.rolearn string
aws role arn to assume
-cloud.gcp.projectid string
gcp project to explore resources from
-cloud.provider string
cloud provider type (gcp, aws, scw)
-demo.enabled string
return fictive demo data (default "false")
-listen string
addr to listen to (default "0.0.0.0:2922")
-log.format string
log format (text, json) (default "text")
-log.level string
log severity (debug, info, warn, error) (default "info")
Environment Variables:
SCW_ACCESS_KEY
scaleway access key
SCW_SECRET_KEY
scaleway secret key
Additional Cloud Cost
Calls to cloud monitoring APIs can incur additional costs. The exporter will do its best to cache API
responses and therefore, lower the impact on your bill. API costs are directly correlated to the number of
resources the exporter generate data from. Here are the average costs you may observe per resource on your cloud account
or project (instance, bucket, load balancer) for a 15 minutes cache TTL:
AWS: $0,06 / month per resource
GCP: $0,03 / month per resource (will be 10 times less in October 2025)
SCW: free
The prices shown below are dated March 2025 and are subject to change by the cloud providers.
You can use the Cost Calculator file to do finer estimations with your own inputs.
In this file, you can also anticipate the storage cost of carbon metrics if you choose to use the cloud provider monitoring service.
Permissions & Security
The exporter requires permissions to automatically discover resources in your cloud environment. For a quick and easy setup, you can grant it read-only access to your entire cloud platform, such as the Project Viewer role or ViewOnlyAccess policy.
If you'd prefer a more precise approach, you can authorize only the specific API calls needed for the services you use. A detailed list of required permissions for each cloud provider service will be available soon.
If the exporter encounters a missing permission, it will log a warning with details about the issue and increment the error_count{action="collect"} value. We recommend periodically monitoring this metric and adjusting permissions as needed to ensure smooth operation.
We're grateful for every contribution that helps shape Cloud Carbon Exporter. Whether it's through testing, feedback, or documentation, each effort strengthens our software and enhances the user experience.
We'd like to extend our heartfelt appreciation to the individuals who have invested significant time and energy into making this project better.
We appreciate your input and contributions to Cloud Carbon Exporter. Here's how you can help:
Share Feedback and Ideas · Found a bug or have a feature idea? Start a discussion in our GitHub Discussions.
Your testing and feedback are crucial to improving the project.
Code Contributions · We're actively refactoring to improve the codebase. For now, we're focusing on smaller, targeted contributions to ensure a smooth integration.
type Impact struct {
// Labels for impact Labels map[string]string// Energy in watts Energy Energy// EnergyEmissions are emissions related to energy in kgCO2eq/day EnergyEmissions EmissionsOverTime// EmbodiedEmissions are emissions related to the manufacturing EmbodiedEmissions EmissionsOverTime
}
ServeHTTP implements the http.Handler interface. It collects all metrics from the configured
collector and return them, formatted in the http response.