cloudcarbonexporter

package module
v0.0.0-...-aeb897c Latest Latest
Warning

This package is not in the latest version of its module.

Go to latest
Published: Jul 8, 2025 License: AGPL-3.0 Imports: 13 Imported by: 0

README

Cloud Carbon Exporter

Monitor your cloud's carbon footprint in real-time

test-build-push badge

Cloud Carbon Exporter automatically discovers cloud resources, estimates energy consumption and carbon emissions in real-time. This tool provides valuable insights for operational and tech teams interested in following the Carbon-Driven Development principles.

Carbon-Driven Development

Carbon-Driven Development (CDD) is a philosophy to build digital services and manage cloud infrastructure with environmental sustainability at its core.

It revolves around three pillars:

  1. Estimate energy consumption for each cloud resource (servers, load balancers, storage, etc.)
  2. Collect data in production environments
  3. Aggregate data in real-time

By applying these few rules, production teams will be able to:

  1. Measure the overall energy efficiency of a system in relation to a business use (active user, transaction, etc.).
  2. Detect infrastructure anomalies faster
  3. Engage the company's operational teams and Execs more widely in continuous improvement
  4. Reduce the carbon footprint of applications

Check out the original article which explains in detail the concepts of CDD.

Demo

grafana demo cdd

On the screenshot above, you can visualize and understand easily:

  • the estimated energy consumed by connected user to the online service,
  • the current CO2 emissions,
  • the equivalent in turned on lightbulbs.

You can easily customize the content of this dashboard using the data returned by the exporter.

Try our live demo with our Grafana dashboard: https://snapshots.raintank.io

Technical Overview

the cloud carbon exporter takes Cloud API to export energy and carbon data to monitoring systems

Multi Cloud · We want to support as much cloud platforms as possible. From hyperscalers to edge datacenters to regional provider. List of supported services

Dangofish Model · This tool will prioritize the number of supported resources over the precision of the exported metrics. Estimating precisely the energy consumption of a resource is a hard task. The complexity and opacity of a Cloud service increase the margin of error but trends should be respected. Model calculations are based on public data - mixed with our own hypothesis documented in primitives model and cloud model

Once the resource energy draw is estimated, the exporter evaluates the carbon intensity of the resource at its location based on publicly available datasets.

OpenMetrics · The exporter is compatible OpenMetrics format. Therefore, you can ingest metrics into Prometheus, Datadog and every time series database that support this standard.

Performance · We're paying close attention to the exporter performance. Most API requests are done concurrently and cached. Most scrapes finish under 1000ms even with thousand monitored resources.

Install

You can download the official Docker Image on the Github Package Registry

$ docker pull ghcr.io/superdango/cloud-carbon-exporter:latest

Configuration

The Cloud Carbon Exporter can work on Google Cloud Platform, Amazon Web Service and Scaleway (more to come).

Google Cloud Platform
      sequenceDiagram
            Prometheus->>cloud-carbon-exporter: scrape metrics
            cloud-carbon-exporter->>Asset Inventory: Query all used resources
            cloud-carbon-exporter->>GCP Resources API: Describe Resource
            cloud-carbon-exporter->>Monitoring: Get Resource statistics
            cloud-carbon-exporter-->>Prometheus: Returns Watts and CO2 metrics

The exporter uses GCP Application Default Credentials:

  • GOOGLE_APPLICATION_CREDENTIALS environment variable
  • gcloud auth application-default login command
  • The attached service account, returned by the metadata server (inside GCP environment)
$ docker run -p 2922 ghcr.io/superdango/cloud-carbon-exporter:latest \
        -cloud.provider=gcp \
        -cloud.gcp.projectid=myproject
Amazon Web Services
      sequenceDiagram
            Prometheus->>cloud-carbon-exporter: scrape metrics
            cloud-carbon-exporter->>AWS Cost Explorer API: Query used services and regions
            cloud-carbon-exporter->>AWS Resources API: Describe Resource
            cloud-carbon-exporter->>Cloudwatch API: Get Resource statistics
            cloud-carbon-exporter-->>Prometheus: Returns Watts and CO2 metrics

The exporter is:

  • Environment Variables (AWS_SECRET_ACCESS_KEY, AWS_ACCESS_KEY_ID, AWS_SESSION_TOKEN)
  • Shared Configuration
  • Shared Credentials files.
$ docker run -p 2922 ghcr.io/superdango/cloud-carbon-exporter:latest \
        -cloud.provider=aws
Scaleway
      sequenceDiagram
            Prometheus->>cloud-carbon-exporter: scrape metrics
            cloud-carbon-exporter->>Regional APIs: Query all used resources
            cloud-carbon-exporter->>Cockpit: Get Resource statistics
            cloud-carbon-exporter-->>Prometheus: Returns Watts and CO2 metrics

Configure the exporter via:

  • Environment Variables (SCW_ACCESS_KEY, SCW_SECRET_KEY)
$ docker run -p 2922 ghcr.io/superdango/cloud-carbon-exporter:latest \
        -cloud.provider=scw
Deployment

Cloud Carbon Exporter can easily run on serverless platform like GCP Cloud Run or AWS Lambda for testing purpose. However, we do recommend running the exporter as a long lived process to keep its cache in memory (lowering the cost)

Usage
Usage of ./cloud-carbon-exporter:
  -cloud.aws.defaultregion string
        aws default region (default "us-east-1")
  -cloud.aws.rolearn string
        aws role arn to assume
  -cloud.gcp.projectid string
        gcp project to explore resources from
  -cloud.provider string
        cloud provider type (gcp, aws, scw)
  -demo.enabled string
        return fictive demo data (default "false")
  -listen string
        addr to listen to (default "0.0.0.0:2922")
  -log.format string
        log format (text, json) (default "text")
  -log.level string
        log severity (debug, info, warn, error) (default "info")

Environment Variables:
  SCW_ACCESS_KEY
        scaleway access key
  SCW_SECRET_KEY
        scaleway secret key

Additional Cloud Cost

Calls to cloud monitoring APIs can incur additional costs. The exporter will do its best to cache API responses and therefore, lower the impact on your bill. API costs are directly correlated to the number of resources the exporter generate data from. Here are the average costs you may observe per resource on your cloud account or project (instance, bucket, load balancer) for a 15 minutes cache TTL:

  • AWS: $0,06 / month per resource
  • GCP: $0,03 / month per resource (will be 10 times less in October 2025)
  • SCW: free

The prices shown below are dated March 2025 and are subject to change by the cloud providers.

You can use the Cost Calculator file to do finer estimations with your own inputs. In this file, you can also anticipate the storage cost of carbon metrics if you choose to use the cloud provider monitoring service.

Permissions & Security

The exporter requires permissions to automatically discover resources in your cloud environment. For a quick and easy setup, you can grant it read-only access to your entire cloud platform, such as the Project Viewer role or ViewOnlyAccess policy.

If you'd prefer a more precise approach, you can authorize only the specific API calls needed for the services you use. A detailed list of required permissions for each cloud provider service will be available soon.

If the exporter encounters a missing permission, it will log a warning with details about the issue and increment the error_count{action="collect"} value. We recommend periodically monitoring this metric and adjusting permissions as needed to ensure smooth operation.

Development

go build \
    -o exporter \
    github.com/superdango/cloud-carbon-exporter/cmd && \
    ./exporter -cloud.provider=aws -log.level=debug

Acknowledgements

We're grateful for every contribution that helps shape Cloud Carbon Exporter. Whether it's through testing, feedback, or documentation, each effort strengthens our software and enhances the user experience.

We'd like to extend our heartfelt appreciation to the individuals who have invested significant time and energy into making this project better.

Contributing

We appreciate your input and contributions to Cloud Carbon Exporter. Here's how you can help:

Share Feedback and Ideas · Found a bug or have a feature idea? Start a discussion in our GitHub Discussions. Your testing and feedback are crucial to improving the project.

Code Contributions · We're actively refactoring to improve the codebase. For now, we're focusing on smaller, targeted contributions to ensure a smooth integration.

Model Contributions · Contribute directly to source model like Boavizta or Cloud Carbon Footprint

Thank you for your support!

Sponsor

dangofish.com - Tools and Services for Carbon-Driven Developers.

Licence

This software is provided as is, without waranty under AGPL 3.0 licence

Documentation

Index

Constants

This section is empty.

Variables

View Source
var ZeroEmissions = EmissionsOverTime{During: time.Hour, Emissions: 0}

Functions

func MergeLabels

func MergeLabels(labels ...map[string]string) map[string]string

Types

type Context

type Context interface {
	context.Context
	IncrCalls()
	Calls() int
}

func WrapCtx

func WrapCtx(ctx context.Context) Context

type Ctx

type Ctx struct {
	context.Context
	// contains filtered or unexported fields
}

func (*Ctx) Calls

func (c *Ctx) Calls() int

func (*Ctx) IncrCalls

func (c *Ctx) IncrCalls()

type Emissions

type Emissions float64

Emissions in gCO2eq

func (Emissions) KgCO2eq

func (e Emissions) KgCO2eq() float64

func (Emissions) TCO2eq

func (e Emissions) TCO2eq() float64

type EmissionsOverTime

type EmissionsOverTime struct {
	During    time.Duration
	Emissions Emissions
}

func CombineEmissionsOverTime

func CombineEmissionsOverTime(eots ...EmissionsOverTime) EmissionsOverTime

func (EmissionsOverTime) KgCO2eq_day

func (k EmissionsOverTime) KgCO2eq_day() float64

func (EmissionsOverTime) KgCO2eq_second

func (k EmissionsOverTime) KgCO2eq_second() float64

func (EmissionsOverTime) KgCO2eq_year

func (k EmissionsOverTime) KgCO2eq_year() float64

type Energy

type Energy float64

Energy in watt

type Explorer

type Explorer interface {
	CollectImpacts(ctx Context, impacts chan *Impact, errors chan error)
	Init(ctx context.Context) error
	IsReady() bool
	SupportedServices() []string
	io.Closer
}

type ExplorerErr

type ExplorerErr struct {
	Err       error
	Operation string
}

func (*ExplorerErr) Error

func (explorerErr *ExplorerErr) Error() string

func (*ExplorerErr) Unwrap

func (explorerErr *ExplorerErr) Unwrap() error

type Impact

type Impact struct {
	// Labels for impact
	Labels map[string]string
	// Energy in watts
	Energy Energy
	// EnergyEmissions are emissions related to energy in kgCO2eq/day
	EnergyEmissions EmissionsOverTime
	// EmbodiedEmissions are emissions related to the manufacturing
	EmbodiedEmissions EmissionsOverTime
}

type Metric

type Metric struct {
	Name   string
	Labels map[string]string
	Value  float64
}

Metric olds the name and value of a measurement in addition to its labels.

func NewEmbodiedEmissionsMetric

func NewEmbodiedEmissionsMetric(value EmissionsOverTime) *Metric

func NewEmissionsMetric

func NewEmissionsMetric(value EmissionsOverTime) *Metric

func NewEnergyMetric

func NewEnergyMetric(value Energy) *Metric

func (*Metric) AddLabel

func (m *Metric) AddLabel(key, value string) *Metric

func (Metric) Clone

func (m Metric) Clone() Metric

Clone return a deep copy of a metric.

func (*Metric) SanitizeLabels

func (m *Metric) SanitizeLabels() *Metric

func (*Metric) SetLabels

func (m *Metric) SetLabels(l map[string]string) *Metric

func (*Metric) SetValue

func (m *Metric) SetValue(v float64) *Metric

type OpenMetricsHandler

type OpenMetricsHandler struct {
	// contains filtered or unexported fields
}

OpenMetricsHandler implements the http.Handler interface

func NewOpenMetricsHandler

func NewOpenMetricsHandler(explorerName string, explorer Explorer) *OpenMetricsHandler

NewOpenMetricsHandler create a new OpenMetricsHandler

func (*OpenMetricsHandler) ServeHTTP

func (handler *OpenMetricsHandler) ServeHTTP(w http.ResponseWriter, r *http.Request)

ServeHTTP implements the http.Handler interface. It collects all metrics from the configured collector and return them, formatted in the http response.

Directories

Path Synopsis
internal
aws
* Copyright (c) 2025 * All rights reserved.
* Copyright (c) 2025 * All rights reserved.
gcp
scw
model

Jump to

Keyboard shortcuts

? : This menu
/ : Search site
f or F : Jump to
y or Y : Canonical URL