spdx/spdx-3-model

[3.1] AI: Add inferenceEnergyConsumption and totalEnergyConsumption

Closed this issue · 4 comments

Per discussion in #671 (comment) with @bennetkl, also related to #677 and #682, the next version of energyConsumption may also include the energy consumption at the inference time.

It is also possible to include the estimation of the energy consumption for the entire life cycle of an AI system.

This issue will be use to collect

  • Definitions
  • Estimation methods
  • Links to relevant technical/legal requirements or documents

Proposed properties and descriptions

A free-form text that captures the amount of energy needed to train and operate the AI model.

  • trainingEnergyConsumption for energy consumption during the training of an AI model

    • Training energy consumption is the amount of energy consumed during the training phase of an AI model. This includes the energy used to power the hardware and software systems used for training, such as GPUs, CPUs, and data centers.
    • Renamed from energyConsumption #677
    • EU AI Act refs: see Points (2)(d) and (2)(e) in Annex IXa Section 1 and Point (c) in Annex IXc
  • inferenceEnergyConsumption for energy consumption per one inference

    • Inference energy consumption is the amount of energy consumed per one inference of an AI model. This includes the energy used to power the hardware and software systems used for making predictions or decisions based on the model, such as edge devices, servers, or cloud services.
  • totalEnergyConsumption for energy consumption during the entire life cycle of the system

    • Total energy consumption is the total amount of energy consumed by an AI model over its entire lifecycle, including training, inference, and any other related activities.

Another way to record the energy consumption is to record the compute, which can be either in

  • FLOPs or
  • multiply-accumulate (MAC) or multiply-add (MAD)

The approach is to not recording/estimating the energy consumption directly but instead record the number of compute.

"The unit of energy is normalized in terms of the energy for a multiply-and-accumulate (MAC) operation (i.e., 10^2 = energy of 100 MACs)." [1]

The actual energy consumption can be calculated from that normalized unit and is varied by the actual hardware using for the computation - which can be more energy efficient overtime.

Because of this, this record the compute approach may be less relevant for the trainingEnergyConsumption as that property is intended to record what already happened in the past. (It can be more relevant if the AI package provider/maintainer also provide the types of hardware they used for the training).

On the other hand, this approach will be very useful for inferenceEnergyConsumption, as the inference can be done on a hardware different from what the AI package was originally designed for/tested with. The AI deployer will have more freedom to estimate the energy consumption for their own hardware.

See:

Update: will be in 3.0, per AI team meeting 2024-04-03

See these PRs:

Was included in #648 to avoid breaking change in future.

bact commented

From 2024-04-10 meeting:

Hugging Face model card has a carbon footprint information:

This could compliment the energy consumption. For 3.1?