Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Documentation writing #72

Merged
merged 11 commits into from
Aug 20, 2024
Merged
2 changes: 1 addition & 1 deletion .github/ISSUE_TEMPLATE/bug_report.yml
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
name: Bug
name: 🐛 Bug
description: Report a bug or an unexpected behavior
labels: [bug]

Expand Down
2 changes: 1 addition & 1 deletion .github/ISSUE_TEMPLATE/feature_request.yml
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
name: Feature request
name: 🚀 Feature request
description: Suggest a new feature for EcoLogits
labels: [feature request]

Expand Down
27 changes: 0 additions & 27 deletions .github/ISSUE_TEMPLATE/issue_template.md

This file was deleted.

2 changes: 1 addition & 1 deletion docs/contributing.md
Original file line number Diff line number Diff line change
Expand Up @@ -115,4 +115,4 @@ Code examples are highly encouraged, but should be kept short, simple and self-c

## Acknowledgment

We'd like to acknowledge that this contribution guide is heavily inspired by the excellent [guide from Pydantic](https://docs.pydantic.dev/latest/contributing/). Thanks the inspiration! :heart:
We'd like to acknowledge that this contribution guide is heavily inspired by the excellent [guide from Pydantic](https://docs.pydantic.dev/latest/contributing/). Thanks for the inspiration! :heart:
28 changes: 27 additions & 1 deletion docs/css/extra.css
Original file line number Diff line number Diff line change
Expand Up @@ -31,4 +31,30 @@ Light green: 00bf63
--md-typeset-a-color: #00bf63 !important;
--md-accent-fg-color: #008343;
--md-primary-fg-color--dark: #062522;
}
}


.provider-item {
display: inline-block;
margin: 4px 4px;
border-radius: 5px;
}

.provider-item:hover {
background: #0000000c;
}

[data-md-color-scheme="slate"] {
.provider-item:hover {
background: #0000005f;
}
}

.provider-item label {
padding: 1px 4px 1px 0;
/*padding-right: 4px;*/
}

.provider-item input {
accent-color: var(--md-accent-fg-color);
}
46 changes: 46 additions & 0 deletions docs/faq.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,46 @@
# Frequently Asked Questions


## Why are training impacts not included?

Even though the training impacts of generative AI models are substantial, we currently do not implement them in our methodologies and tools. EcoLogits is aimed at estimating the impacts of an API request made to a GenAI service. To make the impact assessment complete, we indeed should take into account training impacts. However, given that we focus on services that are used by millions of people, doing billions of requests annually the training impacts are in fact negligible.

For example, looking at Llama 3 70B, the estimated training greenhouse gas emissions are $1,900\ tCO2eq$. This is significant for an AI model but comparing it to running inference on that model for say 100 billion requests annually makes the share of impacts induced by training the model becomes very small. E.g., $\frac{1,900\ \text{tCO2eq}}{100\ \text{billion requests}} = 1.9e-8\ \text{tCO2eq per request}$ or $0.019\ \text{gCO2eq per request}$. This, compared to running a simple request to Llama 3 70B that would yield $1\ \text{to}\ 5\ \text{gCO2}$ (calculated with our methodology).

It does not mean that we do not plan to integrate training impacts, it is just not a priority right now due to the difference in order of magnitude. It is also worth mentioning that estimating the number of requests that will be ever made in the lifespan of a model is very difficult, both for open-source and proprietary models. You can join the discussion on [GitHub #70 :octicons-link-external-16:](https://github.com/genai-impact/ecologits/discussions/70).


## What's the difference with CodeCarbon?

EcoLogits and [CodeCarbon :octicons-link-external-16:](https://github.com/mlco2/codecarbon) are two different tools that do not aim to address the same use case. CodeCarbon should be used when you control the execution environment of your model. This means that if you deploy models on your laptop, your server or in the cloud it is preferable to use CodeCarbon to get energy consumption and estimate carbon emissions associated with running your model (including training, fine-tuning or inference).

On the other hand EcoLogits is designed for scenarios where you do not have access to the execution environment of your GenAI model because it is managed by a third-party provider. In such cases you can rely on EcoLogits to estimate energy consumption and environmental impacts for inference workloads. Both tools are complementary and can be used together to provide a comprehensive view of environmental impacts across different deployment scenarios.


## How can I estimate impacts of general use of GenAI models?

If you want to estimate the environmental impacts of using generative AI models without coding or making request, we recommend you to use our online webapp [EcoLogits Calculator :octicons-link-external-16:](https://huggingface.co/spaces/genai-impact/ecologits-calculator).


## How do we assess impacts for proprietary models?

Environmental impacts are calculated based on model architecture and parameter count. For proprietary models, we lack transparency from providers, so we estimate parameter counts using available information. For GPT models, we based our estimates on leaked GPT-4 architecture and scaled parameters count for GPT-4-Turbo and GPT-4o based on pricing differences. For other proprietary models like Anthropic's Claude, we assume similar impacts for models released around the same time with similar performance on public benchmarks. Please note that these estimates are based on assumptions and may not be exact. Our methods are open-source and transparent, so you can always see the hypotheses we use.


## How to reduce my environmental impact?

First, you may want to assess [**indirect impacts** :octicons-link-external-16:](https://www.tsw.co.uk/blog/environmental/indirect-environment-impacts/) and [**rebound effects** :octicons-link-external-16:](https://en.wikipedia.org/wiki/Rebound_effect_(conservation)) of the project you are building. Does the finality of your product or service is impacting negatively the environment? Does the usage of your product or service drives up consumption and environmental impacts of previously existing technology?

Try to **be frugal** and question your usages or needs of AI:

- Do you really need AI to solve your problem?
- Do you really need GenAI to solve your problem? (you can read this [paper :octicons-link-external-16:](https://arxiv.org/pdf/2305.05862))
- Prefer fine-tuning of small and existing models over generalist models.
- Evaluate before, during and after the development of your project the environmental impacts with tools like EcoLogits or [CodeCarbon :octicons-link-external-16:](https://github.com/mlco2/codecarbon) ([see more tools :octicons-link-external-16:](https://github.com/samuelrince/awesome-green-ai))
- Restrict the use case and limit the usage of your tool or feature to the desired purpose.

**Do not buy new GPUs or hardware**. Hardware production for data centers is responsible for around 50% of the impacts compared to usage impacts. The share is even more bigger for consumer devices, around 80%.

Use cloud instances that are located in low emissions / high energy efficiency data centers (see [electricitymaps.com :octicons-link-external-16:](https://app.electricitymaps.com/map)).

**Optimize your models for production use cases.** You can look at model compression technics such as quantization, pruning or distillation. There are also inference optimization tricks available in some software.
68 changes: 36 additions & 32 deletions docs/index.md
Original file line number Diff line number Diff line change
Expand Up @@ -5,7 +5,7 @@
![EcoLogits](assets/logo_dark.png#only-dark)
</figure>

**EcoLogits** tracks the energy consumption and environmental impacts of using generative AI models through APIs. It supports major LLM providers such as OpenAI, Anthropic, Mistral AI and more (see [supported providers](providers.md)).
**EcoLogits** tracks the energy consumption and environmental impacts of using generative AI models through APIs. It supports major LLM providers such as OpenAI, Anthropic, Mistral AI and more (see [supported providers](tutorial/providers.md)).


## Requirements
Expand All @@ -20,38 +20,42 @@ EcoLogits relies on key libraries to provide essential functionalities:

## Installation

To install EcoLogits, use the following command:
<p><strong>Select providers</strong></p>
<span class="provider-item">
<input type="checkbox" id="anthropic" value="anthropic" class="provider-option">
<label for="anthropic">Anthropic</label>
</span>
<span class="provider-item">
<input type="checkbox" id="cohere" value="cohere" class="provider-option">
<label for="cohere">Cohere</label>
</span>
<span class="provider-item">
<input type="checkbox" id="google-generativeai" value="google-generativeai" class="provider-option">
<label for="google-generativeai">Google Gemini</label>
</span>
<span class="provider-item">
<input type="checkbox" id="huggingface-hub" value="huggingface-hub" class="provider-option">
<label for="huggingface-hub">Hugging Face Inference Endpoints</label>
</span>
<span class="provider-item">
<input type="checkbox" id="litellm" value="litellm" class="provider-option">
<label for="litellm">LiteLLM</label>
</span>
<span class="provider-item">
<input type="checkbox" id="mistralai" value="mistralai" class="provider-option">
<label for="mistralai">Mistral AI</label>
</span>
<span class="provider-item">
<input type="checkbox" id="openai" value="openai" checked="checked" class="provider-option">
<label for="openai">OpenAI</label>
</span>

<p><strong>Run this command</strong></p>
<pre><code id="install-command"></code></pre>
<script src="js/installer.js"></script>

For detailed instructions on each provider, refer to the complete list of [supported providers and features](tutorial/providers.md). It is also possible to install EcoLogits without any provider.

```shell
pip install ecologits
```

For integration with specific providers like OpenAI, additional dependencies can be installed using:

```shell
pip install ecologits[openai]
```

EcoLogits currently supports the following providers/clients:

- `anthropic`
- `cohere`
- `google-generativeai`
- `huggingface-hub` (Hugging Face Inference Endpoints)
- `litellm`
- `mistralai`
- `openai`

For detailed instructions on each provider, refer to the complete list of [supported providers and features](providers.md).


??? info "Installation with multiple providers"

If you need to install extra dependencies for **multiple providers** at once, you can do so by separating them with a comma, like this:

```shell
pip install ecologits[openai,anthropic]
```

## Usage Example

Expand Down
24 changes: 24 additions & 0 deletions docs/js/installer.js
Original file line number Diff line number Diff line change
@@ -0,0 +1,24 @@
document.addEventListener('DOMContentLoaded', function() {
const checkboxes = document.querySelectorAll('input.provider-option[type="checkbox"]');
const commandOutput = document.getElementById('install-command');

function updateCommand() {
let selectedProviders = [];
checkboxes.forEach(checkbox => {
if (checkbox.checked) {
selectedProviders.push(checkbox.value);
}
});

let command = 'pip install ecologits';
if (selectedProviders.length > 0) {
command += `[${selectedProviders.join(',')}]`;
}
commandOutput.textContent = command;
}

checkboxes.forEach(checkbox => checkbox.addEventListener('change', updateCommand));

// Initialize with default command
updateCommand();
});
17 changes: 12 additions & 5 deletions docs/methodology/index.md
Original file line number Diff line number Diff line change
Expand Up @@ -14,12 +14,19 @@ Upcoming methodologies ([join us](https://genai-impact.org/contact) to help spee

## Methodological background

!!! warning "Under construction..."
EcoLogits employs the **Life Cycle Assessment (LCA) methodology**, as defined by ISO 14044, to estimate the environmental impacts of requests made to generative AI inference services. This approach focuses on **multiple phases** of the lifecycle, specifically raw material extraction, manufacturing, transportation (denoted as embodied impacts), usage and end-of-life. Notably, we do not cover the end-of-life phase due to data limitations on e-waste recycling.

[//]: # (- What's an LCA?)
[//]: # (- What are usage and embodied phases?)
[//]: # (- What about end-of-life phase?)
[//]: # (- What are multicriteria impacts? &#40;GWP, ADPe, MJ and others&#41;)
Our assessment considers **three key environmental criteria**:

- Global Warming Potential (GWP): Evaluates the impact on global warming in terms of CO2 equivalents.
- Abiotic Resource Depletion for Elements (ADPe): Assesses the consumption of raw minerals and metals, expressed in antimony equivalents.
- Primary Energy (PE): Calculates energy consumed from natural sources, expressed in megajoules.

Using a **bottom-up modeling approach**, we assess and aggregate the environmental impacts of all individual service components. This method differs from top-down approaches by allowing precise allocation of each resource's impact to the overall environmental footprint.

Our current focus is on high-performance GPU-accelerated cloud instances, crucial for GenAI inference tasks. While we exclude impacts from training, networking, and end-user devices, we thoroughly evaluate the impacts associated with hosting and running the model inferences.

The methodology is grounded in **transparency** and **reproducibility**, utilizing open market and technical data to ensure our results are reliable and verifiable.

## Licenses and citations

Expand Down
4 changes: 0 additions & 4 deletions docs/roadmap.md

This file was deleted.

4 changes: 0 additions & 4 deletions docs/tutorial/impacts.md
Original file line number Diff line number Diff line change
@@ -1,9 +1,5 @@
# Environmental Impacts

!!! warning

This page is under construction.

Environmental impacts are reported for each request in the [`Impacts`][impacts.modeling.Impacts] pydantic model and features multiple [criteria](#criteria) such as the [energy](#energy) and [global warming potential](#global-warming-potential-gwp) per phase ([usage](#usage) or [embodied](#embodied)) as well as the total impacts.

To learn more on how we estimate the environmental impacts and what are our hypotheses go to the [methodology](../methodology/index.md) section.
Expand Down
6 changes: 1 addition & 5 deletions docs/tutorial/index.md
Original file line number Diff line number Diff line change
@@ -1,9 +1,5 @@
# Tutorial

!!! warning

This page is under construction.

The :seedling: **EcoLogits** library tracks the energy consumption and environmental impacts of generative AI models accessed through APIs and their official client libraries.

It achieves this by **patching the Python client libraries**, ensuring that each API request is wrapped with an impact calculation function. This function computes the **environmental impact based on several request features**, such as the **chosen model**, the **number of tokens generated**, and the **request's latency**. The resulting data is then encapsulated in an `Impacts` object, which is added to the response, containing the environmental impacts for a specific request.
Expand Down Expand Up @@ -34,7 +30,7 @@ It achieves this by **patching the Python client libraries**, ensuring that each

List of providers and tutorials on how to make requests.

[:octicons-arrow-right-24: Providers](../providers.md)
[:octicons-arrow-right-24: Providers](providers.md)

- :material-test-tube:{ .lg .middle } __Methodology__

Expand Down
20 changes: 10 additions & 10 deletions docs/providers.md → docs/tutorial/providers.md
Original file line number Diff line number Diff line change
Expand Up @@ -2,15 +2,15 @@

## List of all providers

| Provider name | Extra for installation | Guide |
|------------------|------------------------|----------------------------------------------------------------------------------------|
| Anthropic | `anthropic` | [Guide for Anthropic :octicons-link-16:](tutorial/providers/anthropic.md) |
| Cohere | `cohere` | [Guide for Cohere :octicons-link-16:](tutorial/providers/cohere.md) |
| Google Gemini | `google-generativeai` | [Guide for Google Gemini :octicons-link-16:](tutorial/providers/google.md) |
| Hugging Face Hub | `huggingface-hub` | [Guide for Hugging Face Hub :octicons-link-16:](tutorial/providers/huggingface_hub.md) |
| LiteLLM | `litellm` | [Guide for LiteLLM :octicons-link-16:](tutorial/providers/litellm.md) |
| Mistral AI | `mistralai` | [Guide for Mistral AI :octicons-link-16:](tutorial/providers/mistralai.md) |
| OpenAI | `openai` | [Guide for OpenAI :octicons-link-16:](tutorial/providers/openai.md) |
| Provider name | Extra for installation | Guide |
|------------------|------------------------|-------------------------------------------------------------------------------|
| Anthropic | `anthropic` | [Guide for Anthropic :octicons-link-16:](providers/anthropic.md) |
| Cohere | `cohere` | [Guide for Cohere :octicons-link-16:](providers/cohere.md) |
| Google Gemini | `google-generativeai` | [Guide for Google Gemini :octicons-link-16:](providers/google.md) |
| Hugging Face Hub | `huggingface-hub` | [Guide for Hugging Face Hub :octicons-link-16:](providers/huggingface_hub.md) |
| LiteLLM | `litellm` | [Guide for LiteLLM :octicons-link-16:](providers/litellm.md) |
| Mistral AI | `mistralai` | [Guide for Mistral AI :octicons-link-16:](providers/mistralai.md) |
| OpenAI | `openai` | [Guide for OpenAI :octicons-link-16:](providers/openai.md) |


## Chat Completions
Expand All @@ -25,4 +25,4 @@
| Mistral AI | :material-checkbox-marked-circle: | :material-checkbox-marked-circle: | :material-checkbox-marked-circle: | :material-checkbox-marked-circle: |
| OpenAI | :material-checkbox-marked-circle: | :material-checkbox-marked-circle: | :material-checkbox-marked-circle: | :material-checkbox-marked-circle: |

Partial support for Anthropic streams, see full documentation: [Anthropic provider](tutorial/providers/anthropic.md).
Partial support for Anthropic streams, see full documentation: [Anthropic provider](providers/anthropic.md).
Loading
Loading