Skip to content

Commit bdb7b4e

Browse files
committed
return initial embeds
1 parent 7afa63e commit bdb7b4e

File tree

2 files changed

+6
-1
lines changed

2 files changed

+6
-1
lines changed

pyproject.toml

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,6 +1,6 @@
11
[project]
22
name = "x-transformers"
3-
version = "2.7.1"
3+
version = "2.7.2"
44
description = "X-Transformers"
55
authors = [
66
{ name = "Phil Wang", email = "[email protected]" }

x_transformers/x_transformers.py

Lines changed: 5 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -48,6 +48,7 @@ class LayerIntermediates:
4848
attn_z_loss: Tensor | None = None
4949
mems: Tensor | None = None
5050
last_layer_hiddens: Tensor | None = None
51+
initial_embeds: Tensor | None = None
5152
attn_pooled_tokens: Tensor | None = None
5253
memory_tokens: Tensor | None = None
5354
logit_entropies: Tensor | None = None
@@ -3378,6 +3379,10 @@ def forward(
33783379

33793380
intermediates.last_layer_hiddens = x
33803381

3382+
# store initial embed
3383+
3384+
intermediates.initial_embed = init_embed
3385+
33813386
# global average pool
33823387

33833388
if self.average_pool_embed:

0 commit comments

Comments
 (0)