You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
hello, I finished the first run impl of the tzrec based rec training. thanks for the help along the way. one thing I don't understand is, I tried to use embedding group to manage the tables, and I tried build the features as IdFeature and lookupfeature, but no matter how I configure my embedding_dim in the config, the table dimension will be 4 when I check the sharding plan. is this correct? please see my embedding config below.
feature_configs = []
# Initialize RawFeature instances for each feature
for emb_bag_name, emb_bag_config in embedding_bag_configs.items():
feature_names = emb_bag_config.feature_names
for feature_name in feature_names:
feature_configs.append(
feature_pb2.FeatureConfig(
lookup_feature=feature_pb2.LookupFeature(
feature_name=feature_name,
embedding_name=emb_bag_config.name,
embedding_dim=128,
pooling='mean',
num_buckets=emb_bag_config.num_embeddings
)
)
)
# Create features
features = create_features(feature_configs)
# Define feature groups
feature_groups = [
model_pb2.FeatureGroupConfig(
group_name="wide",
feature_names=embedding_features,
group_type=model_pb2.FeatureGroupType.WIDE,
),
]
# Initialize EmbeddingGroup
embedding_group = EmbeddingGroup(features, feature_groups, device=device)
return embedding_group, embedding_features
The text was updated successfully, but these errors were encountered:
If you want to use embedding_dim in the feature config, the group_type should be specified as model_pb2.FeatureGroupType.DEEP. If group_type is set to model_pb2.FeatureGroupType.WIDE, the table dimension will be 4, whereas we aim to set it to 1. However, it's important to note that there's currently a limitation in torchrec (as documented in this GitHub issue) that restricts this functionality.
hello, I finished the first run impl of the tzrec based rec training. thanks for the help along the way. one thing I don't understand is, I tried to use embedding group to manage the tables, and I tried build the features as IdFeature and lookupfeature, but no matter how I configure my embedding_dim in the config, the table dimension will be 4 when I check the sharding plan. is this correct? please see my embedding config below.
The text was updated successfully, but these errors were encountered: