-
Notifications
You must be signed in to change notification settings - Fork 423
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
bug in Leaky_relu instantiation #1076
Comments
Hi, can you please be more specific about your setup and model? Which version of hls4ml are you using? Which io_type and backend are you trying to use? |
Thank you for your reply.
|
There have been some changes in hls4ml on the data types of activations. Can you please confirm you have the latest version of hls4ml. You can either install it via pip or clone it from github and install it using |
Version 0.6.0 is certainly a bit old. As I said, it seems to be working fine in the current master, so I would indeed encourage you to test with either the master or install version 0.8. via pip. |
I have the same problem. I already have the updated version of 0.8.1 but still have the same error. |
Can you give an example that we can run that shows the problem? |
### Model:
num_resnet_blocks = 6 rf_in = Input(shape=(32, 32, 2), name = 'rf_input') x = Conv2D(num_filters, (kernel_size), activation=None, padding='same', kernel_initializer='lecun_uniform', kernel_regularizer=l1(0.0001))(rf_in) for i in range(num_resnet_blocks): x = Conv2D(num_filters, (kernel_size), activation=None, padding = 'same')(x) x = GlobalAveragePooling2D()(x) dense_1 = Dense(128, activation='leaky_relu', kernel_initializer='lecun_uniform', kernel_regularizer=l1(0.0001))(x) model = keras.Model(rf_in, softmax) ### Hls4ml setup:
hls_config = hls4ml.utils.config_from_keras_model(model, granularity='name', default_reuse_factor=32) hls_config['LayerName']['dense_2']['exp_table_t'] = 'ap_fixed<16,6>' |
I confirmed the bug in the main branch and will try to fix it. In the meantime, this works (though you should check if the alpha is the same default). |
#!/usr/bin/env python
import numpy as np
import tensorflow as tf
from tensorflow.keras.layers import (
Activation,
Conv2D,
Dense,
BatchNormalization,
Add,
Input,
GlobalAveragePooling2D,
Dropout,
Flatten,
LeakyReLU,
)
import hls4ml
### Model:
def resnet_block(input_data, filters, conv_size):
shortcut = input_data # Store the original input for the shortcut connection
# Pre-activation (Batch Normalization and Activation before Convolution)
x = BatchNormalization()(input_data)
x = LeakyReLU()(x)
# Bottleneck 1x1 Convolution (reduce channels)
x = Conv2D(filters // 4, 1, activation=None, padding='same', kernel_initializer='lecun_uniform', kernel_regularizer=tf.keras.regularizers.L1(0.0001))(x)
x = BatchNormalization()(x)
x = LeakyReLU()(x)
# 3x3 Convolution
x = Conv2D(filters // 4, conv_size, activation=None, padding='same',kernel_initializer='lecun_uniform', kernel_regularizer=tf.keras.regularizers.L1(0.0001))(x)
x = BatchNormalization()(x)
x = LeakyReLU()(x)
# Bottleneck 1x1 Convolution (increase channels back to original)
x = Conv2D(filters, 1, activation=None, padding='same',kernel_initializer='lecun_uniform', kernel_regularizer=tf.keras.regularizers.L1(0.0001))(x)
# Shortcut connection (with 1x1 Convolution if needed to match dimensions)
if input_data.shape[-1] != filters: # Check if channel dimensions match
shortcut = Conv2D(filters, 1, activation=None, padding='same',kernel_initializer='lecun_uniform', kernel_regularizer=tf.keras.regularizers.L1(0.0001))(shortcut)
# Add the shortcut connection
x = Add()([x, shortcut])
return x
num_resnet_blocks = 6
num_filters = 16
kernel_size = 3,3
rf_in = Input(shape=(32, 32, 2), name = 'rf_input')
x = Conv2D(num_filters, (kernel_size), activation=None, padding='same', kernel_initializer='lecun_uniform', kernel_regularizer=tf.keras.regularizers.L1(0.0001))(rf_in)
x = BatchNormalization()(x)
x = LeakyReLU()(x)
for i in range(num_resnet_blocks):
x = resnet_block(x, num_filters, (kernel_size))
x = Conv2D(num_filters, (kernel_size), activation=None, padding = 'same')(x)
x = BatchNormalization()(x)
x = LeakyReLU()(x)
x = GlobalAveragePooling2D()(x)
x = Flatten()(x)
dense_1 = Dense(128, kernel_initializer='lecun_uniform', kernel_regularizer=tf.keras.regularizers.L1(0.0001))(x)
dense_act_1 = LeakyReLU()(dense_1)
dropout_1 = Dropout(0.7)(dense_act_1)
dense_2 = Dense(128, kernel_initializer='lecun_uniform', kernel_regularizer=tf.keras.regularizers.L1(0.0001))(dropout_1)
dense_act_2 = LeakyReLU()(dense_2)
dropout_2 = Dropout(0.7)(dense_act_2)
softmax = Dense(7, activation='softmax')(dropout_2)
model = tf.keras.Model(rf_in, softmax)
model.summary()
hls_config = hls4ml.utils.config_from_keras_model(model, granularity='name', default_reuse_factor=32, backend='Vivado')
hls_config['Model']['Strategy']='Resource'
hls_config['LayerName']['dense_2']['exp_table_t'] = 'ap_fixed<16,6>'
hls_config['LayerName']['dense_2']['inv_table_t'] = 'ap_fixed<16,6>'
hls_config['LayerName']['dense_2']['Strategy'] = 'Stable'
hls_model = hls4ml.converters.convert_from_keras_model(
model,
hls_config=hls_config,
output_dir='Res32_v2',
io_type='io_stream',
backend='Vivado',
part='xczu7ev-ffvc1156-2-e')
hls_model.compile() |
Hi @AnouarITI, PR #1085 by @jmitrevs should fix this issue. Can you check out the branch and confirm it works? |
I have a model that includes a leaky_relu layer. The leaky relu function requires three input parameters (input, alpha, output) however, during instantiation, it only generates a function with two input parameters (input, output).
This is the Leaky Relu function in activation.h:
in myproject() function Leaky relu is instantiated as follow:
nnet::leaky_relu<layer12_t, layer15_t, leaky_relu_config15>(layer12_out, layer15_out)
Could this be fixed please?
The text was updated successfully, but these errors were encountered: