Skip to content

Commit

Permalink
fix quantization of EmbeddingLayerNorm (#5321)
Browse files Browse the repository at this point in the history
  • Loading branch information
yufenglee authored Sep 29, 2020
1 parent c00e13a commit 5de47af
Showing 1 changed file with 2 additions and 3 deletions.
Original file line number Diff line number Diff line change
Expand Up @@ -17,6 +17,5 @@ def quantize(self):
(quantized_input_names, zero_point_names, scale_names, nodes) = \
self.quantizer.quantize_inputs(node, [2, 3, 4])

nodes.append(node)

self.quantizer.new_nodes += nodes
super().quantize()
self.quantizer.new_nodes += nodes

0 comments on commit 5de47af

Please sign in to comment.