Why the weight_norm has been removed

#4
by Xiaoyu94 - opened

Hi, I'm wondering why the weight_norm for each convolutional layer has been removed in the HuggingFace implementation (https://github.com/huggingface/transformers/blob/main/src/transformers/models/mimi/modeling_mimi.py#L213-L225)
while the official implementation kept it (https://github.com/kyutai-labs/moshi/blob/main/moshi/moshi/modules/conv.py#L126). If I understood well, HF has the code regarding weight_norm (https://github.com/huggingface/transformers/blob/main/src/transformers/models/mimi/modeling_mimi.py#L170-L178) but this code has never been used during forward()

Sign up or log in to comment