error report to run example

#3
by cloudyu - opened

test_llada21.py", line 7, in
model = AutoModelForCausalLM.from_pretrained(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/opt/homebrew/lib/python3.12/site-packages/transformers/models/auto/auto_factory.py", line 367, in from_pretrained
return model_class.from_pretrained(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/opt/homebrew/lib/python3.12/site-packages/transformers/modeling_utils.py", line 4021, in from_pretrained
model = cls(config, *model_args, **model_kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/.cache/huggingface/modules/transformers_modules/LLaDA2_dot_1_hyphen_mini/modeling_llada2_moe.py", line 962, in init
self.model = LLaDA2MoeModel(config)
^^^^^^^^^^^^^^^^^^^^^^
File "/Users/.cache/huggingface/modules/transformers_modules/LLaDA2_dot_1_hyphen_mini/modeling_llada2_moe.py", line 783, in init
self.rotary_emb = LLaDA2MoeRotaryEmbedding(config=config)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/.cache/huggingface/modules/transformers_modules/LLaDA2_dot_1_hyphen_mini/modeling_llada2_moe.py", line 108, in init
self.rope_init_fn = ROPE_INIT_FUNCTIONS[self.rope_type]
~~~~~~~~~~~~~~~~~~~^^^^^^^^^^^^^^^^
KeyError: 'default'

inclusionAI org

We recommend using transformers==4.57.1 for inference.

utdawn changed discussion status to closed

for transformers-5.1.0 how to run it?
it's not reasonable to use transformers==4.57.1 only.

inclusionAI org

Thank you for bringing this to our attention.

The error occurs because the default strategy was removed from ROPE_INIT_FUNCTIONS[self.rope_type] in transformers 5.1.0. We are aware of this breaking change and will update our modeling files shortly to ensure compatibility with the latest version.

utdawn changed discussion status to open

Sign up or log in to comment