Robotics
LeRobot
Safetensors
smolvla

New smolVLA pipeline, Could not find 'policy_preprocessor.json' , Error while deserializing header: header too large

#12
by juliobellano - opened

Hi, I was tying to finetune SmolVLA using the tutorial (https://huggingface.co/blog/smolvla) and I encountered these issues.

  1. FileNotFoundError: Could not find 'policy_preprocessor.json' on the HuggingFace Hub at 'lerobot/smolvla_base'
  2. safetensors_rust.SafetensorError: Error while deserializing header: header too large

To recreate this error:
python src/lerobot/scripts/train.py
--policy.path=smolvla_base_draft
--policy.push_to_hub=false
--dataset.repo_id=lerobot/svla_so101_pickplace
--batch_size=32
--steps=20000
--output_dir=outputs/train/my_smolvla02
--job_name=my_smolvla_training
--policy.device=mps
--wandb.enable=True

The first problem can be fixed using the new files from https://huggingface.co/lerobot/smolvla_base/discussions/11/files and use the path on --policy.path.

Hi, I was tying to finetune SmolVLA using the tutorial (https://huggingface.co/blog/smolvla) and I encountered these issues.

  1. FileNotFoundError: Could not find 'policy_preprocessor.json' on the HuggingFace Hub at 'lerobot/smolvla_base'
  2. safetensors_rust.SafetensorError: Error while deserializing header: header too large

To recreate this error:
python src/lerobot/scripts/train.py
--policy.path=smolvla_base_draft
--policy.push_to_hub=false
--dataset.repo_id=lerobot/svla_so101_pickplace
--batch_size=32
--steps=20000
--output_dir=outputs/train/my_smolvla02
--job_name=my_smolvla_training
--policy.device=mps
--wandb.enable=True

The first problem can be fixed using the new files from https://huggingface.co/lerobot/smolvla_base/discussions/11/files and use the path on --policy.path.

you can solve the error by running the command:

!cd lerobot_new && python src/lerobot/processor/migrate_policy_normalization.py
--pretrained-path hf_username/modelname
--push-to-hub

Sign up or log in to comment