UniVLA
Collection
All you need to get started with UniVLA
β’
9 items
β’
Updated
β’
1
The model was presented in the paper UniVLA: Learning to Act Anywhere with Task-centric Latent Actions.
Code can be found at https://github.com/OpenDriveLab/UniVLA.
π Run the following script to start an evaluation on SimplerEnv-Bridge "Put Spoon on Table Cloth":
Please visit our official repo for detailed instruction.
ckpt_path="/path/to/your/univla-7b-224-sft-simpler-bridge"
action_decoder_path="/path/to/your/univla-7b-224-sft-simpler-bridge/action_decoder.pt"
python experiments/robot/r2r/real2sim_eval_maniskill3.py \
--model="univla" -e "PutSpoonOnTableClothInScene-v1" -s 0 --num-episodes 24 --num-envs 1 \
--action_decoder_path ${action_decoder_path} \
--ckpt_path ${ckpt_path} \
If you find our models useful in your work, please cite our paper:
@article{bu2025univla,
title={UniVLA: Learning to Act Anywhere with Task-centric Latent Actions},
author={Qingwen Bu and Yanting Yang and Jisong Cai and Shenyuan Gao and Guanghui Ren and Maoqing Yao and Ping Luo and Hongyang Li},
journal={arXiv preprint arXiv:2505.06111},
year={2025}
}