MohamedRashad commited on
Commit
fd7e68f
·
1 Parent(s): 9459cdf

Update requirements to remove specific versions for torch, torchvision, and transformers

Browse files
Files changed (1) hide show
  1. requirements.txt +3 -3
requirements.txt CHANGED
@@ -3,8 +3,8 @@
3
  accelerate
4
  sentencepiece
5
  diffusers
6
- torch==2.4.0
7
- torchvision==0.19.0
8
  pillow==10.4.0
9
  imageio==2.36.1
10
  imageio-ffmpeg==0.5.1
@@ -22,7 +22,7 @@ igraph==0.11.8
22
  git+https://github.com/EasternJournalist/utils3d.git@9a4eb15e4021b67b12c460c7057d642626897ec8
23
  xformers==0.0.27.post2
24
  spconv-cu120==2.3.6
25
- transformers==4.46.3
26
  https://github.com/Dao-AILab/flash-attention/releases/download/v2.7.0.post2/flash_attn-2.7.0.post2+cu12torch2.4cxx11abiFALSE-cp310-cp310-linux_x86_64.whl
27
  https://huggingface.co/spaces/JeffreyXiang/TRELLIS/resolve/main/wheels/diff_gaussian_rasterization-0.0.0-cp310-cp310-linux_x86_64.whl?download=true
28
  https://huggingface.co/spaces/JeffreyXiang/TRELLIS/resolve/main/wheels/nvdiffrast-0.3.3-cp310-cp310-linux_x86_64.whl?download=true
 
3
  accelerate
4
  sentencepiece
5
  diffusers
6
+ torch
7
+ torchvision
8
  pillow==10.4.0
9
  imageio==2.36.1
10
  imageio-ffmpeg==0.5.1
 
22
  git+https://github.com/EasternJournalist/utils3d.git@9a4eb15e4021b67b12c460c7057d642626897ec8
23
  xformers==0.0.27.post2
24
  spconv-cu120==2.3.6
25
+ transformers
26
  https://github.com/Dao-AILab/flash-attention/releases/download/v2.7.0.post2/flash_attn-2.7.0.post2+cu12torch2.4cxx11abiFALSE-cp310-cp310-linux_x86_64.whl
27
  https://huggingface.co/spaces/JeffreyXiang/TRELLIS/resolve/main/wheels/diff_gaussian_rasterization-0.0.0-cp310-cp310-linux_x86_64.whl?download=true
28
  https://huggingface.co/spaces/JeffreyXiang/TRELLIS/resolve/main/wheels/nvdiffrast-0.3.3-cp310-cp310-linux_x86_64.whl?download=true