alpayariyak commited on
Commit
4887205
·
verified ·
1 Parent(s): e1712e1

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +1 -1
README.md CHANGED
@@ -168,7 +168,7 @@ We suggest using `vllm>=0.8.5` and enabling long context in VLLM to serve DeepSW
168
  ```bash
169
  export MAX_CONTEXT_LEN=65536
170
  export TENSOR_PARALLEL_SIZE=8
171
- VLLM_ALLOW_LONG_MAX_MODEL_LEN=1 vllm serve agentica-org/DeepSWE-Preview --tensor-parallel-size $TENSOR_PARALLEL_SIZE --max-model-len $MAX_CONTEXT_LEN --hf-overrides '{"max_position_embeddings": $MAX_CONTEXT_LEN}' --enable_prefix_caching
172
  ```
173
 
174
  ## License
 
168
  ```bash
169
  export MAX_CONTEXT_LEN=65536
170
  export TENSOR_PARALLEL_SIZE=8
171
+ VLLM_ALLOW_LONG_MAX_MODEL_LEN=1 vllm serve agentica-org/DeepSWE-Preview --tensor-parallel-size $TENSOR_PARALLEL_SIZE --max-model-len $MAX_CONTEXT_LEN --hf-overrides '{\"max_position_embeddings\": $MAX_CONTEXT_LEN}' --enable_prefix_caching
172
  ```
173
 
174
  ## License