CodeFormer
This version of CoderFormer has been converted to run on the Axera NPU using w8a16 quantization.
This model has been optimized with the following LoRA:
Compatible with Pulsar2 version: 5.0-patch1
Convert tools links:
For those who are interested in model conversion, you can try to export axmodel through
The repo of AXera Platform, which you can get the detail of guide
Support Platform
| Chips | model | cost |
|---|---|---|
| AX650 | coderformer | 408 ms |
How to use
Download all files from this repository to the device
root@ax650:~/coderformer# tree
.
|-- model
| `-- coderformer.axmodel
|-- python
|`-- run_axmodel.py
|`-- requirements.txt
Inference
Input Data:
|-- images
| `-- 00_00.png
| `-- 00_01.png
| `-- 00_02.png
Inference with AX650 Host, such as M4N-Dock(爱芯派Pro)
root@ax650 ~/realesrgan #python3 run_axmodel.py --inputs_path ./images --model_path ./coderformer.axmodel
[INFO] Available providers: ['AxEngineExecutionProvider']
[INFO] Using provider: AxEngineExecutionProvider
[INFO] Chip type: ChipType.MC50
[INFO] VNPU type: VNPUType.DISABLED
[INFO] Engine version: 2.12.0s
[INFO] Model type: 2 (triple core)
[INFO] Compiler version: 5.0-patch1 681a0b38
SR image save to `00_00.png`
SR image save to `01_00.png`
SR image save to `02_00.png`
- Downloads last month
- 10
