56 Star 141 Fork 91

MindSpore/vllm-mindspore

fix multi-api-server need to import vllm-ms again 可合并
mindspore-cla/no
!499 Junhong 3
审查:
测试:
2025-06-12 20:49
test hccl port 可合并
mindspore-cla/yes
ci-pipeline-failed
good-refactor-case
!498 moran 10
审查:
测试:
2025-06-12 17:07
[Refactor] Adapt v0.8.3 for MindONE models 可合并
mindspore-cla/yes
ci-pipeline-failed
!497 alien_0119 9
审查:
测试:
2025-06-12 15:36
[Refactor] Adapt v0.8.3 for MultiModal 可合并
mindspore-cla/yes
ci-pipeline-passed
!496 alien_0119 8
审查:
测试:
2025-06-12 11:36
[feature] Add an adaptation layer for MF MCore model 可合并
mindspore-cla/yes
ci-pipeline-passed
sig/llm-inference
!492 Jingwei Huang 8
审查:
测试:
2025-06-11 10:09
[featrue] multilora support mindformers mcore model 可合并
mindspore-cla/yes
ci-pipeline-failed
sig/llm-inference
!490 nashturing 8
审查:
测试:
2025-06-10 23:38
[featrue] multilora support mindformers mcore model 可合并
mindspore-cla/yes
ci-pipeline-failed
sig/llm-inference
!489 nashturing 12
审查:
测试:
2025-06-10 14:28
[CI] Optimize ST cases 存在冲突
mindspore-cla/yes
ci-pipeline-failed
good-refactor-case
sig/llm-inference
!488 moran 28
审查:
测试:
2025-06-10 12:04
recover codecheck 可合并
mindspore-cla/yes
ci-pipeline-passed
good-refactor-case
sig/llm-inference
!487 moran 16
审查:
测试:
2025-06-10 09:53
[featrue]Large EP support enable_micro_batch config in vllm_config and forwar... 可合并
mindspore-cla/yes
ci-pipeline-failed
can-review
sig/llm-inference
!486 fengtingyan 41
审查:
测试:
2025-06-09 22:42
Python
1
https://gitee.com/mindspore/vllm-mindspore.git
git@gitee.com:mindspore/vllm-mindspore.git
mindspore
vllm-mindspore
vllm-mindspore

搜索帮助