User Guide¶
Last updated: 03/24/2026
Overview¶
ray-ascend is a community-maintained hardware plugin that supports advanced Ray features on Ascend NPU accelerators.
This guide provides step-by-step instructions for installation, configuration, and usage of ray-ascend's key features:
- HCCL Collective Communication: Distributed collective operations across Ray actors using Huawei Collective Communication Library
- YuanRong Direct Transport: Efficient zero-copy transfer of CPU and NPU tensors between Ray actors
Prerequisites¶
- Architecture: aarch64, x86
- OS Kernel: Linux
- Python: >= 3.10, \<= 3.11
- Ray: Same version as ray-ascend
Optional dependencies for specific features:
- CANN == 8.2.rc1: Required for NPU features (HCCL, NPU tensor transport)
- torch == 2.7.1, torch-npu == 2.7.1.post1: Required for PyTorch NPU support
Quick Start¶
Contents¶
- Installation: Detailed installation and setup instructions
- HCCL Collective Communication: Collective operations guide
- YuanRong Direct Transport: Tensor transport guide
- API Reference: Complete API documentation
- Best Practices: Best practices, troubleshooting, and FAQ