We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
请问大佬们 Internevo这个框架里面MoE支持expert parallel嘛?如果有的话怎么使用呢?不然直接训练MoE感觉tflops很低
The text was updated successfully, but these errors were encountered:
@Cerberous 目前默认是使用expert parallel策略的,即专家分布在多个设备上进行计算。目前暂时还不支持用户设置expert parallel size。#240 提供了用户在配置文件里设置ep_size的功能。
Sorry, something went wrong.
sunpengsdu
No branches or pull requests
Describe the question.
请问大佬们 Internevo这个框架里面MoE支持expert parallel嘛?如果有的话怎么使用呢?不然直接训练MoE感觉tflops很低
The text was updated successfully, but these errors were encountered: