Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[QA] Internevo这个框架里面MoE支持expert parallel嘛? #253

Open
Cerberous opened this issue Jun 18, 2024 · 1 comment
Open

[QA] Internevo这个框架里面MoE支持expert parallel嘛? #253

Cerberous opened this issue Jun 18, 2024 · 1 comment
Assignees
Labels
question Further information is requested

Comments

@Cerberous
Copy link

Describe the question.

请问大佬们 Internevo这个框架里面MoE支持expert parallel嘛?如果有的话怎么使用呢?不然直接训练MoE感觉tflops很低

@Cerberous Cerberous added the question Further information is requested label Jun 18, 2024
@gaoyang07 gaoyang07 assigned sunpengsdu and unassigned yhcc Jun 18, 2024
@blankde
Copy link
Collaborator

blankde commented Jun 28, 2024

@Cerberous 目前默认是使用expert parallel策略的,即专家分布在多个设备上进行计算。目前暂时还不支持用户设置expert parallel size。#240 提供了用户在配置文件里设置ep_size的功能。

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
question Further information is requested
Projects
None yet
Development

No branches or pull requests

4 participants