Skip to content

Commit 750eee5

Browse files
ch-wanssssnow
authored andcommitted
[hotfix] fix merge conflicts in FlashInferEPMoE (#8405)
1 parent d2a6c52 commit 750eee5

File tree

1 file changed

+1
-0
lines changed
  • python/sglang/srt/layers/moe/ep_moe

1 file changed

+1
-0
lines changed

python/sglang/srt/layers/moe/ep_moe/layer.py

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -1236,6 +1236,7 @@ def __init__(self, *args, **kwargs):
12361236
self.num_expert_group = num_expert_group
12371237
self.topk_group = topk_group
12381238
self.correction_bias = correction_bias
1239+
self.use_flashinfer_trtllm_moe = use_flashinfer_trtllm_moe
12391240

12401241
def forward(self, hidden_states: torch.Tensor, router_logits: torch.Tensor):
12411242
assert use_flashinfer_trtllm_moe

0 commit comments

Comments
 (0)