Is Class SwitchMixtureOfExperts unused in main model?

2
👍
0
kyegomez
kyegomez / MoE-Mamba
Implementation of MoE Mamba from the paper: "MoE-Mamba: Efficient Selective State Space Models with Mixture of Experts" in Pytorch and Zeta

Minimum is $1

By funding this issue, you agree to our Terms of Service and understand our Privacy Policy.

How does funding with Polar work?

1

Pay now to fund the work behind this issue.

2

Get updates on progress being made.

3

Maintainer is rewarded once the issue is completed.

FAQ

Backer

You're funding impactful open source efforts

Contributor

You want to contribute to this effort

Maintainer

You want to get funding like this too