Mistral AI announce 7B v0.2 base model release
35 points by nefitty 1 year ago | 2 comments- jerpint 1 year agoAny news on how this model will compare to Mixtral? Interesting that they aren’t releasing a model with MoE this time given the success mixtral had
- Reubend 1 year agoNot yet, but I'm sure they will release some benchmarks soon. As for it not being an MoE model, there's still a ton of value in having a small non-MoE model for many usecases, and improvements that get discovered to train the small model can potentially improve the next version of the MoE model down the line.
- Reubend 1 year ago