You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
@mx8435 Our early experiments have already verified that on the 7B dense model, MLA outperforms MHA (by aligning the overall number of parameters of the models, the lesser parameter count in MLA is compensated by increasing the number of layers).
Hi, great job. Did you have a ablation study about the performance between MLA and MHA in dense model ? Thanks.
The text was updated successfully, but these errors were encountered: