You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
when using switching Endpoints mid-conversation function after generating a response with a model, the parameter didn't dynamic change to new endpoint's default value.
for example, max context will remain 8192 when changing gemini-1.5-pro to claude-3-opus(so as other parameter, top P, top K etc.), and this will cause error.
Steps to Reproduce
1.use Gemini-1.5-flash-latest, with its default parameter value: (temperature 0.2, top P 0.8, top K 40 and max output tokens 8192) to generate a response.
2.change the conversation model to Anthropic Claude-3-Opus with out setting parameter value.
3.model returned with an error message says unsupported value.
What browsers are you seeing the problem on?
Chrome
Relevant log output
No response
Screenshots
as you can see, this is the default value for gemini, but wrongly applied to claude.
Code of Conduct
I agree to follow this project's Code of Conduct
The text was updated successfully, but these errors were encountered:
danny-avila
changed the title
[Bug]: parameteters cannot automaticly change when switching Endpoints mid-conversation
[Bug]: Google <> Anthropic Params Conflict with Mid-conversation Switch
May 19, 2024
Thanks, when the params are "shared," the previous values are allowed mid-conversation but in this case, they are only shared in name, and some discrepancy needs to be made for a more seamless experience.
danny-avila
changed the title
[Bug]: Google <> Anthropic Params Conflict with Mid-conversation Switch
[Bug]: Google <> Anthropic Params Conflict with Mid-convo Switch
May 19, 2024
What happened?
when using switching Endpoints mid-conversation function after generating a response with a model, the parameter didn't dynamic change to new endpoint's default value.
for example, max context will remain 8192 when changing gemini-1.5-pro to claude-3-opus(so as other parameter, top P, top K etc.), and this will cause error.
Steps to Reproduce
1.use Gemini-1.5-flash-latest, with its default parameter value: (temperature 0.2, top P 0.8, top K 40 and max output tokens 8192) to generate a response.
2.change the conversation model to Anthropic Claude-3-Opus with out setting parameter value.
3.model returned with an error message says unsupported value.
What browsers are you seeing the problem on?
Chrome
Relevant log output
No response
Screenshots
as you can see, this is the default value for gemini, but wrongly applied to claude.
Code of Conduct
The text was updated successfully, but these errors were encountered: