-
Notifications
You must be signed in to change notification settings - Fork 24
Open
Description
In my setup, the function return value is longer than what Azure OpenAI accepts as a message and I am therefore seeing this exception:
2025-02-05T17:15:43.150+01:00 ERROR 8036 --- [atcher-worker-3] org.eclipse.lmos.arc.agents.ChatAgent : Agent test-agent failed!
org.eclipse.lmos.arc.agents.ArcException: Status code 400, "{
"error": {
"message": "Invalid 'messages[3].content': string too long. Expected a string with maximum length 1048576, but got a string with length 1852321 instead.",
"type": "invalid_request_error",
"param": "messages[3].content",
"code": "string_above_max_length"
}
}"
While this is all fair and understandable, I see a few issues with it:
- It isn't very user friendly to figure things out that way.
- It is not abstracting from the used llm - replacing the llm can possibly have a completely different behavior.
- It isn't robust, but let's the whole conversation break instead of treating it gracefully.
What I would expect:
- The framework should make sure to not send invalid requests to llms, but rather deal with the situation upfront.
- It could possibly cut the message and issue a warning or make the behavior configurable (e.g. a) reject such calls, b) cut off messages or c) use an llm to shorten the call content)
Reactions are currently unavailable
Metadata
Metadata
Assignees
Labels
No labels