Skip to content

Runtime exception being thrown when function returns too much data #120

@kaikreuzer

Description

@kaikreuzer

In my setup, the function return value is longer than what Azure OpenAI accepts as a message and I am therefore seeing this exception:

2025-02-05T17:15:43.150+01:00 ERROR 8036 --- [atcher-worker-3] org.eclipse.lmos.arc.agents.ChatAgent    : Agent test-agent failed!

org.eclipse.lmos.arc.agents.ArcException: Status code 400, "{
  "error": {
    "message": "Invalid 'messages[3].content': string too long. Expected a string with maximum length 1048576, but got a string with length 1852321 instead.",
    "type": "invalid_request_error",
    "param": "messages[3].content",
    "code": "string_above_max_length"
  }
}"

While this is all fair and understandable, I see a few issues with it:

  1. It isn't very user friendly to figure things out that way.
  2. It is not abstracting from the used llm - replacing the llm can possibly have a completely different behavior.
  3. It isn't robust, but let's the whole conversation break instead of treating it gracefully.

What I would expect:

  1. The framework should make sure to not send invalid requests to llms, but rather deal with the situation upfront.
  2. It could possibly cut the message and issue a warning or make the behavior configurable (e.g. a) reject such calls, b) cut off messages or c) use an llm to shorten the call content)

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions