Skip to content

Auth/confirmation resume ignores skip_summarization on re-executed tool response #4472

@halfpasttense

Description

@halfpasttense

Auth/confirmation resume ignores skip_summarization on re-executed tool response

Problem

When an invocation is resumed after an auth (or tool confirmation) pause, the _AuthLlmRequestProcessor (or _RequestConfirmationLlmRequestProcessor) correctly re-executes the original tool during _preprocess_async and yields the resulting function_response_event. However, _run_one_step_async does not check whether preprocessing already produced a final response, so it unconditionally proceeds to make an LLM call. This means actions.skip_summarization = True on the tool's response event is effectively ignored — the LLM is called to summarize (or re-process) a response that was meant to be final.

Steps to reproduce

  1. Configure an agent with a tool that sets tool_context.actions.skip_summarization = True on its response
  2. The tool requires OAuth or other credential-based auth via ADK's auth flow
  3. On first invocation, the tool triggers an auth request → the invocation pauses with an adk_request_credential event
  4. The client responds with credentials → run_async is called with the original invocation_id to resume
  5. The auth preprocessor re-executes the tool successfully and yields the function response with skip_summarization = True
  6. Expected: The invocation ends with the tool's response as the final event, no LLM call
  7. Actual: An LLM call is made after the tool response, generating an unwanted summary/follow-up

Root cause

In base_llm_flow.py, _run_one_step_async has this structure:

# 1. Preprocess (auth preprocessor yields function_response_event here)
async with Aclosing(self._preprocess_async(invocation_context, llm_request)) as agen:
    async for event in agen:
        yield event
if invocation_context.end_invocation:
    return

# 2. Resume checks (neither matches after auth flow)
events = invocation_context._get_events(current_invocation=True, current_branch=True)

if invocation_context.is_resumable and events and events[-1].get_function_calls():
    # Not taken — events[-1] is the function_response from the auth preprocessor,
    # which has function RESPONSES, not function CALLS
    ...

# 3. Falls through to LLM call
model_response_event = Event(...)
async with Aclosing(self._call_llm_async(...)):
    ...

After the auth preprocessor yields the tool's response during step 1, execution continues. The resumed-function-call detection at step 2 checks events[-1].get_function_calls(), but events[-1] is now the just-appended function response event (which contains function responses, not calls), so the condition is False. The code falls through to the LLM call at step 3.

The skip_summarization flag is only checked by is_final_response() in the outer run_async while loop, but by that point the LLM has already been called and its text response has become the last_event, overwriting the tool's response.

The same issue applies to _RequestConfirmationLlmRequestProcessor, which follows the same pattern: yield a function response during preprocessing → _run_one_step_async ignores it and calls the LLM.

Proposed fix

Track whether _preprocess_async yielded a final response event, and return early if so:

async def _run_one_step_async(self, invocation_context):
    llm_request = LlmRequest()

    preprocess_yielded_final = False
    async with Aclosing(
        self._preprocess_async(invocation_context, llm_request)
    ) as agen:
        async for event in agen:
            yield event
            if event.is_final_response():
                preprocess_yielded_final = True
    if invocation_context.end_invocation or preprocess_yielded_final:
        return

    # ... rest of method unchanged

This is safe for normal (non-resume) flow because no standard preprocessor yields events with is_final_response() == True. The only preprocessors that yield events at all are _AuthLlmRequestProcessor and _RequestConfirmationLlmRequestProcessor, and only during their respective resume flows.

Affected file

src/google/adk/flows/llm_flows/base_llm_flow.py_run_one_step_async method

Metadata

Metadata

Assignees

No one assigned

    Labels

    core[Component] This issue is related to the core interface and implementation

    Type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions