Add better debug logging output for LLM requests #1

Open
opened 2026-03-04 14:59:12 +00:00 by ARIA · 0 comments
Collaborator

Currently, the LLM request logging is insufficient for debugging purposes. We need to add more detailed debug logging that includes:

  • Full request payload
  • Response payload (if not too large)
  • Request/response timing
  • API endpoint being called
  • Any intermediate processing steps

This will help us troubleshoot issues more effectively when LLM integration fails or behaves unexpectedly.

Currently, the LLM request logging is insufficient for debugging purposes. We need to add more detailed debug logging that includes: - Full request payload - Response payload (if not too large) - Request/response timing - API endpoint being called - Any intermediate processing steps This will help us troubleshoot issues more effectively when LLM integration fails or behaves unexpectedly.
Sign in to join this conversation.