autogen_core.logging#
- class AgentConstructionExceptionEvent(*, agent_id: AgentId, exception: BaseException, **kwargs: Any)[source]#
Bases:
object
- class DeliveryStage(value, names=None, *, module=None, qualname=None, type=None, start=1, boundary=None)[source]#
Bases:
Enum
- DELIVER = 2#
- SEND = 1#
- class LLMCallEvent(*, messages: List[Dict[str, Any]], response: Dict[str, Any], prompt_tokens: int, completion_tokens: int, **kwargs: Any)[source]#
Bases:
object
- class LLMStreamEndEvent(*, response: Dict[str, Any], prompt_tokens: int, completion_tokens: int, **kwargs: Any)[source]#
Bases:
object
- class LLMStreamStartEvent(*, messages: List[Dict[str, Any]], **kwargs: Any)[source]#
Bases:
object
To be used by model clients to log the start of a stream.
- Parameters:
messages (List[Dict[str, Any]]) – The messages used in the call. Must be json serializable.
Example
import logging from autogen_core import EVENT_LOGGER_NAME from autogen_core.logging import LLMStreamStartEvent messages = [{"role": "user", "content": "Hello, world!"}] logger = logging.getLogger(EVENT_LOGGER_NAME) logger.info(LLMStreamStartEvent(messages=messages))
- class MessageDroppedEvent(*, payload: str, sender: AgentId | None, receiver: AgentId | TopicId | None, kind: MessageKind, **kwargs: Any)[source]#
Bases:
object
- class MessageEvent(*, payload: str, sender: AgentId | None, receiver: AgentId | TopicId | None, kind: MessageKind, delivery_stage: DeliveryStage, **kwargs: Any)[source]#
Bases:
object
- class MessageHandlerExceptionEvent(*, payload: str, handling_agent: AgentId, exception: BaseException, **kwargs: Any)[source]#
Bases:
object