Skip to main content

agentchat.agent

Agent

@runtime_checkable
class Agent(Protocol)

(In preview) A protocol for Agent.

An agent can communicate with other agents and perform actions. Different agents can differ in what actions they perform in the receive method.

name

@property
def name() -> str

The name of the agent.

description

@property
def description() -> str

The description of the agent. Used for the agent's introduction in a group chat setting.

send

def send(message: Union[Dict[str, Any], str],
recipient: "Agent",
request_reply: Optional[bool] = None) -> None

Send a message to another agent.

Arguments:

  • message dict or str - the message to send. If a dict, it should be a JSON-serializable and follows the OpenAI's ChatCompletion schema.
  • recipient Agent - the recipient of the message.
  • request_reply bool - whether to request a reply from the recipient.

a_send

async def a_send(message: Union[Dict[str, Any], str],
recipient: "Agent",
request_reply: Optional[bool] = None) -> None

(Async) Send a message to another agent.

Arguments:

  • message dict or str - the message to send. If a dict, it should be a JSON-serializable and follows the OpenAI's ChatCompletion schema.
  • recipient Agent - the recipient of the message.
  • request_reply bool - whether to request a reply from the recipient.

receive

def receive(message: Union[Dict[str, Any], str],
sender: "Agent",
request_reply: Optional[bool] = None) -> None

Receive a message from another agent.

Arguments:

  • message dict or str - the message received. If a dict, it should be a JSON-serializable and follows the OpenAI's ChatCompletion schema.
  • sender Agent - the sender of the message.
  • request_reply bool - whether the sender requests a reply.

a_receive

async def a_receive(message: Union[Dict[str, Any], str],
sender: "Agent",
request_reply: Optional[bool] = None) -> None

(Async) Receive a message from another agent.

Arguments:

  • message dict or str - the message received. If a dict, it should be a JSON-serializable and follows the OpenAI's ChatCompletion schema.
  • sender Agent - the sender of the message.
  • request_reply bool - whether the sender requests a reply.

generate_reply

def generate_reply(messages: Optional[List[Dict[str, Any]]] = None,
sender: Optional["Agent"] = None,
**kwargs: Any) -> Union[str, Dict[str, Any], None]

Generate a reply based on the received messages.

Arguments:

  • messages list[dict] - a list of messages received from other agents. The messages are dictionaries that are JSON-serializable and follows the OpenAI's ChatCompletion schema.
  • sender - sender of an Agent instance.

Returns:

str or dict or None: the generated reply. If None, no reply is generated.

a_generate_reply

async def a_generate_reply(messages: Optional[List[Dict[str, Any]]] = None,
sender: Optional["Agent"] = None,
**kwargs: Any) -> Union[str, Dict[str, Any], None]

(Async) Generate a reply based on the received messages.

Arguments:

  • messages list[dict] - a list of messages received from other agents. The messages are dictionaries that are JSON-serializable and follows the OpenAI's ChatCompletion schema.
  • sender - sender of an Agent instance.

Returns:

str or dict or None: the generated reply. If None, no reply is generated.

LLMAgent

@runtime_checkable
class LLMAgent(Agent, Protocol)

(In preview) A protocol for an LLM agent.

system_message

@property
def system_message() -> str

The system message of this agent.

update_system_message

def update_system_message(system_message: str) -> None

Update this agent's system message.

Arguments:

  • system_message str - system message for inference.