promptflow.integrations.langchain module#
- class promptflow.integrations.langchain.LangChainEventType(value)#
Bases:
Enum
An enumeration.
- AGENT = ('AGENT', 3)#
- CHAIN = ('CHAIN', 1)#
- LLM = ('LLM', 0)#
- TOOL = ('TOOL', 2)#
- class promptflow.integrations.langchain.PromptFlowCallbackHandler#
Bases:
BaseCallbackHandler
PromptFlowCallbackHandler
implements the langchain.callbacks.base.BaseCallbackHandler interface, which has a method for each event that can be subscribed to. The appropriate method will be called on the handler when the event is triggered.- property always_verbose: bool#
Whether to always be verbose.
- on_agent_action(action: AgentAction, **kwargs: Any) None #
Run on agent action.
- Parameters:
action (AgentAction) – The action from agent.
- on_agent_finish(finish: AgentFinish, **kwargs: Any) None #
Run on agent end.
- Parameters:
finish (AgentFinish) – The finish from agent.
- on_chain_end(outputs: Dict[str, Any], **kwargs: Any) None #
Run when chain ends running.
- Parameters:
outputs (Dict[str, Any]) – The outputs from chain.
- on_chain_error(error: Union[Exception, KeyboardInterrupt], **kwargs: Any) None #
Run when chain errors.
- Parameters:
error (Union[Exception, KeyboardInterrupt]) – The error from chain.
- on_chain_start(serialized: Dict[str, Any], inputs: Dict[str, Any], **kwargs: Any) None #
Run when chain starts running.
- Parameters:
serialized (Dict[str, Any]) – The serialized chain object.
inputs (Dict[str, Any]) – The inputs used to run chain.
- on_llm_end(response: LLMResult, **kwargs: Any) None #
Run when LLM ends running.
- Parameters:
response (LLMResult) – The response from LLM.
- on_llm_error(error: Union[Exception, KeyboardInterrupt], **kwargs: Any) None #
Run when LLM errors.
- Parameters:
error (Union[Exception, KeyboardInterrupt]) – The error from LLM.
- on_llm_new_token(token: str, **kwargs: Any) None #
Run on new LLM token. Only available when streaming is enabled.
- Parameters:
token (str) – The new token.
- on_llm_start(serialized: Dict[str, Any], prompts: List[str], **kwargs: Any) None #
Run when LLM starts running.
- Parameters:
serialized (Dict[str, Any]) – The serialized LLM object.
prompts (List[str]) – The prompts used to run LLM.
- on_text(text: str, **kwargs: Any) None #
Run on arbitrary text.
- Parameters:
text (str) – The text.
- on_tool_end(output: str, **kwargs: Any) None #
Run when tool ends running.
- Parameters:
output (str) – The output from tool.
- on_tool_error(error: Union[Exception, KeyboardInterrupt], **kwargs: Any) None #
Run when tool errors.
- Parameters:
error (Union[Exception, KeyboardInterrupt]) – The error from tool.
- on_tool_start(serialized: Dict[str, Any], input_str: str, **kwargs: Any) None #
Run when tool starts running.
- Parameters:
serialized (Dict[str, Any]) – The serialized tool object.
input_str (str) – The input string used to run tool.