Execute flow as a function#

Authored by:  Avatar AvatarOpen on GitHub

Requirements - In order to benefit from this tutorial, you will need:

  • A python environment

  • Installed prompt flow SDK

Learning Objectives - By the end of this tutorial, you should be able to:

  • Execute a flow as a function

  • Execute a flow function with in-memory connection object override

  • Execute a flow function with fields override

  • Execute a flow function with streaming output

Motivations - This guide will walk you through the main scenarios of executing flow as a function. You will learn how to consume flow as a function in different scenarios for more pythonnic usage.

Note: the flow context configs may affect each other in some cases. For example, using connection & overrides to override same node. The behavior is undefined for those scenarios. Pleas avoid such usage.

Example1: Load flow as a function with inputs#

from promptflow.client import load_flow


flow_path = "../../flows/standard/web-classification"
sample_url = "https://www.youtube.com/watch?v=o5ZQyXaAv1g"

f = load_flow(source=flow_path)
result = f(url=sample_url)

print(result)

Example2: Load flow as a function with in-memory connection override#

You will need to have a connection named “new_ai_connection” to run flow with new connection.

# provide parameters to create connection

conn_name = "new_ai_connection"
api_key = "<user-input>"
api_base = "<user-input>"
api_version = "<user-input>"
# create needed connection
import promptflow
from promptflow.entities import AzureOpenAIConnection, OpenAIConnection


# Follow https://learn.microsoft.com/en-us/azure/ai-services/openai/how-to/create-resource?pivots=web-portal to create an Azure OpenAI resource.
connection = AzureOpenAIConnection(
    name=conn_name,
    api_key=api_key,
    api_base=api_base,
    api_type="azure",
    api_version=api_version,
)

# use this if you have an existing OpenAI account
# connection = OpenAIConnection(
#     name=conn_name,
#     api_key=api_key,
# )
f = load_flow(
    source=flow_path,
)
# directly use connection created above
f.context.connections = {"classify_with_llm": {"connection": connection}}

result = f(url=sample_url)

print(result)

Example 3: Local flow as a function with flow inputs override#

from promptflow.entities import FlowContext

f = load_flow(source=flow_path)
f.context = FlowContext(
    # node "fetch_text_content_from_url" will take inputs from the following command instead of from flow input
    overrides={"nodes.fetch_text_content_from_url.inputs.url": sample_url},
)
# the url="unknown" will not take effect
result = f(url="unknown")
print(result)

Example 4: Load flow as a function with streaming output#

f = load_flow(source="../../flows/chat/chat-basic")
f.context.streaming = True
result = f(
    chat_history=[
        {
            "inputs": {"chat_input": "Hi"},
            "outputs": {"chat_output": "Hello! How can I assist you today?"},
        }
    ],
    question="How are you?",
)


answer = ""
# the result will be a generator, iterate it to get the result
for r in result["answer"]:
    answer += r

print(answer)