Error Handling

Error Handling#

We show how the optimizer in Trace can be used to resolve execution error duing optimization. When an exception is thrown during the execution of a bundled method, a special MessageNode, called the ExceptionNode, is created and a new Python exception trace.ExecutionError is thrown. The trace.ExecutionError is a wrapper of the original exception and contains the created ExceptionNode as its attribute exception_node. The ExceptionNode’s parents are the inputs to the bundled method triggering the exception. Therefore, to resolve the error, we can simply use the created ExceptionNode as the target and its data can be used as feedback.

Below we show a basic example of how Trace deal with execptions.

!pip install trace-opt
from opto import trace
from opto.optimizers import OptoPrime


def check_input(a):
    if a <0.1:
        raise ValueError("Input must be greater than 0.1")

@trace.bundle()
def func_with_input_checking(a):
    check_input(a)
    return True


param = trace.node(-1., trainable=True)  # Note; setting the initial value to -1. makes it a float;
optimizer = OptoPrime([param], memory_size=5)

for _ in range(5):
    try:
        success = func_with_input_checking(param)
        print(f'\nSuccess, Parameter: {param.data}')
        break
    except trace.ExecutionError as e:
        print(f'\nIter {_}, Failed, Parameter {param.data}\n')
        target = e.exception_node
        optimizer.zero_feedback()
        optimizer.backward(target, target.create_feedback())
        optimizer.step(verbose=True)
Iter 0, Failed, Parameter -1.0

Prompt
 
You're tasked to solve a coding/algorithm problem. You will see the instruction, the code, the documentation of each function used in the code, and the feedback about the execution result.

Specifically, a problem will be composed of the following parts:
- #Instruction: the instruction which describes the things you need to do or the question you should answer.
- #Code: the code defined in the problem.
- #Documentation: the documentation of each function used in #Code. The explanation might be incomplete and just contain high-level description. You can use the values in #Others to help infer how those functions work.
- #Variables: the input variables that you can change.
- #Constraints: the constraints or descriptions of the variables in #Variables.
- #Inputs: the values of other inputs to the code, which are not changeable.
- #Others: the intermediate values created through the code execution.
- #Outputs: the result of the code output.
- #Feedback: the feedback about the code's execution result.

In #Variables, #Inputs, #Outputs, and #Others, the format is:

<data_type> <variable_name> = <value>

If <type> is (code), it means <value> is the source code of a python code, which may include docstring and definitions.

Output_format: Your output should be in the following json format, satisfying the json syntax:

{{
"reasoning": <Your reasoning>,
"answer": <Your answer>,
"suggestion": {{
    <variable_1>: <suggested_value_1>,
    <variable_2>: <suggested_value_2>,
}}
}}

In "reasoning", explain the problem: 1. what the #Instruction means 2. what the #Feedback on #Output means to #Variables considering how #Variables are used in #Code and other values in #Documentation, #Inputs, #Others. 3. Reasoning about the suggested changes in #Variables (if needed) and the expected result.

If #Instruction asks for an answer, write it down in "answer".

If you need to suggest a change in the values of #Variables, write down the suggested values in "suggestion". Remember you can change only the values in #Variables, not others. When <type> of a variable is (code), you should write the new definition in the format of python code without syntax errors, and you should not change the function name or the function signature.

If no changes or answer are needed, just output TERMINATE.

Now you see problem instance:

================================

#Instruction
You need to change the <value> of the variables in #Variables to improve the output in accordance to #Feedback.

#Code
exception_func_with_input_checking0 = func_with_input_checking(a=float0)

#Documentation
[exception] The operator func_with_input_checking raises an exception.

#Variables
(float) float0=-1.0

#Constraints


#Inputs


#Others


#Outputs
(str) exception_func_with_input_checking0=(ValueError) Input must be greater than 0.1

#Feedback
(ValueError) Input must be greater than 0.1

================================


Your response:

LLM response:
 {
"reasoning": "The Instruction indicates that we need to modify the value of variables to improve the output regarding the Feedback received, which is a ValueError stating that the input must be greater than 0.1. The code is trying to use a function `func_with_input_checking` which expects its input (`a`) to be greater than 0.1, but the current value of float0 is -1.0, which does not meet the function's requirement, leading to the exception. To solve the issue, we need to change the value of float0 to be greater than 0.1 to prevent the ValueError from being raised.",
"answer": "",
"suggestion": {
    "float0": 0.2
}
}

Success, Parameter: 0.2

Next we extend this basic example to create an example of constrained optimization. This example shows how optimization and constrained satisfication can be approached in the same way.

from opto import trace
from opto.optimizers import OptoPrime

trace.GRAPH.clear()

def check_input(a):
    if a <0.1:
        raise ValueError("Input must be greater than 0.1")

@trace.bundle()
def objective(a):
    """ Computes (a+1)**2. """
    check_input(a)
    return (a+1)**2


param = trace.node(-1., trainable=True)  # Note; setting the initial value to -1. makes it a float;
optimizer = OptoPrime([param], memory_size=5)

for _ in range(10):
    try:
        target = objective(param)
        feedback = 'Minimize the objective.'
        print(f'\nIter {_}, Objective {target.data}, Parameter {param.data}\n')

    except trace.ExecutionError as e:
        print(f'\nIter {_}, Not satisfying constraint, Parameter {param.data}\n')
        target = e.exception_node
        feedback = e.exception_node.create_feedback()

    optimizer.zero_feedback()
    optimizer.backward(target, feedback)
    optimizer.step()
Iter 0, Not satisfying constraint, Parameter -1.0


Iter 1, Objective 1.44, Parameter 0.2


Iter 2, Objective 1.2122009999999999, Parameter 0.101


Iter 3, Objective 1.2102200100000002, Parameter 0.1001


Iter 4, Objective 1.2104400400000002, Parameter 0.1002


Iter 5, Objective 1.2104620440999998, Parameter 0.10021


Iter 6, Objective 1.2104840484, Parameter 0.10022


Iter 7, Not satisfying constraint, Parameter 0.0


Iter 8, Objective 1.2105060529, Parameter 0.10023


Iter 9, Objective 1.2105082533609999, Parameter 0.100231