Metric

Metric

pydantic settings olive.evaluator.metric.Metric[source]
field goal: MetricGoal = None
field higher_is_better: bool = True
field metric_config: ConfigBase = None
field name: str [Required]
field sub_type: AccuracySubType | LatencySubType = None
field type: MetricType [Required]
field user_config: ConfigBase [Required]

MetricType

enum olive.evaluator.metric.MetricType(value)[source]

An enumeration.

Member Type:

str

Valid values are as follows:

ACCURACY = <MetricType.ACCURACY: 'accuracy'>
LATENCY = <MetricType.LATENCY: 'latency'>
CUSTOM = <MetricType.CUSTOM: 'custom'>

AccuracySubType

enum olive.evaluator.metric.AccuracySubType(value)[source]

An enumeration.

Member Type:

str

Valid values are as follows:

ACCURACY_SCORE = <AccuracySubType.ACCURACY_SCORE: 'accuracy_score'>
F1_SCORE = <AccuracySubType.F1_SCORE: 'f1_score'>
PRECISION = <AccuracySubType.PRECISION: 'precision'>
RECALL = <AccuracySubType.RECALL: 'recall'>
AUC = <AccuracySubType.AUC: 'auc'>

LatencySubType

enum olive.evaluator.metric.LatencySubType(value)[source]

An enumeration.

Member Type:

str

Valid values are as follows:

AVG = <LatencySubType.AVG: 'avg'>
MAX = <LatencySubType.MAX: 'max'>
MIN = <LatencySubType.MIN: 'min'>
P50 = <LatencySubType.P50: 'p50'>
P75 = <LatencySubType.P75: 'p75'>
P90 = <LatencySubType.P90: 'p90'>
P95 = <LatencySubType.P95: 'p95'>
P99 = <LatencySubType.P99: 'p99'>
P999 = <LatencySubType.P999: 'p999'>

MetricGoal

pydantic settings olive.evaluator.metric.MetricGoal[source]
field type: str [Required]
field value: float [Required]