kolena.worfklow.Model
#
Legacy Warning
Content in this section reflects outdated practices or deprecated features. It's recommended to avoid using these in new developments.
While existing implementations using these features will continue to receive support, we strongly advise adopting the latest standards and tools for new projects to ensure optimal performance and compatibility. For more information and up-to-date practices, please refer to our newest documentation at docs.kolena.io.
Model(name, infer=None, metadata=None, tags=None)
#
Bases: Frozen
, WithTelemetry
The descriptor of a model tested on Kolena. A model is a deterministic transformation from
TestSample
inputs to Inference
outputs.
Rather than importing this class directly, use the Model
type definition returned from
define_workflow
.
workflow: Workflow
instance-attribute
#
The workflow of this model. Automatically populated when constructing via the model type returned from
define_workflow
.
name: str
instance-attribute
#
Unique name of the model.
metadata: Dict[str, Any]
instance-attribute
#
Unstructured metadata associated with the model.
tags: Set[str]
instance-attribute
#
Tags associated with this model.
infer: Optional[Callable[[TestSample], Inference]]
instance-attribute
#
Function transforming a TestSample
for a workflow into an
Inference
object. Required when using test
or
TestRun.run
.
create(name, infer=None, metadata=None, tags=None)
classmethod
#
Create a new model.
Parameters:
Name | Type | Description | Default |
---|---|---|---|
name
|
str
|
The unique name of the new model to create. |
required |
infer
|
Optional[Callable[[TestSample], Inference]]
|
Optional inference function for this model. |
None
|
metadata
|
Optional[Dict[str, Any]]
|
Optional unstructured metadata to store with this model. |
None
|
tags
|
Optional[Set[str]]
|
Optional set of tags to associate with this model. |
None
|
Returns:
Type | Description |
---|---|
Model
|
The newly created model. |
load(name, infer=None)
classmethod
#
Load an existing model.
Parameters:
Name | Type | Description | Default |
---|---|---|---|
name
|
str
|
The name of the model to load. |
required |
infer
|
Optional[Callable[[TestSample], Inference]]
|
Optional inference function for this model. |
None
|
load_all(*, tags=None)
classmethod
#
Load all models with this workflow.
Parameters:
Name | Type | Description | Default |
---|---|---|---|
tags
|
Optional[Set[str]]
|
Optionally specify a set of tags to apply as a filter. The loaded models will include only models with tags matching each of these specified tags, i.e. |
None
|
Returns:
Type | Description |
---|---|
List[Model]
|
The models within this workflow, filtered by tags when specified. |
load_inferences(test_case)
#
Load all inferences stored for this model on the provided test case.
Parameters:
Name | Type | Description | Default |
---|---|---|---|
test_case
|
TestCase
|
The test case for which to load inferences. |
required |
Returns:
Type | Description |
---|---|
List[Tuple[TestSample, GroundTruth, Inference]]
|
The ground truths and inferences for all test samples in the test case. |
iter_inferences(test_case)
#
Iterate over all inferences stored for this model on the provided test case.
Parameters:
Name | Type | Description | Default |
---|---|---|---|
test_case
|
TestCase
|
The test case over which to iterate inferences. |
required |
Returns:
Type | Description |
---|---|
Iterator[Tuple[TestSample, GroundTruth, Inference]]
|
Iterator exposing the ground truths and inferences for all test samples in the test case. |