metric
Bases: function
The base class of the evaluation metrics in the tinyBIG toolkit.
...
Attributes:
Name | Type | Description |
---|---|---|
name |
str, default = 'base_metric'
|
Name of the evaluation metric. |
Methods:
Name | Description |
---|---|
__init__ |
It performs the initialization of the evaluation metric. |
evaluate |
It performs the evaluation based on the inputs. |
__call__ |
It reimplementation the build-in callable method. |
Source code in tinybig/metric/base_metric.py
__call__(*args, **kwargs)
abstractmethod
Reimplementation of the build-in callable method.
It is declared to be an abstractmethod and needs to be implemented in the inherited evaluation metric classes. This callable method accepts prediction results and ground-truth results as inputs and return the evaluation metric scores as the outputs.
Returns:
Type | Description |
---|---|
float | dict
|
The evaluation metric scores. |
Source code in tinybig/metric/base_metric.py
__init__(name='base_metric', device='cpu', *args, **kwargs)
The initialization method of the base metric class.
It initializes a metric object based on the provided method parameters.
Parameters:
Name | Type | Description | Default |
---|---|---|---|
name
|
str
|
The name of the metric, with default value "base_metric". |
'base_metric'
|
Returns:
Type | Description |
---|---|
object
|
The metric object. |
Source code in tinybig/metric/base_metric.py
evaluate(*args, **kwargs)
abstractmethod
The evaluate method of the base metric class.
It is declared to be an abstractmethod and needs to be implemented in the inherited evaluation metric classes. The evaluate method accepts prediction results and ground-truth results as inputs and return the evaluation metric scores as the outputs.
Returns:
Type | Description |
---|---|
float | dict
|
The evaluation metric scores. |