mmagic.apis.inferencers.base_mmagic_inferencer
¶
Module Contents¶
Classes¶
Base inferencer. |
Attributes¶
- class mmagic.apis.inferencers.base_mmagic_inferencer.BaseMMagicInferencer(config: Union[mmagic.utils.ConfigType, str], ckpt: Optional[str], device: Optional[str] = None, extra_parameters: Optional[Dict] = None, seed: int = 2022, **kwargs)[source]¶
Bases:
mmengine.infer.BaseInferencer
Base inferencer.
- Parameters
config (str or ConfigType) – Model config or the path to it.
ckpt (str, optional) – Path to the checkpoint.
device (str, optional) – Device to run inference. If None, the best device will be automatically used.
extra_parameters (Dict, optional) – Extra parameters for different models in inference stage.
seed (str, optional) – Seed for inference.
- _init_model(cfg: Union[mmagic.utils.ConfigType, str], ckpt: Optional[str], device: str) None [source]¶
Initialize the model with the given config and checkpoint on the specific device.
- _init_pipeline(cfg: mmagic.utils.ConfigType) mmengine.dataset.Compose [source]¶
Initialize the test pipeline.
- _init_extra_parameters(extra_parameters: Dict) None [source]¶
Initialize extra_parameters of each kind of inferencer.
- _dispatch_kwargs(**kwargs) Tuple[Dict, Dict, Dict, Dict] [source]¶
Dispatch kwargs to preprocess(), forward(), visualize() and postprocess() according to the actual demands.
- __call__(**kwargs) Union[Dict, List[Dict]] [source]¶
Call the inferencer.
- Parameters
kwargs – Keyword arguments for the inferencer.
- Returns
Results of inference pipeline.
- Return type
Union[Dict, List[Dict]]
- base_call(**kwargs) Union[Dict, List[Dict]] [source]¶
Call the inferencer.
- Parameters
kwargs – Keyword arguments for the inferencer.
- Returns
Results of inference pipeline.
- Return type
Union[Dict, List[Dict]]
- get_extra_parameters() List[str] [source]¶
Each inferencer may has its own parameters. Call this function to get these parameters.
- Returns
List of unique parameters.
- Return type
List[str]
- postprocess(preds: PredType, imgs: Optional[List[numpy.ndarray]] = None, is_batch: bool = False, get_datasample: bool = False) Union[ResType, Tuple[ResType, numpy.ndarray]] [source]¶
Postprocess predictions.
- Parameters
preds (List[Dict]) – Predictions of the model.
imgs (Optional[np.ndarray]) – Visualized predictions.
is_batch (bool) – Whether the inputs are in a batch. Defaults to False.
get_datasample (bool) – Whether to use Datasample to store inference results. If False, dict will be used.
- Returns
Inference results as a dict. imgs (torch.Tensor): Image result of inference as a tensor or
tensor list.
- Return type
result (Dict)
- _pred2dict(pred_tensor: torch.Tensor) Dict [source]¶
Extract elements necessary to represent a prediction into a dictionary. It’s better to contain only basic data elements such as strings and numbers in order to guarantee it’s json-serializable.
- Parameters
pred_tensor (torch.Tensor) – The tensor to be converted.
- Returns
The output dictionary.
- Return type
dict
- visualize(inputs: list, preds: Any, show: bool = False, result_out_dir: str = '', **kwargs) List[numpy.ndarray] [source]¶
Visualize predictions.
Customize your visualization by overriding this method. visualize should return visualization results, which could be np.ndarray or any other objects.
- Parameters
inputs (list) – Inputs preprocessed by
_inputs_to_list()
.preds (Any) – Predictions of the model.
show (bool) – Whether to display the image in a popup window. Defaults to False.
result_out_dir (str) – Output directory of images. Defaults to ‘’.
- Returns
Visualization results.
- Return type
List[np.ndarray]