mmagic.models.archs.wrapper
¶
Module Contents¶
Classes¶
Wrapper for models from HuggingFace Diffusers. This wrapper will be set |
Attributes¶
- class mmagic.models.archs.wrapper.DiffusersWrapper(from_pretrained: Optional[Union[str, os.PathLike]] = None, from_config: Optional[Union[str, os.PathLike]] = None, dtype: Optional[Union[str, torch.dtype]] = None, init_cfg: Union[dict, List[dict], None] = None, *args, **kwargs)[source]¶
Bases:
mmengine.model.BaseModule
Wrapper for models from HuggingFace Diffusers. This wrapper will be set a attribute called _module_cls by wrapping function and will be used to initialize the model structure.
Example: >>> 1. Load pretrained model from HuggingFace Space. >>> config = dict( >>> type=’ControlNetModel’, # has been registered in MODELS >>> from_pretrained=’lllyasviel/sd-controlnet-canny’, >>> torch_dtype=torch.float16) >>> controlnet = MODELS.build(config)
>>> 2. Initialize model with pre-defined configs. >>> config = dict( >>> type='ControlNetModel', # has been registered in `MODELS` >>> from_config='lllyasviel/sd-controlnet-canny', >>> cache_dir='~/.cache/OpenMMLab') >>> controlnet = MODELS.build(config)
>>> 3. Initialize model with own defined arguments >>> config = dict( >>> type='ControlNetModel', # has been registered in `MODELS` >>> in_channels=3, >>> down_block_types=['DownBlock2D'], >>> block_out_channels=(32, ), >>> conditioning_embedding_out_channels=(16, )) >>> controlnet = MODELS.build(config)
- Parameters
from_pretrained (Union[str, os.PathLike], optional) – The model id of a pretrained model or a path to a directory containing model weights and config. Please refers to diffusers.model.modeling_utils.ModelMixin.from_pretrained for more detail. Defaults to None.
from_config (Union[str, os.PathLike], optional) – The model id of a pretrained model or a path to a directory containing model weights and config. Please refers to diffusers.configuration_utils.ConfigMixin.load_config for more detail. Defaults to None.
init_cfg (dict or List[dict], optional) – Initialization config dict. Noted that, in DiffuserWrapper, if you want to load pretrained weight from HuggingFace space, please use from_pretrained argument instead of using init_cfg. Defaults to None.
*args – If from_pretrained is passed, *args and **kwargs will be passed to from_pretrained function. If from_config is passed, *args and **kwargs will be passed to load_config function. Otherwise, *args and **kwargs will be used to initialize the model by self._module_cls(*args, **kwargs).
**kwargs –
If from_pretrained is passed, *args and **kwargs will be passed to from_pretrained function. If from_config is passed, *args and **kwargs will be passed to load_config function. Otherwise, *args and **kwargs will be used to initialize the model by self._module_cls(*args, **kwargs).
- init_weights()[source]¶
Initialize the weights.
If type is ‘Pretrained’ but the model has be loaded from repo_id, a warning will be raised.
- __getattr__(name: str) Any [source]¶
This function provide a way to access the attributes of the wrapped model.
- Parameters
name (str) – The name of the attribute.
- Returns
The got attribute.
- Return type
Any
- forward(*args, **kwargs) Any [source]¶
Forward function of wrapped module.
- Parameters
*args – The arguments of the wrapped module.
**kwargs –
The arguments of the wrapped module.
- Returns
The output of wrapped module’s forward function.
- Return type
Any
- to(torch_device: Optional[Union[str, torch.device]] = None, torch_dtype: Optional[torch.dtype] = None)[source]¶
Put wrapped module to device or convert it to torch_dtype. There are two to() function. One is nn.module.to() and the other is diffusers.pipeline.to(), if both args are passed, diffusers.pipeline.to() is called.
- Parameters
torch_device – The device to put to.
torch_dtype – The type to convert to.
- Returns
the wrapped module itself.
- Return type
self