Shortcuts

mmagic.engine.optimizers

Package Contents

Classes

MultiOptimWrapperConstructor

OptimizerConstructor for GAN models. This class construct optimizer for

PGGANOptimWrapperConstructor

OptimizerConstructor for PGGAN models. Set optimizers for each

SinGANOptimWrapperConstructor

OptimizerConstructor for SinGAN models. Set optimizers for each

class mmagic.engine.optimizers.MultiOptimWrapperConstructor(optim_wrapper_cfg: dict, paramwise_cfg=None)[source]

OptimizerConstructor for GAN models. This class construct optimizer for the submodules of the model separately, and return a mmengine.optim.OptimWrapperDict or mmengine.optim.OptimWrapper.

Example 1: Build multi optimizers (e.g., GANs):
>>> # build GAN model
>>> model = dict(
>>>     type='GANModel',
>>>     num_classes=10,
>>>     generator=dict(type='Generator'),
>>>     discriminator=dict(type='Discriminator'))
>>> gan_model = MODELS.build(model)
>>> # build constructor
>>> optim_wrapper = dict(
>>>     generator=dict(
>>>         type='OptimWrapper',
>>>         accumulative_counts=1,
>>>         optimizer=dict(type='Adam', lr=0.0002,
>>>                        betas=(0.5, 0.999))),
>>>     discriminator=dict(
>>>         type='OptimWrapper',
>>>         accumulative_counts=1,
>>>         optimizer=dict(type='Adam', lr=0.0002,
>>>                            betas=(0.5, 0.999))))
>>> optim_dict_builder = MultiOptimWrapperConstructor(optim_wrapper)
>>> # build optim wrapper dict
>>> optim_wrapper_dict = optim_dict_builder(gan_model)
Example 2: Build multi optimizers for specific submodules:
>>> # build model
>>> class GAN(nn.Module):
>>>     def __init__(self) -> None:
>>>         super().__init__()
>>>         self.generator = nn.Conv2d(3, 3, 1)
>>>         self.discriminator = nn.Conv2d(3, 3, 1)
>>> class TextEncoder(nn.Module):
>>>     def __init__(self):
>>>         super().__init__()
>>>         self.embedding = nn.Embedding(100, 100)
>>> class ToyModel(nn.Module):
>>>     def __init__(self) -> None:
>>>         super().__init__()
>>>         self.m1 = GAN()
>>>         self.m2 = nn.Conv2d(3, 3, 1)
>>>         self.m3 = nn.Linear(2, 2)
>>>         self.text_encoder = TextEncoder()
>>> model = ToyModel()
>>> # build constructor
>>> optim_wrapper = {
>>>     '.*embedding': {
>>>         'type': 'OptimWrapper',
>>>         'optimizer': {
>>>             'type': 'Adam',
>>>             'lr': 1e-4,
>>>             'betas': (0.9, 0.99)
>>>         }
>>>     },
>>>     'm1.generator': {
>>>         'type': 'OptimWrapper',
>>>         'optimizer': {
>>>             'type': 'Adam',
>>>             'lr': 1e-5,
>>>             'betas': (0.9, 0.99)
>>>         }
>>>     },
>>>     'm2': {
>>>         'type': 'OptimWrapper',
>>>         'optimizer': {
>>>             'type': 'Adam',
>>>             'lr': 1e-5,
>>>         }
>>>     }
>>> }
>>> optim_dict_builder = MultiOptimWrapperConstructor(optim_wrapper)
>>> # build optim wrapper dict
>>> optim_wrapper_dict = optim_dict_builder(model)
Example 3: Build a single optimizer for multi modules (e.g., DreamBooth):
>>> # build StableDiffusion model
>>> model = dict(
>>>     type='StableDiffusion',
>>>     unet=dict(type='unet'),
>>>     vae=dict(type='vae'),
        text_encoder=dict(type='text_encoder'))
>>> diffusion_model = MODELS.build(model)
>>> # build constructor
>>> optim_wrapper = dict(
>>>     modules=['unet', 'text_encoder']
>>>     optimizer=dict(type='Adam', lr=0.0002),
>>>     accumulative_counts=1)
>>> optim_dict_builder = MultiOptimWrapperConstructor(optim_wrapper)
>>> # build optim wrapper dict
>>> optim_wrapper_dict = optim_dict_builder(diffusion_model)
Parameters
  • optim_wrapper_cfg_dict (dict) – Config of the optimizer wrapper.

  • paramwise_cfg (dict) – Config of parameter-wise settings. Default: None.

__call__(module: torch.nn.Module) Union[mmengine.optim.OptimWrapperDict, mmengine.optim.OptimWrapper][source]

Build optimizer and return a optimizer_wrapper_dict.

class mmagic.engine.optimizers.PGGANOptimWrapperConstructor(optim_wrapper_cfg: dict, paramwise_cfg: Optional[dict] = None)[source]

OptimizerConstructor for PGGAN models. Set optimizers for each stage of PGGAN. All submodule must be contained in a torch.nn.ModuleList named ‘blocks’. And we access each submodule by MODEL.blocks[SCALE], where MODEL is generator or discriminator, and the scale is the index of the resolution scale.

More detail about the resolution scale and naming rule please refers to PGGANGenerator and PGGANDiscriminator.

Example

>>> # build PGGAN model
>>> model = dict(
>>>     type='ProgressiveGrowingGAN',
>>>     data_preprocessor=dict(type='GANDataPreprocessor'),
>>>     noise_size=512,
>>>     generator=dict(type='PGGANGenerator', out_scale=1024,
>>>                    noise_size=512),
>>>     discriminator=dict(type='PGGANDiscriminator', in_scale=1024),
>>>     nkimgs_per_scale={
>>>         '4': 600,
>>>         '8': 1200,
>>>         '16': 1200,
>>>         '32': 1200,
>>>         '64': 1200,
>>>         '128': 1200,
>>>         '256': 1200,
>>>         '512': 1200,
>>>         '1024': 12000,
>>>     },
>>>     transition_kimgs=600,
>>>     ema_config=dict(interval=1))
>>> pggan = MODELS.build(model)
>>> # build constructor
>>> optim_wrapper = dict(
>>>     generator=dict(optimizer=dict(type='Adam', lr=0.001,
>>>                                   betas=(0., 0.99))),
>>>     discriminator=dict(
>>>         optimizer=dict(type='Adam', lr=0.001, betas=(0., 0.99))),
>>>     lr_schedule=dict(
>>>         generator={
>>>             '128': 0.0015,
>>>             '256': 0.002,
>>>             '512': 0.003,
>>>             '1024': 0.003
>>>         },
>>>         discriminator={
>>>             '128': 0.0015,
>>>             '256': 0.002,
>>>             '512': 0.003,
>>>             '1024': 0.003
>>>         }))
>>> optim_wrapper_dict_builder = PGGANOptimWrapperConstructor(
>>>     optim_wrapper)
>>> # build optim wrapper dict
>>> optim_wrapper_dict = optim_wrapper_dict_builder(pggan)
Parameters
  • optim_wrapper_cfg (dict) – Config of the optimizer wrapper.

  • paramwise_cfg (Optional[dict]) – Parameter-wise options.

__call__(module: torch.nn.Module) mmengine.optim.OptimWrapperDict[source]

Build optimizer and return a optimizerwrapperdict.

class mmagic.engine.optimizers.SinGANOptimWrapperConstructor(optim_wrapper_cfg: dict, paramwise_cfg: Optional[dict] = None)[source]

OptimizerConstructor for SinGAN models. Set optimizers for each submodule of SinGAN. All submodule must be contained in a torch.nn.ModuleList named ‘blocks’. And we access each submodule by MODEL.blocks[SCALE], where MODEL is generator or discriminator, and the scale is the index of the resolution scale.

More detail about the resolution scale and naming rule please refers to SinGANMultiScaleGenerator and SinGANMultiScaleDiscriminator.

Example

>>> # build SinGAN model
>>> model = dict(
>>>     type='SinGAN',
>>>     data_preprocessor=dict(
>>>         type='GANDataPreprocessor',
>>>         non_image_keys=['input_sample']),
>>>     generator=dict(
>>>         type='SinGANMultiScaleGenerator',
>>>         in_channels=3,
>>>         out_channels=3,
>>>         num_scales=2),
>>>     discriminator=dict(
>>>         type='SinGANMultiScaleDiscriminator',
>>>         in_channels=3,
>>>         num_scales=3))
>>> singan = MODELS.build(model)
>>> # build constructor
>>> optim_wrapper = dict(
>>>     generator=dict(optimizer=dict(type='Adam', lr=0.0005,
>>>                                   betas=(0.5, 0.999))),
>>>     discriminator=dict(
>>>         optimizer=dict(type='Adam', lr=0.0005,
>>>                        betas=(0.5, 0.999))))
>>> optim_wrapper_dict_builder = SinGANOptimWrapperConstructor(
>>>     optim_wrapper)
>>> # build optim wrapper dict
>>> optim_wrapper_dict = optim_wrapper_dict_builder(singan)
Parameters
  • optim_wrapper_cfg (dict) – Config of the optimizer wrapper.

  • paramwise_cfg (Optional[dict]) – Parameter-wise options.

__call__(module: torch.nn.Module) mmengine.optim.OptimWrapperDict[source]

Build optimizer and return a optimizerwrapperdict.

Read the Docs v: latest
Versions
latest
stable
0.x
Downloads
pdf
epub
On Read the Docs
Project Home
Builds

Free document hosting provided by Read the Docs.