Example Models#
PyTorch models used in example scripts.
Submodules#
syllabus.examples.models.minigrid_model module#
- class syllabus.examples.models.minigrid_model.Categorical(num_inputs, num_outputs)#
Bases:
Module
Categorical distribution (NN module)
- forward(x)#
Defines the computation performed at every call.
Should be overridden by all subclasses.
Note
Although the recipe for forward pass needs to be defined within this function, one should call the
Module
instance afterwards instead of this since the former takes care of running the registered hooks while the latter silently ignores them.
- class syllabus.examples.models.minigrid_model.FixedCategorical(probs=None, logits=None, validate_args=None)#
Bases:
Categorical
Categorical distribution object
- log_probs(actions)#
- mode()#
Returns the mode of the distribution.
- sample()#
Generates a sample_shape shaped sample or sample_shape shaped batch of samples if the distribution parameters are batched.
- class syllabus.examples.models.minigrid_model.MinigridAgent(obs_shape, num_actions, arch='small', base_kwargs=None)#
Bases:
MinigridPolicy
- get_action_and_value(x, action=None, full_log_probs=False)#
- get_value(x)#
- class syllabus.examples.models.minigrid_model.MinigridPolicy(obs_shape, num_actions, arch='small', base_kwargs=None)#
Bases:
Module
Actor-Critic module
- act(inputs, deterministic=False)#
- evaluate_actions(inputs, action)#
- forward()#
Defines the computation performed at every call.
Should be overridden by all subclasses.
Note
Although the recipe for forward pass needs to be defined within this function, one should call the
Module
instance afterwards instead of this since the former takes care of running the registered hooks while the latter silently ignores them.
- get_value(inputs)#
- property is_recurrent#
Size of rnn_hx.
- syllabus.examples.models.minigrid_model.apply_init_(modules)#
Initialize NN modules
- syllabus.examples.models.minigrid_model.init(module, weight_init, bias_init, gain=1)#
- syllabus.examples.models.minigrid_model.init_(m)#
- syllabus.examples.models.minigrid_model.init_relu_(m)#
- syllabus.examples.models.minigrid_model.init_tanh_(m)#
syllabus.examples.models.procgen_model module#
- class syllabus.examples.models.procgen_model.BasicBlock(n_channels, stride=1)#
Bases:
Module
Residual Network Block
- forward(x)#
Defines the computation performed at every call.
Should be overridden by all subclasses.
Note
Although the recipe for forward pass needs to be defined within this function, one should call the
Module
instance afterwards instead of this since the former takes care of running the registered hooks while the latter silently ignores them.
- class syllabus.examples.models.procgen_model.Categorical(num_inputs, num_outputs)#
Bases:
Module
Categorical distribution (NN module)
- forward(x)#
Defines the computation performed at every call.
Should be overridden by all subclasses.
Note
Although the recipe for forward pass needs to be defined within this function, one should call the
Module
instance afterwards instead of this since the former takes care of running the registered hooks while the latter silently ignores them.
- class syllabus.examples.models.procgen_model.Conv2d_tf(*args, **kwargs)#
Bases:
Conv2d
Conv2d with the padding behavior from TF
- forward(input)#
Defines the computation performed at every call.
Should be overridden by all subclasses.
Note
Although the recipe for forward pass needs to be defined within this function, one should call the
Module
instance afterwards instead of this since the former takes care of running the registered hooks while the latter silently ignores them.
- class syllabus.examples.models.procgen_model.FixedCategorical(probs=None, logits=None, validate_args=None)#
Bases:
Categorical
Categorical distribution object
- log_probs(actions)#
- mode()#
Returns the mode of the distribution.
- sample()#
Generates a sample_shape shaped sample or sample_shape shaped batch of samples if the distribution parameters are batched.
- class syllabus.examples.models.procgen_model.Flatten(*args, **kwargs)#
Bases:
Module
Flatten a tensor
- forward(x)#
Defines the computation performed at every call.
Should be overridden by all subclasses.
Note
Although the recipe for forward pass needs to be defined within this function, one should call the
Module
instance afterwards instead of this since the former takes care of running the registered hooks while the latter silently ignores them.
- class syllabus.examples.models.procgen_model.MLPBase(num_inputs, recurrent=False, hidden_size=64)#
Bases:
NNBase
Multi-Layer Perceptron
- forward(inputs)#
Defines the computation performed at every call.
Should be overridden by all subclasses.
Note
Although the recipe for forward pass needs to be defined within this function, one should call the
Module
instance afterwards instead of this since the former takes care of running the registered hooks while the latter silently ignores them.
- class syllabus.examples.models.procgen_model.NNBase(recurrent, recurrent_input_size, hidden_size)#
Bases:
Module
Actor-Critic network (base class)
- property is_recurrent#
- property output_size#
- class syllabus.examples.models.procgen_model.Policy(obs_shape, num_actions, arch='small', base_kwargs=None)#
Bases:
Module
Actor-Critic module
- act(inputs, deterministic=False)#
- evaluate_actions(inputs, rnn_hxs, masks, action)#
- forward(inputs)#
Defines the computation performed at every call.
Should be overridden by all subclasses.
Note
Although the recipe for forward pass needs to be defined within this function, one should call the
Module
instance afterwards instead of this since the former takes care of running the registered hooks while the latter silently ignores them.
- get_value(inputs)#
- property is_recurrent#
Size of rnn_hx.
- class syllabus.examples.models.procgen_model.ProcgenAgent(obs_shape, num_actions, arch='small', base_kwargs=None)#
Bases:
Policy
- get_action_and_value(x, action=None, full_log_probs=False, deterministic=False)#
- get_value(x)#
- class syllabus.examples.models.procgen_model.ResNetBase(num_inputs, recurrent=False, hidden_size=256, channels=[16, 32, 32])#
Bases:
NNBase
Residual Network
- forward(inputs)#
Defines the computation performed at every call.
Should be overridden by all subclasses.
Note
Although the recipe for forward pass needs to be defined within this function, one should call the
Module
instance afterwards instead of this since the former takes care of running the registered hooks while the latter silently ignores them.
- class syllabus.examples.models.procgen_model.SmallNetBase(num_inputs, recurrent=False, hidden_size=256)#
Bases:
NNBase
Residual Network
- forward(inputs)#
Defines the computation performed at every call.
Should be overridden by all subclasses.
Note
Although the recipe for forward pass needs to be defined within this function, one should call the
Module
instance afterwards instead of this since the former takes care of running the registered hooks while the latter silently ignores them.
- syllabus.examples.models.procgen_model.apply_init_(modules)#
Initialize NN modules
- syllabus.examples.models.procgen_model.init(module, weight_init, bias_init, gain=1)#
- syllabus.examples.models.procgen_model.init_(m)#
- syllabus.examples.models.procgen_model.init_relu_(m)#
- syllabus.examples.models.procgen_model.init_tanh_(m)#