ConvolutionalNeuralNetwork#

class deeplay.components.cnn.cnn.ConvolutionalNeuralNetwork(*args, **kwargs)#

Bases: DeeplayModule

Convolutional Neural Network (CNN) module.

Parameters#

in_channels: int or None

Number of input features. If None, the input shape is inferred from the first forward pass

hidden_channels: list[int]

Number of hidden units in each layer

out_channels: int

Number of output features

out_activation: template-like

Specification for the output activation of the MLP. (Default: nn.Identity)

pool: template-like

Specification for the pooling of the block. Is not applied to the first block. (Default: nn.Identity)

Configurables#

  • in_channels (int): Number of input features. If None, the input shape is inferred from the first forward pass.

  • hidden_channels (list[int]): Number of hidden units in each layer.

  • out_channels (int): Number of output features.

  • blocks (template-like): Specification for the blocks of the CNN. (Default: “layer” >> “activation” >> “normalization” >> “dropout”)
    • pool (template-like): Specification for the pooling of the block. (Default: nn.Identity)

    • layer (template-like): Specification for the layer of the block. (Default: nn.Linear)

    • activation (template-like): Specification for the activation of the block. (Default: nn.ReLU)

    • normalization (template-like): Specification for the normalization of the block. (Default: nn.Identity)

    • dropout (template-like): Specification for the dropout of the block. (Default: nn.Identity)

  • out_activation (template-like): Specification for the output activation of the MLP. (Default: nn.Identity)

Constraints#

  • input shape: (batch_size, ch_in)

  • output shape: (batch_size, ch_out)

Evaluation#

>>> for block in mlp.blocks:
>>>    x = block(x)
>>> return x

Examples#

>>> # Using default values
>>> cnn = ConvolutionalNeuralNetwork(3, [32, 64, 128], 1)
>>> # Customizing output activation
>>> cnn.output_block.activation(nn.Sigmoid)
>>> # Changing the kernel size of the first layer
>>> cnn.input_block.layer.kernel_size(5)

Return Values#

The forward method returns the processed tensor.

Additional Notes#

The Config and Layer classes are used for configuring the blocks of the MLP. For more details refer to [Config Documentation](#) and [Layer Documentation](#).

Attributes Summary

activation

Return the activations of the network.

hidden

Return the hidden layers of the network.

input

Return the input layer of the network.

layer

Return the layers of the network.

normalization

Return the normalizations of the network.

output

Return the last layer of the network.

Methods Summary

configure(*args, **kwargs)

Configures the module with specified arguments.

forward(x)

Define the computation performed at every call.

normalized([normalization, ...])

pooled([layer, before_first])

strided(stride[, apply_to_first])

Attributes Documentation

activation#

Return the activations of the network. Equivalent to .blocks.activation.

hidden#

Return the hidden layers of the network. Equivalent to .blocks[:-1]

input#

Return the input layer of the network. Equivalent to .blocks[0].

layer#

Return the layers of the network. Equivalent to .blocks.layer.

normalization#

Return the normalizations of the network. Equivalent to .blocks.normalization.

output#

Return the last layer of the network. Equivalent to .blocks[-1].

Methods Documentation

configure(*args: Any, **kwargs: Any)#

Configures the module with specified arguments.

This method allows dynamic configuration of the module’s properties and behaviors. It can be used to set or modify the attributes and parameters of the module and, if applicable, its child modules. The method intelligently handles both direct attribute configuration and delegation to child modules’ configure methods.

Parameters#

*argsAny

Positional arguments specifying the configuration settings. When the first argument is a string matching a configurable attribute, the method expects either one or two arguments: the attribute name and, optionally, its value. If the attribute is itself a DeeplayModule, subsequent arguments are passed to its configure method.

**kwargsAny

Keyword arguments for configuration settings. If provided, these are used to update the module’s configuration directly.

Raises#

ValueError

Raised if a configuration key is not recognized as a valid configurable for the module or if the provided arguments do not match the expected pattern for configuration.

Example Usage#

To configure a single attribute: ` module.configure('attribute_name', attribute_value) # or module.configure(attribute_name=attribute_value) `

To configure multiple attributes using keyword arguments: ` module.configure(attribute1=value1, attribute2=value2) `

To configure a child module’s attribute: ` module.configure('child_module_attribute', child_attribute=child_attribute_value) # or module.child_module.configure(child_attribute=child_attribute_value) `

forward(x)#

Define the computation performed at every call.

Should be overridden by all subclasses.

Note

Although the recipe for forward pass needs to be defined within this function, one should call the Module instance afterwards instead of this since the former takes care of running the registered hooks while the latter silently ignores them.

normalized(normalization: Layer = Layer[BatchNorm2d](), after_last_layer: bool = True, mode: Literal['append', 'prepend', 'insert'] = 'append', after=None)#
pooled(layer: Layer = Layer[MaxPool2d](kernel_size=2), before_first: bool = False)#
strided(stride: int | tuple[int, ...], apply_to_first: bool = False)#