pylissom.nn.modules

Extends torch.nn with Lissom layers, split in the simpler Linear module and the higher-level Lissom module

pylissom.nn.modules.register_recursive_forward_hook(module, hook)[source]

Adds a forward hook to all modules in module

pylissom.nn.modules.named_apply(mod, fn, prefix)[source]

Like torch.nn.Module.apply() but with named children

pylissom.nn.modules.input_output_hook(module, input, output)[source]
pylissom.nn.modules.register_recursive_input_output_hook(module)[source]

Adds a hook to module so it saves in memory input and output in each forward pass

Submodules

pylissom.nn.modules.linear module

class pylissom.nn.modules.linear.GaussianLinear(in_features, out_features, sigma=1.0)[source]

Bases: torch.nn.Linear

Applies a linear transformation to the incoming data: \(y = Ax + b\)

where A is a Gaussian matrix

Parameters:sigma - (-) –
class pylissom.nn.modules.linear.GaussianCloudLinear(in_features, out_features, sigma=1.0)[source]

Bases: pylissom.nn.modules.linear.GaussianLinear

Applies a linear transformation to the incoming data: \(y = Ax + b\)

where A is a Gaussian matrix multiplied with Gaussian Noise

Parameters:sigma - (-) –
class pylissom.nn.modules.linear.PiecewiseSigmoid(min_theta=0.0, max_theta=1.0)[source]

Bases: torch.nn.Module

Applies a piecewise approximation of the sigmoid function \(f(x) = 1 / ( 1 + exp(-x))\)

The formula is as follows: TODO :param - min_theta -: :param - max_theta -:

forward(input)[source]
class pylissom.nn.modules.linear.UnnormalizedDifferenceOfGaussiansLinear(in_features, out_features, on, sigma_surround, sigma_center=1.0)[source]

Bases: torch.nn.Linear

NOT USED, only for example in notebooks

pylissom.nn.modules.lissom module

class pylissom.nn.modules.lissom.Cortex(in_features, out_features, radius, sigma=1.0)[source]

Bases: pylissom.nn.modules.linear.GaussianCloudLinear

Applies a linear transformation to the incoming data: \(y = Ax + b\)

where A is a Gaussian Cloud with a connective radius

This module is primarily used to build a :py:class`ReducedLissom`

Parameters:
  • radius - (-) –
  • sigma - (-) –
class pylissom.nn.modules.lissom.DifferenceOfGaussiansLinear(in_features, out_features, on, radius, sigma_surround, sigma_center=1.0)[source]

Bases: torch.nn.Linear

Applies a linear transformation to the incoming data: \(y = Ax + b\), where A is a Difference of Gaussians with a connective radius:

\[\begin{equation*} \text{out}_ab = \sigma(\phi_L \sum_(xy) \text{input}_xy L_xy,ab) \end{equation*}\]
Parameters:
  • on - Defines if the substraction goes sorround gaussian - center gaussian or the other way around (-) –
  • radius - (-) –
  • sigma_surround - (-) –
  • sigma_center - (-) –
class pylissom.nn.modules.lissom.Mul(number)[source]

Bases: torch.nn.Module

Represents a layer than only multiplies the input by a constant, used in pylissom.nn.modules.LGN

forward(input)[source]
class pylissom.nn.modules.lissom.LGN(in_features, out_features, on, radius, sigma_surround, sigma_center=1.0, min_theta=0.0, max_theta=1.0, strength=1.0, diff_of_gauss_cls=<class 'pylissom.nn.modules.lissom.DifferenceOfGaussiansLinear'>, pw_sigmoid_cls=<class 'pylissom.nn.modules.linear.PiecewiseSigmoid'>)[source]

Bases: torch.nn.Sequential

Represents an LGN channel, can be ON or OFF

The transformation applied can be described as:

\[\begin{equation*} \text{out}_ab = \sigma(\phi_L \sum_(xy) \text{input}_xy L_xy,ab) \end{equation*}\]

where \(\sigma\) is the piecewise sigmoid, \(N\) is foo

It inherits from Sequential because an LGN is in essence a composition of several transformations

  • afferent_module
Parameters:
  • on - (-) –
  • radius - (-) –
  • sigma_surround - (-) –
  • sigma_center - (-) –
  • strength - (-) –
  • min_theta - (-) –
  • max_theta - (-) –
class pylissom.nn.modules.lissom.ReducedLissom(afferent_module, excitatory_module, inhibitory_module, min_theta=1.0, max_theta=1.0, settling_steps=10, afferent_strength=1.0, excitatory_strength=1.0, inhibitory_strength=1.0, pw_sigmoid_cls=<class 'pylissom.nn.modules.linear.PiecewiseSigmoid'>)[source]

Bases: torch.nn.Module

Represents a Reduced Lissom consisting of afferent, excitatory and inhibitory modules

The transformation applied can be described as:

\[\begin{equation*} n_ij = \sigma(s_ij + \phi_E \sum_(kl) n_kl (t-1) E_kl,ij - \phi_I \sum_(kl) n_kl (t-1) E_kl,ij I_lk,ij) \end{equation*}\]

where \(\sigma\) is the piecewise sigmoid, \(N\) is foo

  • afferent_module
Parameters:
  • afferent_module - (-) –
  • excitatory_module - (-) –
  • inhibitory_module - (-) –
  • afferent_strength - (-) –
  • excitatory_strength - (-) –
  • inhibitory_strength - (-) –
  • min_theta - (-) –
  • max_theta - (-) –
  • settling_steps - (-) –
forward(cortex_input)[source]
class pylissom.nn.modules.lissom.Lissom(on, off, v1)[source]

Bases: torch.nn.Module

Represents a Full Lissom, with ON/OFF channels and a V1 ( ReducedLissom )

The transformation applied can be described as:

\[\begin{equation*} \text{out} = \text{v1}(\text{on}(input) + \text{off}(input)) \end{equation*}\]

:param - on - an ON LGN map: :param - off - an OFF LGN map: :param - v1 - a ReducedLissom map:

Shape:
  • TODO
forward(input)[source]