diktya.layers.core

class Subtensor(start, stop, step=1, axis=0, **kwargs)[source]

Bases: keras.engine.topology.Layer

Selects only a part of the input.

Parameters:
  • start (int) – Start index
  • stop (int) – Stop index
  • axis (int) – Index along this axis
get_output_shape_for(input_shape)[source]

Computes the output shape of the layer given an input shape (assumes that the layer will be built to match that input shape).

# Arguments
input_shape: shape tuple (tuple of integers)
or list of shape tuples (one per output tensor of the layer). Shape tuples can include None for free dimensions, instead of an integer.
call(x, mask=None)[source]

This is where the layer’s logic lives.

# Arguments
x: input tensor, or list/tuple of input tensors. mask: a masking tensor (or list of tensors). Used mainly in RNNs.
# Returns:
A tensor or list/tuple of tensors.
get_config()[source]

Returns a Python dictionary (serializable) containing the configuration of a layer. The same layer can be reinstantiated later (without its trained weights) from this configuration.

The config of a layer does not include connectivity information, nor the layer class name. These are handled by Container (one layer of abstraction above).

class SplitAt(axis=0, **kwargs)[source]

Bases: keras.engine.topology.Layer

get_output_shape_for(input_shapes)[source]

Computes the output shape of the layer given an input shape (assumes that the layer will be built to match that input shape).

# Arguments
input_shape: shape tuple (tuple of integers)
or list of shape tuples (one per output tensor of the layer). Shape tuples can include None for free dimensions, instead of an integer.
compute_mask(x, masks=None)[source]

Computes an output masking tensor, given an input tensor (or list thereof) and an input mask (or list thereof).

# Arguments
input: tensor or list of tensors. input_mask: tensor or list of tensors.
# Returns
None or a tensor (or list of tensors,
one per output tensor of the layer).
call(xs, mask=None)[source]

This is where the layer’s logic lives.

# Arguments
x: input tensor, or list/tuple of input tensors. mask: a masking tensor (or list of tensors). Used mainly in RNNs.
# Returns:
A tensor or list/tuple of tensors.
get_config()[source]

Returns a Python dictionary (serializable) containing the configuration of a layer. The same layer can be reinstantiated later (without its trained weights) from this configuration.

The config of a layer does not include connectivity information, nor the layer class name. These are handled by Container (one layer of abstraction above).

class Swap(a, b, **kwargs)[source]

Bases: keras.engine.topology.Layer

call(x, mask=None)[source]

This is where the layer’s logic lives.

# Arguments
x: input tensor, or list/tuple of input tensors. mask: a masking tensor (or list of tensors). Used mainly in RNNs.
# Returns:
A tensor or list/tuple of tensors.
get_config()[source]

Returns a Python dictionary (serializable) containing the configuration of a layer. The same layer can be reinstantiated later (without its trained weights) from this configuration.

The config of a layer does not include connectivity information, nor the layer class name. These are handled by Container (one layer of abstraction above).

class Switch(**kwargs)[source]

Bases: keras.engine.topology.Layer

get_output_shape_for(input_shape)[source]

Computes the output shape of the layer given an input shape (assumes that the layer will be built to match that input shape).

# Arguments
input_shape: shape tuple (tuple of integers)
or list of shape tuples (one per output tensor of the layer). Shape tuples can include None for free dimensions, instead of an integer.
call(x, mask=None)[source]

This is where the layer’s logic lives.

# Arguments
x: input tensor, or list/tuple of input tensors. mask: a masking tensor (or list of tensors). Used mainly in RNNs.
# Returns:
A tensor or list/tuple of tensors.
class ZeroGradient(**kwargs)[source]

Bases: keras.engine.topology.Layer

Consider the gradient allways zero. Wraps the theano.gradient.zero_grad function.

call(x, mask=None)[source]

This is where the layer’s logic lives.

# Arguments
x: input tensor, or list/tuple of input tensors. mask: a masking tensor (or list of tensors). Used mainly in RNNs.
# Returns:
A tensor or list/tuple of tensors.
class InBounds(low=-1, high=1, clip=True, weight=15, **kwargs)[source]

Bases: keras.engine.topology.Layer

Between low and high this layer is the identity. If the value is not in bounds a regularization loss is added to the model.

Parameters:
  • low – lower bound
  • high – upper bound
  • clip – Clip output if out of bounds
  • weight – The regularization loss is multiplied by this
build(input_shape)[source]

Creates the layer weights. Must be implemented on all layers that have weights.

# Arguments
input_shape: Keras tensor (future input to layer)
or list/tuple of Keras tensors to reference for weight shape computations.
compute_loss(input, output, input_mask=None, output_mask=None)[source]
call(x, mask=None)[source]

This is where the layer’s logic lives.

# Arguments
x: input tensor, or list/tuple of input tensors. mask: a masking tensor (or list of tensors). Used mainly in RNNs.
# Returns:
A tensor or list/tuple of tensors.
get_config()[source]

Returns a Python dictionary (serializable) containing the configuration of a layer. The same layer can be reinstantiated later (without its trained weights) from this configuration.

The config of a layer does not include connectivity information, nor the layer class name. These are handled by Container (one layer of abstraction above).

class BatchLoss(axis=1, normalize=True, l1=0.0, l2=0.0, **kwargs)[source]

Bases: keras.engine.topology.Layer

Regularizes the activation to have std = 1 and mean = 0.

Parameters:
  • axis (int) – Axis along to compute the std and mean.
  • normalize (bool) – Normalize the output by the std and mean of the batch during training.
  • weight (float) – Weight of the regularization loss
compute_loss(input, output, input_mask=None, output_mask=None)[source]
call(x, mask=None)[source]

This is where the layer’s logic lives.

# Arguments
x: input tensor, or list/tuple of input tensors. mask: a masking tensor (or list of tensors). Used mainly in RNNs.
# Returns:
A tensor or list/tuple of tensors.
get_config()[source]

Returns a Python dictionary (serializable) containing the configuration of a layer. The same layer can be reinstantiated later (without its trained weights) from this configuration.

The config of a layer does not include connectivity information, nor the layer class name. These are handled by Container (one layer of abstraction above).