emmi.modules.layers.continuous_sincos_embed¶
Classes¶
Embedding layer for continuous coordinates using sine and cosine functions. |
Module Contents¶
- class emmi.modules.layers.continuous_sincos_embed.ContinuousSincosEmbed(config)¶
Bases:
torch.nn.ModuleEmbedding layer for continuous coordinates using sine and cosine functions. The original implementation from the Attenion is All You Need paper, deals with descrete 1D cordinates (i.e., a sequence). Howerver, this implementation is able to deal with 2D and 3D coordinate systems as well.
Initialize the ContinuousSincosEmbed layer.
- Parameters:
hidden_dim – Dimensionality of the embedded input coordinates.
input_dim – Number of dimensions of the input domain.
max_wavelength – Max length. Defaults to 10000.
assert_positive – If true, assert if all input coordiantes are positive. Defaults to True.
config (emmi.schemas.modules.layers.ContinuousSincosEmbeddingConfig)
- input_dim¶
- ndim_padding¶
- sincos_padding¶
- max_wavelength¶
- padding¶
- assert_positive¶
- forward(coords)¶
Forward method of the ContinuousSincosEmbed layer.
- Parameters:
coords (torch.Tensor) – Tensor of coordinates. The shape of the tensor should be [batch size, number of points, coordinate dimension] or [number of points, coordinate dimension].
- Raises:
NotImplementedError – Only supports sparse (i.e. [number of points, coordinate dimension]) or dense (i.e. [batch size, number of points, coordinate dimension]) coordinates systems.
- Returns:
Tensor with embedded coordinates.
- Return type:
torch.Tensor