attention.config¶
Classes¶
Configuration for an attention module. Since we can have many different attention implementations, we allow extra fields. such that we can use the same schema for all attention modules. |
|
Configuration for the Dot Product attention module. |
|
Configuration for the Transolver attention module. |
|
Configuration for the Transolver++ attention module. |
|
Configuration for the Irregular Neighbourhood Attention Transformer (NAT) attention module. |
|
Configuration for the Perceiver attention module. |
Module Contents¶
- class attention.config.AttentionConfig(/, **data)¶
Bases:
pydantic.BaseModelConfiguration for an attention module. Since we can have many different attention implementations, we allow extra fields. such that we can use the same schema for all attention modules.
Create a new model by parsing and validating input data from keyword arguments.
Raises [ValidationError][pydantic_core.ValidationError] if the input data cannot be validated to form a valid model.
self is explicitly positional-only to allow self as a field name.
- Parameters:
data (Any)
Dimensionality of the hidden features.
- init_weights: emmi.types.InitWeightsMode = None¶
Weight initialization strategy.
- class attention.config.DotProductAttentionConfig(/, **data)¶
Bases:
AttentionConfigConfiguration for the Dot Product attention module.
Create a new model by parsing and validating input data from keyword arguments.
Raises [ValidationError][pydantic_core.ValidationError] if the input data cannot be validated to form a valid model.
self is explicitly positional-only to allow self as a field name.
- Parameters:
data (Any)
- class attention.config.TransolverAttentionConfig(/, **data)¶
Bases:
AttentionConfigConfiguration for the Transolver attention module.
Create a new model by parsing and validating input data from keyword arguments.
Raises [ValidationError][pydantic_core.ValidationError] if the input data cannot be validated to form a valid model.
self is explicitly positional-only to allow self as a field name.
- Parameters:
data (Any)
- class attention.config.TransolverPlusPlusAttentionConfig(/, **data)¶
Bases:
TransolverAttentionConfigConfiguration for the Transolver++ attention module.
Create a new model by parsing and validating input data from keyword arguments.
Raises [ValidationError][pydantic_core.ValidationError] if the input data cannot be validated to form a valid model.
self is explicitly positional-only to allow self as a field name.
- Parameters:
data (Any)
- use_overparameterization: bool = None¶
Whether to use overparameterization for the slice projection.
- use_adaptive_temperature: bool = None¶
Whether to use an adaptive temperature for the slice selection.
- class attention.config.IrregularNatAttentionConfig(/, **data)¶
Bases:
AttentionConfigConfiguration for the Irregular Neighbourhood Attention Transformer (NAT) attention module.
Create a new model by parsing and validating input data from keyword arguments.
Raises [ValidationError][pydantic_core.ValidationError] if the input data cannot be validated to form a valid model.
self is explicitly positional-only to allow self as a field name.
- Parameters:
data (Any)
Hidden dimensionality of the relative position bias MLP.
- class attention.config.PerceiverAttentionConfig(/, **data)¶
Bases:
AttentionConfigConfiguration for the Perceiver attention module.
Create a new model by parsing and validating input data from keyword arguments.
Raises [ValidationError][pydantic_core.ValidationError] if the input data cannot be validated to form a valid model.
self is explicitly positional-only to allow self as a field name.
- Parameters:
data (Any)
- set_kv_dim()¶