attention.config ================ .. py:module:: attention.config Classes ------- .. autoapisummary:: attention.config.AttentionConfig attention.config.DotProductAttentionConfig attention.config.TransolverAttentionConfig attention.config.TransolverPlusPlusAttentionConfig attention.config.IrregularNatAttentionConfig attention.config.PerceiverAttentionConfig Module Contents --------------- .. py:class:: AttentionConfig(/, **data) Bases: :py:obj:`pydantic.BaseModel` Configuration for an attention module. Since we can have many different attention implementations, we allow extra fields. such that we can use the same schema for all attention modules. Create a new model by parsing and validating input data from keyword arguments. Raises [`ValidationError`][pydantic_core.ValidationError] if the input data cannot be validated to form a valid model. `self` is explicitly positional-only to allow `self` as a field name. .. py:class:: Config .. py:attribute:: extra :value: 'allow' .. py:attribute:: hidden_dim :type: int :value: None Dimensionality of the hidden features. .. py:attribute:: num_heads :type: int :value: None Number of attention heads. .. py:attribute:: use_rope :type: bool :value: None Whether to use Rotary Positional Embeddings (RoPE). .. py:attribute:: dropout :type: float :value: None Dropout rate for the attention weights and output projection. .. py:attribute:: init_weights :type: emmi.types.InitWeightsMode :value: None Weight initialization strategy. .. py:attribute:: bias :type: bool :value: None Whether to use bias terms in linear layers. .. py:attribute:: head_dim :type: int | None :value: None Dimensionality of each attention head. .. py:method:: validate_hidden_dim_and_num_heads() .. py:class:: DotProductAttentionConfig(/, **data) Bases: :py:obj:`AttentionConfig` Configuration for the Dot Product attention module. Create a new model by parsing and validating input data from keyword arguments. Raises [`ValidationError`][pydantic_core.ValidationError] if the input data cannot be validated to form a valid model. `self` is explicitly positional-only to allow `self` as a field name. .. py:class:: Config .. py:attribute:: extra :value: None .. py:class:: TransolverAttentionConfig(/, **data) Bases: :py:obj:`AttentionConfig` Configuration for the Transolver attention module. Create a new model by parsing and validating input data from keyword arguments. Raises [`ValidationError`][pydantic_core.ValidationError] if the input data cannot be validated to form a valid model. `self` is explicitly positional-only to allow `self` as a field name. .. py:attribute:: num_slices :type: int :value: None Number of slices to project the input tokens to. .. py:class:: Config .. py:attribute:: extra :value: None .. py:class:: TransolverPlusPlusAttentionConfig(/, **data) Bases: :py:obj:`TransolverAttentionConfig` Configuration for the Transolver++ attention module. Create a new model by parsing and validating input data from keyword arguments. Raises [`ValidationError`][pydantic_core.ValidationError] if the input data cannot be validated to form a valid model. `self` is explicitly positional-only to allow `self` as a field name. .. py:class:: Config .. py:attribute:: extra :value: None .. py:attribute:: use_overparameterization :type: bool :value: None Whether to use overparameterization for the slice projection. .. py:attribute:: use_adaptive_temperature :type: bool :value: None Whether to use an adaptive temperature for the slice selection. .. py:attribute:: temperature_activation :type: Literal['sigmoid', 'softplus', 'exp'] | None :value: None Activation function for the adaptive temperature. .. py:attribute:: use_gumbel_softmax :type: bool :value: None Whether to use Gumbel-Softmax for the slice selection. .. py:class:: IrregularNatAttentionConfig(/, **data) Bases: :py:obj:`AttentionConfig` Configuration for the Irregular Neighbourhood Attention Transformer (NAT) attention module. Create a new model by parsing and validating input data from keyword arguments. Raises [`ValidationError`][pydantic_core.ValidationError] if the input data cannot be validated to form a valid model. `self` is explicitly positional-only to allow `self` as a field name. .. py:class:: Config .. py:attribute:: extra :value: None .. py:attribute:: input_dim :type: int :value: None Dimensionality of the input features. .. py:attribute:: radius :type: float :value: None Radius for the radius graph. .. py:attribute:: max_degree :type: int :value: None Maximum number of neighbors per point. .. py:attribute:: relpos_mlp_hidden_dim :type: int :value: None Hidden dimensionality of the relative position bias MLP. .. py:attribute:: relpos_mlp_dropout :type: float :value: None Dropout rate for the relative position bias MLP. .. py:class:: PerceiverAttentionConfig(/, **data) Bases: :py:obj:`AttentionConfig` Configuration for the Perceiver attention module. Create a new model by parsing and validating input data from keyword arguments. Raises [`ValidationError`][pydantic_core.ValidationError] if the input data cannot be validated to form a valid model. `self` is explicitly positional-only to allow `self` as a field name. .. py:class:: Config .. py:attribute:: extra :value: None .. py:attribute:: kv_dim :type: int | None :value: None Dimensionality of the key/value features. If None, use hidden_dim. .. py:method:: set_kv_dim()