attention.anchor_attention.config

Classes

MultiBranchAnchorAttentionConfig

Configuration for Multi-Branch Anchor Attention module.

CrossAchorAttentionConfig

Configuration for Cross Anchor Attention module.

JointAnchorAttentionConfig

Configuration for Joint Anchor Attention module.

TokenSpec

Specification for a token type in the attention mechanism.

AttentionPattern

Defines which tokens attend to which other tokens.

MixedAttentionConfig

Configuration for Mixed Attention module.

Module Contents

class attention.anchor_attention.config.MultiBranchAnchorAttentionConfig(/, **data)

Bases: emmi.schemas.modules.attention.AttentionConfig

Configuration for Multi-Branch Anchor Attention module.

Create a new model by parsing and validating input data from keyword arguments.

Raises [ValidationError][pydantic_core.ValidationError] if the input data cannot be validated to form a valid model.

self is explicitly positional-only to allow self as a field name.

Parameters:

data (Any)

branches: list[str] = None
anchor_suffix: str = None
class attention.anchor_attention.config.CrossAchorAttentionConfig(/, **data)

Bases: MultiBranchAnchorAttentionConfig

Configuration for Cross Anchor Attention module.

Create a new model by parsing and validating input data from keyword arguments.

Raises [ValidationError][pydantic_core.ValidationError] if the input data cannot be validated to form a valid model.

self is explicitly positional-only to allow self as a field name.

Parameters:

data (Any)

class attention.anchor_attention.config.JointAnchorAttentionConfig(/, **data)

Bases: MultiBranchAnchorAttentionConfig

Configuration for Joint Anchor Attention module.

Create a new model by parsing and validating input data from keyword arguments.

Raises [ValidationError][pydantic_core.ValidationError] if the input data cannot be validated to form a valid model.

self is explicitly positional-only to allow self as a field name.

Parameters:

data (Any)

class attention.anchor_attention.config.TokenSpec(/, **data)

Bases: pydantic.BaseModel

Specification for a token type in the attention mechanism.

Create a new model by parsing and validating input data from keyword arguments.

Raises [ValidationError][pydantic_core.ValidationError] if the input data cannot be validated to form a valid model.

self is explicitly positional-only to allow self as a field name.

Parameters:

data (Any)

name: Literal['surface_anchors', 'volume_anchors', 'surface_queries', 'volume_queries']
size: int = None
classmethod from_dict(token_dict)

Create TokenSpec from dictionary with single key-value pair.

Parameters:

token_dict (dict[str, int])

Return type:

TokenSpec

to_dict()

Convert TokenSpec to dictionary.

Return type:

dict[str, int]

class attention.anchor_attention.config.AttentionPattern(/, **data)

Bases: pydantic.BaseModel

Defines which tokens attend to which other tokens.

Create a new model by parsing and validating input data from keyword arguments.

Raises [ValidationError][pydantic_core.ValidationError] if the input data cannot be validated to form a valid model.

self is explicitly positional-only to allow self as a field name.

Parameters:

data (Any)

query_tokens: collections.abc.Sequence[str]
key_value_tokens: collections.abc.Sequence[str]
class attention.anchor_attention.config.MixedAttentionConfig(/, **data)

Bases: emmi.schemas.modules.attention.DotProductAttentionConfig

Configuration for Mixed Attention module.

Create a new model by parsing and validating input data from keyword arguments.

Raises [ValidationError][pydantic_core.ValidationError] if the input data cannot be validated to form a valid model.

self is explicitly positional-only to allow self as a field name.

Parameters:

data (Any)