eAdditivePosMultiheadAttention#
- class eAdditivePosMultiheadAttention(in_rep, num_heads, *, max_len, dropout=0.0, bias=True, device=None, dtype=None, init_scheme='xavier_normal')[source]#
Bases:
eModule,PositionalAttentionBaseEquivariant additive positional attention with invariant query/key updates.