linalg

Linear algebra utilities for symmetric vector spaces with known group representations.

symm_learning.linalg.invariant_orthogonal_projector(rep_x: Representation) Tensor[source]

Computes the orthogonal projection to the invariant subspace.

The input representation \(\rho_{\mathcal{X}}: \mathbb{G} \mapsto \mathbb{G}\mathbb{L}(\mathcal{X})\) is transformed to the spectral basis given by:

\[\rho_\mathcal{X} = \mathbf{Q} \left( \bigoplus_{i\in[1,n]} \hat{\rho}_i \right) \mathbf{Q}^T\]

where \(\hat{\rho}_i\) denotes an instance of one of the irreducible representations of the group, and \(\mathbf{Q}: \mathcal{X} \mapsto \mathcal{X}\) is the orthogonal change of basis from the spectral basis to the original basis.

The projection is performed by:
  1. Changing the basis to the representation spectral basis (exposing signals per irrep).

  2. Zeroing out all signals on irreps that are not trivial.

  3. Mapping back to the original basis set.

Parameters:
  • rep_x (escnn.group.Representation) – The representation for which the orthogonal projection to the

  • computed. (invariant subspace is)

Returns:

The orthogonal projection matrix to the invariant subspace, \(\mathbf{Q} \mathbf{S} \mathbf{Q}^T\).

Return type:

torch.Tensor

symm_learning.linalg.isotypic_signal2irreducible_subspaces(x: Tensor, rep_x: Representation)[source]

Given a random variable in an isotypic subspace, flatten the r.v. into G-irreducible subspaces.

Given a signal of shape \((n, m_x \cdot d)\) where \(n\) is the number of samples, \(m_x\) the multiplicity of the irrep in \(X\), and \(d\) the dimension of the irrep.

\(X = [x_1, \ldots, x_n]\) and \(x_i = [x_{i_{11}}, \ldots, x_{i_{1d}}, x_{i_{21}}, \ldots, x_{i_{2d}}, \ldots, x_{i_{m_x1}}, \ldots, x_{i_{m_xd}}]\)

This function returns the signal \(Z\) of shape \((n \cdot d, m_x)\) where each column represents the flattened signal of a G-irreducible subspace.

\(Z[:, k] = [x_{1_{k1}}, \ldots, x_{1_{kd}}, x_{2_{k1}}, \ldots, x_{2_{kd}}, \ldots, x_{n_{k1}}, \ldots, x_{n_{kd}}]\)

Parameters:
  • x (Tensor) – Shape \((..., n, m_x \cdot d)\) where \(n\) is the number of samples and \(m_x\) the multiplicity of the irrep in \(X\).

  • rep_x (escnn.nn.Representation) – Representation in the isotypic basis of a single type of irrep.

Return type:

Tensor

Shape:

\((n \cdot d, m_x)\), where each column represents the flattened signal of an irreducible subspace.

symm_learning.linalg.lstsq(x: Tensor, y: Tensor, rep_x: Representation, rep_y: Representation)[source]

Computes a solution to the least squares problem of a system of linear equations with equivariance constraints.

The \(\mathbb{G}\)-equivariant least squares problem to the linear system of equations \(\mathbf{Y} = \mathbf{A}\,\mathbf{X}\), is defined as:

\[\begin{split}\begin{align} &\| \mathbf{Y} - \mathbf{A}\,\mathbf{X} \|_F \\ & \text{s.t.} \quad \rho_{\mathcal{Y}}(g) \mathbf{A} = \mathbf{A}\rho_{\mathcal{X}}(g) \quad \forall g \in \mathbb{G}, \end{align}\end{split}\]

where \(\rho_{\mathcal{Y}}\) and \(\rho_{\mathcal{X}}\) denote the group representations on \(\mathbf{X}\) and \(\mathbf{Y}\).

Parameters:
  • x (Tensor) – Realizations of the random variable \(\mathbf{X}\) with shape \((N, D_x)\), where \(N\) is the number of samples.

  • y (Tensor) – Realizations of the random variable \(\mathbf{Y}\) with shape \((N, D_y)\).

  • rep_x (Representation) – The finite-group representation under which \(\mathbf{X}\) transforms.

  • rep_y (Representation) – The finite-group representation under which \(\mathbf{Y}\) transforms.

Returns:

A \((D_y \times D_x)\) matrix \(\mathbf{A}\) satisfying the G-equivariance constraint and minimizing \(\|\mathbf{Y} - \mathbf{A}\,\mathbf{X}\|^2\).

Return type:

Tensor

Shape:
  • X: \((N, D_x)\)

  • Y: \((N, D_y)\)

  • Output: \((D_y, D_x)\)