Functions¤
clifford_batch_norm(x, n_blades, running_mean=None, running_cov=None, weight=None, bias=None, training=True, momentum=0.1, eps=1e05)
¤
Clifford batch normalization for each channel across a batch of data.
Parameters:
Name  Type  Description  Default 

x 
Tensor

Input tensor of shape 
required 
n_blades 
int

Number of blades of the Clifford algebra. 
required 
running_mean 
Tensor

The tensor with running mean statistics having shape 
None

running_cov 
Tensor

The tensor with running covariance statistics having shape 
None

weight 
Union[Tensor, Parameter]

Additional weight tensor which is applied post normalization, and has the shape 
None

bias 
Union[Tensor, Parameter]

Additional bias tensor which is applied post normalization, and has the shape 
None

training 
bool

Whether to use the running mean and variance. Defaults to True. Defaults to True. 
True

momentum 
float

Momentum for the running mean and variance. Defaults to 0.1. 
0.1

eps 
float

Epsilon for the running mean and variance. Defaults to 1e05. 
1e05

Returns:
Type  Description 

Tensor

Normalized input of shape 
Source code in cliffordlayers/nn/functional/batchnorm.py
complex_batch_norm(x, running_mean=None, running_cov=None, weight=None, bias=None, training=True, momentum=0.1, eps=1e05)
¤
Applies complexvalued Batch Normalization as described in (Trabelsi et al., 2018) for each channel across a batch of data.
Parameters:
Name  Type  Description  Default 

x 
Tensor

The input complexvalued data is expected to be at least 2d, with shape 
required 
running_mean 
Union[Tensor, Parameter]

The tensor with running mean statistics having shape 
None

running_cov 
Union[Tensor, Parameter]

The tensor with running realimaginary covariance statistics having shape 
None

weight 
Tensor

Additional weight tensor which is applied post normalization, and has the shape 
None

bias 
Tensor

Additional bias tensor which is applied post normalization, and has the shape 
None

training 
bool

Whether to use the running mean and variance. Defaults to 
True

momentum 
float

Momentum for the running mean and variance. Defaults to 
0.1

eps 
float

Epsilon for the running mean and variance. Defaults to 
1e05

Returns:
Type  Description 

Tensor

Normalized input as complex tensor of shape 
Source code in cliffordlayers/nn/functional/batchnorm.py
whiten_data(x, training=True, running_mean=None, running_cov=None, momentum=0.1, eps=1e05)
¤
Jointly whiten features in tensors (B, C, *D, I)
: take n_blades(I)dim vectors
and whiten individually for each channel dimension C over (B, *D)
.
I is the number of blades in the respective Clifford algebra, e.g. I = 2 for complex numbers.
Parameters:
Name  Type  Description  Default 

x 
Tensor

The tensor to whiten. 
required 
training 
bool

Wheter to update the running mean and covariance. Defaults to 
True

running_mean 
Tensor

The running mean of shape 
None

running_cov 
Tensor

The running covariance of shape 
None

momentum 
float

The momentum to use for the running mean and covariance. Defaults to 
0.1

eps 
float

A small number to add to the covariance. Defaults to 1e5. 
1e05

Returns:
Type  Description 

Tensor

Whitened data of shape 
Source code in cliffordlayers/nn/functional/batchnorm.py
clifford_group_norm(x, n_blades, num_groups=1, running_mean=None, running_cov=None, weight=None, bias=None, training=True, momentum=0.1, eps=1e05)
¤
Clifford group normalization
Parameters:
Name  Type  Description  Default 

x 
Tensor

Input tensor of shape 
required 
n_blades 
int

Number of blades of the Clifford algebra. 
required 
num_groups 
int

Number of groups for which normalization is calculated. Defaults to 1.
For 
1

running_mean 
Tensor

The tensor with running mean statistics having shape 
None

running_cov 
Tensor

The tensor with running realimaginary covariance statistics having shape 
None

weight 
Union[Tensor, Parameter]

Additional weight tensor which is applied post normalization, and has the shape 
None

bias 
Union[Tensor, Parameter]

Additional bias tensor which is applied post normalization, and has the shape 
None

training 
bool

Whether to use the running mean and variance. Defaults to True. 
True

momentum 
float

Momentum for the running mean and variance. Defaults to 0.1. 
0.1

eps 
float

Epsilon for the running mean and variance. Defaults to 1e05. 
1e05

Returns:
Type  Description 

Tensor

Group normalized input of shape 
Source code in cliffordlayers/nn/functional/groupnorm.py
119 120 121 122 123 124 125 126 127 128 129 130 131 132 133 134 135 136 137 138 139 140 141 142 143 144 145 146 147 148 149 150 151 152 153 154 155 156 157 158 159 160 161 162 163 164 165 166 167 168 169 170 171 172 173 174 175 176 177 178 179 180 181 182 183 184 185 186 187 188 189 190 191 192 193 194 195 196 197 198 199 200 201 202 203 204 205 206 207 208 209 210 211 212 213 214 215 216 217 218 219 220 221 222 223 224 225 

complex_group_norm(x, num_groups=1, running_mean=None, running_cov=None, weight=None, bias=None, training=True, momentum=0.1, eps=1e05)
¤
Group normalization for complexvalued tensors.
Parameters:
Name  Type  Description  Default 

x 
Tensor

The input complexvalued data is expected to be at least 2d, with
shape 
required 
num_groups 
int

Number of groups for which normalization is calculated. Defaults to 1.
For 
1

running_mean 
Tensor

The tensor with running mean statistics having shape 
None

running_cov 
Tensor

The tensor with running realimaginary covariance statistics having shape 
None

weight 
Union[Tensor, Parameter]

Additional weight tensor which is applied post normalization, and has the shape 
None

bias 
Union[Tensor, Parameter]

Additional bias tensor which is applied post normalization, and has the shape 
None

training 
bool

Whether to use the running mean and variance. Defaults to True. 
True

momentum 
float

Momentum for the running mean and variance. Defaults to 0.1. 
0.1

eps 
float

Epsilon for the running mean and variance. Defaults to 1e05. 
1e05

Returns:
Type  Description 

Tensor

Normalized input as complex tensor of shape 
Source code in cliffordlayers/nn/functional/groupnorm.py
12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 111 112 113 114 115 116 
