Barlow Twins: a new super-simple self-supervised method to train joint-embedding architectures (aka Siamese nets) non

Basic idea: maximize the normalized correlation between a variable in the left branch and the same var in the right branch, while making the normalized cross-correlation between one var in the left branch and all other vars in the right branch as close to zero as possible.
2/N
In short: the loss tries to make the normalized cross-correlation between the embedding vectors coming out of the left branch and the right branch as close to the identity matrix as possible.
3/N
The 2 branches are always fed with differently-distorted version of the same image, and there is no need for dissimilar training pairs.

The objective makes the embedding vectors of the two branches as similar as possible, while maximizing their information content.
4/N
No contrastive samples, no huge batch size (optimal is 128), nor predictor, no moving-average weights, no vector quantization, nor cut gradients in one of the branches.
5/N
Competitive results on ImageNet with a linear classifier head.
Great results on semi-supervised ImageNet in the low labeled-data regime and on transfer tasks.
6/N
Results on ImageNet with linear classifier head
7/N
Results with 1% and 10% of ImagNet labeled images
8/N
Results on transfer tasks.
9/N
Arch is standard ResNet50 with 2048-D feature vec.
But contrary to others, the embedding size (projector output) is larger. The perf keeps going up as the embedding dim grows (we stopped at 16384).
Probably cause the feature vars are made independent, not just decorrelated.
10/N
Why Barlow? Horace Barlow was a pioneer of visual neuroscience who proposed the idea that the brain tries to minimize redundancy in representations.

By Jure Zbontar, Li Jing, Ishan Misra, yours truly, and Stéphane Deny.
All from FAIR.
To appear at ICML 2021
11/N
Don't you just hate slicing what would be a decent-size post into threaded thin tweets?
12/N
No, really. Don't you hate reading those long thread slices?
If you do, you could just read my Facebook post:
https://t.co/dQii7BEPQ5
13/N
N=13
Typo: optimal batch size is 1024, not 128.
14/13 (haha).

More from All

You May Also Like

Tip from the Monkey
Pangolins, September 2019 and PLA are the key to this mystery
Stay Tuned!


1. Yang


2. A jacobin capuchin dangling a flagellin pangolin on a javelin while playing a mandolin and strangling a mannequin on a paladin's palanquin, said Saladin
More to come tomorrow!


3. Yigang Tong
https://t.co/CYtqYorhzH
Archived: https://t.co/ncz5ruwE2W


4. YT Interview
Some bats & pangolins carry viruses related with SARS-CoV-2, found in SE Asia and in Yunnan, & the pangolins carrying SARS-CoV-2 related viruses were smuggled from SE Asia, so there is a possibility that SARS-CoV-2 were coming from