1. Let's talk about Albanians. No, not THOSE Albanians, in the Balkans. The Caucasian Albanians, the pre-Turkish inhabitants of the land now known as Azerbaijan. 🇦🇿














More from All
How can we use language supervision to learn better visual representations for robotics?
Introducing Voltron: Language-Driven Representation Learning for Robotics!
Paper: https://t.co/gIsRPtSjKz
Models: https://t.co/NOB3cpATYG
Evaluation: https://t.co/aOzQu95J8z
🧵👇(1 / 12)
Videos of humans performing everyday tasks (Something-Something-v2, Ego4D) offer a rich and diverse resource for learning representations for robotic manipulation.
Yet, an underused part of these datasets are the rich, natural language annotations accompanying each video. (2/12)
The Voltron framework offers a simple way to use language supervision to shape representation learning, building off of prior work in representations for robotics like MVP (https://t.co/Pb0mk9hb4i) and R3M (https://t.co/o2Fkc3fP0e).
The secret is *balance* (3/12)
Starting with a masked autoencoder over frames from these video clips, make a choice:
1) Condition on language and improve our ability to reconstruct the scene.
2) Generate language given the visual representation and improve our ability to describe what's happening. (4/12)
By trading off *conditioning* and *generation* we show that we can learn 1) better representations than prior methods, and 2) explicitly shape the balance of low and high-level features captured.
Why is the ability to shape this balance important? (5/12)
Introducing Voltron: Language-Driven Representation Learning for Robotics!
Paper: https://t.co/gIsRPtSjKz
Models: https://t.co/NOB3cpATYG
Evaluation: https://t.co/aOzQu95J8z
🧵👇(1 / 12)

Videos of humans performing everyday tasks (Something-Something-v2, Ego4D) offer a rich and diverse resource for learning representations for robotic manipulation.
Yet, an underused part of these datasets are the rich, natural language annotations accompanying each video. (2/12)
The Voltron framework offers a simple way to use language supervision to shape representation learning, building off of prior work in representations for robotics like MVP (https://t.co/Pb0mk9hb4i) and R3M (https://t.co/o2Fkc3fP0e).
The secret is *balance* (3/12)
Starting with a masked autoencoder over frames from these video clips, make a choice:
1) Condition on language and improve our ability to reconstruct the scene.
2) Generate language given the visual representation and improve our ability to describe what's happening. (4/12)
By trading off *conditioning* and *generation* we show that we can learn 1) better representations than prior methods, and 2) explicitly shape the balance of low and high-level features captured.
Why is the ability to shape this balance important? (5/12)
You May Also Like
A brief analysis and comparison of the CSS for Twitter's PWA vs Twitter's legacy desktop website. The difference is dramatic and I'll touch on some reasons why.
Legacy site *downloads* ~630 KB CSS per theme and writing direction.
6,769 rules
9,252 selectors
16.7k declarations
3,370 unique declarations
44 media queries
36 unique colors
50 unique background colors
46 unique font sizes
39 unique z-indices
https://t.co/qyl4Bt1i5x
PWA *incrementally generates* ~30 KB CSS that handles all themes and writing directions.
735 rules
740 selectors
757 declarations
730 unique declarations
0 media queries
11 unique colors
32 unique background colors
15 unique font sizes
7 unique z-indices
https://t.co/w7oNG5KUkJ
The legacy site's CSS is what happens when hundreds of people directly write CSS over many years. Specificity wars, redundancy, a house of cards that can't be fixed. The result is extremely inefficient and error-prone styling that punishes users and developers.
The PWA's CSS is generated on-demand by a JS framework that manages styles and outputs "atomic CSS". The framework can enforce strict constraints and perform optimisations, which is why the CSS is so much smaller and safer. Style conflicts and unbounded CSS growth are avoided.
Legacy site *downloads* ~630 KB CSS per theme and writing direction.
6,769 rules
9,252 selectors
16.7k declarations
3,370 unique declarations
44 media queries
36 unique colors
50 unique background colors
46 unique font sizes
39 unique z-indices
https://t.co/qyl4Bt1i5x

PWA *incrementally generates* ~30 KB CSS that handles all themes and writing directions.
735 rules
740 selectors
757 declarations
730 unique declarations
0 media queries
11 unique colors
32 unique background colors
15 unique font sizes
7 unique z-indices
https://t.co/w7oNG5KUkJ

The legacy site's CSS is what happens when hundreds of people directly write CSS over many years. Specificity wars, redundancy, a house of cards that can't be fixed. The result is extremely inefficient and error-prone styling that punishes users and developers.
The PWA's CSS is generated on-demand by a JS framework that manages styles and outputs "atomic CSS". The framework can enforce strict constraints and perform optimisations, which is why the CSS is so much smaller and safer. Style conflicts and unbounded CSS growth are avoided.
So it's now October 10, 2018 and....Rod Rosenstein is STILL not fired.
He's STILL in charge of the Mueller investigation.
He's STILL refusing to hand over the McCabe memos.
He's STILL holding up the declassification of the #SpyGate documents & their release to the public.
I love a good cover story.......
The guy had a face-to-face with El Grande Trumpo himself on Air Force One just 2 days ago. Inside just about the most secure SCIF in the world.
And Trump came out of AF1 and gave ol' Rod a big thumbs up!
And so we're right back to 'that dirty rat Rosenstein!' 2 days later.
At this point it's clear some members of Congress are either in on this and helping the cover story or they haven't got a clue and are out in the cold.
Note the conflicting stories about 'Rosenstein cancelled meeting with Congress on Oct 11!"
First, rumors surfaced of a scheduled meeting on Oct. 11 between Rosenstein & members of Congress, and Rosenstein just cancelled it.
He's STILL in charge of the Mueller investigation.
He's STILL refusing to hand over the McCabe memos.
He's STILL holding up the declassification of the #SpyGate documents & their release to the public.
I love a good cover story.......
The guy had a face-to-face with El Grande Trumpo himself on Air Force One just 2 days ago. Inside just about the most secure SCIF in the world.
And Trump came out of AF1 and gave ol' Rod a big thumbs up!
And so we're right back to 'that dirty rat Rosenstein!' 2 days later.
At this point it's clear some members of Congress are either in on this and helping the cover story or they haven't got a clue and are out in the cold.
Note the conflicting stories about 'Rosenstein cancelled meeting with Congress on Oct 11!"
First, rumors surfaced of a scheduled meeting on Oct. 11 between Rosenstein & members of Congress, and Rosenstein just cancelled it.
Rep. Andy Biggs and Rep. Matt Gaetz say DAG Rod Rosenstein cancelled an Oct. 11 appearance before the judiciary and oversight committees. They are now calling for a subpoena. pic.twitter.com/TknVHKjXtd
— Ivan Pentchoukov \U0001f1fa\U0001f1f8 (@IvanPentchoukov) October 10, 2018