@nntaleb 1/10
Science is assumed to be “evidence-based” but that term alone doesn’t mean much. What constitutes good evidence? How is evidence being used? Is it supporting or refuting a hypothesis? Was the hypothesis and experimental design predetermined or found ex post facto?

@engexplain @nntaleb 2/10
The reality is you can find “evidence” for almost any narrative. Limit the sample size, cherry-pick studies, etc. Systematic reviews, meta analyses, and randomized controlled trials are all susceptible to selective interpretation/narrative fallacy.
@engexplain @nntaleb 3/10
At the heart of the problem is the over-reliance on simplistic statistical techniques that do little more than quantify 2 things moving together.
@engexplain @nntaleb 4/10
Take Pearson’s correlation, based on covariance. Variation can increase simultaneously across 2 variables for countless reasons, most of which are spurious. Yet this simple notion of “causality” undergirds much of scientific literature.
@engexplain @nntaleb 5/10
Information-theoretic (entropy based) approaches on the other hand can assess *general* measures of dependence. Rather than some specialized (linear) view based on concurrent variation, entropy encompasses the amount of information contained in and between variables.
@engexplain @nntaleb 6/10
If you were genuinely interested in giving the term “evidence” an authentic and reliable meaning then the methods used to underpin an assertion would be rigorous.
@engexplain @nntaleb 7/10
We wouldn’t look to conveniently simplistic methods to denote something as evidential, rather we would look for a measure capable of assessing the expected amount of information held in a random variable; there is nothing more fundamental than information.
@engexplain @nntaleb 8/10
Consider Mutual Information (MI), which quantifies the amount of information obtained about one random variable through observing another random variable. This observing of the relationship between variables is what measurement and evidence is all about.
@engexplain @nntaleb 9/10
MI determines how different joint entropy is from marginal entropies. If there is a genuine dependence between variables we would expect information gathered from all variables at once (joint) to be less than the sum of information from independent variables (marginals).
@engexplain @nntaleb 10/10
If “evidence-based” science was genuinely invested in authentic measurement it would leverage *general* measures of dependence; that demands an approach rooted in information-theory. Without entropy you’re just picking data, choosing a narrative, and calling it “evidence.”

More from Science

So it turns out that an organization I thought was doing good work, the False Memory Syndrome Foundation (associated with Center for Inquiry, James Randi, and Martin Gardner) was actually caping for pedophiles. Uhhhh oops?


Since this, bizarrely, turned out to be one of my longest videos ever (??) here's a quick thread to sum it up for those of you like myself with short attention spans. 1/10

In the '90s the False Memory Syndrome Foundation was founded to call attention to the problem of adults suddenly "remembering" child abuse that never actually happened, often under hypnosis. Skeptics like James Randi & Martin Gardner joined their board. 2/10

A new article reveals that the FMSF was founded by parents who had been credibly and PRIVATELY accused of molestation by their now-adult daughter. They publicized the accusation, destroyed the daughter's reputation, and started the foundation. 3/10

The FMSF assumed any accused pedo who joined was innocent, saying "We are a good-looking bunch of people, graying hair, well dressed, healthy, smiling; just about every person who has attended is someone you would surely find interesting and want to count as a friend" 😬 4/10

You May Also Like

Recently, the @CNIL issued a decision regarding the GDPR compliance of an unknown French adtech company named "Vectaury". It may seem like small fry, but the decision has potential wide-ranging impacts for Google, the IAB framework, and today's adtech. It's thread time! 👇

It's all in French, but if you're up for it you can read:
• Their blog post (lacks the most interesting details):
https://t.co/PHkDcOT1hy
• Their high-level legal decision: https://t.co/hwpiEvjodt
• The full notification: https://t.co/QQB7rfynha

I've read it so you needn't!

Vectaury was collecting geolocation data in order to create profiles (eg. people who often go to this or that type of shop) so as to power ad targeting. They operate through embedded SDKs and ad bidding, making them invisible to users.

The @CNIL notes that profiling based off of geolocation presents particular risks since it reveals people's movements and habits. As risky, the processing requires consent — this will be the heart of their assessment.

Interesting point: they justify the decision in part because of how many people COULD be targeted in this way (rather than how many have — though they note that too). Because it's on a phone, and many have phones, it is considered large-scale processing no matter what.
A THREAD ON @SarangSood

Decoded his way of analysis/logics for everyone to easily understand.

Have covered:
1. Analysis of volatility, how to foresee/signs.
2. Workbook
3. When to sell options
4. Diff category of days
5. How movement of option prices tell us what will happen

1. Keeps following volatility super closely.

Makes 7-8 different strategies to give him a sense of what's going on.

Whichever gives highest profit he trades in.


2. Theta falls when market moves.
Falls where market is headed towards not on our original position.


3. If you're an options seller then sell only when volatility is dropping, there is a high probability of you making the right trade and getting profit as a result

He believes in a market operator, if market mover sells volatility Sarang Sir joins him.


4. Theta decay vs Fall in vega

Sell when Vega is falling rather than for theta decay. You won't be trapped and higher probability of making profit.