📊 It's Thursday, allegedly: time for our wækly quiz. So, completely unrelated to anything, this one's going to be about influence, majority, tribes, noise, and dictators.

— Clément Canonne (@ccanonne_) September 24, 2020

(I didn't get "chaos" in, but I'm keeping that option open for the future. #gaussianchaos )

1/11 pic.twitter.com/eeRtkIpNmr

📊 Answers and discussion for yesterday's quiz on voting rules (a.k.a. Boolean functions for social choice).↴

— Clément Canonne (@ccanonne_) September 26, 2020

Disclaimer: for a much better and comprehensive treatment, @BooleanAnalysis 's book is the place to look ( https://t.co/f9OGjFKAVu ).

1/ https://t.co/jy0JzLWwiq

📊 It's (weakly) Thursday: time for a weekly quiz. For this one let's be merry, and play games—repeatedly!

— Clément Canonne (@ccanonne_) September 10, 2020

But, what's a game? You have 2 players, Alice 👩🏻💻 and Bob 👨🏾💻, and a game master, say Guillaume🧔🏼. Guillaume draws (x,y), gives (separately!) x to Alice and y to Bob...

1/9 pic.twitter.com/NMYxoRelJ5

📊 Answers and discussion for yesterday's quiz on games. Technically, "repeated games and direct product (parallel repetition) theorems."

— Clément Canonne (@ccanonne_) September 11, 2020

Or, as one may put it: if I keep playing, how likely is it that I'll keep winning?🤷

1/15 https://t.co/TvjcHG52Gf

📊 It may not be Thursday everywhere yet, but let's have our weakly(weekly(quiz))!

— Clément Canonne (@ccanonne_) September 3, 2020

Today: no general theme. Just things that may be true, or false, and surprisingly so―or not. Beware of the traps.

1/7 pic.twitter.com/VL50ZjMgS8

📊Answers and discussion for yesterday's quiz on "Things I Probably Would Have Gotten Wrong (Did You?)"

— Clément Canonne (@ccanonne_) September 4, 2020

Ft. Complexity theory (Q1), Probability (Q2), Measure theory (Q4), Planar graphs and pandas 🐼 (Q5), and Linear algebra (Q3)... in that order.

1/13 https://t.co/AG9nyiCw5F

📊 It's Thursday for many, time for this week's quiz! As alluded earlier, we're going to flip coins. Like, a lot.

— Clément Canonne (@ccanonne_) August 27, 2020

Today: you have a biased coin which lands heads with some probability p∊(0,1), and you can toss it as many times as you want. You don't know p, though...

1/10 pic.twitter.com/YCYEzSC5zJ

📊 Answers and discussion for yesterday's quiz on coin flipping (a.k.a. the power of small change?)

— Clément Canonne (@ccanonne_) August 28, 2020

Reminder: given a fixed function f, we want a procedure which given independent coin flips from a coin unknown probability of heads p ("bias")...

1/14 https://t.co/GThwTq3VPB

📊 It's Thursday, time for a we•kly quiz! As promised last week, today will be about Gaussians, CLTs, Berry—Esseen, and a little bit of Invariance Principle.

— Clément Canonne (@ccanonne_) August 20, 2020

Let's start. You have n i.i.d. real-valued r.v.s X₁,...,Xₙ, with mean zero. You sum them, and hope for the best.

1/10 pic.twitter.com/vWuIrcxO62

📊 Answers and discussion for yesterday's quiz on CLTs, Gaussians, and a bit of Invariance Principle.

— Clément Canonne (@ccanonne_) August 21, 2020

Let's start: Q1 on the Central Limit Theorem was a trap 🧐!

1/16 https://t.co/AqzMX6YdgI

📊 It's Thursday [reference needed], time for a weᵄekly quiz! Given some recent events about randomized polynomial time (RP), it feels like a short discussion about 🎲 is topical.

— Clément Canonne (@ccanonne_) August 6, 2020

So, let's go: Las Vegas, Monte Carlo, and... Bellagio (?) algorithms!

1/8 pic.twitter.com/DYFqOeMG8q

📊Thread: answers and discussions for yesterday's quiz on randomized complexity classes (RP, ZPP, BPP).

— Clément Canonne (@ccanonne_) August 7, 2020

Of course, I am barely scratching the surface. I find randomness in computation fascinating, and hope you will too. https://t.co/DQNWorXxa0

1/15 pic.twitter.com/CIFcqaqaF8

📊 Today, (weakly|weekly) quiz will be about graphs. The ones with nodes and edges, not those plotting against you.

— Clément Canonne (@ccanonne_) July 30, 2020

Specifically, *planar* graphs. The nice ones.

1/7

📊 Answers and discussion for yesterday's quiz on planar graphs. Recall: a graph G=(V,E) on n=|V| vertices and m=|E| edges is planar if you can "embed it in the plane" (draw it without edge crossings). 🕸️

— Clément Canonne (@ccanonne_) July 31, 2020

Why should we care, you ask?

1/10 https://t.co/yGrMgZ9ndB

📊 It's Thursday: time for our wÆkly quiz! I'll build a little bit on last week's statistics/algorithms thread (see below for a recap), and ask you two 2️⃣ questions.

— Clément Canonne (@ccanonne_) July 23, 2020

General topic: testing distributions, "leaky" measurements, and adaptivity. 🔍

1/6 https://t.co/YhEpCn4yov pic.twitter.com/aj5mYWKqkp

📊 Answers and discussions for yesterday's we(a)ekly quiz on "testing whether data is uniformly distribution, given some leaky query access to the samples." #statistics #quiz

— Clément Canonne (@ccanonne_) July 24, 2020

To paraphrase Tolstoy when starting "War and Peace": I'll be brief. https://t.co/NMK2GJ2DqF

1/10

📊 Two days ago, I asked a question. Way more people answered than expected, and... well, this week's weₐekly #quiz will be slightly ≠: a long thread on uniformity testing, trickling down all day long.

— Clément Canonne (@ccanonne_) July 16, 2020

Be careful what you wish for :) #statistics

1/n https://t.co/3ECldvbekE

📊 A short discussion about last week's thread on uniformity testing: I wrote things down!

— Clément Canonne (@ccanonne_) July 20, 2020

📝 https://t.co/jLEIKSxMIi (LaTeX)

📄 https://t.co/Uqq0yjWrwp (PDF)

Comments welcome.

1/2 https://t.co/6jqSvwoabC pic.twitter.com/1baBnVWWVl

📊 It's Thursday (somewhere), time for a weₐekly quiz! Today: #machinelearning ! Ish. Let's say... computational learning. Some of it. In theory.

— Clément Canonne (@ccanonne_) July 9, 2020

OK. VC dimension. You got me. Anyways, #COLT2020 is starting tomorrow, so... what does it mean to "learn," exactly?

1/10

📊 Answers and comments about yesterday's online quiz on learning.

— Clément Canonne (@ccanonne_) July 10, 2020

Not a quiz on online learning. Though there was an (online learning) component on that online (learning quiz), which maybe led to some learning online, so... I am confused now. https://t.co/2VO9AqXudR

1/16 pic.twitter.com/rNmo5tmwWM

📊 Wæekly quiz: "The answer may surprise you (or not)."

— Clément Canonne (@ccanonne_) July 2, 2020

Four results I find surprising (among many). Three of the statements below are true, one is false. Choose... wisely.

1/11 pic.twitter.com/lAV5UMTwRh

⁉️ Answers and discussion for yesterday's quiz: and yes, you are a Ravenclaw. (BuzzFeed has nothing on me.) https://t.co/csDmh8T4bk

— Clément Canonne (@ccanonne_) July 4, 2020

First I'd like to apologize for the original erroneous phrasing of Q1. If your answer was wrong, it was my fault.

[As usual, refs at the end]

1/19 pic.twitter.com/aODklwZQKB

📊 Weᵃₑkly quiz: linear threshold functions (a.k.a. perceptrons, a.k.a. halfspaces), and what makes them so unique.

— Clément Canonne (@ccanonne_) June 25, 2020

In other words, "the Chow Parameters."

Let's start with Boolean functions: i.e., functions of the form f: {-1,1}ⁿ→{-1,1}. There are 2^2^n of those.

1/9 pic.twitter.com/jJVRaJnyQh

🧑🏫 Answers and discussions for yesterday's quiz on Boolean functions, halfspaces (LTFs), and Chow parameters.

— Clément Canonne (@ccanonne_) June 26, 2020

Our focus here will be functions of the form f: {-1,1}ⁿ→{-1,1} (i.e., bits=±1)

[References for all results at the end of the thread.]

1/18 https://t.co/RYVrBItOrn pic.twitter.com/NZjzVdFJn1

📊 Weakly weekly quiz: a short one—next week 's will be longer. "How do you prove you won't always fall short of your expectations?"

— Clément Canonne (@ccanonne_) June 18, 2020

(No, it not not about eating Cheerios on the sofa, or the last time I wore actual socks. It's about constant-probability statements.)

1/4 pic.twitter.com/fJN7fBYx8B

📊 Answers and discussions for yesterday's quiz: "minimal assumptions for X to be, with constant probability, no less than a constant fraction of 𝔼[X]?"

— Clément Canonne (@ccanonne_) June 19, 2020

E.g., if an algo find something w/ large value in expectation, can it do it w/ probability 1/10?

1/7 https://t.co/wH3dh5qtLd

📊 Weakly-weekly quiz: this week, "how hard is it to learn when you keep forgetting things?"

— Clément Canonne (@ccanonne_) June 11, 2020

(This quiz is about learning functions. The question applies to history as well...)

You observe a stream of labelled samples (xᵢ,yᵢ) where yᵢ=f(xᵢ), and want to learn f...

1/9 pic.twitter.com/OsryHLJyH4

📊 Answers and discussion for yesterday's quiz on sample/memory tradeoffs for learning: a thread. ↴

— Clément Canonne (@ccanonne_) June 12, 2020

Overall question: "how hard does it become to learn a (linear) function when the algorithm can't keep too much information in memory?"

1/11 https://t.co/VXHILDT8Ww

📊 For this edition of the weekly quiz, let's look at random sums of random things, and their tails.

— Clément Canonne (@ccanonne_) June 4, 2020

Specifically, we have some i.i.d. r.v.'s X₁,X₂,..., and some integer-valued (possibly infinite) r.v. T. We look at S = X₁+X₂+...+X_T.

1/7

📊 Answers and discussion about yesterday w(ee|ea)kly quiz on random sums: S = X₁+X₂+...+X_T, w/ Xₙ's independent and T itself an integer-valued random variable.

— Clément Canonne (@ccanonne_) June 5, 2020

Under mild assumptions on all those, what can we say about the tails of the sum S? https://t.co/uNnZUyfuTc

1/11 pic.twitter.com/ACoAo8m5pe

📊 Today's weₐekly quiz: a 3-series ∑ edition! Nothing too fancy, maybe a cute probability proof in disguise somewhere.

— Clément Canonne (@ccanonne_) May 28, 2020

Let's start: we all know—maybe love—the exponential function. For all k in ℝ, ∑ₙkⁿ/n! = eᵏ. Fair enough... butnwhat if we perturb things a bit?

1/5

🧑🏫 Answers and discussions for yesterday's quiz on "series that kind of look like the exponential one."

— Clément Canonne (@ccanonne_) May 29, 2020

If you couldn't see properly the questions (non-utf8er's gonna non-utf8), I had screenshots here: https://t.co/65TqGvrZh1

1/8 https://t.co/e7Y1ChYOAI

📊 Here we go again: we{e,a}kly quiz. Let's have a look at two of our basic* friends, the Poisson 🐟 and Binomial 🐪 distributions. And what happens when they meet.

— Clément Canonne (@ccanonne_) May 21, 2020

Just so we're all on the same page: X~Binom(n,p) if X=∑ₖ Xₖ (n terms)...

*"I'm not basic. THEY're basic."

1/6

Answers and discussions for yesterday's quiz on Poisson 🐟, Binomial 🐪, and I guess 🐘 distributions.

— Clément Canonne (@ccanonne_) May 22, 2020

Here's the plan: answers, discussion of what a "Poisson Binomial distribution" 🐡 is and what it's for, related references.

1/9 https://t.co/di7wk0I48I

📊 It's time for a (short) installment of our weakly quiz. Setup: you are a *deterministic* algorithm 🤖 (sorry). You're given i.i.d. draws from some discrete distribution p over [n]={1,2..,n}, *close-ish* to uniform. You want to output i.i.d. draws *closer* to uniform.

— Clément Canonne (@ccanonne_) May 14, 2020

1/4

📊 Answers for yesterday's quiz: "if you're given i.i.d. samples from something close* from uniform on n elements, can you deterministically combine them to get something more uniform**? ↴

— Clément Canonne (@ccanonne_) May 15, 2020

* within TV distance ε

** within TV distance ε'

1/9 https://t.co/LYNwhDFNzS

📊 Here is a *semi non-technical* weekly quiz: a two-parter, first with an opinion poll, then a one-question quizz.

— Clément Canonne (@ccanonne_) April 30, 2020

Q1: What would you like these quizzes (and/or this account) to focus on more in the future? 🙋

1/3

Answers and discussions for yesterday's poll+quiz: thread below. ↴

— Clément Canonne (@ccanonne_) May 1, 2020

First, the poll: random stuff, and probability distribution-related things win the day. I won't promise puns will disappear, though. As a TCS person, I have promise problems.

1/6 https://t.co/iAMBxcjHbo

📊It's Thursday, time for a we[ae]kly quiz! Let's talk about juntas. The non-military kind.

— Clément Canonne (@ccanonne_) April 23, 2020

(As usual, questions below, answers tomorrow)

What is a junta? Basically, a very misleading function. For simplicity, think of f:{0,1}ⁿ→{0,1} (the def generalizes to f:Σⁿ→ℝ)..

1/7

⌛Time for the answers to yesterday's quiz on junta functions*! See thread below. ↴

— Clément Canonne (@ccanonne_) April 24, 2020

* "Technically n variables, morally k ≪ n"

1/12 https://t.co/x4QISb3Kyn

Weakly weekly quiz, new installment! I'm assuming everyone is very busy with either the #FOCS2020 deadline, the #ICML2020 reviews, or the current global health crisis and juggling with 5 toddlers & 7 Zoom online classes, so I'll keep it short.

— Clément Canonne (@ccanonne_) April 9, 2020

Adaptivity 🗘 and testing 🔎.

1/7

📚 Here are (a day late) the answers to this week's quiz on adaptivity 🗘 in property testing 🥚. A thread ↴

— Clément Canonne (@ccanonne_) April 11, 2020

1/10 https://t.co/1pcUXJ05Dt

🗒️ Today's quiz will be short, and focus on a very useful primitive called "hypothesis selection."

— Clément Canonne (@ccanonne_) March 26, 2020

Say you've got, as usual, data points coming from some unknown probability distribution p over some domain 𝓧. And you have a list of possible 'models' q₁, q₂,..., qₙ.

1/6

Answers and discussions for yesterday's quiz on hypothesis selection: a thread.

— Clément Canonne (@ccanonne_) March 27, 2020

The problem: you have data from some unknown probability distribution p, a list of n hypotheses q₁,..., qₙ. Assuming one of the qₜ's is good, find one 'not too bad.'

1/12 https://t.co/09lOKAy0Cl

📊 The weakly weekly quiz is back: in today's edition, "learning and testing, how are they related?" Specifically, we're going to look at functions f: {0,1}ⁿ→{0,1} and probability distributions p on a domaine of size [k].

— Clément Canonne (@ccanonne_) March 19, 2020

Hope you find that enjoyable. 1/11

Answers and discussions for yesterday's quiz on learning v. testing Boolean functions and probability distributions: a thread. 🧵 https://t.co/dG0N5anxMW

— Clément Canonne (@ccanonne_) March 20, 2020

1/16

Quizz of the week: I'm slightly jetlagged, so it'll be a bit short. Let's about 📉*monotone* 📈 probability distributions (univariate, discrete).

— Clément Canonne (@ccanonne_) March 5, 2020

That is, you have a distribution p over [k] = {1,2,...,k}, and you know its probability mass function is (wlog) non-increasing.

1/6

Answers and discussion for yesterday's polls on inference for (discrete) monotone distributions 📈📉 below. I hope you won't find it too... monotonous↴

— Clément Canonne (@ccanonne_) March 7, 2020

1/16 https://t.co/Tv1AdqRh7e

📊The past two weeks, we talked about learning (density estimation) of discrete distributions. Now, let's look at the flip side, and see a bit of what happens w/ *testing* (goodness-of-fit, hypothesis testing,...).

— Clément Canonne (@ccanonne_) February 27, 2020

All after, our distance measure will be total variation (TV)

1/8

Answers and discussion for yesterday's polls on testin 📊🐘 below. Some things, I'd say, are *far* from obvious.↴ https://t.co/gueiqlGBYA

— Clément Canonne (@ccanonne_) February 28, 2020

1/14

So last week was density estimation (distribution learning) of distributions 📊 over a domain of size k in total variation, Hellinger, and KL distances. Now you may wonder: "What about the rest?"

— Clément Canonne (@ccanonne_) February 20, 2020

Let's see some of the rest: ℓ₂, ℓ_∞, and Kolmogorov.

First: "Kolmogorov"?

1/9

Answers and discussion for yesterday's poll on "density estimation in other distances, such as Kolmogorov" below The answer MAY surprise you (with probability ≥ 0.428). ↴

— Clément Canonne (@ccanonne_) February 21, 2020

1/9 https://t.co/6jo6kAhKtT

📊 Learning discrete distributions over a finite domain, a short thread.↴

— Clément Canonne (@ccanonne_) February 15, 2020

Given n i.i.d. samples from some unknown p over a domain of size k and parameters ε,δ, you want to output some hypothesis q s.t.

d(p,q) < ε

with probability at least 1-δ. How large must n be?

1/8

📊 So, the results... Learning discrete distributions over a finite domain of size k to distance ε, with probability 1-δ: how hard can it be?

— Clément Canonne (@ccanonne_) February 16, 2020

1/9 https://t.co/AANZoxkm95

Test your intuition (or, in my case, lack thereof): hypothesis testing in high dimensions. You are given n i.i.d. samples from an unknown multivariate Gaussian p = N(μ,Σ) in ℝ^d. You want to know if p is *the* standard Gaussian N(0,I). 1/8

— Clément Canonne (@ccanonne_) October 27, 2019

So, the answers:

— Clément Canonne (@ccanonne_) October 28, 2019

- Q1 and Q2 both have the same one, √d/ε².

The fact that it's the same comes from the discussion on equivalent between TV to N(0,I) and ℓ₂ norm of the mean.

Learning the mean to ε in ℓ₂ would be d/ε². But estimating ||μ||₂ is quadratically cheaper! 1/8