📊 Guess what: it's Thursday! Time for a (weakly) weekly quiz. This one may remind some of you of your randomized algorithms class🎲
— Clément Canonne (@ccanonne_) January 14, 2021
"What to expect when we just do random stuff?"
1/ pic.twitter.com/p3ZWJOuLmL
📊 Answers and discussions for this week's quiz on "doing random stuff and expecting things"
— Clément Canonne (@ccanonne_) January 17, 2021
The upshot: random choices 🎲 can lead to surprising insights, even if you don't care for randomized algorithms! (Now, combinatorists may argue that...
1/ https://t.co/7W46AcOuge
📊 It's Thursday, and a lot has been happening in (some parts of) the world... a lot to (un)pack or cover, so let's do none of that: here is a weekly (albeit weakly) quiz instead.
— Clément Canonne (@ccanonne_) January 7, 2021
Covering, packing, and ε-nets. Let's have a ball!
1/9 pic.twitter.com/cSahgJMsGN
📊 Answers and discussion for this week's quiz: on covering, packing, and the importance of norms.
— Clément Canonne (@ccanonne_) January 9, 2021
Below, unless specified otherwise (X, ‖.‖) is a normed space (metric space suffices up to Q4), Θ⊆X, and ε>0. Think of X=ℝⁿ for n≫1 for "intuition."
1/ https://t.co/qjW45MC6rD
📊 It's Thursday, time for a weₐᵉkly quiz! A two-parter, ft. Quantum Santa: merry Christmas, wherever you are!
— Clément Canonne (@ccanonne_) December 24, 2020
Obviously, there are many children on Earth, namely n>>1. Some have been naughty, some have been nice: Christmas Alice 🤶 and Christmas Bob 🎅 are keeping track.
1/ pic.twitter.com/pPHKSBMCZX
📊 Answers and discussions for this week's quiz on the communication complexity of |naughty∩nice|! 🤶
— Clément Canonne (@ccanonne_) December 26, 2020
I hope you had, whether you celebrate Christmas or not, a relaxing break with family or friends, and wish you more of those times to come.
1/8 https://t.co/CJw5GCPuBr
📊 It is, if I am not mistaken, Thursday (in several parts of the world at least): time for a (weakly) weekly quiz!
— Clément Canonne (@ccanonne_) December 17, 2020
It's been a hard year, month, and week. Let's figure out what *else* is (computationally) really hard.
1/9 pic.twitter.com/sihNkomUWR
📊 Answers and discussion for yesterday's quiz on NP-Hardness of various things (or lack thereof).
— Clément Canonne (@ccanonne_) December 18, 2020
Since I forgot to mention it yesterday, all graphs considered were undirected. Hopefully that didn't mislead anyone...
1/ https://t.co/J9n6tcw5DF
📊 It's Thursday, time for our weekly (but weakly so) quiz! This week: integers, and remarkable sequences.
— Clément Canonne (@ccanonne_) December 3, 2020
See this as an extended ad for the OEIS, if you will. Also, if you don't know the OEIS: go check it out (and come back)! https://t.co/CFzSxQu4Wy
1/ pic.twitter.com/QB8tpCA8pC
📊 Answers and discussion for yesterday's quiz on sequences (and their names).
— Clément Canonne (@ccanonne_) December 4, 2020
Before we go to the first question, again: if you haven't, bookmark https://t.co/DVdLWdI3UT . It's useful: the solution to some of your problems may be recorded there!
1/ https://t.co/gN1xUHjmZb
📊 It's Thursday, it's Thanksgiving, and it's time for a (weakly) weekly quiz!
— Clément Canonne (@ccanonne_) November 26, 2020
Let's play a game. I'll write sentences: some of them refer to actual results (from math or computer science). Some of them are lyrics, random quotes (or nothing). Can you find which is which?
1/8
📊 Answers and discussion for yesterday's Thanksquizzing 🦃: "theorem or random string of words?"
— Clément Canonne (@ccanonne_) November 27, 2020
tl;dr: theorem, not theorem, theorem, theorem, not a theorem, theorem.
1/ https://t.co/mU516LZIt1
📊 Thursday morning:* time for a weekly quiz! Let's talk about... POT!
— Clément Canonne (@ccanonne_) November 12, 2020
Testing of functions, graphs, etc. (in all generality) says you have an input thing f: X → Y, want to test whether f has some property 𝒫 (i.e., f∊𝒫) or is "far" from it (d(f,g)>ε for all g∊𝒫). How?
1/7 pic.twitter.com/MW04TnR027
📊 Answers and discussions for Thursday's quiz! Let's look at the answers for those distinctions between (property) testing algorithms.
— Clément Canonne (@ccanonne_) November 14, 2020
Quick recap: given a 'property' 𝒫 (subset of functions) and some access to some function f, accept if f∊𝒫.
1/12 https://t.co/8UmRqPGYAe
📊 It's Thursday, and with it another weekly quiz! Since there is a lot going on these days, I'll keep it very short.
— Clément Canonne (@ccanonne_) November 5, 2020
Today: Fibonacci numbers, and adding a little random kick to them. (Also, here's a cute animal if you need a break from the news.)
1/5 pic.twitter.com/DOuCtmeAAZ
📊 Answers and discussion for yesterday's short quiz on the Fibonacci sequence and a random variant 🎲.
— Clément Canonne (@ccanonne_) November 6, 2020
Let's first recall what the Fibonacci numbers are: we won't go through the whole "rabbits are breeding" thing, but it goes as F(n)=F(n-1)+F(n-2)...
1/ https://t.co/Ofybr88jO7
📊 Thursday! Time for an instance of our weekly (but weakly so) quiz.
— Clément Canonne (@ccanonne_) October 29, 2020
Let's talk about prime numbers! You know, those things really useful in number theory, Amazon orders, and cryptography.
So you want to find a prime, and it needs to be n-bit long, and it can't be 57.
1/ pic.twitter.com/cish0qZRBS
📊Answers and discussion for yesterday's quiz on prime numbers! The gist of the quiz was:
— Clément Canonne (@ccanonne_) October 30, 2020
"Given n, can we efficiently find an n-bit prime number? Can we do that deterministically? In a randomized way? What about 'pseudo-deterministically'?"
1/ https://t.co/ie8cmGuDyl
📊 Sorry for the delay—here is today's (weakly) weekly quiz. It'll be ≈ brief: our goal is to have a superficial look at some of the (mathematical) inequalities we face on a daily basis.
— Clément Canonne (@ccanonne_) October 22, 2020
Some will be true, some false, and some will look like they could go both way.
1/8 pic.twitter.com/qMNcBIZ8h8
📊 Answers and discussion for this week's quiz on inequalities between norms.
— Clément Canonne (@ccanonne_) October 24, 2020
First, let's recap notation. As noted by some, one could unify the notation by seeing both as Lp norms under a suitable (e.g., counting) gen'l measure, (∫|x|ᵖω(dx))^{1/p}.
1/ https://t.co/ci5Cuxp80n pic.twitter.com/bWtrEll4Y8
📊 It's that day of the week: time for a we(a|e)kly quiz! Today's quiz will be loosely based on a problem from a learning theory course I took back in the days, when dinosaurs roamed in Manhattan; and involves balloons🎈.
— Clément Canonne (@ccanonne_) October 8, 2020
Specifically: when do they pop?
1/9 pic.twitter.com/Xd9TxZYUgW
📊 Answers and discussion for yesterday's "pop quiz" on balloons🎈, and #learning .
— Clément Canonne (@ccanonne_) October 9, 2020
Recall the setup: you have a very small number of state-of-the-art 🎈. They pop at some unknown height N* (in "footmeters") , an integer b/w 1 and N. Your goal is to learn (exactly) this N*.
1/ https://t.co/p9PrALexWl
📊 It's that time of the week: Thursday, unless I've been fooled again. Time for a quiz! Today, we'll deal with something quite "basic:" sorted numbers, and how to check that.
— Clément Canonne (@ccanonne_) October 1, 2020
As people said to Louis XVI: "let's keep it short."
1/11 pic.twitter.com/rmCvaAxIWT
📊 Answers and discussions for yesterday's quiz on sorting. Clearly what's been on everyone's minds since yesterday evening!
— Clément Canonne (@ccanonne_) October 2, 2020
To recap: you have an array of n numbers (integers), and want to end up with a sorted array, and, frankly, don't we all.
1/19 https://t.co/RgXVRzW6PP
📊 It's Thursday, allegedly: time for our wækly quiz. So, completely unrelated to anything, this one's going to be about influence, majority, tribes, noise, and dictators.
— Clément Canonne (@ccanonne_) September 24, 2020
(I didn't get "chaos" in, but I'm keeping that option open for the future. #gaussianchaos )
1/11 pic.twitter.com/eeRtkIpNmr
📊 Answers and discussion for yesterday's quiz on voting rules (a.k.a. Boolean functions for social choice).↴
— Clément Canonne (@ccanonne_) September 26, 2020
Disclaimer: for a much better and comprehensive treatment, @BooleanAnalysis 's book is the place to look ( https://t.co/f9OGjFKAVu ).
1/ https://t.co/jy0JzLWwiq
📊 It's (weakly) Thursday: time for a weekly quiz. For this one let's be merry, and play games—repeatedly!
— Clément Canonne (@ccanonne_) September 10, 2020
But, what's a game? You have 2 players, Alice 👩🏻💻 and Bob 👨🏾💻, and a game master, say Guillaume🧔🏼. Guillaume draws (x,y), gives (separately!) x to Alice and y to Bob...
1/9 pic.twitter.com/NMYxoRelJ5
📊 Answers and discussion for yesterday's quiz on games. Technically, "repeated games and direct product (parallel repetition) theorems."
— Clément Canonne (@ccanonne_) September 11, 2020
Or, as one may put it: if I keep playing, how likely is it that I'll keep winning?🤷
1/15 https://t.co/TvjcHG52Gf
📊 It may not be Thursday everywhere yet, but let's have our weakly(weekly(quiz))!
— Clément Canonne (@ccanonne_) September 3, 2020
Today: no general theme. Just things that may be true, or false, and surprisingly so―or not. Beware of the traps.
1/7 pic.twitter.com/VL50ZjMgS8
📊Answers and discussion for yesterday's quiz on "Things I Probably Would Have Gotten Wrong (Did You?)"
— Clément Canonne (@ccanonne_) September 4, 2020
Ft. Complexity theory (Q1), Probability (Q2), Measure theory (Q4), Planar graphs and pandas 🐼 (Q5), and Linear algebra (Q3)... in that order.
1/13 https://t.co/AG9nyiCw5F
📊 It's Thursday for many, time for this week's quiz! As alluded earlier, we're going to flip coins. Like, a lot.
— Clément Canonne (@ccanonne_) August 27, 2020
Today: you have a biased coin which lands heads with some probability p∊(0,1), and you can toss it as many times as you want. You don't know p, though...
1/10 pic.twitter.com/YCYEzSC5zJ
📊 Answers and discussion for yesterday's quiz on coin flipping (a.k.a. the power of small change?)
— Clément Canonne (@ccanonne_) August 28, 2020
Reminder: given a fixed function f, we want a procedure which given independent coin flips from a coin unknown probability of heads p ("bias")...
1/14 https://t.co/GThwTq3VPB
📊 It's Thursday, time for a we•kly quiz! As promised last week, today will be about Gaussians, CLTs, Berry—Esseen, and a little bit of Invariance Principle.
— Clément Canonne (@ccanonne_) August 20, 2020
Let's start. You have n i.i.d. real-valued r.v.s X₁,...,Xₙ, with mean zero. You sum them, and hope for the best.
1/10 pic.twitter.com/vWuIrcxO62
📊 Answers and discussion for yesterday's quiz on CLTs, Gaussians, and a bit of Invariance Principle.
— Clément Canonne (@ccanonne_) August 21, 2020
Let's start: Q1 on the Central Limit Theorem was a trap 🧐!
1/16 https://t.co/AqzMX6YdgI
📊 It's Thursday [reference needed], time for a weᵄekly quiz! Given some recent events about randomized polynomial time (RP), it feels like a short discussion about 🎲 is topical.
— Clément Canonne (@ccanonne_) August 6, 2020
So, let's go: Las Vegas, Monte Carlo, and... Bellagio (?) algorithms!
1/8 pic.twitter.com/DYFqOeMG8q
📊Thread: answers and discussions for yesterday's quiz on randomized complexity classes (RP, ZPP, BPP).
— Clément Canonne (@ccanonne_) August 7, 2020
Of course, I am barely scratching the surface. I find randomness in computation fascinating, and hope you will too. https://t.co/DQNWorXxa0
1/15 pic.twitter.com/CIFcqaqaF8
📊 Today, (weakly|weekly) quiz will be about graphs. The ones with nodes and edges, not those plotting against you.
— Clément Canonne (@ccanonne_) July 30, 2020
Specifically, *planar* graphs. The nice ones.
1/7
📊 Answers and discussion for yesterday's quiz on planar graphs. Recall: a graph G=(V,E) on n=|V| vertices and m=|E| edges is planar if you can "embed it in the plane" (draw it without edge crossings). 🕸️
— Clément Canonne (@ccanonne_) July 31, 2020
Why should we care, you ask?
1/10 https://t.co/yGrMgZ9ndB
📊 It's Thursday: time for our wÆkly quiz! I'll build a little bit on last week's statistics/algorithms thread (see below for a recap), and ask you two 2️⃣ questions.
— Clément Canonne (@ccanonne_) July 23, 2020
General topic: testing distributions, "leaky" measurements, and adaptivity. 🔍
1/6 https://t.co/YhEpCn4yov pic.twitter.com/aj5mYWKqkp
📊 Answers and discussions for yesterday's we(a)ekly quiz on "testing whether data is uniformly distribution, given some leaky query access to the samples." #statistics #quiz
— Clément Canonne (@ccanonne_) July 24, 2020
To paraphrase Tolstoy when starting "War and Peace": I'll be brief. https://t.co/NMK2GJ2DqF
1/10
📊 Two days ago, I asked a question. Way more people answered than expected, and... well, this week's weₐekly #quiz will be slightly ≠: a long thread on uniformity testing, trickling down all day long.
— Clément Canonne (@ccanonne_) July 16, 2020
Be careful what you wish for :) #statistics
1/n https://t.co/3ECldvbekE
📊 A short discussion about last week's thread on uniformity testing: I wrote things down!
— Clément Canonne (@ccanonne_) July 20, 2020
📝 https://t.co/jLEIKSxMIi (LaTeX)
📄 https://t.co/Uqq0yjWrwp (PDF)
Comments welcome.
1/2 https://t.co/6jqSvwoabC pic.twitter.com/1baBnVWWVl
📊 It's Thursday (somewhere), time for a weₐekly quiz! Today: #machinelearning ! Ish. Let's say... computational learning. Some of it. In theory.
— Clément Canonne (@ccanonne_) July 9, 2020
OK. VC dimension. You got me. Anyways, #COLT2020 is starting tomorrow, so... what does it mean to "learn," exactly?
1/10
📊 Answers and comments about yesterday's online quiz on learning.
— Clément Canonne (@ccanonne_) July 10, 2020
Not a quiz on online learning. Though there was an (online learning) component on that online (learning quiz), which maybe led to some learning online, so... I am confused now. https://t.co/2VO9AqXudR
1/16 pic.twitter.com/rNmo5tmwWM
📊 Wæekly quiz: "The answer may surprise you (or not)."
— Clément Canonne (@ccanonne_) July 2, 2020
Four results I find surprising (among many). Three of the statements below are true, one is false. Choose... wisely.
1/11 pic.twitter.com/lAV5UMTwRh
⁉️ Answers and discussion for yesterday's quiz: and yes, you are a Ravenclaw. (BuzzFeed has nothing on me.) https://t.co/csDmh8T4bk
— Clément Canonne (@ccanonne_) July 4, 2020
First I'd like to apologize for the original erroneous phrasing of Q1. If your answer was wrong, it was my fault.
[As usual, refs at the end]
1/19 pic.twitter.com/aODklwZQKB
📊 Weᵃₑkly quiz: linear threshold functions (a.k.a. perceptrons, a.k.a. halfspaces), and what makes them so unique.
— Clément Canonne (@ccanonne_) June 25, 2020
In other words, "the Chow Parameters."
Let's start with Boolean functions: i.e., functions of the form f: {-1,1}ⁿ→{-1,1}. There are 2^2^n of those.
1/9 pic.twitter.com/jJVRaJnyQh
🧑🏫 Answers and discussions for yesterday's quiz on Boolean functions, halfspaces (LTFs), and Chow parameters.
— Clément Canonne (@ccanonne_) June 26, 2020
Our focus here will be functions of the form f: {-1,1}ⁿ→{-1,1} (i.e., bits=±1)
[References for all results at the end of the thread.]
1/18 https://t.co/RYVrBItOrn pic.twitter.com/NZjzVdFJn1
📊 Weakly weekly quiz: a short one—next week 's will be longer. "How do you prove you won't always fall short of your expectations?"
— Clément Canonne (@ccanonne_) June 18, 2020
(No, it not not about eating Cheerios on the sofa, or the last time I wore actual socks. It's about constant-probability statements.)
1/4 pic.twitter.com/fJN7fBYx8B
📊 Answers and discussions for yesterday's quiz: "minimal assumptions for X to be, with constant probability, no less than a constant fraction of 𝔼[X]?"
— Clément Canonne (@ccanonne_) June 19, 2020
E.g., if an algo find something w/ large value in expectation, can it do it w/ probability 1/10?
1/7 https://t.co/wH3dh5qtLd
📊 Weakly-weekly quiz: this week, "how hard is it to learn when you keep forgetting things?"
— Clément Canonne (@ccanonne_) June 11, 2020
(This quiz is about learning functions. The question applies to history as well...)
You observe a stream of labelled samples (xᵢ,yᵢ) where yᵢ=f(xᵢ), and want to learn f...
1/9 pic.twitter.com/OsryHLJyH4
📊 Answers and discussion for yesterday's quiz on sample/memory tradeoffs for learning: a thread. ↴
— Clément Canonne (@ccanonne_) June 12, 2020
Overall question: "how hard does it become to learn a (linear) function when the algorithm can't keep too much information in memory?"
1/11 https://t.co/VXHILDT8Ww
📊 For this edition of the weekly quiz, let's look at random sums of random things, and their tails.
— Clément Canonne (@ccanonne_) June 4, 2020
Specifically, we have some i.i.d. r.v.'s X₁,X₂,..., and some integer-valued (possibly infinite) r.v. T. We look at S = X₁+X₂+...+X_T.
1/7
📊 Answers and discussion about yesterday w(ee|ea)kly quiz on random sums: S = X₁+X₂+...+X_T, w/ Xₙ's independent and T itself an integer-valued random variable.
— Clément Canonne (@ccanonne_) June 5, 2020
Under mild assumptions on all those, what can we say about the tails of the sum S? https://t.co/uNnZUyfuTc
1/11 pic.twitter.com/ACoAo8m5pe
📊 Today's weₐekly quiz: a 3-series ∑ edition! Nothing too fancy, maybe a cute probability proof in disguise somewhere.
— Clément Canonne (@ccanonne_) May 28, 2020
Let's start: we all know—maybe love—the exponential function. For all k in ℝ, ∑ₙkⁿ/n! = eᵏ. Fair enough... butnwhat if we perturb things a bit?
1/5
🧑🏫 Answers and discussions for yesterday's quiz on "series that kind of look like the exponential one."
— Clément Canonne (@ccanonne_) May 29, 2020
If you couldn't see properly the questions (non-utf8er's gonna non-utf8), I had screenshots here: https://t.co/65TqGvrZh1
1/8 https://t.co/e7Y1ChYOAI
📊 Here we go again: we{e,a}kly quiz. Let's have a look at two of our basic* friends, the Poisson 🐟 and Binomial 🐪 distributions. And what happens when they meet.
— Clément Canonne (@ccanonne_) May 21, 2020
Just so we're all on the same page: X~Binom(n,p) if X=∑ₖ Xₖ (n terms)...
*"I'm not basic. THEY're basic."
1/6
Answers and discussions for yesterday's quiz on Poisson 🐟, Binomial 🐪, and I guess 🐘 distributions.
— Clément Canonne (@ccanonne_) May 22, 2020
Here's the plan: answers, discussion of what a "Poisson Binomial distribution" 🐡 is and what it's for, related references.
1/9 https://t.co/di7wk0I48I
📊 It's time for a (short) installment of our weakly quiz. Setup: you are a *deterministic* algorithm 🤖 (sorry). You're given i.i.d. draws from some discrete distribution p over [n]={1,2..,n}, *close-ish* to uniform. You want to output i.i.d. draws *closer* to uniform.
— Clément Canonne (@ccanonne_) May 14, 2020
1/4
📊 Answers for yesterday's quiz: "if you're given i.i.d. samples from something close* from uniform on n elements, can you deterministically combine them to get something more uniform**? ↴
— Clément Canonne (@ccanonne_) May 15, 2020
* within TV distance ε
** within TV distance ε'
1/9 https://t.co/LYNwhDFNzS
📊 Here is a *semi non-technical* weekly quiz: a two-parter, first with an opinion poll, then a one-question quizz.
— Clément Canonne (@ccanonne_) April 30, 2020
Q1: What would you like these quizzes (and/or this account) to focus on more in the future? 🙋
1/3
Answers and discussions for yesterday's poll+quiz: thread below. ↴
— Clément Canonne (@ccanonne_) May 1, 2020
First, the poll: random stuff, and probability distribution-related things win the day. I won't promise puns will disappear, though. As a TCS person, I have promise problems.
1/6 https://t.co/iAMBxcjHbo
📊It's Thursday, time for a we[ae]kly quiz! Let's talk about juntas. The non-military kind.
— Clément Canonne (@ccanonne_) April 23, 2020
(As usual, questions below, answers tomorrow)
What is a junta? Basically, a very misleading function. For simplicity, think of f:{0,1}ⁿ→{0,1} (the def generalizes to f:Σⁿ→ℝ)..
1/7
⌛Time for the answers to yesterday's quiz on junta functions*! See thread below. ↴
— Clément Canonne (@ccanonne_) April 24, 2020
* "Technically n variables, morally k ≪ n"
1/12 https://t.co/x4QISb3Kyn
Weakly weekly quiz, new installment! I'm assuming everyone is very busy with either the #FOCS2020 deadline, the #ICML2020 reviews, or the current global health crisis and juggling with 5 toddlers & 7 Zoom online classes, so I'll keep it short.
— Clément Canonne (@ccanonne_) April 9, 2020
Adaptivity 🗘 and testing 🔎.
1/7
📚 Here are (a day late) the answers to this week's quiz on adaptivity 🗘 in property testing 🥚. A thread ↴
— Clément Canonne (@ccanonne_) April 11, 2020
1/10 https://t.co/1pcUXJ05Dt
🗒️ Today's quiz will be short, and focus on a very useful primitive called "hypothesis selection."
— Clément Canonne (@ccanonne_) March 26, 2020
Say you've got, as usual, data points coming from some unknown probability distribution p over some domain 𝓧. And you have a list of possible 'models' q₁, q₂,..., qₙ.
1/6
Answers and discussions for yesterday's quiz on hypothesis selection: a thread.
— Clément Canonne (@ccanonne_) March 27, 2020
The problem: you have data from some unknown probability distribution p, a list of n hypotheses q₁,..., qₙ. Assuming one of the qₜ's is good, find one 'not too bad.'
1/12 https://t.co/09lOKAy0Cl
📊 The weakly weekly quiz is back: in today's edition, "learning and testing, how are they related?" Specifically, we're going to look at functions f: {0,1}ⁿ→{0,1} and probability distributions p on a domaine of size [k].
— Clément Canonne (@ccanonne_) March 19, 2020
Hope you find that enjoyable. 1/11
Answers and discussions for yesterday's quiz on learning v. testing Boolean functions and probability distributions: a thread. 🧵 https://t.co/dG0N5anxMW
— Clément Canonne (@ccanonne_) March 20, 2020
1/16
Quizz of the week: I'm slightly jetlagged, so it'll be a bit short. Let's about 📉*monotone* 📈 probability distributions (univariate, discrete).
— Clément Canonne (@ccanonne_) March 5, 2020
That is, you have a distribution p over [k] = {1,2,...,k}, and you know its probability mass function is (wlog) non-increasing.
1/6
Answers and discussion for yesterday's polls on inference for (discrete) monotone distributions 📈📉 below. I hope you won't find it too... monotonous↴
— Clément Canonne (@ccanonne_) March 7, 2020
1/16 https://t.co/Tv1AdqRh7e
📊The past two weeks, we talked about learning (density estimation) of discrete distributions. Now, let's look at the flip side, and see a bit of what happens w/ *testing* (goodness-of-fit, hypothesis testing,...).
— Clément Canonne (@ccanonne_) February 27, 2020
All after, our distance measure will be total variation (TV)
1/8
Answers and discussion for yesterday's polls on testin 📊🐘 below. Some things, I'd say, are *far* from obvious.↴ https://t.co/gueiqlGBYA
— Clément Canonne (@ccanonne_) February 28, 2020
1/14
So last week was density estimation (distribution learning) of distributions 📊 over a domain of size k in total variation, Hellinger, and KL distances. Now you may wonder: "What about the rest?"
— Clément Canonne (@ccanonne_) February 20, 2020
Let's see some of the rest: ℓ₂, ℓ_∞, and Kolmogorov.
First: "Kolmogorov"?
1/9
Answers and discussion for yesterday's poll on "density estimation in other distances, such as Kolmogorov" below The answer MAY surprise you (with probability ≥ 0.428). ↴
— Clément Canonne (@ccanonne_) February 21, 2020
1/9 https://t.co/6jo6kAhKtT
📊 Learning discrete distributions over a finite domain, a short thread.↴
— Clément Canonne (@ccanonne_) February 15, 2020
Given n i.i.d. samples from some unknown p over a domain of size k and parameters ε,δ, you want to output some hypothesis q s.t.
d(p,q) < ε
with probability at least 1-δ. How large must n be?
1/8
📊 So, the results... Learning discrete distributions over a finite domain of size k to distance ε, with probability 1-δ: how hard can it be?
— Clément Canonne (@ccanonne_) February 16, 2020
1/9 https://t.co/AANZoxkm95
Test your intuition (or, in my case, lack thereof): hypothesis testing in high dimensions. You are given n i.i.d. samples from an unknown multivariate Gaussian p = N(μ,Σ) in ℝ^d. You want to know if p is *the* standard Gaussian N(0,I). 1/8
— Clément Canonne (@ccanonne_) October 27, 2019
So, the answers:
— Clément Canonne (@ccanonne_) October 28, 2019
- Q1 and Q2 both have the same one, √d/ε².
The fact that it's the same comes from the discussion on equivalent between TV to N(0,I) and ℓ₂ norm of the mean.
Learning the mean to ε in ℓ₂ would be d/ε². But estimating ||μ||₂ is quadratically cheaper! 1/8