Skip to main content

Questions tagged [sample-complexity]

1 vote
0 answers
16 views

Suppose we have data $x_i \sim Normal(0, I_d)$ (i.e. zero mean, unit variance, independent features) and labels $y_i \in {0,1}$, and we are interested in training a linear classifier $f(x) = w^\top x +...
user9998990's user avatar
0 votes
0 answers
23 views

In standard hardness proofs showing that proper PAC learning of certain concept classes is not possible unless RP=NP, one reduces an NP-complete decision problem (e.g., Graph 3-Coloring) to the ...
user1234567890's user avatar
1 vote
1 answer
100 views

Suppose I have a hypothesis class $\mathcal{H}$ with finite and small VC-dimension. Also, assume I have a learning algorithm $A$ that, given any labeled sample, produces in polynomial time a ...
user1234567890's user avatar
0 votes
0 answers
42 views

I'm seeking an explicit bound for the number of samples required to estimate the covariance matrix of a Gaussian distribution. In https://arxiv.org/pdf/1011.3027v7 (end of page 31), the following ...
Dante Perez's user avatar
0 votes
0 answers
68 views

Let $\mathcal{X} = \{-1,1\}^n$ and $h: \mathcal{X} \to \{-1,1\}$, h can be expanded in the basis of monomials for the uniform distribution, or also can have a distribution free expansion (Gram-Schmidt ...
rivana's user avatar
  • 65
2 votes
0 answers
55 views

Arora and Barak (in chapter 18, on average-case complexity) define a polynomial-time samplable (or P-samplable) distribution $D$ (actually a family $\{D_n\}$, for each output length $n$) as having an ...
Shivaram Lingamneni's user avatar
1 vote
0 answers
62 views

Is there some known lower bounds for estimating large fourier coefficients of boolean functions? And were there any comparison of tightness with the upper bound of Goldreich Levin algorithm?
rivana's user avatar
  • 65
2 votes
0 answers
71 views

I'm currently working on finding better bounds for Goldreich-Levin algorithm for estimating large Fourier coefficients of a boolean function. I was surprised seeing that the upper bounds for time ...
rivana's user avatar
  • 65
-1 votes
1 answer
199 views

I have been studying from the book "Understanding Machine Learning - From Theory to Algorithms" by Shai Shalev-Shwartz and Shai Ben-David I am struck at corollary 3.2 which states that Every ...
Sathishkumar Thirumalai's user avatar
1 vote
0 answers
88 views

Goldreich Levin algorithm is an algorithm that based on some assumption (boundness on Fourier coefficients) outputs the indices for most significant Fourier coefficients of a boolean function, however ...
rivana's user avatar
  • 65
0 votes
0 answers
114 views

The goal is to come up the simple data structure for sampling a uniform point from a collection of sets, i.e., given a sub-collection $\mathcal{B}$, sample a point in $\cup \mathcal{B}$ uniformly at ...
Shi's user avatar
  • 115
1 vote
0 answers
113 views

Coming from a more statistical background, it is not clear to me if or how lower bounds in the statistical query (SQ) model imply anything useful about traditional learning problems with iid samples (...
student3365's user avatar
0 votes
0 answers
48 views

I am looking for generalisation bounds under the squared loss, specifically for the class $\mathcal{F}_{\text{lin}} = \{f(x) = \langle w, x \rangle : \|w\| \leq C\}$ of bounded linear predictors. I am ...
Nick Bishop's user avatar
3 votes
1 answer
262 views

Say we have a black-box access to a distribution $\mathcal{D}$ with finite support $\{1,2,...,n\}$ with probability mass function $i \mapsto p_i$. How many samples of $\mathcal{D}$ are needed to learn ...
actcon's user avatar
  • 33
6 votes
1 answer
267 views

Let us have booleans $x_1, \cdots, x_n$. Any algorithm that determines $\bigvee_1^n x_i$ with probability at least $2/3$ requires $\Omega(n)$ time. It is not too difficult to prove this, but the proof ...
user2316602's user avatar

15 30 50 per page