Learning Sums of Independent Integer Random Variables
... straightforward to show that S must have almost all its probability mass on values in a small interval, and (1) follows easily from this. The more challenging case is when Var(S) is “large.” Intuitively, in order for Var(S) to be large it must be the case that at least one of the k − 1 values 1, 2, ...
... straightforward to show that S must have almost all its probability mass on values in a small interval, and (1) follows easily from this. The more challenging case is when Var(S) is “large.” Intuitively, in order for Var(S) to be large it must be the case that at least one of the k − 1 values 1, 2, ...
for Sublinear Time Maximum Inner Product Search (MIPS)
... which is an instance of the standard MIPS problem. It should be noted that we do not have control over the norm of the learned vector, i.e., ∥vj ∥2 , which often has a wide range in practice [13]. If there are N items to recommend, solving (3) requires computing N inner products. Recommendation syst ...
... which is an instance of the standard MIPS problem. It should be noted that we do not have control over the norm of the learned vector, i.e., ∥vj ∥2 , which often has a wide range in practice [13]. If there are N items to recommend, solving (3) requires computing N inner products. Recommendation syst ...
Counting Stars and Other Small Subgraphs in Sublinear Time
... ADH+ 08, HA08, GS09]), as well as by the basic quest to understand simple structural properties of graphs. Our work differs from previous works on counting subgraphs (with the exception of counting the number of edges [Fei06, GR08]) in that we design sublinear algorithms. That is, our algorithms do ...
... ADH+ 08, HA08, GS09]), as well as by the basic quest to understand simple structural properties of graphs. Our work differs from previous works on counting subgraphs (with the exception of counting the number of edges [Fei06, GR08]) in that we design sublinear algorithms. That is, our algorithms do ...