èxk
- Asia > Russia (0.14)
- North America > Canada (0.04)
- Europe > Russia > Central Federal District > Moscow Oblast > Moscow (0.04)
- (2 more...)
FinerMetagenomicReconstruction viaBiodiversityOptimization
In previous work [12, 13], a method was introduced that leverages compressive sensing techniques tofind thefewest taxa thatfitsthefrequencyofshort sequences ofnucleotides (i.e., k-mers) in a given sample. Consider, for instance, an environment/sample made of s bacterial species but where two of them are almost identical: one would wish to say that the concentration vector is almost(s 1)-sparse rather thans-sparse!
- North America > United States > Texas > Brazos County > College Station (0.04)
- North America > United States > Massachusetts > Middlesex County > Natick (0.04)
- North America > Canada > British Columbia > Metro Vancouver Regional District > Vancouver (0.04)
Appendices for " Pruning Randomly Initialized Neural Networks with Iterative Randomization " Contents
We consider a target neural networkf: Rd0 Rdl of depth l, which is described as follows. Similar to the previous works [6, 7], we assume that g(x) is twice as deep as the target network f(x). Thus, g(x) can be described as g(x)=G2lσ(G2l 1σ( G1(x))), (2) where Gj is a edj edj 1 matrix (edj N 1 for j = 1,,2l) with ed2i = di. Under this re-sampling assumption, we describe our main theorem as follows. 1 Theorem A.1 (Main Theorem) Fix,δ>0, and we assume thatkFikFrob 1. LetR Nand we assumethat each elementof Gi can be re-sampled with replacementfrom the uniformdistribution U[ 1,1] up to R 1 times. If n 2log(1δ) holds, then with probability at least 1 δ, we have |α Xi|, (5) for some i {1,,n}.