the suggested references, and add a "broader impact " section 1 (We apologize for not realizing that we are required to

Neural Information Processing Systems 

We thank all the reviewers for their time and for their thoughtful comments. Is grammar-compression useful for vectors and matrices encountered in ML? (a concern raised by reviewer #3) We prove that grammar compressions are harder to analyze (without decompression) than simpler ones like RLE. In the submission, we have only commented on this question in passing. Fortunately, such a test was already performed in the "Compressed Linear Algebra" paper [ To quote reviewer #4: " In this sense, proving a limitation of those models will greatly influence future research W e wish to sincerely thank the reviewers and the PC again for their time and help in improving the quality of this work. The ethical consequences depend on the specific application.

Similar Docs  Excel Report  more

TitleSimilaritySource
None found