Goto

Collaborating Authors

 weigend


Connectionism for Music and Audition

Neural Information Processing Systems

In recent years, NIPS has heard neural networks generate tunes and harmonize chorales. With a large amount of music becoming available in computer readable form, real data can be used to train connectionist models. At the beginning of this workshop, Andreas Weigend focused on architectures to capture structure on multiple time scales. The prediction approach to continuation and completion, as well as to modeling expectations, can be charac(cid:173) terized by the question "What's next?". Moving to time as the primary medium of musical communication, the inquiry in music perception and cognition shifted to the question "When next?" .


Learning Local Error Bars for Nonlinear Regression

Neural Information Processing Systems

We present a new method for obtaining local error bars for nonlinear regression, i.e., estimates of the confidence in predicted values that de(cid:173) pend on the input. We approach this problem by applying a maximum(cid:173) likelihood framework to an assumed distribution of errors. We demon(cid:173) strate our method first on computer-generated data with locally varying, normally distributed target noise. We then apply it to laser data from the Santa Fe Time Series Competition where the underlying system noise is known quantization error and the error bars give local estimates of model misspecification. In both cases, the method also provides a weighted(cid:173) regression effect that improves generalization performance.


Amazon: How Bezos built his data machine

#artificialintelligence

The next challenge was to decide what to sell beyond books. They picked CDs and DVDs. Over the years, electronics, toys and clothing followed, as did overseas expansion. And all this time, Amazon was building a battalion of data-mining experts. Artificial intelligence expert Andreas Weigend was one of the first. Before joining, he had published more than 100 scientific articles, co-founded one of the first music recommendation systems, and worked on an application to analyse online trades in real-time.


Ex-Google Guy Builds English Teaching App That Adapts to Student

#artificialintelligence

Yi Wang was hearing the same refrain over and over: Why are English classes in China so expensive? The former Google product manager decided to do something about it and started an app called LiuLiShuo, which basically means "speaking fluently" in Mandarin. The app, which claims more than 30 million users, is one of scores of English-learning startups looking to disrupt China's hidebound language schools. To differentiate itself from products started by Internet giants like Baidu and Tencent, LiuLiShuo brings gaming and social media features to the genre. Users win points when they move to the next level and text each other encouragement and tips.


Learning Local Error Bars for Nonlinear Regression

Nix, David A., Weigend, Andreas S.

Neural Information Processing Systems

We present a new method for obtaining local error bars for nonlinear regression, i.e., estimates of the confidence in predicted values that depend on the input. We approach this problem by applying a maximumlikelihood framework to an assumed distribution of errors. We demonstrate our method first on computer-generated data with locally varying, normally distributed target noise. We then apply it to laser data from the Santa Fe Time Series Competition where the underlying system noise is known quantization error and the error bars give local estimates of model misspecification. In both cases, the method also provides a weightedregression effect that improves generalization performance.


Learning Local Error Bars for Nonlinear Regression

Nix, David A., Weigend, Andreas S.

Neural Information Processing Systems

We present a new method for obtaining local error bars for nonlinear regression, i.e., estimates of the confidence in predicted values that depend on the input. We approach this problem by applying a maximumlikelihood framework to an assumed distribution of errors. We demonstrate our method first on computer-generated data with locally varying, normally distributed target noise. We then apply it to laser data from the Santa Fe Time Series Competition where the underlying system noise is known quantization error and the error bars give local estimates of model misspecification. In both cases, the method also provides a weightedregression effect that improves generalization performance.



Learning Local Error Bars for Nonlinear Regression

Nix, David A., Weigend, Andreas S.

Neural Information Processing Systems

We present a new method for obtaining local error bars for nonlinear regression, i.e., estimates of the confidence in predicted values that depend onthe input. We approach this problem by applying a maximumlikelihood frameworkto an assumed distribution of errors. We demonstrate our method first on computer-generated data with locally varying, normally distributed target noise. We then apply it to laser data from the Santa Fe Time Series Competition where the underlying system noise is known quantization error and the error bars give local estimates of model misspecification. In both cases, the method also provides a weightedregression effectthat improves generalization performance.


Connectionism for Music and Audition

Weigend, Andreas S.

Neural Information Processing Systems

In recent years, NIPS has heard neural networks generate tunes and harmonize chorales. With a large amount of music becoming available in computer readable form, real data can be used to train connectionist models. At the beginning of this workshop, Andreas Weigend focused on architectures to capture structure on multiple time scales.


Connectionism for Music and Audition

Weigend, Andreas S.

Neural Information Processing Systems

In recent years, NIPS has heard neural networks generate tunes and harmonize chorales. With a large amount of music becoming available in computer readable form, real data can be used to train connectionist models. At the beginning of this workshop, Andreas Weigend focused on architectures to capture structure on multiple time scales.