transfer style
Real-time Artwork Generation using Deep Learning
In this post we will be looking into the paper "Arbitrary Style Transfer in Real-time with Adaptive Instance Normalization"(AdaIN) by Huang et. We are looking this paper because it had some key advantages over the other state-of-the-art methods at the time or release. Most important of all, this method, once trained, can be used to transfer style between any arbitrary content-style image pair, even ones not seen during training. While the method proposed by Gatys et. The AdaIN method is also flexible, it allows for control over the strength of the transferred style in the stylised image and also allows for extensions such as style interpolation and spatial controls.
How to debug neural networks. Manual. – Hacker Noon
Debugging neural networks can be a tough job even for field expert. Millions of parameters stuck together where even one small change can break all your hard work. Without debugging and visualization all your actions is popping a coin, and what worse it eating your time. Here i gather practices that will help you find problems earlier. Try to overfit your model with small dataset General you neural net should overfit your data in a few hundreds of iterations.
Feed-forward neural doodle
It takes time to master the skills, and you have more important things to do:) What if you could only sketch the picture like a 3-years old and everything else is done by a computer so your sketch looks like a real painting? It will certainly happen in near future. In fact several algorithms that do the thing very well were proposed recently, yet they take at least several minutes to render your masterpiece using a high-end hardware. We make a step towards making such things available for everybody and present an online demo of our fast algorithm. The following text describes the way it is done.