人工智能资料库:第39辑(20170223)

原文链接: https://my.oschina.net/u/3579120/blog/1533560

  1. 【博客 & 视频】Why Medicine Needs Deep Learning

简介:

人工智能资料库:第39辑(20170223)_第1张图片

Deep learning will transform medicine, but not in the way that many advocates think. The amount of data times the mutation frequency divided by the biological complexity and the number of hidden variables is small, so downloading a hundred thousand genomes and training a neural network won’t cut it.

原文链接:http://artificialbrain.xyz/why-medicine-needs-deep-learning/


2.【博客】Read-through: Wasserstein GAN

简介:

For Wasserstein GAN, it was mostly compelling word of mouth.

  • The paper proposes a new GAN training algorithm that works well on the common GAN datasets.

  • Said training algorithm is backed up by theory. In deep learning, not all theory-justified papers have good empirical results, but theory-justified papers with good empirical results have really good empirical results. For those papers, it’s very important to understand their theory, because the theory usually explains why they perform so much better.

  • I heard that in Wasserstein GAN, you can (and should) train the discriminator to convergence. If true, it would remove needing to balance generator updates with discriminator updates, which feels like one of the big sources of black magic for making GANs train.

  • The paper shows a correlation between discriminator loss and perceptual quality. This is actually huge if it holds up well. In my limited GAN experience, one of the big problems is that the loss doesn’t really mean anything, thanks to adversarial training, which makes it hard to judge if models are training or not. Reinforcement learning has a similar problem with its loss functions, but there we at least get mean episode reward. Even a rough quantitative measure of training progress could be good enough to use automated hyperparam optimization tricks, like Bayesian optimization. (See this post and this post for nice introductions to automatic hyperparam tuning.)

原文链接:http://www.alexirpan.com/2017/02/22/wasserstein-gan.html?utm_content=buffer7f258&utm_medium=social&utm_source=twitter.com&utm_campaign=buffer


3.【demo & 代码】Image-to-Image Demo

简介:

人工智能资料库:第39辑(20170223)_第2张图片

Recently, I made a Tensorflow port of pix2pix by Isola et al., covered in the article Image-to-Image Translation in Tensorflow. I've taken a few pre-trained models and made an interactive web thing for trying them out. Chrome is recommended.

The pix2pix model works by training on pairs of images such as building facade labels to building facades, and then attempts to generate the corresponding output image from any input image you give it. The idea is straight from the pix2pix paper, which is a good read.

原文链接:http://affinelayer.com/pixsrv/index.html


4.【博客】Sorting through the tags: how does Tumblr’s graph-based topic modeling work?

简介:

人工智能资料库:第39辑(20170223)_第3张图片

What makes Tumblr stand apart from other social media platforms lies in the unique way its users communicate with each other. Each user has their own highly customizable blog where they can post and share content — like articles, images, GIFs, or videos — or re-post content published by another user. Sharing and re-posting content is not only key to how social connections are formed, but also how trending and popular topics are established, since the user must tag each post that they publish.

原文链接:https://medium.com/@NYUDataScience/sorting-through-the-tags-how-does-tumblrs-graph-based-topic-modeling-work-1d396fb48f54#.fay5h8act


5.【博客 & 代码】How to implement Sentiment Analysis using word embedding and Convolutional Neural Networks on Keras.

简介:

Imdb has released a database of 50,000 movie reviews classified in two categories: Negative and Positive. This is a typical sequence binary classification problem.

In this article, I will show how to implement a Deep Learning system for such sentiment analysis with ~87% accuracy. (State of the art is at 88.89% accuracy).

原文链接:https://medium.com/@thoszymkowiak/how-to-implement-sentiment-analysis-using-word-embedding-and-convolutional-neural-networks-on-keras-163197aef623#.sq6ax02mg


转载于:https://my.oschina.net/u/3579120/blog/1533560

你可能感兴趣的:(人工智能资料库:第39辑(20170223))