本文共 3164 字,大约阅读时间需要 10 分钟。
Real-time simulation of fluid and smoke is a long standing problem in computer graphics, where state-of-the-art approaches require large compute resources, making real-time applications often impractical. In this work, we propose a data-driven approach that leverages the approximation power of deep-learning methods with the precision of standard fluid solvers to obtain both fast and highly realistic simulations. The proposed method solves the incompressible Euler equations following the standard operator splitting method in which a large, often ill-condition linear system must be solved. We propose replacing this system by learning a Convolutional Network (ConvNet) from a training set of simulations using a semi-supervised learning method to minimize long-term velocity divergence.
ConvNets are amenable to efficient GPU implementations and, unlike exact iterative solvers, have fixed computational complexity and latency. The proposed hybrid approach restricts the learning task to a linear projection without modeling the well understood advection and body forces. We present real-time 2D and 3D simulations of fluids and smoke; the obtained results are realistic and show good generalization properties to unseen geometry.
原文链接:
代码链接:
2.【博客】Highway Networks with TensorFlow
简介:
This week I implemented to get an intuition for how they work. Highway networks, inspired by , are a method of constructing networks with hundreds, even thousands, of layers. Let’s see how we construct them using TensorFlow.
原文链接:
3.【博客】An LSTM Odyssey
简介:
This week I read . It’s an excellent paper that systematically evaluates the different internal mechanisms of an (long short-term memory) block by disabling each mechanism in turn and comparing their performance. We’re going to implement each of the variants in and evaluate their performance on the (PTB) dataset. This will obviously not be as thorough as the original paper but it allows us to see, and try out, the impact of each variant for ourselves.
原文链接:
4.【博客&代码】Recurrent generative auto-encoders and novelty search
简介:
This post summarizes a bunch of connected tricks and methods I explored with the help of my co-authors. Following the previous post, above the , the overall aim was to improve our ability to train generative models stably and accurately, but we went through a lot of variations and experiments with different methods on the way. I’ll try to explain why I think these things worked, but we’re still exploring it ourselves as well.
原文链接:
代码链接:
5.【博客&代码】Recursive (not recurrent!) Neural Nets in TensorFlow
简介:
For the past few days I’ve been working on how to implement recursive neural networks in TensorFlow. Recursive neural networks (which I’ll call TreeNets from now on to avoid confusion with recurrent neural nets) can be used for learning tree-like structures (more generally, structures). They are highly useful for parsing natural scenes and language; see the of Richard Socher (2011) for examples. More recently, in 2014, Ozan İrsoy used a deep variant of TreeNets to obtain some results.
原文地址:
代码地址:
转载地址:http://opdqb.baihongyu.com/