Convolutional Neural Network brings several breakthroughs for supervised tasks in Computer Vision and other visual problems in Artificial Intelligence. Moreover, semi-supervised or unsupervised learning has remarkable milestones in attempts to understand how models work. These insights unfold impressive studies which utilitize pre-trained models to extract deep-learning-based features or solve various tasks. This post introduces Artistic… Continue reading Neural Algorithm of Artistic Style Transfer: Understanding with PyTorch examples
Category: Learning
Variational Inference algorithm
Sampling data or approximating probabilistic densities is one of the core problems in modern statistics, especially in Bayesian statistic. Beside of MCMC, Variational Inference is one of the two typical Bayesian approaches solving this problem. It leverages the Variational Inference algorithm to develop many recent powerful methods and models: Stochastic Variational Inference, Variational Autoencoder... This… Continue reading Variational Inference algorithm
Getting started with Bayesian Inference
Bayesian inference is an important technique in statistics, particularly in the analysis of a sequence of data. It has a wide range of contributions, including science, engineering, medicine, finance, etc. It can be used to explained mechanisms inside modern techniques, such as regularization in Deep Learning, building structure of Probabilistic Graphical Models, and also a… Continue reading Getting started with Bayesian Inference
Jensen’s inequality and applications: EM Algorithm and ELBO in SVI
Accidentally, I learned the ELBO of SVI and realised some interesting things. SVI transforms its optimization problem from finding parameters bringing the minimal error to iteratively maximizing lower bound values. Moreover, it is a big step that allows SVI to use gradient descent then to integrate with neural network models. Iteratively finding distributions which have… Continue reading Jensen’s inequality and applications: EM Algorithm and ELBO in SVI
Some Deep Learning Regularization techniques
Machine learning models contain a number of parameters, as a system of equations. If the number of parameters is too low, models do not afford to approximate data representation, called under-fitting. In contrast, over-fitting is striving to fit models which have a high number of parameters makes them lose their generalization. The optimum value is… Continue reading Some Deep Learning Regularization techniques



