• In this blog, we will summarize the latex code for most popular machine learning equations, including multiple distance measures, generative models, etc. There are various distance measurements of different data distribution, including KL-Divergence, JS-Divergence, Wasserstein Distance(Optimal Transport), Maximum Mean Discrepancy(MMD) and so on. We will provide the latex code for machine learning models in the following sections. We will also provide latex code of Generative Adversarial Networks(GAN), Variational AutoEncoder(VAE), Diffusion Models(DDPM) for generative models in the second section.

  • In this blog, we will give you a brief introduction to the Huggingface Text to Video Pipeline and the wrapper API. Since installing these pipeline requires many dependencies of python package, such as transformers, torch, diffusers, we provide an API wrapper of common text to video interfaces for non AI or machine learning related experts and put it into the pypi package text2video (https://pypi.org/project/text2video/). Right now this package is still in development stage, and we will keep updating this blog. This package API Wrapper is open to contribution also.

  • Dialogue Agent Multimodal Visualization Tools for AI Systems: A Review. With the rapid development of generative AI technology, developing Chatbot Assistant, Dialogue and Agents have become frequent tasks for designer, developer, product manager, machine learning and AI practitioner. There are a lot data (usually in json format) exchange during different phases of prototyping, designing and developing modern AI systems, such as Chatbot Assistant, Agent, Dialogue, AI Image Generator, text to image, AI video generator, etc. There are potential needs to help visualize the data (json format or others) for AI system. In this blog, we will introduce you an online dialogue data visualization tool DeepNLP Dialogue Visualization to help visualize the agents and dialogue history with simply a json string data. And we will use the OpenAI Dialogue GPT model for a multi-turn dialogue developing for example.

  • In this blog, we will summarize the latex code of most fundamental equations of transfer learning(TL). Different from multi-task learning, transfer learning models aims to achieve the best performance on target domain (minimized target domain test errors), not the performance of source domain. Typical transfer learning methods including domain adaptation(DA), feature sub-space alignment, etc. In this post, we will dicuss more details of TL equations, including many sub-areas like domain adaptation, H-divergence, Domain-Adversarial Neural Networks(DANN), which are useful as quick reference for your research.

  • In this blog, we will summarize the latex code for equations and formula sheet of CFA Level II exam, Formula Sheet Equations and Latex Code. Topics include QUANTITATIVE METHODS, such as LINEAR REGRESSION, MULTIPLE REGRESSION, TIME SERIES ANALYSIS, Machine Learning, ECONOMICS, Exchange rate, Covered interest rate parity, Uncovered interest rate parity, Relative purchasing power parity, Fisher and international Fisher effects, FX carry trade, Mundell-Fleming model, ECONOMIC GROWTH, Growth accounting equation, Labor productivity growth accounting equation, Classical growth model (Malthusian model), Neoclassical growth model (Solow’s model), Endogenous growth model, Convergence, ECONOMICS OF REGULATION. The data source of this blog is summarized from WILEY’S CFA PROGRAM LEVEL II quicksheet.