## List of Statistics Equations Latex Code

rockingdingo 2022-11-06 #Binomial #Poisson #Normal Gaussian #Covariance #Chi-Square 2 0

List of Statistics Equations Latex Code

In this blog, we will summarize the latex code for statistics equations, ranging from the elementary statistics equations (e.g. mean and variance) to more advanced graduate-level statistics equations, e.g. Binomial, Poisson, Normal Distribution, Chi-Square Test, etc.

### 1. Elementary Statistics

• #### Mean and Variance

##### Latex Code
        \text{Mean Discrete}\ \mu=E(X)=\sum P_{i}x_{i} \\\text{Mean Continuous}\ \mu=\int xf(x) dx \\\text{Variance Discrete}\ \sigma^{2}=V(X)=E[(X-\mu)^2]=\sum P_{i}(x_{i} -\mu)^2 \\\text{Variance Continuous}\ \sigma^{2}=V(X)=E[(X-\mu)^2]=\int (x-\mu)^{2}f(x) dx

##### Explanation

X denotes a random variable which has a distribution f(x) over some subset x of the real numbers. If the distribution f(x) is discrete, the probability of f(x=X)=xi is is Pi. And the mean \mu equals to the sum of probability Pi multiplies the random variable value x. . When the distribution is continuous f(x), the probability that X lies in the interval, and the f(x) denotes density function. The variance(squared value of standard deviation \sigma) measure how far the subset X is from the mean value \mu, is it a flat distribution or shallow distribution. And the definition of variance is the expectation E(X) of the squared distance between each data point X and its mean \mu.

• #### Standard Deviation

##### Latex Code
        \sigma=\sqrt{V(X)}=\sqrt{\sum_{i} P_{i}(x_{i} - \mu)^2}=\sqrt{\frac{\sum_{i} (x_{i} - \mu)^2}{n}} \\ \sigma=\sqrt{V(X)}=\sqrt{\int (x-\mu)^{2}f(x) dx}

##### Explanation

Standard Deviation \sigma equals to the squared root of Variance \sigma^{2} of a dataset X.

• ### 2. Probability Distributions

• #### Binomial Distribution

##### Latex Code
        X \sim B(n,p) \\f(x)=\begin{pmatrix}n\\ x\end{pmatrix}p^{x}q^{n-x}=C^{k}_{n}p^{x}q^{n-x},q=1-p\\\text{Binominal Mean}\ \mu=np\\\text{Binominal Variance}\ \sigma^2=npq

##### Explanation

The binomial distribution measures in total n independent trials, the probability that x trials in total n trials are positive (like the getting positive of flipping a coin). In this formulation, f(x) denotes the probability that x positive trials are observed in n independent trials. p denote the probability that positive is observed in each single trial. q denotes the negative is observed, which equals to 1-p.

• #### Poisson Distribution

##### Latex Code
        X \sim \pi(\mu) \\f(x)=\frac{\mu^{x}}{x!}e^{-\mu}\\ \text{Poisson Mean} \mu \\  \text{Poisson Variance}\sigma^2=\mu

##### Explanation

\mu equals to the probability that an event occurs in a unit time period.

• #### Normal Gaussian Distribution

##### Latex Code
        X \sim \mathcal{N}(\mu,\sigma^2) \\ f(x)=\frac{1}{\sigma\sqrt{2\pi}}\exp{[-\frac{(x-\mu)^{2}}{2\sigma^{2}}]}

##### Explanation

X denotes the random variable which follows the normal distribution. \mu denotes the mean value and \sigma denotes the standard deviation.

• #### Chi-Square Test

##### Latex Code
        \chi ^{2}=\sum \frac{(A-E)^2}{E}\\=\sum^{K}_{1}\frac{(A_{i}-E_{i})^2}{E_{i}}=\sum^{K}_{1}\frac{(A_{i}-np_{i})^2}{np_{i}}

##### Explanation

Chi-Square Test measure how close and an actual observation of distribution A are correlated to the assumed theoretical distribution E. The dataset X are splitted into K different buckets and the statistics of Chi-Square Test is calculated as above.

• #### Sum of Random Variables

##### Latex Code
        W=aX+bY \\ E(W)=aE(X)+bE(Y) \\ \text{Var}(W)=a^{2}\text{Var}(X) + b^{2}\text{Var}(Y), \text{X and Y independent}

##### Explanation

E(X) and E(Y) denote the mean of random variable X and Y, Var(X) and Var(Y) denote the variance of random variable X and Y. W is the weighted sum of two random variables.

• #### Covariance

##### Latex Code
        \text{Cov}(X,Y)=E[(X-E(X))(Y-E(Y))]\\=E(XY)-2E(X)E(Y)+E(X)E(Y)\\=E(XY)-E(X)E(Y)

##### Explanation

Covariance measures the total variation of two random variables X and Y from their expected values E(X) and E(Y). The definition of covariance is E[(X-E(X))(Y-E(Y))].