LOW RANK ADAPTATION LORA

Tags: #nlp #llm #RLHF

Equation

$$W_{0} + \Delta W_{0} = W_{0} + BA, h=W_{0}x + \Delta W_{0}x = W_{0}x + BAx, \text{Initialization:} A \sim N(0, \sigma^{2}), B = 0$$

Latex Code

                                 W_{0} + \Delta W_{0} =  W_{0} + BA, h=W_{0}x + \Delta W_{0}x = W_{0}x + BAx, \text{Initialization:} A \sim N(0, \sigma^{2}), B = 0
                            

Have Fun

Let's Vote for the Most Difficult Equation!

Introduction

$$ W_{0} \in \mathbb{R} ^ {d \times k} $$: denotes the pretrained weight matrix with dimension $$ \mathbb{R}^{d \times k} $$.

Comments

Write Your Comment

Upload Pictures and Videos