SARA-RT: Scaling up Robotics Transformers with Self-Adaptive Robust Attention
Isabel Leal,Krzysztof Choromanski,Deepali Jain,Avinava Dubey,Jake Varley,Michael Ryoo,Yao Lu,Frederick Liu,Vikas Sindhwani,Quan Vuong,Tamas Sarlos,Ken Oslund,Karol Hausman,Kanishka Rao,Isabel Leal,Krzysztof Choromanski,Deepali Jain,Avinava Dubey,Jake Varley,Michael Ryoo,Yao Lu,Frederick Liu,Vikas Sindhwani,Quan Vuong,Tamas Sarlos,Ken Oslund,Karol Hausman,Kanishka Rao
We present Self-Adaptive Robust Attention for Robotics Transformers (SARA-RT): a new paradigm for addressing the emerging challenge of scaling up Robotics Transformers (RT) for on-robot deployment. SARA-RT relies on the new method of fine-tuning proposed by us, called up-training. It converts pre-trained or already fine-tuned Transformer-based robotic policies of quadratic time complexity (includi...