Please read these instructions before posting any event on Fermilab Indico

Indico will be down for maintenance on Thursday, May 16th from 6:00PM - 8:00PM CST.

31 July 2023 to 4 August 2023
America/Chicago timezone

Equivariant transformer is all you need

2 Aug 2023, 10:20
20m
Ramsey Auditorium

Ramsey Auditorium

Speaker

Prof. Akio Tomiya (IPUT Osaka)

Description

Machine learning, deep learning, has been accelerating computational physics, which has been used to simulate systems on a lattice. Equivariance is essential to simulate a physical system because it imposes a strong induction bias for the probability distribution described by a machine learning model. However, imposing symmetry on the model sometimes occur a poor acceptance rate in self-learning Monte-Carlo (SLMC). On the other hand, Attention used in Transformers like GPT realizes a large model capacity. We introduce symmetry equivariant attention to SLMC. To evaluate our architecture, we apply it to our proposed new architecture on a spin-fermion model on a two-dimensional lattice. We find that it overcomes poor acceptance rates for linear models and observe the scaling law of the acceptance rate in machine learning. This talk is based on arXiv:2306.11527.

Topical area Algorithms and Artificial Intelligence

Primary author

Prof. Akio Tomiya (IPUT Osaka)

Co-author

Dr Yuki Nagai (JAEA)

Presentation materials