OK, now that I've moved over to sigmoid.social, time for my first...uh...tootprint? Mastoscript? Manudon? Screw it - we wrote a paper and I want to share it with you.
Very pleased to be able to share this one: is attention all you need to solve the Schrödinger equation? https://arxiv.org/abs/2211.13672
For the last several years, numerous groups have shown that neural networks can make calculations in quantum chemistry much more accurate - FermiNet, PauliNet, etc. We wrote a review article about it here: https://arxiv.org/abs/2208.12590
Most work since then has only made small tweaks to these basic neural network ansatzes. Instead, we tried to reinvent neural network ansatzes from the ground up. The result is a model we call the Psiformer: basically, a Transformer encoder designed for quantum chemistry.
One problem with the FermiNet was that it seemed to decrease in accuracy as system size increased. We find that the Psiformer is uniformly more accurate than the FermiNet on all systems we investigated.
Most impressively, the bigger the system size, the bigger the improvement with the Psiformer relative to the FermiNet. On the largest system we looked at, the benzene dimer (84 electrons!) the Psiformer with VMC is more accurate than the FermiNet with *diffusion* Monte Carlo!
I really never thought I’d be an “attention is all you need” guy, but at least in this case, it seems like neural network ansatzes using self-attention are a clear improvement over prior models, and present a path to unprecedented accuracy in quantum chemical calculations.
This was work led by Ingrid von Glehn and in collaboration with
James Spencer. For those at #NeurIPS2022, I’ll be speaking about this and other topics on deep learning and quantum chemistry at the #ML4PhysicalSciences workshop on Saturday! https://ml4physicalsciences.github.io/
@pfau #paperthread is the usual tag!
@nsaphra How bland.
@pfau Look we can’t all be #lichensubscribe