I often see python + some fast libraries have been used rather often for computationally “heavy” calculations. For example, a recent one (for climate modeling, by Google Research, published recently in Nature) is:
https://research.google/blog/fast-accurate-climate-modeling-with-neuralgcm/
- Neural general circulation models for weather and climate | Nature
- Google AI predicts long-term climate trends and weather — in minutes
- [2311.07222] Neural General Circulation Models for Weather and Climate
- Code (Python/JAX)
GitHub - google-research/neuralgcm: Hybrid ML + physics model of the Earth's atmosphere
I remember the Boltzmann generator (a kind of canonical sampling method for molecular/protein configurations) also used Python (together with ML libraries):
- https://www.science.org/doi/full/10.1126/science.aaw1147
- [1812.01729] Boltzmann Generators -- Sampling Equilibrium States of Many-Body Systems with Deep Learning
- GitHub - noegroup/bgflow: Boltzmann Generators and Normalizing Flows in PyTorch
Another example may be FermiNet,