Can fortran be used for Training GANs in ML?

can fortran be used to train Generative adversarial networks like StyleGANs on gpu ? Is it easier to train on GPUs with fortran rather than python ? I have a dataset of 20Gb of non labelled facial data and Nvidia Tesla V100.

Thanks and Regards,
Henil Shalin Panchal

2 Likes

I don’t think so. The official NVIDIA Labs code uses Python/Tensorflow with GPUs for training: GitHub - NVlabs/stylegan: StyleGAN - Official TensorFlow Implementation

Fortran machine learning frameworks are scarce and GPU programming in Fortran receives far less support compared to C/C++/Julia/Python. I’m sure you could access the cuDNN or the Intel® oneDNN library directly from Fortran using C bindings, but it would likely involve a lot of effort. AFAIK, the Python frameworks are calling the same optimized vendor libraries. You just need to make sure you’ve get them installed correctly with GPU support enabled.

2 Likes