Dear all,
A new version of athena has been released β version 2.0.0!
Since my last post regarding ATHENA back in March 2024, I have done a significant rewrite of the code to support a plethora of new capabilties. For a brief recap, ATHENA is a Fortran-based neural network library, with a focus on spatial- and now graph-based data.
Here is a brief summary of the new features since my last post (1.2.4):
- added message passing layers (graph neural network support)
- added support for physics-informed neural network architectures
- added support for merge/skip connection layers and, hence, residual network architectures
- added capabilities to extend athena with custom loss, initialiser, and activation types
- added recurrent layers
- increased support for data padding
- added support for reading and writing to ONNX data files (the json human readable format)
- added ReadtheDocs
- all features of athena should be extendable without editing the source code and still operate with current library functionality
- backpropagation now handled by diffstruc
- and lots more
NOTE: Code coverage badge not properly working (so 72% coverage is not to be trusted).
The library can be installed using fpm (recommended) or cmake. With the significant rewrite of athena, the library is unlikely to be backwards compatible; due to this and the many added features, this warranted a major version number bump.
The current layers built directly into athena are:
- Input layers
- Fully-connected (dense) layers
- Activation layers
- 1D, 2D, and 3D convolutional layers
- Message passing layers (currently Duvenaud and Kipf)
- Recurrent layers
- 1D, 2D, and 3D padding layers
- 2D and 3D dropblock layers
- Dropout layers
- Reshape layers
- Flatten layers (automated)
- 1D, 2D, and 3D average pooling layers
- 1D, 2D, and 3D maxpooling layers
- 1D, 2D, and 3D batch normalisation layers
With it being developed with the SOLID principles in mind, all layers are extendable without having to modify the source code and can function with all athena procedures.
The ATHENA project has drawn some inspiration from the neural-fortran project. I recommend people check out that project also for neural-network functionality in modern fortran.
I hope that the new features are useful for someone as well as myself. Feel free to do whatever you want with it. I might be unavailable for a month or two, but will be getting back to working on athena more in the future.
Feel free to ask any questions or make recommendations. If anyone has any interest in contributing to this project, please get in touch.
I want to thank Artan Qerushi for discussions during athenaβs development over the past year, particularly involving graph convolutional networks and physics informed architectures.
Kind regards,
Ned