Geomagnetically Induced Currents (GICs) can be produced by the interaction between the solar wind and the magnetosphere. Powerful GICs are capable of causing significant damage to critical infrastructure that could be mitigated if these events could be predicted. Machine--learning based models offer great promise for predicting GIC events, but current models fail to provide consistently reliable predictions.
In this work, we describe a novel machine learning model developed for predicting GIC events. The proposed model is based on the recently introduced transformer model, which was originally designed for sequence--to--sequence modeling and has been adapted to model multivariable time series data. The model incorporates a multi--headed attention mechanism and a positional encoding scheme, two components that together allow the model to attend more closely to significant predictors when they arise, to capture complex and variable--length time dependencies in the data, and to process the data in parallel, eliminating the time--consuming sequential processing required by many other machine learning models when trained on sequential data. This leads to a high--performance model capable of being trained over longer periods of time and at a higher resolution more efficiently and effectively.
The model is trained using fifteen years of OMNI data and Supermag data from the Ottawa station over the years 1995--2010 at one--minute resolution. We present results on its performance at predicting the August 5, 2011 and March 17, 2015 geomagnetic storms, and compare the proposed model's performance with other recently developed machine learning models trained and tested on the same datasets.