TY - GEN
T1 - A Novel Approach to Time Series Complexity via Reservoir Computing
AU - Thorne, Braden
AU - Jüngling, Thomas
AU - Small, Michael
AU - Corrêa, Débora
AU - Zaitouny, Ayham
N1 - Publisher Copyright:
© 2022, The Author(s), under exclusive license to Springer Nature Switzerland AG.
PY - 2022
Y1 - 2022
N2 - When working with time series, it is often beneficial to have an idea as to how complex the signal is. Periodic, chaotic and random signals (from least to most complex) may each be approached in different ways, and knowing when a signal can be identified as belonging to one of these categories can reveal a lot about the underlying system. In the field of time series analysis, permutation entropy has emerged as one of the premier measures of time series complexity due to its ability to be calculated from data alone. We propose an alternative method for calculating complexity based on the machine learning paradigm of reservoir computing, and how the outputs of these neural networks capture similar information regarding signal complexity. We observe similar behaviour in our proposed measure to both the Lyapunov exponent and permutation entropy for well known dynamical systems. Additionally, we assess the dependence of our measure on key hyperparameters of the model, drawing conclusions about the invariance of the measure and possible implications on informing network structure.
AB - When working with time series, it is often beneficial to have an idea as to how complex the signal is. Periodic, chaotic and random signals (from least to most complex) may each be approached in different ways, and knowing when a signal can be identified as belonging to one of these categories can reveal a lot about the underlying system. In the field of time series analysis, permutation entropy has emerged as one of the premier measures of time series complexity due to its ability to be calculated from data alone. We propose an alternative method for calculating complexity based on the machine learning paradigm of reservoir computing, and how the outputs of these neural networks capture similar information regarding signal complexity. We observe similar behaviour in our proposed measure to both the Lyapunov exponent and permutation entropy for well known dynamical systems. Additionally, we assess the dependence of our measure on key hyperparameters of the model, drawing conclusions about the invariance of the measure and possible implications on informing network structure.
KW - Information entropy
KW - Recurrent neural networks
KW - Reservoir computing
KW - Time series analysis
UR - http://www.scopus.com/inward/record.url?scp=85144824731&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85144824731&partnerID=8YFLogxK
U2 - 10.1007/978-3-031-22695-3_31
DO - 10.1007/978-3-031-22695-3_31
M3 - Conference contribution
AN - SCOPUS:85144824731
SN - 9783031226946
T3 - Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
SP - 442
EP - 455
BT - AI 2022
A2 - Aziz, Haris
A2 - Corrêa, Débora
A2 - French, Tim
PB - Springer Science and Business Media Deutschland GmbH
T2 - 35th Australasian Joint Conference on Artificial Intelligence, AI 2022
Y2 - 5 December 2022 through 9 December 2022
ER -