TY  - GEN
AB  - Neural architecture search (NAS) emerged as a way to automatically optimize neural networks for a specific task and dataset. Despite an abundance of research on NAS for images and natural language applications, similar studies for time series data are lacking. Among NAS search spaces, chain-structured are the simplest and most applicable to small datasets like time series. We compare three popular NAS strategies on chain-structured search spaces: Bayesian optimization (specifically Tree-structured Parzen Estimator), the hyperband method, and reinforcement learning in the context of financial time series forecasting. These strategies were employed to optimize simple well-understood neural architectures like the MLP, 1D CNN, and RNN, with more complex temporal fusion transformers (TFT) and their own optimizers included for comparison. We find Bayesian optimization and the hyperband method performing best among the strategies, and RNN and 1D CNN best among the architectures, but all methods were very close to each other with a high variance due to the difficulty of working with financial datasets. We discuss our approach to overcome the variance and provide implementation recommendations for future users and researchers.
AD  - School of Engineering and Management Vaud, HES-SO University of Applied Sciences and Arts Western Switzerland
AD  - School of Engineering and Management Vaud, HES-SO University of Applied Sciences and Arts Western Switzerland
AD  - School of Engineering and Management Vaud, HES-SO University of Applied Sciences and Arts Western Switzerland
AD  - Predictive Layer SA, Rolle, Switzerland
AD  - School of Engineering and Management Vaud, HES-SO University of Applied Sciences and Arts Western Switzerland
AU  - Levchenko, Denis
AU  - Rappos, Efstratios
AU  - Ataee, Shabnam
AU  - Nigro, Biagio
AU  - Robert-Nicoud, Stephan
CY  - Berlin, Germany
DA  - 2024-12
DO  - 10.1007/s41060-024-00690-y
DO  - DOI
ID  - 15225
JF  - International Journal of Data Science and Analytics
KW  - neural architecture search
KW  - time series forecasting
KW  - hyperparameter optimization
KW  - deep learning
KW  - neural networks
KW  - reinforcement learning
L1  - https://arodes.hes-so.ch/record/15225/files/Levchenko_2024_Chain-structured_neural_architecture.pdf
L2  - https://arodes.hes-so.ch/record/15225/files/Levchenko_2024_Chain-structured_neural_architecture.pdf
L4  - https://arodes.hes-so.ch/record/15225/files/Levchenko_2024_Chain-structured_neural_architecture.pdf
LA  - eng
LK  - https://arodes.hes-so.ch/record/15225/files/Levchenko_2024_Chain-structured_neural_architecture.pdf
N2  - Neural architecture search (NAS) emerged as a way to automatically optimize neural networks for a specific task and dataset. Despite an abundance of research on NAS for images and natural language applications, similar studies for time series data are lacking. Among NAS search spaces, chain-structured are the simplest and most applicable to small datasets like time series. We compare three popular NAS strategies on chain-structured search spaces: Bayesian optimization (specifically Tree-structured Parzen Estimator), the hyperband method, and reinforcement learning in the context of financial time series forecasting. These strategies were employed to optimize simple well-understood neural architectures like the MLP, 1D CNN, and RNN, with more complex temporal fusion transformers (TFT) and their own optimizers included for comparison. We find Bayesian optimization and the hyperband method performing best among the strategies, and RNN and 1D CNN best among the architectures, but all methods were very close to each other with a high variance due to the difficulty of working with financial datasets. We discuss our approach to overcome the variance and provide implementation recommendations for future users and researchers.
PB  - Springer Nature
PP  - Berlin, Germany
PY  - 2024-12
SN  - 2364-415X
T1  - Chain-structured neural architecture search for financial time series forecasting
TI  - Chain-structured neural architecture search for financial time series forecasting
UR  - https://arodes.hes-so.ch/record/15225/files/Levchenko_2024_Chain-structured_neural_architecture.pdf
VL  - 2024
Y1  - 2024-12
ER  -