000015225 001__ 15225 000015225 005__ 20250107133955.0 000015225 022__ $$a2364-415X 000015225 0247_ $$2DOI$$a10.1007/s41060-024-00690-y 000015225 037__ $$aARTICLE 000015225 039_9 $$a2025-01-07 13:39:55$$b0$$c2025-01-06 13:33:19$$d1000305$$c2024-12-20 15:28:35$$d0$$y2024-12-20 15:28:27$$z1000099 000015225 041__ $$aeng 000015225 245__ $$aChain-structured neural architecture search for financial time series forecasting 000015225 260__ $$aBerlin, Germany$$bSpringer Nature 000015225 269__ $$a2024-12 000015225 300__ $$a10 p. 000015225 506__ $$avisible 000015225 520__ $$9eng$$aNeural architecture search (NAS) emerged as a way to automatically optimize neural networks for a specific task and dataset. Despite an abundance of research on NAS for images and natural language applications, similar studies for time series data are lacking. Among NAS search spaces, chain-structured are the simplest and most applicable to small datasets like time series. We compare three popular NAS strategies on chain-structured search spaces: Bayesian optimization (specifically Tree-structured Parzen Estimator), the hyperband method, and reinforcement learning in the context of financial time series forecasting. These strategies were employed to optimize simple well-understood neural architectures like the MLP, 1D CNN, and RNN, with more complex temporal fusion transformers (TFT) and their own optimizers included for comparison. We find Bayesian optimization and the hyperband method performing best among the strategies, and RNN and 1D CNN best among the architectures, but all methods were very close to each other with a high variance due to the difficulty of working with financial datasets. We discuss our approach to overcome the variance and provide implementation recommendations for future users and researchers. 000015225 540__ $$acorrect 000015225 592__ $$aHEIG-VD 000015225 592__ $$bIICT - Institut des Technologies de l'Information et de la Communication 000015225 592__ $$cIngénierie et Architecture 000015225 6531_ $$9eng$$aneural architecture search 000015225 6531_ $$9eng$$atime series forecasting 000015225 6531_ $$9eng$$ahyperparameter optimization 000015225 6531_ $$9eng$$adeep learning 000015225 6531_ $$9eng$$aneural networks 000015225 6531_ $$9eng$$areinforcement learning 000015225 655__ $$ascientifique 000015225 700__ $$aLevchenko, Denis$$uSchool of Engineering and Management Vaud, HES-SO University of Applied Sciences and Arts Western Switzerland 000015225 700__ $$aRappos, Efstratios$$uSchool of Engineering and Management Vaud, HES-SO University of Applied Sciences and Arts Western Switzerland 000015225 700__ $$aAtaee, Shabnam$$uSchool of Engineering and Management Vaud, HES-SO University of Applied Sciences and Arts Western Switzerland 000015225 700__ $$aNigro, Biagio$$uPredictive Layer SA, Rolle, Switzerland 000015225 700__ $$aRobert-Nicoud, Stephan$$uSchool of Engineering and Management Vaud, HES-SO University of Applied Sciences and Arts Western Switzerland 000015225 773__ $$tInternational Journal of Data Science and Analytics$$j2024 000015225 8564_ $$uhttps://arodes.hes-so.ch/record/15225/files/Levchenko_2024_Chain-structured_neural_architecture.pdf$$yPublished version$$9668f1b76-4d95-4da2-afe2-824e28846312$$s702063 000015225 906__ $$aGOLD 000015225 909CO $$ooai:hesso.tind.io:15225$$pGLOBAL_SET 000015225 950__ $$aaucun 000015225 980__ $$ascientifique 000015225 981__ $$ascientifique