Open access

Introductory Chapter: Time Series Analysis

Written By

Cláudia M. Viana, Sandra Oliveira and Jorge Rocha

Submitted: 12 February 2024 Published: 22 May 2024

DOI: 10.5772/intechopen.1004609

From the Edited Volume

Time Series Analysis - Recent Advances, New Perspectives and Applications

Jorge Rocha, Cláudia M. Viana and Sandra Oliveira

Chapter metrics overview

38 Chapter Downloads

View Full Metrics

1. Introduction

Time series, defined as sequentially observed data points over time [1], find applications across diverse domains such as economics and engineering. The statistical analysis of time series is crucial, and Chatfield’s taxonomy identifies six main categories: Economic and Financial Time Series, Physical Time Series, Marketing Time Series, Process Control Data, Binary Processes, and Point Processes.

To effectively categorize time series, consideration of features like seasonality, trend, and outliers is essential [1]. Seasonality reflects recurring patterns over time intervals, while trend represents a systematic linear or nonlinear component. Outliers are observations distant from others, often indicating anomalies. The categorization and analysis of time series are pivotal for drawing meaningful inferences from the diverse structures encountered in engineering, science, sociology, and economics [2].

The objectives of time series analysis encompass description, explanation, prediction, and control. Description involves plotting observations over time to reveal patterns, while explanation explores relationships between variables. Prediction focuses on forecasting future values, and control utilizes time series to enhance control over physical or economic systems.

Possible applications span from land use-cover [3, 4] and agriculture changes [5, 6], tourism [7, 8], socioeconomic vulnerability [9], epidemiology [10], and health [11]. This chapter delves into advanced approaches for time series analysis.


2. Time series analysis

Time Series Analysis (TSA) involves constructing predictive models that generate a target variable or label based on sequential observations across a defined period, that is, data that is time-dependent. The analysis of time series involves studying the relationships between variables that change over time [12].

There are two types of time series: deterministic and stochastic. Deterministic time series can be predicted with certainty based on previous experience, while stochastic time series have random fluctuations [13]. Time series analysis is widely used in various fields such as economics, finance, and health research [12, 14]. It helps in identifying patterns, forecasting future values, and understanding the underlying dynamics of the data [14, 15, 16]. As such, it has applications in various domains and is important for making inferences about the future based on past observations [12, 13, 15, 17, 18].

To perform time series analysis, it is important to convert nonstationary data into stationary data using techniques like differencing [13]. Time series data, common in various fields, present unique challenges due to random noise and interdependencies between measurements at different time points. Autocorrelations and partial autocorrelations of the series can indicate the degree of correlation between each point and earlier values in the series [18]. Autocorrelation measures the correlation between a data point and its lagged values, while partial autocorrelation measures the correlation between a data point and its lagged values after removing the effects of intermediate lags. Recent developments have led to prominent approaches (Figure 1).

Figure 1.

Different approaches in time series analysis.

2.1 Forecasting

Forecasting entails predicting unseen values within an observed time series, crucial in domains like economics and production planning. Despite a plethora of forecasting methods, challenges persist in achieving satisfactory generalization capabilities [19]. There is still the need to choose the most suitable methodology assuming a set of preconditions. Being a form of extrapolation, forecasting stands all the risks of it. The forecasting horizon introduces risks of error escalation, requiring careful model adaptation based on incoming information.

2.2 Anomaly detection

Anomaly Detection, synonymous with outlier or novelty detection, identifies abnormal data within a dataset [20]. Anomalies signify rare events, prompting critical actions in diverse domains such as network security and healthcare [21]. Anomalies can be categorized as point anomalies (deviations from normal patterns), contextual anomalies (anomalies in specific environments), and collective anomalies (erratic behavior in a group of similar data instances) [22].

2.3 Case-based reasoning

Case-based Reasoning (CBR) replicates human problem-solving by drawing comparisons between previously solved cases and applying similar solutions to new cases. It relies on specific knowledge and maintains a case base for reference, contributing to problem-solving and learning [23].

2.4 Bayesian optimization

Bayesian Optimization addresses global optimization problems by iteratively developing a statistical model of the unknown objective function [24]. It balances exploitation and exploration using an acquisition function, with Expected Improvement (EI) commonly employed [25]. This approach proves beneficial when dealing with black-box functions and limited samples [26].

2.5 Competitor analysis

Competitor Analysis in time series analysis considers frameworks aiding in forecasting and anomaly detection, ranging from to meteorological readings to Internet of Things (IoT) data [27]. Understanding competitors in these areas involves examining tools, methodologies, and their applicability to specific domains. Naturally, to have a superior understand of trending, anomaly occurrences, and forecasting, one has to perform analysis on these data structures. This process can be smoothed with the help of frameworks that provide the researcher with tools to perform forecasting and anomaly detection. In addition, comparative summaries provide insights into the strengths and weaknesses of each competitor.


3. Advancements in time series analysis

The latest advancements in TSA have focused on addressing the challenges posed by the increasing amount of data and the need for more efficient algorithms for accurate forecasting. These advancements offer more efficient ways of analyzing and predicting data compared to traditional approaches [28]. These advancements have been driven by the demand for accurate forecasting and decision-making in various domains such as finance, healthcare, and environmental protection [29, 30].

One can emphasize that new technologies and smarter algorithms are being developed to analyze large time series collections [28]. Likewise, the development of Python packages like tsfresh has accelerated the process of feature extraction from time series data [31]. There are also some emerging applications in various scenarios, such as subject theme evolution, academic influence evaluation, network sentiment analysis, and technology trend analysis [32].

New methods in time series analysis offer more efficient and accurate ways of analyzing and predicting data compared to traditional approaches [27]. The use of advanced techniques, such as nonlinear time series analysis, has improved the ability to model and predict complex systems [33].

However, implementing new methods can be challenging, and the impact on forecasting accuracy and model performance depends on the specific techniques used. For example, the systematic framework proposed in TsP-SA enables qualitative comparison and assessment of different time series prediction techniques, leading to improved forecasting accuracy [34].


4. Persisting issues in TSA

There are some common challenges and limitations in time series analysis. These include dealing with nonstationarity, missing data, outliers, and the curse of dimensionality [13, 14]. Nonstationarity refers to the situation where the statistical properties of a time series change over time, making it difficult to model and forecast accurately [13]. The selection and evaluation of the most suitable method for a specific task can be challenging due to the lack of comprehensive categorization and comparison of techniques [34].

In terms of advancements and trends in time series analysis, there is ongoing research in developing more efficient algorithms and techniques to handle large-scale time series data [28]. Additionally, there is a focus on integrating time series analysis with other data mining and machine learning (ML) techniques to improve prediction accuracy and discover hidden patterns [35].

Time series analysis is a valuable tool for exploring, analyzing, and forecasting data indexed over time. It involves concepts such as autocorrelation, ARIMA (Autoregressive Integrated Moving Average) models, and stochastic volatility models. ARIMA models are commonly used in time series analysis. They combine autoregressive (AR), moving average (MA), and differencing components to model the trend, seasonality, and noise in a time series [14]. In the other hand, stochastic volatility models aim to model the change over time in the variability or volatility of a time series. These models are particularly useful in finance and economics to capture the volatility clustering phenomenon observed in financial markets [14].

TSA finds applications in finance and economics, helping in predicting future values and analyzing the effects of economic policies in market forecasting, risk management, and customer requirement analysis [36]. However, it also faces challenges such as nonstationarity and missing data. Ongoing advancements include the development of more efficient algorithms and the integration of time series analysis with other techniques for improved prediction and pattern discovery.


5. Conclusion

The state-of-the-art in time series analysis shows the prevalence of ML approaches, demonstrating excellent results in forecasting and anomaly detection. While classical models maintain relevance, Bayesian Optimization enhances the quality of results, as evidenced in specific scenarios. The discussion underscores the continued importance of classical models and the evolving role of ML in time series analysis.

The demand for time series forecasting spans various challenging domains and data analytics issues. This process involves utilizing models to interpret past sequences of values at evenly spaced intervals to predict subsequent values along the same time axis. Initially, methods like Holt-Winters [37] and later Box and Jenkins [38] laid the foundation. Subsequently, statistical models such as ARIMA, X11ARIMA, X12ARIMA [39, 40], Seasonal Autoregressive Integrated Moving Average (SARIMA) [38], Seasonal Autoregressive Integrated Moving Average with Exogenous factors (SARIMAX), and Support Vector Machine (SVM) [41] were introduced to this domain.

However, a limitation arises with these models in discerning exogenous input features [42]. Additionally, Autoregressive Moving Average (ARMA) models’ linear basis poses challenges in learning and predicting the nonlinear dynamics of time series [43]. Nonlinear machine learning approaches such as kernel methods [44], Gaussian processes [45], and ensemble methods [46] have shown significant improvements but still lack the ability to assimilate true nonlinear relationships.

Neural networks, on the other hand, offer promising solutions. Multi-Layer Perceptron (MLP) was among the first neural networks used for time series forecasting, followed by recurrent networks [47]. Long Short Term Memory (LSTM) networks have significantly improved accuracy by incorporating information from long-term dependencies [48]. Fuzzy approaches and fuzzy neural networks have also been explored [49].

In recent years, deep learning approaches, particularly in domains like natural language processing, have inspired algorithms applicable to time series forecasting [50]. Attention-based architectures [51], which excel in sequence prediction tasks [52, 53], hold promise in this domain. Most recent deep learning architectures have been Vanilla Long Short Term Memory (V-LSTM) [54], Gated Recurrent Unit (GRU) [55], Bidirectional LSTM (BD-LSTM) [56], Autoencoder LSTM (AELSTM) [57], Convolutional Neural Networks combined with LSTM (CNN-LSTM) [58], Attention Mechanism Network [59], and the Transformer network [60].



We acknowledge GEOMODLAB (Laboratory for Remote Sensing, Geographical Analysis, and Modeling) of the Center of Geographical Studies/IGOT for providing the required equipment and software. This research was supported by the Portuguese Foundation for Science and Technology (FCT) under grant number 2022.09372.PTDC and grant number 2022.05015.PTDC and Centre for Geographical Studies—University of Lisbon and FCT under grant numbers UIDB/00295/2020 and UIDP/00295/2020.


  1. 1. Chatfield C, Xing H. The Analysis of Time Series: An Introduction with R. Boca Raton: CRC Press; 2019
  2. 2. Brockwell PJ, Davis RA. Introduction to Time Series and Forecasting. New York: Springer; 2002
  3. 3. Viana CM, Girão I, Rocha J. Long-term satellite image time-series for land use/land cover change detection using refined open source data in a rural region. Remote Sensing [Internet]. 2019;11(9):1104. Available from:
  4. 4. Viana CM, Pontius RG, Rocha J. Four fundamental questions to evaluate land change models with an illustration of a cellular automata–markov model. Annals of the American Association of Geographers. 2023;113(10):2497-2511. DOI: 10.1080/24694452.2023.2232435
  5. 5. Viana CM, Freire D, Abrantes P, Rocha J. Evolution of agricultural production in Portugal during 1850-2018: A geographical and historical perspective. Land [Internet]. 2021;10(8):776. Available from:
  6. 6. Ribeiro C, Viana CM, Girão I, Figueiredo E, Rocha J. The spatiotemporal links between urban and rural regions through the sale and consumption of agri-food products. Sustainability [Internet]. 2023;15(15):12038. Available from:
  7. 7. Encalada-Abarca L, Ferreira CC, Rocha J. Measuring tourism intensification in urban destinations: An approach based on fractal analysis. Journal of Travel Research [Internet]. 2022;61(2):394-413. Available from:
  8. 8. Encalada-Abarca L, Ferreira CC, Rocha J. Revisiting city tourism in the longer run: An exploratory analysis based on LBSN data. Current Issues in Tourism [Internet]. 2024;27(4):584-599. Available from:
  9. 9. Santos PP, Zêzere JL, Pereira S, Rocha J, Tavares AO. A novel approach to measuring spatiotemporal changes in social vulnerability at the local level in Portugal. International Journal of Disaster Risk Science [Internet]. 2022;13(6):842-861. Available from:
  10. 10. Oliveira S, Capinha C, Rocha J. Predicting the time of arrival of the Tiger mosquito (Aedes albopictus) to new countries based on trade patterns of tyres and plants. Journal of Applied Ecology [Internet]. 2023;60(11):2362-2374. Available from:
  11. 11. Silva M, Betco I, Capinha C, Roquette R, Viana CM, Rocha J. Spatiotemporal Dynamics of COVID-19 Infections in Mainland Portugal. Sustainability. 2022;14
  12. 12. León-Álvarez AL, Betancur- Gómez JI, Jaimes-Barragán F, Grisales-Romero H. Clinical and epidemiological rounds. Time series. Iatreia [Internet]. 2016;29(3):373-381. Available from:
  13. 13. Mathelinea D, Chandrashekar R, Mawengkang H. Stationarity test for medicine time series data. AIP Conference Proceedings [Internet]. 2023;2714(1):30049. Available from. DOI: 10.1063/5.0128444
  14. 14. De Jong P. Time series analysis. In: Frees EW, Derrig RA, Meyers G, editors. Predictive Modeling Applications in Actuarial Science. Cambridge: Cambridge University Press; 2014. pp. 427-448. Available from:
  15. 15. Žáček M. Introduction to time series. In: Volna E, Kotyrba M, Janosek M, editors. Pattern Recognition and Classification in Time Series Data. Hershey, PA: IGI Global; 2017. pp. 32-52. DOI: 10.4018/978-1-5225-0565-5.ch002
  16. 16. Kulp CW, Niskala BJ. Characterization of time series data. In: Skiadas CH, Skiadas C, editors. Handbook of Applications of Chaos Theory [Internet]. New York: Chapman and Hall/CRC; 2017. pp. 211-230. Available from: partnerID=40&md5=47dae08ebe64bb0bb6195ea52938bb39
  17. 17. Ivanović M, Kurbalija V. Time series analysis and possible applications. In: Biljanovic P, Butkovic Z, Skala K, Grbac TC, Cicin-Sain M, Sruk V, et al., editors. 39th International Convention on Information and Communication Technology, Electronics and Microelectronics (MIPRO). Opatija, Croatia: IEEE; May 30 - June 3, 2016. pp. 473-479. DOI: 10.1109/MIPRO.2016.7522190
  18. 18. Chattopadhyay AK, Chattopadhyay T. Time series analysis. In: Statistical Methods for Astronomical Data Analysis. Springer Series in Astrostatistics. Vol. 3. New York, NY: Springer; 2014. DOI: 10.1007/978-1-4939-1507-1_9
  19. 19. Makridakis S, Spiliotis E, Assimakopoulos V. The M4 competition: Results, findings, conclusion and way forward. International Journal of Forecasting. 2018;34(4):802-808
  20. 20. Ahmed M, Naser Mahmood A, Hu J. A survey of network anomaly detection techniques. Journal of Network and Computer Applications [Internet]. 2016;60:19-31. Available from:
  21. 21. Mason AC. Artificial Intelligence Cybersecurity Threats: Determining Strategy and Decision-Making Effects. Ann Arbor, Michigan: ProQuest, Northcentral University; 2020
  22. 22. Ahmed M, Mahmood AN, Islam MR. A survey of anomaly detection techniques in financial domain. Future Generation Computer Systems. 2016;55:278-288
  23. 23. Kolodner JL. An introduction to case-based reasoning. Artificial Intelligence Review [Internet]. 1992;6(1):3-34. DOI: 10.1007/BF00155578
  24. 24. Mockus J. The Bayesian approach to local optimization BT. In: Mockus J, editor. Bayesian Approach to Global Optimization. Mathematics and Its Applications. Vol. 37. Springer, Dordrecht: Springer Netherlands; 1989. pp. 125-156. DOI: 10.1007/978-94-009-0909-0_7
  25. 25. Mockus J, Tiesis V, Zilinskas A. The application of {B}ayesian methods for seeking the extremum. Towards Global Optimization. 1978;2(117-129):2
  26. 26. Brochu E, Cora VM, de Freitas N. A tutorial on Bayesian optimization of expensive cost functions, with application to active user modeling and hierarchical reinforcement learning. ArXiv [Internet]. 2010;abs/1012.2599. Available from:
  27. 27. Song YX. Time series analysis process of dynamic data in internet of things system. Journal of Physics: Conference Series [Internet]. 2021;1856(1):12010. DOI: 10.1088/1742-6596/1856/1/012010
  28. 28. Palpanas T, Beckmann V. Report on the first and second interdisciplinary time series analysis workshop (ITISA). SIGMOD Record [Internet]. 2019;48(3):36-40. Available from:
  29. 29. Kapila Tharanga Rathnayaka RM, Seneviratne DMKN, Jianguo W, Arumawadu HI. A hybrid statistical approach for stock market forecasting based on artificial neural network and ARIMA time series models. In: 2015 International Conference on Behavioral, Economic and Socio-Cultural Computing (BESC). Nanjing, China; 2015. pp. 54-60. DOI: 10.1109/BESC.2015.7365958
  30. 30. Struckov A, Yufa S, Visheratin AA, Nasonov D. Evaluation of modern tools and techniques for storing time-series data. Procedia Computer Science. 2019:19-28. DOI: 10.1016/j.procs.2019.08.125
  31. 31. Christ M, Braun N, Neuffer J, Kempa-Liehr AW. Time series FeatuRe extraction on basis of scalable hypothesis tests (tsfresh – A python package). Neurocomputing [Internet]. 2018;307:72-77. Available from:
  32. 32. Chen G, Wang K. Current advances of time series analysis in information science: Tasks, processes and problems. Documentation, Informaiton & Knowledge [Internet]. 2023;40(6):89-97. Available from:
  33. 33. Xiong O, Li S. Methods of nonlinear time series cycle analysis in big data environment and IoT application. Wireless Communications and Mobile Computing [Internet]. 2022:8. Available from:
  34. 34. Mehrmolaei S, Keyvanpour MR. TsP-SA: Usage of time series techniques on healthcare data. International Journal of Electronic Healthcare [Internet]. 2018;10(3):190-230. Available from:
  35. 35. Klepac G, Kopal R, Mršić L. REFII model as a base for data mining techniques hybridization with purpose of time series pattern recognition. Studies in Computational Intelligence [Internet]. 2016;611:237-270. Available from:
  36. 36. Wang S. Research on data mining and investment recommendation of individual users based on financial time series analysis. International Journal of Data Warehousing and Mining [Internet]. 2020;16(2):64-80. Available from:
  37. 37. Winters PR. Forecasting sales by exponentially weighted moving averages. Management Science. 1960;6(3):324-342
  38. 38. Box GEP, Jenkins GM, Reinsel GC, Ljung GM. Time Series Analysis: Forecasting and Control. Hoboken, New Jersey: John Wiley & Sons; 2015
  39. 39. Shiskin J. The X-11 Variant of the Census Method II Seasonal Adjustment Program. US Department of Commerce, Bureau of the Census; 1967. Available from:
  40. 40. Dagum EB. A new method to reduce unwanted ripples and revisions in trend-cycle estimates from X-11-ARIMA. Survey Methodology. 1996;22:77-84
  41. 41. Yang H, Huang K, King I, Lyu MR. Localized support vector regression for time series prediction. Neurocomputing. 2009;72(10-12):2659-2669
  42. 42. Chen T, Yin H, Chen H, Wu L, Wang H, Zhou X, et al. Tada: Trend alignment with dual-attention multi-task recurrent neural networks for sales prediction. In: 2018 IEEE International Conference on Data Mining (ICDM). Singapure, Singapure: IEEE; 2018. pp. 49-58. DOI: 10.1109/ICDM.2018.00020
  43. 43. Haggan V, Ozaki T. Modelling nonlinear random vibrations using an amplitude-dependent autoregressive time series model. Biometrika. 1981;68(1):189-196
  44. 44. Chen S, Wang XX, Harris CJ. NARX-based nonlinear system identification using orthogonal least squares basis hunting. IEEE Transactions on Control Systems Technology. 2007;16(1):78-84
  45. 45. Frigola R, Rasmussen CE. Integrated pre-processing for Bayesian nonlinear system identification with Gaussian processes. In: 52nd IEEE Conference on Decision and Control. Firenze, Italy: IEEE; 2013. pp. 5371-5376. DOI: 10.1109/CDC.2013.6760734
  46. 46. Bertsimas D, Boussioux L. Ensemble modeling for time series forecasting: an adaptive robust optimization approach [Internet]. 2023. Available from: cs.LG/2304.04308
  47. 47. Khalil RA. Comparison of four neural network learning methods based on genetic algorithm for non-linear dynamic systems identification. AL Rafdain Engineering Journal. 2012;20(1):122-132
  48. 48. Taylor JG. Univariate and multivariate time series predictions. In: Shadbolt J, Taylor JG, editors. Neural Networks and the Financial Markets. Perspectives in Neural Computing. London: Springer; 2002. pp. 11-22. DOI: 10.1007/978-1-4471-0151-2_2
  49. 49. Coyle D, Prasad G, McGinnity M. Faster self-organizing fuzzy neural network training and improved autonomy with time-delayed synapses for locally recurrent learning. In: Turgay T, editor. System and Circuit Design for Biologically-Inspired Intelligent Learning. Hershey, PA: IGI Global; 2011. pp. 156-183. DOI: 10.4018/978-1-60960-018-1.ch008
  50. 50. Bengio Y. Learning deep architectures for AI. Found trends® Machine Learning. 2009;2(1):1-127
  51. 51. Malekmohamadi Faradonbe S, Safi-Esfahani F, Karimian-kelishadrokhi M. A review on neural Turing machine (NTM). SN Computer Science [Internet]. 2020;1(6):333. DOI: 10.1007/s42979-020-00341-6
  52. 52. Zhou H, Zhang S, Peng J, Zhang S, Li J, Xiong H, et al. Informer: Beyond efficient transformer for long sequence time-series forecasting. Proceedings of the AAAI Conference on Artificial Intelligence. 2021;35(12):11106-11115. DOI: 10.1609/aaai.v35i12.17325
  53. 53. Zeng A, Chen M, Zhang L, Xu Q. Are transformers effective for time series forecasting? Proceedings of the AAAI Conference on Artificial Intelligence. 2023;37(9):11121-11128. DOI: 10.1609/aaai.v37i9.26317
  54. 54. Hochreiter S, Schmidhuber J. Long short-term memory. Neural Computation. 1997;9(8):1735-1780
  55. 55. Cho K, van Merriënboer B, Bahdanau D, Bengio Y. On the properties of neural machine translation: Encoder{-}decoder approaches. In: Wu D, Carpuat M, Carreras X, Vecchi EM, editors. Proceedings of {SSST}-8, Eighth Workshop on Syntax, Semantics and Structure in Statistical Translation [Internet]. Doha, Qatar: Association for Computational Linguistics; 2014. pp. 103-111. Available from:
  56. 56. Graves A, Schmidhuber J. Framewise phoneme classification with bidirectional LSTM and other neural network architectures. Neural Networks. 2005;18(5-6):602-610
  57. 57. Sutskever I, Vinyals O, Le QV. Sequence to sequence learning with neural networks. In: Ghahramani Z, Welling M, Cortes C, Lawrence N, Weinberger KQ , editors. Advances in neural information processing systems. 2014;27:9. ISBN: 9781510800410. Available from:
  58. 58. Gehring J, Auli M, Grangier D, Yarats D, Dauphin YN. Convolutional sequence to sequence learning. In: International Conference on Machine Learning. Vol. 70. PMLR; 2017. pp. 1243-1252. Available from:
  59. 59. Firat O, Cho K, Sankaran B, Yarman Vural FT, Bengio Y. Multi-way, multilingual neural machine translation. Computer Speech & Language [Internet]. 2017;45:236-252. Available from:
  60. 60. Vaswani A, Shazeer N, Parmar N, Uszkoreit J, Jones L, Gomez AN, et al. Attention is all you need. Advances in Neural Information Processing Systems. 2017;30:11

Written By

Cláudia M. Viana, Sandra Oliveira and Jorge Rocha

Submitted: 12 February 2024 Published: 22 May 2024