In this thesis, we extend the principal component analysis (PCA) to account for both stationary and non-stationary time series data. The dimension reduction methods we propose will employ a moving cross-covariance matrix of data, which can be updated as we move in time. We show that the moving cross-covariance matrix can extract dynamic dependence among variables of both stationary and non-stationary series.The first two methods we propose can be considered as a generalization of dynamic principal component analysis (DPCA) of Ku et al. (1995) to the nonstationary case. The first method will apply eigenanalysis on the moving crosscovariance matrix of the extended data vector that can be formed by including lagged series into the original data vector. The second method is different from the first one, where we apply eigenanalysis on a quadratic-order of the moving crosscovariance matrix of the extended data vector. In order to optimize the results of our methods, we will propose a new criterion to determine the optimal number of principal components to retain.Additionally, we are going to introduce the moving cross-correlation function that can be used to evaluate the correlations between non-stationary variables. The third dimension reduction method that we introduce will generalize the principal component analysis for time series (TS-PCA) of Chang et al. (2018) to non-stationary series. This method seeks a linear transformation such that the transformed series is segmented into uncorrelated subseries with lower dimensions that can be separately analysed as they are not correlated. The latter method will account for high-dimensional time series.Theoretical properties of the proposed methods show the consistency of the used estimators. All methods prove their abilities to dimension reduction of both stationary and non-stationary series based on simulated and real data sets.
|Date of Award||7 Sep 2020|
- University Of Strathclyde
|Supervisor||Jiazhu Pan (Supervisor) & Xuerong Mao (Supervisor)|