Recognizing Temporal Patterns using an Echo State Network icon Recognizing Temporal Patterns using an Echo State Network
University Projects #Machine Learning#Python#Data Science Featured

Overview#

A custom implementation of an Echo State Network (ESN), a type of recurrent neural network designed for temporal pattern recognition and time series forecasting. Built from scratch using Python, NumPy, and Matplotlib.

Echo State Network#

An Echo State Network is a type of recurrent neural network used to recognize temporal patterns like K-Step Ahead Forcasting. Specifically, ESNs use backpropagation to feed optimized hyperparameters back into its recurrent internal state.

K-Step Ahead Forcasting#

K-Step Ahead Forcasting predicts the value of the timeseries data k timesteps ahead of the current time t. In this implementation, I use Mean-Square Error to determine prediction accuracy. K-Step Ahead Forcasting is used in both the validation stage and testing stage of model evaluation.

View the full project on GitHub.

Key Achievements#

  • Implemented recurrent neural network for time series prediction, achieving 0.142 MSE in 2Sine 1-Step model and 7.626 MSE in Lorenz 1-Step model, indicating significant generalization ability and data variation respectively
  • Conducted systematic hyperparameter tuning using cross-validation and ridge regression to minimize prediction error
  • Analyzed how error compounds overtime in multi-step predictions
  • Demonstrated impact of model complexity on overfitting using Lorenz data

Implementation#

  • Used batch training with input weights, recurrent weights, and sigmoid activation
  • Applied ridge regression to determine optimized output weights
  • Performed cross-validation to optimize hyperparameters
  • Evaluated results using Mean-Square Error (MSE)

Conclusions#

  • The Lorenz model demonstrates that increased model complexity leads to overfitting and a failure to generalize, as its data points had higher varience and its function was more complex compared to the 2Sine dataset. Additionally, the complexity of the Lorenz function causes a smaller regularization parameter, causing outliers to significantly affect weight optimizations.
  • K-Step Ahead prediction produces less accurate results the more steps ahead you attempt to predict because error compounds overtime and predictions are fed back into the model.

2Sine Timeseries#

MetricValue
Optimized Hidden Neurons10
Optimized Regularization5.0
Data Split40/30/30
1-Step MSE0.142
2-Step MSE1.24

1-Step Ahead 2-Step Ahead

Lorenz Timeseries#

MetricValue
Optimized Hidden Neurons20
Optimized Regularization0.1
Data Split80/10/10
1-Step MSE7.626
2-Step MSE111.611

1-Step Ahead 2-Step Ahead

Technologies#

Python, NumPy, Matplotlib, Ridge Regression, Neural Networks

← Back to Projects