Extending Hyperdimensional Machine Learning Pipelines with Quantization, Memory Consolidation, Markov Chains and Graph Neural Networks

Robert McMenemy
11 min readOct 10, 2024

Introduction

In the rapidly evolving field of machine learning, the need for models that are both efficient and adaptable remains a significant challenge. Our previous work introduced a pipeline that integrates Hyperdimensional Computing (HDC) with Reservoir Computing (RC) methods, specifically using Echo State Networks (ESN) and Sparse Distributed Memory (SDM), to process high-dimensional, noisy, and sequential data effectively.

Building on this foundation, we have incorporated additional techniques into the pipeline to further enhance its capabilities. This article explores these advancements, which include:

  • Quantization Techniques: To reduce computational load and memory usage.
  • Memory Consolidation Mechanisms: Inspired by cognitive processes to enhance long-term learning and adaptability.
  • Markov Chains: For modeling and predicting sequential…

--

--