## Brain-Inspired Hyperdimensional Computing

The mathematical properties of high-dimensional spaces show remarkable agreement with behaviors controlled by the brain. Brain-inspired hyperdimensional (HD) computing explores the emulation of cognition by computing with hypervectors as an alternative to computing with numbers. Hyperdimensional (HD) computing captures the idea of pattern recognition by modeling each neural activity with a hypervector, a vector with dimensionality in the thousands. Hypervectors are high-dimensional (e.g., D=10,000), holographic, and (pseudo)random with independent and identically distributed (i.i.d.) components. The HD has several superb properties:

In hardware prospective, we exploit such architectural insight in three widely-used methodological design approaches for developing scalable and efficient associative memories. At its very core, HD computing is about manipulating and comparing large patterns, stored in memory as hypervectors: the input symbols are mapped to a hypervector and an associative search is performed for reasoning and classification. For every classification event, an associative memory is in charge of finding the closest match between a set of

- General and scalable model of computing with well-defined set of arithmetic operations
- Fast and one-shot learning
- A memory-centric architecture with significantly parallelizable operations
- Extremely robust against most failure mechanisms and noise

In hardware prospective, we exploit such architectural insight in three widely-used methodological design approaches for developing scalable and efficient associative memories. At its very core, HD computing is about manipulating and comparing large patterns, stored in memory as hypervectors: the input symbols are mapped to a hypervector and an associative search is performed for reasoning and classification. For every classification event, an associative memory is in charge of finding the closest match between a set of

*learned*hypervectors and a*query*hypervector by using a distance metric. Hypervectors with the i.i.d. components qualify a memory-centric architecture to tolerate massive number of errors, hence it eases cooperation of various methodological design approaches for boosting energy efficiency and scalability. We design architectures for hyperdimensional associative memory (HAM) to facilitate energy-efficient, fast, and scalable search operation using three widely-used design approaches. These HAM designs search for the nearest Hamming distance, and linearly scale with the number of dimensions in the hypervectors while exploring a large design space with orders of magnitude higher efficiency.**publication:****[HPCA'17] M. Imani,**A. Rahimi, D. Kong, T. Rosing, J. M. Rabaey “Exploring Hyperdimensional Associative Memory”, in International Symposium on High-Performance Computer Architecture (HPCA), 2017 [PDF].**[D&T'17] M. Imani,**A. Rahimi, J. Hwang, T. Rosing, J. M. Rabaey “Low-Power Sparse Hyperdimensional Encoder for Language Recognition”, IEEE Design & Test, 2017.