Pause
Read
CEA vacancy search engine

In-memory analog computing for AI attention mechanisms


Thesis topic details

General information

Organisation

The French Alternative Energies and Atomic Energy Commission (CEA) is a key player in research, development and innovation in four main areas :
• defence and security,
• nuclear energy (fission and fusion),
• technological research for industry,
• fundamental research in the physical sciences and life sciences.

Drawing on its widely acknowledged expertise, and thanks to its 16000 technicians, engineers, researchers and staff, the CEA actively participates in collaborative projects with a large number of academic and industrial partners.

The CEA is established in ten centers spread throughout France
  

Reference

SL-DRT-24-0684  

Direction

DRT

Thesis topic details

Category

Technological challenges

Thesis topics

In-memory analog computing for AI attention mechanisms

Contract

Thèse

Job description

The aim of this thesis is to explore the execution of attention mechanisms for Artificial Intelligence directly within a cutting-edge Non-Volatile Memory (NVM) technology.

Attention mechanisms represent a breakthrough in Artificial Intelligence (AI) algorithms and represent the performance booster behind “Transformers” neural networks.
Initially designed for natural language processing, such as ChatGPT, these mechanisms are widely employed today in embedded application domains such as: predicting demand in an energy/heat network, predictive maintenance, and monitoring of transport infrastructures or industrial sites.
Despite their widespread use, attention-based workloads demand extensive data access and computing power, resulting in high power consumption, which may be impractical to target embedded hardware systems.

The non-volatile memristor technology offers a promising solution by enabling analog computing functions with minimal power consumption while serving as non-volatile storage for AI model parameters. Massive linear algebra algorithms can be executed faster, at an ultra-low energy cost, when compared with their fully-digital implementation.
However, the technology comes with limitations, e.g., variability, the number of bits to encode model parameters (i.e. quantization), the maximum size of vectors processed in parallel, etc.

This thesis focuses on overcoming these challenges in the context of embedded time-series analysis and prediction.
The key task is exploring the mapping of attention-based mechanisms to a spin-based memristor technology developed by the SPINTEC Laboratory.
This involves quantizing and partitioning AI models to align with the hardware architecture without compromising the performance of the prediction, and exploring the implementation of particular AI blocks into the memristor analog fabric.

This thesis is part of a collaboration between CEA List, Laboratoire d’Intelligence Intégrée Multi-Capteur, the Grenoble Institute of Engineering and Management and the SPINTEC Laboratory.
Joining this research presents a unique opportunity to work within an interdisciplinary and dynamic team at the forefront of the AI ecosystem in France, with strong connections to influential industrial players in the field.

University / doctoral school

Electronique, Electrotechnique, Automatique, Traitement du Signal (EEATS)
Université Grenoble Alpes

Thesis topic location

Site

Grenoble

Requester

Position start date

01/10/2024

Person to be contacted by the applicant

MOLNOS Anca anca.molnos@cea.fr
CEA
DRT/DSCIN/DSCIN/LIIM
17 Rue de Martyrs, 38054, Grenoble, France
0438780460

Tutor / Responsible thesis director

ANGHEL Lorena lorena.anghel@grenoble-inp.fr
Grenoble INPG
Laboratoire SPINTEC

06 82 31 26 47

En savoir plus


https://list.cea.fr/fr/intelligence-artificielle/