Paper 2025/1917
Embedding belief propagation within a multi-task learning model : An example on Kyber's NTT
Abstract
Domain-informed deep learning integrates specialized knowledge into deep neural networks through modifications to inputs, architectures, loss functions, or training processes. Such approach is usually employed to improve model performance in low/noisy data regime via domain-specific priors. Side-channel analysis is such a domain and even though straight deep learning has become a powerful tool for profiled side-channel, it still struggles on more complex datasets because of weak signals distributed across many datapoints. Initial attempts at integrating specific knowledge into models have been extremely successful to pushing boundaries of what can be achieved with deep learning. For example, integrating knowledge about the masking scheme in Masure et al provides significant improvements in decreasing the profiling complexity. Marquet et Oswald went further by leveraging redundant features across multiple tasks, such as shared randomness or common masking flow. In this work, we continue the discussion by using belief propagation on a larger graph to guide the learning. We introduce a multi-task learning model that explicitly integrates a factor graph reflecting the algebraic dependencies among intermediates in the computations of Kyber's inverse Number Theoretic Transform (iNTT). Such framework allow the model to learn a joint representation of the related tasks that is mutually beneficial. For the first time, we show that one can perform a belief propagation during training even when one does not have access to the internal randomness, on the masked shares, potentially improving greatly the performances of the attack.
Metadata
- Available format(s)
-
PDF
- Category
- Attacks and cryptanalysis
- Publication info
- Preprint.
- Keywords
- Side-channel analysisKyberNTTMaskingMulti-task learningDeep learningBelief propagation
- Contact author(s)
-
thomas marquet666 @ icloud com
elisabeth oswald @ aau at - History
- 2026-02-25: last of 3 revisions
- 2025-10-14: received
- See all versions
- Short URL
- https://ia.cr/2025/1917
- License
-
CC BY
BibTeX
@misc{cryptoeprint:2025/1917,
author = {Thomas Marquet and Elisabeth Oswald},
title = {Embedding belief propagation within a multi-task learning model : An example on Kyber's {NTT}},
howpublished = {Cryptology {ePrint} Archive, Paper 2025/1917},
year = {2025},
url = {https://eprint.iacr.org/2025/1917}
}