778 results sorted by ID
Fully Homomorphic Encryption (FHE) is a modern cryptographic technique that allows performing computations directly over encrypted data. This makes FHE an indispensable method for privacy-preserving applications, where users' data are encrypted and processed by a potentially untrusted third party. Nevertheless, FHE computations are computationally expensive, often rendering them less practical for realistic scenarios. Notably, a major performance bottleneck for FHE is an operation called...
Theoretical treatments of leakage-resilient cryptography typically work under the assumption that the leakage learned by the adversary (e.g., about an n-bit secret key) is arbitrary but bounded, in the sense that the leakage is an l-bit string for some threshold l significantly smaller than n. On the other hand, real-world side-channel attacks on physical implementations of cryptographic protocols produce leakage transcripts that are much longer than n. However, unlike the bounded leakage...
Federated Learning (FL) is a collaborative Machine Learning (ML) process where clients locally train an ML model on their private inputs, and send it to a server that aggregates the local model updates to obtain a global model update. FL is widely used in applications where the training data is distributed among several clients, e.g., for next word prediction in Google keyboard (Gboard). Nevertheless, FL faces several challenges concerning privacy and security. 1) Client privacy needs to be...
We study the design of a privatization mechanism and privacy accounting in the Pufferfish Privacy (PP) family. Specifically, motivated by the curse of dimensionality and lack of practical composition tools for iterative learning in the recent Rényi Pufferfish Privacy (RPP) framework, we propose Sliced Rényi Pufferfish Privacy (SRPP). SRPP preserves PP/RPP semantics (customizable secrets with probability-aware secret–dataset relationships) while replacing high-dimensional Rényi divergence...
We build a leveled fully homomorphic encryption (FHE) scheme that achieves IND-CCA1 security under the learning with errors (LWE) assumption in the standard model. It is the first scheme of this kind that does not rely on succinct non-interactive arguments of knowledge (SNARK) to obtain security against active adversaries. Instead, we use the gadget lattice trapdoors introduced by Micciancio and Peikert [Eurocrypt 2012] in combination with a dual version of the GSW FHE scheme [Gentry,...
Deep Learning-based Non-profiled Side-Channel Analysis (DL-NSCA) enables automated feature extraction and obviates the need for a profiling device. However, existing methods mostly rely on leakage from non-linear operations and require prior knowledge of the target algorithm and its plaintexts/ciphertexts, which restricts their applicability to black-box scenarios and proprietary implementations. Motivated by the ubiquity of plaintext-key XOR operations in cryptographic algorithms, this...
Neural network model extraction has recently emerged as an important security concern, as adversaries attempt to recover a network’s parameters via black-box queries. Carlini et al. proposed in CRYPTO’20 a model extraction approach inspired by differential cryptanalysis, consisting of two steps: signature extraction, which extracts the absolute values of network weights layer by layer, and sign extraction, which determines the signs of these signatures. However, in practice this...
Masking, i.e., computing over secret-shared data, is one of the main counter-measures against side-channel analysis, provably secure in the standard noisy-leakage model. However, all the state-of-the-art security proofs rely on a reduction to a more abstract model called random probing (Eurocrypt’14). As a result, the noise requirements of such proofs must scale with the field size of the circuit which, beyond not reflecting real-world physics of target devices, is often prohibitive,...
Non profiled side-channel attacks aim at exploiting leakage traces from a targeted embedded system to extract secret information, without a priori knowledge on the true leakage model of the device. To automate and simplify attacks, deep learning techniques have thus been introduced in the side-channel community. Most of the works published have mainly explored the use of discriminative models in the profiled or non profiled context. However, the lack of interpretability and explainability of...
We give the first $\mathsf{NC}^1$-computable Pseudorandom Function (PRF) constructions from well-founded (non-ad-hoc) code-based assumptions. Specifically, we give two constructions based on two different classical variants of the learning parity with noise (LPN) assumption: (1) An $\mathsf{NC}^1$-computable PRF from hardness of Sparse-LPN [Alekhnovich, FOCS~'03] (with respect to a sublinear-depth explicit expander graph family). This PRF is also key-homomorphic. (2) An...
Differential privacy (DP) is one of the most efficient tools for protecting the privacy of individual data holders under computation. This property guarantees that the computation outputs for every pair of adjacent input sets are statistically indistinguishable with respect to a given parameter ε, which is independent of the likelihood that specific inputs occur or not. While the distribution of input sets is generally unknown, in some use cases (approximate) information about it might be...
Threshold signatures allow multiple parties to sign a common message by collaborating. More specifically, in a $(t,n)$-threshold signature scheme, at least $t$ out of $n$ parties must collaborate to sign a message. Although pre-quantum threshold signature algorithms have been extensively studied, the state of the art in the creation of post-quantum threshold algorithms remains sparse. Most studies focus on signature algorithms based on structured lattice problems. In particular, few...
The Module Learning With Errors (MLWE) problem is the fundamental hardness assumption underlying the key encapsulation and signature schemes ML-KEM and ML-DSA, which have been selected by NIST for post-quantum cryptography standardization. Understanding its quantum hardness is crucial for assessing the security of these standardized schemes. Inspired by the equivalence between LWE and Extrapolated Dihedral Cosets Problem (EDCP) in [Brakerski, Kirshanova, Stehlé and Wen, PKC 2018], we show...
Differential privacy (DP) has emerged as a rigorous framework for privacy-preserving data analysis, with widespread deployment in industry and government. Yet existing implementations typically assume that the party applying the mechanism can be trusted to sample noise correctly. This trust assumption is overly optimistic: a malicious party may deviate from the protocol to gain accuracy or avoid scrutiny, thereby undermining users’ privacy guarantees. In this paper, we introduce Noisette,...
Side-channel attacks exploiting Plaintext-Checking (PC) and Decryption Failure (DF) oracles are a pressing threat to deployed post-quantum cryptography. These oracles can be instantiated from tangible leakage sources like timing, power, and microarchitectural behaviors, making them a practical concern for leading schemes based on lattices, codes, and isogenies. In this paper, we revisit chosen-ciphertext side-channel attacks that leverage the DF oracle on ML-KEM. While DF oracles are often...
Side-channel analysis (SCA) can recover secret keys by exploiting physical leakages emitted during cryptographic computations. Most SCA techniques, however, require knowledge of the plaintext or ciphertext corresponding to each measured trace, which may be unavailable in realistic adversarial settings. Blind side-channel analysis (Blind SCA), first introduced in 2014, relaxes this requirement, but existing work has mainly targeted S-box nonlinearities. We present a systematic study of blind...
We study t-out-of-n threshold fully homomorphic encryption (ThFHE) in the synchronous setting, i.e., when the set of t decryptors is known at the outset of the decryption protocol. It has been observed in various works that the synchronous setting assumption enables to efficiently circumvent one of the major difficulties of ThFHE, namely hiding noise terms whose gigantic magnitude is incurred by the reconstruction coefficients of Shamir secret sharing. Yet, prior to this work, ThFHE in the...
In this paper, we define the notion of pseudorandom correlation generators (PCGs) and functions (PCFs) for garbled circuit correlations. With a Garbling PCG or PCF, two parties can non-interactively generate a virtually unbounded number of secret-shared garbled circuits and corresponding secret-shared garbled inputs. With the shares of the garbled circuit and garbled input, anyone can recover the garbled circuit and evaluate it to obtain the result of the computation in the clear. In...
The transition of cryptographic primitives to the post-quantum era necessitates the rigorous translation of asymptotic security proofs into concrete parameter instantiations. This paper evaluates the practical realizability of the Decentralized Multi-Authority Attribute-Based Encryption (MA-ABE) scheme by Datta, Komargodski, and Waters (Eurocrypt 2021), a seminal construction relying exclusively on the Learning With Errors (LWE) assumption. While DKW21 eliminates the reliance on bilinear...
We present the first construction of attribute-based laconic function evaluation (AB-LFE) from the decisional composite residuosity (DCR) assumption. This yields the first example of computationally succinct secure computation from a group-based assumption, avoiding reliance on noisy lattice assumptions such as LWE. Our construction builds on recent work in fully homomorphic MACs and homomorphic secret sharing by Ishai, Li and Lin (FOCS 2025) and Meyer, Orlandi, Roy, and Scholl (Crypto...
Secure multi-party computation (MPC) enables $N$ parties to jointly evaluate any function over their private inputs while preserving confidentiality. While decades of research have produced concretely efficient protocols for small to moderate numbers of participants, scaling MPC to thousands of parties remains a central challenge. Most of the existing approaches either incur per-party costs linear in $N$, due to pairwise computations, or rely on heavy cryptographic tools such as homomorphic...
The standardization of CRYSTALS-Kyber (ML-KEM) by NIST represents a milestone in post-quantum security, yet its substantial communication overhead remains a critical bottleneck for resource-constrained environments. This paper introduces <i>LAKE (Lattice-Code Accelerated Kyber Encapsulation)</i>, a novel cryptographic framework that symbiotically integrates coding theory into the Module-LWE structure. Unlike previous concatenation approaches, LAKE embeds density-optimized Construction-A...
Fully Homomorphic Encryption (FHE) aims at ensuring privacy of sensitive data while taking advantage of external computations and services. However, using FHE in real-world scenarios reveals new kinds of security issues. In particular, following Li&Micciancio Eurocrypt'21 seminal paper, CPAD security has emerged as a fundamental notion for FHE, unveiling a subtle interplay between security and correctness. For correct (F)HE schemes, CPA security already implies CPAD. However, all known...
Currently, most FHE schemes realize bootstrapping through the linear-decrypt-then-round paradigm. For the programmable bootstrapping (PBS) of TFHE, this means the lookup table (LUT) needs a redundancy of $O(\sqrt{N})$ to be able to remove the modulus switching noise, which limits the plaintext modulus of PBS to $O(\sqrt{N})$. We remove this requirement for redundancy by proposing the Meta-PBS framework, which allows us to start with under-redundant or non-redundant LUTs. Meta-PBS iteratively...
Fresh re-keying is a countermeasure against side-channel analysis where an ephemeral key is derived from a long-term key using a public random value. Popular instances of such schemes rely on key-homomorphic primitives, so that the re-keying process is easy to mask and the rest of the (e.g., block cipher) computations can run with cheaper countermeasures. The main requirement for these schemes to be secure is that the leakages of the ephemeral keys do not allow recovering the long-term key....
Learning problems have become a foundational element for constructing quantum-resistant cryptographic schemes, finding broad application even beyond, such as in Fully Homomorphic Encryption. The increasing complexity of this field, marked by the rise of physical learning problems due to research into side-channel leakage and secure hardware implementations, underscores the urgent need for a more comprehensive analytical framework capable of encompassing these diverse variants. In...
We study the Batch Learning Parity with Noise (LPN) variant, where the oracle returns $k$ samples in a batch, and draws the noise vector from a joint noise distribution $\mathcal{D}$ on $\mathbb{F}_2^k$ (instead of i.i.d.). This model captures a broad range of correlated or structured noise patterns studied in cryptography and learning theory, and was formally defined in recent work by Golowich, Moitra, and Rohatgi (FOCS 2024). Consequently, understanding which distributions preserve the...
Fully Homomorphic Encryption over the Torus (TFHE) is a fully homomorphic encryption scheme that efficiently supports Boolean logic gates by performing gate operations and refreshing ciphertext noise with single gate bootstrapping. However, its operation is limited to simple two-input gates such as AND, OR, XOR and NAND, requiring deep circuits and multiple bootstrapping steps to support more complex arithmetic. In this paper, we propose Primitive Gate Bootstrapping, a new algebraic...
Succinct zero-knowledge arguments (zk-SNARKs) enable a prover to convince a verifier of the truth of a statement via a succinct and efficiently verifiable proof without revealing any additional information about the secret witness. A barrier to practical deployment of zk-SNARKs is their high proving cost. With this motivation, we study server-aided zk-SNARKs, where a client/prover outsources most of its work to a single, untrusted server while the server learns nothing about the witness or...
The Brakerski-Gentry-Vaikuntanathan (BGV) scheme is one of the most significant fully homomorphic encryption (FHE) schemes. It belongs to a class of FHE schemes whose security is based on the presumed intractability of the Learning with Errors (LWE) problem and its ring variant (RLWE). Such schemes deal with a quantity, called noise, which increases each time a homomorphic operation is performed. Specifically, in order for the scheme to work properly, it is essential that the noise...
TFHE bootstrapping is typically limited to a small plaintext space, with an exponential increase in cost for larger plaintext spaces. To bootstrap larger integers, one can use digit decomposition, a procedure that iteratively extracts and bootstraps a part of the larger plaintext space. Conventional state-of-the-art methods typically extract bits starting from the least significant bits (LSBs) and progress to the most significant bits (MSBs). However, we introduce a DirtyMSB extraction...
Recently, Hövelmanns, Hülsing, and Majenz introduced a security notion called Find Failing Plaintext – Non Generic (FFP-NG), which captures the ability of an adversary to find decryption failures by making non-trivial use of the public key. A first analysis of this property for lattice-based schemes was presented by Majenz and Sisinni, who showed that the Learning With Errors (LWE) problem reduces to breaking the FFP-NG security of the PVW scheme with discrete Gaussian noise. In this work,...
HQC (Hamming Quasi-Cyclic) was selected as the fifth algorithm in the NIST suite of post-quantum cryptographic (PQC) standards. As the only code-based algorithm currently standardized by NIST, HQC offers a good balance between security assurance, performance, and implementation simplicity. Most existing power analyses against HQC are of the SPA style: they can recover secrets with a small number of traces, but can only tolerate limited noise. In this paper, we develop a chosen-ciphertext...
Function Secret Sharing (FSS) schemes enable to share secret functions between multiple parties, with notable applications in anonymous communication and privacy-preserving machine learning. While two-party schemes offer logarithmic key sizes, multi-party schemes remain less practical due to significantly larger keys. Although several approaches have been proposed to improve multi-party schemes, a significant efficiency gap remains between the two-party and multi-party settings. Our work...
Multi-key fully homomorphic encryption (MKFHE) extends the capability of fully homomorphic encryption by enabling homomorphic computations on ciphertexts encrypted under different keys. Multi-key blind rotation is the core and most computationally intensive component of MKFHE. The NTRU-based multi-key blind rotation proposed by Xiang et al. (ASIACRYPT 2024) has the potential to achieve smaller key sizes, faster blind rotation, and lower noise compared to its RLWE-based counterpart. However,...
Prange's information set algorithm is a decoding algorithm for arbitrary linear codes. It decodes corrupted codewords of any $\mathbb{F}_2$-linear code $C$ of message length $n$ up to relative error rate $O(\log n / n)$ in $\mathsf{poly}(n)$ time. We show that the error rate can be improved to $O((\log n)^2 / n)$, provided: (1) the decoder has access to a polynomial-length advice string that depends on $C$ only, and (2) $C$ is $n^{-\Omega(1)}$-balanced. As a consequence we improve the...
We present GRAFHEN, a new cryptographic scheme which offers Fully Homomorphic Encryption without the need for bootstrapping (or in other words, without noise). Building on the work of Nuida and others, we achieve this using encodings in groups. The groups are represented on a machine using rewriting systems. In this way the subgroup membership problem, which an attacker would have to solve in order to break the scheme, becomes maximally hard, while performance is preserved. In fact we...
Feature snooping has been shown to be very effective for stealing cost-sensitive models executing on neural processing units. Existing model obfuscation defenses protect the weights directly, but do not protect the features that hold information on the weights in indirect form. This paper proposes CoupledNets, the first model obfuscation defense that protects the intermediate features during inference. The obfuscation is performed during the training phase, by injecting noise, customized on...
FHEW-like homomorphic encryption (HE) schemes, introduced by Ducas and Micciancio (Eurocrypt 2015), represent the most efficient family of HE schemes in terms of both latency and key size. However, their bootstrapping noise is highly sensitive to parameter selection, leaving only a sparse set of feasible parameters. Because bootstrapping noise directly affects security and performance, existing approaches tend to choose parameters that drive noise excessively low—resulting in large key...
Lattice-based key-homomorphic encodings introduced by Boneh et al.~(Eurocrypt'14)---known as BGG+ encodings---underpin many primitives, including key-policy attribute-based encryption (KP-ABE). Many applications beyond KP-ABE require simulating the homomorphic evaluation of FHE ciphertexts over BGG+ encodings, which involves nonlinear operations on integers of magnitude up to the ciphertext modulus $q$. However, due to noise growth incurred by multiplication, the encodable integers must be...
In the post-quantum migration of the traditional key establishment protocol, hybrid key encapsulation mechanisms (KEMs) are recommended by standards bodies, including NIST, ETSI, and national security agencies like NCSC-UK, BSI-Germany etc. Recently, several hybrid KEMs with CCA security such as XOR-then-MAC, Dual-PRF and X-Wing (being standardized by IETF) are proposed based on CCA KEMs obtained by applying the complicated Fujisaki-Okamoto transform to public-key encryption (PKE)...
Threshold fully homomorphic encryption (ThFHE) enables multiple parties to perform arbitrary computation over encrypted data, while the secret key is distributed across the parties. The main task of designing ThFHE is to construct threshold key-generation and decryption protocols for FHE schemes. Among existing FHE schemes, FHEW-like cryptosystems enjoy the advantage of fast bootstrapping and small parameters. However, known ThFHE solutions use the ``noise-flooding'' technique to realize...
Multi-key fully homomorphic encryption (MK-FHE) enables secure computation over ciphertexts under different keys, but its practicality is hindered by inefficient bootstrapping.In this work, we propose HELIOS, a new MK-FHE scheme with highly efficient bootstrapping. Our bootstrapping framework improves upon the best-known complexity, reducing it from ${O}(dkn)$ to ${O}(kn)$, and further to ${O}(\sqrt{kn})$ under parallelization, where $d$ is the gadget length (typically scaling with the...
The CKKS fully homomorphic encryption (FHE) scheme enables computations on vectors of approximate complex numbers. A moderate precision of $\approx 20$ bits often suffices but, in many applications, a higher precision is required for functionality and/or security. Indeed, to obtain IND-CPA-D security [Li-Micciancio; Eurocrypt'21], secure threshold-FHE [Asharov et al; Eurocrypt'12] and circuit privacy [Gentry; STOC'09], all known approaches require a precision that supports noise flooding....
Threshold Fully Homomorphic Encryption (FHE) enables arbitrary computation on encrypted data, while distributing the decryption capability across multiple parties. A primary application of interest is low-communication multi-party computation (MPC), which benefits from a fast and secure threshold FHE decryption protocol. Several works have addressed this problem, but all existing solutions rely on "noise flooding" for security. This incurs significant overhead and necessitates large...
Cloud computing has matured into the default substrate for data processing, yet confidentiality demands of- ten force a hard trade-off between the utility of outsourced computation and the privacy of sensitive inputs. Homomorphic encryption (HE)[1] promises to dissolve that trade-off by enabling computation directly on ciphertexts, returning encrypted results that only the data owner can decrypt. Despite remarkable progress in fully homomorphic encryption (FHE) and leveled variants suitable...
Random classical linear codes are widely believed to be hard to decode. While slightly sub-exponential time algorithms exist when the coding rate vanishes sufficiently rapidly, all known algorithms at constant rate require exponential time. By contrast, the complexity of decoding a random quantum stabilizer code has remained an open question for quite some time. This work closes the gap in our understanding of the algorithmic hardness of decoding random quantum versus random classical...
WireGuard is a VPN based on the Noise protocol, known for its high performance, small code base, and unique security features. Recently, Hülsing et al. (IEEE S&P'21) presented post-quantum (PQ) WireGuard, replacing the Diffie-Hellman (DH) key exchange underlying the Noise protocol with key-encapsulation mechanisms (KEMs). Since WireGuard requires the handshake message to fit in one UDP packet of size roughly 1200 B, they combined Classic McEliece and a modified variant of Saber. However, as...
The study of efficient multi-party computation (MPC) has been a central focus in the cryptographic literature, producing a wide range of innovative techniques that have substantially improved the practicality of MPC in real-world applications. However, the vast majority of this work assumes reliable communication channels and neglects the impact of network-level noise—a fundamental characteristic of modern communication systems. Although classical error-correcting codes can be used to...
Threshold public-key encryption (tPKE) enables any subset of $t$ out of $K$ parties to decrypt non-interactively, while any ciphertext remains secure if less than $t$ decryption shares are known. Despite recent progress, existing lattice-based tPKEs face at least one of the following drawbacks: (1) having large decryption share size -- polynomial in $K$ and some even exponential in $t$, (2) proven secure only against relaxed security models where the adversary is not allowed to see...
Pseudorandom correlation generators (PCGs) have been popular in generating a huge amount of correlated randomness, a critical resource in secure computation. However, existing PCGs are memory-consuming and not friendly to resource-constrained devices. Even for moderate devices, the need for large memory can also be a disadvantage in applications like zero-knowledge proofs or large-scale secure computation. In this paper, we propose a malicious streaming PCG (sPCG), which generates a bounded...
State-of-the-art actively secure multiparty computation protocols, like SPDZ (Damgård et al., CRYPTO 2012), use correlated randomness, like Beaver triples, to achieve a highly efficient online phase. For a long time, the generation of the correlated randomness in the offline phase relied on classical cryptographic primitives, like somewhat homomorphic encryption or oblivious transfer, that required significant communication. More recently, Boyle et al. (FOCS 2020) defined a new primitive...
Fully Homomorphic Encryption (FHE) enables computation over encrypted data, but deployment is hindered by the gap between plaintext and ciphertext programming models. FHE compilers aim to automate this translation, with a promising approach being FHE Instruction Set Architecture (FHE-ISA) based on homomorphic look-up-tables (LUT). However, existing FHE LUT techniques are limited to 16-bit precision and face critical performance bottlenecks. We introduce Tetris, a versatile TFHE LUT...
Approximate Homomorphic Encryption (AHE), introduced by Cheon et al. [CKKS17] offers a powerful solution for encrypting real-valued data by relaxing the correctness requirement and allowing small decryption errors. Existing constructions from (Ring) Learning with Errors achieve standard IND-CPA security, but this does not fully capture scenarios where an adversary observes decrypted outputs. Li and Micciancio [LiMic21] showed that when decryptions are passively leaked, these schemes become...
This work presents a joint design of encoding and encryption procedures for public key encryptions (PKEs) and key encapsulation mechanism (KEMs) such as Kyber, without relying on the assumption of independent decoding noise components, achieving reductions in both communication overhead (CER) and decryption failure rate (DFR). Our design features two techniques: ciphertext packing and lattice packing. First, we extend the Peikert-Vaikuntanathan-Waters (PVW) method to Kyber: $\ell$ plaintexts...
Mix networks (mix-nets) offer strong anonymity by routing client packets through intermediary hops, where they are shuffled with other packets to obscure their origins from a global adversary monitoring all communication exchanges. However, this anonymity is achieved at the expense of increased end-to-end latency, as packets traverse multiple hops (incurring routing delays) and experience additional delays at each hop for shuffling purposes. Consequently, the overall latency for delivering a...
Plaintext-checking (PC) oracle attacks are among the most prominent types of attacks against NIST's recently standardized ML-KEM. Previous works have drastically reduced the number of queries needed to recover the secret key. Although this number is now close to the information-theoretic bound, current attacks have not yet been adapted to settings with increased noise and highly imperfect oracles. In attacks targeting real-world protected implementations, noisy leakage may lead to oracles...
The Regular Syndrome Decoding (RSD) problem, introduced nearly two decades ago, is a regular version of the Syndrome Decoding (SD) problem, where the noise vector is divided into consecutive, equally sized blocks, each containing exactly one noisy coordinate. Recently, RSD has gained renewed attention for its applications in Pseudorandom Correlation Generator (PCG) and more. Very recently, several works presented the improved algebraic approach (AGB for short) and ISD approach (including...
The $k$LIN problem concerns solving noisy systems of random sparse linear equations mod 2. It gives rise to natural candidate hard CSP distributions and is a cornerstone of local cryptography. Recently, it was used in advanced cryptographic constructions, under the name 'sparse LPN'. For constant sparsity $k$ and inverse polynomial noise rate, both search and decision versions of $k$LIN are statistically possible and conjectured to be computationally hard for $n\ll m\ll n^{k/2}$, where...
We present a fully homomorphic encryption scheme which natively supports arithmetic and logical operations over large ``machine words'', namely plaintexts of the form $\mathbb{Z}_{2^n}$ (e.g.\ $n=64$). Our scheme builds on the well-known BGV framework, but deviates in the selection of number field and in the encoding of messages. This allows us to support large message spaces with only modest effect on the noise growth. Arithmetic operations (modulo $2^n$) are supported natively similarly...
This paper addresses the challenge of best arm identification in stochastic multi-armed bandit (MAB) models under privacy-preserving constraints, such as in dynamic spectrum access networks where secondary users must privately detect underutilized channels. While previous network security research has explored securing MAB algorithms through techniques such as homomorphic encryption or differential privacy, these methods often suffer from high computational overhead or introduce noise that...
The hardness of the learning with errors (LWE) problem increases as its noise rate grows. However, all existing LWE-based public-key encryption schemes require the noise rate to be no greater than $o(1/(\sqrt{n}\log n))$. Breaking through this limitation presents an intriguing challenge. In this paper, we construct public-key encryption (PKE) schemes based on the sub-exponential hardness of decisional LWE with polynomial modulus and noise rate ranging from $O(1/\sqrt{n})$ to $o(1/\log...
Optimizing Boolean circuits presents a considerable challenge, especially when aiming to construct circuits amenable to Fully Homomorphic Encryption (FHE) schemes. FHE enables arbitrary computations on encrypted data but incorporates a computationally intensive operation called bootstrapping, necessary for reducing noise in ciphertexts to facilitate computations on circuits of arbitrary depth. This operation can consume a substantial amount of time, depending on the size of the circuits. To...
In the last few years, the old idea of internal perturbation for multivariate schemes has been resurrected. A form of this method was proposed with application to HFE and UOV and independently by another team for application to Rainbow. Most recently, a newer and more efficient version of internal perturbation was proposed as an enhanced measure for securing HFE for encryption. This efficient method, known as the LL' construction, is designed to add little complexity to HFE decryption...
We present InsPIRe that is the first private information retrieval (PIR) construction simultaneously obtaining both high-throughput and low query communication while using silent preprocessing (meaning no offline communication). Prior PIR schemes with both high-throughput and low query communication required substantial offline communication of either downloading a database hint that is 10-100x larger than the communication cost of a single query (such as SimplePIR and DoublePIR [Henzinger...
Evaluating the security of a device against side-channel attacks is a difficult task. One prominent strategy for this purpose is to characterize the distribution of the rank of the correct key among the different key hypotheses produced by a maximum likelihood attack, depending on the number of measured traces. In practice, evaluators can estimate some statistics of the rank that are used as security indicators---e.g., the arithmetic and geometric mean rank, the median rank, the...
Many post-quantum cryptosystems require generating an $n$-bit binary vector with a prescribed Hamming weight $\omega$, a process known as \emph{fixed-weight sampling}. When $\omega = O(n)$, we call this \emph{dense} fixed-weight sampling, which commonly appears in lattice-based cryptosystems, like those in the NTRU family. In contrast, code-based cryptosystems typically use \emph{sparse} fixed-weight sampling with $\omega = o(n)$ (e.g., $O(\sqrt{n}$). Sparse fixed-weight sampling generally...
The ongoing transition to post-quantum cryptography has led to a surge of research in side-channel countermeasures tailored to these schemes. A prominent method to prove security in the context of side-channel analysis is the utilization of the well-established t-probing model. However, recent studies by Hermelink et al. at CCS 2024 demonstrate a simple and practical attack on a provably secure implementation of the Fujisaki-Okamoto transform. In this paper, we present an unsupervised...
We construct an efficient pseudorandom correlation generator (PCG) (Boyle et al., Crypto'19) for two-party programmable oblivious linear evaluation (OLE) functionality over $\mathbb{F}_2$. Our construction (i) has an efficient seed expansion phase, and (ii) comes with a concretely efficient protocol for distributing the seeds that makes black-box use of cryptography and runs in a constant number of rounds. PCGs for programmable OLE are known to imply PCGs for generating $n$-party Beaver...
Differential privacy (DP) has emerged as a preferred solution for privacy-preserving data analysis, having been adopted by several leading Internet companies. DP is a privacy-preserving mechanism that protects against re-identification of individuals within aggregated datasets. It is known that the privacy budget $\varepsilon$ determines the trade-off between privacy and utility. In this paper, we propose the use of novel set of metrics and an easy-to-implement, step-by-step framework to...
This paper presents OnionPIRv2, an efficient implementation of OnionPIR incorporating standard orthogonal techniques and engineering improvements. OnionPIR is a single-server PIR scheme that improves response size and computation cost by utilizing recent advances in somewhat homomorphic encryption (SHE) and carefully composing two lattice-based SHE schemes to control the noise growth of SHE. OnionPIRv2 achieves $2.5$x-$3.6$x response overhead for databases with moderately large entries...
The Learning Parity with Noise (LPN) problem has become a cornerstone for building lightweight, post-quantum secure encryption schemes. Despite its widespread adoption, LPN-based constructions suffer from a fundamental efficiency limitation: the essential noise term that provides security simultaneously requires error correction coding, leading to bandwidth overhead. We introduce a variant of LPN termed Learning Parity with Quantization (LPQ). While maintaining the ``learning from noisy...
We propose a new iterative method to convert a ciphertext from the Generalized BFV (GBFV) to the regular BFV scheme. In particular, our conversion starts from an encrypted plaintext that lives in a large cyclotomic ring modulo a small-norm polynomial $t(x)$, and gradually changes the encoding to a smaller cyclotomic ring modulo a much larger integer $p$. Previously, only a trivial conversion method was known, which did not change the underlying cyclotomic ring. Using our improved...
Privacy-preserving neural network inference using Fully Homomorphic Encryption (FHE) faces significant challenges in efficiently evaluating non-polynomial functions, such as activation functions, which are critical for introducing non-linearity in neural networks. Full-Domain Functional Bootstrap (FDFB) algorithms provide a promising solution by enabling the evaluation of arbitrary functions while simultaneously refreshing ciphertexts to manage noise accumulation. Despite their theoretical...
We present a novel homomorphic trace evaluation algorithm $\mathsf{RevHomTrace}$, which mitigates the phase amplification problem that comes with the definition of the field trace. Our $\mathsf{RevHomTrace}$ overcomes the phase amplification with only a negligible computational overhead, thereby improving the usability of the homomorphic field trace algorithm. Moreover, our tweak also improves the noise propagation of the $\mathsf{HomTrace}$ and breaks the traditional $O(N^{3})$ variance...
Many distributed analytics applications that are offloaded to the cloud operate on sensitive data. Even when the computations for such analytics workloads are confined to trusted hardware enclaves and all stored data and network communications are encrypted, several studies have shown that they are still vulnerable to access pattern attacks. Prior efforts towards preventing access pattern leakage often incur network and compute overheads that are logarithmic in dataset size, while also...
We introduce the notion of committed vector oblivious linear evaluation (C-VOLE), which allows a party holding a pre-committed vector to generate VOLE correlations with multiple parties on the committed value. It is a unifying tool that can be found useful in zero-knowledge proofs (ZKPs) of committed values, actively secure multi-party computation, private set intersection (PSI), etc. To achieve the best efficiency, we design a tailored commitment scheme and matching C-VOLE protocols,...
Homomorphic encryption schemes based on the Ring-Learning-with-Errors problem require accurate ciphertext noise analysis to ensure correctness and security. However, ring multiplications during homomorphic computations make the noise in the result ciphertexts difficult to characterize. Existing average-case noise analyses derive a bound on the noise by either assuming it follows a Gaussian distribution, or giving empirical formulae, with strong independence assumption and the Central Limit...
Differentially private stochastic gradient descent (DP-SGD) trains machine learning (ML) models with formal privacy guarantees for the training set by adding random noise to gradient updates. In collaborative learning (CL), where multiple parties jointly train a model, noise addition occurs either (i) before or (ii) during secure gradient aggregation. The first option is deployed in distributed DP methods, which require greater amounts of total noise to achieve security, resulting in...
Pseudorandom codes (PRCs) are error-correcting codes with the distinguishing feature that their codewords are computationally indistinguishable from random strings. Introduced by Christ and Gunn (CRYPTO 2024), PRCs have found applications in areas such as AI watermarking, where both robustness and pseudorandomness are essential. All known constructions of PRCs rely on coding-theoretic hardness assumptions. In this work, we study how inherent the use of coding-theoretic hardness is in the...
Secret sharing is a foundational cryptographic primitive for sharing secret keys in distributed systems. In a classical threshold setting, it involves a dealer who has a secret, a set of $n$ users to whom shares of the secret are sent, and a threshold $t$ which is the minimum number of shares required to recover the secret. These schemes offer an all-or-nothing security approach where less than $t$ shares reveal no information about the secret. But these guarantees are threatened by...
Some of the most valuable applications of homomorphic encryption, such as encrypted machine learning inference, require efficient large-scale plaintext-ciphertext and ciphertext-ciphertext matrix multiplications. Current state-of-the-art techniques for matrix multiplications all build on the ability to pack many ciphertexts into a ciphertext and compute on them in a Single Instruction, Multiple Data (SIMD) manner. However, to fit the operation of matrix multiplication into this computational...
We give the first constructions of multiparty pseudorandom correlation generators, distributed point functions, and (negligible-error) homomorphic secret sharing for constant-degree polynomials for any number of parties without using LWE or iO. Our constructions are proven secure under the combination of LPN with dimension $n$, $2n$ samples, and noise rate $n^{\varepsilon-1}$ for a small constant $\varepsilon$, and MQ with $n$ variables and $n^{1+\delta}$ equations. As applications of our...
Multiparty fully homomorphic encryption (MPFHE) is a generalization of (multi-key) fully homomorphic encryption ((MK)FHE) that lives on the cusp between multiparty computation (MPC) and FHE, enabling a computation over encrypted data using multiple keys. However, contrary to MKFHE, it seeks to reduce the noise inflation based on the number of parties by allowing the parties to first compute shared data in MPC before executing the computation in FHE. However, many works use specific...
Fully Homomorphic Encryption (FHE) enables computations on encrypted data without requiring decryption. However, each computation increases the noise level, which can eventually cause decryption failures once a certain threshold is reached. In particular, homomorphic multiplication significantly amplifies noise in the ciphertext. In this work, we revisit Ring-learning-With-Error (RLWE) based encryption proposed by Fan et al. and present an optimized noise growth approach by swapping the...
We give new constructions of pseudorandom functions (PRFs) computable in $\mathsf{NC}^1$ from (variants of the) Learning Parity with Noise (LPN) assumption. Prior to our work, the only $\mathsf{NC}^1$-computable PRF from LPN-style assumptions was due to Boyle et al. (FOCS 2020) who constructed a weak PRF from a new heuristic variant of LPN called variable-density LPN. We give the following three results: (1) A weak PRF computable in $\mathsf{NC}^1$ from standard LPN. (2) A...
We initiate the study of multi-authority attribute-based functional encryption for noisy inner-product functionality, and propose two new primitives: (1) multi-authority attribute-based (noisy) inner-product functional encryption (MA-AB(N)IPFE), and (2) multi-authority attribute-based evasive inner-product functional encryption (MA-ABevIPFE). The MA-AB(N)IPFE primitive generalizes the existing multi-authority attribute-based inner-product functional encryption schemes by Agrawal et al....
As a theoretical cornerstone of post-quantum cryptography, the Learning With Errors (LWE) problem serves as the security foundation for standardized algorithms such as Kyber and Dilithium. Recently, a framework for provable dual attacks on LWE has been proposed by Pouly et al. in Eurocrypt 2024, addressing the limitations in effectiveness caused by existing methods' reliance on heuristic assumptions in LWE dual attacks. Their paper also poses an open problem on how to formally integrate...
Motivated by applications to efficient secure computation, we consider the following problem of encrypted matrix–vector product (EMVP). Let $\mathbb F$ be a finite field. In an offline phase, a client uploads an encryption of a matrix $M \in \mathbb F^{m\times \ell}$ to a server, keeping only a short secret key. The server stores the encrypted matrix \(\hat{M}\). In the online phase, the client may repeatedly send encryptions \(\hat{ q}_i\) of query vectors \(q_i \in \mathbb F^\ell\), ...
Localized side-channel analysis makes it possible to evaluate only the relevant chip area by measuring near-field electromagnetic (EM) emanations. Compared to global power measurements, this can lead to more powerful attacks as the signal-to-noise ratio is higher and irrelevant circuit components are not included in the recorded measurements. Especially for profiled attacks and their reproduction, the probe position in a localized scenario is of utmost importance. Ideally a probe should be...
Noisy linear algebraic assumptions with respect to random matrices, in particular Learning with Errors ($\mathsf{LWE}$) and Alekhnovich Learning Parity with Noise (Alekhnovich $\mathsf{LPN}$), are among the most investigated assumptions that imply post-quantum public-key encryption (PKE). They enjoy elegant mathematical structure. Indeed, efforts to build post-quantum PKE and advanced primitives such as homomorphic encryption and indistinguishability obfuscation have increasingly focused...
The Fiat-Shamir transform is one of the most widely applied methods for secure signature construction. Fiat-Shamir starts with an interactive zero-knowledge identification protocol and transforms this via a hash function into a non-interactive signature. The protocol's zero-knowledge property ensures that a signature does not leak information on its secret key $\mathbf s$, which is achieved by blinding $\vec s$ via proper randomness $\mathbf y$. Most prominent Fiat-Shamir examples are...
Fully Homomorphic Encryption (FHE) allows computations on encrypted data without revealing any information about the data itself. However, FHE ciphertexts include noise for security reasons, which increases during operations and can lead to decryption errors. This paper addresses the noise introduced during bootstrapping in Torus Fully Homomorphic Encryption (TFHE), particularly focusing on approximation errors during modulus switching and gadget decomposition. We propose a mean compensation...
There is rising interest in combining Differential Privacy (DP) and Secure Multiparty Computation (MPC) techniques to protect distributed database query evaluations from both adversaries taking part in the computation and those observing the outputs. This requires implementing both the query evaluation and noise generation parts of a DP mechanism directly in MPC. While query evaluation can be done using existing highly optimized MPC techniques for secure function evaluation, efficiently...
Fully Homomorphic Encryption (FHE) enables arbitrary computation directly on encrypted data. The TFHE scheme supports encryption of bits or small integers, evaluating any univariate function via programmable bootstrapping (PBS), which also refreshes ciphertext noise. Since both linear and nonlinear functions can be expressed with PBS, arbitrary circuits of unlimited depth can be computed without accuracy loss, aside from a negligible failure probability. However, a key limitation of TFHE is...
Secure rate-distortion-perception (RDP) trade-offs arise in critical applications, such as semantic compression and privacy-preserving generative coding, where preserving perceptual quality while minimizing distortion is vital. This paper studies a framework for secure RDP over noiseless and noisy broadcast channels under strong secrecy constraints. We first characterize the exact secure RDP region for noiseless transmission channels. We then develop an inner bound on the secure RDP region...
Threshold encryption schemes provide a common tool to secure a public-key encryption scheme against single point of failure attacks. Despite the success of lattices in building fully-homomorphic and presumably quantum-resistant encryption schemes, the task of thresholdizing those schemes remains challenging. The major bottleneck in the standard approach is the use of statistical noise flooding, leading to a significant efficiency loss and the need of stronger hardness assumptions. Recent...
Delegation of quantum computation in a trustful way is one of the most fundamental challenges toward the realization of future quantum cloud computing. While considerable progress has been made, no known protocol provides a purely classical client with universal delegated quantum computation while simultaneously ensuring blindness (input privacy), verifiability (soundness), and robustness against quantum noise—a feat that must be achieved under stringent cryptographic assumptions and with...
Side-channel analysis recovers a secret by exploiting the key-dependent characteristics of the leakages. Practical techniques, such as Distance-of-Means analysis (DoM), Kolmogorov-Smirnov analysis (KSA) and Cramér-von Mises analysis (CvMA), provide valuable insights about the secret from the indirect perspectives of statistical moment and cumulative distribution function (CDF) respectively, circumventing the direct and costly estimation of leakage probability densities and therefore enabling...
We show that concrete hardness assumptions about learning or cloning the output state of a random quantum circuit can be used as the foundation for secure quantum cryptography. In particular, under these assumptions we construct secure one-way state generators (OWSGs), digital signature schemes, quantum bit commitments, and private key encryption schemes. We also discuss evidence for these hardness assumptions by analyzing the best-known quantum learning algorithms, as well as proving...