Goto

Collaborating Authors

 tfhe



Glyph: Fast and Accurately Training Deep Neural Networks on Encrypted Data

Neural Information Processing Systems

Because of the lack of expertise, to gain benefits from their data, average users have to upload their private data to cloud servers they may not trust. Due to legal or privacy constraints, most users are willing to contribute only their encrypted data, and lack interests or resources to join deep neural network (DNN) training in cloud.



We thank the reviewers for their careful reading of the manuscript and their constructive suggestions

Neural Information Processing Systems

We thank the reviewers for their careful reading of the manuscript and their constructive suggestions. Chimera supports the switching between BFV and TFHE, while Glyph enables the switching between BGV and TFHE. Some users may not have such large network bandwidth. In contrast, Glyph first trains a CNN network model by a plaintext public dataset. Except sending the encrypted input data, the training of Glyph does not involve the client.


Reviews: SHE: A Fast and Accurate Deep Neural Network for Encrypted Data

Neural Information Processing Systems

Main contribution: The paper shows how to implement an accurate homomorphic ReLU and homomorphic max-pooling operation. This is achieved by combining the idea of logarithmic quantization followed by shifting and adding operations with the basic approach of TFHE (Fast Fully Momorphic Encryption over the Torus). Further they also note that 5 bit representations are sufficient for weights, but the intermediate results of accumulation need 16 bit representation to avoid degrading accuracy. Thus they propose mixed bitwidth accumulators to avoid unnecessary computational costs. By using these few key ideas the authors show how TFHE can now support fast matrix multiplications and convolutions which previously were extremely slow.


Glyph: Fast and Accurately Training Deep Neural Networks on Encrypted Data

Neural Information Processing Systems

Because of the lack of expertise, to gain benefits from their data, average users have to upload their private data to cloud servers they may not trust. Due to legal or privacy constraints, most users are willing to contribute only their encrypted data, and lack interests or resources to join deep neural network (DNN) training in cloud. However, such inefficient lookup-table-based activations significantly prolong private training latency of DNNs. In this paper, we propose, Glyph, an FHE-based technique to fast and accurately train DNNs on encrypted data by switching between TFHE (Fast Fully Homomorphic Encryption over the Torus) and BGV cryptosystems. Glyph uses logic-operation-friendly TFHE to implement nonlinear activations, while adopts vectorial-arithmetic-friendly BGV to perform multiply-accumulations (MACs). Glyph further applies transfer learning on DNN training to improve test accuracy and reduce the number of MACs between ciphertext and ciphertext in convolutional layers.


Speed-up of Data Analysis with Kernel Trick in Encrypted Domain

Yoo, Joon Soo, Song, Baek Kyung, Ahn, Tae Min, Heo, Ji Won, Yoon, Ji Won

arXiv.org Artificial Intelligence

Homomorphic encryption (HE) is pivotal for secure computation on encrypted data, crucial in privacy-preserving data analysis. However, efficiently processing high-dimensional data in HE, especially for machine learning and statistical (ML/STAT) algorithms, poses a challenge. In this paper, we present an effective acceleration method using the kernel method for HE schemes, enhancing time performance in ML/STAT algorithms within encrypted domains. This technique, independent of underlying HE mechanisms and complementing existing optimizations, notably reduces costly HE multiplications, offering near constant time complexity relative to data dimension. Aimed at accessibility, this method is tailored for data scientists and developers with limited cryptography background, facilitating advanced data analysis in secure environments.


Homomorphic WiSARDs: Efficient Weightless Neural Network training over encrypted data

Neumann, Leonardo, Guimarães, Antonio, Aranha, Diego F., Borin, Edson

arXiv.org Artificial Intelligence

The widespread application of machine learning algorithms is a matter of increasing concern for the data privacy research community, and many have sought to develop privacy-preserving techniques for it. Among existing approaches, the homomorphic evaluation of ML algorithms stands out by performing operations directly over encrypted data, enabling strong guarantees of confidentiality. The homomorphic evaluation of inference algorithms is practical even for relatively deep Convolution Neural Networks (CNNs). However, training is still a major challenge, with current solutions often resorting to lightweight algorithms that can be unfit for solving more complex problems, such as image recognition. This work introduces the homomorphic evaluation of Wilkie, Stonham, and Aleksander's Recognition Device (WiSARD) and subsequent Weightless Neural Networks (WNNs) for training and inference on encrypted data. Compared to CNNs, WNNs offer better performance with a relatively small accuracy drop. We develop a complete framework for it, including several building blocks that can be of independent interest. Our framework achieves 91.7% accuracy on the MNIST dataset after only 3.5 minutes of encrypted training (multi-threaded), going up to 93.8% in 3.5 hours. For the HAM10000 dataset, we achieve 67.9% accuracy in just 1.5 minutes, going up to 69.9% after 1 hour. Compared to the state of the art on the HE evaluation of CNN training, Glyph (Lou et al., NeurIPS 2020), these results represent a speedup of up to 1200 times with an accuracy loss of at most 5.4%. For HAM10000, we even achieved a 0.65% accuracy improvement while being 60 times faster than Glyph. We also provide solutions for small-scale encrypted training. In a single thread on a desktop machine using less than 200MB of memory, we train over 1000 MNIST images in 12 minutes or over the entire Wisconsin Breast Cancer dataset in just 11 seconds.


Neural Network Training on Encrypted Data with TFHE

Montero, Luis, Frery, Jordan, Kherfallah, Celia, Bredehoft, Roman, Stoian, Andrei

arXiv.org Artificial Intelligence

We present an approach to outsourcing of training neural networks while preserving data confidentiality from malicious parties. We use fully homomorphic encryption to build a unified training approach that works on encrypted data and learns quantized neural network models. The data can be horizontally or vertically split between multiple parties, enabling collaboration on confidential data. We train logistic regression and multi-layer perceptrons on several datasets.


Deep Neural Networks for Encrypted Inference with TFHE

Stoian, Andrei, Frery, Jordan, Bredehoft, Roman, Montero, Luis, Kherfallah, Celia, Chevallier-Mames, Benoit

arXiv.org Artificial Intelligence

Fully homomorphic encryption (FHE) is an encryption method that allows to perform computation on encrypted data, without decryption. FHE preserves the privacy of the users of online services that handle sensitive data, such as health data, biometrics, credit scores and other personal information. A common way to provide a valuable service on such data is through machine learning and, at this time, Neural Networks are the dominant machine learning model for unstructured data. In this work we show how to construct Deep Neural Networks (DNN) that are compatible with the constraints of TFHE, an FHE scheme that allows arbitrary depth computation circuits. We discuss the constraints and show the architecture of DNNs for two computer vision tasks. We benchmark the architectures using the Concrete stack, an open-source implementation of TFHE.