Goto

Collaborating Authors

 ciphertext






Penguin: P arallel-Packed Homomorphic Encryption for Fast Graph Convolutional Network Inference

Neural Information Processing Systems

HE operations (e.g., ciphertext (ct) rotations/multiplications, additions), which could be orders of For example, a GCN layer's computation is dominated by the special consecutive HE operations are defined in Sec. 2. For generality, we assume both feature matrix and adjacency Parallel-Packing (see Sec. 3.2), the ciphertext size is fully exploited, and the total HE operation count We adopt a threat model setting consistent with prior works [9, 14, 3, 7, 18, 22, 27]. The cloud server is semi-honest (e.g.




m. Then,thefollowingholds: nnw

Neural Information Processing Systems

A.1 MoreDetailsonPreliminaries A.1.1 Fixed-PointEncoding Same asother neural networks, Transformer-based models usefloating-point arithmetic, however cryptographic protocols operate on integers. Therefore, we require a float-to-integer conversion [46, 30, 17] to represent a floating-point numberx Q into the ringZ2ℓ. Specifically, we first encode it as a fixed-point number, which is parametrized by a scale parameters that determines the fractional precision. Then, we embed the fixed-point representation into the ring with 2's complement representation. A protocol ΠPI between the server having as input a modelM with weightsw andtheclient havingasinput asamplexisaprivateinference protocol againsthonest-but-curious adversaries ifitsatisfies the following guarantees: 1)Correctness: onevery model weightsw and every input samplex, the output of the client at the end of the protocol is the correct inference M(w,x).