m. Then,thefollowingholds: nnw
–Neural Information Processing Systems
A.1 MoreDetailsonPreliminaries A.1.1 Fixed-PointEncoding Same asother neural networks, Transformer-based models usefloating-point arithmetic, however cryptographic protocols operate on integers. Therefore, we require a float-to-integer conversion [46, 30, 17] to represent a floating-point numberx Q into the ringZ2ℓ. Specifically, we first encode it as a fixed-point number, which is parametrized by a scale parameters that determines the fractional precision. Then, we embed the fixed-point representation into the ring with 2's complement representation. A protocol ΠPI between the server having as input a modelM with weightsw andtheclient havingasinput asamplexisaprivateinference protocol againsthonest-but-curious adversaries ifitsatisfies the following guarantees: 1)Correctness: onevery model weightsw and every input samplex, the output of the client at the end of the protocol is the correct inference M(w,x).
Neural Information Processing Systems
Feb-9-2026, 11:17:55 GMT
- Technology: