Goto

Collaborating Authors

 Dixon, Michael


Polynomial Bounds for Learning Noisy Optical Physical Unclonable Functions and Connections to Learning With Errors

arXiv.org Artificial Intelligence

It is shown that a class of optical physical unclonable functions (PUFs) can be learned to arbitrary precision with arbitrarily high probability, even in the presence of noise, given access to polynomially many challenge-response pairs and polynomially bounded computational power, under mild assumptions about the distributions of the noise and challenge vectors. This extends the results of Rh\"uramir et al. (2013), who showed a subset of this class of PUFs to be learnable in polynomial time in the absence of noise, under the assumption that the optics of the PUF were either linear or had negligible nonlinear effects. We derive polynomial bounds for the required number of samples and the computational complexity of a linear regression algorithm, based on size parameters of the PUF, the distributions of the challenge and noise vectors, and the probability and accuracy of the regression algorithm, with a similar analysis to one done by Bootle et al. (2018), who demonstrated a learning attack on a poorly implemented version of the Learning With Errors problem.


Say Cheese! Experiences with a Robot Photographer

AI Magazine

We have developed an autonomous robot system that takes well-composed photographs of people at social events, such as weddings and conference receptions. In this article, we outline the overall architecture of the system and describe how the various components interrelate. We also describe our experiences deploying the robot photographer at a number of real-world events.


Say Cheese! Experiences with a Robot Photographer

AI Magazine

This model makes system debugging significantly easier, because we know We introduced a sensor abstraction layer to exactly what each sensor reading is at every separate the task layer from concerns about point in the computation; something that physical sensing devices. We process the sensor would not be the case if we were reading from information (from the laser rangefinder in this the sensors every time a reading was used in a application) into distance measurements from calculation. This model also allows us to inject the center of the robot, thus allowing consideration modified sensor readings into the system, as of sensor error models and performance described in the next section.