Plotting

 Europe


Kernel functions based on triplet comparisons

Neural Information Processing Systems

Given only information in the form of similarity triplets "Object A is more similar to object B than to object C" about a data set, we propose two ways of defining a kernel function on the data set. While previous approaches construct a lowdimensional Euclidean embedding of the data set that reflects the given similarity triplets, we aim at defining kernel functions that correspond to high-dimensional embeddings. These kernel functions can subsequently be used to apply any kernel method to the data set.


Inverse Filtering for Hidden Markov Models

Neural Information Processing Systems

This paper considers a number of related inverse filtering problems for hidden Markov models (HMMs). In particular, given a sequence of state posteriors and the system dynamics; i) estimate the corresponding sequence of observations, ii) estimate the observation likelihoods, and iii) jointly estimate the observation likelihoods and the observation sequence. We show how to avoid a computationally expensive mixed integer linear program (MILP) by exploiting the algebraic structure of the HMM filter using simple linear algebra operations, and provide conditions for when the quantities can be uniquely reconstructed. We also propose a solution to the more general case where the posteriors are noisily observed. Finally, the proposed inverse filtering algorithms are evaluated on real-world polysomnographic data used for automatic sleep segmentation.


If Ted Talks are getting shorter, what does that say about our attention spans?

The Guardian

Age: Ted started in 1984. And has Ted been talking ever since? I know, and they do the inspirational online talks. Correct, under the slogan "Ideas change everything". She was talking at the Hay festival, in Wales.


Drone war, ground offensive continue despite new Russia-Ukraine peace push

Al Jazeera

Russia and Ukraine have launched a wave of drone attacks against each other overnight, even as Moscow claimed it was finalising a peace proposal to end the war. Ukrainian air force officials said on Tuesday that Russia deployed 60 drones across multiple regions through the night, injuring 10 people. Kyiv's air defences intercepted 43 of them – 35 were shot down while eight were diverted using electronic warfare systems. In Dnipropetrovsk, central Ukraine, Governor Serhiy Lysak reported damage to residential properties and an agricultural site after Russian drones led to fires during the night. In Kherson, a southern city frequently hit by Russian strikes, a drone attack on Tuesday morning wounded a 59-year-old man and six municipal workers, officials said.



Do Less, Get More: Streaming Submodular Maximization with Subsampling

Neural Information Processing Systems

In this paper, we develop the first one-pass streaming algorithm for submodular maximization that does not evaluate the entire stream even once. By carefully subsampling each element of the data stream, our algorithm enjoys the tightest approximation guarantees in various settings while having the smallest memory footprint and requiring the lowest number of function evaluations. More specifically, for a monotone submodular function and a p-matchoid constraint, our randomized algorithm achieves a 4p approximation ratio (in expectation) with O(k) memory and O(km/p) queries per element (k is the size of the largest feasible solution and m is the number of matroids used to define the constraint).


Optimization for Approximate Submodularity

Neural Information Processing Systems

We consider the problem of maximizing a submodular function when given access to its approximate version. Submodular functions are heavily studied in a wide variety of disciplines since they are used to model many real world phenomena and are amenable to optimization. There are many cases however in which the phenomena we observe is only approximately submodular and the optimization guarantees cease to hold. In this paper we describe a technique that yields strong guarantees for maximization of monotone submodular functions from approximate surrogates under cardinality and intersection of matroid constraints. In particular, we show tight guarantees for maximization under a cardinality constraint and 1/(1 + P) approximation under intersection of P matroids.


Delivery robot autonomously lifts, transports heavy cargo

FOX News

Tech expert Kurt Knutsson discusses LEVA, the autonomous robot that walks, rolls and lifts 187 pounds of cargo for all-terrain deliveries. Autonomous delivery robots are already starting to change the way goods move around cities and warehouses, but most still need humans to load and unload their cargo. That's where LEVA comes in. Developed by engineers and designers from ETH Zurich and other Swiss universities, LEVA is a robot that can not only navigate tricky environments but also lift and carry heavy boxes all on its own, making deliveries smoother and more efficient. Join the FREE "CyberGuy Report": Get my expert tech tips, critical security alerts and exclusive deals, plus instant access to my free "Ultimate Scam Survival Guide" when you sign up!


Adversarial Scene Editing: Automatic Object Removal from Weak Supervision

Neural Information Processing Systems

While great progress has been made recently in automatic image manipulation, it has been limited to object centric images like faces or structured scene datasets. In this work, we take a step towards general scene-level image editing by developing an automatic interaction-free object removal model. Our model learns to find and remove objects from general scene images using image-level labels and unpaired data in a generative adversarial network (GAN) framework. We achieve this with two key contributions: a two-stage editor architecture consisting of a mask generator and image in-painter that co-operate to remove objects, and a novel GAN based prior for the mask generator that allows us to flexibly incorporate knowledge about object shapes. We experimentally show on two datasets that our method effectively removes a wide variety of objects using weak supervision only.


FastGRNN: A Fast, Accurate, Stable and Tiny Kilobyte Sized Gated Recurrent Neural Network Aditya Kusupati

Neural Information Processing Systems

This paper develops the FastRNN and FastGRNN algorithms to address the twin RNN limitations of inaccurate training and inefficient prediction. Previous approaches have improved accuracy at the expense of prediction costs making them infeasible for resource-constrained and real-time applications. Unitary RNNs have increased accuracy somewhat by restricting the range of the state transition matrix's singular values but have also increased the model size as they require a larger number of hidden units to make up for the loss in expressive power. Gated RNNs have obtained state-of-the-art accuracies by adding extra parameters thereby resulting in even larger models. FastRNN addresses these limitations by adding a residual connection that does not constrain the range of the singular values explicitly and has only two extra scalar parameters. FastGRNN then extends the residual connection to a gate by reusing the RNN matrices to match state-of-the-art gated RNN accuracies but with a 2-4x smaller model. Enforcing FastGRNN's matrices to be low-rank, sparse and quantized resulted in accurate models that could be up to 35x smaller than leading gated and unitary RNNs. This allowed FastGRNN to accurately recognize the "Hey Cortana" wakeword with a 1 KB model and to be deployed on severely resource-constrained IoT microcontrollers too tiny to store other RNN models.