It all started with an odd pile of shells: a pile that, upon closer inspection, fell apart like a flower losing its petals, introducing a burned-out nature documentarian named Craig Foster--and, in time, the world--to the octopus hiding cleverly inside. Known simply as "her," she would become the star of My Octopus Teacher, the Oscar-nominated Netflix documentary and surprise pandemic hit that told the story of Foster's unlikely relationship with that eight-armed mollusk. Released in September 2020, it arrived at the perfect moment. Audiences exhausted by lockdowns and unrelenting 2020-ness were primed for escape into the undersea fantasia of South Africa's kelp forests, where Foster met her. Best-selling books like The Soul of an Octopus and Other Minds: The Octopus, the Sea, and the Deep Origins of Consciousness had whetted public curiosity about these uncannily intelligent creatures with whom humans last shared a common ancestor 600 million years ago. Yet while most writing about octopuses emphasizes their ostensibly alien, unknowable nature,1 and serious, science-minded nature documentaries elevate concern about biodiversity over sentiment for a single animal, My Octopus Teacher defied convention. It embraced Foster's feelings for the octopus, which over the course of a year evolved from curiosity to care--even to love. And though her own feelings were left for viewers to interpret, the film's indelible impression was of nature populated by species who are not only beautiful and exquisitely evolved and ecologically important, but highly sentient, too. Nautilus talked to Foster about his octopus teacher and how getting to know her changed the way he thinks about nature. I write a lot about nature and biology and ecology, but in the last few years I've focused on the minds of animals and how we think about them.
This thesis is a proof of concept for the potential of Variational Auto-Encoder (VAE) on representation learning of real-world Knowledge Graphs (KG). Inspired by successful approaches to the generation of molecular graphs, we evaluate the capabilities of our model, the Relational Graph Variational Auto-Encoder (RGVAE). The impact of the modular hyperparameter choices, encoding through graph convolutions, graph matching and latent space prior, is compared. The RGVAE is first evaluated on link prediction. The mean reciprocal rank (MRR) scores on the two datasets FB15K-237 and WN18RR are compared to the embedding-based model DistMult. A variational DistMult and a RGVAE without latent space prior constraint are implemented as control models. The results show that between different settings, the RGVAE with relaxed latent space, scores highest on both datasets, yet does not outperform the DistMult. Further, we investigate the latent space in a twofold experiment: first, linear interpolation between the latent representation of two triples, then the exploration of each latent dimension in a $95\%$ confidence interval. Both interpolations show that the RGVAE learns to reconstruct the adjacency matrix but fails to disentangle. For the last experiment we introduce a new validation method for the FB15K-237 data set. The relation type-constrains of generated triples are filtered and matched with entity types. The observed rate of valid generated triples is insignificantly higher than the random threshold. All generated and valid triples are unseen. A comparison between different latent space priors, using the $\delta$-VAE method, reveals a decoder collapse. Finally we analyze the limiting factors of our approach compared to molecule generation and propose solutions for the decoder collapse and successful representation learning of multi-relational KGs.
We propose a Distributional Approach to address Controlled Text Generation from pre-trained Language Models (LMs). This view permits to define, in a single formal framework, "pointwise" and "distributional" constraints over the target LM -- to our knowledge, this is the first approach with such generality -- while minimizing KL divergence with the initial LM distribution. The optimal target distribution is then uniquely determined as an explicit EBM (Energy-Based Model) representation. From that optimal representation we then train the target controlled autoregressive LM through an adaptive distributional variant of Policy Gradient. We conduct a first set of experiments over pointwise constraints showing the advantages of our approach over a set of baselines, in terms of obtaining a controlled LM balancing constraint satisfaction with divergence from the initial LM (GPT-2). We then perform experiments over distributional constraints, a unique feature of our approach, demonstrating its potential as a remedy to the problem of Bias in Language Models. Through an ablation study we show the effectiveness of our adaptive technique for obtaining faster convergence.
Mr. Smith Goes to Washington 1939 TCM Tue. 7 p.m. Mean Streets 1973 Cinemax Sun. 6 a.m. Batman Begins 2005 AMC Sun. Throw Momma From the Train 1987 EPIX Sun. Die Hard 1988 IFC Sun. I Know What You Did Last Summer 1997 Starz Tue. Gone in 60 Seconds 2000 CMT Wed. 8 p.m., Thur. Total Recall 1990 Encore Thur. 2 a.m. A Fish Called Wanda 1988 Encore Thur. 2 p.m., 9 p.m. The World Is Not Enough 1999 EPIX Sat. 4 p.m. Look Who's Talking 1989 OVA Sun. Die Hard With a Vengeance 1995 IFC Thur. Oil-platform workers, including an estranged couple, and a Navy SEAL make a startling deep-sea discovery. A clueless politician falls in love with a waitress whose erratic behavior is caused by a nail stuck in her head. After glimpsing his future, an ambitious politician battles the agents of Fate itself to be with the woman he loves. To help a friend, a suburban baby sitter drives into downtown Chicago with her two charges and a neighbor. Two teenage baby sitters and a group of children spend a wild night ...
We present a probabilistic framework for studying adversarial attacks on discrete data. Based on this framework, we derive a perturbation-based method, Greedy Attack, and a scalable learning-based method, Gumbel Attack, that illustrate various tradeoffs in the design of attacks. We demonstrate the effectiveness of these methods using both quantitative metrics and human evaluation on various state-of-the-art models for text classification, including a word-based CNN, a character-based CNN and an LSTM. As as example of our results, we show that the accuracy of character-based convolutional networks drops to the level of random selection by modifying only five characters through Greedy Attack.
The Luftwaffe would use the doodlebugs to attack London, where actress Helen Mirren's parents lived during the war. "They were by far the worst because [as Mirren's mother told her] you would hear them coming over and if you heard the drone -- the buzz up there in the sky -- if you heard that noise stop, that's when it was dropping its load," said Mirren during a phone interview with The Huffington Post. "So you would just pray it went over your head." Mirren recalled this memory from her family's history as she was promoting her movie "Eye in the Sky," in which she plays a British colonel tasked with deciding whether to use a drone strike in Nairobi, Kenya. Her character has tracked the location of an extremist meeting, but must choose whether to take out the terrorists at the cost of killing a young girl who is selling bread right outside their headquarters.
The Lost World: Jurassic Park 1997 AMC Sun. Tomorrow Never Dies 1997 EPIX Wed. 10 p.m., Thur. The X-Files: Fight the Future 1998 IFC Thur. Hard to Kill 1990 Sundance Mon. 8 p.m., Tue. A scientist gives his bodyguard superhuman powers in order to fight racists. A lawyer unwittingly becomes friends with an unstable woman who has a criminal history. A successful businesswoman puts her family, career and life on the line to satisfy her addiction to sex. With his father trapped in the wreckage of their spacecraft, a youth treks across Earth's now-hostile terrain to recover their rescue beacon and signal for help. In the future a cutting-edge android in the form of a boy embarks on a journey to discover his true nature. An 11-year-old boy experiences the worst day of his young life but soon learns that he's not alone when other members of his family encounter their own calamities. A struggling writer falls in love with a stenographer while trying to finish his new novel in 30 days.