Goto

Collaborating Authors

Results


Artificial Intelligence In Badi App Spells Trouble For Consumers

#artificialintelligence

The newest roommate app, Badi, has made its official debut into the New York market. The artificial intelligence app took Europe by storm in September 2015. England, Germany, and Spain to be exact. The company has changed the lives of 2 million users by providing about 300,000 room listings. Badi uses an algorithm based on information such as age, gender, interests, and lifestyle preferences to assist users in finding rooms on the platform.


AI in Action E59: Marco Braun, Transformation Lead at Akelius

#artificialintelligence

Welcome to the episode 59 and the penultimate AI in Action podcast of 2019, the show where we break down the hype and explore the impact that Data Science, Machine Learning and Artificial Intelligence are making on our everyday lives. Powered by Alldus International, our goal is to share with you the insights of technologists and data science enthusiasts to showcase the excellent work that is being done within AI in the United States and Europe. Today's guest is Marco Braun, Transformation Lead at Akelius. Akelius provides Better Living by acquiring, upgrading and managing residential properties. Akelius upgrades residential units to a quality level with newly constructed apartments.


Prediction of Construction Cost for Field Canals Improvement Projects in Egypt

arXiv.org Artificial Intelligence

Field canals improvement projects (FCIPs) are one of the ambitious projects constructed to save fresh water. To finance this project, Conceptual cost models are important to accurately predict preliminary costs at the early stages of the project. The first step is to develop a conceptual cost model to identify key cost drivers affecting the project. Therefore, input variables selection remains an important part of model development, as the poor variables selection can decrease model precision. The study discovered the most important drivers of FCIPs based on a qualitative approach and a quantitative approach. Subsequently, the study has developed a parametric cost model based on machine learning methods such as regression methods, artificial neural networks, fuzzy model and case-based reasoning.


SONYC

Communications of the ACM

Over an 11-month period--May 2016 to April 2017--51% of all noise complaints in the focus area were related to after-hours construction activity (6 P.M.–7 A.M.), three times the amount in the next category. Note combining all construction-related complaints adds up to 70% of this sample, highlighting how disruptive to the lives of ordinary citizens this particular category of noise can be. Figure 4c includes SPL values (blue line) at a five-minute resolution for the after-hours period during or immediately preceding a subset of the complaints. Dotted green lines correspond to background levels, computed as the moving average of SPL measurements within a two-hour window. Dotted black lines correspond to SPL values 10dB above the background, the threshold defined by the city's noise code to indicate potential violations.


Location-Centered House Price Prediction: A Multi-Task Learning Approach

arXiv.org Machine Learning

Accurate house prediction is of great significance to various real estate stakeholders such as house owners, buyers, investors, and agents. We propose a location-centered prediction framework that differs from existing work in terms of data profiling and prediction model. Regarding data profiling, we define and capture a fine-grained location profile powered by a diverse range of location data sources, such as transportation profile (e.g., distance to nearest train station), education profile (e.g., school zones and ranking), suburb profile based on census data, facility profile (e.g., nearby hospitals, supermarkets). Regarding the choice of prediction model, we observe that a variety of approaches either consider the entire house data for modeling, or split the entire data and model each partition independently. However, such modeling ignores the relatedness between partitions, and for all prediction scenarios, there may not be sufficient training samples per partition for the latter approach. We address this problem by conducting a careful study of exploiting the Multi-Task Learning (MTL) model. Specifically, we map the strategies for splitting the entire house data to the ways the tasks are defined in MTL, and each partition obtained is aligned with a task. Furthermore, we select specific MTL-based methods with different regularization terms to capture and exploit the relatedness between tasks. Based on real-world house transaction data collected in Melbourne, Australia. We design extensive experimental evaluations, and the results indicate a significant superiority of MTL-based methods over state-of-the-art approaches. Meanwhile, we conduct an in-depth analysis on the impact of task definitions and method selections in MTL on the prediction performance, and demonstrate that the impact of task definitions on prediction performance far exceeds that of method selections.


Data Engineer Machine Learning (m/f) at Akelius GmbH

#artificialintelligence

Akelius buys, upgrades and manages residential properties. The company owns 47,000 apartments in Sweden, Denmark, Germany, France, Canada, England and the United States. We are a rapidly growing international company more than eight hundred employees around the world. An integral part of our company is the Technology department. The Development team consists of more than one hundred employees mostly based in Berlin.


Modeling Evolution of Topics in Large-Scale Temporal Text Corpora

AAAI Conferences

Large text temporal collections provide insights into social and cultural change over time. To quantify changes in topics in these corpora, embedding methods have been used as a diachronic tool. However, they have limited utility for modeling changes in topics due to the stochastic nature of training. We propose a new computational approach for tracking and detecting temporal evolution of topics in a large collection of texts. This approach for identifying dynamic topics and modeling their evolution combines the advantages of two methods: (1) word embeddings to learn contextual semantic representation of words from temporal snapshots of the data and (2) dynamic network analysis to identify dynamic topics by using dynamic semantic similarity networks developed using embedding models. Experimenting with two large temporal data sets from the legal and real estate domains, we show that this approach performs faster (due to parallelizing different snapshots), uncovers more coherent topics (compared to available dynamic topic modeling approaches), and effectively enables modeling evolution leveraging the network structure.


Interpreting Neural Network Judgments via Minimal, Stable, and Symbolic Corrections

arXiv.org Machine Learning

The paper describes a new algorithm to generate minimal, stable, and symbolic corrections to an input that will cause a neural network with ReLU neurons to change its output. We argue that such a correction is a useful way to provide feedback to a user when the neural network produces an output that is different from a desired output. Our algorithm generates such a correction by solving a series of linear constraint satisfaction problems. The technique is evaluated on a neural network that has been trained to predict whether an applicant will pay a mortgage.


Google's plan to revolutionise cities is a takeover in all but name

The Guardian

Last June Volume, a leading magazine on architecture and design, published an article on the GoogleUrbanism project. Conceived at a renowned design institute in Moscow, the project charts a plausible urban future based on cities acting as important sites for "data extractivism" – the conversion of data harvested from individuals into artificial intelligence technologies, allowing companies such as Alphabet, Google's parent company, to act as providers of sophisticated and comprehensive services. The cities themselves, the project insisted, would get a share of revenue from the data.


Your next home could be printed and completed in 24hrs

Daily Mail - Science & tech

It could be the end of seemingly neverending delays and arguments with builders. A San Francisco startup has revealed a future where homes are built in just 24 hours. Apis Cor unveiled a 400-square-foot house in a town outside of Moscow, Russia that was constructed using a mobile 3D printer. The technology printed the walls, partitions and building envelope from a concrete mix - the entire project was done in a single day and amounted to $10,134. A San Francisco startup foresees a future where homes are built in just 24 hours.