Goto

Collaborating Authors

Weaviate is an open-source search engine powered by ML, vectors, graphs, and GraphQL

ZDNet

Bob van Luijt's career in technology started at age 15, building websites to help people sell toothbrushes online. Not many 15 year-olds do that. Apparently, this gave van Luijt enough of a head start to arrive at the confluence of technology trends today. Van Luijt went on to study arts but ended up working full time in technology anyway. In 2015, when Google introduced its RankBrain algorithm, the quality of search results jumped up.


Database technology evolves to combine machine learning and data storage

#artificialintelligence

When Bob van Luijt, the CEO of SeMI Technologies, looks at the history of databases, he highlights a few distinct waves. First, there was the world of SQL, where all the data fit neatly into rectangular tables. Then came the NoSQL revolution that brought the flexibility of the document model, where each entry didn't need to have the same fields. Now, his company is bringing Weaviate to the market as part of a wave of AI-centric databases that merge the power of machine learning with data storage. The new model offers not just the potential for tapping the power of AI algorithms, but also a more flexible search engine that isn't locked into searching for exact matches.


Extending Neural Generative Conversational Model using External Knowledge Sources

arXiv.org Artificial Intelligence

The use of connectionist approaches in conversational agents has been progressing rapidly due to the availability of large corpora. However current generative dialogue models often lack coherence and are content poor. This work proposes an architecture to incorporate unstructured knowledge sources to enhance the next utterance prediction in chit-chat type of generative dialogue models. We focus on Sequence-to-Sequence (Seq2Seq) conversational agents trained with the Reddit News dataset, and consider incorporating external knowledge from Wikipedia summaries as well as from the NELL knowledge base. Our experiments show faster training time and improved perplexity when leveraging external knowledge.


Automatic Generation of Text Descriptive Comments for Code Blocks

arXiv.org Artificial Intelligence

We propose a framework to automatically generate descriptive comments for source code blocks. While this problem has been studied by many researchers previously, their methods are mostly based on fixed template and achieves poor results. Our framework does not rely on any template, but makes use of a new recursive neural network called Code-RNN to extract features from the source code and embed them into one vector. When this vector representation is input to a new recurrent neural network (Code-GRU), the overall framework generates text descriptions of the code with accuracy (Rouge-2 value) significantly higher than other learning-based approaches such as sequence-to-sequence model. The Code-RNN model can also be used in other scenario where the representation of code is required.


Attention Mechanism in Neural Networks: Where it Comes and Where it Goes

#artificialintelligence

It can be said that 2015 is the golden year of attention mechanisms. Because the number of attention studies has grown like an avalanche after three main studies presented in that year. However, compressing all the necessary information of a source sentence into a fixed-length vector is an important disadvantage of this encoder-decoder approach. The idea that Bahdanau et al. (2015) introduced is an extension to the conventional NMT models. This extension is composed of an encoder and decoder as shown in Fig 1.