Goto

Collaborating Authors

How to Set Up Kafka - DZone Big Data

#artificialintelligence

Kafka is one of the most popular publisher-subscriber models written in Java and Scala. It was originally developed by LinkedIn and later open-sourced. Kafka is known for handling heavy loads, i.e. You can find out more about Kafka here. In this article, I am going to explain how to install Kafka on Ubuntu.


An introduction to Apache kafke, including what is kafka, kafka architecture, topics, partitions, a kafka tutorial, a kafka producer example, and a kafka consumer example

@machinelearnbot

Kafka is a messaging system used for big data streaming and processing. In this tutorial, we discuss the basics of getting started with Kafka. We'll discuss the architecture behind Kafka and demonstrate how to get started publishing and consuming basic messages. Kafka is a messaging system. It safely moves data from system A to system B. Kafka runs on a shared cluster of servers making it a highly available and fault-tolerant platform for data streaming.


Cassandra & Kafka based tweet analysis app on Application Container Cloud

#artificialintelligence

Application Container Cloud provides out-of-the-box Service Binding for Data Hub Cloud. The Kafka cluster topology used in this case is relatively simple i.e. a single broker with co-located with Zookeeper). You can opt for a topology specific to your needs e.g. Once you're done, please check the Key and Access Tokens section for the required info -- you will use it during application deployment


Real-World Data Stream Processing

#artificialintelligence

Using streaming technologies with Kafka Spark Cassandra to effectively gain insights on data. A tremendous stream of data is consumed and created by applications these days. These data include application logs, event transaction logs (errors, warnings), batch job data, IoT sensor data, social media, other external systems data and much many more. All this data flow can be piped through the data pipelines or stages that can give insights and provide tremendous benefits to the organization. As it was mentioned recently in an article in the Economist, "The world's most valuable resource is no longer oil, but data".


kaiwaehner/kafka-streams-machine-learning-examples

#artificialintelligence

This project contains examples which demonstrate how to deploy analytic models to mission-critical, scalable production leveraging Apache Kafka and its Streams API. Examples will include analytic models built with TensorFlow, Keras, H2O, Python, DeepLearning4J and other technologies. More sophisticated use cases around Kafka Streams and other technologies will be added over time. The code is developed and tested on Mac and Linux operating systems. As Kafka does not support and work well on Windows, this is not tested at all.