Architecture


Gartner: top 10 data and analytics technology trends for 2019

#artificialintelligence

The story of data and analytics is one that keeps evolving; from appointing chief data officers to procuring the latest analytics software, business leaders are desperately trying to utilise it, but it's not easy. "The size, complexity, distributed nature of data, speed of action and the continuous intelligence required by digital business means that rigid and centralised architectures and tools break down," says Donald Feinberg, vice president and distinguished research analyst at Gartner. "The continued survival of any business will depend upon an agile, data-centric architecture that responds to the constant rate of change. But while business leaders have to tackle digital disruption by looking for the right services and technology to help streamline their data processes, unprecedented opportunities have also arisen. The sheer amount of data, combined with the increase of strong processing capabilities enabled by cloud technologies, means it's now possible to train and execute algorithms at the large scale necessary to finally realise the full potential of AI. According to Gartner, it's critical to gain a deeper understanding of the following top 10 technology trends fuelling that evolving story and prioritise them based on business value to stay ahead. Gartner says by 2020, augmented analytics will be the main selling point for analytics and BI solutions. Using machine learning and AI, augmented analytics is considered, by Gartner, as a disrupter in the data and analytics market because it will transform how analytics content in developed, consumed and shared. Augmented data management utilises machine learning capabilities and AI technology to make data management categories including data quality, master data management, metadata management, data integration as well as database management systems (DBMSs) self-configuring and self-tuning. According to Gartner, this is a big deal because it automates many of the manual tasks opening up opportunities for less technically skilled users to use data. It also helps highly skilled technical resources to focus on more value-adding tasks. Through to the end of 2022, manual tasks in data management will be cut by 45% thanks to ML and automated service-level management. Continues data is more than a new way to say real-time data. Instead, it's about a design pattern where real-time analytics are combined with business operations, processing current and historical data to prescribe actions in response to events. "Continuous intelligence represents a significant change in the job of the data and analytics team," says Rita Sallam, research vice president at Gartner. "It's a grand challenge -- and a grand opportunity -- for analytics and BI (business intelligence) teams to help businesses make smarter real-time decisions in 2019.


RFID tags help robots locate and grab moving objects in milliseconds

ZDNet

A group of researchers at the Massachusetts Institute of Technology (MIT) have developed a system that uses radio-frequency identification (RFID) tags to locate moving, tagged objects within milliseconds. The system, called TurboTrack, could improve the efficiency of robots working on manufacturing, as well as carrying out drone search-and-rescue missions, with the the system being able to locate objects within 7.5 milliseconds on average and with errors of less than 1 centimetre. TurboTrack uses a reader to send wireless signals to RFID tags that can be applied to any object, which is then rebounded back to the reader. The system uses a "space-time super-resolution" algorithm, MIT says, which sifts through the reflected signals to locate the RFID tag's response. "As the tag moves, its signal angle slightly alters -- a change that also corresponds to a certain location ... by constantly comparing that changing distance measurement to all other distance measurements from other signals, it can find the tag in a three-dimensional space. This all happens in a fraction of a second," MIT said.


Exposed Chinese database shows depth of surveillance state

The Japan Times

BEIJING - The Chinese database Victor Gevers found online was not just a collection of old personal details. It was a compilation of real-time data on more than 2.5 million people in western China, updated constantly with GPS coordinates of their precise whereabouts. Alongside their names, birth dates and places of employment, there were notes on the places that they had most recently visited -- mosque, hotel, restaurant. The discovery by Gevers, a Dutch cybersecurity researcher who revealed it on Twitter last week, has given a rare glimpse into China's extensive surveillance of Xinjiang, a remote region home to an ethnic minority population that is largely Muslim. The area has been blanketed with police checkpoints and security cameras that apparently are doing more than just recording what happens.


Security Architecture for Smart Factories

#artificialintelligence

Building smart factories is a substantial endeavor for organizations. The initial steps involve understanding what makes them unique and what new advantages they offer. However, a realistic view of smart factories also involves acknowledging the risks and threats that may arise in its converged virtual and physical environment. As with many systems that integrate with the industrial internet of things (IIoT), the convergence of information technology (IT) and operational technology (OT) in smart factories allows for capabilities such as real-time monitoring, interoperability, and virtualization. But this also means an expanded attack surface.


Here's what it takes to make IoT data ready for AI and machine learning

#artificialintelligence

The integration of artificial intelligence and the Internet of Things introduces a wide array of connected health tools that produce a vast amount of data that must be synthesized, analyzed, stored and communicated by a robust information infrastructure. But if hospitals don't structure and store IoT patient data properly, that information could be rendered not assessable by AI tools. For starters, significant infrastructure is needed to streamline IoT-generated data to make sure it is simple to assess and manage with AI. "AI adoption and scale will be accelerated by the relatively low cost of deployment," said Rick Krohn, president of HealthSense, a connected health consulting firm. "A terabyte of storage costs less than $100, and wearable sensors and cloud infrastructure are becoming increasingly affordable. But AI requires sophisticated applications that deliver contextually aware right-place-right-time clinical decision support."


How AI is personalising marketing KDR Recruitment

#artificialintelligence

There is no question that artificial intelligence, machine learning and automation is changing the marketing function in many businesses. For many consumer brands these technologies are now they most effective way of communication with a customer. As AI continues to develop, the ability to personalise every piece of marketing material is becoming a reality. How can artificial intelligence and machine learning allow for better personalisation, and ultimately improve the customer experience? For many brands, apps are now the way consumers shop and spend, and for many push notifications are a great way of communicating and connecting with the customer directly.


Next-generation Armv8.1-M architecture: Delivering enhanced machine learning and signal processing for the smallest embedded devices

#artificialintelligence

The drive towards a world of a trillion connected devices is accelerating and will continue to do so, but only if we can find ways to efficiently expand the compute capabilities on a greater number of constrained devices at the far edge of the network. Increasing the compute capabilities in these devices will immediately open the door for developers to write machine learning (ML) applications directly for the device for decision-making at the source, thus enhancing data security while cutting down on network energy consumption, latency and bandwidth usage. To achieve this, we're introducing Arm Helium technology, the M-Profile Vector Extension (MVE) for the Arm Cortex-M series processors that will enhance the compute performance of the Armv8.1-M Helium will deliver up to 15x more ML performance and up to 5x uplift to signal processing for future Arm Cortex-M processors, unlocking new market opportunities for our partners where performance challenges have limited the use of low-cost and highly energy-efficient devices. Advanced digital signal processing (DSP) is available today through Arm Neon technology in richer Cortex-A based devices.


APNewsBreak: Howard Dean to Head New Dem Voter Data Exchange

U.S. News

The arrangement would allow the national party, state parties and independent political action groups on the left to share voter data in real time during campaigns. That means, for example, that a field worker for a congressional campaign in Iowa and another for an independent political action committee knocking on doors in Florida could update a master voter file essentially as they work. When a presidential campaign spends big money on consumer data to update voter profiles, the new information would go into the file as well. And all participating organizations would have access to the latest information.


Digital twin - Wikipedia

#artificialintelligence

A digital twin is a digital replica of a living or non-living physical entity.[1] By bridging the physical and the virtual world, data is transmitted seamlessly allowing the virtual entity to exist simultaneously with the physical entity. Digital twin refers to a digital replica of physical assets (physical twin), processes, people, places, systems and devices that can be used for various purposes.[2] The digital representation provides both the elements and the dynamics of how an Internet of things device operates and lives throughout its life cycle.[3] Definitions of digital twin technology used in prior research emphasize two important characteristics. Firstly, each definition emphasizes the connection between the physical model and the corresponding virtual model or virtual counterpart[4]. Secondly, this connection is established by generating real time data using sensors. Digital twins integrate internet of things, artificial intelligence, machine learning and software analytics with spatial network graphs[5] to create living digital simulation models that update and change as their physical counterparts change.


Apply Now: $1.8 Million in Seed Funding for Artificial Intelligence Startups

#artificialintelligence

The UNICEF Innovation Fund's Data Science Call for Proposals seeks to invest in innovative companies that are using data science, machine learning, artificial intelligence, or similar technologies to improve development outcomes. Apply Now! Deadline is February 28 The UNICEF Innovation Fund will make investments in for-profit companies registered in a UNICEF programme country that has an artificial intelligence solution with the potential to positively impact the lives of the most vulnerable children. Sign up now to get more funding opportunities emailed to you! The existing prototype should have promising results from initial pilots and be generating publicly exposed real-time data that is measurable and have the potential to benefit humanity. Companies interested in The UNICEF Innovation Fund's early-stage financing are required to complete and submit an Expression of Interest Response form showcasing how their project will score highly on the following criteria: Do you want to get advice on how to apply for USAID grants or get startup investments for technology entrepreneurs?