Telstra's independent venture capital arm has shown its intention to expand into the artificial intelligence data market following a $US100m (145m AUD) capital raising for San Francisco company Trifacta. Trifacta employs machine-learning technology to deduce a greater depth of insights from the increasing level of data migrating to cloud-based storage. Australia's largest venture capital fund, Telstra Ventures Fund No 2, led the investment, joined in the round by the likes of Energy Impact Partners, NTT Docomo, BMW Ventures and ABN AMRO. Telstra Venture joins a long and credible list of existing investors from Accel Partners, Greylock Partners, Ignition Partners and Google. "The share register for Trifacta is very impressive. It is great to have so many experienced and impressive co-investors in this deal. That is a really massive plus for us," Mr Koertge said.
All industries face similar challenges as they seek to extract information from forms, documents, and visual artifacts - and most agree that is costly, time consuming and prone to errors with manual data entry. In this session, you will learn how to use machine learning on a scalable cloud-based platform to efficiently analyze documents - and use the knowledge hiding within - to improve decision-making at your company. Iron Mountain will show how they have been able to ingest nearly every type of imaged data from a wide variety of origins, both on-premise and in the cloud, to capture, process, analyze and then store data integrated into a complete visual search interface to enable their customers to unlock insights from their documents.
It seems like artificial intelligence is taking over the world, leaving many of us non-techies feeling terrified. Yet when you stop to think about it, we all use artificial intelligence (AI) every day. When we Google something, use Siri on our smartphones or ask Alexa a question, we are using AI. Hollywood has certainly featured AI in many movies from "The Terminator" series to "Robocop" and "I, Robot." In "Minority Report," algorithms predict who is going to commit a crime, and the person is arrested before the crime can be committed.
For the Vision AI Developer Kit, Microsoft and Qualcomm have partnered to simplify training and deploying computer vision-based AI models. Developers can use Microsoft's cloud-based AI and IoT services on Azure to train models while deploying them on the smart camera edge device powered by a Qualcomm's AI accelerator. Let's take a close look at Vision AI Developer Kit. The Vision AI Developer Kit not only looks stylish and sophisticated, but also boasts of an impressive configuration. The kit is powered by a Qualcomm Snapdragon 603 processor, 4GB of LDDR4X memory and 16GB of eMMC storage.
It supports mainstream deep learning frameworks such as TensorFlow, PyTorch and PaddlePaddle. Tensor Engine and its operators are Huawei's equivalent of NVIDIA cuDNN, a library that makes CUDA accessible to AI developers. MindSpore is Huawei's own unified training/inference framework architected to be design-friendly, operations-friendly that's adaptable to multiple scenarios. It includes core subsystems, such as a model library, graph compute, and tuning toolkit; a unified, distributed architecture for machine learning, deep learning, and reinforcement learning; a flexible program interface along with support for multiple languages. MindSpore is highly optimized for Ascend chips. It takes advantage of the hardware innovations that went into the design of the AI chips.
Every time we binge on Netflix or install a new internet-connected doorbell to our home, we're adding to a tidal wave of data. In just 10 years, bandwidth consumption has increased 100 fold, and it will only grow as we layer on the demands of artificial intelligence, virtual reality, robotics and self-driving cars. According to Intel, a single robo car will generate 4 terabytes of data in 90 minutes of driving. That's more than 3 billion times the amount of data people use chatting, watching videos and engaging in other internet pastimes over a similar period. Tech companies have responded by building massive data centers full of servers.
These Docker images use popular frameworks and are performance optimized, compatibility tested, and ready to deploy. Deep Learning Containers provide a consistent environment across Google Cloud services, making it easy to scale in the cloud or shift from on-premises. You have the flexibility to deploy on Google Kubernetes Engine (GKE), AI Platform, Cloud Run, Compute Engine, Kubernetes, and Docker Swarm.
What can fly like a bird and hover like an insect? If drones had this combo, they would be able to maneuver better through collapsed buildings and other cluttered spaces to find trapped victims. Purdue University researchers have engineered flying robots that behave like hummingbirds, trained by machine learning algorithms based on various techniques the bird uses naturally every day. This means that after learning from a simulation, the robot "knows" how to move around on its own like a hummingbird would, such as discerning when to perform an escape maneuver. Artificial intelligence, combined with flexible flapping wings, also allows the robot to teach itself new tricks.
What this will mean, in the short term, is that AI will become significantly more capable, in less time due to dramatically faster prototyping and larger scale training. In addition, there will be a growth in practical applications of AI because the new paradigm of training at the edge avoids the huge upfront costs of centralized training in the cloud. Millions more developers can now participate in advancing AI solutions. Because training can be coordinated between devices using the IoT (Internet of Things), the cloud infrastructure will have a diminished role. One of the early applications of AI in the construction industry is for training workers and improving their skills.