Goto

Collaborating Authors

 development kit


MILUV: A Multi-UAV Indoor Localization dataset with UWB and Vision

Shalaby, Mohammed Ayman, Ahmed, Syed Shabbir, Dahdah, Nicholas, Cossette, Charles Champagne, Ny, Jerome Le, Forbes, James Richard

arXiv.org Artificial Intelligence

This paper introduces MILUV, a Multi-UAV Indoor Localization dataset with UWB and Vision measurements. This dataset comprises 217 minutes of flight time over 36 experiments using three quadcopters, collecting ultra-wideband (UWB) ranging data such as the raw timestamps and channel-impulse response data, vision data from a stereo camera and a bottom-facing monocular camera, inertial measurement unit data, height measurements from a laser rangefinder, magnetometer data, and ground-truth poses from a motion-capture system. The UWB data is collected from up to 12 transceivers affixed to mobile robots and static tripods in both line-of-sight and non-line-of-sight conditions. The UAVs fly at a maximum speed of 4.418 m/s in an indoor environment with visual fiducial markers as features. MILUV is versatile and can be used for a wide range of applications beyond localization, but the primary purpose of MILUV is for testing and validating multi-robot UWB- and vision-based localization algorithms. The dataset can be downloaded at https://doi.org/10.25452/figshare.plus.28386041.v1. A development kit is presented alongside the MILUV dataset, which includes benchmarking algorithms such as visual-inertial odometry, UWB-based localization using an extended Kalman filter, and classification of CIR data using machine learning approaches. The development kit can be found at https://github.com/decargroup/miluv, and is supplemented with a website available at https://decargroup.github.io/miluv/.


Experience with Abrupt Transition to Remote Teaching of Embedded Systems

Koniarik, Jan, Dlhopolcek, Daniel, Ukrop, Martin

arXiv.org Artificial Intelligence

Due to the pandemic of COVID-19, many university courses had to abruptly transform to enable remote teaching. Adjusting courses on embedded systems and micro-controllers was extra challenging since interaction with real hardware is their integral part. We start by comparing our experience with four basic alternatives of teaching embedded systems: 1) interacting with hardware at school, 2) having remote access to hardware, 3) lending hardware to students for at-home work and 4) virtualizing hardware. Afterward, we evaluate in detail our experience of the fast transition from traditional, offline at-school hardware programming course to using remote access to real hardware present in the lab. The somewhat unusual remote hardware access approach turned out to be a fully viable alternative for teaching embedded systems, enabling a relatively low-effort transition. Our setup is based on existing solutions and stable open technologies without the need for custom-developed applications that require high maintenance. We evaluate the experience of both the students and teachers and condense takeaways for future courses. The specific environment setup is available online as an inspiration for others.


SmartCow's new dev kit promises conversational AI, video apps

#artificialintelligence

Enterprises needing to beef up development of conversational AI and video-based applications may want to know about SmartCow.ai, The artificial intelligence of things (AIoT) is a newly named IT category. It combines artificial intelligence (AI) with the internet of things (IoT) infrastructure to ostensibly achieve more efficient IoT operations, improve human-machine interactions, and enhance data management and analytics. The six-year-old company this week introduced its new audiovisual development kit, Apollo. Built around Nvidia Jetson Xavier NX processors, the Apollo device enables developers to create applications with conversational AI capabilities, CEO and founder Ravi Kiran told VentureBeat from the company's headquarters in St. Julian's, Malta.


BrainChip Begins Taking Orders of Akida AI Processor Development Kits

#artificialintelligence

WIRE)--BrainChip Holdings Ltd (ASX: BRN), (OTCQX: BRCHF), a leading provider of ultra-low power high performance artificial intelligence technology, today announced BrainChip will be taking orders of two development kits for its Akida advanced neural networking processor, enabling partners, large enterprises, and OEMs to begin internal testing and validation of Akida's high-performance, small, ultra-low power AI chip. Akida NSoC and intellectual property enable a wide array of edge AI capabilities that include continuous learning and inference. BrainChip is offering two development kits, both including the AKD1000 chip on a mini-PCI board: an X86 Shuttle PC development kit, as well as an ARM-based Raspberry Pi development kit. "Offering development kits is not only a major step towards full commercialization, it's also an exciting opportunity to see how our partners and future customers will put Akida to work in environments and scenarios like consumer electronics, industrial applications, aerospace and defense systems, healthcare and medical devices, automotive technology, and more," said Anil Mankar, BrainChip co-founder and chief development officer. "We believe the AKD1000 silicon, or the licensing of Akida in a configurable IP format, will lead to major changes in industries using AI at the edge because of its performance, security, low power requirements, and mainly Akida's ability to perform AI training and learning on the device itself, without dependency on the cloud."


BrainChip Begins Taking Orders of Akida AI Processor Development Kits

#artificialintelligence

Akida NSoC and intellectual property enable a wide array of edge AI capabilities that include continuous learning and inference. BrainChip is offering two development kits both including the AKD1000 chip on a mini-PCI board: an X86 Shuttle PC development kit as well as an ARM-based Raspberry Pi development kit. "Offering development kits is not only a major step towards full commercialization, it's also an exciting opportunity to see how our partners and future customers will put Akida to work in environments and scenarios like consumer electronics, industrial applications, aerospace and defense systems, healthcare and medical devices, automotive technology, and more," said Anil Mankar, BrainChip co-founder and chief development officer. "We believe the AKD1000 silicon, or the licensing of Akida in a configurable IP format, will lead to major changes in industries using AI at the edge because of its performance, security, low power requirements, and mainly Akida's ability to perform AI training and learning on the device itself, without dependency on the cloud." Development kits for Akida-based applications and solutions evolving to production status are a step toward joining the neuromorphic revolution for edge AI applications.


Developers are making games for a Nintendo 4K console that doesn't exist

The Japan Times

Many people were surprised to learn that Nintendo Co.'s new video game console is missing a common feature of rival systems: support for high-fidelity, 4K graphics. Perhaps most perplexed were the numerous developers who were working on 4K games using a software toolkit provided by Nintendo. Employees at 11 game companies said their teams were in possession of Nintendo's 4K development kit for the Switch. The companies span the globe, ranging from large publishers to small studios and include at least one that has never made a console game before, Zynga Inc., according to the employees, who asked not to be identified because they weren't authorized to discuss their projects publicly. The latest model of the best-selling Nintendo Switch is set to go on sale Oct. 8.


How Azure Percept Simplifies Building And Deploying AI Models At Edge

#artificialintelligence

Azure Percept is the latest edge computing platform from Microsoft. Announced at the recent Ignite event, the platform brings the best hardware, software and cloud services to the edge. Azure Percept is an exciting device for makers and builders to build and prototype intelligent IoT applications powered by Azure Cognitive Services and Azure Machine Learning Services. The Azure Percept platform has three elements - the hardware, development kit, and cloud-based development and management tools. Microsoft is working with the ecosystem of hardware developers to publish patterns and best practices for developing edge AI hardware that can be integrated easily with Azure AI and IoT services.


'Pushed to the limit': could 2021 be the worst year ever for video games?

The Guardian

Since the pandemic began, the video games industry has been booming. Last year was a bumper year, with most of the world's population forced inside by lockdowns and looking for safe ways to have fun and socialise, and new games consoles such as PlayStation 5 and Xbox Series X/S launching in November. UK consumers spent more on games last year than ever before; Roblox, a gaming platform popular with children and teens, saw an 85% uptick in players and shares in the company recently rose 60%, increasing its value to $47bn. Last year's games were great, too, from lockdown saviour Animal Crossing: New Horizons to the provocative horror game The Last of Us II and the knockabout multiplayer caper Fall Guys. But 2021, so far, is depressingly devoid of exciting gaming experiences.


Enabling Edge AI - Connected World

#artificialintelligence

This week we saw a big announcement that gets us further on our hybrid cloud journey--one where cloud strategies will also include edge and hybrid investments and companies can extend compute and AI (artificial intelligence) to the edge of the network. Microsoft introduces Azure Percept at its Ignite digital conference this week, which is a platform with added security for creating Azure AI technologies and solutions on the edge. The end-to-end edge AI platform includes hardware accelerators integrated with Azure AI and IoT (Internet of Things) services and pre-built AI models--for vision capabilities including object detection, shelf analytics, vehicle analytics, and audio capabilities like voice control and anomaly detection--and solution management to help go from prototype to production in minutes. This is big news especially as companies and partners look to digital transformation with great fervor. The goal of the Azure Percept platform is to simplify the process of developing, training, and deploying edge AI solutions, making it easier for more customers to take advantage of these kinds of offerings, according to Moe Tanabian, Microsoft vice president and GM of the Azure edge and devices group.


With Azure Percept, Microsoft adds new ways for customers to bring AI to the edge - The AI Blog

#artificialintelligence

Elevators that respond to voice commands, cameras that notify store managers when to restock shelves and video streams that keep tabs on everything from cash register lines to parking space availability. These are a few of the millions of scenarios becoming possible thanks to a combination of artificial intelligence and computing on the edge. Standalone edge devices can take advantage of AI tools for things like translating text or recognizing images without having to constantly access cloud computing capabilities. At its Ignite digital conference, Microsoft unveiled the public preview of Azure Percept, a platform of hardware and services that aims to simplify the ways in which customers can use Azure AI technologies on the edge – including taking advantage of Azure cloud offerings such as device management, AI model development and analytics. Roanne Sones, corporate vice president of Microsoft's edge and platform group, said the goal of the new offering is to give customers a single, end-to-end system, from the hardware to the AI capabilities, that "just works" without requiring a lot of technical know-how.