Goto

Collaborating Authors

martin


Facebook, Twitter and YouTube have your data. Why not China-owned ByteDance's TikTok?

USATODAY - Tech Top Stories

Facebook, Twitter and YouTube have your data. The Trump administration said Friday that it would bar two popular Chinese-owned mobile apps WeChat and TikTok from U.S. app stores as of midnight Sunday, escalating the U.S. standoff with China. "Today's actions prove once again that President Trump will do everything in his power to guarantee our national security and protect Americans from the threats of the Chinese Communist Party," Commerce Secretary Wilbur Ross said in a statement. The Trump administration contends the data collected from American users by TikTok and WeChat could be accessed by the Chinese government. "The Trump administration is looking to make sure U.S. TikTok consumer data stays out of Beijing," said Wedbush Securities analyst Daniel Ives.


Soldier targeting goggles 'augment' human 3-D vision tracking

FOX News

Fox News Flash top headlines are here. Check out what's clicking on Foxnews.com. Imagine this land-war scenario: An enemy fighter is several hundred yards away, another is attacking from one mile while a third fires from a nearby room in a close-quarters urban warfare circumstance, when U.S. Army soldiers apprehend, integrate, and quickly map the locations of multiple targets at once in 3D, all while knowing the range and distance of the enemy forces. How could something like this be possible, one might wonder, given the nuances in perspective, range, navigational circumstances and the limitations of a human eye? These complexities form the conceptual basis upon which the Army is fast-tracking its Integrated Visual Augmentation System, or IVAS, which is a soldier-worn combat goggle engineered with advanced sensors that are able to overcome some of the limitations of human vision and quickly organize target data.


Q&A: Participatory Machine Learning

#artificialintelligence

In May 2020, Fernanda Viégas, Jess Holbrook, and Martin Wattenberg -- who cofounded Google Research's People AI Research (PAIR) initiative in 2017 -- sat down to talk about participatory machine learning, a core idea central to the direction PAIR's research and projects have taken. They took the conversation as an opportunity to further articulate and explore the concept in theory and especially in practice. David Weinberger, PAIR's writer-in-residence, prompted them with questions. Fernanda: From the beginning, PAIR has had a broad research agenda focused on putting humans at the center of building AI technology. For instance, we were building tools to help developers understand their data and model behaviors, but we were also working on how doctors do or don't trust AI-assisted diagnoses. We were bringing Tensorflow to the web and publishing human-centered AI guidance for UXers.


Air Force, Navy and Army merge attack tactics into Joint All Domain Command and Control

FOX News

Fox News Flash top headlines are here. Check out what's clicking on Foxnews.com. A forward-operating, satellite-networked Air Force drone comes across a small, moving group of enemy surface ships heading toward vulnerable areas, when instant data is sent to Navy ships' commanders and land-based Army weapons operators in real-time, enabling a coordinated, multi-pronged attack using deck-fired Tomahawk missiles fired from the ocean, land-based attack rockets and fighter jets armed with air-to-surface weapons. This possible scenario, in which land, sea and air warriors and weapons system share information in real-time across vast, otherwise dispersed areas to optimize attack is precisely what the Pentagon intends with its new doctrinal and technical approach to future war. The Army, Navy and Air Force each have secure information-sharing combat network technology programs.


Specialized polymers bring us one step closer to 'cyborgs'

#artificialintelligence

Our squishy, salty brains are capable of doing incredible things -- from commanding us to walk to solving complex questions about our world. Scientists and science-fiction authors alike have yearned to understand (and even) control our brains, but they've thus far been an incredibly complex nut to crack. Intriguingly, the development of a new, biocompatible polymer coating for electronic implants by a team of researchers at the University of Delaware could be the key to better understanding this biological black box. These polymers would not only leave less scarring on biological tissue than inorganic-coated electronics but would also allow scientists to fine-tune the sensitivities of polymers -- which could allow for the creation of early warning systems for the presence of harmful diseases. Furthermore, as these devices continue to mature, scientists say they could be the answer to creating an effective human brain-A.I. interface in the future.


New "Cyborg" Technology Could Enable Merger of Humans and AI

#artificialintelligence

Such devices could monitor for tumor development or stand in for damaged tissues. But connecting electronics directly to human tissues in the body is a huge challenge. Now, a team is reporting new coatings for components that could help them more easily fit into this environment. The researchers will present their results today (Agusut 17, 2020) at the American Chemical Society (ACS) Fall 2020 Virtual Meeting & Expo. ACS is holding the meeting through Thursday.


Arena and the disappearing art of bootstrapping startups

ZDNet

Silicon Valley headlines often report on the size of venture capital raised by a startup -- the bigger the funding, the bigger the story. But this is a poor way to understand the startup community. Startup success isn't determined by how much you raise; it's about how much you keep. Arena.im is a great example. It recently raised a seed round of $2.3 million -- a tiny amount by local standards.


A.I. Artificial Intelligence shows us a future where we neglect to dream

#artificialintelligence

The Verge is a place where you can consider the future. In Yesterday's Future, we revisit a movie about the future and consider the things it tells us about today, tomorrow, and yesterday. The future: A.I. begins with a brief summary of the sorry state of the world: climate change has melted the polar ice caps, wiping out coastal cities and severely reducing the human population. With regulations in place for reproduction on a resource-starved planet, corporations developed Mecha -- androids that appear human but lack emotions. They're seen as objects -- useful for labor or sex work, just human enough to not be strange but machine enough to not mistake them for people.


Debate flares over using AI to detect Covid-19 in lung scans - STAT

#artificialintelligence

A series of studies, starting as a steady drip and quickening to a deluge, has reported the same core finding amid the global spread of Covid-19: Artificial intelligence could analyze chest images to accurately detect the disease in legions of untested patients. The results promised a ready solution to the shortage of diagnostic testing in the U.S. and some other countries and triggered splashy press releases and a cascade of hopeful headlines. But in recent days, the initial burst of optimism has given way to an intensifying debate over the plausibility of building AI systems during an unprecedented public health emergency. On one side are AI developers and researchers who argue that training and testing methods can, and should, be modified to fit the contours of the crisis; on the other are skeptics who point to flaws in study designs and the limited number of lung scans available from coronavirus patients to train AI algorithms. They also argue that imaging should be used sparingly during the pandemic because of the risk of spreading the infection through contaminated equipment.


Data-analysis solutions: New artificial intelligence algorithm better predicts corn yield

#artificialintelligence

"We're trying to change how people run agronomic research. Instead of establishing a small field plot, running statistics and publishing the means, what we're trying to do involves the farmer far more directly. We are running experiments with farmers' machinery in their own fields. We can detect site-specific responses to different inputs. And we can see whether there's a response in different parts of the field," said Nicolas Martin, assistant professor in the U of I Department of Crop Sciences and co-author of the study.