An artificial intelligence commission led by former Google CEO Eric Schmidt is urging the U.S. to boost its AI skills to counter China, including by pursuing "AI-enabled" weapons – something that Google itself has shied away from on ethical grounds. Schmidt and current executives from Google, Microsoft, Oracle and Amazon are among the 15 members of the National Security Commission on Artificial Intelligence, which released its final report to Congress on Monday. "To win in AI we need more money, more talent, stronger leadership," Schmidt said Monday. The report says that machines that can "perceive, decide, and act more quickly" than humans and with more accuracy are going to be deployed for military purposes -- with or without the involvement of the U.S. and other democracies. It warns against unchecked use of autonomous weapons but expresses opposition to a global ban.
It was 9:00 P.M. on a Monday in South Carolina, and the Charleston County Republican Party was ninety minutes into its February meeting, when the open-comments portion of the session began. Maurice Washington, the Party chairman and a former city-council member, invited a newcomer to the microphone at the front of the room filled with seventy members and guests. She identified herself as Elizabeth Rodi, announced that she had attended Donald Trump's rally on January 6th, and declared media reports about the Capitol insurrection false. "The people that were there were Antifa and Black Lives Matter. They were identified through facial recognition," she claimed.
In 2018, William Frederick Keck III pleaded guilty in a court in Manassas, Virginia, to possession with intent to distribute cannabis. He served three months in prison, then began a three-year probation. He was required to wear a GPS ankle monitor before his trial and then to report for random drug tests after his release. Eventually, the state reduced his level of monitoring to scheduled meetings with his parole officer. Finally, after continued good behaviour, Keck's parole officer moved him to Virginia's lowest level of monitoring: an app on his smartphone.
Three years ago, Customs and Border Protection placed an order for self-flying aircraft that could launch on their own, rendezvous, locate and monitor multiple targets on the ground without any human intervention. In its reasoning for the order, CBP said the level of monitoring required to secure America's long land borders from the sky was too cumbersome for people alone. To research and build the drones, CBP handed $500,000 to Mitre Corp., a trusted nonprofit Skunk Works that was already furnishing border police with prototype rapid DNA testing and smartwatch hacking technology. They were "tested but not fielded operationally" as "the gap from simulation to reality turned out to be much larger than the research team originally envisioned," a CBP spokesperson says. This year, America's border police will test automated drones from Skydio, the Redwood City, Calif.-based startup that on Monday announced it had raised an additional $170 million in venture funding at a valuation of $1 billion. That brings the total raised for Skydio to $340 million.
In June 2020, a new and powerful artificial intelligence (AI) began dazzling technologists in Silicon Valley. Called GPT-3 and created by the research firm OpenAI in San Francisco, California, it was the latest and most powerful in a series of'large language models': AIs that generate fluent streams of text after imbibing billions of words from books, articles and websites. GPT-3 had been trained on around 200 billion words, at an estimated cost of tens of millions of dollars. The developers who were invited to try out GPT-3 were astonished. "I have to say I'm blown away," wrote Arram Sabeti, founder of a technology start-up who is based in Silicon Valley. "It's far more coherent than any AI language system I've ever tried. All you have to do is write a prompt and it'll add text it thinks would plausibly follow. I've gotten it to write songs, stories, press releases, guitar tabs, interviews, essays, technical manuals. I feel like I've seen the future."
Three years ago, Customs and Border Protection placed an order for self-flying aircraft that could launch on their own, rendezvous, locate and monitor multiple targets on the ground without any human intervention. In its reasoning for the order, CBP said the level of monitoring required to secure America's long land borders from the sky was too cumbersome for people alone. To research and build the drones, CBP handed $500,000 to Mitre Corp., a trusted nonprofit Skunk Works that was already furnishing border police with prototype rapid DNA testing and smartwatch hacking technology. They were "tested but not fielded operationally" as "the gap from simulation to reality turned out to be much larger than the research team originally envisioned," a CBP spokesperson says. This year, America's border police will test automated drones from Skydio, the Redwood City, Calif.-based startup that on Monday announced it had raised an additional $170 million in venture funding at a valuation of $1 billion.
Clinton Township, Michigan--(Newsfile Corp. - March 1, 2021) - Resgreen Group (OTC PINK: RGGI) ("RGGI"), a leading mobile robot company, today announced the development of Atlas, its new Autonomous Mobile Robot (AMR) for demanding industrial and mission critical 24/7 applications. The vehicle can use either natural feature or magnetic tape guidance to navigate through manufacturing facilities and warehouses. The natural feature or free guidance requires no wires, tape or navigation marks. Instead, the vehicle uses advanced lasers to scan its surroundings, and then determines its position based on the mapped features along its path. "Atlas mobile robot was designed to meet a wide variety of customers' needs, whether it's free navigation requiring no modification to your facility or more cost-effective magnetic tape guidance," said Parsh Patel, CEO of RGGI. "We also understand industrial customers require a rugged vehicle that is built to last and moves heavy loads easily." It features 5G communications and operates using an Android or iOS application in manual mode and WiFi in automatic mode.
The reputation and bottom line of a company can be adversely affected if defective products are released. If a defect is not detected, and the flawed product is not removed early in the production process, the damage can be costly – and the higher the unit value, the higher those costs will be. And worst of all, dissatisfied customers can demand returns. To mitigate these costs, many manufacturers install cameras to monitor their products as they move along their production lines. However, the data obtained may not always be useful – or more appropriately said, the data is useful, but existing machine vision systems may not be able to accurately assess it at full production speeds.
The New York police department has acquired a robotic police dog, known as Digidog, and has deployed it on the streets of Brooklyn, Queens and, most recently, the Bronx. At a time that activists in New York, and beyond, are calling for the defunding of police departments – for the sake of funding more vital services that address the root causes of crime and poverty – the NYPD's decision to pour money into a robot dog seems tone-deaf if not an outright provocation. As Congresswoman Alexandria Ocasio-Cortez, who represents parts of Queens and the Bronx, put it on Twitter: "Shout out to everyone who fought against community advocates who demanded these resources go to investments like school counseling instead. Now robotic surveillance ground drones are being deployed for testing on low-income communities of color with underresourced schools." There is more than enough evidence that law enforcement is lethally racially biased, and adding an intimidating non-human layer to it seems cruel.
The ACM Conference for Fairness, Accountability, and Transparency (FAccT) has decided to suspend its sponsorship relationship with Google, conference sponsorship co-chair and Boise State University assistant professor Michael Ekstrand confirmed today. The organizers of the AI ethics research conference came to this decision a little over a week after Google fired Ethical AI lead Margaret Mitchell and three months after the firing of Ethical AI co-lead Timnit Gebru. Google has subsequently reorganized about 100 engineers across 10 teams, including placing Ethical AI under the leadership of Google VP Marian Croak. "FAccT is guided by a Strategic Plan, and the conference by-laws charge the Sponsorship Chairs, in collaboration with the Executive Committee, with developing a sponsorship portfolio that aligns with that plan," Ekstrand told VentureBeat in an email. "The Executive Committee made the decision that having Google as a sponsor for the 2021 conference would not be in the best interests of the community and impede the Strategic Plan. We will be revising the sponsorship policy for next year's conference."