The Falcon 9 used for the mission launched 10 Iridium satellites from Vandenberg Air Force Base in California. The launch got 10 satellites into orbit, SpaceX is contracted to launch 75 of these communication satellites for Iridium and with Monday's launch has successfully put 30 into orbit. After the launch and deployment of the satellites the first stage of the rocket then entered Earth's atmosphere again and landed upright on the drone ship "Just Read The Instructions." The third launch of a bath of 10 Iridium satellites took place at Vandenberg Air Force base in California.
Sally Jones, a former punk rocker from Kent, United Kingdom, who gained notoriety as "Mrs Terror" after joining the Islamic State group (also called ISIS), was reportedly killed in a United States drone strike along with her 12-year old son Jojo in Syria as she tried to escape Raqqa, the Sun reported. Though Whitehall sources confirmed reports that Jones was killed, according to the Guardian, the Pentagon was unable to confirm the news. Jones collected another nickname -- White Widow --after Hussain was killed by a U.S. army drone in IS group capital of Raqqa in 2015. Metro reported that in a Twitter post after Hussain's death, Jones claimed she was "proud my husband was killed by the biggest enemy of Allah, may Allah be pleased with him."
North Korea had plans to direct a cyber attack against power grids in the United States and successfully launched an attack directed at South Korea's Ministry of Defense, NBC News reported. While the campaign may have failed, the attempts of North Korean hackers to target utility companies presents a growing risk for American companies that are responsible for keeping the lights on for millions of homes across the country. Many power grids operate on a network separate from the public internet, insulating the systems that control the grid from attackers. North Korean hackers were able to successfully infiltrate South Korea's defense ministry and stole a large collection of military documents that purport to detail wartime contingency plans developed by South Korea and the U.S. A total of 235 gigabytes of military documents were reported to be stolen from South Korea's Defense Integrated Data Centre in a breach that took place in September 2016, and 80 percent of those stolen files have yet to be identified.
Elon Musk's company SpaceX is planning two rocket launches and recoveries both happening in a span of just 48 hours. The two launches will occur on different coasts of the country from one another, the first at Kennedy Space Center in Florida and the second from the Vandenberg Air Force Base in California. The second launch of the 48-hour period is scheduled for 8:37 a.m. EDT Monday from the Vandenberg Air Force Base in California. This is the third launch SpaceX will conduct for Iridium, the company has contracted SpaceX to launch 75 communications satellites for them.
Artificial intelligence and machine learning is making its way into more security products, helping organizations and individuals automate certain tasks required to keep their services and information safe. Kashyap, the senior vice president and chief product officer at Cylance--a cybersecurity firm known for its use of AI--doesn't view AI and machine learning as a replacement for human workers but rather as a supplemental service that can enable those workers to do their job more efficiently. He said there were now "billions of pieces of malware" in the wild, and "well thought-out cyber campaigns" being carried out on the regular, with targeted threats directed at individuals and organizations that require a more efficient way to check the validity of code and defend against attacks. With a widening gap between the number of security professionals needed compared to the number available--a shortage of more than 1.5 million is expected by 2020--Kashyap determined the issue no longer just required a human scale solution; it needed a computing solution.
However, going by the official video, it doesn't seem to be equipped for flying long distances. Unlike diesel or petrol vehicles, battery-powered flying vehicles are equipped to fly less than an hour, also the thin frame doesn't give room to add much payload to the vehicle and also limits the number of rotors on the vehicle. But the concept can be scaled up and made capable of flying long distances and carrying bigger payloads. But it is not just the Russian military, the U.S. military is also working on a similar concept with Malloy Aeronautics, but that concept currently has a robot riding the hoverbike.
That said, the government has recently begun to act on the issue, making a start with the security guidelines for smart homes. While it does make life easy, the fact remains that AI is based on algorithms and if a base algorithm is tampered with, AI can also be reprogrammed. Unless and until these risks are properly assessed and preventive measures to plug vulnerabilities are put in place, AI adoption needs to be closely monitored. Strict security guidelines need to be put in place by governments, while tech companies need address the issue more seriously, and start issuing regular updates to plug vulnerabilities the way they currently do for smartphones.
Advantages of such weapons were discussed in a New York Times article published last year, which stated that speed and precision of the novel weapons could not be matched by humans. The official stance of the United States on such weapons, was discussed at the Convention on Certain Conventional Weapons (CCW) Informal Meeting of Experts on Lethal Autonomous Weapons Systems held in 2016 in Geneva, where the U.S. said that "appropriate levels" of human approval was necessary for any engagement of autonomous weapons that involved lethal force. In 2015, numerous scientists and experts signed an open letter that warned that developing such intelligent weapons could set off a global arms race. A similar letter, urging the United Nations to ban killer robots or lethal autonomous weapons, was signed by world's top artificial intelligence (AI) and robotics companies in the International Joint Conference on Artificial Intelligence (IJCAI) held in Melbourne in August.
A coordinated international coalition of non-governmental organizations dedicated to bringing about a preemptive ban of fully autonomous weaponry -- The Campaign to Stop Killer Robots -- was started in April 2013. A breakthrough was reached in 2016 when the fifth review conference of the United Nations Convention on Conventional Weapons (CCW) saw countries hold formal talks to expand their deliberations on fully autonomous weapons. The conference also saw the establishment of a Group of Governmental Experts (GGE) chaired by India's ambassador to the U.N., Amandeep Gill. According to Human Rights Watch, over a dozen countries are developing autonomous weapon systems.
Autonomous weapons refer to military devices that utilize artificial intelligence in applications like determining targets to attack or avoid. "We should not lose sight of the fact that, unlike other potential manifestations of AI which still remain in the realm of science fiction, autonomous weapons systems are on the cusp of development right now and have a very real potential to cause significant harm to innocent people along with global instability." For observers like the letter's signees, much of their concern over artificial intelligence isn't about science fiction hypotheticals like Gariepy alludes to. On Musk's part, the Tesla CEO has been a longtime supporter of increased regulation for artificial intelligence research and has regularly argued that, if left unchecked, it could pose a risk to the future of mankind.