Imagine your body is like a loaf of sliced bread. During an MRI scan, a powerful magnet and radio waves create detailed images of each "slice" of your body, then a computer puts the slices together to show a full picture of your anatomy. But before the slicing comes the choosing. Before an MRI technologist can scan a patient, they have to manually specify the slices they want the MRI to acquire. This process can take several minutes of tweaking and adjusting, leaving a patient waiting anxiously in the MRI scanner and adding unnecessary steps to set up each scan.
As police embrace new facial recognition technology, many fear false matches could lead to wrongful arrests. The fight over the use of our faces is far from done. A raging battle over controversial facial recognition software used by law enforcement and the civil rights of Americans might be heading to a courtroom. The latest salvo includes the American Civil Liberties Union suing the FBI, the Department of Justice and the Drug Enforcement Agency for those federal agencies' records to see if there is any secret surveillance in use nationwide. The lawsuit, filed Oct. 31, comes as organizations and law enforcement are going toe-to-toe over what is private and what isn't.
No matter the generation, we all know some of the storied battles that have withstood the test of time. With AI projected to become a $190 billion industry by 2025 (according to Markets and Markets), it is more integrated in our everyday lives than we may even notice at this stage – and it continues to gain popularity. AI has found its way into home appliances, medical imagery, natural language processing and even musical composition. One area that AI has remained a constant is cybersecurity, where its continual learnings help detect and combat cyberthreats. But what if this technology were to fall into the wrong hands?
For the second year in a row, Canada has refused visas to dozens of researchers - most of them from Africa - who were hoping to attend an artificial intelligence (AI) conference in Vancouver. The hassles have caused at least one other AI conference to choose a different country for their next event. The Neural Information Processing Systems conference (NeurIPS), which brings together thousands of experts and researchers from all over the world, will be held in Vancouver next month. Last week, NeurIPS began hearing that several attendees had had their visas denied. It was the second year in a row the conference has had visa troubles.
This August, the Department of Housing and Urban Development put forth a proposed ruling that could potentially turn back the clock on the Fair Housing Act (FHA). This ruling states that landlords, lenders, and property sellers who use third-party machine learning algorithms to decide who gets approved for a loan or who can purchase or rent a property would not be held responsible for any discrimination resulting from these algorithms. The Fair Housing Act (FHA) is a part of the Civil Rights Act of 1968. This stated that people should not be discriminated against for the purchase of a home, rental of a property or qualification of a lease based on race, national origin or religion. In 1974, this was expanded to include gender, and in 1988, disability.
Just take a moment to think about how many times you blindly trust someone or something every single day. From the very simple things you might not realise, like trusting your alarm will wake you up or the electricity provider has honoured it's obligation to maintain a steady electricity supply to your house, meaning your breakfast hasn't spoiled over night. Perhaps you're thinking about the trust you share between the members of your household, or that your neighbour will bring your bins in and feed your cats whilst you are on holiday. Did you think about how you trust that your government will support you in times of need, or that when you go to the shops your card will work or the money in your pocket will still be valid? Perhaps you thought about fake-news, as this is a hot topic recently?
We live in the greatest time in human history. Only 200 years ago, for most Europeans, life was a struggle rather than a pleasure. Without antibiotics and hospitals, every infection was fatal. There was only a small elite of citizens who lived in the cities in relative prosperity. Freedom of opinion, human and civil rights were far away. Voting rights and decision-making were reserved for a class consisting of nobility, clergy, the military and rich citizens. The interests of the general population were virtually ignored.
After speaking at an MIT conference on emerging AI technology earlier this year, I entered a lobby full of industry vendors and noticed an open doorway leading to tall grass and shrubbery recreating a slice of the African plains. I had stumbled onto TrailGuard AI, Intel's flagship AI for Good project, which the chip company describes as an artificial intelligence solution to the crime of wildlife poaching. Walking through the faux flora and sounds of the savannah, I emerged in front of a digital screen displaying a choppy video of my trek. The AI system had detected my movements and captured digital photos of my face, framed by a rectangle with the label "poacher" highlighted in red. Mark Latonero (@latonero) is a fellow at the Harvard Kennedy School's Carr Center for Human Rights Policy and a research lead at Data & Society.
To produce quality data, there is a need to look at how data is collected, he noted. Addressing the global conference on agricultural statistics here, Srivastava said: "Even though agriculture contributes about 17 per cent to the country's GDP, almost 50 per cent of the workforce is dependent on agriculture. The country's decentralised economy and statistical set up pose a challenge in developing and collecting data. The machinery in the state and central government is facing these challenges on a day-to-day basis, he said. As the demand for traditional data continues to increase, there is a need to look at the respondent burden in terms of how data is collected, Srivastava said, adding that quality data is required as decisions are taken based on that information. Srivastava, secretary in the Ministry of Statistics and Program Implementation (MoSPI), further said, "If you want to look at climate change and its impact on agriculture and farmers, we really need to look at getting data from non-inclusive methods using big data, artificial intelligence, and whatever new contemporary techniques coming.
Connect, download a free E-Book, watch a keynote, or browse my blog. Recently, I discussed how Artificial Intelligence (AI) and a new breed of Creative Machines was being used to help design everything from cities to NASA planetary rovers, and now architecture studio Wallgren Arkitekter and Swedish construction company BOX Bygg have created an AI design tool called Finch that can generate new building floor plans and adapt them according to the space available – and while this might sound like quirky work, as we begin to 3D print everything from military barracks through to family homes and 80 storey skyscrapers, having an AI that can help design buildings will no doubt come in very handy indeed. Furthermore, as AI and drone technology helps us develop the world's first fully autonomous construction sites this additional development could mean one day machines control the entire construction process – from initial building concept and design, through to final construction and fit outs. Finch, that you can see working below, will be launched in 2020 as a plug-in to visual programming tool Grasshopper within 3D computer graphics software Rhino. "The idea of Finch is to create a more user-friendly tool for architects to be able to enjoy the benefits of parametric design without any knowledge of Grasshopper or coding," said Pamela Wallgren, co-founder of Wallgren Arkitekter.