It was December 2020, and she was being invited into a pilot program providing guaranteed income--a direct cash transfer with no strings attached. For Softky, it was a lifeline. "For the first time in a long time, I felt like I could … take a deep breath, start saving, and see myself in the future," she says. The idea of "just giving people money" has been in and out of the news since becoming a favored cause for many high-profile Silicon Valley entrepreneurs, including Twitter's Jack Dorsey, Facebook cofounders Mark Zuckerberg and (separately) Chris Hughes, and Singularity University's Peter Diamandis. They proposed a universal basic income as a solution to the job losses and social conflict that would be wrought by automation and artificial intelligence--the very technologies their own companies create.
It's amazing that the Star Wars juggernaut (enabled by the Disney industrial complex) has managed to ringfence one entire day each year to peddle new shows, movies, toys and the rest. If you survived yesterday without seeing Gandalf doing the Spock salute with some white text saying May The Fourth Be With You, you're living a better life than me. Sneering aside, I got something out of May 4th -- the briefest glimpse of a'working' lightsaber that extends and retracts a blade of what looks like light. The device definitely looks far more expensive than my double-edged Dark Maul saber from 1999, and there doesn't appear to be a plastic tube in sight. Patents unearthed after Disney showed off the saber suggest the blade is composed of LED-illuminated plastic, bright enough to obscure the fact it isn't actually a laser that could cut a robot in half.
Last week, the United States Senate played host to a number of social media company VPs during hearings on the potential dangers presented by algorithmic bias and amplification. While that meeting almost immediately broke down into a partisan circus of grandstanding grievance airing, Democratic senators did manage to focus a bit on how these recommendation algorithms might contribute to the spread of online misinformation and extremist ideologies. The issues and pitfalls presented by social algorithms are well-known and have been well-documented. So, really, what are we going to do about it? "So I think in order to answer that question, there's something critical that needs to happen: we need more independent researchers being able to analyze platforms and their behavior," Dr. Brandie Nonnecke, Director of the CITRIS Policy Lab at UC Berkeley, told Engadget. Social media companies "know that they need to be more transparent in what's happening on their platforms, but I'm of the firm belief that, in order for that transparency to be genuine, there needs to be collaboration between the platforms and independent peer reviewed, empirical research."
Apple's AirTag item trackers are reasonably priced, petite and already rather useful. They do, however, demand some sort of extra peripheral to keep them attached to things. Yes, there are official (and cheaper, unofficial) keychains and things, but the lack of a keyring hole is a frustration. So can I just do-it-myself, and drill a hole? It laid out how you can drill a hole in an AirTag -- just don't forget to remove the battery first.
Each year, researchers from around the world gather at Neural Information Processing Systems, an artificial-intelligence conference, to discuss automated translation software, self-driving cars, and abstract mathematical questions. It was odd, therefore, when Michael Levin, a developmental biologist at Tufts University, gave a presentation at the 2018 conference, which was held in Montreal. Fifty-one, with light-green eyes and a dark beard that lend him a mischievous air, Levin studies how bodies grow, heal, and, in some cases, regenerate. He waited onstage while one of Facebook's A.I. researchers introduced him, to a packed exhibition hall, as a specialist in "computation in the medium of living systems." Levin began his talk, and a drawing of a worm appeared on the screen behind him.
Robots are not going to cut hair or perform other salon services any time soon. It requires human judgement and intuition and there's a bond of trust that develops between a stylist and their customer. However, technology can still transform a salon by streamlining processes, adding automation, and generating more business. By using technology solutions including various cloud-based platforms, salon owners, managers, and stylists can all benefit by staying busier and developing long-term relationships with loyal customers. As the country starts reopening and decimated salons dive back into business, they'll also need technology to develop competitive advantages as demand for their services grows.
In a new series of experiments, artificial intelligence (A.I.) algorithms were able to influence people's preferences for fictitious political candidates or potential romantic partners, depending on whether recommendations were explicit or covert. Ujué Agudo and Helena Matute of Universidad de Deusto in Bilbao, Spain, present these findings in the open-access journal PLOS ONE on April 21, 2021. From Facebook to Google search results, many people encounter A.I. algorithms every day. Private companies are conducting extensive research on the data of their users, generating insights into human behavior that are not publicly available. Academic social science research lags behind private research, and public knowledge on how A.I. algorithms might shape people's decisions is lacking. To shed new light, Agudo and Matute conducted a series of experiments that tested the influence of A.I. algorithms in different contexts.
A Graph is a data structure consisting of finite number of nodes (or vertices) and edges that connect them. The numbered circles are nodes with the lines connecting them being the edges. A pair (0,1) represents an edge that connects the nodes or vertices 0 and 1. Graphs are used to represent and solve many real life problems. For example, they can represent any network which could be social media like Facebook, LinkedIn etc. or Google maps. In graph search, we traverse or search the graph data structure from node to node. The easiest to understand example would be that of navigation maps like Google Maps.
Machine learning is capable of doing all sorts of things as long as you have the data to teach it how. That's not always easy, and researchers are always looking for a way to add a bit of "common sense" to AI so you don't have to show it 500 pictures of a cat before it gets it. Facebook's newest research takes a big step towards reducing the data bottleneck. The company's formidable AI research division has been working on how to advance and scale things like advanced computer vision algorithms for years now, and has made steady progress, generally shared with the rest of the research community. One interesting development Facebook has pursued in particular is what's called "semi-supervised learning."
Australia's eSafety Commissioner is set to receive sweeping new powers like the ability to order the removal of material that seriously harms adults, with the looming passage of the Online Safety Act. Tech firms, as well as experts and civil liberties groups, have taken issue with the Act, such as with its rushed nature, the harm it can cause to the adult industry, and the overbearing powers it affords to eSafety, as some examples. Current eSafety Commissioner Julie Inman Grant has even previously admitted that details of how the measures legislated in the Online Safety Bill 2021 would be overseen are still being worked out. The Bill contains six priority areas, including an adult cyber abuse scheme to remove material that seriously harms adults; an image-based abuse scheme to remove intimate images that have been shared without consent; Basic Online Safety Expectations (BOSE) for the eSafety Commissioner to hold services accountable; and an online content scheme for the removal of "harmful" material through take-down powers. Appearing before the Parliamentary Joint Committee on Intelligence and Security as part of its inquiry into extremist movements and radicalism in Australia, Inman Grant said while the threshold is quite high in the new powers around take-down requests, it will give her agency a fair amount of leeway to look at intersectional factors, such as the intent behind the post.