to

Wedge and Hiretual Join Forces to Streamline Candidate Sourcing and Screening

Wedge, the video interviewing solution that helps recruiters make authentic connections with candidates, announced its new partnership with Hiretual, the industry-leading AI-powered talent data system. Matt Baxter, CEO of Wedge, shared, "There's a natural synergy between Wedge and Hiretual – we're both on a mission to remove inefficiencies and enhance the recruiting function overall. That's why we're so excited Hiretual is the first major player to join Wedge's newly established partner program. Our work together is going to benefit clients in a big way." Steven Jiang, CEO of Hiretual, shared, "Hiretual can now provide customers with richer screening capabilities enhanced with video interview recordings through our partnership with Wedge. Wedge is changing how companies think about video interviewing, right when they need it most. Likewise, talent acquisition is evolving rapidly, and the partnership between Hiretual and Wedge will further support that transformation."

Emerging Job Roles for Successful AI Teams - AI Trends

Many job descriptions across organizations will require at least some use of AI in the coming years, creating opportunities for the savvy to learn about AI and advance their careers regardless of discipline. New job titles have and will emerge to help the organization execute on AI strategy. Machine learning engineers have cemented a leading role on the AI team, for example, taking first place on best jobs listed on Indeed last year, according to a recent rapport in CIO. And AI specialists were the top job in LinkedIn's 2020 Emerging Jobs report, with 74% annual growth in the last four years. This was followed by robot engineer and data scientist.

That and There: Judging the Intent of Pointing Actions with Robotic Arms

Collaborative robotics requires effective communication between a robot and a human partner. This work proposes a set of interpretive principles for how a robotic arm can use pointing actions to communicate task information to people by extending existing models from the related literature. These principles are evaluated through studies where English-speaking human subjects view animations of simulated robots instructing pick-and-place tasks. The evaluation distinguishes two classes of pointing actions that arise in pick-and-place tasks: referential pointing (identifying objects) and locating pointing (identifying locations). The study indicates that human subjects show greater flexibility in interpreting the intent of referential pointing compared to locating pointing, which needs to be more deliberate. The results also demonstrate the effects of variation in the environment and task context on the interpretation of pointing. Our corpus, experiments and design principles advance models of context, common sense reasoning and communication in embodied communication.

Learning Model Bias

In this paper the problem of {\em learning} appropriate domain-specific bias is addressed. It is shown that this can be achieved by learning many related tasks from the same domain, and a theorem is given bounding the number tasks that must be learnt. A corollary of the theorem is that if the tasks are known to possess a common {\em internal representation} or {\em preprocessing} then the number of examples required per task for good generalisation when learning $n$ tasks simultaneously scales like $O(a + \frac{b}{n})$, where $O(a)$ is a bound on the minimum number of examples required to learn a single task, and $O(a + b)$ is a bound on the number of examples required to learn each task independently. An experiment providing strong qualitative support for the theoretical results is reported.

Scientists create 'Baxter' the robot who can assist the elderly amid a shortage of nurses

Scientists have created a robot that may be able to help the elderly perform tasks amid a shortage of nurses in the UK. Named Baxter, it has two arms and 3D printed'fingers', allowing it to step in when a person is struggling with things such as getting dressed. Artificial intelligence allows the robot to detect when assistance is needed and learn about the owners difficulties over time. When it's ready for use in healthcare settings, it could help free up the time of staff so they can do other work. There are around 40,000 nurse vacancies in NHS England, which is expected to double after Brexit, according to figures.

How AI companies can avoid ethics washing

One of the essential phrases necessary to understand AI in 2019 has to be "ethics washing." Put simply, ethics washing -- also called "ethics theater" -- is the practice of fabricating or exaggerating a company's interest in equitable AI systems that work for everyone. A textbook example for tech giants is when a company promotes "AI for good" initiatives with one hand while selling surveillance capitalism tech to governments and corporate customers with the other. Accusations of ethics washing have been lobbed at the biggest AI companies in the world, as well as startups. The most high-profile example this year may have been Google's external AI ethics panel, which devolved into a PR nightmare and was disbanded after about a week.

There's still time to prevent biased AI from taking over the world

Mobile maps route us through traffic, algorithms can now pilot automobiles, virtual assistants help us smoothly toggle between work and life, and smart code is adept at surfacing our next our new favorite song. But AI could prove dangerous, too. Tesla CEO Elon Musk once warned that biased, unmonitored and unregulated AI could be the "greatest risk we face as a civilization." Instead, AI experts are concerned that automated systems are likely to absorb bias from human programmers. And when bias is coded into the algorithms that power AI it will be nearly impossible to remove.

Will Artificial Intelligence (AI) Steal Our Jobs? GetSmarter Blog

As artificial intelligence develops and disrupts more industries, more working professionals are becoming increasingly concerned about its implications for the future of work. According to a Pew Research Center survey completed in 2017, 72% of Americans fear AI technology is capable of replacing jobs, with 25% feeling exceptionally worried.1 The industries most at risk are predicted to be jobs within science, healthcare, security, farming, construction, transport, and banking.2 While it's speculated AI will take over 1.8 million human jobs by the year 2020,4 the technology is also expected to create a 2.3 million new kinds of jobs, many of which will involve the collaboration between humans and AI.5 Research shows artificial intelligence is capable of performing several tasks better than humans in specific occupations, but it's not capable of performing all tasks required for the job better than humans.6 In other words, most jobs will be affected by AI but in such a way that a partnership is formed between humans and machines, a more powerful alliance compared to either working individually.7 What will this look like?

The Soundtrack to Space Exploration

After 15 years of diligently exploring the surface of Mars, the Opportunity rover finally succumbed to the elements and went offline Feb. 13. As obituaries and tributes to "Oppy" surfaced, fans caught a glimpse into the robot's final moments: the last picture it sent, its last words, the last-ditch attempts to revive it. Scientists wept as they said their final farewells. As employees swayed and embraced, mission control sent one final transmission to Oppy: Billie Holiday's 1944 recording of "I'll Be Seeing You." The muted, intimate timbre of Holiday's voice helped millions say goodbye to "the little robot who could": I'll find you in the morning sun, I'll be looking at the moon, But I'll be seeing you.

Video Friday: Package Delivery by Robot, and More

Video Friday is your weekly selection of awesome robotics videos, collected by your Automaton bloggers. We'll also be posting a weekly calendar of upcoming robotics events for the next few months; here's what we have so far (send us your events!): Let us know if you have suggestions for next week, and enjoy today's videos. Using machine-learning and sensory hardware, Alberto Rodriguez, assistant professor of mechanical engineering, and members of MIT's MCube lab have developed a robot that is learning how to play the game Jenga. The technology could be used in robots for manufacturing assembly lines.