The two companies have butted heads for years, and it's likely they'll continue to do so--Spotify's protest web page (in which Spotify details accusations that Apple engages in anticompetitive behavior) is just one example of hurt feelings. But despite the mutual dislike, Apple and Spotify are reportedly in talks to integrate Spotify more tightly with Siri, Apple's digital assistant. The companies are "discussing a plan" that would let iPhone users ask Siri to play music with Spotify, instead of requiring them to manually navigate to whatever song, album, or playlist they want to hear via the third-party app. The Information's report on this handy potential change cites three anonymous sources who are "familiar with the discussions." Neither company confirmed the report when contacted by Fast Company.
On paper, it's a great time to be on a dating app. In the seven years since Tinder's entrance on to the dating scene in 2012, it has gone from fringe novelty to romantic ubiquity; within two years of launching, it was seeing 1bn swipes a day. Other apps have similarly impressive stats: in 2018, Bumble's global brand director revealed it had more than 26 million users and a confirmed 20,000 marriages. It's a far cry from the considerably less optimistic response Tinder received when it launched. Many hailed it as the end of romance itself.
When it comes to pure, cutting the cord TVs, Amazon's Fire TV Edition paved new ground in 2018. It was low-priced and aimed at folks who were happy ditching cable, plugging in an antenna and using the set to watch Internet programming. Vizio's new V436-61, just out, goes even further. It does all of that, and more. Instead of just being able to use voice commands via the Amazon Alexa assistant, Vizio lets you use Apple's Siri and the Google Assistant as well.
Now that it's upending the way you play music, cook, shop, hear the news and check the weather, the friendly voice emanating from your Amazon Alexa-enabled smart speaker is poised to wriggle its way into all things health care. Amazon has big ambitions for its devices. It thinks Alexa, the virtual assistant inside them, could help doctors diagnose mental illness, autism, concussions and Parkinson's disease. It even hopes Alexa will detect when you're having a heart attack. At present, Alexa can perform a handful of health care-related tasks: "She" can track blood glucose levels, describe symptoms, access post-surgical care instructions, monitor home prescription deliveries and make same-day appointments at the nearest urgent care center.
With a new feature, Tinder says it wants to make the swiping experience safer for its LGBTQ users traveling and living in certain countries. On Wednesday, the dating app introduced a new safety update dubbed "Traveler Alert" that will warn users who have identified themselves as lesbian, gay, bisexual, transgender and/or queer when they enter a country that could criminalize them for being out. The app plans to use the locations from users' devices to determine if there is a threat to the user's safety, where users can opt to have their profile hidden during their stay or make their profile public again. The caveat being that if a user decides to have their profile public, their sexual preference or gender identity will no longer be disclosed on the app until they return to a location where the user is deemed safer to disclose their identity. In the statement, Tinder says they developed the feature so that users "can take extra caution and do not unknowingly place themselves in danger for simply being themselves."
Google has reportedly admitted that Google employees listen to private recordings of customer conversations via Google Assistant. Moreover, employees are able to access conversations which were not meant to be recorded. Leak of 1,000 private conversations in Dutch language by some of Google's partners to a Belgian news site further proved that third-party contractors working for Google were also able to access these multiple sensitive user conversations, that were reportedly recorded unintentionally. Usually, users with Google Assistant on their phones and smart speakers have to say "Ok, Google" to start a conversation with the AI-powered virtual assistant. But even when users didn't call up the virtual assistant, various user conversations that were personal and sensitive in nature were recorded.
What do Russian trolls, Facebook, and US elections have to do with machine learning? Recommendation engines are at the heart of the central feedback loop of social networks and the user-generated content (UGC) they create. Users join the network and are recommended users and content with which to engage. Recommendation engines can be gamed because they amplify the effects of thought bubbles. The 2016 US presidential election showed how important it is to understand how recommendation engines work and the limitations and strengths they offer.
Summer romance is in the air and the special someone you just met at an online dating site or on social media seems too good to be true. The sad truth is the person just might turn out to be. In fact, your would-be dreamboat could be a "catfisher." Some states have a higher risk than others, it seems. HighSpeedInternet.com has issued a new report "When Love Bites," in which the internet service provider comparison website identified the states where you are most likely to fall prey to these scammers.
"Do you believe in magic?" Google asked attendees of its annual developer conference this May, playing the seminal Lovin' Spoonful tune as an introduction. Throughout the three-day event, company executives repeatedly answered yes while touting new features of the Google Assistant, the company's version of Alexa or Siri, that can indeed feel magical. The tool can book you a rental car, tell you what the weather is like at your mother's house, and even interpret live conversations across 26 languages. But to some of the Google employees responsible for making the Assistant work, the tagline of the conference – "Keep making magic" – obscured a more mundane reality: the technical wizardry relies on massive data sets built by subcontracted human workers earning low wages.
Assigning female genders to digital assistants such as Apple's Siri and Amazon's Alexa is helping entrench harmful gender biases, according to a UN agency. Research released by Unesco claims that the often submissive and flirty responses offered by the systems to many queries – including outright abusive ones – reinforce ideas of women as subservient. "Because the speech of most voice assistants is female, it sends a signal that women are obliging, docile and eager-to-please helpers, available at the touch of a button or with a blunt voice command like'hey' or'OK'," the report said. "The assistant holds no power of agency beyond what the commander asks of it. It honours commands and responds to queries regardless of their tone or hostility. In many communities, this reinforces commonly held gender biases that women are subservient and tolerant of poor treatment."