All eyes are on Facebook as more and more information rolls out regarding Cambridge Analytica, its involvement in recent elections and forums and how it came to obtain 50 million Facebook users' profile information. Now, New York Attorney General Eric Schneiderman is joining those demanding more information from the social network giant. "Consumers have a right to know how their information is used -- and companies like Facebook have a fundamental responsibility to protect their users' personal information," Schneiderman said in a statement. "Today, along with Massachusetts Attorney General Healey, we sent a demand letter to Facebook -- the first step in our joint investigation to get to the bottom of what happened."
It's still unclear to what degree Uber's vehicle was responsible for the tragedy. Tempe's police chief has said he doesn't believe Uber is at fault, but the department isn't responsible for determining the fault in any crash. The incident came not long after Toyota and Uber had formed a partnership, and just days after a report claiming that Uber was hoping to sell its self-driving tech to Toyota. This isn't likely to put deals in jeopardy, but it may lead to more caution from both companies as they minimize the chances of a repeat incident.
The vehicle did have a human operator in the car, but it was in autonomous mode. The driver, Rafaela Vasquez, said that "it was like a flash," when the person abruptly stepped out from a center median in front of the car. "His first alert to the collision was the sound of the collision," Moir stated to the San Francisco Chronicle. The vehicle was traveling 38 mph in a 35 zone. The pedestrian did not appear to be using a crosswalk, though apparently the street design did make it appear as if that section was inviting people to cross.
"Our team is uncovering data breaches every day because of one simple fact: Organizations don't know what data they have, who has it and where it exists," Mike Baukes, co-founder and co-CEO of UpGuard, said in a statement. "BreachSight will allow customers to regain control of their private data by providing total visibility over their digital footprint." BreachSight will automatically perform searches based on relevant keywords provided by those using the service and it will scan the places that UpGuard regularly finds exposed data -- such as Amazon S3 servers. It will then inform companies when data they control is found exposed online as well as data managed by their partners, suppliers and other connected vendors. BreachSight is meant to work in a complementary way to UpGuard's CyperRisk service, which provides companies continuous risk assessments of both their own security practices and those of connected vendors.
One of IBM's first partners Harman will demonstrate Watson Assistant at the event through a digital cockpit aboard a Maserati GranCabrio, though the companies didn't elaborate on what it can do. In fact, IBM already released a Watson-powered voice assistant for cybersecurity early last year. You'll be able to access Watson Assistant via text or voice, depending on the device and how IBM's partner decides to incorporate it. So, you'll definitely be using voice if it's a smart speaker, but you might be able to text commands to a home device. Speaking of commands, it wasn't just designed to follow them -- it was designed to learn from your actions and remember your preferences.
Sunday evening, one of Uber's autonomous SUVs struck a woman who later died at the hospital as a result of her injuries. It appears to be the first time a pedestrian has died after a collision with an autonomous vehicle, and as a result, Uber has temporarily suspended all its testing. While an investigation is ongoing, the Tempe police chief provided an update based on video from the car itself, and said "it's very clear it would have been difficult to avoid this collision in any kind of mode (autonomous or human-driven) based on how she came from the shadows right into the roadway." Still, if there's a chance for widespread adoption of self-driving car technology, the amount of transparency Uber displays right now will be important. The idea of a data-science company no one has ever heard of attempting to poke around in a country's collective psyche sounds like a plot out of Black Mirror, and yet here we are.
As a result of this incident, Uber has stopped all self-driving vehicle tests in San Francisco, Pittsburgh, Toronto and the greater Phoenix area. "Our hearts go out to the victim's family. We are fully cooperating with local authorities in their investigation of this incident," said Uber in a statement. CEO Dara Khosrowshahi echoed the sentiment on Twitter, saying that the authorities were trying to figure out what happened. We're thinking of the victim's family as we work with local law enforcement to understand what happened.
The company recently unveiled two new models including the GH5 (not the camera) and ADA Mini, above. With its, er, friendly face, it's designed to serve drinks, take orders, walk, speak, detect obstacles and move its head across two axes. The robots can work in shopping malls, hospitals, airports and other businesses to help tourists, patients, visitors and others. The ADA Mini, for instance, will be deployed at an Instanbul airport to provide visitors with check-in information, directions, and more. The ADA GH5 can perform similar chores and even dance, but with a decidedly more Cylon vibe.
Uber is putting all of its self-driving vehicle tests on hold after one of its cars struck and killed a pedestrian in Tempe, Arizona Sunday evening. According to ABC affiliate KNXV, the car had a human operator behind the wheel but was in autonomous mode. A woman walking on a crosswalk was struck by the car and she later died in the hospital due to the injuries she sustained. Uber says that it is working with the the local authorities.
It's a rendering technique based on how we see the world. In nature, light photons bounce off of various objects before they hit your eyes. Because simulating every photon in real-time is a little ambitious, ray tracing reverses that process, sending millions of lines outwards from the camera towards the scene in front. When a ray hits something, it then checks the object's properties as well as the light sources around it to calculate the exact color that pixel on your screen should be. There's more to it, of course, but no one wants to read about algorithms on a Monday morning.