Robot re-creates the gait of 290-million-year-old creature based on fossil find in Germany

The Japan Times

WASHINGTON - How did the earliest land animals move? Scientists have used a nearly 300-million-year-old fossil skeleton and preserved ancient footprints to create a moving robot model of prehistoric life. Evolutionary biologist John Nyakatura at Humboldt University in Berlin has spent years studying a 290-million-year-old fossil dug up in central Germany's Bromacker quarry in 2000. The four-legged plant-eater lived before the dinosaurs and fascinates scientists "because of its position on the tree of life," said Nyakatura. Researchers believe the creature is a "stem amniote" -- an early land-dwelling animal that later evolved into modern mammals, birds and reptiles.

MIT lets AI "synthesize" computer programs to aid data scientists


How to make artificial intelligence more approachable for ordinary mortals -- that is, people who are neither programmers nor IT admins nor machine learning scientists -- is a topic very much in vogue these days. One approach is to abstract all the complexity by stuffing it in cloud computing operations, as was proposed by one AI startup described recently by ZDNet, Petuum, which aims to "industrialize" AI. Another approach, presented this week by MIT, is to make machine learning do more of the work itself, to invent its own programs to crunch data in specific applications such as time series analysis. This is a hot area of AI in itself, having machines build the models that in turn perform the induction of answers from data. The researchers describe a way to automate the creation of programs that infer patterns in data, which means that a data scientist doesn't need to figure out the "model" that fits the data being studied.

Up close with Mars: NASA's InSight lander reveals its seismometer is 'crouched' to hear sounds

Daily Mail

NASA's InSight lander is leaning in for a better listen of Mars' underground tremors. The robotic explorer placed its seismometer on the surface at the end of last month, and is now getting even closer'for a better connection with Mars.' This will help its instruments pick up fainter signals that may otherwise have been missed. NASA's InSight lander is leaning in for a better listen of Mars' underground tremors. The robotic explorer placed its seismometer on the surface at the end of last month, and is now getting even closer'for a better connection with Mars.' Before and after images show its instrument at its lowest position yet Days prior, InSight leveled out its seismometer and adjusted the internal sensors ahead of lowering everything down toward the ground.

Toronto's SickKids announces first-of-its-kind artificial intelligence position


Dr. Anna Goldenberg, senior scientist in genetics and genome biology at SickKids, poses at the Peter Gilgan Centre for Research and Learning in Toronto. Inside the pediatric intensive care unit at Toronto's Hospital for Sick Children, an infant recovering from open-heart surgery is barely visible through the forest of whizzing and beeping machines that monitor his every vital sign. In the old days, those vital signs – a baby's heart rate, blood pressure, oxygen levels and other signals – would have flashed across a screen and then been lost to posterity. But in 2013, SickKids began collecting and storing the data that emanate from patients in their 42 intensive-care beds. The unit now has more than two trillion data points in its virtual vault, far more than a mere mortal could make sense of.

New app gives throat cancer patients their voices back

The Japan Times

PRAGUE - Vlastimil Gular's life took an unwelcome turn a year ago: minor surgery on his vocal cords revealed throat cancer, which led to the loss of his larynx -- and with it, his voice. But the 51-year-old father of four is still chatting away using his own voice rather than the tinny timbre of a robot, thanks to an innovative app developed by two Czech universities. "I find this very useful," Gular said, using the app to type in what he wanted to say, in his own voice, via a mobile phone. "I'm not very good at using the voice prosthesis," he added, pointing at the hole the size of a large coin in his throat. A small silicon device implanted in the throat allows people to speak by pressing the hole with their fingers, to regulate airflow through the prosthesis, and so create sound.

I used facial recognition technology on birds


As a birder, I had heard that if you paid careful attention to the head feathers on the downy woodpeckers that visited your bird feeders, you could begin to recognize individual birds. I even went so far as to try sketching birds at my own feeders and had found this to be true, up to a point. In the meantime, in my day job as a computer scientist, I knew that other researchers had used machine learning techniques to recognize individual faces in digital images with a high degree of accuracy. These projects got me thinking about ways to combine my hobby with my day job. Would it be possible to apply those techniques to identify individual birds?

Scientists are building a quantum computer that "acts like a brain"


A new research project aims to harness the power of quantum computers to build a new type of neural network -- work the researchers say could usher in the next generation of artificial intelligence. "My colleagues and I instead hope to build the first dedicated neural network computer, using the latest'quantum' technology rather than AI software," wrote Michael Hartmann, a professor at Heriot-Watt University who's leading the research, in a new essay for The Conversation. "By combining these two branches of computing, we hope to produce a breakthrough which leads to AI that operates at unprecedented speed, automatically making very complex decisions in a very short time." A neural network is a type of machine learning algorithm loosely modeled on a biological brain, which learns from examples in order to deal with new inputs. Quantum computers take advantage of subatomic particles that can exist in more than one state at a time to circumvent the limitations of old-fashioned binary computers.

Artificial intelligence used to detect women with deadliest ovarian cancer


The deadliest forms of ovarian cancer have been detected using AI, offering hope for more precise life-saving treatments. Scientists have created a tool that seeks out clusters of tumour cells with unusually shaped nuclei – the control centres at the heart of each cell. More than 7,000 new cases of ovarian cancer are diagnosed every year in the UK, and it has one of the worst survival rates of all cancers. Normal survival rate beyond five years is 53 per cent, but this is cut to just 15 per cent for patients with these misshapen cells. The oddly shaped nuclei appear to be an indication that the cell's DNA has become unstable, but conventional tests can easily overlook them.

Computing at Light Speed: The World's First Photonic Neural Network Has Arrived


As developments are made in neural computing, we can continue to push artificial intelligence further. A fairly recent technology, neural networks have been taking over the world of data processing, giving machines advanced capabilities such as object recognition, face recognition, natural language processing, and machine translation. These sound like simple things, but they were way out of reach for processors until scientists began to find way to make machines behave more like human brains in the way they learned and handled data. To do this, scientists have been focusing on building neuromorphic chips, circuits that operate in a similar fashion to neurons. Now, a team at Princeton University has found a way to build a neuromorphic chip that uses light to mimic neurons in the brain, and their study has been detailed in Cornell University Library.

Scientists are using machine learning to unlock the mysteries of long-dead languages


Although cuneiform passed to other Mesopotamian cultures, which refined and altered it to suit their own languages and dialects, knowledge of how to read and write the various cuneiform scripts was gradually lost to time. In the 19th century, translators managed to decipher the writing system; and in 1872 the Assyriologist George Smith translated the most famous example of cuneiform, the Epic of Gilgamesh, a 4000-year-old poem widely believed to be the earliest surviving great work of literature. Unfortunately, translation of cuneiform tablets is still a time-consuming process and there are very few modern scholars who are able to decipher them. Sumerian is what is known as a "language isolate", one that has no genealogical relationship to any other language spoken today. But modern technology has given researchers new hope of unravelling the script imprinted on the roughly 300,000 cuneiform tablets discovered to date, of which only around 10% have been translated so far.