Water & Waste Management

New Jersey-Size 'Dead Zone' Is Largest Ever in Gulf of Mexico

National Geographic News

This year's large size is mainly due to heavy stream flows in May, Rabalais continued, which were about 34 percent above the long-term average and carried higher-than-average amounts of nutrients through Midwest waterways and into the Gulf. In its action plan for the Gulf of Mexico hypoxic zone, the Mississippi River/Gulf of Mexico Hypoxia Task Force recently extended the deadline until 2035 for achieving the goal of a 1,950-square-mile dead zone, which would be roughly the size of Delaware. Shrinking the annual Gulf of Mexico dead zone down to that size, however, will require a much higher 59 percent reduction in the amount of nitrogen runoff that flows down the Mississippi River, according to a study published in Proceedings of the National Academy of Sciences. "The bottom line is that we will never reach the action plan's goal of 1,950 square miles until more serious actions are taken to reduce the loss of Midwest fertilizers into the Mississippi River system," says University of Michigan aquatic ecologist Don Scavia, lead author of the paper.

Finding leaks while they're easy to fix

MIT News

Monterrey itself has a strong incentive to take part in this study, since it loses an estimated 40 percent of its water supply to leaks every year, costing the city about $80 million in lost revenue. That's why that desert nation's King Fahd University of Petroleum and Minerals has sponsored and collaborated on much of the MIT team's work, including successful field tests there earlier this year that resulted in some further design improvements to the system, Youcef-Toumi says. Currently there is not an effective tool to locate leaks in those plastic pipes, and MIT PipeGuard's robot is the disruptive change we have been looking for." The MIT system was actually first developed to detect gas leaks, and later adapted for water pipes.

Population increase and the smart city


One thing is for certain when urbanization at this rate occurs, and that is the strain on public services and resources rapidly increases. By 2100 the global population is expected to reach 11 billion people, but we should see this as an exciting opportunity to use the Internet of Things in formatting smart cities. Smart waste management applies the Internet of Things to rapidly improve efficiency. The growth of smart cities should only accelerate over the coming years – their potential is limitless and although they are expensive to plan and implement initially, they will only benefit residents by improving living cost, health, and quality of life.

Hazardous waste identified and sorted using simple barcodes

New Scientist

Because many processing facilities can't quickly identify the chemicals in this household waste, the items are often simply lumped together and incinerated – which is expensive. Their start-up, Smarter Sorting, has installed a barcode scanning system at four waste disposal sites in the US used by the public – in Austin, Texas; Salt Lake City, Utah; Portland, Oregon; and Mesa County, Colorado. "The machine goes'beep' and at that point the screen simply tells the worker, 'this is where you should place this item'," says Chris Ripley, who co-founded Smarter Sorting together with Charlie Vallely. Also testing the technology is Hope Petrie, hazardous materials manager at Mesa County Hazardous Waste Collection Facility, although she isn't yet using it to alter the way large numbers of items are processed.

How Machine Learning Helps Identify Toxicity In Potential Drugs


The team believe that being able to determine the atomic structure of protein molecules will play a huge role in understanding how they work, and how they may respond to drug therapies. The drugs typically work by binding to a protein molecule, and then changing its shape and thus altering how it works.

Less Than 10% of Bovine i E. coli /i Strains Affect Human Health


Using software to compare genetic information in bacterial isolates from animals and people, researchers have predicted that less than 10% of Escherichia coli 0157:H7 strains are likely to have the potential to cause human disease. In this study, the researchers applied machine learning to predict the zoonotic potential of bacterial isolates from the United Kingdom and the United States. "[O]ne of the cattle isolates (apart from outbreak trace-back isolates) achieved very high human association probabilities ( 0.9), potentially indicating that those posing a serious zoonotic threat are very rare," the authors write. As a consequence, experts could use targeted control strategies, including vaccination or eradication, in cattle carrying strains of high zoonotic potential, in order to better protect human health.

Researchers Use Machine Learning to Detect Pathogenic Bacteria in Cattle


A team of researchers has found a new way to detect dangerous strains of bacteria, potentially preventing outbreaks of food poisoning. The team developed a method that utilizes machine learning and tested it with isolates of Escherichia coli strains. The team utilized machine learning, a form of artificial intelligence that allows computers to learn and analyze patterns. If researchers can quickly identify cattle herds carrying dangerous strains of E.coli, that particular herd can be treated or isolated before an outbreak occurs.

The Roslin Institute (University of Edinburgh) - News


Machine learning can predict strains of bacteria likely to cause food poisoning outbreaks, research has found. The study – which focused on harmful strains of E. coli bacteria – could help public health officials to target interventions and reduce risk to human health. The team trained the software on DNA sequences from strains isolated from cattle herds and human infections in the UK and the US. The study highlights the potential of machine learning approaches for identifying these strains early and prevent outbreaks of this infectious disease.

DataOps, Monetization, and the Rise of the Data Broker: Questioning Authority with Tamr CEO Andy Palmer


In 2013, he and Stonebraker moved up the data value chain and founded Tamr, the Cambridge, MA-based software company aiming to provide a unified view of data in the modern enterprise. Those enterprise data sources should be managed and organized similarly to how search engines crawl and organize the modern world-wide web. From brokering that data internally to business units that want to run analytics, it's a logical extension to monetize data by selling it outside the company. TW: If I'm an enterprise customer, what criteria should I take into account as I look to adopt data integration, machine-learning, data self-service technologies?