Goto

Collaborating Authors

 tsang




How healthy am I? My immunome knows the score.

MIT Technology Review

How healthy am I? My immunome knows the score. Groundbreaking new tests reveal patterns in our immune systems that can signal underlying disease and tell us how well we might recover from our next cold. I got my results in a text message. It's not often you get a text about the robustness of your immune system, but that's what popped up on my phone last spring. Sent by John Tsang, an immunologist at Yale, the text came after his lab had put my blood through a mind-boggling array of newfangled tests. The result--think of it as a full-body, high-resolution CT scan of my immune system--would reveal more about the state of my health than any test I had ever taken. And it could potentially tell me far more than I wanted to know. "David," the text read, "you are the red dot." Tsang was referring to an image he had attached to the text that showed a graph with a scattering of black dots representing other people whose immune systems had been evaluated--and a lone red one.



A glimpse into the future of radiation therapy – Physics World

#artificialintelligence

Which innovations will have the greatest impact in radiotherapy by 2030? That was the question posed in the closing session of last week's ESTRO 2022 congress; and five experts stepped up to respond. As often seen in debate-style ESTRO sessions, competition was intense and gimmicks were plentiful, with all talk titles based on movies and a definite sci-fi twist. Before battle commenced, the audience voted for their preferred innovation based on the presentation titles. This opening vote put personalized inter-fraction adaptation as the winner.


The making of Ashe, Overwatch's new outlaw gunslinger

The Guardian

It's just a few days before BlizzCon, the annual celebration that sees thousands of fans of the company's games pack into the Anaheim Convention Centre for announcements, panels and entertainment. Overwatch's game director, Jeff Kaplan, bounces in his chair as he tells me it "feels like Christmas." But there are no elves at work here, just hundreds of developers collaborating to make a new character to introduce to Overwatch's players. At BlizzCon's opening ceremony, fans will see a meticulous animated short that focuses on beloved cowboy character McCree, and introduces a new face, with red eyes framed dramatically by white hair and a wide brimmed hat. Her name is Ashe, and she's a no-nonsense gunslinger who commands a gang of outlaws, including her own robot butler, Bob.


China steps up drone race with stealth aircraft, AK-47-toting chopper drones

The Japan Times

ZHUHAI, CHINA – China is unleashing stealth drones and pilotless aircraft fitted with AK-47 rifles onto world markets, racing to catch up to U.S. technology and adding to a fleet that has already seen combat action in the Middle East. Combat drones were among the jet fighters, missiles and other military hardware shown off this past week at Airshow China, the country's biggest aerospace industry exhibition. A delta-winged stealth drone received much attention, highlighting China's growing production of sophisticated unmanned aerial vehicles seeking to compete with the U.S. military's massive fleet. The CH-7 -- a charcoal-gray UAV unveiled at the air show -- is as long as a tennis court and has a 22-meter (72-feet) wingspan. It can fly at more than 800 kph (500 mph) and at an altitude of 13,000 meters (42,650 feet).


Doubly Approximate Nearest Neighbor Classification

Liu, Weiwei (The University of New South Wales) | Liu, Zhuanghua (University of Technology Sydney) | Tsang, Ivor W. (University of Technology Sydney) | Zhang, Wenjie (The University of New South Wales) | Lin, Xuemin (The University of New South Wales)

AAAI Conferences

Nonparametric classification models, such as K-Nearest Neighbor (KNN), have become particularly powerful tools in machine learning and data mining, due to their simplicity and flexibility. However, the testing time of the KNN classifier becomes unacceptable and the KNN's performance deteriorates significantly when applied to data sets with millions of dimensions. We observe that state-of-the-art approximate nearest neighbor (ANN) methods aim to either reduce the number of distance comparisons based on tree structure or decrease the cost of distance computation by dimension reduction methods. In this paper, we propose a doubly approximate nearest neighbor classification strategy, which marries the two branches which compress the dimensions for decreasing distance computation cost as well as reduce the number of distance comparison instead of full scan. Under this strategy, we build a compressed dimensional tree (CD-Tree) to avoid unnecessary distance calculations. In each decision node, we propose a novel feature selection paradigm by optimizing the feature selection vector as well as the separator (indicator variables for splitting instances) with the maximum margin. An efficient algorithm is then developed to find the globally optimal solution with convergence guarantee. Furthermore, we also provide a data-dependent generalization error bound for our model, which reveals a new insight for the design of ANN classification algorithms. Our empirical studies show that our algorithm consistently obtains competitive or better classification results on all data sets, yet we can also achieve three orders of magnitude faster than state-of-the-art libraries on very high dimensions.


What Does Baidu Have In Its Artificial Intelligence Pipeline?

#artificialintelligence

Baidu (BIDU) has been focusing its efforts on artificial intelligence as growth slows for its core online search business and a number of its new investments struggled to take off. So what can investors expect from Baidu's artificial intelligence (AI) ambitions? HSBC analyst Chi Tsang had a sneak peek at Baidu's AI Developer Conference. DuerOS allows people to speak to machines. Apollo is getting a warm reception from the autonomous vehicles industry.


Sparse Perceptron Decision Tree for Millions of Dimensions

Liu, Weiwei (University of Technology) | Tsang, Ivor W. (University of Technology)

AAAI Conferences

Due to the nonlinear but highly interpretable representations,decision tree (DT) models have significantly attracted a lot of attention of researchers. However, DT models usually suffer from the curse of dimensionality and achieve degenerated performance when there are many noisy features. To address these issues, this paper first presents a novel data-dependent generalization error bound for the perceptron decision tree(PDT), which provides the theoretical justification to learn a sparse linear hyperplane in each decision node and to prune the tree. Following our analysis, we introduce the notion of sparse perceptron decision node (SPDN) with a budget constraint on the weight coefficients, and propose a sparse perceptron decision tree (SPDT) algorithm to achieve nonlinear prediction performance. To avoid generating an unstable and complicated decision tree and improve the generalization of the SPDT, we present a pruning strategy by learning classifiers to minimize cross-validation errors on each SPDN. Extensive empirical studies verify that our SPDT is more resilient to noisy features and effectively generates a small,yet accurate decision tree. Compared with state-of-the-art DT methods and SVM, our SPDT achieves better generalization performance on ultrahigh dimensional problems with more than 1 million features.