Collaborating Authors

Project Manager Today


A ROBOT with an algorithm-based persona is being used to help companies make data-driven decisions in real time. South Australian company Complexica has developed Larry, the Digital Analyst, which is basically a set of algorithms tuned to complex problems to quickly generate answers that would otherwise take people a very long time to work out. Big Data software algorithms are taking decision-making to a new level, delivering solutions and efficiencies like never before. The global Artificial Intelligence market is forecast to exceed USD 5 billion by 2020. Father and son team Matthew Michalewicz and Dr Zbigniew "Mike" Michalewicz, a former professor at the University of Adelaide's School of Computer Science and Artificial Intelligence pioneer, started the company in 2014 with software architect Constantin Chiriac.

Asynchronous \epsilon-Greedy Bayesian Optimisation Artificial Intelligence

Bayesian Optimisation (BO) is a popular surrogate model-based approach for optimising expensive black-box functions. In order to reduce optimisation wallclock time, parallel evaluation of the black-box function has been proposed. Asynchronous BO allows for a new evaluation to be started as soon as another finishes, thus maximising utilisation of evaluation workers. We present AEGiS (Asynchronous $\epsilon$-Greedy Global Search), an asynchronous BO method that, with probability $2\epsilon$, performs either Thompson sampling or random selection from the approximate Pareto set trading-off between exploitation (surrogate mean prediction) and exploration (surrogate posterior variance). The remaining $1-2\epsilon$ of moves exploit the surrogate's mean prediction. Results on fifteen synthetic benchmark problems, three meta-surrogate hyperparameter tuning problems and two robot pushing problems show that AEGiS generally outperforms existing methods for asynchronous BO. When a single worker is available performance is no worse than BO using expected improvement. We also verify the importance of each of the three components in an ablation study, as well as comparing Pareto set selection to selection from the entire feasible problem domain, finding that the former is vastly superior.

Characterization of the convergence of stationary Fokker-Planck learning Artificial Intelligence

The convergence properties of the stationary Fokker-Planck algorithm for the estimation of the asymptotic density of stochastic search processes is studied. Theoretical and empirical arguments for the characterization of convergence of the estimation in the case of separable and nonseparable nonlinear optimization problems are given. Some implications of the convergence of stationary Fokker-Planck learning for the inference of parameters in artificial neural network models are outlined.

Machine learning is the new face of enterprise data


While the complexity of the searching and result-ranking technology behind Apple's Siri would likely elude most of its users, the value of a context-sensitive personal assistant certainly has not. Yet while Siri spawned a new generation of anthropomorphic digital assistants, researchers in machine learning and artificial intelligence (AI) are taking the concept much further to help enterprises catch up to the growth of data. Industrial products distributor Coventry Group is among the latest companies to jump onto the trend. The company, whose fasteners, fluid systems, gasket and hardware divisions collectively employ around 650 people, is working with Adelaide-based data-analytics specialist Complexica to apply that company's AI technology – personified as Larry, the Digital Analyst – to guide decisions around sales and pricing strategies. Introducing Larry – a collection of algorithms delivered on a software-as-a-service (SaaS) basis via Amazon's cloud – to Coventry's business is a two to four month process that will see the technology finetuned to the company's operating parameters.

Reasoning and Facts Explanation in Valuation Based Systems Artificial Intelligence

In the literature, the optimization problem to identify a set of composite hypotheses H, which will yield the $k$ largest $P(H|S_e)$ where a composite hypothesis is an instantiation of all the nodes in the network except the evidence nodes \cite{KSy:93} is of significant interest. This problem is called "finding the $k$ Most Plausible Explanation (MPE) of a given evidence $S_e$ in a Bayesian belief network". The problem of finding $k$ most probable hypotheses is generally NP-hard \cite{Cooper:90}. Therefore in the past various simplifications of the task by restricting $k$ (to 1 or 2), restricting the structure (e.g. to singly connected networks), or shifting the complexity to spatial domain have been investigated. A genetic algorithm is proposed in this paper to overcome some of these restrictions while stepping out from probabilistic domain onto the general Valuation based System (VBS) framework is also proposed by generalizing the genetic algorithm approach to the realm of Dempster-Shafer belief calculus.