About Me
 Research Fellow (11/20Present) — Queensland University of Technology (QUT), Centre for Data Science
 ACEMS Research Fellow (01/19 10/ 20) — UNSW Sydney
 ACEMS Research Fellow (08/18 – 01 /19) [Short Contract] — The University of Queensland
 PhD Candidate in Statistics (20152018) — The University of Queensland.

 Thesis: Advances in Monte Carlo Methodology  Advisor: Professor Dirk Kroese

Research Interests
My research primarily surrounds fundamental aspects of methodology in Probabilistic Machine Learning and related fields.
I am also interested in the application and development of extensions of such advanced methods and techniques with the goal of solving complicated and interesting scientific problems and/or developing advanced artificial intelligence systems!
Some topics I have worked/published/am working on: Monte Carlo Methods, Deep Generative Models, Bayesian Statistics, Variational Inference, Representation Learning, Federated Learning & Privacy Enhancing Technologies, Time Series Analysis, Kernel Methods, Variance Reduction, LikelihoodFree Models, Particle Filters, and RareEvent Simulation.
To find out a little bit more about some of my interests, feel free to have a look at the course content for my AMSI Winter School 2021 course on Deep Probabalistic Models.
Research Output
Publications
Villani, M., Quiroz, M., Kohn, R., and Salomone, R. (2022), Spectral Subsampling MCMC for Stationary Multivariate Time Series with an Application to Vector ARTFIMA Processes. Econometrics and Statistics. [Read Online]
Sutton, M. , Salomone, R., Chevallier, A., and Fearnhead, P. (2022), ContinuouslyTempered PDMP Samplers. Neural Information Processing Systems (NeuRIPS), 2022. Accepted. [Preprint]
Hodgkinson, L., Salomone,R., and Roosta, F. (2021), Implicit Langevin Algorithms for Sampling From Logconcave Densities, Journal of Machine Learning Research (JMLR) 22: 130. [Read Online]
Salomone R., Quiroz, M., Kohn, R., Villani, M., and Tran, M.N. (2020), Spectral Subsampling MCMC for Stationary Time Series, Proceedings of the International Conference on Machine Learning (ICML) 2020. [Read Online]
Botev, Z.I., Salomone, R., Mackinlay, D. (2019), Fast and accurate computation of the distribution of sums of dependent lognormals, Annals of Operations Research 280 (1), 1946. [Read Online]
Laub, P.J., Salomone, R., Botev, Z.I. (2019), Monte Carlo estimation of the density of the sum of dependent random variables, Mathematics and Computers in Simulation 161, 2331.
Salomone, R., Vaisman, R., and Kroese, D.P. (2016). Estimating the Number of Vertices in Convex Polytopes. Proceedings of the Annual International Conference on Operations Research and Statistics, ORS 2016. [Read Online]
Pipeline (Under Review or Revision)
Davies, L., Salomone, R., Sutton, S., and Drovandi, C. (2022), Transport Reversible Jump Proposals. arXiv: 2210.12572
Bon, J.J., Bretherton, A., Buchhorn, K., Cramb, S., Drovandi, C., Hassan, C., Jenner, A., Mayfield, H.J., McGree, J.M., Mengersen, K., Price, A., Salomone, R., SantosFernández, E., Vercelloni, E., and Wang, X. (2022), Being Bayesian in the 2020s: opportunities and challenges in the practice of modern applied Bayesian statistics. arXiv: 2211.10029
Wang, X., Jenner, A.L., Salomone, R., Drovandi, C. (2022), Calibration of a Voronoi cellbased model for tumour growth using approximate Bayesian computation. [bioarXiv]
Hodgkinson, L., Salomone, R., and Roosta, F. (2021), The reproducing Stein kernel approach for posthoc corrected sampling. arXiv: 2001.09266
Salomone, R., South, L.F., Drovandi, C.C., and Kroese, D.P. (2018, Revision Forthcoming), Unbiased and Consistent Nested Sampling via Sequential Monte Carlo. arXiv:1805.03924
Selected Presentations
If you enjoyed this talk, be sure to check out my course materials for my AMSI Winter School 2021 course on Deep Probabalistic Models (the above talk is a simplified version of the first quarter!).
 Monte Carlo Secrets Revealed
 Spectral Subsampling for Stationary Time Series (ICML 2020)
 A Tutorial on Reproducing Stein Kernels
 Slides and Jupyter Notebook for my threehour workshop on Automatic Differentiation.