Zé Vinícius

Discovery Bay,

Hong Kong, China

Hi there! I’m Zé Vinícius, a PhD candidate at HKUST, in sunny Hong Kong, working with Prof. Daniel Palomar on interesting problems involving graphs and financial time series. More precisely, I design optimization algorithms using elements of graph theory and statistical learning theory to extract knowledge from networks of financial assets. I have done a few internships along the way: (1) research scientist at Shell Street Labs, Hong Kong; (2) scientific software engineer at NASA in Silicon Valley, California, and (3) guest researcher at NIST in Gaithersburg, Maryland. I was a Google Summer of Code developer for OpenAstronomy.

I spend most of my time doing research and coding. In my free time, there is nothing better than swimming and crab hunting in the waters of Clear Water Bay and video-chatting with my dog, Pluto.

Résumé

news

Dec 6, 2021 Our paper Fast Projected Newton-like Method for Precision Matrix Estimation with Nonnegative Partial Correlations has been pushed to the arXiv. Matlab code lives at github.com/jxying/mtp2.
Dec 1, 2021 Our paper, Efficient Algorithms for General Isotone Optimization, has been accepted to AAAI 2022!
Nov 22, 2021 Our paper Graphical Models in Heavy-Tailed Markets has been accepted to NeurIPS 2021! R code for the algorithms lives at github.com/mirca/fingraph.
Sep 1, 2021 I’ve completed my internship at Shell Street Labs :) Now, I’m back to HKUST where I’m the TA for the course Portfolio Optimization with R!
Jul 12, 2021 Our paper “A Fast Algorithm for Graph Learning under Attractive Gaussian Markov Random Fields” has been accepted to Asilomar 2021!

selected publications

  1. AAAI
    Efficient Algorithms for General Isotone Optimization
    Wang, X., Ying, J., Cardoso, J. V. M., and Palomar, D. P.
    In The Thirty-Sixth AAAI Conference on Artificial Intelligence 2022
  2. NeurIPS
    Graphical Models in Heavy-Tailed Markets
    Cardoso, J. V. M., Ying, J., and Palomar, D. P.
    In Advances in Neural Information Processing Systems 2021
  3. AISTATS
    Minimax Estimation of Laplacian Constrained Precision Matrices
    Ying, J., Cardoso, J. V. M., and Palomar, D. P.
    In 24th International Conference on Artificial Intelligence and Statistics 2021
  4. arXiv
    Algorithms for Learning Graphs in Financial Markets
    Cardoso, J. V. M., Ying, J., and Palomar, D. P.
    In arXiv e-prints 2020
  5. arXiv
    Does the L1 norm Learn a Sparse Graph under Laplacian Constrained Graphical Models?
    Ying, J., Cardoso, J. V. M., and Palomar, D. P.
    In arXiV e-prints 2020
  6. NeurIPS
    Nonconvex Sparse Graph Learning under Laplacian Constrained Graphical Model
    Ying, J., Cardoso, J. V. M., and Palomar, D. P.
    In Advances in Neural Information Processing Systems 2020
  7. JMLR
    A Unified Framework for Structured Graph Learning via Spectral Constraints
    Kumar, S., Ying, J., Cardoso, J. V. M., and Palomar, D. P.
    Journal of Machine Learning Research 2020
  8. NeurIPS
    Structured Graph Learning Via Laplacian Spectral Constraints
    Kumar, S., Ying, J., Cardoso, J. V. M., and Palomar, D. P.
    In Advances in Neural Information Processing Systems 2019