Ongoing investigations in statistical computing and machine learning
SGCX maintains several active research projects alongside our product work. These investigations inform our open-source libraries and commercial platforms.
Systematic evaluation of machine learning optimizers against known global minima. Using toy problems where exact solutions are feasible, with statistical rigor (50+ runs, confidence intervals).
Discovery that random pruning outperforms sophisticated methods up to 70-80% sparsity. Backed by rigorous statistical methodology with 25+ runs per configuration.
GPU-accelerated WENO implementation using PyTorch convolution operations. Modern approach to classical numerical methods.
Statistics-informed neural network pruning using FANIM with Wilcoxon testing. On hold pending AFL findings.
Target: ICLR or NeurIPS
Status: Manuscript in preparation
Demonstrates proper statistical methodology with 25+ runs per configuration, confidence intervals, and effect sizes.
Target: ICML
Status: Research phase
Systematic evaluation of popular optimizers against known global minima, revealing true performance gaps.
Our research informs our open-source libraries. PyStatistics and PyStatsBio are freely available and we welcome contributions.