Research

Ongoing investigations in statistical computing and machine learning

SGCX maintains several active research projects alongside our product work. These investigations inform our open-source libraries and commercial platforms.

Active Research

Research Phase

Project Blacklight

Systematic evaluation of machine learning optimizers against known global minima. Using toy problems where exact solutions are feasible, with statistical rigor (50+ runs, confidence intervals).

Learn More →
Significant Results

AFL Research

Discovery that random pruning outperforms sophisticated methods up to 70-80% sparsity. Backed by rigorous statistical methodology with 25+ runs per configuration.

Learn More →
Early Development

GradFlow

GPU-accelerated WENO implementation using PyTorch convolution operations. Modern approach to classical numerical methods.

Learn More →
On Hold

Project Bonsai

Statistics-informed neural network pruning using FANIM with Wilcoxon testing. On hold pending AFL findings.

Learn More →

Upcoming Publications

"Approximate Forgiveness Level: Random Pruning Outperforms Sophisticated Methods"

Target: ICLR or NeurIPS

Status: Manuscript in preparation

Demonstrates proper statistical methodology with 25+ runs per configuration, confidence intervals, and effect sizes.

"Quantifying Optimizer Performance: A Blacklight Analysis"

Target: ICML

Status: Research phase

Systematic evaluation of popular optimizers against known global minima, revealing true performance gaps.

Open Source

Our research informs our open-source libraries. PyStatistics and PyStatsBio are freely available and we welcome contributions.

View on GitHub →