Insights & Tutorials
Deep dives into formula compilation, model training, and the future of mathematical computing.

We trained 6 progressively complex formulas on the same dataset. The results surprised us.

Programming languages have grammars. Mathematical notation has conventions, context, and centuries of overloaded symbols. Building a formula compiler means solving all of that.

We ran 17 standard ML formulas through MathExec's compiler. 10 compiled cleanly. 7 didn't. The failures taught us more than the successes.

A definitive reference mapping LaTeX formula patterns to their PyTorch equivalents. Every entry backed by MathExec's compiler.

The formula in the paper is never the whole story. Batch dimensions, numerical stability tricks, weight initialization, shape inference: the hidden work that textbooks don't mention.

We took 10 formulas from ML papers and textbooks and wrote the complete PyTorch equivalent for each. The average was 94 lines. The formula was never more than 40 characters.

We timed 5 different ML workflows on the same task: raw PyTorch, Lightning, scikit-learn, AutoML, and MathExec. The gap was bigger than we expected.

Most ML platforms use drag-and-drop node editors. We chose mathematical notation instead.

How we built MathExec's Data Studio, a feature that lets users transform datasets using plain English instructions powered by LLMs.

From linear regression to neural networks: 5 models you can train in MathExec just by typing a formula.

A deep dive into MathExec's formula compiler: how we parse LaTeX expressions and generate equivalent PyTorch modules with trainable parameters.

MathExec lets you write a math formula, compile it to PyTorch, train on your CSV data, and export production-ready Python code. All in under a minute.