Changelog
What's new, what's fixed, and what's improved in each version.
Training loop, new matrix and ML components, and a major overhaul of the gradient system for mathematical correctness.
Feature
One-click training with Step button
Set a loss node, adjust the learning rate, and click Step to run gradient descent. Weights and biases update automatically. Works with scalar, vector, and matrix parameters.
Feature
5 new components for neural network building
Matrix-Vector Multiply, Matrix Scalar Multiply, Matrix Sum, Flatten, and Softmax. Build a full neural network layer from primitives.
Feature
Freeze parameters
Right-click any input node and select Freeze Parameter to prevent it from changing during training. Frozen nodes show a snowflake icon and dimmed border when gradient mode is active.
Feature
Gradients flow through user-created composites
Package nodes into a composite component and gradients will automatically propagate through it. No manual gradient rules needed.
Feature
3 new lessons
Train a Neuron, Matrix Operations, and Build a Neural Network Layer. Hands-on lessons that walk you through gradient descent and matrix math.
Fix
Gradient type mismatch on scalar-to-vector connections
When a scalar output feeds a vector input (auto-cast), the backward pass now correctly reduces the vector gradient back to a scalar. Previously this could cause training to silently skip parameter updates.
Fix
Undo/redo now works reliably
Fixed undo/redo creating phantom entries whenever the graph recomputed. Each undo now corresponds to exactly one user action.
Improve
Gradient mode is per-tab
Loss node, learning rate, and gradient mode are now saved per tab. Switch between tabs without losing your gradient setup.
Improve
Smarter nabla button
The gradient button only appears when you select a node that can be a loss node (scalar output). Click it to set the loss in one step, instead of the old two-step right-click workflow.
Major gradient mode upgrade. Gradients now flow through all statistics and ML components, and several accuracy fixes make gradient values reliable across any graph.
Feature
Gradient mode works through 18 more components
Gradient mode previously showed zero gradients at many commonly-used nodes. You can now see correct gradients flowing through:
- Mean, Dot Product, MSE, Max, Min
- Variance, Standard Deviation, Covariance, Correlation, R²
- Clamp, Lerp, Cumulative Sum, and more
Feature
Gradient error notifications
When a gradient computation hits a problem (like dividing by zero), you now get a clear notification instead of silently seeing zero.
Fix
Gradient values could be incorrect on complex canvases
Fixed an issue where gradient values could be wrong depending on the order you built your graph. Gradients are now always computed correctly regardless of how you construct your canvas.
Fix
Multi-output nodes as loss now handled correctly
Setting a node with multiple outputs as the loss node (like Linear Regression) now correctly computes gradients from the primary output only. Nodes with non-scalar outputs are clearly rejected instead of producing confusing results.
Fix
Power node now computes gradient for the exponent
The Power node (x^y) previously only showed the gradient for the base. If you wired something to the exponent input, its gradient was missing. Both inputs now have correct gradients.
Improve
Faster gradient computation on large graphs
Gradient mode is now noticeably faster on canvases with many nodes.
Mobile and tablet support. Statslingo now works on phones and tablets with touch-friendly controls.
Feature
Phone and tablet support
The full interface adapts to smaller screens. Toolbar collapses into a compact menu, panels slide up as bottom sheets, and all controls are sized for touch.
Feature
On-screen math keyboard
Math expression fields show a virtual keyboard on touch devices so you can type formulas without a physical keyboard.
Improve
Touch-friendly interactions
Long-press to open context menus, larger tap targets throughout, floating "+" button to add nodes, and a collapsible lesson panel that gives you more canvas space on small screens.
Initial public release.
Feature
Visual node-based math platform
Drag, drop, and wire mathematical components on a canvas. Build anything from basic arithmetic to neural networks by connecting building blocks together. 100+ components across math, statistics, logic, visualization, and ML.
Feature
Interactive lessons
36 step-by-step lessons across Foundations, Statistics, Machine Learning, and Physics. Manipulate sliders and watch the math update in real-time. Copy any lesson to your own canvas to experiment freely.
Feature
Create your own components
Select nodes, group them into reusable components, and edit them in dedicated tabs. Export and share components as files.
Feature
Full workspace
Multiple projects with tabs, dark mode, undo/redo, auto-layout, PNG/SVG export, URL sharing, keyboard shortcuts, WYSIWYG math editor, and a documentation panel with formulas for every component.