An interesting concept I came up with today for finding the point on a curve which minimizes its distance to some other point. It takes the dot product of the vector from the guess to the target and the derivative of the curve at the guess. And then normalizing it by dividing by the length of the derivative to see how far it should move along the curve. It is a lot more stable than newton's method or gradient descent, but when the target is too far from the curve it blows up. Scaling down the shift fixes the issue. I need to find a way to calculate a suitable scalar dynamically.
Here is another method for solving this, which isn't as smooth, but doesn't get stuck in local minima: https://scratch.mit.edu/projects/807172182/ A Desmos visualization: https://www.desmos.com/calculator/mjcxxt55zf Bézier polynomial form: https://en.wikipedia.org/wiki/Bernstein_polynomial I used Wolfram Alpha to calculate the derivatives of the polynomials. This may be a bit slower than De Casteljau's algorithm (I haven't tested) but it is a lot more concise.