次の方法で共有


Using Sho for Optimization Tasks

Our colleague Nathan Brixius just wrote a great blog post about how to do some pretty cool tricks with the Optimization package that ships with Sho (including solving the classic N-Queens problem), which is just a thin wrapper of his team's Solver Foundation.  My personal favorite in the package is Optimizer.QuasiNewton, which implements L-BFGS for fast, unconstrained optimization.  I use that all the time for machine learning techniques that involve minimizing some (ideally convex) loss function, like Logistic Regression, which is just a few dozen lines of code when you have QuasiNewton at your disposal.