diff scripts/optimization/fminunc.m @ 20165:f1d0f506ee78 stable

doc: Update more docstrings to have one sentence summary as first line. Reviewed optimization, polynomial, signal script directories. * scripts/optimization/fminbnd.m, scripts/optimization/fminsearch.m, scripts/optimization/fminunc.m, scripts/optimization/fsolve.m, scripts/optimization/fzero.m, scripts/optimization/glpk.m, scripts/optimization/lsqnonneg.m, scripts/optimization/pqpnonneg.m, scripts/optimization/qp.m, scripts/optimization/sqp.m, scripts/polynomial/compan.m, scripts/polynomial/mkpp.m, scripts/polynomial/mpoles.m, scripts/polynomial/pchip.m, scripts/polynomial/poly.m, scripts/polynomial/polyaffine.m, scripts/polynomial/polyder.m, scripts/polynomial/polyeig.m, scripts/polynomial/polyfit.m, scripts/polynomial/polygcd.m, scripts/polynomial/polyint.m, scripts/polynomial/polyout.m, scripts/polynomial/polyval.m, scripts/polynomial/ppder.m, scripts/polynomial/ppint.m, scripts/polynomial/ppjumps.m, scripts/polynomial/ppval.m, scripts/polynomial/residue.m, scripts/polynomial/roots.m, scripts/polynomial/spline.m, scripts/polynomial/splinefit.m, scripts/polynomial/unmkpp.m, scripts/signal/arch_fit.m, scripts/signal/arch_rnd.m, scripts/signal/arma_rnd.m, scripts/signal/autoreg_matrix.m, scripts/signal/bartlett.m, scripts/signal/blackman.m, scripts/signal/detrend.m, scripts/signal/diffpara.m, scripts/signal/durbinlevinson.m, scripts/signal/fftconv.m, scripts/signal/fftfilt.m, scripts/signal/fftshift.m, scripts/signal/filter2.m, scripts/signal/freqz.m, scripts/signal/hamming.m, scripts/signal/hanning.m, scripts/signal/hurst.m, scripts/signal/ifftshift.m, scripts/signal/periodogram.m, scripts/signal/sinc.m, scripts/signal/sinetone.m, scripts/signal/sinewave.m, scripts/signal/spectral_adf.m, scripts/signal/spectral_xdf.m, scripts/signal/spencer.m, scripts/signal/stft.m, scripts/signal/synthesis.m, scripts/signal/unwrap.m, scripts/signal/yulewalker.m: Update more docstrings to have one sentence summary as first line.
author Rik <rik@octave.org>
date Mon, 04 May 2015 21:50:57 -0700
parents 9fc020886ae9
children a7dbc4fc3732 2935d56203a4
line wrap: on
line diff
--- a/scripts/optimization/fminunc.m	Mon May 04 14:22:02 2015 -0700
+++ b/scripts/optimization/fminunc.m	Mon May 04 21:50:57 2015 -0700
@@ -25,31 +25,34 @@
 ## Solve an unconstrained optimization problem defined by the function
 ## @var{fcn}.
 ##
-## @var{fcn} should accept a vector (array) defining the unknown variables,
-## and return the objective function value, optionally with gradient.
+## @var{fcn} should accept a vector (array) defining the unknown variables, and
+## return the objective function value, optionally with gradient.
 ## @code{fminunc} attempts to determine a vector @var{x} such that
-## @code{@var{fcn} (@var{x})} is a local minimum.  @var{x0} determines a
-## starting guess.  The shape of @var{x0} is preserved in all calls to
-## @var{fcn}, but otherwise is treated as a column vector.
-## @var{options} is a structure specifying additional options.
-## Currently, @code{fminunc} recognizes these options:
+## @code{@var{fcn} (@var{x})} is a local minimum.
+##
+## @var{x0} determines a starting guess.  The shape of @var{x0} is preserved in
+## all calls to @var{fcn}, but otherwise is treated as a column vector.
+##
+## @var{options} is a structure specifying additional options.  Currently,
+## @code{fminunc} recognizes these options:
 ## @qcode{"FunValCheck"}, @qcode{"OutputFcn"}, @qcode{"TolX"},
 ## @qcode{"TolFun"}, @qcode{"MaxIter"}, @qcode{"MaxFunEvals"},
-## @qcode{"GradObj"}, @qcode{"FinDiffType"},
-## @qcode{"TypicalX"}, @qcode{"AutoScaling"}.
+## @qcode{"GradObj"}, @qcode{"FinDiffType"}, @qcode{"TypicalX"},
+## @qcode{"AutoScaling"}.
 ##
-## If @qcode{"GradObj"} is @qcode{"on"}, it specifies that @var{fcn},
-## when called with 2 output arguments, also returns the Jacobian matrix
-## of partial first derivatives at the requested point.
-## @code{TolX} specifies the termination tolerance for the unknown variables
-## @var{x}, while @code{TolFun} is a tolerance for the objective function
-## value @var{fval}.  The default is @code{1e-7} for both options.
+## If @qcode{"GradObj"} is @qcode{"on"}, it specifies that @var{fcn}, when
+## called with 2 output arguments, also returns the Jacobian matrix of partial
+## first derivatives at the requested point.  @code{TolX} specifies the
+## termination tolerance for the unknown variables @var{x}, while @code{TolFun}
+## is a tolerance for the objective function value @var{fval}.  The default is
+## @code{1e-7} for both options.
 ##
 ## For a description of the other options, see @code{optimset}.
 ##
 ## On return, @var{x} is the location of the minimum and @var{fval} contains
-## the value of the objective function at @var{x}.  @var{info} may be one of the
-## following values:
+## the value of the objective function at @var{x}.
+##
+## @var{info} may be one of the following values:
 ##
 ## @table @asis
 ## @item 1
@@ -77,11 +80,13 @@
 ## (@var{output}), the output gradient (@var{grad}) at the solution @var{x},
 ## and approximate Hessian (@var{hess}) at the solution @var{x}.
 ##
-## Notes: If have only a single nonlinear equation of one variable then using
-## @code{fminbnd} is usually a much better idea.  The algorithm used is a
-## gradient search which depends on the objective function being differentiable.
-## If the function has discontinuities it may be better to use a derivative-free
-## algorithm such as @code{fminsearch}.
+## Application Notes: If have only a single nonlinear equation of one variable
+## then using @code{fminbnd} is usually a better choice.
+##
+## The algorithm used by @code{fminsearch} is a gradient search which depends
+## on the objective function being differentiable.  If the function has
+## discontinuities it may be better to use a derivative-free algorithm such as
+## @code{fminsearch}.
 ## @seealso{fminbnd, fminsearch, optimset}
 ## @end deftypefn