Skip to content
GitLab
Projects
Groups
Snippets
Help
Loading...
Help
Help
Support
Community forum
Keyboard shortcuts
?
Submit feedback
Contribute to GitLab
Sign in / Register
Toggle navigation
dynare
Project overview
Project overview
Details
Activity
Releases
Repository
Repository
Files
Commits
Branches
Tags
Contributors
Graph
Compare
Issues
122
Issues
122
List
Boards
Labels
Service Desk
Milestones
Merge Requests
0
Merge Requests
0
CI / CD
CI / CD
Pipelines
Jobs
Schedules
Operations
Operations
Incidents
Environments
Packages & Registries
Packages & Registries
Container Registry
Analytics
Analytics
CI / CD
Repository
Value Stream
Wiki
Wiki
Snippets
Snippets
Members
Members
Collapse sidebar
Close sidebar
Activity
Graph
Create a new issue
Jobs
Commits
Issue Boards
Open sidebar
Dynare
dynare
Commits
0718a3eb
Commit
0718a3eb
authored
Jan 22, 2021
by
Johannes Pfeifer
Browse files
Options
Browse Files
Download
Email Patches
Plain Diff
dynare_minimize_objective: fix use of analytic derivatives by employing wrapper function
parent
ff427807
Changes
1
Hide whitespace changes
Inline
Side-by-side
Showing
1 changed file
with
61 additions
and
28 deletions
+61
-28
matlab/optimization/dynare_minimize_objective.m
matlab/optimization/dynare_minimize_objective.m
+61
-28
No files found.
matlab/optimization/dynare_minimize_objective.m
View file @
0718a3eb
...
...
@@ -77,24 +77,37 @@ switch minimizer_algorithm
% is not able to even move away from the initial point.
optim_options
=
optimoptions
(
optim_options
,
'Algorithm'
,
'active-set'
);
end
if
options_
.
analytic_derivation
||
options_
.
mom
.
analytic_jacobian
==
1
optim_options
=
optimoptions
(
optim_options
,
'GradObj'
,
'on'
,
'TolX'
,
1e-7
);
%alter default TolX
end
if
~
isempty
(
options_
.
optim_opt
)
eval
([
'optim_options = optimoptions(optim_options,'
options_
.
optim_opt
');'
]);
end
if
options_
.
silent_optimizer
optim_options
=
optimoptions
(
optim_options
,
'display'
,
'off'
);
end
if
options_
.
analytic_derivation
optim_options
=
optimoptions
(
optim_options
,
'GradObj'
,
'on'
,
'TolX'
,
1e-7
);
%alter default TolX
end
if
~
isoctave
[
opt_par_values
,
fval
,
exitflag
,
output
,
lamdba
,
grad
,
hessian_mat
]
=
...
fmincon
(
objective_function
,
start_par_value
,[],[],[],[],
bounds
(:,
1
),
bounds
(:,
2
),[],
optim_options
,
varargin
{:});
if
options_
.
analytic_derivation
||
options_
.
mom
.
analytic_jacobian
==
1
%use wrapper
func
=
@
(
x
)
analytic_gradient_wrapper
(
x
,
objective_function
,
varargin
{:});
if
~
isoctave
[
opt_par_values
,
fval
,
exitflag
,
output
,
lamdba
,
grad
,
hessian_mat
]
=
...
fmincon
(
func
,
start_par_value
,[],[],[],[],
bounds
(:,
1
),
bounds
(:,
2
),[],
optim_options
);
else
% Under Octave, use a wrapper, since fmincon() does not have an 11th
% arg. Also, only the first 4 output arguments are available.
[
opt_par_values
,
fval
,
exitflag
,
output
]
=
...
fmincon
(
func
,
start_par_value
,[],[],[],[],
bounds
(:,
1
),
bounds
(:,
2
),[],
optim_options
);
end
else
% Under Octave, use a wrapper, since fmincon() does not have an 11th
% arg. Also, only the first 4 output arguments are available.
func
=
@
(
x
)
objective_function
(
x
,
varargin
{:});
[
opt_par_values
,
fval
,
exitflag
,
output
]
=
...
fmincon
(
func
,
start_par_value
,[],[],[],[],
bounds
(:,
1
),
bounds
(:,
2
),[],
optim_options
);
if
~
isoctave
[
opt_par_values
,
fval
,
exitflag
,
output
,
lamdba
,
grad
,
hessian_mat
]
=
...
fmincon
(
objective_function
,
start_par_value
,[],[],[],[],
bounds
(:,
1
),
bounds
(:,
2
),[],
optim_options
,
varargin
{:});
else
% Under Octave, use a wrapper, since fmincon() does not have an 11th
% arg. Also, only the first 4 output arguments are available.
func
=
@
(
x
)
objective_function
(
x
,
varargin
{:});
[
opt_par_values
,
fval
,
exitflag
,
output
]
=
...
fmincon
(
func
,
start_par_value
,[],[],[],[],
bounds
(:,
1
),
bounds
(:,
2
),[],
optim_options
);
end
end
case
2
...
...
@@ -159,20 +172,28 @@ switch minimizer_algorithm
if
~
isempty
(
options_
.
optim_opt
)
eval
([
'optim_options = optimoptions(optim_options,'
options_
.
optim_opt
');'
]);
end
if
options_
.
analytic_derivation
optim_options
=
optimoptions
(
optim_options
,
'GradObj'
,
'on'
);
end
if
options_
.
silent_optimizer
optim_options
=
optimoptions
(
optim_options
,
'display'
,
'off'
);
end
if
~
isoctave
[
opt_par_values
,
fval
,
exitflag
]
=
fminunc
(
objective_function
,
start_par_value
,
optim_options
,
varargin
{:});
if
options_
.
analytic_derivation
||
options_
.
mom
.
analytic_jacobian
==
1
optim_options
=
optimoptions
(
optim_options
,
'GradObj'
,
'on'
);
if
~
isoctave
func
=
@
(
x
)
analytic_gradient_wrapper
(
x
,
objective_function
,
varargin
{:});
[
opt_par_values
,
fval
,
exitflag
]
=
fminunc
(
func
,
start_par_value
,
optim_options
);
else
% Under Octave, use a wrapper, since fminunc() does not have a 4th arg
func
=
@
(
x
)
analytic_gradient_wrapper
(
x
,
objective_function
,
varargin
{:});
[
opt_par_values
,
fval
,
exitflag
]
=
fminunc
(
func
,
start_par_value
,
optim_options
);
end
else
% Under Octave, use a wrapper, since fminunc() does not have a 4th arg
func
=
@
(
x
)
objective_function
(
x
,
varargin
{:});
[
opt_par_values
,
fval
,
exitflag
]
=
fminunc
(
func
,
start_par_value
,
optim_options
);
if
~
isoctave
[
opt_par_values
,
fval
,
exitflag
]
=
fminunc
(
objective_function
,
start_par_value
,
optim_options
,
varargin
{:});
else
% Under Octave, use a wrapper, since fminunc() does not have a 4th arg
func
=
@
(
x
)
objective_function
(
x
,
varargin
{:});
[
opt_par_values
,
fval
,
exitflag
]
=
fminunc
(
func
,
start_par_value
,
optim_options
);
end
end
case
4
% Set default options.
H0
=
1e-4
*
eye
(
n_params
);
...
...
@@ -505,7 +526,12 @@ switch minimizer_algorithm
if
options_
.
silent_optimizer
solveoptoptions
.
verbosity
=
0
;
end
[
opt_par_values
,
fval
]
=
solvopt
(
start_par_value
,
objective_function
,[],[],[],
solveoptoptions
,
varargin
{:});
if
options_
.
analytic_derivation
||
options_
.
mom
.
analytic_jacobian
==
1
func
=
@
(
x
)
analytic_gradient_wrapper
(
x
,
objective_function
,
varargin
{:});
[
opt_par_values
,
fval
]
=
solvopt
(
start_par_value
,
func
,
1
,[],[],
solveoptoptions
);
else
[
opt_par_values
,
fval
]
=
solvopt
(
start_par_value
,
objective_function
,[],[],[],
solveoptoptions
,
varargin
{:});
end
case
102
if
isoctave
error
(
'Optimization algorithm 2 is not available under Octave'
)
...
...
@@ -534,15 +560,22 @@ switch minimizer_algorithm
eval
([
'optim_options = optimoptions(optim_options,'
options_
.
optim_opt
');'
]);
end
if
options_
.
silent_optimizer
optim_options
=
optimoptions
(
optim_options
,
'Display'
,
'off'
)
;
optim_options
.
Display
=
'off'
;
end
if
~
isoctave
[
opt_par_values
,
Resnorm
,
fval
,
exitflag
,
OUTPUT
,
LAMBDA
,
JACOB
]
=
lsqnonlin
(
objective_function
,
start_par_value
,
bounds
(:,
1
),
bounds
(:,
2
),
optim_options
,
varargin
{:});
if
options_
.
analytic_derivation
||
options_
.
mom
.
analytic_jacobian
==
1
optim_options
.
SpecifyObjectiveGradient
=
true
;
func
=
@
(
x
)
analytic_gradient_wrapper
(
x
,
objective_function
,
varargin
{:});
[
opt_par_values
,
Resnorm
,
fval
,
exitflag
,
OUTPUT
,
LAMBDA
,
JACOB
]
=
...
lsqnonlin
(
func
,
start_par_value
,
bounds
(:,
1
),
bounds
(:,
2
),
optim_options
);
else
% Under Octave, use a wrapper, since lsqnonlin() does not have a 6th arg
func
=
@
(
x
)
objective_function
(
x
,
varargin
{:});
[
opt_par_values
,
Resnorm
,
fval
,
exitflag
,
OUTPUT
,
LAMBDA
,
JACOB
]
=
lsqnonlin
(
func
,
start_par_value
,
bounds
(:,
1
),
bounds
(:,
2
),
optim_options
);
end
if
~
isoctave
[
opt_par_values
,
Resnorm
,
fval
,
exitflag
,
OUTPUT
,
LAMBDA
,
JACOB
]
=
lsqnonlin
(
objective_function
,
start_par_value
,
bounds
(:,
1
),
bounds
(:,
2
),
optim_options
,
varargin
{:});
else
% Under Octave, use a wrapper, since lsqnonlin() does not have a 6th arg
func
=
@
(
x
)
objective_function
(
x
,
varargin
{:});
[
opt_par_values
,
Resnorm
,
fval
,
exitflag
,
OUTPUT
,
LAMBDA
,
JACOB
]
=
lsqnonlin
(
func
,
start_par_value
,
bounds
(:,
1
),
bounds
(:,
2
),
optim_options
);
end
end
otherwise
if
ischar
(
minimizer_algorithm
)
if
exist
(
minimizer_algorithm
)
...
...
Write
Preview
Markdown
is supported
0%
Try again
or
attach a new file
.
Attach a file
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Cancel
Please
register
or
sign in
to comment