Python wrappers for Dakota analysis methods.
Abstract base classes for Dakota analysis methods.
dakotathon.method.base.
MethodBase
(method='vector_parameter_study', max_iterations=None, convergence_tolerance=None, **kwargs)[source]¶Bases: object
Describe common features of Dakota analysis methods.
The max_iterations and convergence_tolerance keywords are included in Dakota’s set of method independent controls.
__init__
(method='vector_parameter_study', max_iterations=None, convergence_tolerance=None, **kwargs)[source]¶Create default method parameters.
convergence_tolerance
¶Convergence tolerance for the method.
max_iterations
¶Maximum number of iterations for the method.
method
¶The name of the analysis method used in the experiment.
dakotathon.method.base.
UncertaintyQuantificationBase
(basis_polynomial_family='extended', probability_levels=(0.1, 0.5, 0.9), response_levels=(), samples=10, sample_type='random', seed=None, variance_based_decomp=False, **kwargs)[source]¶Bases: dakotathon.method.base.MethodBase
Describe features of uncertainty quantification methods.
To supply probability_levels or response_levels to multiple responses, nest the inputs to these properties.
__init__
(basis_polynomial_family='extended', probability_levels=(0.1, 0.5, 0.9), response_levels=(), samples=10, sample_type='random', seed=None, variance_based_decomp=False, **kwargs)[source]¶Create default method parameters.
__str__
()[source]¶Define the method block for a UQ experiment.
dakotathon.method.base.MethodBase.__str__
basis_polynomial_family
¶The type of basis polynomials used by the method.
probability_levels
¶Probabilities at which to estimate response values.
response_levels
¶Values at which to estimate statistics for responses.
sample_type
¶Sampling strategy.
samples
¶Number of samples in experiment.
seed
¶Seed of the random number generator.
variance_based_decomp
¶Use variance-based decomposition global sensitivity analysis.
Implementation of a Dakota centered parameter study.
dakotathon.method.centered_parameter_study.
CenteredParameterStudy
(steps_per_variable=(5, 4), step_vector=(0.4, 0.5), **kwargs)[source]¶Bases: dakotathon.method.base.MethodBase
Define parameters for a Dakota centered parameter study.
__init__
(steps_per_variable=(5, 4), step_vector=(0.4, 0.5), **kwargs)[source]¶Create a new Dakota centered parameter study.
Create a default centered parameter study experiment:
>>> c = CenteredParameterStudy()
__str__
()[source]¶Define a centered parameter study method block.
dakotathon.method.base.MethodBase.__str__
step_vector
¶Step size in each direction.
steps_per_variable
¶Number of steps to take in each direction.
Implementation of a Dakota multidim parameter study.
dakotathon.method.multidim_parameter_study.
MultidimParameterStudy
(partitions=(10, 8), **kwargs)[source]¶Bases: dakotathon.method.base.MethodBase
Define parameters for a Dakota multidim parameter study.
__init__
(partitions=(10, 8), **kwargs)[source]¶Create a new Dakota multidim parameter study.
Create a default multidim parameter study experiment:
>>> m = MultidimParameterStudy()
__str__
()[source]¶Define a multidim parameter study method block.
dakotathon.method.base.MethodBase.__str__
partitions
¶The number of evaluation intervals for each parameter.
Implementation of a Dakota vector parameter study.
dakotathon.method.vector_parameter_study.
VectorParameterStudy
(final_point=(1.1, 1.3), n_steps=10, **kwargs)[source]¶Bases: dakotathon.method.base.MethodBase
Define parameters for a Dakota vector parameter study.
__init__
(final_point=(1.1, 1.3), n_steps=10, **kwargs)[source]¶Create a new Dakota vector parameter study.
Create a default vector parameter study experiment:
>>> v = VectorParameterStudy()
__str__
()[source]¶Define a vector parameter study method block for a Dakota input file.
dakotathon.method.base.MethodBase.__str__
final_point
¶End points used by study variables.
n_steps
¶Number of steps along vector.
Implementation of the Dakota sampling method.
dakotathon.method.sampling.
Sampling
(**kwargs)[source]¶Bases: dakotathon.method.base.UncertaintyQuantificationBase
The Dakota sampling method.
Implementation of the Dakota polynomial chaos method.
dakotathon.method.polynomial_chaos.
PolynomialChaos
(coefficient_estimation_approach='quadrature_order_sequence', quadrature_order=2, dimension_preference=(), nested=False, **kwargs)[source]¶Bases: dakotathon.method.base.UncertaintyQuantificationBase
The Dakota polynomial chaos uncertainty quantification method.
Designation of a coefficient estimation approach is required, but the only approach currently implemented is quadrature_order_sequence, which obtains coefficients of the expansion using multidimensional integration by a tensor-product of Gaussian quadrature rules specified with quadrature_order, and, optionally, with dimension_preference. If dimension_preference is defined, its highest value is set to the quadrature_order.
This implementation of the polynomial chaos method is based on the description provided in the Dakota 6.4 documentation.
__init__
(coefficient_estimation_approach='quadrature_order_sequence', quadrature_order=2, dimension_preference=(), nested=False, **kwargs)[source]¶Create a new Dakota polynomial chaos study.
Create a default instance of PolynomialChaos with:
>>> m = PolynomialChaos()
__str__
()[source]¶Define the method block for a polynomial_chaos experiment.
Display the method block created by a default instance of PolynomialChaos:
>>> m = PolynomialChaos()
>>> print(m)
method
polynomial_chaos
sample_type = random
samples = 10
probability_levels = 0.1 0.5 0.9
quadrature_order = 2
non_nested
<BLANKLINE>
<BLANKLINE>
dakotathon.method.base.UncertaintyQuantificationBase.__str__
dimension_preference
¶Weights specifying the relative importance of each dimension.
nested
¶Enforce use of nested quadrature rules.
quadrature_order
¶The highest order polynomial used by the method.
Implementation of the Dakota stochastic collocation method.
dakotathon.method.stoch_collocation.
StochasticCollocation
(coefficient_estimation_approach='quadrature_order_sequence', quadrature_order=2, dimension_preference=(), nested=False, **kwargs)[source]¶Bases: dakotathon.method.base.UncertaintyQuantificationBase
The Dakota stochastic collocation uncertainty quantification method.
Stochastic collocation is a general framework for approximate representation of random response functions in terms of finite-dimensional interpolation bases. Stochastic collocation is very similar to polynomial chaos, with the key difference that the orthogonal polynomial basis functions are replaced with interpolation polynomial bases.
This implementation of the stochastic collocation method is based on the description provided in the Dakota 6.4 documentation.
__init__
(coefficient_estimation_approach='quadrature_order_sequence', quadrature_order=2, dimension_preference=(), nested=False, **kwargs)[source]¶Create a new Dakota stochastic collocation study.
Create a default instance of StochasticCollocation with:
>>> m = StochasticCollocation()
__str__
()[source]¶Define the method block for a stoch_collocation experiment.
Display the method block created by a default instance of StochasticCollocation:
>>> m = StochasticCollocation()
>>> print(m)
method
stoch_collocation
sample_type = random
samples = 10
probability_levels = 0.1 0.5 0.9
quadrature_order = 2
non_nested
<BLANKLINE>
<BLANKLINE>
dakotathon.method.base.UncertaintyQuantificationBase.__str__
basis_polynomial_family
¶The type of basis polynomials used by the method.
dimension_preference
¶Weights specifying the relative importance of each dimension.
nested
¶Enforce use of nested quadrature rules.
quadrature_order
¶The highest order polynomial used by the method.