API

MonotoneDecomposition._optimMethod
_optim(y::AbstractVector, workspace::WorkSpaceCS, μs::AbstractVector)
_optim(y::AbstractVector, J::Int, B::AbstractMatrix, H::AbstractMatrix{Int}, μs::AbstractVector)

Optimization for monotone decomposition with cubic B-splines.

_optim(y::AbstractVector, J::Int, B::AbstractMatrix, H::AbstractMatrix{Int}, L::AbstractMatrix, λs::AbstractVector, μs::AbstractVector)

Optimization for monotone decomposition with smoothing splines.

_optim!(y::AbstractVector, J::Int, B::AbstractMatrix, s::Union{Nothing, Real}, γhat::AbstractVector, H::AbstractMatrix{Int}; L, t, λ, μ)
source
MonotoneDecomposition.benchmarkingFunction
benchmarking(f::String; n = 100, 
                        σs = 0.2:0.2:1,
                        competitor = "ss_single_lambda")

Run benchmarking experiments for monotone decomposition on curve f. The candidates of f include:

  • simple functions: x^2, x^3, exp(x), sigmoid
  • random functions generated from Gaussian Process: SE_1 SE_0.1 Mat12_1 Mat12_0.1 Mat32_1 Mat32_0.1 RQ_0.1_0.5 Periodic_0.1_4

Arguments

  • n::Integer = 100: sample size for the simulated curve
  • σs::AbstractVector: a vector of noise level to be investigated
  • competitor::String: a string to indicate the strategy used in monotone decomposition. Possible choices:
    • ss_single_lambda: decomposition with smoothing splines ss with the single_lambda strategy
    • ss_fix_ratio: decomposition with smoothing splines ss with the fix_ratio strategy
    • ss_grid_search: decomposition with smoothing splines ss with the grid_search strategy
    • ss_iter_search: decomposition with smoothing splines ss with the iter_search strategy
    • bspl: decomposition with cubic splines cs
source
MonotoneDecomposition.benchmarking_csFunction
benchmarking_cs(n, σ, f; figname_cv = nothing, figname_fit = nothing)

Run benchmarking experiments for decomposition with cubic splines on n observations sampled from curve f with noise σ.

Optional Arguments

  • figname_cv: if not nothing, the cross-validation error will be plotted and saved to the given path.
  • figname_fit: if not nothing, the fitted curves will be plotted and saved to the given path.
  • Js: the candidates of number of basis functions.
  • fixJ: whether to use the CV-tuned J from the crossponding cubic spline fitting.
  • nfold: the number of folds in cross-validation procedure
  • one_se_rule: whether to use the one-standard-error rule to select the parameter after cross-validation procedure
  • μs: the candidates of tuning parameters for the discrepancy parameter
source
MonotoneDecomposition.benchmarking_ssFunction
benchmarking_ss(n::Int, σ::Float64, f::Union{Function, String}; 
                    method = "single_lambda")

Run benchmarking experiments for decomposition with smoothing splines on n observations sampled from curve f with noise σ.

Arguments

  • method::String = "single_lambda": strategy for decomposition with smoothing spline. Possible choices:
    • single_lambda
    • fix_ratio
    • grid_search
    • iter_search
source
MonotoneDecomposition.build_model!Method
build_model!(workspace::WorkSpaceCS, x::AbstractVector{T})

Calculate components that construct the optimization problem for Monotone Decomposition with Cubic splines.

source
MonotoneDecomposition.cv_mono_decomp_csMethod
cv_mono_decomp_cs(x::AbstractVector, y::AbstractVector, xnew::AbstractVector; )
cv_mono_decomp_cs(x::AbstractVector, y::AbstractVector; fixJ = true)

Cross-validation for Monotone Decomposition with Cubic B-splines. Parameters J and s (μ if s_is_μ) are tuned by cross-validation.

  • if fixJ == true, then J is CV-tuned by the corresponding cubic B-spline fitting
  • if fixJ == false, then both J and s would be tuned by cross-validation.

Arguments

  • figname: if not nothing, then the CV erro figure will be saved to the given name (can include the path)
source
MonotoneDecomposition.cv_mono_decomp_csMethod
cv_mono_decomp_cs(x::AbstractVector, y::AbstractVector)

Cross-validation for monotone decomposition with cubic B-splines when the fixed J is CV-tuned by the corresponding cubic B-spline fitting method.

source
MonotoneDecomposition.cv_mono_decomp_ssMethod
cv_mono_decomp_ss(x::AbstractVector, y::AbstractVector)

Cross Validation for Monotone Decomposition with Smoothing Splines. With λ tuned by smoothing spline, and then perform golden search for μ.

Returns

  • D: a MonoDecomp object.
  • workspace: workspace contained some intermediate results
  • μmin: the parameter μ that achieve the smallest CV error
  • μs: the investigated parameter μ

Example

x, y, x0, y0 = gen_data(100, 0.001, "SE_0.1")
res, workspace = cv_mono_decomp_ss(x, y, one_se_rule = true, figname = "/tmp/p.png", tol=1e-3)
yup = workspace.B * res.γup
ydown = workspace.B * res.γdown
scatter(x, y)
scatter!(x, yup)
scatter!(x, ydown)
source
MonotoneDecomposition.cv_one_se_ruleMethod
cv_one_se_rule(μs::AbstractVector{T}, σs::AbstractVector{T}; small_is_simple = true)
cv_one_se_rule(μs::AbstractMatrix{T}, σs::AbstractMatrix{T}; small_is_simple = [true, true])
cv_one_se_rule2(μs::AbstractMatrix{T}, σs::AbstractMatrix{T}; small_is_simple = [true, true])

Return the index of parameter(s) (1dim or 2-dim) that minimize the CV error with one standard error rule.

For 2-dim parameters, cv_one_se_rule2 adopts a grid search for μ+σ while cv_one_se_rule searchs after fixing one optimal parameter. The potential drawback of cv_one_se_rule2 is that we might fail to determine the simplest model when both parameters are away from the optimal parameters. So we recommend cv_one_se_rule.

source
MonotoneDecomposition.cvfitMethod
Given `μmax`, and construct μs = (1:nμ) ./ nμ * μmax. If the optimal `μ` near the boundary, double or halve `μmax`.
source
MonotoneDecomposition.cvfit_gssMethod
cvfit_gss(x, y, μrange, λs)

For each λ in λs, perform cvfit(x, y, μrange, λ), and store the current best CV error. Finally, return the smallest one.

source
MonotoneDecomposition.cvfit_gssMethod
cvfit_gss(x, y, μrange, λ; λ_is_μ)

Cross-validation by Golden Section Searching μinμrangegivenλ`.

  • If λ_is_μ, search λ in μrange given λ (μ)
  • Note that one_se_rule is not suitable for the golden section search.
source
MonotoneDecomposition.cvplotFunction
cvplot(sil::String)
cvplot(μerr::AbstractVector, σerr::Union{Nothing, AbstractVector{T}}, paras::AbstractVector)
cvplot(μerr::AbstractMatrix, σerr::AbstractMatrix, para1::AbstractVector, para2::AbstractVector)

Plot the cross-validation curves.

source
MonotoneDecomposition.div_into_foldsMethod
div_into_folds(N::Int; K = 10, seed = 1234)

Equally divide 1:N into K folds with random seed seed. Specially,

  • If seed is negative, it is a non-random division, where the i-th fold would be the i-th equidistant range.
  • If seed = 0, it is a non-random division, where each fold consists of equidistant indexes.
source
MonotoneDecomposition.gen_dataMethod
gen_data(n::Int, σ::Union{Real, Nothing}, f::Union{Function, String}; xmin = -1, xmax = 1, k = 10)

Generate n data points (xi, yi) from curve f with noise level σ, i.e., yi = f(xi) + N(0, σ^2).

Arguments

  • for f

    • if f is a Function, just take y = f(x)
    • if f = "MLP", it will be a simple neural network with one layer.
    • otherwise, it accepts the string with format KernelName_Para[_OtherPara] representing some Gaussian Processes, including
      • SE, Mat12, Mat32, Mat52, Para: the length scale parameter
      • Poly: Para is the degree parameter p
      • RQ: Para is and OtherPara is α
  • for σ: the noise level

    • if σ is nothing, then σ is calculated to achieve given signal-to-noise ratio (snr)
  • if seed is not nothing, it ensures the same random function from Gaussian process, but it does not influence the random noises.

Returns

It returns four vectors, x, y, x0, y0, where

  • x, y: pair points of length n.
  • x0, y0: true curve without noise, represented by k*n points.
source
MonotoneDecomposition.mono_decomp_ssMethod
mono_decomp_ss(workspace::WorkSpaceSS, x::AbstractVector{T}, y::AbstractVector{T}, λ::AbstractFloat, μ::AbstractFloat)

Monotone decomposition with smoothing splines.

source
MonotoneDecomposition.smooth_splineMethod
smooth_spline(x::AbstractVector, y::AbstractVector, xnew::AbstractVector)

Perform smoothing spline on (x, y), and make predictions on xnew.

Returns: yhat, ynewhat,....

source
RecipesBase.plotMethod
plot(obs, truth, D::MonoDecomp, other)

Plot the noised observations, the true curve, and the fitting from monotone decomposition D and other fitting technique.

  • obs: usually be [x, y]
  • truth: usually be [x0, y0]
  • D: a MonoDecomp object
  • other: the fitted curve [x0, other] by other method, where x0 is omitted.

A typical usage can be plot([x, y], [x0, y0], D, yhatnew, prefix_title = "SE (ℓ = 1, σ = 0.5): ")

source
StatsAPI.predictMethod
predict(W::WorkSpaceSS, xnew::AbstractVector, γhat::AbstractVecOrMat)
predict(W::WorkSpaceCS, xnew::AbstractVector, γhat::AbstractVecOrMat)

Make multiple predictions at xnew for each column of γhat.

source
StatsAPI.predictMethod
predict(W::WorkSpaceSS, xnew::AbstractVector, γup::AbstractVector, γdown::AbstractVector)
predict(W::WorkSpaceCS, xnew::AbstractVector, γup::AbstractVector, γdown::AbstractVector)

Predict yup and ydown at xnew given workspace W and decomposition coefficients γup and γdown.

source