API
MonotoneDecomposition._optim — Method_optim(y::AbstractVector, workspace::WorkSpaceCS, μs::AbstractVector)
_optim(y::AbstractVector, J::Int, B::AbstractMatrix, H::AbstractMatrix{Int}, μs::AbstractVector)Optimization for monotone decomposition with cubic B-splines.
_optim(y::AbstractVector, J::Int, B::AbstractMatrix, H::AbstractMatrix{Int}, L::AbstractMatrix, λs::AbstractVector, μs::AbstractVector)Optimization for monotone decomposition with smoothing splines.
_optim!(y::AbstractVector, J::Int, B::AbstractMatrix, s::Union{Nothing, Real}, γhat::AbstractVector, H::AbstractMatrix{Int}; L, t, λ, μ)MonotoneDecomposition.benchmarking — Functionbenchmarking(f::String; n = 100,
σs = 0.2:0.2:1,
competitor = "ss_single_lambda")Run benchmarking experiments for monotone decomposition on curve f. The candidates of f include:
- simple functions:
x^2,x^3,exp(x),sigmoid - random functions generated from Gaussian Process:
SE_1SE_0.1Mat12_1Mat12_0.1Mat32_1Mat32_0.1RQ_0.1_0.5Periodic_0.1_4
Arguments
n::Integer = 100: sample size for the simulated curveσs::AbstractVector: a vector of noise level to be investigatedcompetitor::String: a string to indicate the strategy used in monotone decomposition. Possible choices:ss_single_lambda: decomposition with smoothing splinessswith thesingle_lambdastrategyss_fix_ratio: decomposition with smoothing splinessswith thefix_ratiostrategyss_grid_search: decomposition with smoothing splinessswith thegrid_searchstrategyss_iter_search: decomposition with smoothing splinessswith theiter_searchstrategybspl: decomposition with cubic splinescs
MonotoneDecomposition.benchmarking_cs — Functionbenchmarking_cs(n, σ, f; figname_cv = nothing, figname_fit = nothing)Run benchmarking experiments for decomposition with cubic splines on n observations sampled from curve f with noise σ.
Optional Arguments
figname_cv: if notnothing, the cross-validation error will be plotted and saved to the given path.figname_fit: if notnothing, the fitted curves will be plotted and saved to the given path.Js: the candidates of number of basis functions.fixJ: whether to use the CV-tunedJfrom the crossponding cubic spline fitting.nfold: the number of folds in cross-validation procedureone_se_rule: whether to use the one-standard-error rule to select the parameter after cross-validation procedureμs: the candidates of tuning parameters for the discrepancy parameter
MonotoneDecomposition.benchmarking_ss — Functionbenchmarking_ss(n::Int, σ::Float64, f::Union{Function, String};
method = "single_lambda")Run benchmarking experiments for decomposition with smoothing splines on n observations sampled from curve f with noise σ.
Arguments
method::String = "single_lambda": strategy for decomposition with smoothing spline. Possible choices:single_lambdafix_ratiogrid_searchiter_search
MonotoneDecomposition.build_model! — Methodbuild_model!(workspace::WorkSpaceCS, x::AbstractVector{T})Calculate components that construct the optimization problem for Monotone Decomposition with Cubic splines.
MonotoneDecomposition.conf_band_width — Methodconf_band_width(CIs::AbstractMatrix)Calculate width of confidence bands.
MonotoneDecomposition.coverage_prob — Methodcoverage_prob(CIs::AbstractMatrix, y0::AbstractVector)Calculate coverage probability given n x 2 CI matrix CIs and true vector y0 of size n.
MonotoneDecomposition.cv_cubic_spline — Methodcv_cubic_spline(x::AbstractVector, y::AbstractVector, xnew::AbstractVector)B-spline fitting with nfold CV-tuned J from Js.
MonotoneDecomposition.cv_mono_decomp_cs — Methodcv_mono_decomp_cs(x::AbstractVector, y::AbstractVector, xnew::AbstractVector; )
cv_mono_decomp_cs(x::AbstractVector, y::AbstractVector; fixJ = true)Cross-validation for Monotone Decomposition with Cubic B-splines. Parameters J and s (μ if s_is_μ) are tuned by cross-validation.
- if
fixJ == true, thenJis CV-tuned by the corresponding cubic B-spline fitting - if
fixJ == false, then bothJandswould be tuned by cross-validation.
Arguments
figname: if notnothing, then the CV erro figure will be saved to the given name (can include the path)
MonotoneDecomposition.cv_mono_decomp_cs — Methodcv_mono_decomp_cs(x::AbstractVector, y::AbstractVector)Cross-validation for monotone decomposition with cubic B-splines when the fixed J is CV-tuned by the corresponding cubic B-spline fitting method.
MonotoneDecomposition.cv_mono_decomp_ss — Methodcv_mono_decomp_ss(x::AbstractVector, y::AbstractVector)Cross Validation for Monotone Decomposition with Smoothing Splines. With λ tuned by smoothing spline, and then perform golden search for μ.
Returns
D: aMonoDecompobject.workspace: workspace contained some intermediate resultsμmin: the parameterμthat achieve the smallest CV errorμs: the investigated parameterμ
Example
x, y, x0, y0 = gen_data(100, 0.001, "SE_0.1")
res, workspace = cv_mono_decomp_ss(x, y, one_se_rule = true, figname = "/tmp/p.png", tol=1e-3)
yup = workspace.B * res.γup
ydown = workspace.B * res.γdown
scatter(x, y)
scatter!(x, yup)
scatter!(x, ydown)MonotoneDecomposition.cv_one_se_rule — Methodcv_one_se_rule(μs::AbstractVector{T}, σs::AbstractVector{T}; small_is_simple = true)
cv_one_se_rule(μs::AbstractMatrix{T}, σs::AbstractMatrix{T}; small_is_simple = [true, true])
cv_one_se_rule2(μs::AbstractMatrix{T}, σs::AbstractMatrix{T}; small_is_simple = [true, true])Return the index of parameter(s) (1dim or 2-dim) that minimize the CV error with one standard error rule.
For 2-dim parameters, cv_one_se_rule2 adopts a grid search for μ+σ while cv_one_se_rule searchs after fixing one optimal parameter. The potential drawback of cv_one_se_rule2 is that we might fail to determine the simplest model when both parameters are away from the optimal parameters. So we recommend cv_one_se_rule.
MonotoneDecomposition.cvfit — MethodGiven `μmax`, and construct μs = (1:nμ) ./ nμ * μmax. If the optimal `μ` near the boundary, double or halve `μmax`.MonotoneDecomposition.cvfit_gss — Methodcvfit_gss(x, y, μrange, λs)For each λ in λs, perform cvfit(x, y, μrange, λ), and store the current best CV error. Finally, return the smallest one.
MonotoneDecomposition.cvfit_gss — Methodcvfit_gss(x, y, μrange, λ; λ_is_μ)Cross-validation by Golden Section Searching μinμrangegivenλ`.
- If
λ_is_μ, searchλinμrangegivenλ (μ) - Note that
one_se_ruleis not suitable for the golden section search.
MonotoneDecomposition.cvplot — Functioncvplot(sil::String)
cvplot(μerr::AbstractVector, σerr::Union{Nothing, AbstractVector{T}}, paras::AbstractVector)
cvplot(μerr::AbstractMatrix, σerr::AbstractMatrix, para1::AbstractVector, para2::AbstractVector)Plot the cross-validation curves.
MonotoneDecomposition.demo_data — Methoddemo_data()Generate demo data for illustration. (Figure 6 in the paper)
MonotoneDecomposition.div_into_folds — Methoddiv_into_folds(N::Int; K = 10, seed = 1234)Equally divide 1:N into K folds with random seed seed. Specially,
- If
seedis negative, it is a non-random division, where thei-th fold would be thei-th equidistant range. - If
seed = 0, it is a non-random division, where each fold consists of equidistant indexes.
MonotoneDecomposition.gen_data — Methodgen_data(n::Int, σ::Union{Real, Nothing}, f::Union{Function, String}; xmin = -1, xmax = 1, k = 10)Generate n data points (xi, yi) from curve f with noise level σ, i.e., yi = f(xi) + N(0, σ^2).
Arguments
for
f- if
fis aFunction, just takey = f(x) - if
f = "MLP", it will be a simple neural network with one layer. - otherwise, it accepts the string with format
KernelName_Para[_OtherPara]representing some Gaussian Processes, includingSE,Mat12,Mat32,Mat52,Para: the length scale parameterℓPoly:Parais the degree parameterpRQ:ParaisℓandOtherParaisα
- if
for
σ: the noise level- if
σisnothing, thenσis calculated to achieve given signal-to-noise ratio (snr)
- if
if
seedis notnothing, it ensures the same random function from Gaussian process, but it does not influence the random noises.
Returns
It returns four vectors, x, y, x0, y0, where
x, y: pair points of lengthn.x0, y0: true curve without noise, represented byk*npoints.
MonotoneDecomposition.gen_data_bowman — Methodgen_data_bowman()Generate Curves for Monotonicity Test used in Bowman et al. (1998)
MonotoneDecomposition.gen_data_ghosal — Methodgen_data_ghosal()Generate curves used in Ghosal et al. (2000).
MonotoneDecomposition.gen_mono_data — Methodgen_mono_data()Generate monotonic curves, used for checking type I error under H0. (Table 4 and Figure 5 in the paper)
MonotoneDecomposition.gp — Methodgp(x; K)Generate a random Gaussian vector with mean zero and covariance matrix Σij = K(xi, xj).
The candidates of kernel K include SE, Mat12, Mat32, Mat52.
MonotoneDecomposition.mono_decomp — Methodmono_decomp(y::AbstractVector)Perform monotone decomposition on vector y, and return yup, ydown.
MonotoneDecomposition.mono_decomp_cs — Methodmono_decomp_cs(x::AbstractVector, y::AbstractVector)Monotone Decomposition with Cubic B-splines by solving an optimization problem.
MonotoneDecomposition.mono_decomp_ss — Methodmono_decomp_ss(workspace::WorkSpaceSS, x::AbstractVector{T}, y::AbstractVector{T}, λ::AbstractFloat, μ::AbstractFloat)Monotone decomposition with smoothing splines.
MonotoneDecomposition.mono_test_bootstrap_ss — Methodmono_test_bootstrap_ss(x, y)Perform monotonicity test after monotone decomposition with smoothing splines.
MonotoneDecomposition.recover — Methodrecover(Σ)Recover matrix from the vector-stored Σ.
MonotoneDecomposition.smooth_spline — Methodsmooth_spline(x::AbstractVector, y::AbstractVector, xnew::AbstractVector)Perform smoothing spline on (x, y), and make predictions on xnew.
Returns: yhat, ynewhat,....
RecipesBase.plot — Methodplot(obs, truth, D::MonoDecomp, other)Plot the noised observations, the true curve, and the fitting from monotone decomposition D and other fitting technique.
obs: usually be[x, y]truth: usually be[x0, y0]D: aMonoDecompobjectother: the fitted curve[x0, other]by other method, wherex0is omitted.
A typical usage can be plot([x, y], [x0, y0], D, yhatnew, prefix_title = "SE (ℓ = 1, σ = 0.5): ")
StatsAPI.predict — Methodpredict(D::MonoDecomp, xnew)Predict at xnew given decomposition D.
StatsAPI.predict — Methodpredict(W::WorkSpaceSS, xnew::AbstractVector, γhat::AbstractVecOrMat)
predict(W::WorkSpaceCS, xnew::AbstractVector, γhat::AbstractVecOrMat)Make multiple predictions at xnew for each column of γhat.
StatsAPI.predict — Methodpredict(W::WorkSpaceSS, xnew::AbstractVector, γup::AbstractVector, γdown::AbstractVector)
predict(W::WorkSpaceCS, xnew::AbstractVector, γup::AbstractVector, γdown::AbstractVector)Predict yup and ydown at xnew given workspace W and decomposition coefficients γup and γdown.