From 3e016e161f992409d8ec899848e4d14cfae23079 Mon Sep 17 00:00:00 2001 From: SzymonNowakowski Date: Fri, 7 Oct 2022 19:04:07 +0200 Subject: [PATCH] Increasing the package version to 0.3.2. Also: (1) changing a github version badge URL to get rid of the auxiliary file .version.json. Now the version information is retrieved from DESCRIPTION itself (2) changing URL in projecteuclid link to DOI in a few files to get rid of a persistent incorrect URL NOTE (3) reformatting the vignette URL to CRAN canonical (4) speeding up examples (5) rephrasing some help entries into the 3rd person (e.g. plots vs plot) (6) including missing package scope denotation (utils::) in a call to packageDescription() function in the .onAttach() handler --- .version.json | 6 ------ DESCRIPTION | 4 ++-- NEWS.md | 8 ++------ R/DMR.R | 2 +- R/DMRnet-package.R | 2 +- R/DMRnet.R | 26 +++++++++++++------------- R/cv.DMR.R | 2 +- R/cv.DMRnet.R | 2 +- R/onAttach.R | 2 +- R/plot.DMR.R | 2 +- R/plot.cv.DMR.R | 2 +- R/plot.gic.DMR.R | 2 +- R/predict.DMR.R | 2 +- R/predict.cv.DMR.R | 2 +- R/predict.gic.DMR.R | 2 +- R/print.DMR.R | 2 +- R/release_questions.R | 4 +--- README.md | 8 ++++---- cran-comments.md | 3 --- inst/CITATION | 4 ++-- man/DMR.Rd | 2 +- man/DMRnet-package.Rd | 2 +- man/DMRnet.Rd | 26 +++++++++++++------------- man/cv.DMR.Rd | 2 +- man/cv.DMRnet.Rd | 2 +- man/plot.DMR.Rd | 2 +- man/plot.cv.DMR.Rd | 2 +- man/plot.gic.DMR.Rd | 2 +- man/predict.DMR.Rd | 2 +- man/predict.cv.DMR.Rd | 2 +- man/predict.gic.DMR.Rd | 2 +- man/print.DMR.Rd | 2 +- vignettes/getting-started.Rmd | 2 +- 33 files changed, 61 insertions(+), 76 deletions(-) delete mode 100644 .version.json diff --git a/.version.json b/.version.json deleted file mode 100644 index 5310984..0000000 --- a/.version.json +++ /dev/null @@ -1,6 +0,0 @@ -{ - "schemaVersion": 1, - "label": "GitHub", - "message": "0.3.1.9002", - "color": "blue" -} diff --git a/DESCRIPTION b/DESCRIPTION index 0db4af9..1a3369c 100644 --- a/DESCRIPTION +++ b/DESCRIPTION @@ -2,11 +2,11 @@ Package: DMRnet Type: Package Title: Delete or Merge Regressors Algorithms for Linear and Logistic Model Selection and High-Dimensional Data -Version: 0.3.1.9002 +Version: 0.3.2 Authors@R: c(person("Agnieszka", "Prochenka-Sołtys", email = "ap220756@mimuw.edu.pl", role = c("aut"), comment = "previous maintainer for versions <= 0.2.0"), person("Piotr", "Pokarowski", role = c("aut")), person("Szymon", "Nowakowski", email = "s.nowakowski@mimuw.edu.pl", role = c("aut", "cre"), comment = c(ORCID = "0000-0002-1939-9512"))) -Description: Model selection algorithms for regression and classification, where the predictors can be continuous or categorical and the number of regressors may exceed the number of observations. The selected model consists of a subset of numerical regressors and partitions of levels of factors. Aleksandra Maj-Kańska, Piotr Pokarowski and Agnieszka Prochenka, 2015. Delete or merge regressors for linear model selection. Electronic Journal of Statistics 9(2): 1749-1778. . Piotr Pokarowski and Jan Mielniczuk, 2015. Combined l1 and greedy l0 penalized least squares for linear model selection. Journal of Machine Learning Research 16(29): 961-992. . Piotr Pokarowski, Wojciech Rejchel, Agnieszka Sołtys, Michał Frej and Jan Mielniczuk, 2022. Improving Lasso for model selection and prediction. Scandinavian Journal of Statistics, 49(2): 831–863. . +Description: Model selection algorithms for regression and classification, where the predictors can be continuous or categorical and the number of regressors may exceed the number of observations. The selected model consists of a subset of numerical regressors and partitions of levels of factors. Aleksandra Maj-Kańska, Piotr Pokarowski and Agnieszka Prochenka, 2015. Delete or merge regressors for linear model selection. Electronic Journal of Statistics 9(2): 1749-1778. . Piotr Pokarowski and Jan Mielniczuk, 2015. Combined l1 and greedy l0 penalized least squares for linear model selection. Journal of Machine Learning Research 16(29): 961-992. . Piotr Pokarowski, Wojciech Rejchel, Agnieszka Sołtys, Michał Frej and Jan Mielniczuk, 2022. Improving Lasso for model selection and prediction. Scandinavian Journal of Statistics, 49(2): 831–863. . License: GPL-2 Encoding: UTF-8 LazyData: true diff --git a/NEWS.md b/NEWS.md index 5d2029a..d2144c9 100644 --- a/NEWS.md +++ b/NEWS.md @@ -1,15 +1,11 @@ -# DMRnet 0.3.1.9002 +# DMRnet 0.3.2 - Improved readability of a getting-started vignette - Fixed a bug in model-indexed cross validation related to folds with different model sizes - Added df.1se to GIC-indexed cross validation - Improved CV plots with df.1se model - - -# DMRnet 0.3.1.9001 - -- Improved readability of Readme on CRAN ([issue #32](https://github.com/SzymonNowakowski/DMRnet/issues/32)) +- Improved readability of README on CRAN ([issue #32](https://github.com/SzymonNowakowski/DMRnet/issues/32)) - Welcome message on package load added # DMRnet 0.3.1 diff --git a/R/DMR.R b/R/DMR.R index 4136606..0dd320d 100644 --- a/R/DMR.R +++ b/R/DMR.R @@ -1,6 +1,6 @@ #' @title Delete or Merge Regressors #' -#' @description Fit a path of linear (\code{family="gaussian"}) or logistic (\code{family="binomial"}) regression models, where the number of parameters changes from 1 to p (p is the number of columns in the model matrix). Models are subsets of continuous predictors and partitions of levels of factors in \code{X}. +#' @description Fits a path of linear (\code{family="gaussian"}) or logistic (\code{family="binomial"}) regression models, where the number of parameters changes from 1 to p (p is the number of columns in the model matrix). Models are subsets of continuous predictors and partitions of levels of factors in \code{X}. #' #' @param X Input data frame; each row is an observation vector; each column can be numerical or integer for a continuous predictor or a factor for a categorical predictor; DMR works only if p=n see \code{\link{DMRnet}}. #' diff --git a/R/DMRnet-package.R b/R/DMRnet-package.R index eea18eb..676f5fa 100644 --- a/R/DMRnet-package.R +++ b/R/DMRnet-package.R @@ -31,7 +31,7 @@ #' #' @references #' -#' Aleksandra Maj-Kańska, Piotr Pokarowski and Agnieszka Prochenka, 2015. Delete or merge regressors for linear model selection. Electronic Journal of Statistics 9(2): 1749-1778. \url{https://projecteuclid.org/euclid.ejs/1440507392} +#' Aleksandra Maj-Kańska, Piotr Pokarowski and Agnieszka Prochenka, 2015. Delete or merge regressors for linear model selection. Electronic Journal of Statistics 9(2): 1749-1778. \doi{10.1214/15-EJS1050} #' #' Piotr Pokarowski and Jan Mielniczuk, 2015. Combined l1 and greedy l0 penalized least squares for linear model selection. Journal of Machine Learning Research 16(29): 961-992. \url{https://www.jmlr.org/papers/volume16/pokarowski15a/pokarowski15a.pdf} #' diff --git a/R/DMRnet.R b/R/DMRnet.R index d4e1c2d..c742764 100644 --- a/R/DMRnet.R +++ b/R/DMRnet.R @@ -1,6 +1,6 @@ #' @title Delete or Merge Regressors net #' -#' @description Fit a path of linear (\code{family="gaussian"}) or logistic (\code{family="binomial"}) regression models, where models are subsets of continuous predictors and partitions of levels of factors in \code{X}. Works even if p>=n (the number of observations is greater than the number of columns in the model matrix). +#' @description Fits a path of linear (\code{family="gaussian"}) or logistic (\code{family="binomial"}) regression models, where models are subsets of continuous predictors and partitions of levels of factors in \code{X}. Works even if p>=n (the number of observations is greater than the number of columns in the model matrix). #' #' @param X Input data frame; each row is an observation vector; each column can be numerical or integer for a continuous predictor or a factor for a categorical predictor. #' @@ -51,9 +51,9 @@ #' @examples #' ## DMRnet for linear regression #' data(miete) -#' ytr <- miete[1:500,1] -#' Xtr <- miete[1:500,-1] -#' Xte <- miete[501:1000,-1] +#' ytr <- miete[1:200,1] +#' Xtr <- miete[1:200,-1] +#' Xte <- miete[201:250,-1] #' m1 <- DMRnet(Xtr, ytr) #' print(m1) #' plot(m1) @@ -64,9 +64,9 @@ #' #' ## DMRnet for logistic regression #' data(promoter) -#' ytr <- factor(promoter[1:80,1]) -#' Xtr <- promoter[1:80,-1] -#' Xte <- promoter[81:106,-1] +#' ytr <- factor(promoter[1:70,1]) +#' Xtr <- promoter[1:70,-1] +#' Xte <- promoter[71:106,-1] #' m2 <- DMRnet(Xtr, ytr, family = "binomial") #' print(m2) #' plot(m2) @@ -77,9 +77,9 @@ #' #' ## GLAMER for linear regression #' data(miete) -#' ytr <- miete[1:500,1] -#' Xtr <- miete[1:500,-1] -#' Xte <- miete[501:1000,-1] +#' ytr <- miete[1:200,1] +#' Xtr <- miete[1:200,-1] +#' Xte <- miete[201:250,-1] #' m1 <- DMRnet(Xtr, ytr, algorithm="glamer") #' print(m1) #' plot(m1) @@ -90,9 +90,9 @@ #' #' ## GLAMER for logistic regression #' data(promoter) -#' ytr <- factor(promoter[1:80,1]) -#' Xtr <- promoter[1:80,-1] -#' Xte <- promoter[81:106,-1] +#' ytr <- factor(promoter[1:70,1]) +#' Xtr <- promoter[1:70,-1] +#' Xte <- promoter[71:106,-1] #' m2 <- DMRnet(Xtr, ytr, family = "binomial", algorithm="glamer") #' print(m2) #' plot(m2) diff --git a/R/cv.DMR.R b/R/cv.DMR.R index 2bb2e97..2ee3df6 100644 --- a/R/cv.DMR.R +++ b/R/cv.DMR.R @@ -1,6 +1,6 @@ #' @title cross-validation for DMR #' -#' @description Does k-fold cross-validation for \code{DMR} and returns a value for df. +#' @description Executes k-fold cross-validation for \code{DMR} and returns a value for df. #' #' @param X Input data frame, of dimension n x p; \code{DMR} works only if p=n see \code{\link{DMRnet}}; each row is an observation vector. Columns can be numerical or integer for continuous predictors or factors for categorical predictors. #' diff --git a/R/cv.DMRnet.R b/R/cv.DMRnet.R index cc9ccb6..6b9bff6 100644 --- a/R/cv.DMRnet.R +++ b/R/cv.DMRnet.R @@ -1,6 +1,6 @@ #' @title cross-validation for DMRnet #' -#' @description Does k-fold cross-validation for DMR and returns a value for df. +#' @description Executes k-fold cross-validation for DMR and returns a value for df. #' #' @param X Input data frame, of dimension n x p; each row is an observation vector. Columns can be numerical or integer for continuous predictors or factors for categorical predictors. #' diff --git a/R/onAttach.R b/R/onAttach.R index f14a184..c80b157 100644 --- a/R/onAttach.R +++ b/R/onAttach.R @@ -1,3 +1,3 @@ .onAttach=function(libname,pkgname){ - packageStartupMessage("Loaded DMRnet version ", as.character(packageDescription("DMRnet")[["Version"]])) + packageStartupMessage("Loaded DMRnet version ", as.character(utils::packageDescription("DMRnet")[["Version"]])) } diff --git a/R/plot.DMR.R b/R/plot.DMR.R index f8cfc3f..67be48c 100644 --- a/R/plot.DMR.R +++ b/R/plot.DMR.R @@ -1,6 +1,6 @@ #' @title plot.DMR #' -#' @description Plot coefficients from a \code{DMR} object. +#' @description Plots coefficients from a \code{DMR} object. #' #' @param x Fitted \code{DMR} object. #' diff --git a/R/plot.cv.DMR.R b/R/plot.cv.DMR.R index 55ab475..2a70e29 100644 --- a/R/plot.cv.DMR.R +++ b/R/plot.cv.DMR.R @@ -1,6 +1,6 @@ #' @title plot.cv.DMR #' -#' @description Plot cross-validated error values from a \code{cv.DMR} object. +#' @description Plots cross-validated error values from a \code{cv.DMR} object. #' #' @param x Fitted \code{cv.DMR} object. #' diff --git a/R/plot.gic.DMR.R b/R/plot.gic.DMR.R index 436e93c..e0916f1 100644 --- a/R/plot.gic.DMR.R +++ b/R/plot.gic.DMR.R @@ -1,6 +1,6 @@ #' @title plot.gic.DMR #' -#' @description Plot gic values from a \code{gic.DMR} object. +#' @description Plots gic values from a \code{gic.DMR} object. #' #' @param x Fitted \code{gic.DMR} object. #' diff --git a/R/predict.DMR.R b/R/predict.DMR.R index 6be6aca..effb9ba 100644 --- a/R/predict.DMR.R +++ b/R/predict.DMR.R @@ -1,6 +1,6 @@ #' @title predict.DMR #' -#' @description Make predictions from a \code{DMR} object. +#' @description Makes predictions from a \code{DMR} object. #' #' @param object Fitted \code{DMR} object. #' diff --git a/R/predict.cv.DMR.R b/R/predict.cv.DMR.R index 9756801..967625d 100644 --- a/R/predict.cv.DMR.R +++ b/R/predict.cv.DMR.R @@ -1,6 +1,6 @@ #' @title predict.cv.DMR #' -#' @description Make predictions from a cv.DMR object (for the model with minimal cross-validated error /the default/ or the smallest model falling under the upper curve of a prediction error plus one standard deviation). +#' @description Makes predictions from a cv.DMR object (for the model with minimal cross-validated error /the default/ or the smallest model falling under the upper curve of a prediction error plus one standard deviation). #' #' @param object Fitted cv.DMR object. #' diff --git a/R/predict.gic.DMR.R b/R/predict.gic.DMR.R index dbbe839..18194d9 100644 --- a/R/predict.gic.DMR.R +++ b/R/predict.gic.DMR.R @@ -1,6 +1,6 @@ #' @title predict.gic.DMR #' -#' @description Make predictions from a \code{gic.DMR} object (for the model with minimal GIC). +#' @description Makes predictions from a \code{gic.DMR} object (for the model with minimal GIC). #' #' @param object Fitted \code{gic.DMR} object. #' diff --git a/R/print.DMR.R b/R/print.DMR.R index 6f1001a..597da1c 100644 --- a/R/print.DMR.R +++ b/R/print.DMR.R @@ -1,6 +1,6 @@ #' @title print.DMR #' -#' @description Print a \code{DMR} object. +#' @description Prints a \code{DMR} object. #' #' @param x Fitted \code{DMR} object. #' diff --git a/R/release_questions.R b/R/release_questions.R index b73a69e..def50df 100644 --- a/R/release_questions.R +++ b/R/release_questions.R @@ -2,8 +2,6 @@ release_questions <- function() { c( "Have you run the testing_branch tests?", "Have you updated DESCRIPTION with new version number?", - "Have you updated README.md with new version number in master branch only (not testing_branch)?", - "Have you updated NEWS.md with new version number?", - "Have you updated .version.json with new version number?" + "Have you updated NEWS.md with new version number?" ) } diff --git a/README.md b/README.md index 31013c1..8dcf6e5 100644 --- a/README.md +++ b/README.md @@ -1,5 +1,5 @@ -[![GitHub version](https://img.shields.io/endpoint?url=https://raw.githubusercontent.com/SzymonNowakowski/DMRnet/master/.version.json&style=flat&logo=github)](https://github.com/SzymonNowakowski/DMRnet) +[![GitHub version](https://img.shields.io/github/r-package/v/SzymonNowakowski/DMRnet?color=yellowgreen&label=GitHub&logo=github](https://github.com/SzymonNowakowski/DMRnet) [![CRAN version](https://img.shields.io/cran/v/DMRnet?logo=R)](https://cran.r-project.org/package=DMRnet) [![downloads](https://cranlogs.r-pkg.org/badges/DMRnet)](https://cran.r-project.org/package=DMRnet) @@ -9,17 +9,17 @@ DMRnet (Delete or Merge Regressors) is a suit of algorithms for linear and logistic model selection with high-dimensional data (i.e. the number of regressors may exceed the number of observations). The predictors can be continuous or categorical. The selected model consists of a subset of numerical regressors and partitions of levels of factors. -For information on how to get started using DMRnet, see our [getting started vignette](https://cran.r-project.org/web/packages/DMRnet/vignettes/getting-started.html). +For information on how to get started using DMRnet, see our [getting started vignette](https://cran.r-project.org/package=DMRnet/vignettes/getting-started.html). ## Installing `DMRnet` package -To install the development package version (currently: 0.3.1.9002) please execute +To install the development package version please execute ``` library(devtools) devtools::install_github("SzymonNowakowski/DMRnet") ``` -Alternatively, to install the current stable CRAN version (currently: 0.3.1) please execute +Alternatively, to install the current stable CRAN version please execute ``` install.packages("DMRnet") diff --git a/cran-comments.md b/cran-comments.md index ba9ce84..1ca6fdb 100644 --- a/cran-comments.md +++ b/cran-comments.md @@ -1,9 +1,6 @@ ## local R CMD check results There were no ERRORs, WARNINGs, or NOTEs. -## R-hub check results -There was a NOTE related to the maintainer change. As I read over the Internet, the previous maintainer will get an email asking for her consent - ## Downstream dependencies There seem to be no downstream dependencies: diff --git a/inst/CITATION b/inst/CITATION index 2f7ce9a..4691cab 100644 --- a/inst/CITATION +++ b/inst/CITATION @@ -10,7 +10,7 @@ citEntry(entry="Article", volume = "9", number = "2", pages = "1749-1778", - url="https://projecteuclid.org/euclid.ejs/1440507392", - textVersion = "Aleksandra Maj-Kańska, Piotr Pokarowski and Agnieszka Prochenka, 2015. Delete or merge regressors for linear model selection. Electronic Journal of Statistics 9(2): 1749-1778. https://projecteuclid.org/euclid.ejs/1440507392") + doi = "10.1214/15-EJS1050", + textVersion = "Aleksandra Maj-Kańska, Piotr Pokarowski and Agnieszka Prochenka, 2015. Delete or merge regressors for linear model selection. Electronic Journal of Statistics 9(2): 1749-1778. doi:10.1214/15-EJS1050") diff --git a/man/DMR.Rd b/man/DMR.Rd index d82b477..c371195 100644 --- a/man/DMR.Rd +++ b/man/DMR.Rd @@ -39,7 +39,7 @@ An object with S3 class \code{"DMR"}, which is a list with the ingredients \item{interc}{If the intercept was fitted: for \code{DMR} always equal to \code{TRUE}.} } \description{ -Fit a path of linear (\code{family="gaussian"}) or logistic (\code{family="binomial"}) regression models, where the number of parameters changes from 1 to p (p is the number of columns in the model matrix). Models are subsets of continuous predictors and partitions of levels of factors in \code{X}. +Fits a path of linear (\code{family="gaussian"}) or logistic (\code{family="binomial"}) regression models, where the number of parameters changes from 1 to p (p is the number of columns in the model matrix). Models are subsets of continuous predictors and partitions of levels of factors in \code{X}. } \details{ \code{DMR} algorithm is based on a traditional stepwise method. diff --git a/man/DMRnet-package.Rd b/man/DMRnet-package.Rd index b98e6d0..c44e0cf 100644 --- a/man/DMRnet-package.Rd +++ b/man/DMRnet-package.Rd @@ -34,7 +34,7 @@ vignette("getting-started", package="DMRnet") } \references{ -Aleksandra Maj-Kańska, Piotr Pokarowski and Agnieszka Prochenka, 2015. Delete or merge regressors for linear model selection. Electronic Journal of Statistics 9(2): 1749-1778. \url{https://projecteuclid.org/euclid.ejs/1440507392} +Aleksandra Maj-Kańska, Piotr Pokarowski and Agnieszka Prochenka, 2015. Delete or merge regressors for linear model selection. Electronic Journal of Statistics 9(2): 1749-1778. \doi{10.1214/15-EJS1050} Piotr Pokarowski and Jan Mielniczuk, 2015. Combined l1 and greedy l0 penalized least squares for linear model selection. Journal of Machine Learning Research 16(29): 961-992. \url{https://www.jmlr.org/papers/volume16/pokarowski15a/pokarowski15a.pdf} diff --git a/man/DMRnet.Rd b/man/DMRnet.Rd index 5089a63..2e70a0f 100644 --- a/man/DMRnet.Rd +++ b/man/DMRnet.Rd @@ -54,7 +54,7 @@ An object with S3 class \code{"DMR"}, which is a list with the ingredients \item{interc}{If the intercept was fitted: value of parameter \code{interc} is returned.} } \description{ -Fit a path of linear (\code{family="gaussian"}) or logistic (\code{family="binomial"}) regression models, where models are subsets of continuous predictors and partitions of levels of factors in \code{X}. Works even if p>=n (the number of observations is greater than the number of columns in the model matrix). +Fits a path of linear (\code{family="gaussian"}) or logistic (\code{family="binomial"}) regression models, where models are subsets of continuous predictors and partitions of levels of factors in \code{X}. Works even if p>=n (the number of observations is greater than the number of columns in the model matrix). } \details{ \code{DMRnet} algorithm is a generalization of \code{\link{DMR}} to high-dimensional data. @@ -71,9 +71,9 @@ The final path of models is chosen by minimizing the likelihood of the models fo \examples{ ## DMRnet for linear regression data(miete) -ytr <- miete[1:500,1] -Xtr <- miete[1:500,-1] -Xte <- miete[501:1000,-1] +ytr <- miete[1:200,1] +Xtr <- miete[1:200,-1] +Xte <- miete[201:250,-1] m1 <- DMRnet(Xtr, ytr) print(m1) plot(m1) @@ -84,9 +84,9 @@ ypr <- predict(m1, newx = Xte, df = g$df.min) ## DMRnet for logistic regression data(promoter) -ytr <- factor(promoter[1:80,1]) -Xtr <- promoter[1:80,-1] -Xte <- promoter[81:106,-1] +ytr <- factor(promoter[1:70,1]) +Xtr <- promoter[1:70,-1] +Xte <- promoter[71:106,-1] m2 <- DMRnet(Xtr, ytr, family = "binomial") print(m2) plot(m2) @@ -97,9 +97,9 @@ ypr <- predict(m2, newx = Xte, df = g$df.min) ## GLAMER for linear regression data(miete) -ytr <- miete[1:500,1] -Xtr <- miete[1:500,-1] -Xte <- miete[501:1000,-1] +ytr <- miete[1:200,1] +Xtr <- miete[1:200,-1] +Xte <- miete[201:250,-1] m1 <- DMRnet(Xtr, ytr, algorithm="glamer") print(m1) plot(m1) @@ -110,9 +110,9 @@ ypr <- predict(m1, newx = Xte, df = g$df.min) ## GLAMER for logistic regression data(promoter) -ytr <- factor(promoter[1:80,1]) -Xtr <- promoter[1:80,-1] -Xte <- promoter[81:106,-1] +ytr <- factor(promoter[1:70,1]) +Xtr <- promoter[1:70,-1] +Xte <- promoter[71:106,-1] m2 <- DMRnet(Xtr, ytr, family = "binomial", algorithm="glamer") print(m2) plot(m2) diff --git a/man/cv.DMR.Rd b/man/cv.DMR.Rd index 1720395..efec714 100644 --- a/man/cv.DMR.Rd +++ b/man/cv.DMR.Rd @@ -40,7 +40,7 @@ An object with S3 class \code{"cv.DMR"} is returned, which is a list with } } \description{ -Does k-fold cross-validation for \code{DMR} and returns a value for df. +Executes k-fold cross-validation for \code{DMR} and returns a value for df. } \details{ \code{cv.DMR} algorithm does cross-validation for \code{DMR} with \code{nfolds} folds. The df for the minimal estimated prediction error is returned. diff --git a/man/cv.DMRnet.Rd b/man/cv.DMRnet.Rd index 0112f16..703c7f7 100644 --- a/man/cv.DMRnet.Rd +++ b/man/cv.DMRnet.Rd @@ -55,7 +55,7 @@ An object with S3 class "cv.DMR" is returned, which is a list with the i } } \description{ -Does k-fold cross-validation for DMR and returns a value for df. +Executes k-fold cross-validation for DMR and returns a value for df. } \details{ cv.DMRnet algorithm does \code{nfold}-fold cross-validation for DMRnet. The df for the minimal estimated prediction error is returned. diff --git a/man/plot.DMR.Rd b/man/plot.DMR.Rd index c04507e..a8b25a0 100644 --- a/man/plot.DMR.Rd +++ b/man/plot.DMR.Rd @@ -12,7 +12,7 @@ \item{...}{Further arguments passed to or from other methods.} } \description{ -Plot coefficients from a \code{DMR} object. +Plots coefficients from a \code{DMR} object. } \details{ Produces a coefficient profile plot of the coefficient paths for a fitted \code{DMR} object. diff --git a/man/plot.cv.DMR.Rd b/man/plot.cv.DMR.Rd index 0a4f96c..23e0af5 100644 --- a/man/plot.cv.DMR.Rd +++ b/man/plot.cv.DMR.Rd @@ -12,7 +12,7 @@ \item{...}{Further arguments passed to or from other methods.} } \description{ -Plot cross-validated error values from a \code{cv.DMR} object. +Plots cross-validated error values from a \code{cv.DMR} object. } \details{ Produces a plot of cross-validated error values for the entire sequence of models from the fitted \code{cv.DMR} object. The horizontal level indicating separation of one standard deviation from the minimum error is indicated with a blue dashed line. The df.min (the smallest model minimizing the cross-validated error) and df.1se (the smallest model falling under the blue dashed line) are marked with red and blue points, respectively. diff --git a/man/plot.gic.DMR.Rd b/man/plot.gic.DMR.Rd index 26b3e94..4ecaa75 100644 --- a/man/plot.gic.DMR.Rd +++ b/man/plot.gic.DMR.Rd @@ -12,7 +12,7 @@ \item{...}{Further arguments passed to or from other methods.} } \description{ -Plot gic values from a \code{gic.DMR} object. +Plots gic values from a \code{gic.DMR} object. } \details{ Produces a plot of Generalized Information Criterion for the entire sequence of models from the fitted \code{gic.DMR} object. diff --git a/man/predict.DMR.Rd b/man/predict.DMR.Rd index e6bf828..aaf12b8 100644 --- a/man/predict.DMR.Rd +++ b/man/predict.DMR.Rd @@ -30,7 +30,7 @@ Vector or matrix of predictions. } \description{ -Make predictions from a \code{DMR} object. +Makes predictions from a \code{DMR} object. } \details{ Similar to other \code{predict} methods, this function predicts fitted values from a fitted \code{DMR} object. diff --git a/man/predict.cv.DMR.Rd b/man/predict.cv.DMR.Rd index 677167a..5ccc42a 100644 --- a/man/predict.cv.DMR.Rd +++ b/man/predict.cv.DMR.Rd @@ -30,7 +30,7 @@ Vector of predictions. } \description{ -Make predictions from a cv.DMR object (for the model with minimal cross-validated error /the default/ or the smallest model falling under the upper curve of a prediction error plus one standard deviation). +Makes predictions from a cv.DMR object (for the model with minimal cross-validated error /the default/ or the smallest model falling under the upper curve of a prediction error plus one standard deviation). } \details{ Similar to other \code{predict} methods, this function predicts fitted values from a fitted \code{cv.DMR} object. diff --git a/man/predict.gic.DMR.Rd b/man/predict.gic.DMR.Rd index 68d6989..0451600 100644 --- a/man/predict.gic.DMR.Rd +++ b/man/predict.gic.DMR.Rd @@ -21,7 +21,7 @@ Vector of predictions. } \description{ -Make predictions from a \code{gic.DMR} object (for the model with minimal GIC). +Makes predictions from a \code{gic.DMR} object (for the model with minimal GIC). } \details{ Similar to other \code{predict} methods, this function predicts fitted values from a fitted \code{gic.DMR} object for the model with minimal GIC. diff --git a/man/print.DMR.Rd b/man/print.DMR.Rd index 4def290..147a9a0 100644 --- a/man/print.DMR.Rd +++ b/man/print.DMR.Rd @@ -15,7 +15,7 @@ The summary is silently returned. } \description{ -Print a \code{DMR} object. +Prints a \code{DMR} object. } \details{ Print a summary of the \code{DMR} path at each step along the path. diff --git a/vignettes/getting-started.Rmd b/vignettes/getting-started.Rmd index f40f171..54caa3a 100644 --- a/vignettes/getting-started.Rmd +++ b/vignettes/getting-started.Rmd @@ -190,7 +190,7 @@ Please note that 1 is the value of a target class in the `predict` output. # References 1. Szymon Nowakowski, Piotr Pokarowski and Wojciech Rejchel. 2021. *Group Lasso Merger for Sparse Prediction with High-Dimensional Categorical Data.* arXiv [stat.ME]. -1. Aleksandra Maj-Kańska, Piotr Pokarowski and Agnieszka Prochenka, 2015. *Delete or merge regressors for linear model selection.* Electronic Journal of Statistics 9(2): 1749-1778. +1. Aleksandra Maj-Kańska, Piotr Pokarowski and Agnieszka Prochenka, 2015. *Delete or merge regressors for linear model selection.* Electronic Journal of Statistics 9(2): 1749-1778. 2. Piotr Pokarowski and Jan Mielniczuk, 2015. *Combined l1 and greedy l0 penalized least squares for linear model selection.* Journal of Machine Learning Research 16(29): 961-992. 4. Piotr Pokarowski, Wojciech Rejchel, Agnieszka Sołtys, Michał Frej and Jan Mielniczuk, 2022. *Improving Lasso for model selection and prediction.* Scandinavian Journal of Statistics, 49(2): 831–863. 5. Ludwig Fahrmeir, Rita Künstler, Iris Pigeot, Gerhard Tutz, 2004. Statistik: der Weg zur Datenanalyse. 5. Auflage, Berlin: Springer-Verlag.