Skip to content

Commit

Permalink
Fixing NOTEs received from the R-hub automatic checks
Browse files Browse the repository at this point in the history
  • Loading branch information
SzymonNowakowski committed Jul 19, 2022
1 parent 987aa22 commit 1ad062e
Show file tree
Hide file tree
Showing 17 changed files with 28 additions and 24 deletions.
3 changes: 2 additions & 1 deletion .Rbuildignore
Original file line number Diff line number Diff line change
@@ -1,4 +1,5 @@
^.*\.Rproj$
^\.Rproj\.user$
^doc$
cran-comments.md
^cran-comments.md$
^CRAN-SUBMISSION$
2 changes: 1 addition & 1 deletion DESCRIPTION
Original file line number Diff line number Diff line change
Expand Up @@ -6,7 +6,7 @@ Version: 0.3.1
Authors@R: c(person("Agnieszka", "Prochenka-Sołtys", email = "[email protected]", role = c("aut"), comment = "previous maintainer for versions <= 0.2.0"),
person("Piotr", "Pokarowski", role = c("aut")),
person("Szymon", "Nowakowski", email = "[email protected]", role = c("aut", "cre"), comment = c(ORCID = "0000-0002-1939-9512")))
Description: Model selection algorithms for regression and classification, where the predictors can be continuous or categorical and the number of regressors may exceed the number of observations. The selected model consists of a subset of numerical regressors and partitions of levels of factors. Aleksandra Maj-Kańska, Piotr Pokarowski and Agnieszka Prochenka, 2015. Delete or merge regressors for linear model selection. Electronic Journal of Statistics 9(2): 1749-1778. <https://projecteuclid.org/euclid.ejs/1440507392>. Piotr Pokarowski and Jan Mielniczuk, 2015. Combined l1 and greedy l0 penalized least squares for linear model selection. Journal of Machine Learning Research 16(29): 961-992. <http://www.jmlr.org/papers/volume16/pokarowski15a/pokarowski15a.pdf>. Piotr Pokarowski, Wojciech Rejchel, Agnieszka Sołtys, Michał Frej and Jan Mielniczuk, 2022. Improving Lasso for model selection and prediction. Scandinavian Journal of Statistics, 49(2): 831–863. <https://doi.org/10.1111/sjos.12546>.
Description: Model selection algorithms for regression and classification, where the predictors can be continuous or categorical and the number of regressors may exceed the number of observations. The selected model consists of a subset of numerical regressors and partitions of levels of factors. Aleksandra Maj-Kańska, Piotr Pokarowski and Agnieszka Prochenka, 2015. Delete or merge regressors for linear model selection. Electronic Journal of Statistics 9(2): 1749-1778. <https://projecteuclid.org/euclid.ejs/1440507392>. Piotr Pokarowski and Jan Mielniczuk, 2015. Combined l1 and greedy l0 penalized least squares for linear model selection. Journal of Machine Learning Research 16(29): 961-992. <https://www.jmlr.org/papers/volume16/pokarowski15a/pokarowski15a.pdf>. Piotr Pokarowski, Wojciech Rejchel, Agnieszka Sołtys, Michał Frej and Jan Mielniczuk, 2022. Improving Lasso for model selection and prediction. Scandinavian Journal of Statistics, 49(2): 831–863. <doi:10.1111/sjos.12546>.
License: GPL-2
Encoding: UTF-8
LazyData: true
Expand Down
4 changes: 2 additions & 2 deletions R/DMR4glm.R
Original file line number Diff line number Diff line change
@@ -1,5 +1,5 @@
DMR4glm <- function(X, y, clust.method, lam){
if (class(y) != "factor"){
if (!inherits(y, "factor")){
stop("Error: y should be a factor")
}
lev <- levels(factor(y))
Expand All @@ -13,7 +13,7 @@ DMR4glm <- function(X, y, clust.method, lam){
stop("Error: non-conforming data: nrow(X) not equal to length(y)")
}
ssd <- apply(X, 2, function(x) length(unique(x)))
if (ssd[1] == 1 & (class(X[,1]) == "numeric" | class(X[,1]) == "integer")){
if (ssd[1] == 1 & (inherits(X[,1], "numeric") | inherits(X[,1], "integer"))){
X <- X[,-1, drop = FALSE]
ssd <- ssd[-1]
}
Expand Down
2 changes: 1 addition & 1 deletion R/DMR4lm.R
Original file line number Diff line number Diff line change
Expand Up @@ -6,7 +6,7 @@ DMR4lm <- function(X, y, clust.method, lam){
stop("Error: non-conforming data: nrow(X) not equal to length(y)")
}
ssd <- apply(X, 2, function(x) length(unique(x))) #number of unique values in each column of X
if (ssd[1] == 1 & (class(X[,1]) == "numeric" | class(X[,1]) == "integer")){ # removing the first column in case
if (ssd[1] == 1 & (inherits(X[,1], "numeric") | inherits(X[,1], "integer"))){ # removing the first column in case
# in case is a numeric constant in X
# i.e. in case it is an Intercept. Other than that, constant columns are NOT allowed
X <- X[,-1, drop = FALSE] #drop=FALSE keeps the dimensions of X
Expand Down
4 changes: 2 additions & 2 deletions R/DMRnet-package.R
Original file line number Diff line number Diff line change
Expand Up @@ -33,8 +33,8 @@
#'
#' Aleksandra Maj-Kańska, Piotr Pokarowski and Agnieszka Prochenka, 2015. Delete or merge regressors for linear model selection. Electronic Journal of Statistics 9(2): 1749-1778. \url{https://projecteuclid.org/euclid.ejs/1440507392}
#'
#' Piotr Pokarowski and Jan Mielniczuk, 2015. Combined l1 and greedy l0 penalized least squares for linear model selection. Journal of Machine Learning Research 16(29): 961-992. \url{http://www.jmlr.org/papers/volume16/pokarowski15a/pokarowski15a.pdf}
#' Piotr Pokarowski and Jan Mielniczuk, 2015. Combined l1 and greedy l0 penalized least squares for linear model selection. Journal of Machine Learning Research 16(29): 961-992. \url{https://www.jmlr.org/papers/volume16/pokarowski15a/pokarowski15a.pdf}
#'
#' Piotr Pokarowski, Wojciech Rejchel, Agnieszka Sołtys, Michał Frej and Jan Mielniczuk, 2022. Improving Lasso for model selection and prediction. Scandinavian Journal of Statistics, 49(2): 831–863. \url{https://doi.org/10.1111/sjos.12546}
#' Piotr Pokarowski, Wojciech Rejchel, Agnieszka Sołtys, Michał Frej and Jan Mielniczuk, 2022. Improving Lasso for model selection and prediction. Scandinavian Journal of Statistics, 49(2): 831–863. \doi{10.1111/sjos.12546}
#'
NULL
4 changes: 2 additions & 2 deletions R/DMRnet4glm.R
Original file line number Diff line number Diff line change
@@ -1,5 +1,5 @@
DMRnet4glm <- function(X, y, clust.method, o, nlambda, lam, maxp, lambda) {
if (class(y) != "factor"){
if (!inherits(y, "factor")){
stop("Error: y should be a factor")
}
lev <- levels(factor(y))
Expand All @@ -12,7 +12,7 @@ DMRnet4glm <- function(X, y, clust.method, o, nlambda, lam, maxp, lambda) {
stop("Error: non-conforming data: nrow(X) not equal to length(y)")
}
ssd <- apply(X, 2, function(x) length(unique(x)))
if (ssd[1] == 1 & (class(X[,1]) == "numeric" | class(X[,1]) == "integer")){
if (ssd[1] == 1 & (inherits(X[,1], "numeric") | inherits(X[,1], "integer"))){
X <- X[,-1, drop = FALSE]
ssd <- ssd[-1]
}
Expand Down
2 changes: 1 addition & 1 deletion R/DMRnet4lm.R
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,7 @@ DMRnet4lm <- function(X, y, clust.method, o, nlambda, lam, maxp, lambda){
stop("Error: non-conforming data: nrow(X) not equal to length(y)")
}
ssd <- apply(X, 2, function(x) length(unique(x)))
if (ssd[1] == 1 & (class(X[,1]) == "numeric" | class(X[,1]) == "integer")){
if (ssd[1] == 1 & (inherits(X[,1], "numeric") | inherits(X[,1], "integer"))){
X <- X[,-1, drop = FALSE]
ssd <- ssd[-1]
}
Expand Down
2 changes: 1 addition & 1 deletion R/SOSnet4glm.R
Original file line number Diff line number Diff line change
@@ -1,5 +1,5 @@
SOSnet4glm <- function(X, y, o, nlambda, lam, interc, maxp, lambda){
if (class(y) != "factor"){
if (!inherits(y, "factor")){
stop("Error: y should be a factor")
}
lev <- levels(y)
Expand Down
2 changes: 1 addition & 1 deletion R/coef.DMR.R
Original file line number Diff line number Diff line change
Expand Up @@ -25,7 +25,7 @@ coef.DMR <- function(object, df = NULL, ...){
colnames(out) <- paste("df", object$df, sep = "")
return(out)
}
if(class(df) != "numeric" & class(df) != "integer"){
if(!inherits(df, "numeric") & !inherits(df, "integer")){
stop("Error: wrong input type, df should have type numeric")
}
out <- object$beta[,ncol(object$beta) - df + 1]
Expand Down
2 changes: 1 addition & 1 deletion R/cv_GIC_indexed.R
Original file line number Diff line number Diff line change
Expand Up @@ -73,7 +73,7 @@ cv_GIC_indexed <- function(X, y, nfolds, model_function, ...) {

} else{
if (family == "binomial"){
if (class(y) != "factor"){
if (!inherits(y, "factor")){
stop("Error: y should be a factor")
}
lev <- levels(factor(y))
Expand Down
2 changes: 1 addition & 1 deletion R/cv_MD_indexed.R
Original file line number Diff line number Diff line change
Expand Up @@ -31,7 +31,7 @@ cv_MD_indexed <- function(X, y, nfolds, model_function, ...) {

} else{
if (family == "binomial"){
if (class(y) != "factor"){
if (!inherits(y, "factor")){
stop("Error: y should be a factor")
}
lev <- levels(factor(y))
Expand Down
4 changes: 2 additions & 2 deletions R/glamer_4glm.R
Original file line number Diff line number Diff line change
@@ -1,5 +1,5 @@
glamer_4glm <- function(X, y, clust.method, nlambda, lam, maxp, lambda){
if (class(y) != "factor"){
if (!inherits(y, "factor")){
stop("Error: y should be a factor")
}
lev <- levels(factor(y))
Expand All @@ -12,7 +12,7 @@ glamer_4glm <- function(X, y, clust.method, nlambda, lam, maxp, lambda){
stop("Error: non-conforming data: nrow(X) not equal to length(y)")
}
ssd <- apply(X, 2, function(x) length(unique(x)))
if (ssd[1] == 1 & (class(X[,1]) == "numeric" | class(X[,1]) == "integer")){ #removing the intercept from X, as it will be added later to X.full
if (ssd[1] == 1 & (inherits(X[,1], "numeric") | inherits(X[,1], "integer"))){ #removing the intercept from X, as it will be added later to X.full
X <- X[,-1, drop = FALSE]
ssd <- ssd[-1]
}
Expand Down
2 changes: 1 addition & 1 deletion R/glamer_4lm.R
Original file line number Diff line number Diff line change
Expand Up @@ -5,7 +5,7 @@ glamer_4lm <- function(X, y, clust.method, nlambda, lam, maxp, lambda){
stop("Error: non-conforming data: nrow(X) not equal to length(y)")
}
ssd <- apply(X, 2, function(x) length(unique(x)))
if (ssd[1] == 1 & (class(X[,1]) == "numeric" | class(X[,1]) == "integer")){
if (ssd[1] == 1 & (inherits(X[,1], "numeric") | inherits(X[,1], "integer"))){
X <- X[,-1, drop = FALSE]
ssd <- ssd[-1]
}
Expand Down
2 changes: 1 addition & 1 deletion R/predict.DMR.R
Original file line number Diff line number Diff line change
Expand Up @@ -68,7 +68,7 @@ predict.DMR <- function(object, newx, df = NULL, type = "link", unknown.factor.l
if(ncol(Z) != nrow(object$beta) | is.null(ncol(newx))){
stop(paste("Error: non-conforming arrays, newx should be a data frame with ncol equal to", nrow(object$beta)))
}
if(class(df) != "numeric" & class(df) != "integer" & is.null(class(df))){
if(!inherits(df, "numeric") & !inherits(df, "integer") & is.null(class(df))){
stop("Error: wrong input type, df should have type numeric, integer or NULL")
}
if (names(object)[3] == "rss") type = "link"
Expand Down
5 changes: 4 additions & 1 deletion cran-comments.md
Original file line number Diff line number Diff line change
@@ -1,6 +1,9 @@
## R CMD check results
## local R CMD check results
There were no ERRORs, WARNINGs, or NOTEs.

## R-hub check results
There was a NOTE related to the maintainer change. As I read over the Internet, the previous maintainer will get an email asking for her consent

## Downstream dependencies
There seem to be no downstream dependencies:

Expand Down
4 changes: 2 additions & 2 deletions man/DMRnet-package.Rd

Some generated files are not rendered by default. Learn more about how customized files appear on GitHub.

6 changes: 3 additions & 3 deletions vignettes/getting-started.Rmd
Original file line number Diff line number Diff line change
Expand Up @@ -189,10 +189,10 @@ Please note that 1 is the value of a target class in the `predict` output.

# References

1. Szymon Nowakowski, Piotr Pokarowski and Wojciech Rejchel. 2021. *Group Lasso Merger for Sparse Prediction with High-Dimensional Categorical Data.* arXiv [stat.ME]. <http://arxiv.org/abs/2112.11114>
1. Szymon Nowakowski, Piotr Pokarowski and Wojciech Rejchel. 2021. *Group Lasso Merger for Sparse Prediction with High-Dimensional Categorical Data.* arXiv [stat.ME]. <https://arxiv.org/abs/2112.11114>
1. Aleksandra Maj-Kańska, Piotr Pokarowski and Agnieszka Prochenka, 2015. *Delete or merge regressors for linear model selection.* Electronic Journal of Statistics 9(2): 1749-1778. <https://projecteuclid.org/euclid.ejs/1440507392>
2. Piotr Pokarowski and Jan Mielniczuk, 2015. *Combined l1 and greedy l0 penalized least squares for linear model selection.* Journal of Machine Learning Research 16(29): 961-992. <http://www.jmlr.org/papers/volume16/pokarowski15a/pokarowski15a.pdf>
4. Piotr Pokarowski, Wojciech Rejchel, Agnieszka Sołtys, Michał Frej and Jan Mielniczuk, 2022. *Improving Lasso for model selection and prediction.* Scandinavian Journal of Statistics, 49(2): 831–863. <https://doi.org/10.1111/sjos.12546>
2. Piotr Pokarowski and Jan Mielniczuk, 2015. *Combined l1 and greedy l0 penalized least squares for linear model selection.* Journal of Machine Learning Research 16(29): 961-992. <https://www.jmlr.org/papers/volume16/pokarowski15a/pokarowski15a.pdf>
4. Piotr Pokarowski, Wojciech Rejchel, Agnieszka Sołtys, Michał Frej and Jan Mielniczuk, 2022. *Improving Lasso for model selection and prediction.* Scandinavian Journal of Statistics, 49(2): 831–863. <doi:10.1111/sjos.12546>
5. Ludwig Fahrmeir, Rita Künstler, Iris Pigeot, Gerhard Tutz, 2004. Statistik: der Weg zur Datenanalyse. 5. Auflage, Berlin: Springer-Verlag.
6. Dean P. Foster and Edward I. George, 1994. *The Risk Inflation Criterion for Multiple Regression.* The Annals of Statistics 22 (4): 1947–75.

Expand Down

0 comments on commit 1ad062e

Please sign in to comment.