Skip to content
Merged
Show file tree
Hide file tree
Changes from 10 commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
64 changes: 18 additions & 46 deletions .github/workflows/R-CMD-check.yaml
Original file line number Diff line number Diff line change
@@ -1,5 +1,5 @@
# For help debugging build failures open an issue on the RStudio community with the 'github-actions' tag.
# https://community.rstudio.com/new-topic?category=Package%20development&tags=github-actions
# Workflow derived from https://github.com/r-lib/actions/tree/v2/examples
# Need help debugging build failures? Start at https://github.com/r-lib/actions#where-to-find-help
on:
push:
branches:
Expand All @@ -12,7 +12,9 @@ on:
- master
- dev

name: R-CMD-check
name: R-CMD-check.yaml

permissions: read-all

jobs:
R-CMD-check:
Expand All @@ -32,56 +34,26 @@ jobs:
- {os: ubuntu-latest, r: 'release'}

env:
R_REMOTES_NO_ERRORS_FROM_WARNINGS: true
RSPM: ${{ matrix.config.rspm }}
GITHUB_PAT: ${{ secrets.GITHUB_TOKEN }}
R_KEEP_PKG_SOURCE: yes

steps:
- uses: actions/checkout@v2
- uses: actions/checkout@v4

- uses: r-lib/actions/setup-pandoc@v2

- uses: r-lib/actions/setup-r@v2
with:
r-version: ${{ matrix.config.r }}
http-user-agent: ${{ matrix.config.http-user-agent }}
use-public-rspm: true

- uses: r-lib/actions/setup-pandoc@v2

- name: Query dependencies
run: |
install.packages('remotes')
saveRDS(remotes::dev_package_deps(dependencies = TRUE), ".github/depends.Rds", version = 2)
writeLines(sprintf("R-%i.%i", getRversion()$major, getRversion()$minor), ".github/R-version")
shell: Rscript {0}

- name: Cache R packages
if: runner.os != 'Windows'
uses: actions/cache@v2
- uses: r-lib/actions/setup-r-dependencies@v2
with:
path: ${{ env.R_LIBS_USER }}
key: ${{ runner.os }}-${{ hashFiles('.github/R-version') }}-1-${{ hashFiles('.github/depends.Rds') }}
restore-keys: ${{ runner.os }}-${{ hashFiles('.github/R-version') }}-1-

- name: Install system dependencies
if: runner.os == 'Linux'
run: |
while read -r cmd
do
eval sudo $cmd
done < <(Rscript -e 'writeLines(remotes::system_requirements("ubuntu", "20.04"))')

- name: Install dependencies
run: |
remotes::install_deps(dependencies = TRUE)
remotes::install_cran("rcmdcheck")
shell: Rscript {0}

- name: Check
env:
_R_CHECK_CRAN_INCOMING_REMOTE_: false
run: rcmdcheck::rcmdcheck(args = c("--no-manual", "--as-cran"), error_on = "warning", check_dir = "check")
shell: Rscript {0}
extra-packages: any::rcmdcheck
needs: check

- name: Upload check results
if: failure()
uses: actions/upload-artifact@main
- uses: r-lib/actions/check-r-package@v2
with:
name: ${{ runner.os }}-r${{ matrix.config.r }}-results
path: check
upload-snapshots: true
build_args: 'c("--no-manual","--compact-vignettes=gs+qpdf")'
4 changes: 2 additions & 2 deletions DESCRIPTION
Original file line number Diff line number Diff line change
Expand Up @@ -26,7 +26,7 @@ Depends: R (>= 4.0)
Imports:
yyjsonr (>= 0.1.18),
jsonvalidate (>= 1.3.1),
lubridate
hms
Suggests:
testthat (>= 2.1.0),
jsonlite (>= 1.8.0),
Expand All @@ -37,7 +37,7 @@ Suggests:
purrr,
tibble,
dplyr,
hms,
lubridate,
data.table
VignetteBuilder: knitr
Config/testthat/edition: 3
2 changes: 1 addition & 1 deletion NAMESPACE
Original file line number Diff line number Diff line change
Expand Up @@ -16,8 +16,8 @@ export(set_study_oid)
export(set_variable_attributes)
export(validate_dataset_json)
export(write_dataset_json)
importFrom(hms,as_hms)
importFrom(jsonvalidate,json_validate)
importFrom(lubridate,hms)
importFrom(tools,file_path_sans_ext)
importFrom(utils,tail)
importFrom(yyjsonr,opts_read_json)
Expand Down
2 changes: 1 addition & 1 deletion R/utils.R
Original file line number Diff line number Diff line change
Expand Up @@ -127,7 +127,7 @@ date_time_conversions <- function(d, dt, tdt){
as.POSIXct,
tz = "UTC",
tryFormats = "%Y-%m-%dT%H:%M:%S")
d[time_cols] <- lapply(d[time_cols], hms)
d[time_cols] <- lapply(d[time_cols], as_hms)
d
}

2 changes: 1 addition & 1 deletion R/zzz.R
Original file line number Diff line number Diff line change
Expand Up @@ -2,7 +2,7 @@
#' @importFrom yyjsonr opts_write_json opts_read_json write_json_file write_json_str read_json_str read_json_file
#' @importFrom tools file_path_sans_ext
#' @importFrom utils tail
#' @importFrom lubridate hms
#' @importFrom hms as_hms
NULL

#' @keywords internal
Expand Down
2 changes: 1 addition & 1 deletion data-raw/data.R
Original file line number Diff line number Diff line change
Expand Up @@ -89,7 +89,7 @@ time_options <- c("12:34:56", "15:34:34", "11:12:52", "21:16:11")

adsl$VIST1TMC <- sample(time_options, 254, replace=TRUE)
adsl$VIST1DTC <-paste(format(adsl$VISIT1DT, "%Y-%m-%d"), sample(time_options, 254, replace=TRUE), sep="T")
adsl$VISIT1TM <- lubridate::hms(adsl$VIST1TMC)
adsl$VISIT1TM <- hms::as_hms(adsl$VIST1TMC)
adsl$VIST1DTM <- as.POSIXct(strptime(adsl$VIST1DTC, "%Y-%m-%dT%H:%M:%S", tz="UTC"))

new_meta <- tibble::tribble(
Expand Down
4 changes: 2 additions & 2 deletions tests/testthat/test-read_dataset_json.R
Original file line number Diff line number Diff line change
Expand Up @@ -84,11 +84,11 @@ test_that("datetime conversions work properly",{
iris_timetest <- read_dataset_json(test_path("testdata", "iris_timetest.json"))

expect_s3_class(iris_timetest$Datetime, "POSIXct")
expect_equal(class(iris_timetest$Time), "Period", ignore_attr=TRUE)
expect_equal(class(iris_timetest$Time), c("hms", "difftime"), ignore_attr=TRUE)

expect_equal(sort(unique(iris_timetest$Datetime)),
as.POSIXct(strptime(c("2024-01-01T12:34:56", "2024-01-17T18:45:56"),
"%Y-%m-%dT%H:%M:%S", tz="UTC")))
expect_equal(sort(unique(as.numeric(iris_timetest$Time))),
as.numeric(hms(c("12:34:56", "18:45:56"))))
as.numeric(as_hms(c("12:34:56", "18:45:56"))))
})
2 changes: 1 addition & 1 deletion tests/testthat/test-utils.R
Original file line number Diff line number Diff line change
Expand Up @@ -42,5 +42,5 @@ test_that("Date, datetime and time conversions work as expected", {
expect_equal(df_converted$datetime, as.POSIXct(c("2020-01-01 12:00:00",
"2020-01-01 12:00:01",
NA), tz = "UTC"))
expect_equal(df_converted$time, hms(c("12:00:00", "12:00:01", NA)))
expect_equal(df_converted$time, as_hms(c("12:00:00", "12:00:01", NA)))
})
Binary file modified tests/testthat/testdata/adsl_time_test.Rds
Binary file not shown.
3 changes: 3 additions & 0 deletions vignettes/converting_files.Rmd
Original file line number Diff line number Diff line change
Expand Up @@ -49,6 +49,9 @@ extract_xpt_meta <- function(n, .data) {
} else if (inherits(.data[[n]],"numeric")) {
if (any(is.double(.data[[n]]))) out$dataType <- "float"
else out$dataType <- "integer"
} else if (inherits(.data[[n]],"hms")) {
out$dataType <- "time"
out$targetDataType <- "integer"
} else {
out$dataType <- "string"
out$length <- max(purrr::map_int(.data[[n]], nchar))
Expand Down
4 changes: 2 additions & 2 deletions vignettes/date_time_datetime.Rmd
Original file line number Diff line number Diff line change
Expand Up @@ -43,8 +43,8 @@ tibble::tribble(
In the table above, we have the metadata for both character and numeric dates, times, and date times. Both sets of variables have the same values within `dataType`. The difference is the optional field of `targetDataType`, where the value for the numeric variables is set to `integer`. Both `read_dataset_json()` and `write_dataset_json()` rely on these fields and as such they must be set properly. This comes with a few assumption and requirements.

- Numeric dates will be converted into the type of `Date` (see `help("Date", package="base")`)
- Numeric times will be converted to the **{lubridate}** type of `Period`
- R doesn't have a specific built in type of time. We decided to take on **{lubridate}** as a dependency given package stability and **{lubridate}**'s inclusion in the tidyverse. The `Period` objects on read are produced using the `lubridate::hms()`.
- Numeric times will be converted to the **{hms}** type of `hms`
- R doesn't have a specific built in type of time. We decided to take on **{hms}** as a dependency given that this is the type using by the **{haven}** package when reading SAS Version 5 Transport files. As such, similar behavior can be expected when importing an XPT or a Dataset JSON file.
- Numeric date times will be converted to the base R type of `POSIXct` and anchored to the UTC timezone.
- CDISC dates are generally not timezone qualified, though for character dates, this is optional. Unless a timezone is explicitly specified systems may default to the user's current timezone. To decrease ambiguity, we've introduced a hard requirement that datetimes are anchored to UTC. If the datetime variable is found to be using a different timezone, an error will be thrown.

Expand Down
Loading