insightsengineering/teal.modules.general

Implement shinytest2 for tmg

Closed this issue ยท 6 comments

This is a continuation of https://github.com/insightsengineering/coredev-tasks/issues/503

Using the shinytest2 helper class that we built for teal, let extend the feature for tmg.

We have 15 teal module functions in tmg

Note

  • We stick to one app per test and open issue to optimize this by exploring one app for all tests in a module.
  • Create function: app_driver_\<name of module\>
    • create this in tests/testthat/helper-TealAppDriver.R so we can call it as a function in the test case.

Test specs

  • Test if the example apps for the modules tm_* can be run without shiny errors
    • We won't be doing a snapshot test or confirm if the exact output based on the encoding. We will only check if the visualization is generated when the app initialized.
    • Run the app using different arguments other than what's provided in the example (when applicable)
  • Test if visualization is updated when encoding is updated
  • Prefix all the external function (i.e. within and specifying the module, everywhere!)

@insightsengineering/nest-core-dev
As discussed, please self-assign by adding your name to the modules.
Please begin with one or two modules and review them before applying the test cases to the remaining modules.

Created an initial PR that will support {tmg} testing (#714).

I'm wondering about what datasets should be used as the base for automated testing, (1) iris/mtcars/ or (2) CDISC

CDISC is closer to the overall goal of teal, on the other hand, all the examples modules should work with general data, just as it was recently added to all the examples (a mix between iris/mtcars/CO2/USArrests/...)

Any thoughts folks?

I believe general datasets can encompass all necessary functionalities since (tmg) is designed as standard modules for data operations.

I prefer using general datasets to maintain simplicity, except in cases where specific functionalities are only relevant for CDISC datasets

  • Test if the example apps for the modules tm_* can be run without shiny errors

I think we can focus our effort partially on merging insightsengineering/teal.modules.clinical#983 and then copy/paste it to {tmg}

This will allow our automated tests to be focused on the individual module's functionality (and leave basic error detection to this effort that is 2-in-1

The automated tests being developed for {tmc} works in this package as well with 2 exceptions [ FAIL 2 | WARN 0 | SKIP 0 | PASS 68 ]

Require custom input at startup (selections are not available on the module's parameters as of now)

  • tm_file_viewer (file being chosen)
  • tm_g_distribution: choosing a test on a collapsed section of encoding

It lives on a tentative branch as of now (477_shinytest2_examples@main)

We could address this in different ways:

  • Exclude example app and create custom test
  • Set missing options via module call (adding new argument)
  • Allow for some shiny validation errors based on a regular expression

Thanks for running this in here. Out of the options you listed I think I like allow-list (for warnings / validation / etc.) the most and disliked adding new argument the most :) It would be great to have such list very specific and accurate, e.g. for file XYZ accept validation with ID ABC and warning DEF.

All PRs have been merged. Great job team!! ๐ŸŽ‰ ๐Ÿ’ฏ