[BUG] `olink_ttest` does not have `estimate` column by default in conda package
abiddanda54gene opened this issue · 4 comments
Describe the bug
When looking for the fold-change between cases and controls, the documentation of olink_ttest
suggests that there should be an estimate
column, that is not there anymore.
To Reproduce
Steps to reproduce the behavior:
library(OlinkAnalyze)
library(dplyr)
npx_data <- read_NPX("Explore1536_random_NPX_11MAR2022.csv")
metadata <- read.table("Explore1536_random_NPX_11MAR2022.fake_meta.csv", sep=",", header=TRUE)
npx_data$SampleID <- as.character(npx_data$SampleID)
metadata <- as.character(metadata$SampleID)
# merge with metadata that describes case-control status
merged_npx <- npx_data %>% inner_join(metadata, by="SampleID")
ttest.results <- olink_ttest(df=merged_npx, variable="Treatment")
#NOTE: column names does not have `estimate` or any numbers that are negative here ...
colnames(ttest.results)
Expected behavior
There should be a estimate
column describing the difference in mean NPX between the two conditions but this is not currently within the package. Even estimate1
and estimate2
are not present in the data frame.
System Information:
- OS: Linux
- R Version: 4.1.3
- Package Information:
conda install -c r -c bioconda -c conda-forge r-olinkanalyze
Additional context
See test_data attached for reproduction: test_data.tar.gz
Thank you for reporting this unexpected behavior. However, I could not reproduce this bug on my end.
I used this code to load your example data and calculate t-test results:
library(OlinkAnalyze)
library(dplyr)
npx_data <- read.delim("Explore1536_random_NPX_11MAR2022.csv", sep = ";", dec = ".")
metadata <- read.table("Explore1536_random_NPX_11MAR2022.fake_meta.csv", sep=",", header=TRUE)
npx_data$SampleID <- as.character(npx_data$SampleID)
metadata$SampleID <- as.character(metadata$SampleID)
merged_npx <- npx_data %>% inner_join(metadata, by="SampleID")
ttest.results <- olink_ttest(df=merged_npx, variable="Treatment")
And got the following output:
> ttest.results
# A tibble: 1,472 × 16
Assay OlinkID UniProt Panel estimate Treated Untreated statistic p.value
<chr> <chr> <chr> <chr> <dbl> <dbl> <dbl> <dbl> <dbl>
1 DTX3 OID21349 Q8N9I9 Oncolo… 0.937 3.61 2.67 3.98 9.51e-5
2 IL22RA1 OID20449 Q8N6P7 Inflam… 0.939 3.60 2.66 3.75 2.50e-4
3 PCDH1 OID20614 Q08174 Inflam… -0.761 2.52 3.28 -3.48 6.15e-4
4 ICAM1 OID20411 P05362 Cardio… 0.758 3.56 2.81 3.40 8.02e-4
5 GSTA1 OID20166 P08263 Cardio… -0.644 2.63 3.28 -3.11 2.12e-3
6 ABHD14B OID20921 Q96IU4 Neurol… -0.720 2.50 3.22 -2.88 4.43e-3
7 CCN4 OID21444 O95388 Oncolo… 0.751 3.52 2.77 2.83 5.20e-3
8 IL6 OID20563 P05231 Inflam… 0.708 3.36 2.65 2.80 5.74e-3
9 PLXNB3 OID20160 Q9ULL4 Cardio… 0.697 3.50 2.81 2.80 5.77e-3
10 SOST OID20204 Q9BQB4 Cardio… -0.646 2.55 3.20 -2.75 6.52e-3
# … with 1,462 more rows, and 7 more variables: parameter <dbl>,
# conf.low <dbl>, conf.high <dbl>, method <chr>, alternative <chr>,
# Adjusted_pval <dbl>, Threshold <chr>
System environment:
R version 4.1.3 (2022-03-10)
Platform: x86_64-conda-linux-gnu (64-bit)
Running under: Ubuntu 20.04.2 LTS
OlinkAnalyze_3.1.0
Can you provide any more insights that can help us reproduce the bug?
Do you have a possibility to run the example dataset in an non-conda environment to help isolate the problem?
This also worked for me completely fine - I think that it was due to some dependencies in a previous conda environment being incorrect that overrode some things (I've shared them below). Having a very simple conda environment with just the olink analyze package solved the trick here:
Good config:
name: olink-analyze
channels:
- conda-forge
- bioconda
- r
dependencies:
- r-olinkanalyze
Buggy config:
name: olink-analyze
channels:
- r
- conda-forge
- bioconda
dependencies:
- r-olinkanalyze
- r-base=4.1.3
- r-broom
- r-car
- r-tidyverse
- r-ggpubr
- r-lme4
- r-lmertest
- r-magrittr
- r-readxl
- r-data.table
- r-r.utils
I think that this can be reliably closed due to it likely being a conda issue.
Closing this for now then. @abiddanda54gene please let us know if you have further details on why this is happening.