ultimate_data_process_in_one_shot produces the feature for training in one shot, however, the input file name need to be updated in the file.
concatenate_dataset concatenates multiple feature file generated from different datasets together to produce combined feature to be trained.
model.py is the neural network file
-
de_analysis.py: save model and Deep Explain arrays
python de_analysis.py --inpath <data_path> --tab \<preflix of data file: new\_\/sample\_\/none\> --modelpath <model_path> --depath \<Deep_Explain array path\> --(no-)save_de --(no-)save_model --feature_file \<selected feature file if any\>
- --(no-)save_model: option for saving model or not
- --(no-)save_de: option for saving Deep Explain arrays
- --modelpath: path to save model
- --depath: path to save Deep Explain arrays
- --feature_file: selected features generated by deep_explain.ipynb. defaut is none; if provided, only the selected features will be preserved
-
nn_feature_Selection.py: generate the results when removing features one by one according to the DE array saliency order
meaning of tabs:
- new: generated using the singalun.network from Leon (in network.py)
- resample: using resample_SEQC.py. since the majority:minority in SEQC is 4:1, resampling is done by delete half of the majority patients and repeat the minority HC twice as the original. Deprecated