/integration-test

Big integration test of all tools

Primary LanguagePythonMIT LicenseMIT

integration-test

Big integration test of all tools

Text stolen from Martin Schäf's email:

Right now, this repo only contains stubs and has no test oracles but we hope to get everyone involved very soon. Each part of the system has a subfolder which contains a script to run one of our tools. Some subfolders still have to be added. The integration story we have in mind is as follows (and please yell if that doesn’t work for you):

We start from the corpus. First step is a dynamic analysis (which we only have to do once per corpus project). The dynamic analysis is found in the folder: ./dynamic_analysis This contains files to download and run Randoop on all projects in the corpus. Next, we will add a Daikon run as well. The test cases are added in subfolders of the corresponding corpus projects. The dynamic analysis currently has no output (other then generating tests) because we didn’t not yet agree with Mayur on the kind of input he wants.

After the dynamic analysis we perform type inference using the files in ./type_inference These files currently only download the checker framework etc and run the regression tests. In the long run, this should get a mapping file from Java types to Annotations (provided by Howie). Propagate the types around and produce a .jiaf file that is passed to Petablox to update the LB representation of the corpus.

Next, we call PetaBlox on all corpus projects. Those scrips are in ./into_logicblox Currently, this only does the regular full program analysis with bddbddb. The big step here will be to figure out a way to put all projects into one LB instance and to deal with programs without multiple or no entries. I’d like to start a discussion on this asap because this will eventually be to center of all other activity.

Finally, we have a graph generation in: ./graph_generation This is just running the scripts that we used in the demo workshop generate dot files for each corpus project and precompute the graph kernels (and store them in ./graph_generation/kernels.txt). In the long run, the dot generation should be moved into the LB part. The graph generation uses the type annotations generated by Werner where available to label the nodes.

Results of all tools running can be found on Travis: Click here to see the Results of the experiments on TravisCI

Obtaining LogicBlox

LogicBlox is released every month. It is available for download at https://download.logicblox.com . To obtain a username and password for this website, please fill out the form at http://www.logicblox.com/learn/academic-license-request-form/ and indicate that you are a MUSE team member.

LogicBlox offers Linux 64-bit as well as OSX distributions. If you problems installing LogicBlox, please contact muse-users@logicblox.com. You might want to be signed up for this group as well!

Documentation is available at https://developer.logicblox.com/documentation/ . Depending on what you are most interested in, you want to check the reference manual or the admin manual.

Just email martin.bravenboer@logicblox.com if you have any further questions.