fedbiomed/fedbiomed

Create test cases to run examples in tutorials/pytorch tutorials/sklearn

Closed this issue · 3 comments

  • Create a separated (that can be executed independently from regular end2end test) script to execute and test notebooks in the documentation.
  • This issue is only writing test cases for tutorials/pytorch and tutorials/sklearn

IMPORTANT:

There are some notebook that requires big datasets such as celaba. Those notebooks can be skipped if that set is not existing in the local.

Please find the details of this issue

  • Create end2end test scripts like the ones in test/end2end/e2e_*.py
  • In the e2e test script create pytests fixtures to add components and add datasets. This already implemented in other e2e files.
  • Write testing function that executes tutorial notebooks. Test should only verify if notebook is executable.

Suggestion:
You can use same fixture (nodes) to add other datasets without recreating the nodes:

@pytest.fixture(scope="module")
def setup(request):
    # Creates nodes and starts 
   return node_1, node_2,
   
def test_execute_pytorch_mnist_notebook(setup):
    node_1, node_2, node_3 = setup
    dataset = { this is the dict that defines the dataset (dataset json)}
    
    add_dataset(node_1, dataset)
    add_dataset(node_2, dataset)
    add_dataset(node_3, dataset)
     
    try: 
         execute_script('...noteboks....')
    except: 
         pytest.fail('Opps')

Note: All notebooks can be executed in one single test file.

Notebooks: pytorch used cars example and sklearn perceptron example contains dataset preparation in the beginning of the notebook. Those notebooks may require modification to make it compatible to e2e test machinery. Therefore, we can skip those in the scope of this issue.

Notebooks: pytorch used cars example and sklearn perceptron example contains dataset preparation in the beginning of the notebook. Those notebooks may require modification to make it compatible to e2e test machinery. Therefore, we can skip those in the scope of this issue.

This will be handled in a separate issue