mito-ds/mito

Frontend Tests

Opened this issue · 6 comments

Ideation:

Infrastructure:

  • Split tests into different files per feature
  • Utility for setting pivot to a specific state
  • Utility for setting merge to specific state
  • Test code runnability in Streamlit after each test (run button in Streamlit)
  • Snapshot tests. Making it so we can test graphs with pictures of it.
  • Organize the utilities so they are accessible

Feature Specific

Merge

  • Merge: cycling through different merge types and making sure they work
  • Merge: testing all of the different dropdowns
  • Merge: testing error message if keys have invalid datatypes (string and int)
  • Merge: changing the different dataframes
  • Merge: toggling all the different columns
  • Merge: adding in and deleting merge keys
  • Merge: editing a merge, merge key missing, error
  • Merge: editing merge, select column missing, warning
  • Merge: editing merge, changing every single field

Pivot

  • Pivot: multiple additions to any column dropdown
  • Pivot: adding filters
  • Pivot: setting it to invalid values for the filters
  • Pivot: reopening a pivot with column headers that were deleted
  • Pivot: replaying edits automatically

Import

  • Import: configure CSV import
  • Import: configure XLSX import (a lot more)
  • Import: different range imports (maybe cycle through them)

Graph

  • Graph: setup graph tab. Setting the axes
  • Graph: set title by clicking, delete title by clicking (x3, one for each title)
  • Graph: changing the chart type changes the extra configuration you expect to be there
  • Graph: copy code
  • Graph: closing and opening a graph
  • Graph: changing the browser size and graph resizes
  • Graph: format tab. Checking that the format tab actually changes the color.
  • Graph: deleting a column from the source data
  • Graph: putting in valid numbers for the format tabs
  • Graph: deleting the source data data deletes the graph

Search and Replace

  • Try out a bunch of different searches
  • Try a bunch of replaces

Formatting

  • Formatting: set dataframe colors, just testing each of these
  • Formatting: test

Add Column

  • Add column to start of dataframe using context menu
  • Add column to end of dataframe using context menu
  • Add column to middle of dataframe using context menu
  • Add multiple columns to middle of dataframe using context menu
  • Add column to end of dataframe using toolbar button

Formulas

  • Formulas: basic constants
  • Formulas: switch from cell editor to formula bar and back
  • Formulas: cross sheet with VLOOKUP
  • Formulas: rolling ranges, etc
  • Formulas: edit entire column turned off
  • Formulas: clicking a cell inserts edit
  • Formulas: clicking on a header
  • Formulas toolbar: test custom formulas show up
  • Formulas toolbar: test the formulas show up
  • Formulas toolbar: test that the formulas insert

Column Header

  • Column header: right click open context menu
  • Column header: context menu items work
  • Column header: click on dtype opens toolbar

Column Control Panel

  • Column control panel: changing dtype
  • Column control panel: adding filter
  • Column control panel: sorting
  • Column control panel: changing the number type
  • Column control panel: loading values
  • Column control panel: loading values of mixed type
  • Column control panel: filtering values by clicking them
  • Column control panel: loads summary stats
  • Column control panel: loads graph

Code Config

  • Code config: test generating a function
  • Code config: adding parameters (has the right parameters)
  • Code config: changing the function name

Schedule Automation

  • Schedule automation: ignore for now

Toolbar Top

  • Toolbar top: undo
  • Toolbar top: redo
  • Toolbar top: clear
  • Toolbar top: steps
  • Toolbar top: fullscreen
  • Toolbar top: help button
  • Toolbar top: Mito Open Source / Mito Enterprise button

In JupyterLab

  • If you don't have code cell, does Mito create a code cell and add to code to it

Note for updating snapshots automatically: jupyter/notebook#6723

Oh yeah that's great. So they appear to be use the run number as the unique id for downloading the artifacts -- which seems legit, and we'll have to do something like this.

I don't totally understand the branching / merging model of the snapshots, I guess. They are stored as an artifact... but how.

I think before getting screenshot tests working on Github actions, we 100% need some local utilities for working with them!

Some more test ideas:

Signup Flow

  • When on open source, the signup flow appears if you don't have an email in user.json
  • When on open source, the signup flow does not appear if you have an email in the user.json
  • When on pro the signup flow does not appear
  • When on enterprise the signup flow does not appear

Another testing issue: #1153