konveyor/tackle

[RFE] - [RFE for upcoming 2.0] Upload analysis results/report to associated git repo

Opened this issue · 1 comments

What is your persona?

I'm an architect tasked with large scale application modernisation engagements.

What is your story / use case?

As an architect, I want to share the analysis report with the development teams, so that the developers have access to the reports from their project repository.

What is your specific intent, not necessarily the feature you want, but what are
you trying to achieve?

If an analysis is done "remotely" (the application is configured to download the code from a git repo for analysis), it would be nice to upload the report into that same git repo (into a configurable directory, on main or automatically creating a tackle-<timestamp> branch.
This would allow developers access to the report without having to access Tackle.
Furthermore, it would also allow for a delta comparison of reports between application versions (since the report is versioned along with the application source).
How has the number of story points changed between 0.0.1 and 0.0.5 would be an easy comparison in git, since developers would now have a versioned report, alongside the code that has been analysed.

How does your immediate need fit into your bigger picture? What's the overall
benefit?

  • Ease of use (access to the report directly from your repo)
  • Version controlled "snapshots" of reports associated with their code versions
  • Ability to compare / visualise progress over time (comparison of reports across branches/versions/tags)

Do you have a suggestion for implementation?

Since Tackle 2.0 will incorporate the git/svn download/clone of code, "uploading" aka adding/committing the reports shouldn't be too much of an effort, I hope.
Additionally, to make diffs/comparisons easier, maybe a simple text representation of the report dashboard would be nice, so a git compare would not show the diff of the generated HTML but rather changes in the key metrics (number of issues, story points, percentage in different areas, etc.)
Additionally, an analysis config text file containing the parameters added to the uploaded report would be necessary, so developers would see which analysis was actually performed and would not compare results of two different analysis targets.

bitmoji

@PhilipCattanach we should start looking into enabling the Windup CLI to provide an overview of the execution results aside from the reports themselves. A set of high level metrics should be fine, maybe collocated with the HTML reports in a JSON or YAML file. This would enable both what Markus is requesting and the analysis history/burndown charts feature we were discussing some time ago. @jortel what do you think?