Code coverage report creation fails for big schema
Opened this issue · 22 comments
We have here an enterprise application with more than 1000 packages, functions and procedures with over 1 million LoC in one of the schemas.
We have created one fresh test now, and have to convert our v2 tests later.
Actually we start with utPLSQL v3 and want measure code coverage too.
I have tried this CLI in Jenkins, but after 10 minutes the build is broken. I have seen that 180 MB of html file of the report is written till then.
If I run CLI without coverage the build is successful in 4s.
Can you share error message with which the build is broken? And which version of cli you are using?
We are currently working on a first stable release of utPLSQL-cli
What you are describing might be an issue with connection timeout (#38), which should be solved in current Pull Request.
I try latest development version utPLSQL-cli-develop-201711161046.zip.
There is no error shown in the output or log.
But I get an exit code 1 from the call.
There are tow things here:
- scope of coverage
- size of project
Does your project files = whole DB schema?
If project < schema.
Do you want to report coverage on whole DB schema or want to limit the coverage to your project files?
HTML reporter in general is good for small projects. If it's a big project - the file will become really big.
If you're running against a large project, integration with sonar or coveralls is more lightweight - it is just reporting line numbers, not the content of lines.
This is also better option from source-code security point of view - code is not taken out of DB into report.
It is best to use feature of mapping project files into DB objects. We use it for publishing to sonar and coveralls in utPLSQL self-testing.
see: https://github.com/utPLSQL/utPLSQL/blob/develop/test/install_and_run_tests.sh
and https://github.com/utPLSQL/utPLSQL/blob/develop/sonar-project.properties
as well as coverage documentation and our Travis build results and sonar results.
Also:
I have plan to improve performance of reporters (output buffer). That could help a bit on timing.
The 10 minutes could be a timeout limit on your database for long-running queries or it could be related to the client jdbc setting.
The timeout is something i already raised with @viniciusam and I think it is fixed in develop.
We plan to have official release of utPLSQL-cli soon.
Huge thanks to both @pesse and @viniciusam who did a lot of work to get us to that point.
Our project files = this and another smaller whole DB schema!
BTW I found no call to map these both together.
We don't have sonar or coveralls at the moment here, I think this need some time.
But I will look into mapping after DOAG2017.
To map your project files to db objects use parameter -source_path
If your project files are distributed across multiple DB schemes we expect by default the naming convention of:
- owner.name.pkb- package body
- owner.name.tbp - type body
- owner.name.trg - trigger
- owner.name.fnc - function
- owner.name.prc - procedure
Default extension mapping is defined here: https://github.com/utPLSQL/utPLSQL/blob/develop/source/core/ut_file_mapper.pkb#L39
The regex-pattern defines what part of file path contains information about object owner, object name, object type.
If your project structure/naming matches the default definitions - you're all set. If not, you need to override default configuration by providing a new REGEX and identify element position in regex for owner, object name, object type.
@tkleiber
If you want to give a try to coveralls or sonar coverage you can give an example project structure/file names and we can help you figure how to configure invocation
The generation works now, it needs 12 minutes.
The resulting html file is round about 200 MB and kills IE 11 browser most of the time.
In Chrome it opens but needs again a lot of time.
Maybe it will be possible to modularize this into a main html and lot of small pieces for the code popup's?
Hm, that would mean to outsource parts of the logic of report generation to cli I guess. You probably won't be able to create such a report from pl/sql only.
Any thoughts on this, @jgebal?
Yes. One reporter=one output stream.
If we would go that way, we would make PLSQL call to html report useless.
For large codebase it is best to use sonar/coveralls reporting. They do not contain the source of covered code but only line numbers.
@tkleiber do you have sonar or any similar tools?
We could think of other common coverage formats to be supported. I didn't investigate any other option for now but there are multiple standard coverage formats. It all depends where and how you want to display them.
It is best to get max outcomes with minimal effort. That would mean for Jenkins etc we should be able to find coverage format that can be displayed in CI without need for pulling sources into report.
At the moment we don't have sonar. This need some time to setup internal, as bank we can not use a cloud environment here.
What CI are you using? How do you report coverage on other code that PLSQL?
Answers to those questions can give some options.
Jenkins and it's plugins.
jacoco plugin
OK.
I've heard about jacoco a bit here and there. Seems quite common format.
utPLSQL/utPLSQL#557 created for this
it could be possible to reduce the generated size and support 'multiple files' if the coverage output was pushed into a zip file... using something like https://technology.amis.nl/2010/03/13/utl_compress-gzip-and-zlib/
that produces a BLOB, so it may only be useful for utl_file output of via the custom CLI
Interesting idea. Reporter could create multi-file zip as BLOB and we could then unzip it in utPLSQL-cli side.
That would require:
- new (binary) output buffer
- rebuild of the reporter
- rebuild of CSS/HTML template
- some very specific code on the client-side to react to the specific reporter format.
Quite a bit of work to be done (sounds like an Epic). It would be an interesting challenge.
I've already managed to generate 200 MB thml report at work.
Opens in chrome, but takes quite a while to load.
Have updated now to utPLSQL v3.1.1 and utPLSQL-cli v3.1.0. The file is furthermore successfully created, but cannot be opened neither in IE nor in Chrome now. Only the broken loading icon / text is shown in the page.
Is the "_assets" folder being created?
That's one major change in cli 3.1.0.
How big is the resulting report-file?
The new folder raises a problem in combination using directories in the -o switch eg.:
-o=results/ut_coverage_html_reporter.html
The resulting ut_coverage_html_reporter.html contains too the reference to this directory whis is wrong:
results/ut_coverage_html_reporter.html_assets/application.js
This should furthermore be:
ut_coverage_html_reporter.html_assets/application.js
as it must be relative to to html file location.
As workaround I change now into the directory before calling the cli and use -o without directories.
Great finding, thanks a lot!
This will be fixed in the upcoming 3.1.1 utPLSQL-cli release.