canimus/cuallee

[JOSS REVIEW] Automatic testing is failing

devarops opened this issue Β· 4 comments

Hi @canimus,

The test snowpark_dataframe/test_are_complete.py::test_positive ends with an error.

ERROR test/unit/snowpark_dataframe/test_are_complete.py::test_positive - ValueError: snowpark did not yield a value

Please pass the tests in this pipeline.

Hi @devarops in order to test the integration scenarios, is required to provide a SNOWFLAKE account through a series of environment variables:

  • SF_ACCOUNT
  • SF_USER
  • SF_PASSWORD
  • SF_ROLE
  • SF_WAREHOUSE
  • SF_DATABASE
  • SF_SCHEMA

Same will happen with BigQuery, a GOOGLE_CREDENTIALS file with the json key is required. Perhaps we should rather move this tests, into an integration folder, and leave the unit one to those can be executed in full isolation.
As the submission made initial reference for the pyspark solution, and in fact the rest of the tests are simply organized as we did the ones for Spark, then running pytest test/unit/pyskark_dataframe

Solved in #211

Excellent job, @canimus! πŸ™ŒπŸΎπŸ₯³πŸŽ‰

How did you manage to solve it?

@devarops there is a free tier for a Snowflake account that expires every month. Credentials in the Github actions have to be renewed every month, based on this limitation. In principle, unit tests should mock the interfaces but we found it more representative, to actually run the tests again the real systems, specially if there is not an additional costs to do so.
So the fix on the pipeline, was literally just changing the credential references to a new "active" trial account in Snowflake.