Provide conversion methods from DataTable to scala types
gaeljw opened this issue · 1 comments
Related to #50
I would like to have convenience methods for converting a DataTable to Scala types:
Given("My expression") { (table: DataTable) =>
table.asScalaMaps
// Instead of something like:
table.asMaps().asScala.map(_.asScala)
}Also considering that DataTable can contain null values but null is not expected in Scala, we should probably filter out the null values.
Also considering that DataTable can contain
nullvalues butnullis not expected in Scala, we should probably filter out thenullvalues.
In the example of a table read as List[Map[K,V]] this would lead to inconsistent maps for each row: not all the maps would have the same keys.
In the example of a table read as List[List[V]] the issue is even more visible: each row could be represented with a list of a different size.
If people are reading the rows using the index like row(2), this would mess things up as the index would not necessarily match the column number of the table.
null values.
The correct Scala way would be to wrap cell values as Options.
For instance table.asScalaMaps[String,String] would actually return a Seq[Map[String, Option[String]]] rather than a Seq[Map[String, String]].
And we would convert null values to None.
Users would then use the table like this:
table.asScalaMaps[String,String] // Seq[Map[String, Option[String]]]
.map { row =>
val name = row("name").getOrElse("")
// Or a bit uglier if you are not sure the key is always defined
val surname = row.get("surname").flatten.getOrElse("")
}I'll give a look to existing codebases using Cucumber Scala but I believe this solution has no much drawback