dwhjames/datomisca

NullPointerException where mapped class references same class

Closed this issue · 8 comments

Hi!

I'm pretty sure I found a bug, but I would love to be wrong about this :). My bug seems to be stemming from when I have a case class that references another instance of the same class. take the following example:

  case class Price(uuid: UUID, amount: Double, previousPrice: Option[Price])

Notice how previousPrice is an option of type Price. Now if I map everything the normal way and add a newPrice method to create new Price entities....

object Price{

  object Schema {

    object ns { val price = Namespace("price") }

    val uuid = Attribute(ns.price / "uuid", SchemaType.uuid, Cardinality.one).withUnique(Unique.identity)
    val amount = Attribute(ns.price / "amount", SchemaType.double, Cardinality.one)
    val previousPrice = Attribute(ns.price / "previousPrice", SchemaType.ref, Cardinality.one)

    val all = List(uuid, amount, previousPrice)
  }

  implicit val reader: EntityReader[Price] = (
      Schema.uuid.read[UUID] and
      Schema.amount.read[Double] and
      Schema.previousPrice.readOpt[Price]
    )(Price.apply _)


  def newPrice(amount: Double, previousPrice: Option[Price])(implicit conn: Connection): Future[Price] = {

    val tempId =  DId(Partition.USER)
    val addTxn: AddEntity = (
      SchemaEntity.newBuilder
        += (Schema.uuid -> Datomic.squuid())
        += (Schema.amount -> amount)
        +?= (Schema.previousPrice -> previousPrice.map(pp => LookupRef(Schema.uuid, pp.uuid)))
      ) withId tempId

    Datomic.transact(addTxn) map { r =>

      val entityId = r.resolve(tempId)
      val entity = r.dbAfter.entity(entityId)
      DatomicMapping.fromEntity[Price](entity)
    }
  }
}

So far, so good, but if I run the following unit test....

  "A price" should {

    "be created once" in new WithDB {

      (for{
        _ <- Datomic.transact(Price.Schema.all)
        price <- Price.newPrice(123.00, None)

      } yield {

        price.amount must beEqualTo(123.00)
      }).await
    }

    "be created twice" in new WithDB {

      (for{
        _ <- Datomic.transact(Price.Schema.all)
        price1 <- Price.newPrice(123.00, None)
        price2 <- Price.newPrice(456.00, Some(price1))
      } yield {

        (price1.amount must beEqualTo(123.00)) and (price2.amount must beEqualTo(456.00))
      }).await
    }
  }

The first test, (which does not tie in a previous price) passes, but the second test fails with the dreaded NullPointerException. This is the stack trace that I get:

[error]    NullPointerException:   (attribute2EntityReader.scala:253)
[error] datomisca.Attribute2EntityReaderCast$$anon$25$$anon$11.read(attribute2EntityReader.scala:253)
[error] datomisca.package$RichAttribute$$anonfun$readOpt$extension$1.apply(package.scala:148)
[error] datomisca.package$RichAttribute$$anonfun$readOpt$extension$1.apply(package.scala:146)
[error] datomisca.EntityReader$$anon$1.read(entityMapper.scala:71)
[error] datomisca.EntityReader$EntityReaderMonad$$anonfun$bind$1.apply(entityMapper.scala:77)
[error] datomisca.EntityReader$EntityReaderMonad$$anonfun$bind$1.apply(entityMapper.scala:77)
[error] datomisca.EntityReader$$anon$1.read(entityMapper.scala:71)
[error] datomisca.EntityReader$EntityReaderMonad$$anonfun$bind$1.apply(entityMapper.scala:77)
[error] datomisca.EntityReader$EntityReaderMonad$$anonfun$bind$1.apply(entityMapper.scala:77)
[error] datomisca.EntityReader$$anon$1.read(entityMapper.scala:71)
[error] datomisca.EntityReader$EntityReaderFunctor$$anonfun$fmap$1.apply(entityMapper.scala:81)
[error] datomisca.EntityReader$EntityReaderFunctor$$anonfun$fmap$1.apply(entityMapper.scala:81)
[error] datomisca.EntityReader$$anon$1.read(entityMapper.scala:71)
[error] datomisca.DatomicMapping$.fromEntity(DatomicMapping.scala:26)

Printing out the entity produced by the line val entity = r.dbAfter.entity(entityId) seems to give me exactly the entity I would expect, but then it crashes on the next line.

BTW, for the record, I'm aware of how silly this example is in a database that keeps history. But trust me when I say that my real example needs to be like this :).

I’m pretty sure this is a Scala issue not a Datomisca issue.

Try:

implicit lazy val reader: EntityReader[Price] =

You are trying to construct a recursive object, but you’re constructing it as a val, not a lazy val. When Scala initializes the reader, the reader references itself before it is fully constructed, hence the NullPointerException. Similar things happen when the definition of val references a name that is lexically later.

Good point! Unfortunately, however, changing it to a lazy val causes a StackOverflowException, but at least I now understand the problem.

Hmm… I don’t immediately see why this would result in a StackOverflowException

Btw, TxReport has a resolveEntity method, so you could do:

val entity = r.resolveEntity(tempId)

instead.

Yeah, that was kind of surprising, but it's happening.

java.lang.StackOverflowError
    at models.PriceSpec$Price$.reader(PriceSpec.scala:34)
    at models.PriceSpec$Price$.reader$lzycompute(PriceSpec.scala:38)
    at models.PriceSpec$Price$.reader(PriceSpec.scala:34)
    at models.PriceSpec$Price$.reader$lzycompute(PriceSpec.scala:38)
       ...

I wonder if there is something I can do with the Builders, perhaps making the arguments to and and ~ by-name?

Here’s another idea (haven’t tried to type it up and compile):

  implicit lazy val reader: EntityReader[Price] = (
      Schema.uuid.read[UUID] and
      Schema.amount.read[Double] and
      Schema.previousPrice.readOpt[Entity].map(_.map(e => reader.read(e)))
    )(Price.apply _)

which should delay when the reader is actually needed, compared to previousPrice.readOpt[Price]. (cf. readOpt)

You da man @dwhjames! That worked like a charm. What do you recommend as far as this ticket is concerned? Close it? Want me to add some documentation somewhere and close? Perhaps I can try implementing a lazyReadOpt that does the same thing internally?

I’m not sure if a lazyReadOpt is possible… you could try making the implicit argument call-by-name, but I have no idea if that would work, if Scala even allows that.

The other option would be to make the and combinator more lazy, but that would be a very invasive change.

Previously, when I’ve read a recursive structure, I’ve done it in a recursive method, not in a recursive entity reader. I guess the latter starts to hit the limits of the reader.

Yeah, in my case, what I was planning to do before your suggested workaround would be to simply store the UUID of the parent entity in the case class, instead of the entity itself, then manually "inflate" the parent entities as I walked up the tree. That would have been would have been a perfectly fine and reasonable approach, though now with your workaround, I'm happier because I get to do less work.

Thanks again for the help!