rockneurotiko/ex_gram

Parsing JSON to proper types

a3kov opened this issue · 7 comments

a3kov commented

I am working on adding proper types for Bot API data using Ecto embedded schemas. Then it would be possible to parse JSON straight to model structs, and also to validate data before making requests.
Before I start working on a pull request, I want to ask - are you open to adding this feature to ex_gram ?
I am going to use polymorphic_embed for generics.

a3kov commented

Just to let you know, I got this mostly working in my project. However, I am using a heavily modified fork and to merge the changes back to ex_gram I would need to make some changes in ex_gram to be able to run it without forking.
The original idea behind this issue was to contribute back and to reduce the amount of code I need to maintain. I still need to evaluate the amount of work needed, and your approval for this feature

I would love to see the result to decide, can you publish your forked repo to take a look?

a3kov commented

I am afraid it's not possible, as I integrated it into my project code, I don't run it as a library. But it involves modifying the model macro, and adding macro helpers to generate embedded_schema and changeset parts, basically. I left typespec as it. Its also using poly embed project, and custom helper code for generics (unfortunately each time Telegram will update/add generics, this code has to be updated manually)

That's what I was afraid of, after maintaining many telegram libraries where I had to update it manually when there were updates on the API that's one point that it's the main point on this library, having the ability to update to new versions without changes.

With that problem and that I wasn't really sure to use ecto, it can tie the developers to ecto, and can lead them to use bad code habits like saving directly the structs from the library instead of having their own entities.

a3kov commented

But if there are changes in the Bot API spec, as an app developer you still need to update your code. You can't just expect it to magically work with the new spec. The only difference is, if the change happens when parsing the data, or deeper inside the application. And parsing is better than randomly pattern match on maps across your project. There's a great article on this topic .
Anyway, I will leave that for you to decide. I am fine to run the fork as well.

Yeah, but that's why versioning exists, and with the current macros is really easy to see the differences between versions to know if you need to update or not.

If this were Haskell it would be much easier to provide the type information for compile-time checks, for now the typespecs from the methods and models are properly generated, so if the developer uses dialyzer that's the closer to check that your code fits on the specification as it can get with Elixir, with the ecto schemas you still have the same problem. And if the developer are saving the embedded schemas in a database, when the new schema changes they will have a problem when retrieving the old versions because the serialization won't work.

For now I'm not convinced at all to add ecto schema to the models

a3kov commented

It will be very easy to update the generics code in my fork also. The generic code is simply hints for poly embeds library to detect child schemas based on presence of "type" fields. But I understand it would add maintenance burden.
I don't buy the "Let's not use Ecto embedded schemas because people will save it to the DB" argument, sorry. Data validation is exactly the reason that library exists. And people often misuse things - you can't protect them from it. The solution to the issue has always been to educate. If you buy a kitchen knife, you can cut yourself with it as well. Should we stop selling kitchen knives ? :)

But anyway, I respect your decision. This is your pet project so you are in position to decide what's proper.