/Transformers_2D_positional_embedding

PyTorch implementation for a self-attention neural scorer with 2D positional encoding

Primary LanguageJupyter Notebook

Transformers with self-attention and 2D positional embedding

This is a simplified version of PyTorch implementation for a self-attention neural scorer with 2D positional encoding.

Representation learning for information extraction

This NN model is mentioned in an ACL paper from google research lab and the training sample is generated from ICDAR 2019 challenge that can be found here: https://github.com/Michael-Xiu/ICDAR-SROIE .


Task

This is about the task of extracting structured information from form-like documents using a learned representation of an extraction candidate. Form-like documents like invoices, purchase orders, tax forms, and insurance are quite difficult to read and not easy to understand in terms of details.