Poor performance when json data is large
Opened this issue · 2 comments
rajeevmajumdarindexnine commented
Currently i'm creating table dynamically, in that multiple rows get added dynamically (Similar to Excel).Table can have millions of rows. Now redo/undo working perfectly when rows count is upto 100. How to improve redo/undo performance when data is too large.
duncanmcdowell commented
Angular in general will struggle with datasets in the millions of rows. Your best option is to paginate and change the watched variables on page change.
… On Dec 6, 2016, at 7:53 AM, rajeevmajumdarindexnine ***@***.***> wrote:
Currently i'm creating table dynamically, in that multiple rows get added dynamically (Similar to Excel).Table can have millions of rows. Now redo/undo working perfectly when rows count is upto 100. How to improve redo/undo performance when data is too large.
—
You are receiving this because you are subscribed to this thread.
Reply to this email directly, view it on GitHub <#43>, or mute the thread <https://github.com/notifications/unsubscribe-auth/ADySo9pjyjMD19PZ2qf462SUWa6jNd81ks5rFVrKgaJpZM4LFYB->.
rajeevmajumdarindexnine commented
@duncanmcdowell thanks for the reply.But paginate is not suit for my case.It seems Angular-Chronicle uses watch for changes, and this is what takes the most performance.Is there is any other way to achieve the same using events instead of watch.