enricoros/big-AGI

[BUG] slow UI / loading with a conversation that has large context

Opened this issue · 8 comments

Description

I load up a bunch of data into context (only 80k tokens) and reloading the page with that conversation takes 10-15 seconds. After that it's pretty normal

The context is basically spread over 2 large messages and then another 3-4 messages with a couple sentences.

Device and browser

linux / firefox 127

Screenshots and more

No response

Willingness to Contribute

  • 🙋‍♂️ Yes, I would like to contribute a fix.

@dogmatic69 are you running in prod or dev mode? If in dev, there are many time penalties (e.g. react strict mode = 2x the time). If on Big-AGI.com then it's a prod build.

I have conversations around 20k tokens which can take 1s to load, and I'd expect 4s for 80k (and mostly it's text layout by the browser).

Could you try to download the message, and upload it to big -AGI.com (even in an incognito window) to see whether it's faster on there?

Finally, I profile with Chrome but not Firefox, not sure if anyone has done profiling there.

I am running the included Docker image as is, I'd assume that is production mode as it follows what is in the docs for production.

The total load time is ~ 40 seconds. Not found a way to look at the exact code yet, but there is one function in the profile taking 35 seconds.

image

Found the function at this point, something to do with markdown formatting?

function eP(e) {
    let t = e.length,
        n = !1;
    for (; t--;) {
        let r = e[t][1];
        if (("labelLink" === r.type || "labelImage" === r.type) && !r._balanced) {
            n = !0;
            break
        }
        if (r._gfmAutolinkLiteralWalkedInto) {
            n = !1;
            break
        }
    }
    return e.length > 0 && !n && (e[e.length - 1][1]._gfmAutolinkLiteralWalkedInto = !0), n
}
...

    ex = {
        tokenize: function(e, t, n) {
            let r = this;
            return function(t) {
                return 87 !== t && 119 !== t || !eO.call(r, r.previous) || eP(r.events) ? n(t) : (e.enter("literalAutolink"), e.enter("literalAutolinkWww"), e.check(eg, e.attempt(ey, e.attempt(eb, i), n), n)(t))
            };

            function i(n) {
                return e.exit("literalAutolinkWww"), e.exit("literalAutolink"), t(n)
            }
        },
        previous: eO
    },
    ew = {
        tokenize: function(e, t, n) {
            let r = this,
                i = "",
                l = !1;
            return function(t) {
                return (72 === t || 104 === t) && eT.call(r, r.previous) && !eP(r.events) ? (e.enter("literalAutolink"), e.enter("literalAutolinkHttp"), i += String.fromCodePoint(t), e.consume(t), a) : n(t)
            };

            function a(t) {
                if ((0, o.jv)(t) && i.length < 5) return i += String.fromCodePoint(t), e.consume(t), a;
                if (58 === t) {
                    let n = i.toLowerCase();
                    if ("http" === n || "https" === n) return e.consume(t), u
                }
                return n(t)
            }

            function u(t) {
                return 47 === t ? (e.consume(t), l) ? s : (l = !0, u) : n(t)
            }

            function s(t) {
                return null === t || (0, o.Av)(t) || (0, o.z3)(t) || (0, o.B8)(t) || (0, o.Xh)(t) ? n(t) : e.attempt(ey, e.attempt(eb, c), n)(t)
            }

            function c(n) {
                return e.exit("literalAutolinkHttp"), e.exit("literalAutolink"), t(n)
            }
        },
        previous: eT
    },

I tried to wrap the content of the large message in triple backticks to avoid the above and now the app is completely broken

image

@dogmatic69 it's likely the markdown editor (which is react-markdown + remark_gfm, the standard combo). There must be something exponential with their layouting algos.

Are you able to edit the 'localStorage' of the application, as-is (broken) and set localStorage > app-ui > renderMarkdown: false
image

This should disable the markdown renderer, and the app may come back to life,

I did get it working by removing something in local storage. Lost my folders though :/

The lack of deep URL links made things worse as I had no way to navigate away from that broken chat.

I just turned off markdown and revisited that chat and same problem.

Good idea about the deep links. I'll move to storing the current chat ID in the URL. Would be a good workaround in extreme cases. LMK if you can share a problematic file (ctrl+s in the app to save it, and then you can upload it here), so I can profile it. However I use the chrome profiler for dev.
The recursion bug is also terrible - would love to repro that.