gravelBridge/AutoGPT-Web-Interaction

Bug: Get DOM exceeds size

Opened this issue · 9 comments

On large sites get_dom exceeds its size, maybe there's a way to use a batched version of get_dom?

Interesting, that shouldn't be possible as I cut off the DOM at a certain size. Could you share more information like screenshots, logs, etc..? Also what websites experience this issue?

Sure thing! Sorry I was lazy and didn't get a screenshot and quite possibly read the error wrong >.<

Ill play around with it more tomorrow and see i can get a proper stacktrace

No problem! Ok, thank you!

I might have been confused by this message at the bottom and so I think that's why I wrote this bug:
Screen Shot 2023-05-03 at 9 19 00 PM

I'll see if I can reproduce it reliably or maybe that error message is a red herring and it's not really a problem?

I see, yes that is intentional. I am still trying to create a workaround, if you have any suggestions, please let me know!

Ohhhh okay gotcha I see I thought you were using Playwright to get the DOM. Hmm would it make sense to maybe use playwright to execute JS to get the DOM?

Or would that run into the same limitation?

It would run into the same limitation as the dom size stays the same no matter how you get it

Hmm that's tough. I'd have to look ways of batching or streaming the DOM if that's even possible lol

I wonder if there's way to have playwright do the element lookup/find instead of having to download the DOM and search the flat DOM structure to click on what you're looking for.

Just hip-firing ideas here so take it all with a grain of salt lol

I think there isn't a way to bypass this, it's an OpenAI issue.