plantain-00/type-coverage

Maximum call stack size exceeded

Closed this issue · 15 comments

Hello,

I have a few users getting a RangeError: Maximum call stack size exceeded when running the typescript-coverage-report command, but (as you'll see from alexcanessa/typescript-coverage-report#51 (comment)) seems like the bug happens also when running type-coverage directly.

Anyone else?

@plantain-00 this seems to still be a bug (see the linked issue ^)

Yonom commented

Also getting this error

macOS
Node: v16.3.0
TS: v4.2.2

@Yonom Can you provide a minimal repo to produce this?

Yonom commented

Sure, I'll be busy tomorrow so it might take a day or two before I get around to it

Also getting this.

Yonom commented

I fixed the problem by deleting node_modules and installing everything again. Sadly I have been unable to reproduce the bug again

I have this when running directly too.

Windows 10
Node v14.15.5
"type-coverage": "2.18.2", - installed locally not globally
"typescript": "4.4.3",

The project is huge, and a mix of TS and JS. With allowJs: false it needs --max-old-space-size=8192 and completes successfully, however after turning on JS it needs --max-old-space-size=24576 to avoid running out of heap - but then it exceeds max call stack size.

I found that program.getSourceFiles() is returning files from my dist directory, despite it not being in my tsconfig include. I expect that's a TypeScript issue.

Added --ignore-files 'dist/**' to workaround

When I enable the checker's strict mode, I'm back to max call size exceeded.

Yonom commented

My issue was fixed by deleting node_modules and reinstalling everything. I haven't been able to reproduce it since

I'm also having this issue with a large codebase. I've tried to give 27 of my 32GB of RAM to the process but that is not enough 😞
I wonder if the issue is with sourceFileInfos used to loop over and over across all files (we are talking about 50K+ files) and maintaining a massive map with source information.

Maybe changing the loop to be file by file instead of step by step would help node release the memory of processed files?

I'm also having this issue with a large codebase. I've tried to give 27 of my 32GB of RAM to the process but that is not enough 😞 I wonder if the issue is with sourceFileInfos used to loop over and over across all files (we are talking about 50K+ files) and maintaining a massive map with source information.

Maybe changing the loop to be file by file instead of step by step would help node release the memory of processed files?

sourceFileInfos just contains file path, sourceFile Reference, hash string:

sourceFileInfos.push({
file,
sourceFile,
hash,
cache: cache && cache.hash === hash ? cache : undefined
})

It's file by file:

for (const sourceFile of program.getSourceFiles()) {

My guess about the problem is checking every identifier's type:

const type = context.checker.getTypeAtLocation(node)
if (type) {
types.push(type)
}
const contextualType = context.checker.getContextualType(node as ts.Expression)
if (contextualType) {
types.push(contextualType)
}

Maybe so, but is is a memory issue. It's not a problem of it being slow (yet), it feels like there is a memory leek or something as it keeps growing. So if the issue is there, it means that either getTypeAtLocation or getContextualType leak, otherwise I'd think types would be garbage collected.
But I'm still just speculating as didn't have time to put a debugger and look at what is happening

Alright, had time to investigate a bit and found out the the issue is with this fallback as my tsconfig has neither includes nor files. So that generic wildcard '**/*' was matching things like md,json,png, sh, ... and I guess TS would choke trying to read some of them, never finishing the type check and eating all the memory.
By setting an explicit includes: ['**/*.{ts,tsx}'] in my tsconfig then TS would handle 50K files just fine, and the program would perform decently.

Wonder if that is what is causing issues to other people here too and if we should make it a bit more specific.

Alright, had time to investigate a bit and found out the the issue is with this fallback as my tsconfig has neither includes nor files. So that generic wildcard '**/*' was matching things like md,json,png, sh, ... and I guess TS would choke trying to read some of them, never finishing the type check and eating all the memory. By setting an explicit includes: ['**/*.{ts,tsx}'] in my tsconfig then TS would handle 50K files just fine, and the program would perform decently.

Wonder if that is what is causing issues to other people here too and if we should make it a bit more specific.

According to typescript document (https://www.typescriptlang.org/tsconfig#include ), if files is not specified, the default value of include is **
屏幕快照 2022-04-27 11 24 15
And typescript will support .mts and cts, so **/*.{ts,tsx} will break.