nwaples/rardecode

rardecode: corrupt ppm data

Closed this issue · 6 comments

Get this error with some of the files approximately a dozen or so out of 130+ rar files.

This is the go version i am using currently:

go version go1.6.2 darwin/amd64

and below is the code responsible for decoding the rar files using rardecode:

   rarfile, err := os.Open(filepath.Join("downloads", f))
    if err != nil {
        return err
    }

    rdr, err := rardecode.NewReader(rarfile, "")
    if err != nil {
        return err
    }

    nf, err := rdr.Next()
    if err != nil {
        return err
    }

    ps := make([]byte, nf.UnPackedSize)
    _, err = rdr.Read(ps)
    if err != nil {
        return err
    }
    newbok, err := os.Create(filepath.Join("bok", bokFile))
    defer newbok.Close()
    if err != nil {
        return err
    }

    _, err = io.CopyBuffer(newbok, rdr, ps)
    if err != nil {
         return err
    }

While this worked for the vast majority of the files, however it doesn't for dozen or so files. Vast majority getting the above mentioned error, and a couple get another but I'll leave that for another issue.

Am i doing it wrong, or is there a better way?

Thank you.

No, it looks to be an error in my code somewhere.
I was able to produce the same error by compressing some large randomly generated text files. Its still possible yours is caused by a different error in the code, but Ill have a look and see if I can fix this first.

Thanks a lot.

I'll do some more testing and see if i can come up with anything, but first need to wrap my head around rar specs.

There's not a lot of documentation. Ive mostly had to rely on the free unrar c++ sourcecode as the standard. And its not pretty.

When I implemented the ppm text decompression I skipped the memory allocator for it. It was hard enough understanding it all (I still don't really understand the algorithm). It worked ok for the small files that I was testing, as I assumed unlimited memory. But it breaks for the larger files when the decode reaches the memory limits.
I think I have more of an idea how it works now, so Im having a go at hacking something together to give the correct behaviour. Its just not as easy as I'd hoped.

The latest patch may fix the corrupt ppm data problems.

I'll be testing it this week and will let you know how that went.

The latest path resolved this issue. Thanks mate.