bchelli/node-smb2

Unable to read a large file

Opened this issue ยท 13 comments

I'm working on something that's going to be copying a large file from one Windows machine to another. The code I have looks like this:

function copyFileFromSourceToDestination(source, destination, on_success, on_error) {
  try {
    var source_smb = new smb2({
      share: '\\\\' + source.ip + '\\c$',
      domain: 'WORKGROUP',
      username: source.username,
      password: source.password
    });
    destination_smb = new smb2({
      share: '\\\\' + destination.ip + '\\c$',
      domain: 'nonayoarbiziniz',
      username: destination.username,
      password: destination.password
    });

    source_smb.exists('largefiles\\' + kSourceFileName, function(err, exists) {
      if(err) {
        return on_error(err);
      }
      if(!exists) {
        return on_error('File not found on source');
      }
      source_smb.readFile('largefiles\\' + kSourceFileName, {'encoding': null}, function(err, data) {
        if(err) {
          return on_error(err);
        }
        console.log(data);
        destination_smb.writeFile('Users\\' + destination.username + '\\AppData\\Local\\Temp\\' + kSourceFileName, data, {'encoding': null}, function(err) {
          if(err) {
            return on_error(err);
          }
          return on_success();
        });
      });
    });
  } catch(e) {
    return on_error(e);
  }
}

The result of the above function is on_success being called inside writeFile, but when I look at the file created on the destination machine, it was zero bytes. Once I saw that, I checked to make sure there was data being provided from the readFile callback, and there wasn't, only an empty buffer. However, there was no error message either which I thought was a bit odd. The file was there on the source machine, I know that because I checked myself and also we're doing the exists check at the beginning of the function.

I also took a second to throw a console.log(fileLength) in smb2/lib/api/readfile.js to make sure it was sensible; the result of it seemed a bit odd: -443374080. However, I did see that there was some bit masking in the code above it, so I don't know if that's just how the samba protocol worked. The file size that I see in Windows explorer is: 3,916,604,928 bytes. Not sure if that makes a difference, but it jumped out at me.

The next thing I tried was to dump the contents of file.EndofFile on line 39 to see what that was. The result was:

<Buffer 00 a6 e2 e5 00 00 00 00>

Anyway, I'm hung up on this and it's the last thing standing between me and finishing this project so I thought I'd throw a bug up here to see if you'd be able to help with it.

Thanks a lot and nice project ๐Ÿ‘

Christopher Dale

I just verified that this seems to have something to do with file size by adjusting the code to read a different (smaller) file and attempt to write it. The code works with the smaller file.

The file that I attempted to read/write here is 18KB (or 17,734 bytes according to the file property dialog) and the file.EndofFile property contained the following:

<Buffer 46 45 00 00 00 00 00 00>

The fileLength property appears to make more sense if we are talking about bytes:

File length: 17734

The only thing I changed in the above code was the file name to read and write.

s/kSourceFileName/'eula.1028.txt'/g

Hrm, so I can definitely see the file-size in the little-endian bytes of the EndofFIle property for the small file; but I get something different when I do the math on the first (big file):

Small file
46 45 00 00 00 00 00 00 -> 45 46 00 00 00 00 00 00 = 17734 == 17734

Big file
00 a6 e2 e5 00 00 00 00 -> a6 00 e5 e2 00 00 00 00 = 2785076706 != 3916604928

Here's more debugging info if it will help:

Small file

{ StructureSize: <Buffer 59 00>,
  OplockLevel: <Buffer 00>,
  Flags: <Buffer 00>,
  CreateAction: <Buffer 01 00 00 00>,
  CreationTime: <Buffer 00 4c 8f 32 3e 21 c8 01>,
  LastAccessTime: <Buffer fc 3e b8 3b 41 d4 cf 01>,
  LastWriteTime: <Buffer 00 4c 8f 32 3e 21 c8 01>,
  ChangeTime: <Buffer 2e 5a 30 c4 73 af d0 01>,
  AllocationSize: <Buffer 00 50 00 00 00 00 00 00>,
  EndofFile: <Buffer 46 45 00 00 00 00 00 00>,
  FileAttributes: <Buffer 20 00 00 00>,
  Reserved2: <Buffer 00 00 00 00>,
  FileId: <Buffer 2d 01 00 00 40 01 00 00 05 00 00 00 ff ff ff ff>,
  CreateContextsOffset: <Buffer 00 00 00 00>,
  CreateContextsLength: <Buffer 00 00 00 00>,
  Buffer: <Buffer > }
File length: 17734
<Buffer ff fe 2a 00 2a 00 59 00 6f 00 75 00 20 00 68 00 61 00 76 00 65 00 20 00 72 00 65 00 63 00 65 00 69 00 76 00 65 00 64 00 20 00 74 00 68 00 65 00 20 00 73 ...>

Big file

{ StructureSize: <Buffer 59 00>,
  OplockLevel: <Buffer 00>,
  Flags: <Buffer 00>,
  CreateAction: <Buffer 01 00 00 00>,
  CreationTime: <Buffer 4d b0 24 a7 58 af d0 01>,
  LastAccessTime: <Buffer 4d b0 24 a7 58 af d0 01>,
  LastWriteTime: <Buffer d9 0e a1 95 78 af d0 01>,
  ChangeTime: <Buffer d9 0e a1 95 78 af d0 01>,
  AllocationSize: <Buffer 00 b0 b2 e9 00 00 00 00>,
  EndofFile: <Buffer 00 a6 b2 e9 00 00 00 00>,
  FileAttributes: <Buffer 20 00 00 00>,
  Reserved2: <Buffer 00 00 00 00>,
  FileId: <Buffer ed 00 00 00 50 01 00 00 05 00 00 00 ff ff ff ff>,
  CreateContextsOffset: <Buffer 00 00 00 00>,
  CreateContextsLength: <Buffer 00 00 00 00>,
  Buffer: <Buffer > }
File length: -374168064
<Buffer >

Well, if there's anything I can do to help feel free to let me know. I'm out of ideas myself and can't really spend more time digging into it myself. I can however get you any information you need if it will help.

For now, I'm going to use smbclient instead; but it'd be nice to use smb2 when this issue is resolved ๐ŸŽ‰

The smbclient commands I'll be using to read and write (and yes, I'm afraid it will require copying the file locally before pushing it to the destination):

smbclient '//source/c$' -W WORKGROUP -c 'lcd /tmp; cd bigfiles; get big.file' -U 'username%password';

And then for the write I'm going to use:

smbclient '//destination/c$' -W nonayoarbiziniz -c 'lcd /tmp; cd Users\username\AppData\Local\Temp; put big.file' -U 'username%password'

I'll be watching this issue closely to see when I can switch back over to smb2! Again, thanks again for the work your doing on this project and I look forward to deploying it when this issue is resolved ๐Ÿ‘

Christopher Dale

Hi Christopher,

I tried to read some large text file and that seems to work well for me. However, I noticed that you do not pass along a autoCloseTimeout parameter in the options, it might have to do with its default value of 10 seconds. Could it be that reading the file takes longer than 10 seconds, hence when you try to write the data to the destination samba share, that connection is already closed?

You can set the autoCloseTimeout value to 0 to not autoClose, but then you must close the connection yourself

Hi @bvankeir,

Thanks for the reply! I didn't even think to set the autoCloseTimeout. I'll probably manually handle closing it since I'm going to be dealing with some rather large files on the order of 2Gb.

I'll probably be able to try this in the next couple of weeks. We've got some other high priority stuff going on right now, so this has slipped to the bottom of my queue for the moment. I'll report back on what I find.

Thanks again ๐Ÿ˜„

Hi @guywithcrookedface and @bvankeir,

Thanks for the debugging, I'm sorry I haven't been spending much time on SMB2.
Keep me posted if this solves the issue to set the timeout to 0? I'll try to look at it asap and make a fix for that.

Thanks again.
Ben

This may not be a problem of size, but a problem of time, as the value 0 for autoCloseTimout is not handled correctly and falls back to 10 seconds.

Things work well for me with the code submitted here #12

Coool, I'll give that a go; thanks for looking into the issue @marsaud!

@guywithcrookedface did 0.2.7 solve your issue ?

Hey @marsaud. Sorry man, I haven't quite had a chance to look into this yet. I'll take a moment and check it out quick and report back.

Finally freed up enough space on a VM to test this, running the test now. I'm testing with a 4GB file. I'll report back with what I find.

Hrm... I'm running into some other issues, but I think it's related to my VM + network setup. I'm going to look into that and then try again later this week.