mount glusterfs volume, files larger than 64Mb only show 64Mb
Closed this issue · 8 comments
Description of problem:
https://user-images.githubusercontent.com/10395602/88264300-1e47c180-ccfe-11ea-82ef-73e9c969d29b.png)
please help me !!
The exact command to reproduce the issue:
The full output of the command that failed:
Expected results:
Additional info:
- The output of the gluster volume info
command:
Could you let us know the steps to recreate this issue?
I was able to replicate the behavior by setting performance.parallel-readdir = on for one of my volumes. Turning it back to off again causes the file sizes to be displayed correctly. I originally set this option following the "Accessing Gluster volume via SMB Protocol" document here.
@TechPorter I see that you have shard feature enabled on the volume. In this case what @NetSean says is correct.The issue you are seeing is possible if readdir-ahead/parallel-readdir is enabled. Please disable it.
RCA:
Shard stores default shard-size blocks in the actual path and rest of the shards in a hidden directory of the brick called '.shard'. This directory won't be visible on the mount. When the user executes 'ls' command, shard fetches the actual file-size extended attribute from the file and displays the correct size to the user. When parallel-readdir/readdir-ahead features are enabled, readdir of the brick happens in the background which will not fetch actual file-size xattr from the brick and caches it. When the user does 'ls' the entries are served from this cache which doesn't have actual file size, so shard translator thinks these files are created before sharding is enabled and shows the user shard size instead of the actual file size.
@VHariharmath-rh Could you look into this issue?
One possible way to fix the issue is to make sure readdir-ahead fetches shard xattrs in readdir-ahead. A generic way is to make readdir-ahead fetch all xattrs is by using 'list-xattr' in xdata part of readdir(p).
Thank you for your contributions.
Noticed that this issue is not having any activity in last ~6 months! We are marking this issue as stale because it has not had recent activity.
It will be closed in 2 weeks if no one responds with a comment here.
Closing this issue as there was no update since my last update on issue. If this is an issue which is still valid, feel free to open it.
unstale