splitbrain/dokuwiki-plugin-sync

Sync of large wikis or with large files fails

Alfredo-HMS opened this issue · 5 comments

We have a quite large wiki (25.000 files) with quite some large media files. To make the plugin work we have to:

  • Change parameters max_execution_time and memory_limit in php.ini
  • In file admin.php of sync plugin in function sync change the call to function @set_time_limit(30) to @set_time_limit(300) and in function _getSyncList add a call to @set_time_limit(300) just before the comment \get remote file lis

We think it'd be usefull not to have that time limit hardcoded to 30 but to include it as a parameter of the plugin and to include the second call in function _getSyncList

@Alfredo-HMS For someone not that good with php, how do i add a call to @set_time_limit(300) in function _getSyncList?

Would love to get this working.

I have a similar probem with large list of files and have edited the admin.php file as suggested by Alfredo-HMS (I do not know, if I need to change the php.ini file as well and to what values; I am new to php). I get the following error message:
"Failed to fetch remote file list. transport error - Timeout while reading headers (15.015s) ><"

  • any suggestions?

Changing the @set_time_limit you change the maximum execution time of the script. The max_execution_time in php.ini sets the same limit but globally. It's quite redundant to change both of them but to play safe we change them both. The time limit set in the admin.php should overwrite the time limit set globally but anyway check what time limit you have gobally set in php.ini

The memory_limit in php.ini needs to be changed if you want to transfer big files like when you have attached big pdf to your pages. Set it as big as your largest file at least. We've set a limit of 256M and if works fine.

The message you get says the script has run into a maximum limit of execution of 15 seconds.

@Alfredo-HMS thank you, that works!

  • php.ini: setting memory_limit to 256M;

  • admin.php: setting @set_time_limit to 300, adding a call to @set_time_limit(300) in function _getSyncList;

  • and when calling the plugin on the "Wiki Synchronization" page, setting the Timeout to 300

@Alfredo-HMS, gratefully by your suggestions I was able to sync a secondary wiki to a primary one.
I believe that this plugin should be updated to use a mechanism similar to the "SearchIndex Manager", "Move page and namespace" and "BatchEdit" plug-ins, that uses AJAX to query the server, avoiding timeout issues.
When pushing large media files from remote wiki, it could push it by chunks to avoid timeout.
These approaches would diminish the need of changing PHP parameters and could have another good side-effect, would allow the user to follow the synchronization process because the browser could report in real-time the progress of the synchronization, and could allow the user cancel the synchronization at any time.