Importing Laravel system translations
kasperhartwich opened this issue · 22 comments
When importing it perfectly scan the application, but it does not import the already existing translations in /resources/lang/en. Maybe it should?
Thx for your comment.
The issue is there is no option at the moment to separate json translations from system translations in the poeditor interface.
I have some options to try out, but if you have a suggestion you are more than welcome to discuss it here.
Maybe a seperate command to import the php files to the json files could be an idea.
It should be possible to import the files by using laravels Loader.
Could easily be better structured, but i used this to get the translations:
Arr::dot(collect(\Illuminate\Support\Facades\File::files(base_path('resources/lang/en')))->mapWithKeys(function($file) {
$group = str_replace('.php', '', $file->getFilename());
return [
$group =>
(new \Illuminate\Translation\FileLoader(
new Illuminate\Filesystem\Filesystem(),
base_path('resources/lang')
))->load('en', $group)
];
})->toArray());
Yes, but the problem is not in the scan, it already scans those files, the problem is with POEditor and how to write it back in different folders and files.
The solution is to change your system translations in /resources/lang/en/auth.php
from:
'throttle' => 'Too many login attempts. Please try again in :seconds seconds.',
to:
'throttle' => __('too many login attempts. Please try again in :seconds seconds'),
Then it will on next scan it will add it to the translation sync and you are all good.
I'll update the README soon.
Ohh.. Is that why it removes the translations i imported, when doing a new scan?
Good question, do you mean in POEditor or in Laravel project?
A new scan creates a new base_language
file and that is was is getting uploaded.
Ohh. That's why. As it is primary uploading terms, wouldn't it be smarter to merge with current one and then upload? In that way you can create terms without using then in the application.
Good idea, will look into that.
Cool. I also often experience that code gets outcommented, removed or added from different branches, so they are sometimes there and then not. The result of that would be terms getting deleted and added all the time.
You always have to do a scan for translations then the file will always be updated.
If I do a merge, old keys will be keept and never cleaned up. It will also fill up in POEditor.
We only use translation scan on master branch before deploying.
But you can delete olds keys manually on POEditor? And if you wait to scan upon deployment, you could get translations in production that has not been translated.
Try release 1.0.4 it have a merge option on scan command
There is one slight problem. If the key exists, it overrides the translation. So if the translation has been changed at POEditor, it will be changed backed to default value in the json file.
Its expected behavior, POEditor always have priority, if you choose to do an upload/download.
Otherwise im misunderstand ;)
- Scan and upload to POEditor
- Change string at POEditor.
- Download strings from POEditor.
- Scan and merge.
Then the translated string from POEditor is overwritten by the string from the code.
But why do you need the base_language strings translated in PoEditor?
They are by default correct from the code? or are you using this format group.text
instead of full strings like Your pasword is incorrect
?
First, they are only correct if the developer speaks correct english.
Secondly, for this project; yes i am using keys and not sentences to pass to the POEditor, so here they are called f.ex 'auth.password.incorrect' like laravel default behaviour.
Ok, i see, then it makes sense.
Will look into a solution to support this.
In general i don't see why new translations should ever overwrite the ones already in the translation files.
Try version 1.0.5 it only add new keys when using the --merge option
Hi @kasperhartwich have you looked at the new version and does it fix your merge issues?
Feel free to reopen if there is anything else.