Would it make sense to replace "coding rules" with an automated verification (aka prettify & lint) as part of build?
mi-hol opened this issue · 14 comments
Describe the improvement you'd like
Developers are used to have automated tooling to support coding tasks.
Instead of explaining the coding rules to follow an automated process should be used that assures the rules are enforced.
Additional context
I've seen a great example for above in https://github.com/vercel/next.js/blob/canary/contributing/repository/linting.md
@votdev in case you see value in going this route, I'd be happy to port above outlined tooling to OMV.
If no value is perceive just close this issue
All this exists already:
The problem with the Python code processing is that since a long time where 0 warnings and errors were reported the rules have been changed and someone must have the engagement to adapt the code or insert ignore statements for thousands of warnings. In the current state it does not make sense to lint the Python code with Git workflows.
Prettify the code automatically with workflows is no option, the code must be committed in good shape.
since a long time where 0 warnings and errors were reported the rules have been changed and someone must have the engagement to adapt the code or insert ignore statements for thousands of warnings
Sorry dont get the meaning of what you try to say. " 0 warnings" and "thousands of warnings" is contradicting info.
Prettify the code automatically with workflows is no option, the code must be committed in good shape.
Is this not the purpose of pre-commit actions?
Details are explained in https://pre-commit.com
Prettify the code automatically with workflows is no option, the code must be committed in good shape.
Is this not the purpose of pre-commit actions? Details are explained in pre-commit.com
That might be the case, but i do not want that behaviour in OMV because you do not know what happens at the end with the code. If there is a problem with the tooling it might happen that code will be corrupted.
pre-commit is a client side action and could be altered. I agree that people should be doing that before commit and verify changes.
since a long time where 0 warnings and errors were reported the rules have been changed and someone must have the engagement to adapt the code or insert ignore statements for thousands of warnings
Sorry dont get the meaning of what you try to say. " 0 warnings" and "thousands of warnings" is contradicting info.
Some long time ago i prettified and linted the whole Python code with the result of 0 warnings/errors. But since that time autopep8 has been evolved an more and more rules have been added. The outcome is that now several hundred or thousand warnings occur.
i do not want that behaviour in OMV
pre-commit is a client side action and could be altered.
fyi, https://verdantfox.com/blog/view/how-to-use-git-pre-commit-hooks-the-hard-way-and-the-easy-way
says:
"Git hook events can trigger on the server-side (where the code is stored remotely -- GitHub, GitLab, etc.) or locally (on your computer). Local git events that can trigger a hook script to run include:
pre-commit (occurs before a git commit is accepted)
post-commit (occurs immediately after a git commit is accepted)
post-checkout (occurs before a git checkout)
pre-rebase (occurs before a git rebase)
There are several other local git hook triggers, and also several server-side git hook triggers."
autopep8 has been evolved an more and more rules have been added. The outcome is that now several hundred or thousand warnings occur.
Just out of curiosity, are these warning somewhere stored to be viewed?
While it is possible to have server side pre-commit, it doesn't make any sense. You want the commit to fail when commiting. Thank you for googling to educate me. Please continue to waste people's time on this. I will see myself out.
autopep8 has been evolved an more and more rules have been added. The outcome is that now several hundred or thousand warnings occur.
Just out of curiosity, are these warning somewhere stored to be viewed?
No, you have to run fakeroot debian/rules omv_lint_py
in the openmediavault
directory.
it doesn't make any sense.
its always the same pattern in these conversation.
1.) first an highly opiniated, partially correct or even wrong answer is provided
2.) if contradicting information is provided
3.) "it makes no sense" without any reasons and " don't waste my time" is replied
=> its really fun to communicate and reach a conclusion
It is really fun dealing with your tiny changes that have little effect that just go on and on about when you are told no. You take everything verbatim and have no experience with most of the things you are arguing about. You say answers are wrong because technically they are possible but not realistically possible. I honestly don't know what or who you think you are helping. I have yet to see many tangible benefits of your "changes".
I have yet to see many tangible benefits of your "changes".
openmediavault/openmediavault@423592b
that just go on and on about when you are told no.
interesting to see your comment when in post 1 I wrote "@votdev in case you see value in going this route, I'd be happy to port above outlined tooling to OMV.
If no value is perceive just close this issue"