ctfs/write-ups-2014

git lfs

Opened this issue Β· 13 comments

Hi, @mathiasbynens

The size of write-up's repo is growing more and more day by day which is not a good option for hosting it on github as it suggests to keep the repo under 1gb source and also for people who are trying to clone it as it will be dam slow; so why don't we create a new repo called write-up-2015 so that we can add all the ctf's done in 2015 to this repo so that it makes the write-ups more organised and we need not add the 2015 options in the end of every ctf name

Also we can organise our repo as in shell-storm ie.. sort them using the category name even though it is mentioned in the readme file. So that it becomes easy for people who focus on a particular topic as one will not solve all the challenges of the ctf and he need not look through all the challenges under the ctf to see what he likes (as the name gives very less about the type of challenge)
Also inside the category of the challenge the challenge name will be ending with the score of the challenge which is also a good thing to follow πŸ‘
Also one more good thing about the shell-storm is that it contains the score board πŸ‘ which increase the spirit of the ctf

So what do you say?? Hope these things will be implemented which makes this repo more organised and friendly to use. πŸ˜„

I didn't think the repo size would be large enough, so I checked:

$ du -d1 -h
66M ./ghost-in-the-shellcode-2014
11M ./seccon-ctf-2014
43M ./olympic-ctf-2014
32K ./hack-in-the-box-amsterdam-2014
17M ./secuinside-ctf-prequal-2014
1.2M    ./ncn-ctf-quals-2014
4.8M    ./qiwi-ctf-2014
2.2M    ./pwnium-ctf-2014
6.2M    ./hack-lu-ctf-2014
91M ./nuit-du-hack-ctf-qualifications
64K ./ructfe-2014
40M ./gpn-ctf-2014
11M ./defkthon-ctf
6.3M    ./def-con-ctf-qualifier-2014
23M ./asis-ctf-finals-2014
50M ./asis-ctf-quals-2014
666M    ./.git
464K    ./ructf-2014-quals
24K ./notsosecure-ctf-2014
180M    ./plaid-ctf-2014
740K    ./d-ctf-2014
11M ./hitcon-ctf-2014
28M ./ghost-in-the-shellcode-2015-teaser
468K    ./confidence-ds-ctf-teaser
644K    ./tinyctf-2014
5.1M    ./stripe-ctf3
17M ./ncn-ctf-2014
103M    ./csaw-ctf-2014
20M ./ectf-2014
8.1M    ./9447-ctf-2014
1.4G    .

The git repo size is 666M, but the repo itself takes up 767M of space.

@captn3m0 but think of the scenario after 2 years etc .... the size for 1 year of writeups reached half of the 1GB also we did not include all the ctf's so better we create the new repo now instead of waiting for the repo to become 1gb πŸ˜„
@captn3m0 I think other suggestions are good enough to implement, like the score board what do you say πŸ‘

Every one Feel free to comment suggestions to make this repo better

Oh, I'm with you on this repo size being too large. This definitely can't be sustained for very long. Another way of breaking it would be to put binaries and dependent files in other places (such as the github releases section). However, that doesn't quite sound as good as a solution as @dhanvi suggested (split off for each year).

I'm not sure about the scoreboard, though. ctftime already has scoreboards from various CTFs. Any reason to duplicate that?

But putting the binaries in some thing is a good Idea but organizing the binaries becomes a very difficult task πŸ‘Ž I have suggested this idea of splitting into years because the new year is starting πŸ˜„ and this is a good time to start something new also this way helps us to go back into the old (ones conducted before 2014) ctf's without any problem. Also we need not explicitly say that ctf was organised in the particular year after the name followed by year.

Score board is just to show everyone who have excelled in the ctf nothing in particular

Then what about the organisation of challenges based on the category ?? like in the shell-storm isn't it a good thing to implement ?? Also the challenge score in the challenge name as in the shell storm?? it is also a great way to do things right

Seems like every one are busy with their holidays as no one is discussing about it πŸ˜„

https://github.com/blog/1986-announcing-git-large-file-storage-lfs would elegantly solve this problem without having to rely on a third-party host.

The bad news:

Every user and organization on GitHub.com with Git LFS enabled will begin with 1 GB of free file storage and a monthly bandwidth quota of 1 GB. If your workflow requires higher quotas, you can easily purchase more storage and bandwidth for your account.

This is not enough for our purposes, and I’m not sure if paying is an option as the cost is still unknown. (We could try a donation drive kind of thing?)

I hope it is not enabled by default, because if so, we have to think of a (donation based) support. Especially with monthly bandwidth quota of 1GB

We could use git hooks (haven't worked with them yet) to automatically add a .gitignore entry for files > 10MB, e.g. mentioned here. I think the full year repos will be close to 1GB each year though β€” even with the concept to ignore all files >10MB.

Would be troublesome to implement. Most of our contributions come from web-based edits. Further, git-hooks are completely opt-in, ie they have to be setup on a system manually after cloning.

We could do it by fetching each PR, and then pruning it for large files, and then merging it, but it will be far more harder than the green button we have now.

Most of our contributions come from web-based edits.

@captn3m0 Yes, but _no_ contributions that actually add challenge files (i.e. tarballs etc.) are web-based edits. (At least, I haven’t seen any.)

Further, git-hooks are completely opt-in, ie they have to be setup on a system manually after cloning.

This is true. We could add a simple Bash script to the repo that sets up the hooks.

For each pack of 50 GB of storage + 50 GB of bandwidth per month, it’s $5/month.

https://github.com/pricing#add-ons

well we may not be able to use it yet as the git clone is not pause and resume process so even if we use use the lfs it will be a problem with slow internet speed to clone the repo. Think of a case when you cloning stops at 95 %

The raw (non-git-lfs) Git repo size would be significantly smaller, though, and git-lfs would enable us to host the larger files as closely to the repo as possible (instead of manually uploading them to a third-party host and linking them).