ritchieng/dlami

8G instead of 100G?

hunkim opened this issue Β· 8 comments

I really appreciate these AMIs. Wonderful!

One issue with them is the disk space, 100G. Usually, we are using spot instances with external EBS (in order to keep our data, checkpoints, etc.), the main disk can be small. Is it possible to make small size AMIs such as 8G or 10G?

I really appreciate it.

I'll look into this :) Seems like there are some people who want it too.

Sounds good. Look forward to small AMIs!

I added your project to the top deep learning project list. https://github.com/hunkim/DeepLearningStars. Check it out. If you know other good deep learning projects, let me know.

@hunkim Thanks for adding! The only deep learning project I know is gaining traction is something I worked on very briefly previously that gained quite a lot of traction: https://github.com/zsdonghao/tensorlayer (1000+ stars).

It is a good alternative to Keras considering how Keras has some limitations when you want certain flexibility of TensorFlow while having an easier API. I used it for reinforcement learning in particular.

I thought about this again, and it seems 8G is too small, but perhaps 20~30G (mid-size) AMIs would be appreciated.

Ok, I'll keep that in mind for the next revision.

This would be released in the upcoming TFAMI.v4: #10

@hunkim
how nice to see you here professor Kim!!
I'm a university student in Korea and also a really big fan of yours.
now I've been studying Deep Learning for about 2 months.
and I could learn and understand a lot thanks to your great effort!
쒋은 κ°•μ˜μ™€ 자료 λ§Œλ“€μ–΄ μ£Όμ…”μ„œ 정말 κ°μ‚¬λ“œλ¦½λ‹ˆλ‹€!
λ”₯λŸ¬λ‹μ„ κ³΅λΆ€ν•˜λŠ”λ° 정말 λ§Žμ€ 도움이 λ˜μ—ˆμŠ΅λ‹ˆλ‹€.
이번 가을에 홍콩에 갈 κΈ°νšŒκ°€ μƒκ²ΌλŠ”λ°
μš°μ—°νžˆ μ§€λ‚˜κ°€λ‹€κ°€ ν˜Ήμ‹œ λ΅™κ²Œ 되면 κΌ­ μΈμ‚¬λ“œλ¦¬κ³  μ‹ΆμŠ΅λ‹ˆλ‹€!
λ‹€μ‹œ ν•œλ²ˆ κ°μ‚¬λ“œλ¦½λ‹ˆλ‹€ λŠ¦μ—ˆμ§€λ§Œ μƒˆν•΄ 볡 많이 λ°›μœΌμ„Έμš”!!