/ai-tech-interview

๐Ÿ‘ฉโ€๐Ÿ’ป๐Ÿ‘จโ€๐Ÿ’ป AI ์—”์ง€๋‹ˆ์–ด ๊ธฐ์ˆ  ๋ฉด์ ‘ ์Šคํ„ฐ๋””

MIT LicenseMIT

๐Ÿ“Œ ๋‹ต๋ณ€ ์ˆ˜์ • ๋ฐ ์‚ฌ์ดํŠธํ™”๊ฐ€ ๋  ์˜ˆ์ •์ž…๋‹ˆ๋‹ค!

logo
forks stars pr license

๐Ÿ‘‰ Discussions

์Šคํ„ฐ๋”” ๋ฐฉ์‹, ์Šคํ„ฐ๋”” ๊ธฐ๋ก, QnA๋Š” ๋ชจ๋‘ Discussions์—์„œ ์ž‘์„ฑ๋ฉ๋‹ˆ๋‹ค!


Table of Contents


Part 1. Data Science

๐Ÿ“ˆ Statistics/Math

  • ๊ณ ์œ ๊ฐ’(eigen value)์™€ ๊ณ ์œ ๋ฒกํ„ฐ(eigen vector)์— ๋Œ€ํ•ด ์„ค๋ช…ํ•ด์ฃผ์„ธ์š”. ๊ทธ๋ฆฌ๊ณ  ์™œ ์ค‘์š”ํ• ๊นŒ์š”?
  • ์ƒ˜ํ”Œ๋ง(Sampling)๊ณผ ๋ฆฌ์ƒ˜ํ”Œ๋ง(Resampling)์— ๋Œ€ํ•ด ์„ค๋ช…ํ•ด์ฃผ์„ธ์š”. ๋ฆฌ์ƒ˜ํ”Œ๋ง์€ ๋ฌด์Šจ ์žฅ์ ์ด ์žˆ์„๊นŒ์š”?
  • ํ™•๋ฅ  ๋ชจํ˜•๊ณผ ํ™•๋ฅ  ๋ณ€์ˆ˜๋Š” ๋ฌด์—‡์ผ๊นŒ์š”?
  • ๋ˆ„์  ๋ถ„ํฌ ํ•จ์ˆ˜์™€ ํ™•๋ฅ  ๋ฐ€๋„ ํ•จ์ˆ˜๋Š” ๋ฌด์—‡์ผ๊นŒ์š”? ์ˆ˜์‹๊ณผ ํ•จ๊ป˜ ํ‘œํ˜„ํ•ด์ฃผ์„ธ์š”.
  • ์กฐ๊ฑด๋ถ€ ํ™•๋ฅ ์€ ๋ฌด์—‡์ผ๊นŒ์š”?
  • ๊ณต๋ถ„์‚ฐ๊ณผ ์ƒ๊ด€๊ณ„์ˆ˜๋Š” ๋ฌด์—‡์ผ๊นŒ์š”? ์ˆ˜์‹๊ณผ ํ•จ๊ป˜ ํ‘œํ˜„ํ•ด์ฃผ์„ธ์š”.
  • ์‹ ๋ขฐ ๊ตฌ๊ฐ„์˜ ์ •์˜๋Š” ๋ฌด์—‡์ธ๊ฐ€์š”?
  • p-value๋ฅผ ๋ชจ๋ฅด๋Š” ์‚ฌ๋žŒ์—๊ฒŒ ์„ค๋ช…ํ•œ๋‹ค๋ฉด ์–ด๋–ป๊ฒŒ ์„ค๋ช…ํ•˜์‹ค ๊ฑด๊ฐ€์š”?
  • R square์˜ ์˜๋ฏธ๋Š” ๋ฌด์—‡์ธ๊ฐ€์š”?
  • ํ‰๊ท (mean)๊ณผ ์ค‘์•™๊ฐ’(median)์ค‘์— ์–ด๋–ค ์ผ€์ด์Šค์—์„œ ๋ญ๋ฅผ ์จ์•ผํ• ๊นŒ์š”?
  • ์ค‘์‹ฌ๊ทนํ•œ์ •๋ฆฌ๋Š” ์™œ ์œ ์šฉํ•œ๊ฑธ๊นŒ์š”?
  • ์—”ํŠธ๋กœํ”ผ(entropy)์— ๋Œ€ํ•ด ์„ค๋ช…ํ•ด์ฃผ์„ธ์š”. ๊ฐ€๋Šฅํ•˜๋ฉด Information Gain๋„์š”.
  • ์–ด๋–จ ๋•Œ ๋ชจ์ˆ˜์  ๋ฐฉ๋ฒ•๋ก ์„ ์“ธ ์ˆ˜ ์žˆ๊ณ , ์–ด๋–จ ๋•Œ ๋น„๋ชจ์ˆ˜์  ๋ฐฉ๋ฒ•๋ก ์„ ์“ธ ์ˆ˜ ์žˆ๋‚˜์š”?
  • โ€œlikelihoodโ€์™€ โ€œprobabilityโ€์˜ ์ฐจ์ด๋Š” ๋ฌด์—‡์ผ๊นŒ์š”?
  • ํ†ต๊ณ„์—์„œ ์‚ฌ์šฉ๋˜๋Š” bootstrap์˜ ์˜๋ฏธ๋Š” ๋ฌด์—‡์ธ๊ฐ€์š”.
  • ๋ชจ์ˆ˜๊ฐ€ ๋งค์šฐ ์ ์€ (์ˆ˜์‹ญ๊ฐœ ์ดํ•˜) ์ผ€์ด์Šค์˜ ๊ฒฝ์šฐ ์–ด๋–ค ๋ฐฉ์‹์œผ๋กœ ์˜ˆ์ธก ๋ชจ๋ธ์„ ์ˆ˜๋ฆฝํ•  ์ˆ˜ ์žˆ์„๊นŒ์š”?
  • ๋ฒ ์ด์ง€์•ˆ๊ณผ ํ”„๋ฆฌํ€€ํ‹ฐ์ŠคํŠธ ๊ฐ„์˜ ์ž…์žฅ์ฐจ์ด๋ฅผ ์„ค๋ช…ํ•ด์ฃผ์‹ค ์ˆ˜ ์žˆ๋‚˜์š”?
  • ๊ฒ€์ •๋ ฅ(statistical power)์€ ๋ฌด์—‡์ผ๊นŒ์š”?
  • missing value๊ฐ€ ์žˆ์„ ๊ฒฝ์šฐ ์ฑ„์›Œ์•ผ ํ• ๊นŒ์š”? ๊ทธ ์ด์œ ๋Š” ๋ฌด์—‡์ธ๊ฐ€์š”?
  • ์•„์›ƒ๋ผ์ด์–ด์˜ ํŒ๋‹จํ•˜๋Š” ๊ธฐ์ค€์€ ๋ฌด์—‡์ธ๊ฐ€์š”?
  • ํ•„์š”ํ•œ ํ‘œ๋ณธ์˜ ํฌ๊ธฐ๋ฅผ ์–ด๋–ป๊ฒŒ ๊ณ„์‚ฐํ•ฉ๋‹ˆ๊นŒ?
  • Bias๋ฅผ ํ†ต์ œํ•˜๋Š” ๋ฐฉ๋ฒ•์€ ๋ฌด์—‡์ž…๋‹ˆ๊นŒ?
  • ๋กœ๊ทธ ํ•จ์ˆ˜๋Š” ์–ด๋–ค ๊ฒฝ์šฐ ์œ ์šฉํ•ฉ๋‹ˆ๊นŒ? ์‚ฌ๋ก€๋ฅผ ๋“ค์–ด ์„ค๋ช…ํ•ด์ฃผ์„ธ์š”.
  • ๋ฒ ๋ฅด๋ˆ„์ด ๋ถ„ํฌ / ์ดํ•ญ ๋ถ„ํฌ / ์นดํ…Œ๊ณ ๋ฆฌ ๋ถ„ํฌ / ๋‹คํ•ญ ๋ถ„ํฌ / ๊ฐ€์šฐ์‹œ์•ˆ ์ •๊ทœ ๋ถ„ํฌ / t ๋ถ„ํฌ / ์นด์ด์ œ๊ณฑ ๋ถ„ํฌ / F ๋ถ„ํฌ / ๋ฒ ํƒ€ ๋ถ„ํฌ / ๊ฐ๋งˆ ๋ถ„ํฌ์— ๋Œ€ํ•ด ์„ค๋ช…ํ•ด์ฃผ์„ธ์š”. ๊ทธ๋ฆฌ๊ณ  ๋ถ„ํฌ ๊ฐ„์˜ ์—ฐ๊ด€์„ฑ๋„ ์„ค๋ช…ํ•ด์ฃผ์„ธ์š”.
  • ์ถœ์žฅ์„ ์œ„ํ•ด ๋น„ํ–‰๊ธฐ๋ฅผ ํƒ€๋ ค๊ณ  ํ•ฉ๋‹ˆ๋‹ค. ๋‹น์‹ ์€ ์šฐ์‚ฐ์„ ๊ฐ€์ ธ๊ฐ€์•ผ ํ•˜๋Š”์ง€ ์•Œ๊ณ  ์‹ถ์–ด ์ถœ์žฅ์ง€์— ์‚ฌ๋Š” ์นœ๊ตฌ 3๋ช…์—๊ฒŒ ๋ฌด์ž‘์œ„๋กœ ์ „ํ™”๋ฅผ ํ•˜๊ณ  ๋น„๊ฐ€ ์˜ค๋Š” ๊ฒฝ์šฐ๋ฅผ ๋…๋ฆฝ์ ์œผ๋กœ ์งˆ๋ฌธํ•ด์ฃผ์„ธ์š”. ๊ฐ ์นœ๊ตฌ๋Š” 2/3๋กœ ์ง„์‹ค์„ ๋งํ•˜๊ณ  1/3์œผ๋กœ ๊ฑฐ์ง“์„ ๋งํ•ฉ๋‹ˆ๋‹ค. 3๋ช…์˜ ์นœ๊ตฌ๊ฐ€ ๋ชจ๋‘ โ€œ๊ทธ๋ ‡์Šต๋‹ˆ๋‹ค. ๋น„๊ฐ€ ๋‚ด๋ฆฌ๊ณ  ์žˆ์Šต๋‹ˆ๋‹คโ€๋ผ๊ณ  ๋งํ–ˆ์Šต๋‹ˆ๋‹ค. ์‹ค์ œ๋กœ ๋น„๊ฐ€ ๋‚ด๋ฆด ํ™•๋ฅ ์€ ์–ผ๋งˆ์ž…๋‹ˆ๊นŒ?

๋ชฉ์ฐจ๋กœ ๋Œ์•„๊ฐ€๊ธฐ


๐Ÿค– Machine Learning

  • ์•Œ๊ณ  ์žˆ๋Š” metric์— ๋Œ€ํ•ด ์„ค๋ช…ํ•ด์ฃผ์„ธ์š”. (ex. RMSE, MAE, recall, precision ...)
  • ์ •๊ทœํ™”๋ฅผ ์™œ ํ•ด์•ผํ• ๊นŒ์š”? ์ •๊ทœํ™”์˜ ๋ฐฉ๋ฒ•์€ ๋ฌด์—‡์ด ์žˆ๋‚˜์š”?
  • Local Minima์™€ Global Minima์— ๋Œ€ํ•ด ์„ค๋ช…ํ•ด์ฃผ์„ธ์š”.
  • ์ฐจ์›์˜ ์ €์ฃผ์— ๋Œ€ํ•ด ์„ค๋ช…ํ•ด์ฃผ์„ธ์š”.
  • dimension reduction๊ธฐ๋ฒ•์œผ๋กœ ๋ณดํ†ต ์–ด๋–ค ๊ฒƒ๋“ค์ด ์žˆ๋‚˜์š”?
  • PCA๋Š” ์ฐจ์› ์ถ•์†Œ ๊ธฐ๋ฒ•์ด๋ฉด์„œ, ๋ฐ์ดํ„ฐ ์••์ถ• ๊ธฐ๋ฒ•์ด๊ธฐ๋„ ํ•˜๊ณ , ๋…ธ์ด์ฆˆ ์ œ๊ฑฐ๊ธฐ๋ฒ•์ด๊ธฐ๋„ ํ•ฉ๋‹ˆ๋‹ค. ์™œ ๊ทธ๋Ÿฐ์ง€ ์„ค๋ช…ํ•ด์ฃผ์‹ค ์ˆ˜ ์žˆ๋‚˜์š”?
  • LSA, LDA, SVD ๋“ฑ์˜ ์•ฝ์ž๋“ค์ด ์–ด๋–ค ๋œป์ด๊ณ  ์„œ๋กœ ์–ด๋–ค ๊ด€๊ณ„๋ฅผ ๊ฐ€์ง€๋Š”์ง€ ์„ค๋ช…ํ•  ์ˆ˜ ์žˆ๋‚˜์š”?
  • Markov Chain์„ ๊ณ ๋“ฑํ•™์ƒ์—๊ฒŒ ์„ค๋ช…ํ•˜๋ ค๋ฉด ์–ด๋–ค ๋ฐฉ์‹์ด ์ œ์ผ ์ข‹์„๊นŒ์š”?
  • ํ…์ŠคํŠธ ๋”๋ฏธ์—์„œ ์ฃผ์ œ๋ฅผ ์ถ”์ถœํ•ด์•ผ ํ•ฉ๋‹ˆ๋‹ค. ์–ด๋–ค ๋ฐฉ์‹์œผ๋กœ ์ ‘๊ทผํ•ด ๋‚˜๊ฐ€์‹œ๊ฒ ๋‚˜์š”?
  • SVM์€ ์™œ ๋ฐ˜๋Œ€๋กœ ์ฐจ์›์„ ํ™•์žฅ์‹œํ‚ค๋Š” ๋ฐฉ์‹์œผ๋กœ ๋™์ž‘ํ• ๊นŒ์š”? SVM์€ ์™œ ์ข‹์„๊นŒ์š”?
  • ๋‹ค๋ฅธ ์ข‹์€ ๋จธ์‹  ๋Ÿฌ๋‹ ๋Œ€๋น„, ์˜ค๋ž˜๋œ ๊ธฐ๋ฒ•์ธ ๋‚˜์ด๋ธŒ ๋ฒ ์ด์ฆˆ(naive bayes)์˜ ์žฅ์ ์„ ์˜นํ˜ธํ•ด๋ณด์„ธ์š”.
  • ํšŒ๊ท€ / ๋ถ„๋ฅ˜์‹œ ์•Œ๋งž์€ metric์€ ๋ฌด์—‡์ผ๊นŒ?
  • Association Rule์˜ Support, Confidence, Lift์— ๋Œ€ํ•ด ์„ค๋ช…ํ•ด์ฃผ์„ธ์š”.
  • ์ตœ์ ํ™” ๊ธฐ๋ฒ•์ค‘ Newtonโ€™s Method์™€ Gradient Descent ๋ฐฉ๋ฒ•์— ๋Œ€ํ•ด ์•Œ๊ณ  ์žˆ๋‚˜์š”?
  • ๋จธ์‹ ๋Ÿฌ๋‹(machine)์  ์ ‘๊ทผ๋ฐฉ๋ฒ•๊ณผ ํ†ต๊ณ„(statistics)์  ์ ‘๊ทผ๋ฐฉ๋ฒ•์˜ ๋‘˜๊ฐ„์— ์ฐจ์ด์— ๋Œ€ํ•œ ๊ฒฌํ•ด๊ฐ€ ์žˆ๋‚˜์š”?
  • ์ธ๊ณต์‹ ๊ฒฝ๋ง(deep learning์ด์ „์˜ ์ „ํ†ต์ ์ธ)์ด ๊ฐ€์ง€๋Š” ์ผ๋ฐ˜์ ์ธ ๋ฌธ์ œ์ ์€ ๋ฌด์—‡์ผ๊นŒ์š”?
  • ์ง€๊ธˆ ๋‚˜์˜ค๊ณ  ์žˆ๋Š” deep learning ๊ณ„์—ด์˜ ํ˜์‹ ์˜ ๊ทผ๊ฐ„์€ ๋ฌด์—‡์ด๋ผ๊ณ  ์ƒ๊ฐํ•˜์‹œ๋‚˜์š”?
  • ROC ์ปค๋ธŒ์— ๋Œ€ํ•ด ์„ค๋ช…ํ•ด์ฃผ์‹ค ์ˆ˜ ์žˆ์œผ์‹ ๊ฐ€์š”?
  • ์—ฌ๋Ÿฌ๋ถ„์ด ์„œ๋ฒ„๋ฅผ 100๋Œ€ ๊ฐ€์ง€๊ณ  ์žˆ์Šต๋‹ˆ๋‹ค. ์ด๋•Œ ์ธ๊ณต์‹ ๊ฒฝ๋ง๋ณด๋‹ค Random Forest๋ฅผ ์จ์•ผํ•˜๋Š” ์ด์œ ๋Š” ๋ญ˜๊นŒ์š”?
  • K-means์˜ ๋Œ€ํ‘œ์  ์˜๋ฏธ๋ก ์  ๋‹จ์ ์€ ๋ฌด์—‡์ธ๊ฐ€์š”? (๊ณ„์‚ฐ๋Ÿ‰ ๋งŽ๋‹ค๋Š”๊ฒƒ ๋ง๊ณ )
  • L1, L2 ์ •๊ทœํ™”์— ๋Œ€ํ•ด ์„ค๋ช…ํ•ด์ฃผ์„ธ์š”.
  • Cross Validation์€ ๋ฌด์—‡์ด๊ณ  ์–ด๋–ป๊ฒŒ ํ•ด์•ผํ•˜๋‚˜์š”?
  • XGBoost์„ ์•„์‹œ๋‚˜์š”? ์™œ ์ด ๋ชจ๋ธ์ด ์บ๊ธ€์—์„œ ์œ ๋ช…ํ• ๊นŒ์š”?
  • ์•™์ƒ๋ธ” ๋ฐฉ๋ฒ•์—” ์–ด๋–ค ๊ฒƒ๋“ค์ด ์žˆ๋‚˜์š”?
  • feature vector๋ž€ ๋ฌด์—‡์ผ๊นŒ์š”?
  • ์ข‹์€ ๋ชจ๋ธ์˜ ์ •์˜๋Š” ๋ฌด์—‡์ผ๊นŒ์š”?
  • 50๊ฐœ์˜ ์ž‘์€ ์˜์‚ฌ๊ฒฐ์ • ๋‚˜๋ฌด๋Š” ํฐ ์˜์‚ฌ๊ฒฐ์ • ๋‚˜๋ฌด๋ณด๋‹ค ๊ดœ์ฐฎ์„๊นŒ์š”? ์™œ ๊ทธ๋ ‡๊ฒŒ ์ƒ๊ฐํ•˜๋‚˜์š”?
  • ์ŠคํŒธ ํ•„ํ„ฐ์— ๋กœ์ง€์Šคํ‹ฑ ๋ฆฌ๊ทธ๋ ˆ์…˜์„ ๋งŽ์ด ์‚ฌ์šฉํ•˜๋Š” ์ด์œ ๋Š” ๋ฌด์—‡์ผ๊นŒ์š”?
  • OLS(ordinary least squre) regression์˜ ๊ณต์‹์€ ๋ฌด์—‡์ธ๊ฐ€์š”?

๋ชฉ์ฐจ๋กœ ๋Œ์•„๊ฐ€๊ธฐ


๐Ÿง  Deep Learning

  • ๋”ฅ๋Ÿฌ๋‹์€ ๋ฌด์—‡์ธ๊ฐ€์š”? ๋”ฅ๋Ÿฌ๋‹๊ณผ ๋จธ์‹ ๋Ÿฌ๋‹์˜ ์ฐจ์ด๋Š”?
  • Cost Function๊ณผ Activation Function์€ ๋ฌด์—‡์ธ๊ฐ€์š”?
  • Tensorflow, PyTorch ํŠน์ง•๊ณผ ์ฐจ์ด๊ฐ€ ๋ญ˜๊นŒ์š”?
  • Data Normalization์€ ๋ฌด์—‡์ด๊ณ  ์™œ ํ•„์š”ํ•œ๊ฐ€์š”?
  • ์•Œ๊ณ ์žˆ๋Š” Activation Function์— ๋Œ€ํ•ด ์•Œ๋ ค์ฃผ์„ธ์š”. (Sigmoid, ReLU, LeakyReLU, Tanh ๋“ฑ)
  • ์˜ค๋ฒ„ํ”ผํŒ…์ผ ๊ฒฝ์šฐ ์–ด๋–ป๊ฒŒ ๋Œ€์ฒ˜ํ•ด์•ผ ํ• ๊นŒ์š”?
  • ํ•˜์ดํผ ํŒŒ๋ผ๋ฏธํ„ฐ๋Š” ๋ฌด์—‡์ธ๊ฐ€์š”?
  • Weight Initialization ๋ฐฉ๋ฒ•์— ๋Œ€ํ•ด ๋งํ•ด์ฃผ์„ธ์š”. ๊ทธ๋ฆฌ๊ณ  ๋ฌด์—‡์„ ๋งŽ์ด ์‚ฌ์šฉํ•˜๋‚˜์š”?
  • ๋ณผ์ธ ๋งŒ ๋จธ์‹ ์€ ๋ฌด์—‡์ธ๊ฐ€์š”?
  • TF, PyTorch ๋“ฑ์„ ์‚ฌ์šฉํ•  ๋•Œ ๋””๋ฒ„๊น… ๋…ธํ•˜์šฐ๋Š”?
  • ๋‰ด๋Ÿด๋„ท์˜ ๊ฐ€์žฅ ํฐ ๋‹จ์ ์€ ๋ฌด์—‡์ธ๊ฐ€? ์ด๋ฅผ ์œ„ํ•ด ๋‚˜์˜จ One-Shot Learning์€ ๋ฌด์—‡์ธ๊ฐ€?
  • ์š”์ฆ˜ Sigmoid ๋ณด๋‹ค ReLU๋ฅผ ๋งŽ์ด ์“ฐ๋Š”๋ฐ ๊ทธ ์ด์œ ๋Š”?
    • Non-Linearity๋ผ๋Š” ๋ง์˜ ์˜๋ฏธ์™€ ๊ทธ ํ•„์š”์„ฑ์€?
    • ReLU๋กœ ์–ด๋–ป๊ฒŒ ๊ณก์„  ํ•จ์ˆ˜๋ฅผ ๊ทผ์‚ฌํ•˜๋‚˜?
    • ReLU์˜ ๋ฌธ์ œ์ ์€?
    • Bias๋Š” ์™œ ์žˆ๋Š”๊ฑธ๊นŒ?
  • Gradient Descent์— ๋Œ€ํ•ด์„œ ์‰ฝ๊ฒŒ ์„ค๋ช…ํ•œ๋‹ค๋ฉด?
    • ์™œ ๊ผญ Gradient๋ฅผ ์จ์•ผ ํ• ๊นŒ? ๊ทธ ๊ทธ๋ž˜ํ”„์—์„œ ๊ฐ€๋กœ์ถ•๊ณผ ์„ธ๋กœ์ถ• ๊ฐ๊ฐ์€ ๋ฌด์—‡์ธ๊ฐ€? ์‹ค์ œ ์ƒํ™ฉ์—์„œ๋Š” ๊ทธ ๊ทธ๋ž˜ํ”„๊ฐ€ ์–ด๋–ป๊ฒŒ ๊ทธ๋ ค์งˆ๊นŒ?
    • GD ์ค‘์— ๋•Œ๋•Œ๋กœ Loss๊ฐ€ ์ฆ๊ฐ€ํ•˜๋Š” ์ด์œ ๋Š”?
    • Back Propagation์— ๋Œ€ํ•ด์„œ ์‰ฝ๊ฒŒ ์„ค๋ช… ํ•œ๋‹ค๋ฉด?
  • Local Minima ๋ฌธ์ œ์—๋„ ๋ถˆ๊ตฌํ•˜๊ณ  ๋”ฅ๋Ÿฌ๋‹์ด ์ž˜ ๋˜๋Š” ์ด์œ ๋Š”?
    • GD๊ฐ€ Local Minima ๋ฌธ์ œ๋ฅผ ํ”ผํ•˜๋Š” ๋ฐฉ๋ฒ•์€?
    • ์ฐพ์€ ํ•ด๊ฐ€ Global Minimum์ธ์ง€ ์•„๋‹Œ์ง€ ์•Œ ์ˆ˜ ์žˆ๋Š” ๋ฐฉ๋ฒ•์€?
  • Training ์„ธํŠธ์™€ Test ์„ธํŠธ๋ฅผ ๋ถ„๋ฆฌํ•˜๋Š” ์ด์œ ๋Š”?
    • Validation ์„ธํŠธ๊ฐ€ ๋”ฐ๋กœ ์žˆ๋Š” ์ด์œ ๋Š”?
    • Test ์„ธํŠธ๊ฐ€ ์˜ค์—ผ๋˜์—ˆ๋‹ค๋Š” ๋ง์˜ ๋œป์€?
    • Regularization์ด๋ž€ ๋ฌด์—‡์ธ๊ฐ€?
  • Batch Normalization์˜ ํšจ๊ณผ๋Š”?
    • Dropout์˜ ํšจ๊ณผ๋Š”?
    • BN ์ ์šฉํ•ด์„œ ํ•™์Šต ์ดํ›„ ์‹ค์ œ ์‚ฌ์šฉ์‹œ์— ์ฃผ์˜ํ•  ์ ์€? ์ฝ”๋“œ๋กœ๋Š”?
    • GAN์—์„œ Generator ์ชฝ์—๋„ BN์„ ์ ์šฉํ•ด๋„ ๋ ๊นŒ?
  • SGD, RMSprop, Adam์— ๋Œ€ํ•ด์„œ ์•„๋Š”๋Œ€๋กœ ์„ค๋ช…ํ•œ๋‹ค๋ฉด?
    • SGD์—์„œ Stochastic์˜ ์˜๋ฏธ๋Š”?
    • ๋ฏธ๋‹ˆ๋ฐฐ์น˜๋ฅผ ์ž‘๊ฒŒ ํ• ๋•Œ์˜ ์žฅ๋‹จ์ ์€?
    • ๋ชจ๋ฉ˜ํ…€์˜ ์ˆ˜์‹์„ ์ ์–ด ๋ณธ๋‹ค๋ฉด?
  • ๊ฐ„๋‹จํ•œ MNIST ๋ถ„๋ฅ˜๊ธฐ๋ฅผ MLP+CPU ๋ฒ„์ „์œผ๋กœ numpy๋กœ ๋งŒ๋“ ๋‹ค๋ฉด ๋ช‡์ค„์ผ๊นŒ?
    • ์–ด๋Š ์ •๋„ ๋Œ์•„๊ฐ€๋Š” ๋…€์„์„ ์ž‘์„ฑํ•˜๊ธฐ๊นŒ์ง€ ๋ช‡์‹œ๊ฐ„ ์ •๋„ ๊ฑธ๋ฆด๊นŒ?
    • Back Propagation์€ ๋ช‡์ค„์ธ๊ฐ€?
    • CNN์œผ๋กœ ๋ฐ”๊พผ๋‹ค๋ฉด ์–ผ๋งˆ๋‚˜ ์ถ”๊ฐ€๋ ๊นŒ?
  • ๊ฐ„๋‹จํ•œ MNIST ๋ถ„๋ฅ˜๊ธฐ๋ฅผ TF, PyTorch ๋“ฑ์œผ๋กœ ์ž‘์„ฑํ•˜๋Š”๋ฐ ๋ช‡์‹œ๊ฐ„์ด ํ•„์š”ํ•œ๊ฐ€?
    • CNN์ด ์•„๋‹Œ MLP๋กœ ํ•ด๋„ ์ž˜ ๋ ๊นŒ?
    • ๋งˆ์ง€๋ง‰ ๋ ˆ์ด์–ด ๋ถ€๋ถ„์— ๋Œ€ํ•ด์„œ ์„ค๋ช… ํ•œ๋‹ค๋ฉด?
    • ํ•™์Šต์€ BCE loss๋กœ ํ•˜๋˜ ์ƒํ™ฉ์„ MSE loss๋กœ ๋ณด๊ณ  ์‹ถ๋‹ค๋ฉด?
  • ๋”ฅ๋Ÿฌ๋‹ํ•  ๋•Œ GPU๋ฅผ ์“ฐ๋ฉด ์ข‹์€ ์ด์œ ๋Š”?
    • GPU๋ฅผ ๋‘๊ฐœ ๋‹ค ์“ฐ๊ณ  ์‹ถ๋‹ค. ๋ฐฉ๋ฒ•์€?
    • ํ•™์Šต์‹œ ํ•„์š”ํ•œ GPU ๋ฉ”๋ชจ๋ฆฌ๋Š” ์–ด๋–ป๊ฒŒ ๊ณ„์‚ฐํ•˜๋Š”๊ฐ€?

๋ชฉ์ฐจ๋กœ ๋Œ์•„๊ฐ€๊ธฐ


Part 2. Language

๐Ÿ Python

  • What is the difference between list and tuples in Python?
  • What are the key features of Python?
  • What type of language is python? Programming or scripting?
  • Python an interpreted language. Explain.
  • What is pep 8?
  • How is memory managed in Python?
  • What is namespace in Python?
  • What is PYTHONPATH?
  • What are python modules? Name some commonly used built-in modules in Python?
  • What are local variables and global variables in Python?
  • Is python case sensitive?
  • What is type conversion in Python?
  • How to install Python on Windows and set path variable?
  • Is indentation required in python?
  • What is the difference between Python Arrays and lists?
  • What are functions in Python?
  • What is __init__?
  • What is a lambda function?
  • What is self in Python?
  • How does break, continue and pass work?
  • What does [::-1] do?
  • How can you randomize the items of a list in place in Python?
  • Whatโ€™s the difference between iterator and iterable?
  • How can you generate random numbers in Python?
  • What is the difference between range & xrange?
  • How do you write comments in python?
  • What is pickling and unpickling?
  • What are the generators in python?
  • How will you capitalize the first letter of string?
  • How will you convert a string to all lowercase?
  • How to comment multiple lines in python?
  • What are docstrings in Python?
  • What is the purpose of is, not and in operators?
  • What is the usage of help() and dir() function in Python?
  • Whenever Python exits, why isnโ€™t all the memory de-allocated?
  • What is a dictionary in Python?
  • How can the ternary operators be used in python?
  • What does this mean: *args, **kwargs? And why would we use it?
  • What does len() do?
  • Explain split(), sub(), subn() methods of โ€œreโ€ module in Python.
  • What are negative indexes and why are they used?
  • What are Python packages?
  • How can files be deleted in Python?
  • What are the built-in types of python?
  • What advantages do NumPy arrays offer over (nested) Python lists?
  • How to add values to a python array?
  • How to remove values to a python array?
  • Does Python have OOps concepts?
  • What is the difference between deep and shallow copy?
  • How is Multithreading achieved in Python?
  • What is the process of compilation and linking in python?
  • What are Python libraries? Name a few of them.
  • What is split used for?
  • How to import modules in python?
  • Explain Inheritance in Python with an example.
  • How are classes created in Python?
  • What is monkey patching in Python?
  • Does python support multiple inheritance?
  • What is Polymorphism in Python?
  • Define encapsulation in Python?
  • How do you do data abstraction in Python?
  • Does python make use of access specifiers?
  • How to create an empty class in Python?
  • What does an object() do?
  • What is map function in Python?
  • Is python numpy better than lists?
  • What is GIL in Python language?
  • What makes the CPython different from Python?
  • What are Decorators in Python?
  • What is object interning?
  • What is @classmethod, @staticmethod, @property?

๋ชฉ์ฐจ๋กœ ๋Œ์•„๊ฐ€๊ธฐ


Part 3. CS

๐ŸŒ Network

  • TCP/IP์˜ ๊ฐ ๊ณ„์ธต์„ ์„ค๋ช…ํ•ด์ฃผ์„ธ์š”.
  • OSI 7๊ณ„์ธต์™€ TCP/IP ๊ณ„์ธต์˜ ์ฐจ์ด๋ฅผ ์„ค๋ช…ํ•ด์ฃผ์„ธ์š”.
  • Frame, Packet, Segment, Datagram์„ ๋น„๊ตํ•ด์ฃผ์„ธ์š”.
  • TCP์™€ UDP์˜ ์ฐจ์ด๋ฅผ ์„ค๋ช…ํ•ด์ฃผ์„ธ์š”.
  • TCP์™€ UDP์˜ ํ—ค๋”๋ฅผ ๋น„๊ตํ•ด์ฃผ์„ธ์š”.
  • TCP์˜ 3-way-handshake์™€ 4-way-handshake๋ฅผ ๋น„๊ต ์„ค๋ช…ํ•ด์ฃผ์„ธ์š”.
  • TCP์˜ ์—ฐ๊ฒฐ ์„ค์ • ๊ณผ์ •(3๋‹จ๊ณ„)๊ณผ ์—ฐ๊ฒฐ ์ข…๋ฃŒ ๊ณผ์ •(4๋‹จ๊ณ„)์ด ๋‹จ๊ณ„๊ฐ€ ์ฐจ์ด๋‚˜๋Š” ์ด์œ ๊ฐ€ ๋ฌด์—‡์ธ๊ฐ€์š”?
  • ๋งŒ์•ฝ Server์—์„œ FIN ํ”Œ๋ž˜๊ทธ๋ฅผ ์ „์†กํ•˜๊ธฐ ์ „์— ์ „์†กํ•œ ํŒจํ‚ท์ด Routing ์ง€์—ฐ์ด๋‚˜ ํŒจํ‚ท ์œ ์‹ค๋กœ ์ธํ•œ ์žฌ์ „์†ก ๋“ฑ์œผ๋กœ ์ธํ•ด FIN ํŒจํ‚ท๋ณด๋‹ค ๋Šฆ๊ฒŒ ๋„์ฐฉํ•˜๋Š” ์ƒํ™ฉ์ด ๋ฐœ์ƒํ•˜๋ฉด ์–ด๋–ป๊ฒŒ ๋ ๊นŒ์š”?
  • ์ดˆ๊ธฐ Sequence Number์ธ ISN์„ 0๋ถ€ํ„ฐ ์‹œ์ž‘ํ•˜์ง€ ์•Š๊ณ  ๋‚œ์ˆ˜๋ฅผ ์ƒ์„ฑํ•ด์„œ ์„ค์ •ํ•˜๋Š” ์ด์œ ๊ฐ€ ๋ฌด์—‡์ธ๊ฐ€์š”?
  • HTTP์™€ HTTPS์— ๋Œ€ํ•ด์„œ ์„ค๋ช…ํ•˜๊ณ  ์ฐจ์ด์ ์— ๋Œ€ํ•ด ์„ค๋ช…ํ•ด์ฃผ์„ธ์š”.
  • HTTP ์š”์ฒญ/์‘๋‹ต ํ—ค๋”์˜ ๊ตฌ์กฐ๋ฅผ ์„ค๋ช…ํ•ด์ฃผ์„ธ์š”.
  • HTTP์™€ HTTPS ๋™์ž‘ ๊ณผ์ •์„ ๋น„๊ตํ•ด์ฃผ์„ธ์š”.
  • CORS๊ฐ€ ๋ฌด์—‡์ธ๊ฐ€์š”?
  • HTTP GET๊ณผ POST ๋ฉ”์„œ๋“œ๋ฅผ ๋น„๊ต/์„ค๋ช…ํ•ด์ฃผ์„ธ์š”.
  • ์ฟ ํ‚ค(Cookie)์™€ ์„ธ์…˜(Session)์„ ์„ค๋ช…ํ•ด์ฃผ์„ธ์š”.
  • DNS๊ฐ€ ๋ฌด์—‡์ธ๊ฐ€์š”?
  • REST์™€ RESTful์˜ ๊ฐœ๋…์„ ์„ค๋ช…ํ•˜๊ณ  ์ฐจ์ด๋ฅผ ๋งํ•ด์ฃผ์„ธ์š”.
  • ์†Œ์ผ“(Socket)์ด ๋ฌด์—‡์ธ๊ฐ€์š”? ์ž์‹  ์žˆ๋Š” ์–ธ์–ด๋กœ ๊ฐ„๋‹จํžˆ ์†Œ์ผ“ ์ƒ์„ฑ ์˜ˆ์‹œ๋ฅผ ๋ณด์—ฌ์ฃผ์„ธ์š”.
  • Socket.io์™€ WebSocket์˜ ์ฐจ์ด๋ฅผ ์„ค๋ช…ํ•ด์ฃผ์„ธ์š”.
  • IPv4์™€ IPv6 ์ฐจ์ด๋ฅผ ์„ค๋ช…ํ•ด์ฃผ์„ธ์š”.
  • MAC Address๊ฐ€ ๋ฌด์—‡์ธ๊ฐ€์š”?
  • ๋ผ์šฐํ„ฐ์™€ ์Šค์œ„์น˜, ํ—ˆ๋ธŒ์˜ ์ฐจ์ด๋ฅผ ์„ค๋ช…ํ•ด์ฃผ์„ธ์š”.
  • SMTP๊ฐ€ ๋ฌด์—‡์ธ๊ฐ€์š”?
  • ๋…ธํŠธ๋ถ์œผ๋กœ www.google.com์— ์ ‘์†์„ ํ–ˆ์Šต๋‹ˆ๋‹ค. ์š”์ฒญ์„ ๋ณด๋‚ด๊ณ  ๋ฐ›๊ธฐ๊นŒ์ง€์˜ ๊ณผ์ •์„ ์ž์„ธํžˆ ์„ค๋ช…ํ•ด์ฃผ์„ธ์š”.
  • ์—ฌ๋Ÿฌ ๋„คํŠธ์›Œํฌ topology์— ๋Œ€ํ•ด ๊ฐ„๋‹จํžˆ ์†Œ๊ฐœํ•ด์ฃผ์„ธ์š”.
  • subnet mask์— ๋Œ€ํ•ด์„œ ์„ค๋ช…ํ•ด์ฃผ์„ธ์š”.
  • data encapsulation์ด ๋ฌด์—‡์ธ๊ฐ€์š”?
  • DHCP๋ฅผ ์„ค๋ช…ํ•ด์ฃผ์„ธ์š”.
  • routing protocol์„ ๋ช‡ ๊ฐ€์ง€ ์„ค๋ช…ํ•ด์ฃผ์„ธ์š”. (ex. link state, distance vector)
  • ์ด๋”๋„ท(ethernet)์ด ๋ฌด์—‡์ธ๊ฐ€์š”?
  • client์™€ server์˜ ์ฐจ์ด์ ์„ ์„ค๋ช…ํ•ด์ฃผ์„ธ์š”.
  • delay, timing(jitter), throughput ์ฐจ์ด๋ฅผ ์„ค๋ช…ํ•ด์ฃผ์„ธ์š”.

๋ชฉ์ฐจ๋กœ ๋Œ์•„๊ฐ€๊ธฐ


๐Ÿ–ฅ๏ธ Operating System

  • ํ”„๋กœ์„ธ์Šค์™€ ์Šค๋ ˆ๋“œ์˜ ์ฐจ์ด(Process vs Thread)๋ฅผ ์•Œ๋ ค์ฃผ์„ธ์š”.
  • ๋ฉ€ํ‹ฐ ํ”„๋กœ์„ธ์Šค ๋Œ€์‹  ๋ฉ€ํ‹ฐ ์Šค๋ ˆ๋“œ๋ฅผ ์‚ฌ์šฉํ•˜๋Š” ์ด์œ ๋ฅผ ์„ค๋ช…ํ•ด์ฃผ์„ธ์š”.
  • ์บ์‹œ์˜ ์ง€์—ญ์„ฑ์— ๋Œ€ํ•ด ์„ค๋ช…ํ•ด์ฃผ์„ธ์š”.
  • Thread-safe์— ๋Œ€ํ•ด ์„ค๋ช…ํ•ด์ฃผ์„ธ์š”. (hint: critical section)
  • ๋ฎคํ…์Šค์™€ ์„ธ๋งˆํฌ์–ด์˜ ์ฐจ์ด๋ฅผ ์„ค๋ช…ํ•ด์ฃผ์„ธ์š”.
  • ์Šค์ผ€์ค„๋Ÿฌ๊ฐ€ ๋ฌด์—‡์ด๊ณ , ๋‹จ๊ธฐ/์ค‘๊ธฐ/์žฅ๊ธฐ๋กœ ๋‚˜๋ˆ„๋Š” ๊ธฐ์ค€์— ๋Œ€ํ•ด ์„ค๋ช…ํ•ด์ฃผ์„ธ์š”.
  • CPU ์Šค์ผ€์ค„๋Ÿฌ์ธ FCFS, SJF, SRTF, Priority Scheduling, RR์— ๋Œ€ํ•ด ๊ฐ„๋žตํžˆ ์„ค๋ช…ํ•ด์ฃผ์„ธ์š”.
  • ๋™๊ธฐ์™€ ๋น„๋™๊ธฐ์˜ ์ฐจ์ด๋ฅผ ์„ค๋ช…ํ•ด์ฃผ์„ธ์š”.
  • ๋ฉ”๋ชจ๋ฆฌ ๊ด€๋ฆฌ ์ „๋žต์—๋Š” ๋ฌด์—‡์ด ์žˆ๋Š”์ง€ ๊ฐ„๋žตํžˆ ์„ค๋ช…ํ•ด์ฃผ์„ธ์š”.
  • ๊ฐ€์ƒ ๋ฉ”๋ชจ๋ฆฌ์— ๋Œ€ํ•ด ์„ค๋ช…ํ•ด์ฃผ์„ธ์š”.
  • ๊ต์ฐฉ์ƒํƒœ(๋ฐ๋“œ๋ฝ, Deadlock)์˜ ๊ฐœ๋…๊ณผ ์กฐ๊ฑด์„ ์„ค๋ช…ํ•ด์ฃผ์„ธ์š”.
  • ์‚ฌ์šฉ์ž ์ˆ˜์ค€ ์Šค๋ ˆ๋“œ์™€ ์ปค๋„ ์ˆ˜์ค€ ์Šค๋ ˆ๋“œ์˜ ์ฐจ์ด๋ฅผ ์„ค๋ช…ํ•ด์ฃผ์„ธ์š”.
  • ์™ธ๋ถ€ ๋‹จํŽธํ™”์™€ ๋‚ด๋ถ€ ๋‹จํŽธํ™”์— ๋Œ€ํ•ด ์„ค๋ช…ํ•ด์ฃผ์„ธ์š”.
  • Context Switching์ด ๋ฌด์—‡์ธ์ง€ ์„ค๋ช…ํ•˜๊ณ  ๊ณผ์ •์„ ๋‚˜์—ดํ•ด์ฃผ์„ธ์š”.
  • Swapping์— ๋Œ€ํ•ด ์„ค๋ช…ํ•ด์ฃผ์„ธ์š”.

๋ชฉ์ฐจ๋กœ ๋Œ์•„๊ฐ€๊ธฐ


๐Ÿ—‚ Data Structure

  • linked list
    • single linked list
    • double linked list
    • circular linked list
  • hash table
  • stack
  • queue
    • circular queue
  • graph
  • tree
    • binary tree
    • full binary tree
    • complete binary tree
    • bst(binary search tree)
  • heap(binary heap)
    • min heap
    • max heap
  • red-black tree
  • b+ tree

๋ชฉ์ฐจ๋กœ ๋Œ์•„๊ฐ€๊ธฐ


๐Ÿ”ป Algorithm

  • ์‹œ๊ฐ„, ๊ณต๊ฐ„ ๋ณต์žก๋„
  • Sort Algorithm
    • Bubble Sort
    • Selection Sort
    • Insertion Sort
    • Merge Sort
    • Heap Sort
    • Quick Sort
    • Counting Sort
    • Radix Sort
  • Divide and Conquer
  • Dynamic Programming
  • Greedy Algorithm
  • Graph
    • Graph Traversal: BFS, DFS
    • Shortest Path
      • Dijkstra
      • Floyd-Warshall
      • Bellman-Ford
    • Minimum Spanning Tree
      • Prim
      • Kruskal
    • Union-find
    • Topological sort

๋ชฉ์ฐจ๋กœ ๋Œ์•„๊ฐ€๊ธฐ


References