/keras-in-DeepLearning

๐ŸŒฑCNN, GAN, RNN, AE, GAN, UNET

Primary LanguageJupyter Notebook

keras-in-DeepLearning

ํŒŒ์ด์ฌ ๋ผ์ด๋ธŒ๋Ÿฌ๋ฆฌ KERAS

  • ์—”์ง„์œผ๋กœ๋Š” Tensorflow, Theano, CNTK ๋“ฑ์ด ์žˆ์Œ
  • ์ผ€๋ผ์Šค๋Š” ์ธ๊ณต์ง€๋Šฅ ์—”์ง„ ํ”„๋กœ๊ทธ๋žจ์„ ํ˜ธ์ถœํ•˜์—ฌ ์ธ๊ณต์ง€๋Šฅ ์•Œ๊ณ ๋ฆฌ์ฆ˜์„ ์ˆ˜ํ–‰ํ•œ๋‹ค
  • ์ผ€๋ผ์Šค๋Š” ํŠน์ • ์—”์ง„์— ๊ตญํ•œ๋˜์ง€ ์•Š๋Š” ๊ณตํ†ต ๋ฐฑ์—”๋“œ ํ•จ์ˆ˜๋„ ์ œ๊ณตํ•ด์ค€๋‹ค.

CODE

  • models๋Š” ์ธ๊ณต์‹ ๊ฒฝ๋ง์˜ ๊ฐ ๊ฒŒ์ธต์„ ์—ฐ๊ฒฐํ•˜์—ฌ ํ•˜๋‚˜์˜ ๋ชจ๋ธ์„ ๋งŒ๋“  ํ›„ ์ปดํŒŒ์ผ, ํ•™์Šต, ์˜ˆ์ธก์„ ๋‹ด๋‹น
  • layers๋Š” ์ธ๊ณต์‹ ๊ฒฝ๋ง์˜ ๊ฐ ๊ณ„์ธต์„ ๋งŒ๋“œ๋Š” ํด๋ž˜์Šค ์ œ๊ณต
# ์ผ€๋ผ์Šค๋กœ ์ธ๊ณต ์‹ ๊ฒฝ๋ง ๋ชจ๋ธ ๋งŒ๋“ฆ. models.Sequential()์„ ์‚ฌ์šฉํ•˜์—ฌ ํŒŒ์ด์ฌ ํ”„๋กœ์„ธ์Šค์—๊ฒŒ ์•Œ๋ฆผ
## models.Sequential์€ ํŒŒ์ด์ฌ์˜ ํด๋ž˜์Šค
## model์ด๋ผ๋Š” ์ธ์Šคํ„ด์Šค ๋งŒ๋“ฆ

model = keras.models.Sequential()

# ๋ชจ๋ธ ์ธ์Šคํ„ด์Šค๊ฐ€ ์ƒ์„ฑ๋˜๋ฉด ๋ฉค๋ฒ„ ํ•จ์ˆ˜ add()๋ฅผ ์ด์šฉํ•˜์—ฌ ์ธ๊ณต์ง€๋Šฅ ๊ณ„์ธต ์ถ”๊ฐ€
model.add(keras.layers.Dense(1, input_shape =(1,))

# ํ•™์Šต์— ์‚ฌ์šฉ๋˜๋Š” ์ตœ์ ํ™” ์•Œ๊ณ ๋ฆฌ์ฆ˜ -> ํ™•๋ฅ ์  ๊ฒฝ์‚ฌํ•˜๊ฐ•๋ฒ•, ์†์‹คํ•จ์ˆ˜ -> ํ‰๊ท ์ œ๊ณฑ ์˜ค์ฐจ
model.compile('SGD', 'mse')

# ๋ชจ๋ธ์„ ์ฃผ์–ด์ง„ ๋ฐ์ดํ„ฐ๋กœ ํ•™์Šต
## verbose๋Š” ํ•™์Šต ์ง„ํ–‰์‚ฌํ•ญ ํ‘œ์‹œ ์—ฌ๋ถ€
model.fit(x[:2], y[:2], epochs = 1000, verbose = 0)

# ์„ฑ๋Šฅ ํ‰๊ฐ€
print("Targets:", y[2:])
print("Predictions:", model.predict(x[2:]).flatten())

๊ฐ์ฒด์ง€ํ–ฅํ˜• ๊ตฌํ˜„

  • ๋ถ„์‚ฐ ๋ฐฉ์‹ ๋ชจ๋ธ๋ง
class ANN(models.Model):
  def __init__(self, Nin, Nh, Nout):
      # Prepare network layers and activate functions
      hidden = layers.Dense(Nh)
      output = layers.Dense(Nout)
      relu = layers.Activation('relu')
      softmax = layers.Activation('softmax')
      
      # Connect network elements
      x = layers.Input(shape = (Nin,))
      h = relu(hidden(x))
      y = softmax(output(h))
      
      super().__init__(x, y)
      self.compile(loss = 'categorical_crossentropy', optimizer = 'adam', metrics = ['accuracy'])      
  • ์—ฐ์‡„ ๋ฐฉ์‹ ๋ชจ๋ธ๋ง
class ANN(models.Sequential):
  def __init__(self, Nin, Nh, Nout):
      super().__init__()
      self.add(layers.Dense(Nh, activation = 'relu', input_shape = (Nin,)))
      self.add(layers.Dense(Nout, activation = 'softmax'))
      self.compil (loss = 'categorical_crossentropy', optimizer = 'adam', metrics = ['accuracy'])

ํ•จ์ˆ˜ํ˜• ๊ตฌํ˜„

  • ๋ถ„์‚ฐ ๋ฐฉ์‹ ๋ชจ๋ธ๋ง
def ANN(Nin, Nh, Nout):
  x = layers.Input(shape = (Nin,))
  h = layers.Activation('relu')(layers.Dense(Nh)(x))
  ...
  model = models.Model(x,y)
  model.compile(loss = 'categorical_crossentropy', optimizer = 'adam', metrics = ['accuracy']
  return model
  • ์—ฐ์‡„ ๋ฐฉ์‹ ๋ชจ๋ธ๋ง
def ANN(Nin, Nh, Nout):
  model = models.Sequential()
  model.add(layers.Dense(Nh, activation = 'relu', input_shape = (Nin,)))
  model.add(layers.Dense(Nout, avtivation = 'softmax'))
  model.compile(loss = 'categorical_crossentropy', optimizer = 'adam', metrics = ['accuracy'])
  return model