sherpa-ai/sherpa

Example using tensorflow API

vinodrajendran001 opened this issue · 5 comments

I have been trying to use Sherpa bayesian opt in low level tensorflow API but no success so far.

It would be helpful, if you could update your README with an example of using tensorflow API's.

Hi @vinodrajendran001,
Just to be sure we are on the same page, would you mind adding the code you have been using?
Thanks,
Lars

I am afraid it's not possible but I can provide a simple version of it. If sherpa can be applied to it, then I could easily adapt it to my code.

import tensorflow as tf
import numpy as np

# fake data
x = np.linspace(-1, 1, 100)[:, np.newaxis]          # shape (100, 1)
noise = np.random.normal(0, 0.1, size=x.shape)
y = np.power(x, 2) + noise                          # shape (100, 1) + some noise

tf_x = tf.placeholder(tf.float32, x.shape)     # input x
tf_y = tf.placeholder(tf.float32, y.shape)     # input y

# neural network layers
l1 = tf.layers.dense(tf_x, 10, tf.nn.relu)          # hidden layer
output = tf.layers.dense(l1, 1)                     # output layer

loss = tf.losses.mean_squared_error(tf_y, output)   # compute cost
optimizer = tf.train.GradientDescentOptimizer(learning_rate=0.5)
train_op = optimizer.minimize(loss)

sess = tf.Session()                                 # control training and others
sess.run(tf.global_variables_initializer())         # initialize var in graph
for step in range(100):
    # train and net output
    _, l, pred = sess.run([train_op, loss, output], {tf_x: x, tf_y: y})

The technical challenge in applying it to low level TF api's is finding the keras callback equivalent. It would be helpful, if you can replicate keras example hyper parameter tuning using the above mentioned code snippet

@LarsHH Any updates?

Hi @vinodrajendran001 ,

How about something like this:

import tensorflow as tf
import numpy as np

# fake data
x = np.linspace(-1, 1, 100)[:, np.newaxis]          # shape (100, 1)
noise = np.random.normal(0, 0.1, size=x.shape)
y = np.power(x, 2) + noise                          # shape (100, 1) + some noise

parameters = [sherpa.Discrete('num_layers', [1,5]),
                          sherpa.Continuous('learning_rate', [5e-2, 5e-1], 'log')]
algorithm = bayesian_optimization.GPyOpt(max_num_trials=50)
study = sherpa.Study(parameters, algorithm)

for trial in study:
    lr = trial.parameters['learning_rate']
    num_layers = trial.parameters['num_layers']

    tf_x = tf.placeholder(tf.float32, x.shape)     # input x
    tf_y = tf.placeholder(tf.float32, y.shape)     # input y

    h = tf_x

    # neural network layers
    for _ in range(num_layers):
        h = tf.layers.dense(h, 10, tf.nn.relu)          # hidden layer
    output = tf.layers.dense(h, 1)                     # output layer

    loss = tf.losses.mean_squared_error(tf_y, output)   # compute cost
    optimizer = tf.train.GradientDescentOptimizer(learning_rate=0.5)
    train_op = optimizer.minimize(loss)

    sess = tf.Session()                                 # control training and others
    sess.run(tf.global_variables_initializer())         # initialize var in graph
    for step in range(100):
        # train and net output
        _, l, pred = sess.run([train_op, loss, output], {tf_x: x, tf_y: y})
        study.add_observation(trial, objective=l, iteration=step)
    study.finalize(trial)

Would that work? I just made up the tuning parameters for now.

Yes it works. Please feel free to close the comment.

As a side note, it would be helpful if you could enable dashboard for windows.