This is the repository for CS520 project 1. In this repo, there are several algorithms that can solve Thompson-type problem. Current algorithms include projected gradient descent (PGD), nesterov accelerated PGD, penalty method, augmented Lagrange method and stochastic gradient descent (SGD) method.
Nelder-Mead Algorithm and Projected Gradient Descent are added.
We used the minimizer in cvxpy package to solve eq. (5) and (*) in the report.
We explored the projected gradient descent algorithm and found that it attained the theoretical infimum on a large scale of problems. In the meantime, it gives the coordinates of
def f(X):
n = X.shape[0]
energy = 0
for i in range(n):
for j in range(i+1, n):
dist = torch.norm(X[i,:] - X[j,:])
energy += 1 / dist
return energy
k, n = 3, 30
X0 = torch.randn(n, k)
learner_PGD = PGD(f, X0)
T = 50
for t in range(1, T + 1):
learner_PGD.train(t)
learner_PGD.result()
In each iteration, the learner returns current value of the enegy function f(X)
and the result will be printed in the end.
For each
|
Image |
---|---|