YoojLee/Uniformaly

Weird Anomaly scores

Opened this issue · 3 comments

Hello,
Trained the model on the imagenet30 dataset and inference it on a few images. However, the anomaly scores are coming in some weird numbers. Something like 84.31, 84.32, etc for both anomalous and nonanomalous images. The script that I am using for inference is attached below. Is there something wrong I am doing?

from uniformaly.uniformaly import Uniformaly
device = "cuda"
backbone_name = "dino_vit_base_8"
layers_to_extract_from = ["2", "3", "4", "5", "6", "7", "8", "9"]
params = {"backbone": backbone_name, "layers_to_extract_from": layers_to_extract_from}
model = Uniformaly(device, params)

model_path = "dino_vit_base_8/models/imagenet_acorn"
model.load_from_path(load_path=model_path, device="cuda", local_nn_method=uniformaly.common.FaissNN(False, 4), topk=0.05, thres=0.1)

test_images = [os.path.join("imagenet30_subset/one_class_test/acorn/n12267677", i) 
for i in os.listdir("imagenet30_subset/one_class_test/acorn/n12267677")]

from datasets.transforms import ImageTransform
from tqdm import tqdm
resize=256 
imagesize=224
transform_img = ImageTransform(resize, imagesize)
prediction_scores = []
for img in tqdm(test_images, total=len(test_images)):
  preprocessed_image = transform_img(Image.open(img))
  predictions = model.predict(preprocessed_image.unsqueeze(0))
  anomaly_score = predictions[0][0]
  prediction_scores.append(anomaly_score)

Thanks

Hello, thank you for your interest in our paper.

How many images were used for evaluation during the test? Also, are the 84.31 and 84.32 anomaly scores or AUROC values?

@sy00n These are anomaly scores, that I am getting when I am inferencing the model on a few images.

I cannot give a definitive answer as I cannot verify which image it is, but the anomaly images might be confusing examples. Since the anomaly score is a distance measure, only such small differences may occur.

If you have any other issues or questions, please let us know. Thank you