huggingface/evaluate

Add Precision@k and Recall@k metrics

Andron00e opened this issue · 0 comments

Previous Precision and Recall metrics, supported by evaluate are only sklearn clones. It would be great to add an top k version for those metrics.

For example, a simple implementation of P@k metric is:

def precision_at_k(y_true, y_score, k):
    df = pd.DataFrame({'true': y_true, 'score': y_score}).sort('score')
    threshold = df.iloc[int(k*len(df)),1]
    y_pred = pd.Series([1 if i >= threshold else 0 for i in df['score']])
    return metrics.precision_score(y_true, y_pred)