You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I tried using DOMINANT for anomaly detection on a custom dataset. This dataset consists of undirected graphs, each with one node feature and one edge feature. The graphs range from a minimum of 3 nodes to a maximum of approximately 1000 nodes.
When training DOMINANT with this custom dataset, I observed a decrease in the AUC score for anomalous graphs when standardizing the node and edge features during preprocessing, compared to not standardizing them. I had expected that standardization would stabilize the features by unifying their scales, so I'm unsure why the AUC score decreased.
Additionally, I found that after standardization, if I add 2 to all values of the node and edge features (to adjust the minimum value to be above zero), the AUC score for anomalous graphs improves (resulting in an AUC score similar to that before standardization).
If you have any insights into the cause or how to further investigate this issue, I would appreciate your advice.
Below are the parameter settings for DOMINANT and DataLoader, which are mostly set to their default values.
Note: I'm using the DataLoader from torch_geometric.loader and training for 50 epochs. Since the anomaly detection is done on a graph level, I'm using the pygod.utils.to_graph_score function to calculate the anomaly scores, and the AUC score is used as the evaluation metric for these anomaly scores.
The text was updated successfully, but these errors were encountered:
I tried using DOMINANT for anomaly detection on a custom dataset. This dataset consists of undirected graphs, each with one node feature and one edge feature. The graphs range from a minimum of 3 nodes to a maximum of approximately 1000 nodes.
When training DOMINANT with this custom dataset, I observed a decrease in the AUC score for anomalous graphs when standardizing the node and edge features during preprocessing, compared to not standardizing them. I had expected that standardization would stabilize the features by unifying their scales, so I'm unsure why the AUC score decreased.
Additionally, I found that after standardization, if I add 2 to all values of the node and edge features (to adjust the minimum value to be above zero), the AUC score for anomalous graphs improves (resulting in an AUC score similar to that before standardization).
If you have any insights into the cause or how to further investigate this issue, I would appreciate your advice.
Below are the parameter settings for DOMINANT and DataLoader, which are mostly set to their default values.
DOMINANT:
epoch = 10
DataLoader:
batch_size = 4
shuffle = False
drop_last = True
pin_memory = True
num_workers = 2
Note: I'm using the DataLoader from torch_geometric.loader and training for 50 epochs. Since the anomaly detection is done on a graph level, I'm using the pygod.utils.to_graph_score function to calculate the anomaly scores, and the AUC score is used as the evaluation metric for these anomaly scores.
The text was updated successfully, but these errors were encountered: