Skip to content

Commit

Permalink
fixed documentation issues (verified from generated docs) (HelenGuohx#35
Browse files Browse the repository at this point in the history
)

* fixed documentation issues (verified from generated docs)

* small fix in documentation

* formatted docstring, capitalized, added period and removed param type

* formatted docstring, capitalized, added period and removed param type

* formatted docstring, capitalized, added period and removed param type

* Revise the sphinx docs

---------

Co-authored-by: yangwenzhuo08 <[email protected]>
  • Loading branch information
amritasaha87 and yangwenzhuo08 authored Feb 1, 2023
1 parent da56853 commit dc059af
Show file tree
Hide file tree
Showing 62 changed files with 813 additions and 879 deletions.
53 changes: 53 additions & 0 deletions docs/source/logai.algorithms.nn_model.forecast_nn.rst
Original file line number Diff line number Diff line change
@@ -0,0 +1,53 @@
logai.algorithms.nn\_model.forecast\_nn package
===============================================

Submodules
----------

logai.algorithms.nn\_model.forecast\_nn.base\_nn module
-------------------------------------------------------

.. automodule:: logai.algorithms.nn_model.forecast_nn.base_nn
:members:
:undoc-members:
:show-inheritance:

logai.algorithms.nn\_model.forecast\_nn.cnn module
--------------------------------------------------

.. automodule:: logai.algorithms.nn_model.forecast_nn.cnn
:members:
:undoc-members:
:show-inheritance:

logai.algorithms.nn\_model.forecast\_nn.lstm module
---------------------------------------------------

.. automodule:: logai.algorithms.nn_model.forecast_nn.lstm
:members:
:undoc-members:
:show-inheritance:

logai.algorithms.nn\_model.forecast\_nn.transformer module
----------------------------------------------------------

.. automodule:: logai.algorithms.nn_model.forecast_nn.transformer
:members:
:undoc-members:
:show-inheritance:

logai.algorithms.nn\_model.forecast\_nn.utils module
----------------------------------------------------

.. automodule:: logai.algorithms.nn_model.forecast_nn.utils
:members:
:undoc-members:
:show-inheritance:

Module contents
---------------

.. automodule:: logai.algorithms.nn_model.forecast_nn
:members:
:undoc-members:
:show-inheritance:
61 changes: 61 additions & 0 deletions docs/source/logai.algorithms.nn_model.logbert.rst
Original file line number Diff line number Diff line change
@@ -0,0 +1,61 @@
logai.algorithms.nn\_model.logbert package
==========================================

Submodules
----------

logai.algorithms.nn\_model.logbert.configs module
-------------------------------------------------

.. automodule:: logai.algorithms.nn_model.logbert.configs
:members:
:undoc-members:
:show-inheritance:

logai.algorithms.nn\_model.logbert.eval\_metric\_utils module
-------------------------------------------------------------

.. automodule:: logai.algorithms.nn_model.logbert.eval_metric_utils
:members:
:undoc-members:
:show-inheritance:

logai.algorithms.nn\_model.logbert.predict module
-------------------------------------------------

.. automodule:: logai.algorithms.nn_model.logbert.predict
:members:
:undoc-members:
:show-inheritance:

logai.algorithms.nn\_model.logbert.predict\_utils module
--------------------------------------------------------

.. automodule:: logai.algorithms.nn_model.logbert.predict_utils
:members:
:undoc-members:
:show-inheritance:

logai.algorithms.nn\_model.logbert.tokenizer\_utils module
----------------------------------------------------------

.. automodule:: logai.algorithms.nn_model.logbert.tokenizer_utils
:members:
:undoc-members:
:show-inheritance:

logai.algorithms.nn\_model.logbert.train module
-----------------------------------------------

.. automodule:: logai.algorithms.nn_model.logbert.train
:members:
:undoc-members:
:show-inheritance:

Module contents
---------------

.. automodule:: logai.algorithms.nn_model.logbert
:members:
:undoc-members:
:show-inheritance:
9 changes: 9 additions & 0 deletions docs/source/logai.algorithms.nn_model.rst
Original file line number Diff line number Diff line change
@@ -1,6 +1,15 @@
logai.algorithms.nn\_model package
==================================

Subpackages
-----------

.. toctree::
:maxdepth: 4

logai.algorithms.nn_model.forecast_nn
logai.algorithms.nn_model.logbert

Submodules
----------

Expand Down
Original file line number Diff line number Diff line change
@@ -0,0 +1,21 @@
logai.applications.openset.anomaly\_detection.configs package
=============================================================

Submodules
----------

logai.applications.openset.anomaly\_detection.configs.schema module
-------------------------------------------------------------------

.. automodule:: logai.applications.openset.anomaly_detection.configs.schema
:members:
:undoc-members:
:show-inheritance:

Module contents
---------------

.. automodule:: logai.applications.openset.anomaly_detection.configs
:members:
:undoc-members:
:show-inheritance:
29 changes: 29 additions & 0 deletions docs/source/logai.applications.openset.anomaly_detection.rst
Original file line number Diff line number Diff line change
@@ -0,0 +1,29 @@
logai.applications.openset.anomaly\_detection package
=====================================================

Subpackages
-----------

.. toctree::
:maxdepth: 4

logai.applications.openset.anomaly_detection.configs

Submodules
----------

logai.applications.openset.anomaly\_detection.openset\_anomaly\_detection\_workflow module
------------------------------------------------------------------------------------------

.. automodule:: logai.applications.openset.anomaly_detection.openset_anomaly_detection_workflow
:members:
:undoc-members:
:show-inheritance:

Module contents
---------------

.. automodule:: logai.applications.openset.anomaly_detection
:members:
:undoc-members:
:show-inheritance:
18 changes: 18 additions & 0 deletions docs/source/logai.applications.openset.rst
Original file line number Diff line number Diff line change
@@ -0,0 +1,18 @@
logai.applications.openset package
==================================

Subpackages
-----------

.. toctree::
:maxdepth: 4

logai.applications.openset.anomaly_detection

Module contents
---------------

.. automodule:: logai.applications.openset
:members:
:undoc-members:
:show-inheritance:
8 changes: 8 additions & 0 deletions docs/source/logai.applications.rst
Original file line number Diff line number Diff line change
@@ -1,6 +1,14 @@
logai.applications package
==========================

Subpackages
-----------

.. toctree::
:maxdepth: 4

logai.applications.openset

Submodules
----------

Expand Down
2 changes: 1 addition & 1 deletion examples/jupyter_notebook/tutorial_deep_ad.md
Original file line number Diff line number Diff line change
Expand Up @@ -236,7 +236,6 @@ workflow_config:
Then to run the end to end log anomaly detection on the HDFS dataset using LSTM Anomaly Detector (a sequence-based deep-learning model), you can use a python script like below:

```python
import os
from logai.applications.openset.anomaly_detection.openset_anomaly_detection_workflow import OpenSetADWorkflowConfig
from logai.utils.file_utils import read_file
from logai.utils.dataset_utils import split_train_dev_test_for_anomaly_detection
Expand Down Expand Up @@ -299,6 +298,7 @@ print (predict_results)
This kind of Anomaly Detection workflow for various Deep-Learning models and various experimental settings have also been automated in `logai.applications.openset.anomaly_detection.openset_anomaly_detection_workflow.OpenSetADWorkflow` class which can be easily invoked like the below example

```python
import os
from logai.applications.openset.anomaly_detection.openset_anomaly_detection_workflow import OpenSetADWorkflow, get_openset_ad_config

TEST_DATA_PATH = "tests/logai/test_data/HDFS_AD/HDFS_5k.log"
Expand Down
19 changes: 10 additions & 9 deletions logai/algorithms/anomaly_detection_algo/anomaly_detector_het.py
Original file line number Diff line number Diff line change
Expand Up @@ -14,7 +14,7 @@

class HetAnomalyDetectionConfig(AnomalyDetectionConfig):
"""
Heterogeneous Anomaly Detector Parameters
Heterogeneous Anomaly Detector Parameters.
:param train_test_ratio: The ratio between test and training splits.
"""
Expand All @@ -26,7 +26,8 @@ class HetAnomalyDetector(AnomalyDetector):
"""
Anomaly Detector Wrapper to handle heterogeneous log feature dataframe which include various attributes of log. For
each attribute, we build its specific anomaly detector if the data satisfies the requirement.
This current version only supports anomaly detection on the constants.LOGLINE_COUNTS field (i.e. frequency count of the log events)
This current version only supports anomaly detection on the constants.LOGLINE_COUNTS field (i.e. frequency count of
the log events).
"""

def __init__(self, config: HetAnomalyDetectionConfig):
Expand All @@ -40,11 +41,11 @@ def __init__(self, config: HetAnomalyDetectionConfig):

def preprocess(self, counter_df: pd.DataFrame):
"""
Split raw log feature dataframe by unique attribute ID
Splits raw log feature dataframe by unique attribute ID.
:param counter_df: log feature dataframe must contain at least two columns
:param counter_df: A log feature dataframe that must contain at least two columns
['timestamp': datetime, constants.LOGLINE_COUNTS: int].
The rest of columns combinations are treated as log attribute ID
The rest of columns combinations are treated as log attribute ID.
:return: The processed log feature dataframe.
"""
ts_df = counter_df[[constants.LOG_COUNTS]]
Expand All @@ -55,13 +56,13 @@ def preprocess(self, counter_df: pd.DataFrame):
attr_list = counter_df["attribute"].unique()
return attr_list

def fit_predict(self, log_feature: pd.DataFrame):
def fit_predict(self, log_feature: pd.DataFrame) -> pd.DataFrame:
"""
Train and predict anomaly scores
Trains a model and predicts anomaly scores.
:param log_features: log feature dataframe must contain at least two columns
:param log_features: A log feature dataframe that must contain at least two columns
['timestamp': datetime, constants.LOGLINE_COUNTS: int].
The rest of columns combinations are treated as log attribute ID
The rest of columns combinations are treated as log attribute ID.
:return: The predicted anomaly scores.
"""
res = pd.DataFrame()
Expand Down
15 changes: 8 additions & 7 deletions logai/algorithms/anomaly_detection_algo/dbl.py
Original file line number Diff line number Diff line change
Expand Up @@ -24,7 +24,7 @@
class DBLDetectorParams(Config):
"""
Dynamic Baseline Parameters. For more details on the paramaters see
https://opensource.salesforce.com/Merlion/v1.3.1/merlion.models.anomaly.html#module-merlion.models.anomaly.dbl
https://opensource.salesforce.com/Merlion/v1.3.1/merlion.models.anomaly.html#module-merlion.models.anomaly.dbl.
:param threshold: The rule to use for thresholding anomaly scores.
:param fixed_period: ``(t0, tf)``; Train the model on all datapoints occurring between t0 and tf (inclusive).
Expand All @@ -46,10 +46,10 @@ class DBLDetectorParams(Config):
@factory.register("detection", "dbl", DBLDetectorParams)
class DBLDetector(AnomalyDetectionAlgo):
"""Dynamic baseline based time series anomaly detection. This is a wrapper class for the Dynamic Baseline
anomaly detection model from Merlion library
anomaly detection model from Merlion library .
https://opensource.salesforce.com/Merlion/v1.3.1/merlion.models.anomaly.html#module-merlion.models.anomaly.dbl
Current implementation only supports anomaly detection on the constants.LOGLINE_COUNTS class (which maintains
frequency counts of the log events )
frequency counts of the log events).
"""
def __init__(self, params: DBLDetectorParams):
dbl_config = DynamicBaselineConfig(
Expand All @@ -65,9 +65,9 @@ def __init__(self, params: DBLDetectorParams):

def fit(self, log_features: pd.DataFrame):
"""
Train method of the Dynamic Baseline model
Training method of the Dynamic Baseline model.
:param log_features: log feature dataframe must only contain two columns
:param log_features: A log feature dataframe that must only contain two columns
['timestamp': datetime, constants.LOGLINE_COUNTS: int].
"""
self._is_valid_ts_df(log_features)
Expand All @@ -77,9 +77,10 @@ def fit(self, log_features: pd.DataFrame):

def predict(self, log_features: pd.DataFrame):
"""
Predict anomaly scores for log_feature["timestamp", constants.LOGLINE_COUNTS]
Predicts anomaly scores for log_feature["timestamp", constants.LOGLINE_COUNTS].
:param log_features: log feature dataframe must contain two columns ['timestamp': datetime, 'counts': int].
:param log_features: A log feature dataframe that must contain two columns
['timestamp': datetime, 'counts': int].
:return: A dataframe of the predicted anomaly scores, e.g., index:log_features.index.
value: anomaly score to indicate if anomaly or not.
"""
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -35,8 +35,8 @@ def _js_divergence(p, q):
class DistributionDivergenceParams(Config):
"""Parameters for distribution divergence based anomaly detector.
:param n_bins: number of bins to use to discretize the continuous distribution into a discrete distribution
:param type: list of types of distribution divergences. The allowed types are Kullback–Leibler ("KL"), Jensen–Shannon
:param n_bins: The number of bins to use to discretize the continuous distribution into a discrete distribution
:param type: A list of types of distribution divergences. The allowed types are Kullback–Leibler ("KL"), Jensen–Shannon
("JS"). It also allows a comma separated list of metrics like ("KL,JS" or "JS,KL").
"""
n_bins: int = 100
Expand Down
18 changes: 8 additions & 10 deletions logai/algorithms/anomaly_detection_algo/ets.py
Original file line number Diff line number Diff line change
Expand Up @@ -54,10 +54,9 @@ class ETSDetectorParams(Config):

@factory.register("detection", "ets", ETSDetectorParams)
class ETSDetector(AnomalyDetectionAlgo):
"""
ETS Anomaly Detector. This is a wrapper for the ETS based Anomaly Detector from Merlion library
https://opensource.salesforce.com/Merlion/v1.0.2/merlion.models.forecast.html#module-merlion.models.forecast.ets.
This current version only supports anomaly detection of the constants.LOGLINE_COUNTS (i.e. frequency count of log events).
"""ETS Anomaly Detector. This is a wrapper for the ETS based Anomaly Detector from Merlion library
https://opensource.salesforce.com/Merlion/v1.0.2/merlion.models.forecast.html#module-merlion.models.forecast.ets
This current version only supports anomaly detection of the constants.LOGLINE_COUNTS (i.e. frequency count of log events)
"""

def __init__(self, params: ETSDetectorParams):
Expand All @@ -77,12 +76,11 @@ def __init__(self, params: ETSDetectorParams):

def fit(self, log_features: pd.DataFrame):
"""
Fit method to train ETS Anomaly Detector.
:param log_features: log feature dataframe must only contain two columns
:param log_features: A log feature dataframe that must only contain two columns
['timestamp': datetime, constants.LOGLINE_COUNTS: int].
:return: train_scores: anomaly scores dataframe
:return: train_scores: The anomaly scores dataframe
['index':log_features.index, 'timestamps': datetime, 'anom_score': scores, 'trainval':
whether it is training set.
"""
Expand All @@ -100,11 +98,11 @@ def fit(self, log_features: pd.DataFrame):

def predict(self, log_features: pd.DataFrame):
"""
Predict anomaly scores for log_feature["timestamp", constants.LOGLINE_COUNTS].
Predicts anomaly scores for log_feature["timestamp", constants.LOGLINE_COUNTS].
:param log_features: log feature dataframe must only contain two columns
:param log_features: A log feature dataframe that must only contain two columns
['timestamp': datetime, constants.LOGLINE_COUNTS: int].
:return: test_scores: anomaly scores dataframe
:return: test_scores: The anomaly scores dataframe
['index':log_features.index, 'timestamps': datetime, 'anom_score': scores, 'trainval':
whether it is training set.
"""
Expand Down
Loading

0 comments on commit dc059af

Please sign in to comment.