Authors: Ian J. Davis Michael W. Godfrey Douglas Neuse Serge Mankovskii
Venue: N/A, 2015
Year: 2015
Abstract: Computer algorithms are written with the intent that when run they perform a useful function. Typically any information obtained is unknown until the algorithm is run. However, if the behavior of an algorithm can be fully described by precomputing just once how an algorithm will respond when executed on any input, this precomputed result provides a complete specification for all solutions in the problem domain. We apply this idea to a previous anomaly detection algorithm, and in doing so transform the usefulness of this algorithm from one that merely detects individual anomalies when asked to characterize potentially anomalous values, into a meta-solution which provides a complete specification for what constitutes such an anomaly. This specification is derived by examining no more than a small training data, can be obtained in very small constant time, and is inherently far more powerful than results obtained by repeated execution of this tool. For example, armed with such a specification one can ask how close an anomaly is to being deemed normal, and can validate this answer not by exhaustively testing the algorithm but by examining if the specification so generated is indeed correct. This powerful idea can be applied to any algorithm whose entire behavior can be computed by such a meta-algorithm and so has wide applicability. Keywords— Anomaly detection; prediction; auditing; filtering; unsupervised learning; ordered trees; random forest
BibTeX:
@inproceedings{ianj.davis2015sads,
author = "Ian J. Davis and Michael W. Godfrey and Douglas Neuse and Serge Mankovskii",
title = "Specifying Anomalous Data Spaces",
year = "2015",
booktitle = "N/A"
}
Plain Text:
Ian J. Davis, Michael W. Godfrey, Douglas Neuse, and Serge Mankovskii, "Specifying Anomalous Data Spaces," N/A