Chen, Jiazhen2025-04-292025-04-292025-04-292025-04-29https://hdl.handle.net/10012/21676Anomaly detection involves identifying patterns or behaviors that substantially differ from normal instances in a dataset. It has a wide range of applications in diverse fields such as cybersecurity, manufacturing, finance, and e-commerce. However, real-world anomaly detection often grapples with two main challenges: label scarcity, as anomalies are rare and hard to label, and the complexity of data structures, which can involve intricate dependencies that require careful analysis. In this thesis, we develop deep learning frameworks designed to work effectively for label-free or extremely limited labeling scenarios, with a focus on time series anomaly detection (TSAD) and graph anomaly detection (GAD). To overcome the issue of label scarcity, our initial work investigates unsupervised TSAD methods that extract meaningful patterns from abundant unlabeled data. Building on recent advances in contrastive learning from NLP and computer vision, we introduce the Contrastive Neural Transformation (CNT) framework. This approach integrates temporal contrastive learning with neural transformations to capture context-aware and discriminative patterns effectively. Moreover, the dual-loss formulation prevents representation collapse by avoiding reliance on negative samples - a common challenge in anomaly detection, where the majority of instances represent normal behavior. While capturing temporal context is essential, understanding inter-series relationships is equally important in multivariate TSAD. Anomalies may seem normal in isolation but reveal abnormal patterns when compared to other series. To address this, we introduce DyGraphAD, a dynamic graph-driven dual forecasting framework that models both intra- and inter-series dependencies through a combination of graph and time series forecasting tasks. This allows anomalies to be detected via significant forecasting errors in both the channel-wise and time-wise dimensions. To further enhance computational efficiency, we propose an alternative framework, termed Prospective Multi-Graph Cohesion (PMGC). PMGC leverages graph structure learning to model inter-series relationships in a task-specific manner, reducing computational load compared to manual sequential graph construction in DyGraphAD. Furthermore, it introduces a multi-graph cohesion mechanism to adaptively learn both long-term dependencies and diverse short-term relationships. A prospective graphing strategy is also introduced to encourage the model to capture concurrent inter-series relationships, reducing reliance solely on historical data. Beyond TSAD, GAD is also critical due to its prevalence in numerous applications. Graphs provide structural information alongside node and edge attributes, and understanding the interplay between graph structure and attributes is essential for uncovering subtle anomalies not apparent when examining nodes or edges alone. Given that obtaining labeled data is relatively more feasible in graphs than in time series data for experimental purposes, we focus on GAD settings with limited labeling, more reflecting practical real-world scenarios. Specifically, we make the first attempt to address GAD in cross-domain few-shot settings, aiming to detect anomalies in a sparsely labeled target graph by leveraging a related but distinct source graph. To handle domain shifts, our CDFS-GAD framework incorporates a domain-adaptive graph contrastive learning and a domain-specific prompt tuning, aiming to align the features across two domains while preserving unique characteristics tailored to each domain. A domain-adaptive hypersphere classification loss and a self-training phase are introduced to further refine predictions in the target domain exploiting the limited labeling information. In addition to static graphs, many real-world applications involve dynamic graph data, where both the structure and attributes evolve over time. This adds complexity to anomaly detection, as both temporal and structural variations must be accounted for. Moreover, obtaining sufficient labeled data remains challenging, and related-domain labeled data may not be available in certain scenarios. To tackle the two more practical issues, we propose the EL$^2$-DGAD framework, specifically designed for detecting anomalies in dynamic graphs in extremely labeled conditions. This framework enhances model robustness through a transformer-based temporal graph encoder that captures evolving patterns from local and global perspectives. An ego-context hypersphere classification loss is further introduced to adjust the anomaly detection boundary contextually under limited supervision, supplemented by an ego-context contrasting module to improve generalization with unlabeled data. Overall, this thesis tackles anomaly detection for two commonly used data types, addressing unsupervised, semi-supervised, and cross-domain few-shot scenarios to meet the demands of real-world applications. Our extensive experiments show that the proposed frameworks perform well against various benchmark datasets and competitive anomaly detection baselines.enTime Series Anomaly DetectionGraph Anomaly DetectionGraph Neural NetworkContrastive LearningDeep Learning Frameworks for Anomaly Detection in Time Series and Graphs with Limited LabelsDoctoral Thesis