zecoo's Stars
wuyifan18/DeepLog
Pytorch Implementation of DeepLog.
ADLILog/ADLILog
tartaruszen/MADAN
Python implementation of MADAN algorithm for Multi-scale Anomaly Detection on Attributed Networks
leoguti85/MADAN
Multi-scale Anomaly Detection on Attributed Networks
jackaduma/ThreatReportExtractor
Extracting Attack Behavior from Threat Reports
crimson-unicorn/parsers
microsoft/anomalydetector
SR-CNN
logpai/Drain3
A robust streaming log template miner based on the Drain algorithm
ashish-gehani/SPADE
SPADE: Support for Provenance Auditing in Distributed Environments
linux-audit/audit-kernel
GitHub mirror of the Linux Kernel's audit repository
jun-zeng/Audit-log-analysis
Have fun with audit log analysis :)
openfaas/faas
OpenFaaS - Serverless Functions Made Simple
dyweb/papers-notebook
:page_facing_up: :cn: :page_with_curl: 论文阅读笔记(分布式系统、虚拟化、机器学习)Papers Notebook (Distributed System, Virtualization, Machine Learning)
yunjey/pytorch-tutorial
PyTorch Tutorial for Deep Learning Researchers
mariaGarofalakis/pyro_Classification_graph_classification
The objective of this project is to infer to a graph’s latent space using Variational Inference in order to successfully represent an undirected graph into a latent space and predict the class of each node. Graphs represent objects and their relationships in the real world, popular examples are the social networks, biological networks, road networks and many more can be represented using graphs. The non-regularity of data structures have led to advancements in Graph Neural Networks in relation to tasks such as classification, predictions, etc. Recently, Kipf and Welling [T. N. Kipf and Welling 2017] proposed the Graph Convolutional Network (GCN), which is considered one of the basic Graph Neural Network variants. In the GCNs the model learns the features by inspecting the neighboring nodes. Inspired by the paper “Neural Relational Inference for Interacting Systems” [T. Kipf et al. 2018] we implemented a Variational Encoder-Decoder Structured Classifier which receives as input a graph, meaning its adjacency matrix and dictionary, and outputs the classification result. The relaxation of the discrete latent state will be executed by using the Concrete distribution (CONtinuous disCRETE) proposed in the paper “Variational Inference for Graph Convolutional Networks in the Absence of Graph Data and Adversarial Settings” [Elinas, Bonilla, and Tiao 2020]. Our goal is to develop a model which will learn a good latent representation of graphs, while accurately predicting the node’s subject.
bhattbhavesh91/imbalance_class_sklearn
Address imbalance classes in machine learning projects.
DheerajKumar97/Income-Classification-using-SKlearn-Pipelines-and-Using-SMOTE-
LaurentVeyssier/Credit-Card-fraud-detection-using-Machine-Learning
Detect Fraudulent Credit Card transactions using different Machine Learning models and compare performances
zhulei227/GNN_Notes
图神经网络(GNN)学习笔记
KurochkinAlexey/DA-RNN
Pytorch implementation of Dual-Stage Attention-Based Recurrent Neural Network for Time Series Prediction https://arxiv.org/pdf/1704.02971.pdf
Zhenye-Na/DA-RNN
📃 𝖀𝖓𝖔𝖋𝖋𝖎𝖈𝖎𝖆𝖑 PyTorch Implementation of DA-RNN (arXiv:1704.02971)
xdg988/Dissertation-ARIMA_SVR-prediction
这一篇notebook是关于我写的毕业论文的整个过程的方法和总结,包括了我在写论文时的一些感悟和体验。
msalehsaudi/LSTM-Seq2seq-Timeseries-forecast
manohar029/TimeSeries-Seq2Seq-deepLSTMs-Keras
This project aims to give you an introduction to how Seq2Seq based encoder-decoder neural network architectures can be applied on time series data to make forecasts. The code is implemented in pyhton with Keras (Tensorflow backend).
JEddy92/TimeSeries_Seq2Seq
This repo aims to be a useful collection of notebooks/code for understanding and implementing seq2seq neural networks for time series forecasting. Networks are constructed with keras/tensorflow.
wdc11795/CNN_Seq2Seq_StockPrediction
epishova/Neural-Networks-for-time-series-analysis
Compare how ANNs, RNNs, LSTMs, and LSTMs with attention perform on time-series analysis
guillaume-chevalier/Linear-Attention-Recurrent-Neural-Network
A recurrent attention module consisting of an LSTM cell which can query its own past cell states by the means of windowed multi-head attention. The formulas are derived from the BN-LSTM and the Transformer Network. The LARNN cell with attention can be easily used inside a loop on the cell state, just like any other RNN. (LARNN)
kail0n/fp_study_notes_hello_docker
Demo Repo for Intro to Docker session
pointful/docker-intro
Presentation: Intro to Docker