Automated service classification plays a crucial role in service management such as service discovery, selection, and composition. In recent years, machine learning techniques have been used for service classification. However, they can only predict around 10 to 20 service categories due to the quality of feature engineering and the imbalance problem of service dataset. In this project, we present a deep neural network ServeNet with a novel dataset splitting algorithm to deal with these issues. ServeNet can automatically abstract low-level representation to high-level features, and then predict service classification based on the service datasets produced by the proposed splitting algorithm. To demonstrate the effectiveness of our approach, we conducted a comprehensive experimental study on 10,000 real-world services in 50 categories. The result shows that ServeNet can achieve higher accuracy than other machine learning methods.
-
Yilong Yang, Nafees Qamar, Peng Liu, Katarina Grolinger, Weiru Wang, Zhi Li, Zhifang Liao. "ServeNet: A Deep Neural Network for Web Services Classification" . to be presented at the IEEE 12th International Conferences on Web Services (ICWS’20), Beijing, China, Oct. 2020.
-
Yilong Yang, Wei Ke, Weiru Wang, Yongxin Zhao. "Deep Learning for Web Services Classification". presented at the IEEE 11th International Conferences on Web Services (ICWS’19), Milan, Italy, July 2019.
- RAW data
- CSV
- hdf5 or h5
- Tensor in pickle
- BILSTM.hdf5
- CLSTM.hdf5
- CNN.hdf5
- LSTM.hdf5
- RCNN.hdf5
- ServeNet.hdf5
- ServeNet-ICWS20.hdf5
Directly open .ipynb in Google Colab and prepare data
- git clone https://github.com/yylonly/ServeNet.git
- docker build . -t servenet:cpu -f Dockerfile-CPU
- docker run -itd --rm --name servenet-cpu -p 8888:8888 -v /yourpath:/data servenet:cpu
- docker logs servenet-cpu
- docker build . -t servenet:gpu -f Dockerfile-GPU
- docker run -itd --rm --runtime=nvidia --name servenet-gpu -p 8888:8888 -v /yourpath:/data servenet:gpu
- docker logs servenet-gpu