Assets/Accelerators for Watson NLP (this repo) contains self-serve notebooks and documentation on how to create NLP models using Watson NLP library, how to serve Watson NLP models, and how to make inference requests from custom applications. With an IBM Cloud account a full production sample can be deployed in roughly one hour.
Key Technologies:
- IBM Watson NLP (Natural Language Processing) comes with a wide variety of text processing capabilities, such as emotion analysis and topic modeling. Watson NLP is built on top of the best AI open source software. It provides stable and supported interfaces, it handles a wide range of languages and its quality is enterprise proven. The Watson NLP containers can be deployed with Docker, on various Kubernetes-based platforms, or using cloud-based container services.
Machine Learning notebooks, tutorials, and datasets focused on supporting a Data Science Engineer are under the ML folder. Assets focused on deployment are under the MLOps folder. Go to the respective folders to learn more about these assets.
- ML Assets
- MLOps Assets
- Serve Pretrained Models using Docker
- Serve Custom Models using Docker
- Serve Models with Standalone Containers on Kubernetes or OpenShift
- Serve Models with AWS Fargate
- Serve Models with Azure Container Instances
- Serve Models with IBM Code Engine
- Serve Pretrained Models on Kubernetes or OpenShift
- Serve Custom Models with Kubernetes or OpenShift
- Serve Models with KServe ModelMesh
- Create an NLP Python Client
- IBM Watson NLP Library for Embed
- IBM Technology Zone assets
- Embeddable AI
- Watson NLP - Text Classification
- Watson NLP - Entities & Keywords extraction
- Watson NLP - Topic Modeling
- Watson NLP - Sentiment and Emotion Analysis
- Watson NLP - Creating Client Applications
- Watson NLP - Serving Models with Standalone Containers
- Watson NLP - Serving Models with Kubernetes and OpenShift
- IBM Developer Tutorials