NLPToolkit / README.md
Canstralian's picture
Update README.md
65a82e2 verified
|
raw
history blame
4.46 kB
metadata
title: NLPToolkit
emoji: πŸ’¬
colorFrom: yellow
colorTo: purple
sdk: gradio
sdk_version: 5.11.0
app_file: app.py
pinned: false

NLP Model Deployment with FastAPI

Build Status Security Status Dependabot Status Python Version License Code Coverage Downloads Issues

Overview This project demonstrates how to deploy Natural Language Processing (NLP) models using FastAPI, a modern web framework for building APIs with Python. The application integrates two pre-trained models from the Hugging Face Transformers library:

Sequence Classification Model: Utilized for tasks like sentiment analysis. Question Answering Model: Designed to provide answers based on a given context. Features RESTful API Endpoints:

/predict: Accepts user input and returns model predictions. /health: Provides health status of the API. Model Integration:

Incorporates Hugging Face's AutoModelForSequenceClassification and AutoModelForQuestionAnswering for NLP tasks. Installation Clone the Repository:

bash Copy code git clone https://github.com/yourusername/nlp-fastapi-deployment.git cd nlp-fastapi-deployment Set Up a Virtual Environment:

bash Copy code python -m venv venv source venv/bin/activate # On Windows: venv\Scripts\activate Install Dependencies:

bash Copy code pip install -r requirements.txt Usage Start the FastAPI Server:

bash Copy code uvicorn main:app --reload The API will be accessible at http://127.0.0.1:8000.

Interact with the API:

Navigate to http://127.0.0.1:8000/docs to access the interactive API documentation provided by Swagger UI.

Example Request:

curl -X POST "http://127.0.0.1:8000/predict" -H "Content-Type: application/json" -d '{"text": "Your input text here"}'

Project Structure

nlp-fastapi-deployment/
β”œβ”€β”€ app/
β”‚   β”œβ”€β”€ __init__.py
β”‚   β”œβ”€β”€ main.py          # Main application file
β”‚   β”œβ”€β”€ models.py        # Pydantic models for request and response
β”‚   β”œβ”€β”€ nlp_models.py    # Functions for loading and utilizing NLP models
β”‚   └── utils.py         # Utility functions
β”œβ”€β”€ requirements.txt     # Project dependencies
β”œβ”€β”€ README.md            # Project documentation
└── .gitignore           # Git ignore file

Dependencies

  • FastAPI: Web framework for building APIs with Python.
  • Transformers: Library for state-of-the-art NLP models.
  • Torch: Deep learning framework used by Transformers.
  • Uvicorn: ASGI server for running FastAPI applications.

Ensure all dependencies are listed in requirements.txt for easy installation.

Contributing

Contributions are welcome! Please fork the repository and submit a pull request with your changes.

License

This project is licensed under the MIT License. See the LICENSE file for details.

Acknowledgements

  • Hugging Face for providing accessible NLP models.
  • FastAPI for the high-performance API framework. For a visual guide on creating a deep learning API with FastAPI, you might find the following resource helpful: https://youtu.be/NrarIs9n24I

An example chatbot using Gradio, huggingface_hub, and the Hugging Face Inference API.