Pixel Prompt is a versatile application built using React Native, which can be deployed as a standalone application or integrated with a symmetrical backend powered by FastAPI and Docker. Although currently configured with diffusion models, Pixel Prompt is designed to handle a wide range of ML workloads, offering flexibility and scalability. Pixel Prompt
To ensure a comprehensive understanding of the application's architecture, here are the key components and deployment strategies:
-
Frontend: The frontend component is developed using React Native, providing a user-friendly interface for interaction with the underlying ML models and backend services.
-
Backend: The backend component is built using FastAPI and Docker, offering a robust and scalable foundation for hosting and managing ML models and associated APIs. FastAPI provides a fast and efficient framework for building APIs, while Docker ensures consistent and reproducible deployments across different environments.
-
Containerization: Both the frontend and backend components can be packaged into lightweight and portable Docker containers. Containerization allows for easy deployment and scaling of the application, ensuring consistent behavior across various environments. Docker containers encapsulate all the necessary dependencies and configurations, making it simple to deploy Pixel Prompt on different platforms and infrastructures.
-
JavaScript (JS): By leveraging React Native, Pixel Prompt can be built as a self-contained JavaScript application. This allows for a unified codebase that can be compiled and deployed on multiple platforms, including mobile devices and web browsers. The JavaScript version of Pixel Prompt provides a seamless and platform-agnostic user experience.
For a more in-depth discussion about the architectures and deployment strategies, refer to the article Cloud Bound: React Native and FastAPI for ML.
To preview the application visit the hosted version on AWS here.
Before running this application locally, ensure that you have the following dependencies installed on your machine. Each version has seperate build instructions.:
- Node
- npm (Node Package Manager)
- python
For all Modules
git submodule update --init --recursive
Updating all Modules
git submodule update --remote --merge
All the models are opensource and available on HuggingFace.
- Random
- stabilityai/stable-diffusion-3-medium
- stabilityai/stable-diffusion-xl-base-1.0
- nerijs/pixel-art-xl
- Fictiverse/Voxel_XL_Lora
- dallinmackay/Van-Gogh-diffusion
- gsdf/Counterfeit-V2.5
- Gustavosta/MagicPrompt-Stable-Diffusion
- meta-llama/Meta-Llama-3-70B-Instruct
This App was creating using the HuggingFace Inference API. Although Free to use, some functionality isn't available yet. The Style and Layout switches are based on the IP adapter which isn't supported by the Inference API. If you decide to use custom endpoints this is available now.
This project is licensed under the MIT License.
This application is built with Expo, a powerful framework for building cross-platform mobile applications. Learn more about Expo: https://expo.io
This application is using the HuggingFace Inference API and the Diffusers Library, provided by HuggingFace