Ladyleet

This workspace has been generated by Nx, Smart Monorepos · Fast CI.

Integrate with editors

Enhance your Nx experience by installing Nx Console for your favorite editor. Nx Console provides an interactive UI to view your projects, run tasks, generate code, and more! Available for VSCode, IntelliJ and comes with a LSP for Vim users.

Start the application and server

In one terminal run npx nx serve ai-chat-server to start the local chat server. You will need to add your OPENAI_API_KEY to the .env file. In one terminal run npx nx serve ladyeet-ai-chat to start the Angular application.

You should now have a basic app running at http://localhost:4200 that will allow you to hit the chat.

Build for production

You'll need to deploy the server backend, and then you'll want to deploy the frontend, configuring it to point at the backend.

  1. Build the server application with npx nx build ai-chat-server.

The client app:

  1. You'll need to update the environment.prod.ts for the Angular application to point to the URL for your chat server.
  2. Run npx nx build ladyleet-ai-chat to build the application. The build artifacts are stored in the output directory (e.g. dist/ or build/), ready to be deployed.

Running tasks

To execute tasks with Nx use the following syntax:

npx nx <target> <project> <...options>

You can also run multiple targets:

npx nx run-many -t <target1> <target2>

..or add -p to filter specific projects

npx nx run-many -t <target1> <target2> -p <proj1> <proj2>

Targets can be defined in the package.json or projects.json. Learn more in the docs.

Set up CI!

Nx comes with local caching already built-in (check your nx.json). On CI you might want to go a step further.

Explore the project graph

Run npx nx graph to show the graph of the workspace. It will show tasks that you can run with Nx.

Connect with us!