✨ This workspace has been generated by Nx, Smart Monorepos · Fast CI. ✨
Enhance your Nx experience by installing Nx Console for your favorite editor. Nx Console provides an interactive UI to view your projects, run tasks, generate code, and more! Available for VSCode, IntelliJ and comes with a LSP for Vim users.
In one terminal run npx nx serve ai-chat-server
to start the local chat server. You will need to add your OPENAI_API_KEY
to the .env
file.
In one terminal run npx nx serve ladyeet-ai-chat
to start the Angular application.
You should now have a basic app running at http://localhost:4200 that will allow you to hit the chat.
You'll need to deploy the server backend, and then you'll want to deploy the frontend, configuring it to point at the backend.
- Build the server application with
npx nx build ai-chat-server
.
The client app:
- You'll need to update the
environment.prod.ts
for the Angular application to point to the URL for your chat server. - Run
npx nx build ladyleet-ai-chat
to build the application. The build artifacts are stored in the output directory (e.g.dist/
orbuild/
), ready to be deployed.
To execute tasks with Nx use the following syntax:
npx nx <target> <project> <...options>
You can also run multiple targets:
npx nx run-many -t <target1> <target2>
..or add -p
to filter specific projects
npx nx run-many -t <target1> <target2> -p <proj1> <proj2>
Targets can be defined in the package.json
or projects.json
. Learn more in the docs.
Nx comes with local caching already built-in (check your nx.json
). On CI you might want to go a step further.
Run npx nx graph
to show the graph of the workspace.
It will show tasks that you can run with Nx.