/Template-Api-Rest-Node-Docker-Prisma

Template API REST em Nodejs com Fastify

Primary LanguageTypeScriptMIT LicenseMIT

Template API REST em Nodejs com Fastify

SobreVitrine DevTecnologiasInstalaçõesFuncionalidadesAutorLicença

 

💻 Sobre o projeto

🚀 Template API REST em Nodejs com Fastify, TypeScript, Zod, PrismaJs, Docker, Vitest, Eslint e JSON Web Token.

 

issue site Template-Api-Rest-Node-Docker-Prisma total amount of programming languages used in the project most used language in the projects repository size

deploy badge Render deploy badge Heroku

 


 

📺 Vitrine Dev

🪧 Vitrine.Dev
✨ Nome Template API REST em Nodejs com Fastify
🏷️ Tecnologias NodeJs, TypeScript, JavaScript, .ENV, Fastify, PrismaJs, Zod, Database(MySql, PostgreSQL, SqLite), Docker, Vitest, EsLint e JSON Web Token. Insomnia

 

🛠 Tecnologias

As seguintes ferramentas foram usadas na construção do projeto

 

Node.js badge TypeScript badge JavaScript badge Fastify badge Dotenv badge Prisma badge ZOD badge Docker badge SQLite badge Postgresql badge MySQL badge Vitest Badge Insomnia badge swagger badge JSON Web Tokens Badge vscode download code formatter prettier code standardization eslint


 

⚙️ Instalações

 

Configurando VsCode

O arquivo JsTs.code-profile é um profile com nome JS(Javascript/NodeJs) e TS(Typescript) contém as configurações do VsCode para o projeto, como extensões, configurações de formatação e configurações de lint.

Para importar o profile basta clicar Ctrl+Shift+P e digitar import profile. Clicar e selecionar o arquivo JsTs.code-profile.

Este profile está configurado para projetos NodeJs e Typescript, fique a vontade para alterar as configurações para o seu projeto.


 

Criando projeto NodeJs

# Create project nodejs with npm and -y to accept all default options
npm init -y
// Create scripts in package.json
"scripts": {
  "dev": "tsx watch src/server.ts", // Create script to run server in development mode
  "build": "tsup src --out-dir build", // Create script to build server in production mode
  "start": "node build/server.js" // Create script to run server in production mode
},

Create .gitignore file

Create .npmrc file with save-exact=true to save exact version of dependencies

 

.env architecture

npm install dotenv # Install dotenv to use environment variables in NodeJs

Create .env file with all environment variables and gitignore this file

Create .env.example file with all environment variables and not gitignore this file

 

Configurando ESlint

npm install -D @rocketseat/eslint-config # Install Rocketseat ESLint config

Create .eslintrc.json file with all ESLint config

Create .eslintignore file with all ESLint ignore files

 

TypeScript architecture

npm install -D typescript # Install TypeScript
npm install -D @types/node # Install @types/node to use types in NodeJs
npm install -D tsx # Install tsx to use compiler TypeScript in NodeJs in development mode
npm install -D tsup # Install tsup to use compiler TypeScript in NodeJs in production mode
npm install zod # Install zod to use types in NodeJs and validate data
npx tsc --init # Create tsconfig.json
"target": "es2020", // Change TypeScript target to ES2020 in tsconfig.json
"baseUrl": "./", // Specify the base directory to resolve non-relative module names.
"paths": {
  "@/*": [
    "./src/*"
  ],
} // Specify path aliases to import files in tsconfig.json

 

Fastify architecture

npm install fastify # Install Fastify
npm install @fastify/jwt # Install @fastify/jwt to use JWT in Fastify
npm install @fastify/cookie # Install @fastify/cookie to use cookie in Fastify

Create fastify-jwt.d.ts file in @types folder with all types of JWT in Fastify

import '@fastify/jwt'

declare module '@fastify/jwt' {
  export interface FastifyJWT {
    user: {
      sub: string
    }
  }
}

Create JWT_SECRET in .env and .env.example

# Auth token in development mode
JWT_SECRET="secret"

Create script fastifyJwt in app.ts to use JWT in Fastify

app.register(fastifyJwt, {
  secret: env.JWT_SECRET,
})

 

Docker architecture

Instalando Docker https://docs.docker.com/get-docker/

// Create scripts in package.json
"scripts": {
  "start-docker": "docker-compose up -d", // Create script to run docker-compose in background
  "stop-docker": "docker-compose stop" // Create script to stop docker-compose
},

Create docker-compose.yml file with all docker-compose config

Create .dockerignore file with all docker-compose ignore files

docker ps # List all running containers
docker ps -a # List all containers
docker images # List all images
docker pull mysql # Pull mysql image database (if you want to use mysql)
docker pull postgres # Pull postgres image database (if you want to use postgres)
docker pull mariadb # Pull mariadb image database (if you want to use mariadb)
docker start <container_id or container_name> # <container_id> Start container with id | <container_name> Start container with name
docker stop <container_id or container_name> # <container_id> Stop container with id | <container_name> Stop container with name
docker pause <container_id or container_name> # <container_id> Pause container with id | <container_name> Pause container with name
docker unpause <container_id or container_name> # <container_id> Unpause container with id | <container_name> Unpause container with name
docker rm <container_id or container_name> # <container_id> Remove container with id | <container_name> Remove container with name
docker logs <container_id or container_name> # <container_id> Show logs of container with id | <container_name> Show logs of container with name
docker inspect <container_id or container_name> # <container_id> Show all information of container with id | <container_name> Show all information of container with name
docker-compose --version # Show docker-compose version
docker-compose up # Run docker-compose, show logs in terminal
docker-compose up -d # Run docker-compose in background, not show logs in terminal
docker-compose start # Start o docker-compose
docker-compose stop # Stop o docker-compose, not remove all data in database
docker-compose down # Stop and remove o docker-compose, obs remove all data in database

 

PrismaJs database architecture

npm install -D prisma # Install Prisma
npm i -D prisma-erd-generator @mermaid-js/mermaid-cli # Install Prisma ERD generator
npm i @prisma/client # Install Prisma client

npx prisma init # Create prisma folder with prisma.schema and prisma folder
npx prisma migrate dev # Enter a name for the new migration: » created tab Habits
npx prisma studio # Open Prisma Studio in browser
npx prisma studio -b firefox -p 5173 # Open Prisma Studio in browser with specific port and browser
npx prisma generate # Generate diagram in prisma folder
npx prisma db seed # Seed database with data in prisma/seed.ts - Populate database with data for development
// Add generator in prisma.schema
generator erd {
  provider = "prisma-erd-generator"
}
// Create scripts in package.json
"scripts": {
  "studio": "npx prisma studio -b firefox -p 5173", // Create script to open Prisma Studio in browser with specific port and browser
  "generate": "npx prisma generate", // Create script to generate diagram in prisma folder
  "migrate": "npx prisma migrate dev", // Create script to migrate database
  "seed": "npx prisma db seed" // Create script to seed database
},

Create src/lib/prisma.ts file with all Prisma config

import { env } from '@/env'
import { PrismaClient } from '@prisma/client'

export const prisma = new PrismaClient({
  log: env.NODE_ENV === 'dev' ? ['query'] : [],
}) // Create prisma client with log in development mode

Environment variables to databases (postgres, mysql, sqlite)

 

Create DATABASE_URL in .env and .env.example file with Postgres

DATABASE_URL="postgresql://USER:PASSWORD@HOST:PORT/DATABASE_NAME?schema=public"
// Add datasource in prisma.schema to postgres database
datasource db {
  provider          = "postgresql"
  url               = env("DATABASE_URL")
}

 

Create DATABASE_URL in .env and .env.example file with MySql

DATABASE_URL="mysql://USER:PASSWORD@HOST:PORT/DATABASE_NAME"

SHADOW_DATABASE_URL="mysql://OTHER_USER:PASSWORD@HOST:PORT/OTHER_DATABASE_NAME"
// Add datasource in prisma.schema to mysql database
datasource db {
  provider          = "mysql"
  url               = env("DATABASE_URL")
  shadowDatabaseUrl = env("SHADOW_DATABASE_URL")
}
// https://www.prisma.io/docs/concepts/components/prisma-migrate/shadow-database
// Exist provider that user cannot have access to create a database, to resolve this problem, you can use shadow database

 

Create DATABASE_URL in .env and .env.example file with SqLite

DATABASE_URL="file:./dev.db"
// Add datasource in prisma.schema to sqlite database
datasource db {
  provider = "sqlite"
  url      = env("DATABASE_URL")
}

 

Vitest architecture

npm install -D vitest # Install Vitest
npm install -D vite-tsconfig-paths # To vite understand tsconfig paths
npm install -D @vitest/coverage-c8 # Install coverage vitest
npm install -D @vitest/ui # Install vitest ui
npm install -D supertest # Install supertest to test http requests
npm install -D @types/supertest # Install types supertest

Create vite.config.ts file with all vitest config

import tsconfigPaths from 'vite-tsconfig-paths'
import { defineConfig } from 'vitest/config'

export default defineConfig({
  plugins: [tsconfigPaths()],
})
// now vitest can understand tsconfig paths
// Create scripts in package.json
"scripts": {
  "test:create-prisma-environment": "npm link ./prisma/vitest-environment-prisma", // Create vitest-environment-prisma in node_modules
  "test:install-prisma-environment": "npm link vitest-environment-prisma", // Install vitest-environment-prisma in node_modules
  "test": "vitest run --dir src/use-cases", // Run all tests without watch
  "test:watch": "vitest --dir src/use-cases", // Run all tests with watch
  "pretest:e2e": "run-s test:create-prisma-environment test:install-prisma-environment", // Run before test:e2e, run-s is to run scripts in sequence (npm install -D npm-run-all) 
  "test:e2e": "vitest run --dir src/http", // Run all tests without watch in specific folder
  "test:e2e:watch": "vitest --dir src/http", // Run all tests with watch in specific folder
  "test:coverage": "vitest run --coverage", // Run all tests with coverage
  "test:ui": "vitest --ui", // Run all tests with ui
},

Create vitest-environment-prisma to test environment with prisma

cd prisma/vitest-environment-prisma # Enter in vitest-environment-prisma folder
npm init -y # Create package.json
npm link # Link vitest-environment-prisma to node_modules
cd ../../ # Return to root folder
npm link vitest-environment-prisma # Link vitest-environment-prisma to node_modules

Edit package.json file like this

{
  "name": "vitest-environment-prisma",
  "main": "prisma-test-environment.ts"
}

Create prisma-test-environment.ts in vitest-environment-prisma folder

Edit vite.config.ts file with all vitest config

// Any test inside src/http/controllers will run in environment with prisma/vitest-environment-prisma
test: {
  environmentMatchGlobs: [['src/http/controllers/**', 'prisma']],
},

 

Others libraries

npm install -D npm-run-all # Install npm-run-all to run multiple scripts in parallel or sequential
npm install ... # Install libraries

 

⚙️ Funcionalidades

RF - Requisitos Funcionais

  • Usuário deve poder criar novas transações;
  • Usuário deve poder listar todas as transações que ja foram criadas;
  • Bla, bla bla...;

RN - Regras de Negócio

  • A transação deve ser do tipo entrada (crédito) ou saída (débito);
  • Deve ser possível identificar o usuário que criou as transações, (Obs: Não é necessário autenticação);
  • Bla, bla bla...;

RNF - Requisitos Não Funcionais

  • Testes e2e de todas as rotas em Vitest;
  • Uso de sqlite em ambiente Dev e PostgreSQL em ambiente Prod;
  • Uso de PrismaJs para migrations, queries e diagrama de banco de dados;
  • Uso de Fastify para rotas e middlewares;
  • Uso de ZodJs para validação de dados de entrada;
  • Uso de SupertestJs para testes de integração;
  • Uso de TsupJs para compilar o TypeScript em modo de produção;
  • Uso de Tsx para compilar o TypeScript em modo de desenvolvimento;
  • Uso de Eslint para padronização de código;
  • Uso de Prettier para padronização de código;
  • Uso de Docker para deploy da aplicação e padronização de ambiente de desenvolvimento;
  • Uso de Insonmia para testes de requisições;
  • Uso de Github para versionamento de código;
  • Uso Swagger para documentação da API;

 

🧭 Rodando a aplicação (Modo desenvolvimento)

git clone https://github.com/livioalvarenga/Template-Api-Rest-Node-Docker-Prisma.git # Clone este repositório
cd Template-Api-Rest-Node-Docker-Prisma # Acesse a pasta do projeto no seu terminal/cmd
npm install # Instale as dependências
npm run start-docker # Subir o banco de dados em modo de desenvolvimento na porta 5432
npm run dev # Execute a aplicação em modo de desenvolvimento, a aplicação será aberta na porta:3333 - acesse http://localhost:3333

npm run stop-docker # Parar o banco de dados em modo de desenvolvimento

# Ou

npm run lets-code # Sobe o banco de dados em modo de desenvolvimento e executa a aplicação em modo de desenvolvimento

🧭 Rodando a aplicação (Modo produção)

npm run build # Compilar o TypeScript em modo de produção
npm run start # Iniciar o servidor em modo de produção

🧭 Prisma

npm run studio # Iniciar o Prisma Studio para visualizar o banco de dados
npm run migrate # Criar migrations do banco de dados
npm run seed # Popular o banco de dados com dados de desenvolvimento
npm run generate # Gerar diagrama do banco de dados

🧭 Testes

npm run test # Executar os testes de integração
npm run test:watch # Executar os testes de integração com watch
npm run test:coverage # Executar os testes de integração com coverage
npm run test:ui # Executar os testes de integração com ui

Testando requests com Insomnia

npm run dev # start server
# Escolha a variável dev (vermelho) em Insomnia

# Para testar a API com deploy em produção use a variável prod (verde) em Insomnia

Importar o arquivo Insomnia.json no Insomnia para testar as requests

Insomnia

 


 

🦸 Autor

Olá, eu sou Livio Alvarenga, Engenheiro de Produção | Dev Back-end e Front-end. Sou aficcionado por tecnologia, programação, processos e planejamento. Uni todas essas paixões em uma só profissão. Dúvidas, sugestões e críticas são super bem vindas. Seguem meus contatos.

 

portifólio livio alvarenga perfil linkedin livio alvarenga perfil twitter livio alvarenga perfil instagram livio alvarenga perfil facebook livio alvarenga perfil youtube livio alvarenga

perfil vitrinedev livio alvarenga


 

📝 Licença

Este projeto é MIT licensed.

#CompartilheConhecimento