Experimental blockchain API exploring Ethereum integration patterns, with caching and async processing workflows. Built with Express, PostgreSQL, Redis, and BullMQ.
TypeScript
Blockchain API
Project Overview
A blockchain API service built with Node.js, TypeScript, and Express.js that provides Ethereum blockchain
integration alongside user authentication and management capabilities. This project demonstrates
architecture patterns, security best practices, and scalable design principles.
Key Features
Authentication & Authorization: JWT-based authentication with secure password hashing
User Management: Complete user profile management with authentication
Blockchain Integration: Ethereum transaction querying, balance checking, and address validation
Smart Caching: Multi-tier caching system (Redis + Database + Provider) for optimal performance
Docker Support: Complete containerisation with multi-service orchestration
API Documentation: Interactive Swagger/OpenAPI documentation
Testing Suite: Unit and integration tests (still in progress)
Security: Production-ready security middlewares and best practices
Scalability: Queue-based background processing for heavy operations
Architecture Highlights
Modular Design: Feature-based folder structure with versioned APIs
Type Safety: Full TypeScript implementation with strict type checking
Database: PostgreSQL with Prisma ORM for type-safe database operations
Caching: Redis for high-performance caching and session management
Background Jobs: Queue system with BullMQ for processing blockchain data asynchronously
Monitoring: Comprehensive logging with Winston and health check endpoints
Security: Helmet, CORS, rate limiting, and input validation
Tech Stack
Core Technologies:
Node.js 22+ with TypeScript
Express.js for REST API framework
PostgreSQL for primary database
Redis for caching and sessions
Prisma ORM for database operations
BullMQ for background processing
JWT for authentication
Blockchain Integration:
Ethers.js for Ethereum interaction
Etherscan API for transaction data
Smart contract detection and validation
Development & Operations:
Docker & Docker Compose for containerisation
Jest and Supertest for testing
ESLint & Prettier for code quality
Winston for structured logging
Swagger/OpenAPI for documentation
Installation & Setup
Prerequisites
Node.js 22+ and npm
Docker and Docker Compose (for containerised setup)
Git for version control
🚀 Quick Start with Docker (Recommended)
The fastest way to get the application running is using Docker Compose:
# 1. Clone the repository
git clone <repository-url>cd ai-be-task-lqmwgk
# 2. Configure environment variables
cp .env.docker .env
# Edit .env and add your API keys:# - SEPOLIA_RPC_URL (Alchemy, Infura, or other Ethereum provider)# - ETHERSCAN_API_KEY (from etherscan.io)# 3. Start all services
docker-compose up --build
# 1. Clone and install dependencies
git clone <repository-url>cd ai-be-task-lqmwgk
npm install
# 2. Start only the databases with Docker
npm run dev:db:up
# 3. Configure environment
cp .env.example .env
# Edit .env with your API keys and configuration# 4. Set up the database
npm run prisma:push
npm run prisma:generate
# 5. Start the development server
npm run dev:server
# 6. Start the background worker
npm run dev:worker
# Alternatively, you can run both in development mode:# I recommend starting them in two separate terminals otherwise there's too much clutter
npm run dev:all
Environment Configuration
Required Environment Variables
# Blockchain API Keys (Required)
SEPOLIA_RPC_URL=https://eth-sepolia.g.alchemy.com/v2/YOUR_API_KEY
ETHERSCAN_API_KEY=YOUR_ETHERSCAN_API_KEY
# Security (Required)
JWT_SECRET=your-super-secret-jwt-key-minimum-32-characters
# Database & Redis (Auto-configured for Docker) - use this for local development
DATABASE_URL=postgresql://postgres:postgres@localhost:5432/postgres
REDIS_URL=redis://devuser:devpassword@localhost:6379
# Push schema to database
npm run prisma:push
# Generate Prisma client
npm run prisma:generate
# Open Prisma Studio (database GUI)
npm run prisma:studio
# Reset database (development only)
npx prisma db push --force-reset
Available Scripts
Build & Production
Script
Description
npm run build
Compile TypeScript to JavaScript in dist/ folder
npm run start:server
Start the production API server (requires build)
npm run start:worker
Start the production background worker (requires build)
npm run start:all
Start both API server and worker in production mode
Development
Script
Description
npm run dev:server
Start API server with hot reload using nodemon
npm run dev:worker
Start background worker with hot reload using nodemon
npm run dev:all
Start both API server and worker in development mode
npm run worker
Run worker directly with ts-node (for debugging)
Testing
Script
Description
npm test
Run all tests with open handles detection
npm run test:unit
Run only unit tests
npm run test:integration
Run only integration tests
npm run test:unit:watch
Run unit tests in watch mode
npm run test:integration:watch
Run integration tests in watch mode
npm run test:watch
Run all tests in watch mode
npm run test:coverage
Generate test coverage report
Code Quality
Script
Description
npm run lint
Check code for linting errors
npm run lint:fix
Automatically fix linting errors where possible
Database Management
Script
Description
npm run prisma:push
Push schema changes to database
npm run prisma:generate
Generate Prisma client from schema
npm run prisma:studio
Open Prisma Studio (database GUI)
npm run prisma:seed
Seed database with initial data
Docker Development
Script
Description
npm run dev:db:up
Start only PostgreSQL and Redis containers for local development
npm run dev:db:down
Stop development database containers
npm run dev:db:logs
View logs from development database containers
npm run redis:cli
Connect to Redis CLI in the container
Common Development Workflows
# Full development setup
npm run dev:db:up # Start databases
npm run prisma:push # Set up database schema
npm run dev:server # Start API server
npm run dev:worker # Start background worker# Testing workflow
npm run test:unit:watch # Run unit tests in watch mode
npm run test:coverage # Check test coverage# Production build
npm run build # Compile TypeScript
npm run start:all # Start production services
curl -X GET http://localhost:8000/api/v1/users/me \
-H "Authorization: Bearer YOUR_JWT_TOKEN"
Blockchain Endpoints
Method
Endpoint
Description
Auth Required
GET
/api/v1/eth/address/:address/transactions
Get transactions for an Ethereum address
❌
GET
/api/v1/eth/address/:address/balance
Get ETH balance for an address
❌
GET
/api/v1/eth/address/:address/count
Get stored transaction count for an address
❌
GET
/api/v1/eth/address/queue-info
Get queue processing statistics (dev only)
❌
Query Parameters for Transactions
Parameter
Type
Description
Default
fromBlock
number
Starting block number
0 (or contract creation block)
toBlock
number
Ending block number
Latest block
page
number
Page number for pagination
1
limit
number
Items per page (max 1000)
1000
Example: Get Transactions
# Get all transactions for an address
curl "http://localhost:8000/api/v1/eth/address/0x742d35Cc6634C0532925a3b8D5c9C4B9e4C4c4c4/transactions"# Get transactions with pagination and block range
curl "http://localhost:8000/api/v1/eth/address/0x742d35Cc6634C0532925a3b8D5c9C4B9e4C4c4c4/transactions?fromBlock=18000000&toBlock=18100000&page=1&limit=100"
Address Validation: Automatic checksum format validation and conversion
Contract Detection: Smart detection of contract vs EOA addresses
Multi-Network: Designed for Sepolia testnet with mainnet extensibility
Transaction Querying
// Actual transaction processing flowconstgetTransactions=async(address: string,fromBlock: number=0)=>{// 1. Check cache firstconstcached=awaitgetCachedPaginatedTransactionQuery(address,fromBlock,toBlock,page,limit,order);if(cached)returncached;// 2. Resolve starting block with 3-tier cachingconstaddressInfo=awaitresolveStartingBlock(address);constactualFrom=fromBlock===0
? getStartingBlockFromInfo(addressInfo)
: Math.max(fromBlock,getStartingBlockFromInfo(addressInfo));// 3. Check for gaps in database coverageconstgaps=findGapsInCoverage(coverageRanges,actualFrom,actualTo,address);if(gaps.length===0){// No gaps - serve from databasereturnawaitfetchTransactionsFromDatabase(address,actualFrom,actualTo,page,limit,order);}else{// Gaps found - fetch from Etherscan and queue background processingconstresult=awaitfetchTransactionsFromEtherscan(address,actualFrom,actualTo,page,limit,order);processGapsInBackground(address,gaps);// Queue for BullMQ processingreturnresult||awaitfetchTransactionsFromDatabase(address,actualFrom,actualTo,page,limit,order,{incomplete: true});}};
Smart Gap Processing
Gap Detection: Identifies missing block ranges in database coverage
Immediate Response: Always serve data immediately (from Etherscan or database)
Background Filling: All gaps are queued for background processing using BullMQ
Coverage Tracking: Maintains block range coverage to avoid redundant queries
No Size Threshold: All gaps are processed in background regardless of size
Performance Optimizations
Contract Creation Detection: Binary search to find contract deployment block
Batch Processing: Efficient batch queries to external APIs
Intelligent Caching: Multi-tier caching reduces provider calls by 99%+
Transaction Caching: 5-minute TTL for transaction queries
Address Info Caching: 7-day TTL for contract/EOA detection
Automatic Invalidation: Smart cache invalidation on data updates
🔄 Background Processing
Event Loop Processing
Non-Blocking: Uses Node.js event loop for background tasks
Progress Tracking: Real-time progress updates for long-running operations
Error Recovery: Graceful error handling and retry mechanisms
Queue System Architecture
// Background processing using BullMQexportfunctionprocessGapsInBackground(address: string,gaps: Gap[]): void{if(gaps.length===0)return;// Schedule background processing using setImmediate for non-blocking behavioursetImmediate(()=>{queueGapsForProcessing(address,gaps).then(()=>{logger.info('Successfully queued gaps for processing',{address,gapsCount: gaps.length});}).catch((error)=>{logger.error('Failed to queue gaps for processing',{address,error: error.message});});});}
🛡️ Error Handling
Comprehensive Error Management
Custom Error Classes: Structured error hierarchy for different error types
Global Error Handler: Centralized error processing and logging
Graceful Degradation: System continues operating even when components fail
Error Types
// Custom error classes for different scenarios with convenient methods for common errorsexportclassApiErrorextendsError{publicstatusCode: number;publicerrorName: string;publicdetails?: Record<string,unknown|null>;constructor(statusCode: number,errorName: string,message: string,details?: Record<string,unknown|null>){super(message);this.statusCode=statusCode;this.errorName=errorName;this.details=details;}staticbadRequest(message: string,details?: Record<string,unknown|null>){returnnewApiError(400,'Bad Request',message,details);}staticunauthorized(message: string,details?: Record<string,unknown|null>){returnnewApiError(401,'Unauthorized',message,details);}// ...}exportclassJwtTokenErrorextendsError{constructor(publicoriginalError: jwt.TokenExpiredError|jwt.NotBeforeError|jwt.JsonWebTokenError,publictokenType: 'access'|'refresh'){super(originalError.message);}}
📊 Monitoring & Observability
Structured Logging
Winston Logger: Production-ready logging with multiple transports
Log Levels: Debug, Info, Warn, Error with environment-specific configuration
Performance Metrics: Response times and operation durations