A robust and scalable RESTful API for a blogging platform built with NestJS, TypeScript, and MySQL. This project demonstrates modern backend development practices with clean architecture, comprehensive authentication, and flexible service integrations.
- Blog APIs
- User Authentication & Authorization - JWT-based auth with email verification and secure refresh tokens
- Blog Management - Create, read, update, and delete blog posts (public/private)
- Comment System - Users can comment on public blogs
- Social Features - Follow/unfollow users and like blog posts
- File Upload - Profile picture uploads with multiple storage providers
- Email Integration - Verification emails and password reset functionality
- Database Seeding - Pre-populated data for development and testing
- API Documentation - Interactive Swagger/OpenAPI documentation
- Dockerized - Ready for containerized deployment
- Flexible Architecture - Generic email and upload modules using factory pattern
- Framework: NestJS (Node.js)
- Language: TypeScript
- Database: MySQL with TypeORM
- Authentication: JWT with refresh tokens
- File Upload: Cloudinary (configurable)
- Email: Multiple providers (Brevo, SendGrid, Mailtrap)
- Documentation: Swagger/OpenAPI
- Validation: class-validator & class-transformer
- Containerization: Docker & Docker Compose
This project showcases some interesting architectural patterns:
I've implemented a flexible email system using the Factory Pattern that allows switching between different email providers (Brevo, SendGrid, Mailtrap) based on the environment:
-
Development: Mailtrap (for testing)
-
Staging: SendGrid
-
Production: Brevo
The beauty of this approach is that you can easily add new email providers without changing existing code - just implement the
EmailProvider
interface!
Similarly, the upload module is built to be provider-agnostic. Currently configured for Cloudinary, but designed to easily support other storage solutions like AWS S3, Firebase, or local storage through the same factory pattern approach. π Back to Top
You have two options to run this project: Docker (recommended) or Local Development.
This is the easiest way to get up and running quickly!
- Docker
- Docker Compose
- Clone the repository
git clone https://github.com/MohamedAboElnaser/Blog-apis.git
cd Blog-apis
- Environment Setup
# Copy the example environment file
cp .env.example .env
# Edit the .env file with your configurations
nano .env # or use your preferred editor
- Run with Docker Compose
# For development (with hot reload)
docker compose up --build
# For production
docker compose -f docker-compose.prod.yml up --build -d
- Seed the database (optional)
# Access the running container
docker compose exec app npm run seed
# Or force seed (clears existing data)
docker compose exec app npm run seed:force
That's it! Your API will be available at http://localhost:3000
- Node.js (>=18)
- MySQL database
- npm or yarn
- Clone the repository
git clone https://github.com/MohamedAboElnaser/Blog-apis.git
cd Blog-apis
- Install dependencies
npm install
- Environment Setup
# Copy the example environment file
cp .env.example .env
# Edit the .env file with your configurations
nano .env # or use your preferred editor
-
Database Setup
Make sure your MySQL database is running and create the database specified in your .env file.
-
Run the application
# Development mode
npm run start:dev
# Production mode
npm run build
npm run start:prod
- Seed the database (optional)
# Regular seeding (skips if data exists)
npm run seed
# Force seeding (clears existing data)
npm run seed:force
Create a .env file based on .env.example and fill in your specific configurations. This file contains sensitive information like database credentials, JWT secrets, and email provider settings. π Back to Top
You can access the API documentation in two ways:
Visit http://localhost:3000/api
to access the interactive Swagger documentation when running the application locally. Here you'll find detailed information about all endpoints, request/response schemas, and can even test the API directly from the browser!
π View Static API Documentation - Complete API documentation that's always available, automatically updated with each release. No need to run the application locally!
Note: The GitHub Pages documentation is static and doesn't allow testing endpoints directly. For interactive testing, use the local documentation after running the application.
POST /auth/register
- Register a new userPOST /auth/verify-email
- Verify email with OTPPOST /auth/login
- User login (returns access token + sets refresh token cookie)POST /auth/refresh-token
- Refresh expired access token using cookiePOST /auth/request-password-reset
- Request password resetPOST /auth/reset-password
- Reset password with OTPPOST /auth/resend-verification-code
- Resend email verification code
GET /blogs
- Get user's blogs (authenticated)POST /blogs
- Create a new blogGET /blogs/
- Get all public blogsGET /blogs/:id
- Get specific blog detailsPATCH /blogs/:id
- Update blogDELETE /blogs/:id
- Delete blog
GET /users/me
- Get current user profileGET /users/:id/blogs
- Get user's public blogsPOST /users/:id/follow
- Follow a userDELETE /users/:id/unfollow
- Unfollow a userPOST /blogs/:id/like
- Like a blogDELETE /blogs/:id/unlike
- Unlike a blog
The API uses a dual-token authentication system for enhanced security:
-
Access Token: Short-lived JWT token (15-30 minutes) sent in response body
- Used for authenticating API requests
- Include in Authorization header:
Bearer <access_token>
- Expires quickly for security
-
Refresh Token: Long-lived token (7 days default) stored in HTTP-only cookie
- Used to obtain new access tokens when they expire
- Automatically sent with requests (HTTP-only cookie)
- Cannot be accessed by JavaScript (XSS protection)
- HTTP-Only Cookies: Refresh tokens stored securely, inaccessible to client-side scripts
- Environment-Based Security: HTTPS-only cookies in production
- Configurable Expiration: Token lifetimes configurable via environment variables
- Automatic Token Refresh: Seamless token renewal without user intervention
Usage Example:
- Login β Receive access token + refresh token cookie
- Use access token for API calls
- When access token expires β Call
/auth/refresh-token
- Receive new access token (refresh token cookie updated automatically)
src/
βββ auth/ # Authentication module (JWT, guards, strategies)
βββ blog/ # Blog management (CRUD operations)
βββ comment/ # Comment system for blogs
βββ common/ # Shared utilities and DTOs
βββ database/ # Database configuration and seeders
βββ email/ # Generic email module with multiple providers
βββ follow/ # User following system
βββ like/ # Blog liking functionality
βββ otp/ # OTP generation and verification
βββ upload/ # Generic file upload module
βββ user/ # User management and profiles
βββ app.module.ts # Main application module
βββ main.ts # Application entry point
scripts/
βββ seed.ts # Database seeding script
docker-compose.yml # Development Docker setup
docker-compose.prod.yml # Production Docker setup
Dockerfile # Multi-stage Docker build
.env.example # Environment variables template
The project includes a comprehensive seeding system that creates:
- Test users with verified accounts
- Sample blog posts (public and private)
- Comments on public blogs
- Follow relationships between users
- Likes on blog posts
This makes it super easy to get started with development - just run the seed command and you'll have a fully populated database to work with!
.env
file has the following SALT value:
SALT='$2b$10$CfyyfGZKr/OjMf6wJ34Na.'
All seeded users use the password: pass1234
Available Test Users:
admin@blog.com
(Admin User)john.doe@example.com
(John Doe)jane.smith@example.com
(Jane Smith)mike.wilson@example.com
(Mike Wilson)sarah.jones@example.com
(Sarah Jones)
You can login with any of these emails using the password pass1234
to test the application functionality immediately!
I'd love your contributions! Here's how you can help:
- Fork the repository
- Create a feature branch (
git checkout -b feature/amazing-feature
) - Make your changes and add tests if applicable
- Commit your changes (
git commit -m 'Add some amazing feature'
) - Push to the branch (
git push origin feature/amazing-feature
) - Open a Pull Request π Back to Top
- Follow the existing code style and patterns
- Update documentation if needed
- Use descriptive commit messages
- Test both Docker and local setups π Back to Top
The application automatically adapts based on the NODE_ENV
:
- Development: Uses Mailtrap for emails, detailed logging
- Staging: Uses SendGrid for emails
- Production: Uses Brevo for emails, optimized logging, HTTPS-only cookies
The project is production-ready with:
- Multi-stage Docker builds for optimized images
- Production Docker Compose configuration
- Environment-based configurations
- Health checks and restart policies
- Secure token management with HTTP-only cookies
This project is licensed under the MIT License - see the LICENSE file for details.
If you have any questions or run into issues, feel free to:
- Open an issue on GitHub
- Check the Swagger documentation at
/api
- Review the seeded data examples π Back to Top