Learning Platform

A full-stack educational platform leveraging AI to generate personalized learning paths with adaptive difficulty assessment and real-time content generation.

Role: Full-Stack Developer

Duration: 3 months

Credits: Renish Bhaskaran (Backend Mentor)

Tech-Stack: Python, Django, Celery, Redis, PostgreSQL, Docker, GCP, Groq API

Project Overview

Designed and deployed a full-stack AI-powered educational platform that delivers personalized learning experiences across 15 subject areas. The platform intelligently assesses user knowledge levels and generates customized curriculum paths with dynamically created educational content.

Key Features:

  • Adaptive difficulty assessment using AI-generated quizzes
  • Dynamic generation of 30 topics and 120 lessons per subject
  • Real-time progress tracking
  • Responsive UI with background task processing
  • Secure user authentication and profile management
AI Learning Platform Dashboard

Technical Implementation

Architecture & Backend

Built with Django for robust ORM capabilities, built-in authentication, and rapid development. Implemented a PostgreSQL database with optimized schema design featuring three core models: LearningProfile, Topic, and Lesson, with UserProgress tracking completion states.

AI Integration & Content Generation

Integrated Groq`s LLM API with structured output validation using Pydantic schemas. Engineered prompts to generate consistent, educational content including 10 assessment questions per subject and 4 comprehensive lessons per topic. Implemented markdown-to-HTML conversion for rich content display.

Asynchronous Processing

Implemented Celery with Redis as a message broker to handle AI content generation asynchronously. This architecture prevents blocking operations, enabling users to navigate the platform while content generates in the background. Implemented polling mechanisms for real-time status updates.

Deployment & DevOps

Containerized the application using Docker with multi-stage builds for optimized production images. Deployed on Google Cloud Platform (GCP) with separate containers for Django, Celery workers, and Redis. Configured Gunicorn with Uvicorn workers for ASGI support and implemented CI/CD pipelines using GitHub Actions.

Platform Feature 1
Platform Feature 2

Problem Solving & Challenges

1. LLM Output Consistency

Challenge: Initial LLM responses were inconsistent, with formatting issues and invalid JSON structures causing parsing failures.

Solution: Implemented a robust validation pipeline using Pydantic models with strict JSON schema enforcement. Refined system prompts with explicit formatting rules, including JSON escape handling and markdown syntax guidelines. Added retry logic with exponential backoff for failed generations.

2. User Experience During Long Operations

Challenge: Content generation for 30 topics took more than a minute, creating poor UX.

Solution: Architected an asynchronous task queue system using Celery and Redis. Implemented background workers that process content generation while users continue browsing. Created real-time status polling with visual feedback (spinners, progress indicators) and graceful error handling with retry mechanisms.

3. Cloud Deployment Complexity

Challenge: Deploying a multi-container architecture (Django, Celery, Redis) on GCP with proper networking, service communication, and external database integration.

Solution: Used Neon for managed PostgreSQL hosting with built-in SSL connections. Created a Virtual Private Cloud (VPC) on GCP for secure inter-service communication. Deployed on a GCP Compute Engine VM where I built Docker images for Django, Celery and Redis services, configured docker-compose for container orchestration, and managed environment variables for production.

Platform Feature 1
Platform Feature 2
Platform Results

Results & Impact

Technical Learnings:

  • Deepened expertise in Django ORM, migrations, and admin interface
  • Gained hands-on experience with distributed task queues (Celery/Redis)
  • Developed skills in prompt engineering and LLM integration
  • Enhanced DevOps capabilities with Docker and cloud deployment

Future Roadmap

Planned Enhancements

  • Multi-Subject Support: Enable users to pursue multiple learning paths simultaneously with cross-subject progress tracking
  • Interactive Learning: Implement end-of-lesson quizzes with immediate feedback
  • AI Chat Assistant: Integrate real-time AI tutoring for concept clarification and personalized help
Future Development Roadmap
← Back to Projects

© 2026 Roberto Costantino. All rights reserved.