Professional Experience
Highlights from My Software Engineering Career

Lead the design and development of scalable backend systems using Python and cloud-native technologies, driving technical vision and mentoring teams. Collaborate with stakeholders to align solutions with business goals while enhancing system reliability and deployment efficiency through DevOps practices.
Responsibilities
- Architect scalable backend services using FastAPI, Django, and PostgreSQL.
- Lead backend initiatives, enforcing clean code and best practices.
- Mentor engineers through code reviews and technical workshops.
- Drive CI/CD, containerization, and infrastructure automation.
- Ensure security compliance and optimize performance.
Achievements
- Boosted application performance by 40% via backend optimizations.
- Led a team of 5 on high-impact projects, achieving on-time delivery.
- Reduced deployment time by 60% with Docker and GitHub Actions CI/CD.
- Built RESTful APIs handling 1M+ daily requests with 99.9% uptime.
- Migrated monolith to microservices, cutting downtime by 70%.
- Enhanced response time consistency by 35% with profiling tools.
- Improved throughput by 50% using asynchronous FastAPI programming.
- Implemented observability (Prometheus, Grafana, Sentry), reducing MTTR by 40%.
Projects
- CardTrack Engine : Developed image-based search system for collectible cards using FastAPI, CLIP embeddings, and vector search, handling 1M+ records with real-time responses.
- CI/CD Overhaul : Redesigned deployment pipelines with Docker and GitHub Actions for zero-downtime blue-green deployments.
- Monolith to Microservices : Led migration of Django monolith to FastAPI/gRPC microservices, improving modularity and performance.
Impact and Outcomes
- Cut incident recovery time from 2 hours to 40 minutes with observability tools.
- Reduced onboarding time from 3 weeks to 1 week via automation.
- Increased delivery velocity by 30% through DevOps automation.
Tools and Platforms
Technologies Used
Engineered automation solutions for coupon discovery and application, streamlining discount retrieval and enhancing the user experience across e-commerce platforms. Played a pivotal role in developing browser extension features and ensuring product quality through rigorous QA practices.
Responsibilities
- Developed backend services for automated coupon scraping using Python and headless browser automation.
- Built a Chrome extension for real-time coupon application during checkout on supported e-commerce sites.
- Collaborated closely with QA and frontend teams to maintain high product standards.
- Integrated logging and monitoring tools to track scraping success and coupon validity.
- Maintained and extended scraping rules to adapt to site structure changes.
Achievements
- Increased scraping accuracy by 45% through improved DOM targeting and error handling.
- Boosted scraping speed by 60% with concurrent task execution and request optimizations.
- Contributed to a Chrome extension that reached 5K+ active users within 2 months.
- Reduced manual QA efforts by 70% via automated test scripts and monitoring dashboards.
Projects
- Coupon Scraper Engine : Designed a robust scraping pipeline using Python and Puppeteer for extracting and validating coupons from 100+ e-commerce websites.
- Smart Apply Extension : Built and maintained a Chrome extension that automatically detects and applies best coupons at checkout, enhancing UX and driving user retention.
- QA Automation Suite : Implemented automated test cases and monitoring scripts to ensure scraper and extension stability across updates.
Impact and Outcomes
- Reduced coupon discovery and application time by over 80%, improving user satisfaction.
- Enhanced customer retention and checkout conversion through automated savings features.
- Improved system resilience to layout changes via flexible selector strategies and fallback mechanisms.
Tools and Platforms
Technologies Used
Acted as a liaison between business and technical teams by transforming business needs into clear technical specifications. Designed and developed backend systems using FastAPI and Flask, with an emphasis on maintainability, scalability, and collaboration across departments.
Responsibilities
- Translated complex business requirements into actionable technical tasks.
- Developed high-performance, RESTful APIs using FastAPI and Flask.
- Maintained effective communication across sales, marketing, and development teams.
- Collaborated with DevOps to deploy services and monitor system performance on AWS.
- Assisted in logging and tracing service behavior via Elastic Stack (ELK).
Achievements
- Accelerated backend development timelines by 30% through improved coordination.
- Reduced communication gaps between business and tech teams by establishing standardized documentation practices.
- Increased system visibility with integrated logging and monitoring using ELK.
- Delivered stable and well-documented APIs that reduced frontend-backend friction.
Projects
- Business Requirement Mapping Tool : Built internal tooling to convert client requirements into structured engineering tasks, improving backlog clarity and team efficiency.
- API Development Stack : Implemented robust API services using FastAPI and Flask for internal business operations, enabling faster product iteration.
- AWS Deployment & Monitoring : Worked with infrastructure team to containerize services, deploy them on AWS, and set up real-time monitoring using the Elastic Stack.
Impact and Outcomes
- Improved alignment between product vision and technical implementation.
- Reduced production incidents through proactive monitoring setup.
- Enabled quicker stakeholder feedback loops by reducing technical iteration cycles.
Tools and Platforms
Technologies Used
Developed a robust backend system to support a machine learning-driven performance review platform. Focused on efficient data extraction, database optimization, and scalable deployment to ensure high availability and reliability for enterprise-level employee feedback processing.
Responsibilities
- Built data pipelines for extracting and processing employee performance metrics.
- Integrated ML-driven logic to generate tailored feedback based on historical work data.
- Optimized PostgreSQL queries and schema for faster access and lower latency.
- Implemented Redis caching to reduce DB read load and improve API response times.
- Deployed backend services on AWS, including EC2 for compute and S3 for data storage.
Achievements
- Reduced database load by 30% through caching, indexing, and query optimization.
- Enabled personalized employee feedback via automated ML-based analysis.
- Deployed and maintained reliable cloud infrastructure with 99.9% uptime.
- Streamlined processing of large employee datasets with minimal performance bottlenecks.
Projects
- ML-Powered Review Engine : Engineered a backend system to support a machine learning pipeline for generating automated performance feedback for employees.
- Database Optimization Layer : Improved database performance with strategic indexing, optimized query plans, and Redis-based caching.
- Cloud Infrastructure Deployment : Set up scalable infrastructure using AWS EC2 and S3 to ensure reliable and cost-efficient application hosting.
Impact and Outcomes
- Improved system responsiveness and scalability, enhancing user experience.
- Enabled faster and more accurate performance reviews through ML integration.
- Reduced operational overhead by automating infrastructure setup and tuning.
Tools and Platforms
Technologies Used
Designed and built scalable Django-based backend systems with RESTful API integrations to improve real-time monitoring and data accessibility. Delivered high-performance visualizations and streamlined user experience for handling large-scale datasets.
Responsibilities
- Developed backend logic for interactive web applications using Django and Django REST Framework.
- Created REST APIs to expose monitoring data for frontend visualization tools.
- Built an intuitive frontend interface to display database metrics and real-time system information.
- Implemented a high-performance real-time filtering and visualization engine to handle millions of records efficiently.
- Collaborated with UI/UX and product teams to improve accessibility for non-technical stakeholders.
Achievements
- Improved monitoring system functionality by integrating new backend features and APIs.
- Reduced query response time by 45% with efficient filtering and indexing strategies.
- Increased stakeholder engagement by 60% through enhanced data visibility.
- Successfully scaled data visualization system to handle millions of records with low memory overhead.
Projects
- Monitoring Dashboard Backend : Designed Django-based backend system and APIs to support a live monitoring dashboard with robust data querying and filtering capabilities.
- Real-Time Visualization Engine : Engineered a high-performance data pipeline and visualization interface to support millions of records in real-time.
- Stakeholder Web Interface : Built a user-centric frontend connected to REST APIs for real-time data inspection and decision-making.
Impact and Outcomes
- Enhanced system observability and responsiveness for internal and external users.
- Increased transparency and usability of critical system metrics across departments.
- Boosted development velocity by streamlining API contracts and frontend/backend integration.