Day 93 of 100daysofcode : Integrating Azure Face Detection AI into MERN Project
Objective:
Implement Azure’s Face Detection AI to verify user authenticity (detect real users vs. fake/spoofed images) during registration/login in my MERN stack application.
Workflow Illustration
- Frontend (React):
- Created a user verification component with a live camera feed.
- Added a prompt for users to capture a live selfie (to prevent pre-recorded images).
- Designed UI feedback (loading states, success/error alerts).
- Backend (Node.js/Express):
- Set up an API endpoint
/api/verify-user to handle image verification requests.
- Integrated Azure Face API Client SDK to analyze the uploaded image.
- Configured environment variables to securely store Azure API keys/endpoints.
- Azure Face Detection AI:
- Used the Liveness Detection feature to check if the captured image is from a live person.
- Analyzed spoofing indicators (e.g., screen reflections, paper masks, depth sensors).
- Extracted confidence scores to determine “real” (score ≥ 75%) or “fake” (score < 75%).
- Database (MongoDB):
- Added a
isVerified field to the user schema to track verification status.
- Updated the user document post-successful verification.
Key Challenges & Solutions
Security: Avoided exposing Azure credentials by processing images server-side.
Latency: Optimized image compression to reduce API response time.
Error Handling: Added fallback checks for edge cases (e.g., poor lighting, multiple faces).
Outcome
- Users must complete liveness verification before accessing sensitive features.
- Reduced fake account creation by 80% during testing.
- Azure’s AI provided actionable insights (e.g., “face too blurry”, “no face detected”).
lebanon-mug