A sophisticated recommendation system that provides cross-domain taste-based recommendations with AI-generated explanations. Built with Next.js, TypeScript, and Google Gemini AI.
- Cross-Domain Recommendations: Get recommendations across different domains (movies, books, music, restaurants, etc.)
- AI-Powered Explanations: Contextual explanations for recommendations using Google Gemini
- Cultural Theme Analysis: Advanced taste profiling using Qloo API
- Multi-Layer Caching: L1 (in-memory), L2 (Redis), and L3 (database) caching
- User Authentication: Secure authentication with Clerk
- Rate Limiting: Tiered rate limiting based on user subscription
- Comprehensive Testing: Full test coverage with Vitest
- Node.js 18+
- Redis server
- Google Gemini API key
- Qloo API key
- Clerk authentication keys
- Supabase database
- Clone the repository
- Install dependencies:
npm install- Set up environment variables in
.env.local:
# Google Gemini API
GEMINI_API_KEY=your_gemini_api_key
GEMINI_MODEL=gemini-2.0-flash-lite
# Qloo API
QLOO_API_KEY=your_qloo_api_key
QLOO_API_URL=https://hackathon.api.qloo.com
# Clerk Authentication
NEXT_PUBLIC_CLERK_PUBLISHABLE_KEY=your_clerk_publishable_key
CLERK_SECRET_KEY=your_clerk_secret_key
# Supabase
NEXT_PUBLIC_SUPABASE_URL=your_supabase_url
NEXT_PUBLIC_SUPABASE_ANON_KEY=your_supabase_anon_key
# Redis Cache
REDIS_URL=redis://localhost:6379- Run the development server:
npm run devOpen http://localhost:3000 with your browser to see the result.
The application supports configurable Gemini models through environment variables:
gemini-1.5-flash(default fallback)gemini-1.5-progemini-2.0-flash-lite(recommended)gemini-2.0-flash-expgemini-progemini-pro-vision
Set the model in your .env.local file:
GEMINI_MODEL=gemini-2.0-flash-liteYou can also configure the Gemini service programmatically:
import { GeminiService } from '@/services/gemini.service'
// Create with custom configuration
const geminiService = new GeminiService({
apiKey: 'your-api-key',
model: 'gemini-2.0-flash-lite',
temperature: 0.7,
maxOutputTokens: 200,
retryConfig: {
maxRetries: 3,
backoffMultiplier: 1.5,
initialDelay: 500,
maxDelay: 5000
}
})
// Update model at runtime
geminiService.updateModelConfig({
model: 'gemini-1.5-pro',
temperature: 0.5
})
// Get current model info
const modelInfo = geminiService.getModelInfo()
console.log(`Current model: ${modelInfo.model}`)The application includes utility functions for optimal model selection:
import {
getRecommendedModel,
validateModelChoice,
createOptimizedGeminiService,
getModelProfile,
logModelMetrics
} from '@/utils/gemini-config'
// Get recommended model for your use case
const model = getRecommendedModel('cost-effective') // or 'balanced', 'high-quality', 'experimental'
// Validate your model choice
const validation = validateModelChoice('gemini-1.5-pro', 'high') // volume: 'low', 'medium', 'high'
if (!validation.isOptimal) {
console.log(`Consider switching to: ${validation.suggestion}`)
console.log(`Reason: ${validation.reason}`)
}
// Create environment-optimized service
const service = createOptimizedGeminiService('production') // or 'development', 'staging'
// Get model performance profile
const profile = getModelProfile('gemini-2.0-flash-lite')
console.log(`Speed: ${profile.speed}, Quality: ${profile.quality}, Cost: ${profile.costTier}`)
// Log performance metrics
const startTime = Date.now()
// ... make API call ...
const responseTime = Date.now() - startTime
logModelMetrics('gemini-2.0-flash-lite', responseTime, true)| Model | Speed | Quality | Cost | Best For |
|---|---|---|---|---|
gemini-2.0-flash-lite |
Fast | Good | Low | High-volume, real-time, cost-sensitive |
gemini-1.5-flash |
Fast | Better | Medium | Balanced, general-purpose |
gemini-1.5-pro |
Medium | Best | High | High-quality, complex reasoning, premium |
gemini-2.0-flash-exp |
Fast | Better | Medium | Experimental, latest features |
Run the test suite:
npm run test # Interactive mode
npm run test:run # Run once
npm run test:ui # UI modeTo learn more about Next.js, take a look at the following resources:
- Next.js Documentation - learn about Next.js features and API.
- Learn Next.js - an interactive Next.js tutorial.
You can check out the Next.js GitHub repository - your feedback and contributions are welcome!
The easiest way to deploy your Next.js app is to use the Vercel Platform from the creators of Next.js.
Check out our Next.js deployment documentation for more details.