💻
Code Examples
Complete integration examples with production-ready code
Build complete content moderation systems with our step-by-step examples. Each example includes database setup, report thresholds, AI moderation, human review escalation, and secure webhook handling.
🚀
Node.js & Express
Complete content moderation system built with Express.js, Prisma, and PostgreSQL. Features report-based thresholds, automated AI analysis, and manual review workflows.
Express.js + Prisma ORM
PostgreSQL database
Webhook security & processing
View complete example→
🐍
Python & FastAPI
Async content moderation API built with FastAPI, SQLAlchemy, and PostgreSQL. High-performance Python implementation with automatic API documentation.
FastAPI + SQLAlchemy
Async HTTP requests
Pydantic validation
View complete example→
⚡Quick API Test
Want to quickly test the Outharm API? Here's a minimal example to get you started:
Node.jsSet OUTHARM_TOKEN in your environment
const response = await fetch('https://api.outharm.com/moderation/automated', { method: 'POST', headers: { 'Authorization': `Bearer ${process.env.OUTHARM_TOKEN}`, 'Content-Type': 'application/json' }, body: JSON.stringify({ schema_id: 'your-schema-id', content: { title: ['Post title to moderate'], content: ['Post content to analyze for harmful material'] } }) }) const result = await response.json() console.log('Moderation result:', result.is_harmful)