A production-ready agentic AI system that integrates Notion workspace management with Microsoft AutoGen multi-agent framework through Model Context Protocol (MCP).
Frontend: https://mcp-notion-chatbot.lovable.app
Backend: Dynamically tunneled via ngrok (see deployment section)
This project demonstrates advanced implementation of:
- Model Context Protocol (MCP) for standardized tool integration
- Microsoft AutoGen multi-agent conversational AI
- Asynchronous Python architecture with proper concurrency handling
- Production deployment with ngrok tunneling
Frontend (Lovable) → ngrok Tunnel → Flask API → AutoGen Agent → MCP Tools → Notion API
- MCP Server:
https://mcp.notion.com/mcpviamcp-remote - Protocol: Stdio-based MCP server communication
- Tools: Dynamic tool discovery and registration from Notion MCP server
- Authentication: Secure token-based Notion workspace access
- Framework: Microsoft AutoGen with AssistantAgent implementation
- Team: RoundRobinGroupChat with 5-turn conversation limit
- Model: OpenAI GPT-4o via
autogen-extOpenAI client - Termination: Text-based termination condition ("TERMINATE")
- Reflection: Tool use reflection enabled for improved decision making
# Proper async chain implementation
Flask (sync) → asyncio.run() → AutoGen Team → MCP Tools → Notion API├── main.py # Flask API server with ngrok tunneling
├── .env.example # Environment variables template
├── pyproject.toml # UV dependency management
├── uv.lock # Dependency lock file
├── agents/
│ └── notion_agent.py # AutoGen AssistantAgent configuration
├── teams/
│ └── notion_team.py # RoundRobinGroupChat team setup
├── MCP/
│ └── notion_mcp_tools.py # MCP server connection and tool registration
├── models/
│ └── openai_model_client.py # OpenAI client configuration
├── config/
│ ├── settings.py # Environment configuration
│ └── prompt/
│ └── system_prompt.py # Agent system message
└── images/
├── frontend_ui.png # Frontend interface screenshot
├── workflow.png # System workflow diagram
└── detailed_workflow_tunnelling.png # Detailed architecture diagram
Execute tasks through the Notion agent
{
"task": "create a page named 'Project Timeline'"
}Response:
{
"status": "success",
"result": "Agent conversation stream output"
}Health check endpoint
# Clone repository
git clone <repository-url>
cd MCP-Notion-Project
# Install dependencies
uv sync
# Configure environment
cp .env.example .env
# Add your tokens: NOTION_TOKEN, OPENAI_API_KEY, NGROK_AUTH_TOKENuv run main.pyThe application will:
- Start Flask server on port 7001
- Establish ngrok tunnel for public access
- Connect to Notion MCP server
- Initialize AutoGen agent with MCP tools
Your local machine acts as the production server through ngrok tunneling, enabling public access while maintaining local development flexibility.


