MCP Server for AI Assistants
Enable your AI assistant to scrape web pages using ScrapingAnt's MCP server.
What is MCP?
Model Context Protocol (MCP) is an open standard that enables AI assistants to connect with external data sources and tools. ScrapingAnt's MCP server allows AI assistants like Claude, Cursor, and VS Code Copilot to scrape web pages directly during conversations.
Getting Your API Key
To use the ScrapingAnt MCP server, you need an API key:
- Sign up for a ScrapingAnt account
- Navigate to your Dashboard to find your API key
Getting Started
Install the ScrapingAnt MCP server in your preferred AI tool:
Claude Desktop
Add the following to your claude_desktop_config.json:
- macOS:
~/Library/Application Support/Claude/claude_desktop_config.json - Windows:
%APPDATA%\Claude\claude_desktop_config.json
{
"mcpServers": {
"scrapingant": {
"url": "https://api.scrapingant.com/mcp",
"transport": "streamableHttp",
"headers": {
"x-api-key": "<YOUR-API-KEY>"
}
}
}
}
VS Code / GitHub Copilot
Add the following to your VS Code settings or create .vscode/mcp.json in your workspace:
{
"mcpServers": {
"scrapingant": {
"url": "https://api.scrapingant.com/mcp",
"transport": "streamableHttp",
"headers": {
"x-api-key": "<YOUR-API-KEY>"
}
}
}
}
Cursor
- Open Settings → MCP → Add new MCP Server
- Enter the following configuration:
- Name:
scrapingant - URL:
https://api.scrapingant.com/mcp - Transport:
streamableHttp - Headers:
x-api-key: <YOUR-API-KEY>
- Name:
Claude Code (CLI)
Run the following command in your terminal:
claude mcp add scrapingant --transport http https://api.scrapingant.com/mcp -H "x-api-key: <YOUR-API-KEY>"
Cline
Add the following to your cline_mcp_settings.json:
{
"mcpServers": {
"scrapingant": {
"url": "https://api.scrapingant.com/mcp",
"transport": "streamableHttp",
"headers": {
"x-api-key": "<YOUR-API-KEY>"
}
}
}
}
Windsurf
Follow the Windsurf MCP documentation using the standard configuration:
{
"mcpServers": {
"scrapingant": {
"url": "https://api.scrapingant.com/mcp",
"transport": "streamableHttp",
"headers": {
"x-api-key": "<YOUR-API-KEY>"
}
}
}
}
Available Tools
The ScrapingAnt MCP server provides three tools for web scraping:
| Tool | Description |
|---|---|
get_web_page_html | Fetch a URL and return raw HTML content |
get_web_page_markdown | Fetch a URL and return content as Markdown |
get_web_page_text | Fetch a URL and return plain text content |
Parameters
All three tools accept the same parameters:
| Parameter | Type | Required | Default | Description |
|---|---|---|---|---|
url | string | ✅ | - | The URL of the page to scrape |
browser | boolean | true | Whether to use browser rendering | |
proxy_type | string | datacenter | Proxy type: datacenter or residential | |
proxy_country | string | Random | ISO-3166 country code. See available proxy countries |
Parameter Details
browser
By default, ScrapingAnt uses a headless browser to render pages. This is essential for JavaScript-heavy websites and Single-Page Applications (SPAs). Set to false for lightweight HTML parsing of static pages.
proxy_type
datacenter(default): High-speed datacenter proxies. Best for most use cases.residential: Premium residential proxies. Use when encountering anti-bot detection for improved success rates.
proxy_country
Specify a country code to route requests through proxies in that region. Useful for accessing geo-restricted content or testing location-specific behavior.
Usage Examples
Once configured, you can ask your AI assistant to scrape web pages naturally:
Basic Scraping
"Fetch the content from https://example.com and summarize it"
"Get the HTML from https://news.ycombinator.com"
Using Markdown Output
"Scrape https://docs.python.org/3/tutorial/index.html as markdown and explain the main topics"
With Residential Proxies
"Fetch https://example.com using residential proxies"
Geo-targeted Requests
"Get the content from https://example.com using a proxy from Germany"
Credits Usage
MCP server requests consume API credits the same way as regular API calls. Learn more about credits and pricing.