Skip to main content

Making Your First Request

Step-by-step tutorial to get started with ClaroEdge

L
Written by Lukas Mikalauskas
Updated today

Before You Begin

This tutorial will walk you through making your first requests with both ClaroEdge Network (proxy) and ClaroEdge Extract (web scraping API). By the end, you'll have successfully retrieved data using both products.

Prerequisites

Before starting, make sure you have:

βœ“ Created your ClaroEdge account

βœ“ Completed KYC verification (to access free trial credits)

βœ“ Logged into your dashboard at dashboard.claroedge.com

πŸ’‘ Free Trial Credits

After KYC verification, you receive free credits: 500MB for Network, 1,000 requests for Extract Lite, and 1000 requests for Extract Pro. No credit card required!


​

Part 1: Your First Network (Proxy) Request

In this section, you'll generate proxy credentials and test them with a simple request.

Step 1: Navigate to Network

From your dashboard, click "Network" in the left sidebar under the PRODUCTS section.

Title: Network page showing your data balance and usage statistics - Description: Network page showing your data balance and usage statistics

Network page showing your data balance and usage statistics

At the top of the page, you'll see your Network Data Remaining balance. This shows how much proxy data you have available.

Step 2: Open the Proxy Generator

Scroll down the Network page to find the Proxy Generator section. This is where you'll create your proxy credentials.

Title: Proxy Generator - Configure your proxy settings - Description: Proxy Generator - Configure your proxy settings

Proxy Generator - Configure your proxy settings

Step 3: Configure Your Proxy Settings

Set up your proxy with these recommended settings for your first test:

Setting

Value

Why

Protocol

HTTP(S)

Works with most tools and cURL

Session

Sticky

Same IP for multiple requests

Country

United States (or your choice)

Easy to verify location

Duration

5 minutes

Sufficient for testing

Quantity

1

One credential pair for testing

Step 4: Generate Credentials

Click the cyan "Generate" button at the bottom of the Proxy Generator section.

Title: Click Generate to create your proxy credentials - Description: Click Generate to create your proxy credentials

Click Generate to create your proxy credentials

Part 2: Your First Extract (Web Scraping) Request

Now let's use the Extract API to scrape a webpage. We'll cover both the dashboard interface and API methods.

Method A: Using the Dashboard

The easiest way to test Extract is through the dashboard interface.

Step 1: Navigate to Extract

Click "Extract" in the left sidebar under the PRODUCTS section.

Title: Extract page showing your request balances - Description: Extract page showing your request balances

Extract page showing your request balances

You'll see your remaining requests for both Extract Lite and Extract Pro at the top.

Step 2: Find the Single Request Interface

Scroll down to find the "Send Single Request" section.

Title: Extract interface with tier selection and URL input - Description: Extract interface with tier selection and URL input

Extract interface with tier selection and URL input

Step 3: Configure Your Extraction

Set up your first extraction with these settings:

Setting

Value

Why

Tier

Lite

Good for static pages, lower cost

URL

Simple test page

Output Format

HTML

See the raw HTML response

Step 4: Send the Request

Click the "Send Request" button. The response will appear in the Response panel on the right.

βœ… Success!

If you see HTML content in the Response panel, your Extract request worked! You've successfully scraped your first webpage.

Method B: Using the API

For programmatic access, use the Extract API directly.


​

Next Steps

Congratulations! You've successfully made your first requests with both ClaroEdge products. Here's what to explore next:

For Network Users

β€’ Try different countries and cities for geo-targeted requests

β€’ Set up IP whitelisting for enhanced security

β€’ Integrate with your existing tools (Scrapy, Selenium, Puppeteer)

For Extract Users

β€’ Try batch processing with the /v1/job endpoint for multiple URLs

β€’ Experiment with different output formats (JSON, CSV, Markdown)

β€’ Set up webhooks for async batch result delivery

Useful Resources

Resource

Description

πŸ“š Documentation

Full API reference with code examples for multiple languages

❓ Help Center

Tutorials, guides, and troubleshooting articles

πŸ’¬ Customer Support

Live chat for real-time assistance

🀝 Affiliate Program

Earn commissions by referring new customers

πŸ’‘ Monitor Your Usage

Keep an eye on your usage statistics in the dashboard. Both the Home page and individual product pages show detailed usage charts and remaining credits.

Did this answer your question?