The change of locations, the page information, like prices of the products, laws of a country, and even search results, looks different. ScrapingBee country code with geo-targeted scraping can get you the data you want. After going through the simple steps, you can navigate the page in the local languages and access it via local servers.
With the concise and clear code for cURL, Python, and JavaScript, you can easily set the ball rolling without any hindrances. Turn on the ScrapingBee premium proxy at selected times, run the tests, and fix common errors to progress continuously.
The country_code tells ScrapingBee to route your request from a country you select. Many sites then respond with local content. With the right code, your scraper sees the same page a local user would see. That makes your data accurate and useful.
Use country_code when the site is location-aware. Consider these common cases:
For most cases, enable premium proxies as well. That choice gives cleaner exits and fewer blocks. Before running examples, install ScrapingBee in your environment and store the API key in a secure variable.
Here is a minimal cURL request. Replace the API key and URL with your own values.
curl -G "https://app.scrapingbee.com/api/v1/" \
--data-urlencode "api_key=YOUR_API_KEY" \
--data-urlencode "url=https://example.com" \
--data-urlencode "render_js=false" \
--data-urlencode "premium_proxy=true" \
--data-urlencode "country_code=de"
import requests
params = {
"api_key": "YOUR_API_KEY",
"url": "https://example.com",
"render_js": "false",
"premium_proxy": "true",
"country_code": "de"
}
r = requests.get("https://app.scrapingbee.com/api/v1/", params=params, timeout=60)
print(r.status_code, len(r.text))
const params = new URLSearchParams({
api_key: "YOUR_API_KEY",
url: "https://example.com",
render_js: "false",
premium_proxy: "true",
country_code: "de",
});
fetch("https://app.scrapingbee.com/api/v1/?" + params.toString())
.then(res => res.text())
.then(html => console.log(html.length));
Start with render_js=false. Many pages return clean HTML without JavaScript. That choice saves time and cost. If you plan bulk pulls, Scrapingbee Pagination for CSV Exports and APIs helps you move through results in order and export clean CSV files.
ScrapingBee follows ISO 3166-1 alpha-2 codes. For standardized country identifiers used in this guide, see the official list of country codes. Here are common examples:
For standardized country identifiers used in this guide, see the official list of country codes. If you are not sure, check the official list. Pick the code that matches your real target market. In this section, note the ScrapingBee iso 3166 codes list for readers who search for a map of codes.
Search results change by region. You may need local SERPs for audits or rank checks. Try this test for Germany.
curl -G "https://app.scrapingbee.com/api/v1/" \
--data-urlencode "api_key=YOUR_API_KEY" \
--data-urlencode "url=https://www.google.com/search?q=site%3Aexample.com+pricing" \
--data-urlencode "premium_proxy=true" \
--data-urlencode "country_code=de" \
--data-urlencode "block_resources=true" \
--data-urlencode "timeout=60000"
Block heavy assets when you only need HTML. The page will load faster and cost less. To reduce parsing work, ScrapingBee extract rules let you return only the fields you need from the HTML response. This use case aligns with scrapingbee Google results by country.
Large shops show different items and currencies by country. For Amazon Germany, test with a product that exists in that market.
curl -G "https://app.scrapingbee.com/api/v1/" \
--data-urlencode "api_key=YOUR_API_KEY" \
--data-urlencode "url=https://www.amazon.de/dp/B0XXXXXXX" \
--data-urlencode "premium_proxy=true" \
--data-urlencode "country_code=de" \
--data-urlencode "render_js=false"
Confirm the language, currency, and availability. For dynamic pages and scripted clicks, ScrapingBee Playwright can render the app, keep country_code active, and return the final HTML you need. This section supports scrapingbee’s Amazon country code and helps readers who face region walls.
Some sites use both IP and headers to set location. Add an Accept-Language header that fits the target.
curl -G "https://app.scrapingbee.com/api/v1/" \
-H "Accept-Language: de-DE,de;q=0.9" \
--data-urlencode "api_key=YOUR_API_KEY" \
--data-urlencode "url=https://example.com/" \
--data-urlencode "premium_proxy=true" \
--data-urlencode "country_code=de"
Keep headers simple and honest. Use one realistic User-Agent if needed. When a target needs form data or JSON, ScrapingBee’s post request lets you send a body while keeping country_code and premium_proxy active. Too many spoofed headers can look strange and cause blocks. This advice maps to the scrapingbee request headers location.
Premium proxies often give better geolocation and fewer errors. If the content looks wrong, enable them first and test again. Readers may find this under the scrapingbee premium proxy setup or the scrapingbee residential proxies guide. Both terms point to the same fix.
Errors still happen. Use this plan when a page looks wrong or is blocked.
This section covers the scrapingbee region blocked bypass and scrapingbee troubleshooting country code. Keep fixes short and clear.
You can repeat this flow for each new job.
Readers who look for a full recipe may search for the scrapingbee geotargeting tutorial. This flow answers that need. For network-level routing without extra code changes, ScrapingBee proxy mode lets you forward requests through the service using your existing HTTP client.
import requests
def scrape_geo(url, cc, api_key, js=False, lang=None):
params = {
"api_key": api_key,
"url": url,
"premium_proxy": "true",
"country_code": cc,
"render_js": "true" if js else "false",
"block_resources": "false" if js else "true",
"timeout": "60000"
}
headers = {}
if lang:
headers["Accept-Language"] = lang
r = requests.get("https://app.scrapingbee.com/api/v1/", params=params, headers=headers, timeout=90)
r.raise_for_status()
return r.text
html = scrape_geo("https://example.com/pricing", "gb", "YOUR_API_KEY", js=False, lang="en-GB,en;q=0.9")
print(len(html))
async function scrapeGeo(url, cc, apiKey, js = false, lang = "") {
const params = new URLSearchParams({
api_key: apiKey,
url,
premium_proxy: "true",
country_code: cc,
render_js: js ? "true" : "false",
block_resources: js ? "false" : "true",
timeout: "60000"
});
const headers = {};
if (lang) headers["Accept-Language"] = lang;
const res = await fetch("https://app.scrapingbee.com/api/v1/?" + params.toString(), { headers });
if (!res.ok) throw new Error("HTTP " + res.status);
return await res.text();
}
scrapeGeo("https://example.com", "de", "YOUR_API_KEY", false, "de-DE,de;q=0.9")
.then(h => console.log(h.length));
These helpers keep your choices in one place. They also make tests easy to share. In this section, note scrapingbee country code examples to help readers find copy-ready code. If you work in spreadsheets, ScrapingBee extension alternatives for Google Sheets help you run requests outside of add-ons and load clean results into your sheet by CSV import.
Small checks prevent large mistakes. Use these checks before a big run.
Use data in a safe way. Respect robots.txt, rate limits, and local laws. Protect user privacy. Use geotargeting to test pages, compare markets, and fix errors. Avoid attempts to bypass lawful blocks.
Goal | Key parameters | Country code example | Extra headers | JS rendering | Quick notes |
---|---|---|---|---|---|
Local SERP check | premium_proxy=true, country_code=de, block_resources=true | de | Accept-Language: de-DE,de;q=0.9 | No | HTML only is faster and cheaper. |
Amazon product page | premium_proxy=true, country_code=de | de | Accept-Language: de-DE,de;q=0.9 | No | Pick a product sold in the target market. |
Price and stock audit | premium_proxy=true, country_code=gb | gb | Accept-Language: en-GB,en;q=0.9 | No | Check currency and units after the request. |
App or SPA page | premium_proxy=true, country_code=us, render_js=true | us | Accept-Language: en-US,en;q=0.9 | Yes | Turn off JS only if HTML loads correctly. |
Legal or consent banners | premium_proxy=true, country_code=fr | fr | Accept-Language: fr-FR,fr;q=0.9 | No | Screenshots help verify banner behavior. |
Speed-focused crawl | premium_proxy=true, country_code=ca, block_resources=true | ca | Accept-Language: en-CA,en;q=0.9 | No | Block images, fonts, and media. |
Heavy pages that time out | premium_proxy=true, country_code=au, timeout=90000 | au | Optional: Accept-Language: en-AU,en;q=0.9 | Maybe | Increase the timeout and retry. |
Region-blocked pages | premium_proxy=true, country_code=br | br | Accept-Language: pt-BR,pt;q=0.9 | No | Try a second exit in the same country. |
By now, the ScrapingBee country routing is clear. Because location changes what pages show, this setting matters for accurate data. For teams that prefer community code, ScrapingBee’s open source tools can speed up prototypes, provide reference examples, and support quick integration. With the steps above, set country_code, add helpful headers, enable premium proxies, test pages, and fix common errors.
Next, run a small test. Pick one country. Use the quick start. Check the output. Then add more markets. With these patterns, you can move from guesswork to a stable and trusted workflow.
Data is the key to success. When you have the right tools with the correct…
The best ScrapingBee extension alternative for Google Sheets will bring out your productivity, make it…
In this age of information, data is the key to success. There are a number…
There are a number of big organizations around the world collecting data daily from millions…
In this age of information, timely data collection plays an important role in the success…
ScrapingBee Playwright can take your web data scraping to the next level. If you want…