Rating for ABCProxy

Web Unblocker simulates real users Internet Behaviour web crawling

JavaScript rendering
Advanced browser fingerprinting
99.99% success rates

TRUSTED BY 50,000+ CUSTOMERS WORLDWIDE

Access to public data

By enabling businesses to effortlessly reach, access, scrape and utilize this data for research and analysis, we empower them to make informed decisions and drive innovation.

Never get blocked again

Web Unblocker automatically develops new methods to overcome anti-bot technologies — allowing you to access the public web data you need, without investing heavily in infrastructure or research and development.

The importance of Unblocking

Web Unblocker breaks through network restrictions and allows users to access blocked content while improving browsing speed and security, which is important in accessing information and expanding the scope of web usage.

Quality web Unblocking and crawling solutions

AI-driven website Unblocking

When attempting to access a blocked website, the AI system quickly evaluates the availability, speed and stability of different servers and automatically selects the most suitable connection to ensure that users can access the target website quickly and stably. Whether it's geographical restrictions, network censorship or other types of blocking, AI can break through with greater accuracy and efficiency.

Automated agent management

When it is necessary to access a website in a specific region, the system will automatically select a proxy server located near the region to ensure the fastest access speed and the best connection quality. At the same time, the system will also monitor the status of each proxy server in real time, once a server is found to have problems, it will immediately switch to other reliable servers to ensure that the user's Internet connection will not be interrupted.

Built-in
Browser for JavaScript Rendering

Some sites that use JS require a browser to be launched in order for certain data elements to be fully displayed on the page. Web Unblocker detects these sites and automatically launches the built-in browser in the background to ensure seamless crawling and complete and accurate data results.

Never get blocked again when web scraping

Easily integrate our solutions to your projects

We ensure that integrating our products into your scraping infrastructure is as effortless as possible. With multiple language support and ready-to-use code examples, a quick and easy start to your web scraping project is a guarantee.

                    
curl -k -v -x https://unblock.abcproxy.com:17610 \
-U 'USERNAME:PASSWORD' \
'https://www.abcproxy.com/' \
-H 'X-Abc-Render: html'
                    
                
                      
import requests

# Use your Web Unblocker credentials here.
USERNAME, PASSWORD = 'YOUR_USERNAME', 'YOUR_PASSWORD'

# Define proxy dict.
proxies = {
  'http': f'http://{USERNAME}:{PASSWORD}@unblock.abcproxy.com:17610',
  'https': f'https://{USERNAME}:{PASSWORD}@unblock.abcproxy.com:17610',
}

headers = {
    'X-Abc-Render': 'html'
}

response = requests.get(
    'https://www.abcproxy.com/',
    verify=False,  # It is required to ignore certificate
    proxies=proxies,
    headers=headers,
)

# Print result page to stdout
print(response.text)

# Save returned HTML to result.html file
with open('result.html', 'w') as f:
    f.write(response.text)
                      
                  
                      
<?php
$ch = curl_init();

curl_setopt($ch, CURLOPT_URL, 'https://www.abcproxy.com/');
curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1);
curl_setopt($ch, CURLOPT_PROXY, 'https://unblock.abcproxy.com:17610');
curl_setopt($ch, CURLOPT_PROXYUSERPWD, 'YOUR_USERNAME' . ':' . 'YOUR_PASSWORD');
curl_setopt($ch, CURLOPT_SSL_VERIFYPEER, false);
curl_setopt($ch, CURLOPT_SSL_VERIFYHOST, false);

curl_setopt_array($ch, array(
    CURLOPT_HTTPHEADER  => array(
        'X-Abc-Render: html'
    )
));

$result = curl_exec($ch);
echo $result;

if (curl_errno($ch)) {
    echo 'Error:' . curl_error($ch);
}
curl_close($ch);
                      
                  
                      
import fetch from 'node-fetch';
import { HttpsProxyAgent } from 'https-proxy-agent';

const username = 'YOUR_USERNAME';
const password = 'YOUR_PASSWORD';

const agent = new HttpsProxyAgent(
  `https://${username}:${password}@unblock.abcproxy.com:17610`
);

// We recommend accepting our certificate instead of allowing insecure (http) traffic
process.env['NODE_TLS_REJECT_UNAUTHORIZED'] = 0;

const headers = {
   'X-Abc-Render': 'html',
}

const response = await fetch('https://www.abcproxy.com/', {
  method: 'get',
  headers: headers,
  agent: agent,
});

console.log(await response.text());
                      
                  
                      
package main

import (
	"crypto/tls"
	"fmt"
	"io/ioutil"
	"net/http"
	"net/url"
)

func main() {
	const Username = "YOUR_USERNAME"
	const Password = "YOUR_PASSWORD"

	proxyUrl, _ := url.Parse(
		fmt.Sprintf(
			"https://%s:%s@unblock.abcproxy.com:17610",
			Username,
			Password,
		),
	)
	customTransport := &http.Transport{Proxy: http.ProxyURL(proxyUrl)}

	// We recommend accepting our certificate instead of allowing insecure (http) traffic
	customTransport.TLSClientConfig = &tls.Config{InsecureSkipVerify: true}

	client := &http.Client{Transport: customTransport}
	request, _ := http.NewRequest("GET",
		"https://www.abcproxy.com/",
		nil,
	)
	
	// Add custom cookies
        request.Header.Add("X-Abc-Render", "html")
        
	request.SetBasicAuth(Username, Password)
	response, _ := client.Do(request)

	responseText, _ := ioutil.ReadAll(response.Body)
	fmt.Println(string(responseText))
}

                      
                  
                      
package org.example;

import org.apache.hc.client5.http.auth.AuthScope;
import org.apache.hc.client5.http.auth.CredentialsProvider;
import org.apache.hc.client5.http.classic.methods.HttpGet;
import org.apache.hc.client5.http.config.RequestConfig;
import org.apache.hc.client5.http.impl.auth.CredentialsProviderBuilder;
import org.apache.hc.client5.http.impl.classic.CloseableHttpClient;
import org.apache.hc.client5.http.impl.classic.HttpClients;
import org.apache.hc.client5.http.impl.io.PoolingHttpClientConnectionManagerBuilder;
import org.apache.hc.client5.http.ssl.NoopHostnameVerifier;
import org.apache.hc.client5.http.ssl.SSLConnectionSocketFactoryBuilder;
import org.apache.hc.client5.http.ssl.TrustAllStrategy;
import org.apache.hc.core5.http.HttpHost;
import org.apache.hc.core5.http.io.entity.EntityUtils;
import org.apache.hc.core5.http.message.StatusLine;
import org.apache.hc.core5.ssl.SSLContextBuilder;

import java.util.Arrays;
import java.util.Properties;


public class Main {

    public static void main(final String[] args)throws Exception {
        final CredentialsProvider credsProvider = CredentialsProviderBuilder.create()
                .add(new AuthScope("unblock.abcproxy.com", 17610), "USERNAME", "PASSWORD".toCharArray())
                .build();
        final HttpHost target = new HttpHost("https", "https://www.abcproxy.com/", 443);
        final HttpHost proxy = new HttpHost("https", "unblock.abcproxy.com", 17610);
        try (final CloseableHttpClient httpclient = HttpClients.custom()
                .setDefaultCredentialsProvider(credsProvider)
                .setProxy(proxy)
                // We recommend accepting our certificate instead of allowing insecure (http) traffic
                .setConnectionManager(PoolingHttpClientConnectionManagerBuilder.create()
                        .setSSLSocketFactory(SSLConnectionSocketFactoryBuilder.create()
                                .setSslContext(SSLContextBuilder.create()
                                        .loadTrustMaterial(TrustAllStrategy.INSTANCE)
                                        .build())
                                .setHostnameVerifier(NoopHostnameVerifier.INSTANCE)
                                .build())
                        .build())
                .build()) {

            final RequestConfig config = RequestConfig.custom()
                    .build();
            final HttpGet request = new HttpGet("/locations.html");
            request.addHeader("X-Abc-Render","html");
            request.setConfig(config);

            System.out.println("Executing request " + request.getMethod() + " " + request.getUri() +
                    " via " + proxy + " headers: " + Arrays.toString(request.getHeaders()));

            httpclient.execute(target, request, response -> {
                System.out.println("----------------------------------------");
                System.out.println(request + "->" + new StatusLine(response));
                EntityUtils.consume(response.getEntity());
                return null;
            });
        }
    }
}
                      
                  
                      
using System;
using System.Net;
using System.Net.Http;
using System.Threading.Tasks;

namespace OxyApi
{
    class Program
    {
        static async Task Main(string[] args)
        {
            var webProxy = new WebProxy
            {
                Address = new Uri("https://unblock.abcproxy.com:17610"),
                BypassProxyOnLocal = false,
                UseDefaultCredentials = false,

                Credentials = new NetworkCredential(
                userName: "YOUR_USERNAME",
                password: "YOUR_PASSWORD"
                )
            };

            var httpClientHandler = new HttpClientHandler
            {
                Proxy = webProxy,
            };

            // We recommend accepting our certificate instead of allowing insecure (http) traffic
            httpClientHandler.ClientCertificateOptions = ClientCertificateOption.Manual;
            httpClientHandler.ServerCertificateCustomValidationCallback =
                (httpRequestMessage, cert, cetChain, policyErrors) =>
                {
                    return true;
                };


            var client = new HttpClient(handler: httpClientHandler, disposeHandler: true);
            
            // Add custom cookies
            client.DefaultRequestHeaders.Add("X-Abc-Render", "html");
            
            Uri baseUri = new Uri("https://www.abcproxy.com/");
            client.BaseAddress = baseUri;

            var requestMessage = new HttpRequestMessage(HttpMethod.Get, "");

            var response = await client.SendAsync(requestMessage);
            var contents = await response.Content.ReadAsStringAsync();

            Console.WriteLine(contents);
        }
    }
}
                      
                  
Copy

Millis-Leland

CEO

I have to say, this is one of the best Unblocking tools we've ever used. It has a huge amount of real IP addresses, so I can easily break through all kinds of network restrictions. Our business often needs to visit some foreign commercial websites to get market information, which was always blocked out before, but now everything is going smoothly with it.

@I**t Technology

Lloyd-Stout.

Senior Python Engineer

The connection speed is incredibly fast, opening web pages almost instantly, and it's very stable during browsing, never lagging or dropping out. What's even more reassuring is that it's highly secure, so I don't have to worry about personal information leakage. I really appreciate ABC Platform for developing such a great web page Unblocker.

@I**t Technology

Trusted by our customers

152,650 User

Unblock public data for any website

Questions? We have answers.

Can I use Web Unblocker to interact with or navigate my browser?

No, Web Unblocker does not work with browsers or third-party tools such as Adspower, PuppeteerPlaywright, or Multilogin (MLA). If you need a solution that interacts with your browser that also integrates with ABCProxy's Unblocking tool, Web Unblocker, check out our Scraping Browser solution!

Does ABC Web Unblocker offer a free trial?

Yes, we offer a free trial. Simply submit your request for a trial to us, and after approval, you will receive a 7-day free trial. To request a trial, please click Request a Trial

What is the difference between Web Unblocker and Proxies?

Website Unblocker is more advanced than any proxy because it includes browser fingerprinting, JavaScript rendering, crawling capabilities, and other features that proxies cannot provide.

Can I use Web Unblocker to get parsed data?

Website Unblocker uses JavaScript to provide results in full HTML form. If you want to parse it into a different format, we recommend using other specialised tools.