I was trying to write a web scraper, so I used
scalpel. The website I wanted to scrape blocks my IP (I run a tor exit node), so I decided to use
proxychains (specifically, version 3.1-6 according to Debian). I ran into the following weird behavior: if I tell proxychains to run dns through the proxy, things are fine, but if I tell it to run dns in the clear or the URL I'm trying to connect to is an IP address (e.g. manually resolved), I always get timeouts (much faster than I should).
(don't resolve dns over the proxy)
(resolve dns over the proxy)
(resolve dns over proxy, but use an IP to avoid actually doing it)
wget and aria2c also behave like curl.
HTTP (the Haskell package) behaves differently than all the rest, failing to connect even where the rest succeed:
% proxychains stack exec -- test-http "
http://ifconfig.co"
ProxyChains-3.1 (
http://proxychains.sf.net)
|DNS-request|
ifconfig.co|R-chain|-<>-201.175.94.245:38746-<><>-4.2.2.2:53-<><>-OK
|DNS-response|
ifconfig.co is 188.113.88.193
|R-chain|-<>-201.175.94.245:38746-<><>-188.113.88.193:80-<--timeout
Anyone have any idea what's going on?