DS9 customer crawlers may use user agent ds9 2.x

Deep SEARCH 9 GmbH is developing intelligent web crawlers for large scale web information analytics. See our website for more details.

Our customers can send out DS9 crawlers to retrieve freely available information from the web.
These crawlers strictly are designed to always follow bot directives defined in robots.txt.
They follow meta tags in HTML documents and should always behave friendly.
Nevertheless, misconfiguration of a crawler or unexpected web content may lead to unexpected behaviour.

If you think our crawlers violated your crawler directions or misbehaved in any other way, please contact netmin (at) deepsearchnine.com and let us know about the incident.