Thank you for the detailed explanation. Adding that functionality to
ModSecurity shouldn't be much work. I was thinking implementing it
along these lines:
# Perform lookup. We are not blocking straight away but by specifying
# the "capture" action we instruct the @rbl operator to extract the
# bits and place them into transaction variables (only when successful).
SecRule REMOTE_ADDR "@rbl whatever.org" nolog,pass,capture
# At this point the TX:0 variable contains the actual response returned.
# Variables TX:1, TX:2, ..., TX:9 contain individual bits so you can use
them for any purpose. For example:
SecRule TX:1 log,deny
SecRule TX:2 pass
On 4/24/07, Michael Renzmann wrote:
> -----BEGIN PGP SIGNED MESSAGE-----
> Hash: SHA1
> Hi Ryan.
> > Interesting... I see the resource/time advantage of being able to just
> > do one @rbl looking to multi.surbl.org and it will then query the other
> > list.
> >From what I understand the dnsbl-server does not perform "local
> sub-queries to other zones". Instead, the database used for that zone is
> fed by a appropriate zonefile (which, in return, most probably will be
> built by some script or whatever that combines the current contents of
> zones in question).
> > As for inspecting the bitmask portion, does it really matter
> > which list it was on? [...] Is there something specific that you are
> > looking to do depending on which list had the client IP address listed?
> Yes, but I have to explain a bit.
> A friend of mine just started a dnsbl that lists known spider IPs. This
> dnsbl is fed with data from iplists.com, which in return allows pretty
> good destinction between various "big" spiders such as Googlebot or the
> live.com spider.
> One of the websites I'm responsible for makes use of Trac
> (http://trac.edgewall.org). This has
proven to be very useful for our
> purposes, but it can be pretty demanding when processing some requests.
> Unfortunately the webserver isn't very powerful (P3-850), and I'm happy
> for any unnecessary expensive request I can prevent. One alternative
> be to upgrade to a more powerful server (but we don't have the money for
> that at this time), another would be to implement caching for Trac (which
> is actually on my to-do list, but will take some time).
> Analysing the access logs reveals that a good share of the server's load
> seems to be caused by spiders hitting places that have been disallowed in
> robots.txt. At first I thought that I'm just too dumb to write a proper
> robots.txt, but after double-checking the file content and reading
> the online available information on that topic I came to believe that I'm
> not at fault.
> Hence I'm currently considering to use a combination of mod-security
> and queries to the mentioned spider dnsbl to implement some kind of
> "robots.txt enforcement". GET-requests from IP-addresses that are listed
> in the spider dnsbl to locations that are disallowed in robots.txt should
> be answered with an error (403 or 410 or ...).
> You may wonder: where is the need for bitmasking the result? Well, some
> the locations in robots.txt are not disallowed for all spiders. One
> example is the feed generator which is allowed to be accessed by Google's
> Feedfetcher, but not by all the other "ordinary" spiders. Without
> bitmasking this will be pretty hard/expensive to achieve.
> I'm aware that this is a very special case, and most probably one that
> looks pretty borked.
> But think of a case where a user is interested to check an IP against
> three zones of a dnsbl service. If that service offers a multi zone that
> combines more than these three zones it's still easier to perform one
> lookup and mask only the interesting bits in the result, rather than
> performing three consecutive lookups to get the same result.
> Bye, Mike
> -----BEGIN PGP SIGNATURE-----
> Version: GnuPG v1.4.1 (GNU/Linux)
> -----END PGP SIGNATURE-----
> This SF.net email is sponsored by DB2 Express
> Download DB2 Express C - the FREE version of DB2 express and take
> control of your XML. No limits. Just data. Click to get it now.
> mod-security-users mailing list
> [email protected]
This SF.net email is sponsored by DB2 Express
Download DB2 Express C - the FREE version of DB2 express and take
control of your XML. No limits. Just data. Click to get it now.