123
-=+=- -=+=- -=+=- -=+=- -=+=- -=+=- -=+=- -=+=- -=+=- -=+=- -=+=- -=+=- -=+=- -=+=- -=+=- -=+=- -=+=- -=+=- -=+=- -=+=- -=+=- -=+=- -=+=- -=+=- -=+=- -=+=- -=+=- -=+=- -=+=- -=+=- (c) WidthPadding Industries 1987 0|422|0 -=+=- -=+=- -=+=- -=+=- -=+=- -=+=- -=+=- -=+=- -=+=- -=+=- -=+=- -=+=- -=+=- -=+=- -=+=- -=+=- -=+=- -=+=- -=+=- -=+=- -=+=- -=+=- -=+=- -=+=- -=+=- -=+=- -=+=- -=+=- -=+=- -=+=-
Socoder -> Web Development -> Stop A Proxy?

Mon, 05 Mar 2007, 21:27
Stealth
I was wondering how I could stop proxy's on my school website project. To prevent kids at school to bypass the IP block I have setup. I was thinking this. If JavaScript checks the domain of the URL and compares it to the allowed ones, then if the domains don't match, the user is redirected to another page, would that work? And just so people don't disable JavaScript, I could use document.write() do something like:



So if JavaScript doesn't work the code will be commented out nicely. Of course that is just a quick fix. But this is just a minor problem of mine.

I am not JavaScript king though. I was wondering if someone could point me to something helpful. Or show me some code that would be helpful. Or maybe tell me that I am over complicating this problem and there is an easer solution.

Thanks!

-=-=-
Quit posting and try Google.
Mon, 05 Mar 2007, 22:52
power mousey

yes, stealth

thats a good idea.

set up a red herring for them so to speak.

redirect them to another page or maybe even
another page or site you create to collect their info
and trap them so they would be running around thru pages trying to get out.
Tue, 06 Mar 2007, 10:19
garand
Don't forget to block: babelfish.altavista.com/

Maybe you could block any referring site that has that in it?
Tue, 06 Mar 2007, 12:49
ingenium1
tsk tsk...
https://www.hackthissite.org/
https://www.hackits.de/

...

And do you think a JavaScript function will block them?

-=-=-
Roger Federer is go(o)d.
But he is not alone.

Just relax... sometimes there's no need to argue.
Fri, 09 Mar 2007, 15:44
HoboBen
JavaScript won't provide any problem to people smart enough to use proxies.

I'd use a .htaccess file if you're using an Apache server, or something PHP based at least.

You could customise this .htaccess document with a list of bots, but expect a little (usually small enough) bit of server delay due to processing... But as that's server-side, it can't be bypassed via view source, saving local html and editing, blocking browser headers, etc. (|edit| that link is for user agents rather than site addresses, so web based proxies aren't blocked by it, but you get the idea and just need to find a slightly different htaccess rewrite rule from google somewhere |edit|)

-=-=-
blog | work | code | more code
Fri, 09 Mar 2007, 16:25
Stealth
Thanks.