Nimbuzzmasters forum
nwlve HI   GUEST nwlve
WELCOME TO NIMBUZZ MASTERS FORUM
PLEASE   REGISTER

TO
Dzs HAVE FULL ACCESS TO THE FORUM AND BE ABLE TO DOWNLOAD STUFF Dzs
Grp
STAY WITH US THANK YOU
Forum management ©️
Mzs

Web Pentest-part1 on information gathering

View previous topic View next topic Go down

r00t d3str0y3r
Member
Member
Join date : 2014-10-25
Posts : 30
Thanks gained - : 130
Gender : Male
Age : 23
View user profile http://securitymafia.com

Postr00t d3str0y3r on Sat Nov 01, 2014 8:24 pm

Information Gathering with online websites

Very Happy Hello and welcome to my first tutorial on Information Gathering.

in this tutorial we will gather information about our website using some freely online available websites.

1. Netcraft
2. YouGetSignal
3. Archive.org
4. robots.txt

Then the last thing we are going to use is the robots.txt file to view the paths, which web admin wants to hide from the bots and do not want them to be public. All such infomation can many times give your testing a boost start. I will Expain each with example one by one. Very Happy Very Happy

1. Netcraft
This website gives us a detailed information about the web hosting and the Server with detailed information on what is running on the server along with the IP, whoIs information, Server side technologies etc. All this Information should be saved in your reports so that you can use all the information to find the right tests and define the attack surface which is the most important part of a penetest.

2. YouGetSignalMany times the particular domain you are targetting is not so vulnerable or you are not able to find the right attack surface, in such case you can make a Reverse IP domain lookup and find the other domains on the server which may be vulnerable and allow you to enter the Server.

In such a way you can make your way towards into the website.

3. Archive.org
Archive.org is a website which is maintaining history of many websites over the internet. Many times you can get some information which is no more displayed on the website because of some security issue but something related to that can still be found there.

4. Robots.txt
Robots.txt is a file which is used by the websites to disallow crawlers to Crawl some of its sensitive data or the admin panels. And it can be viewed publically so in that case it could be useful if we find that data and use it later on.

After all this we can move to our target domain and view the robots.txt file, which is used by the web Admins or some Web-Application to hide private stuff from the web bots. But viewing it may allow you to get the path of all that content and later we can view those pages or paths and find some hidden content which could also be in an open form because of the foolishness of a web admin.
Very Happy Very Happy Very Happy

Author- r00t d3str0y3r

avatar
Broken Angel
Designer
Designer
Join date : 2012-12-16
Posts : 138
Thanks gained - : 3778
Gender : Female
Age : 22
View user profile

PostBroken Angel on Sat Nov 01, 2014 11:08 pm

Post


ηιλ⁄(вυzz λ⁄(αѕтєяs™
Ι'm sο lοπεlψ, βrοκεπ απgεl  
Nim

View previous topic View next topic Back to top

Create an account or log in to leave a reply

You need to be a member in order to leave a reply.

Create an account

Join our community by creating a new account. It's easy!


Create a new account

Log in

Already have an account? No problem, log in here.


Log in

 
Permissions in this forum:
You cannot reply to topics in this forum