Answered step by step
Verified Expert Solution
Link Copied!

Question

1 Approved Answer

WRITE THE UNIX CODE FOR EACH STEP PLEASE Scenario Imagine as part of a Penetration Testing reconnaissance, you are tasked with finding all the cisco.com

WRITE THE UNIX CODE FOR EACH STEP PLEASE

Scenario Imagine as part of a Penetration Testing reconnaissance, you are tasked with finding all the cisco.com subdomains listed on their index page and their corresponding IP address with no duplicates. Doing this manually would be frustrating and take a long time, however with Bash command it becomes much easier, (if you know the commands).

As with most problems in our field there are many ways to solve this problem. The lab provides one method where you can use and learn these command to solve this problem. As we go through the lab you will be combining or piping several commands together. You will also need to save your results to files as specific points in the lab. Be on the lookout for those areas.

Use wget to retrieve the index page from www.cisco.com,

Look over the index1.html you will see entries with that look like links, (lines such as

  • NOTE: If you use cut/paste be careful as quotation mark may not copy properly.

    Use grep to pull these lines from the index file. (use redirection to save as lab3-1.txt)

    NOTE: Throughout the lab you will pipe one command to another, for the purposes of this lab do not use the saved files in the command. For example, the command cat lab3-1.txt | cut -d etc > lab3-2.txt is not allowed.

    Review the links in the file, how can you extract the web address from this file? Is there a delimiter we can use to trim down the result further?

    1. To trim down or cut text from a file we can use the cut command. The basic command such as cut d, f3 would use a comma as a delimiter and return the 3rd field or token based on that delimiter. In addition, use the man pages to learn about cut.

    2. So the question is what delimiter would help us here? F2

    3. Pipe the cut command to the original grep command. (save as lab3-2.txt)

    4. Review the output. Not optimal yet but closer.

    5. What is unique about the lines we want compared to the other lines? Can we pipe another grep command to the end of our command to extract the lines we want? (save as lab3-3.txt)(Hint- the \ is the escape character.)

    6. Look over the output, it is almost clean but still has some issues, develop and pipe a cut command so the output is only web addresses. (save as lab3-4.txt)

    7. Review the output again, is there anything else that doesnt belong. If so what command would get rid of it (save as lab3-5.txt)(Only looking for web address)

    8. Is there still a problem with our output? Is there still a problem with our output? What is it? What is it? Hint: take a look at some of the options of the sort command.

    9. Once you have the output set. (save as lab3-6.txt) (Save the code used to generate lab3-6.txt in a file call lab3Script. This must be all the commands piped together DO NOT use a saved file in this command. You will add to this file later)

    10. Now we have a clean list of domain names, so the next step is convert them to IP addresses.

    11. Host command - host is a simple utility for performing DNS lookups. It is normally used to convert names to IP addresses and vice versa, (man the host command for details)

    12. Bash allows for online programming code such as for each type loops in the form: for in ; do ; done for example:

    for var1 in $(cat mylist);

    do wc $var1;

    done

    would give you a word count of each file list in the file called mylist. (the $ indicates a variable vice text) .

    13. Use a bash for loop to convert the host name to an ip addresses (save as lab3-7.txt)

    14. Review the output, as host provides you with a lot of output, some of which is not relevant. Basically, we want the lines that includes has address or has IPv6 address. Theres a straightforward way to handle this. However, a more interesting way is to use grep with wildcards and regular expression (not required)

    15. Pipe a grep command after this loop that will filter down our output. (save as lab3-8.txt)

    16. Use cut command to extract only the ip address. (hint The IP address for IPv4 and IPv6 are in different places on each line, but both are the last field.)

    17. Complete the lab by removing duplicates. (lab3-final.txt)(Add code used to generate lab3-final.txt to the lab3Script file) (There should be around 45 unique address)

  • Step by Step Solution

    There are 3 Steps involved in it

    Step: 1

    blur-text-image

    Get Instant Access to Expert-Tailored Solutions

    See step-by-step solutions with expert insights and AI powered tools for academic success

    Step: 2

    blur-text-image_2

    Step: 3

    blur-text-image_3

    Ace Your Homework with AI

    Get the answers you need in no time with our AI-driven, step-by-step assistance

    Get Started

    Recommended Textbook for

    Practical Database Programming With Visual Basic.NET

    Authors: Ying Bai

    1st Edition

    0521712351, 978-0521712354

    More Books

    Students also viewed these Databases questions