Gather the URLs : Challenge 2

This part is the 18 of 18 in the series Linux Basics For Hackers
Series Navigation<< Learn Now : Filtering Content in Linux with Awk and Sed

Okay, everyone, it’s time for a challenge now, your challange looks simple but it isn’t. You will have to use everything you have learnt so far to complete this challenge.


The Challenge


Lets just get straight into the chanllenge. You have to get all of the internal URLs from the home page of https://growinghacker.com. After extracting the URLs store them in a file, name it whatever you want and also check the count of URLs. That’s it but you have to do this using a single command in the terminal.

REMEMBER : You should only use one single command, from the URL extraction to storing those URLs in a single file. JUST A SINGLE LINE !!

Leave a Reply