Gather the URLs : Challenge 2

This entry is part 18 of 19 in the series Linux Basics For Hackers

Linux Basics For Hackers

Learn Now : Filtering Content in Linux with Awk and Sed Learning The Basics of Permissions in Linux is Important

Okay, everyone, it’s time for a challenge now, your challange looks simple but it isn’t. You will have to use everything you have learnt so far to complete this challenge.


The Challenge


Lets just get straight into the chanllenge. You have to get all of the internal URLs from the home page of https://growinghacker.com. After extracting the URLs store them in a file, name it whatever you want and also check the count of URLs. That’s it but you have to do this using a single command in the terminal.

REMEMBER : You should only use one single command, from the URL extraction to storing those URLs in a single file. JUST A SINGLE LINE !!

Leave a Reply