You can get any http/https links from a webpage and save into a text file.
You can clone the project and run the following command to install:
$ git clone https://github.com/rahulcs754/Scrape-Urls.git
I consider, You have already installed virtualenv and pip.
First of all, create a virtualenv in a working directory
$ virtualenv virtualenv_name
after that you have to activate virtualenv machine by using the below command
#In linux
$ source virtualenv_name/bin/activate
#In Windows
$ source virtualenv_name/Scripts/activate
#saving list of dependencies to file
pip install -r requirements.txt
$ python main.py
Note : Make sure you cd into the clone folder before performing the command above.
- Fork the repository, make your changes, and add yourself to AUTHORS.md
- Send a pull request
python3