I need you guys to build me a program that will allow me to save links from my data base as a file on my hard drive in a form of html file.
I've got thousands of links in my data base and the program must be able to:
- save a dynamic website eg. [login to view URL] !!! - top priority
- saving speed must be settable ( like i've got 10 domains and on 2 of them i want 5 saves per sec. and on the rest 10 per sec.)
- save the page under a previously assigned / predefined name
- images are not important but it would be nice that they were (lower resolution / lower quality
Ultimately, the program will run on a separate computer 24/7
The link database is available in csv, xlsx or google spreadsheets so the program must be able to read/import it.
23456 | https://www.zbozi.cz/vyrobek/philips-oneblade-face-and-body-qp2630-30/#from=hp
"23456" - is the name under this link to be saved as.
18 freelancers are bidding on average $370 for this job
hi sir am interested in your task. myself 6 years experienced phpdeveloper with great knowledge of scraping too. for assurance of quality work you can check my reviews. ping me for more discussion.