Create a PHP script will acquire the data from 6 website pages, and store the data in 5 separate mysql tables.
1 ) From this page ([login to view URL])
with the same columns, as displayed on the page.
Download the first 100 only if available. Only keep the latest request in the table. (Overwrite whatever is currently in the table with the new data)
Repeat this on a schedule, every 30 minutes.
This one will create a mysql table called "unusual_option_activity_stocks"
2) [login to view URL]
with the same columns, as displayed on the page.
Download the first 100 only if available. Only keep the latest request in the table. (Overwrite whatever is currently in the table with the new data)
Repeat this on a schedule, every 30 minutes. Only write to the table any NEW data currently not in the table.
This one will create a mysql table called "unusual_option_activity_etf"
3) [login to view URL]
with the same columns, as displayed on the page.
Download the first 100 only if available. Only keep the latest request in the table. (Overwrite whatever is currently in the table with the new data)
Repeat this on a schedule, every 30 minutes. Only write to the table any NEW data currently not in the table.
This one will create a mysql table called "unusual_option_activity_Index"
4) [login to view URL]
with the same columns, as displayed on the page.
Download the first 100 only if available. Only keep the latest request in the table. (Overwrite whatever is currently in the table with the new data)
Repeat this on a schedule, every 30 minutes. Only write to the table any NEW data currently not in the table.
This one will create a mysql table called "most_active_options_stock"
5) [login to view URL]
with the same columns, as displayed on the page.
Download the first 100 only if available. Only keep the latest request in the table. (Overwrite whatever is currently in the table with the new data)
Repeat this on a schedule, every 30 minutes. Only write to the table any NEW data currently not in the table.
This one will create a mysql table called "most_active_options_etf"
6) [login to view URL]
with the same columns, as displayed on the page.
Download the first 100 only if available. Only keep the latest request in the table. (Overwrite whatever is currently in the table with the new data)
Repeat this on a schedule, every 30 minutes. Only write to the table any NEW data currently not in the table.
This one will create a mysql table called "most_active_options_etf"
Hello
How are you
My name is Xu
this program will contain . autoloign and download csv that is all
I AM A python dev I am sure I can do with python and selenium
i have full time and I can start to work immediately
Please contact me and do let us discuss about your project
Thanks for your posting
$100 CAD in 3 days
4.9 (228 reviews)
7.2
7.2
7 freelancers are bidding on average $216 CAD for this job
Hello, I'm a full time freelancer and am free now.
I have read your project description
I am very interested in this project.
I have a lot of experience in this type of work.
I'm sure I can complete it on time with high quality according to your request
please get in touch so that we can discuss more and I can start to work immediately.
I'm always ready.
I have got your job very much interested to work in your project.
About Me:
I am a software developer more than 3.6 years working experiences of Laravel, codeigniter, CorePhp
and worked 25+ successful projects in various technologies. Also made many web services APIs for mobile Apps.
Also worked in APIs in more than 18 projects.
specialization in your requirements:
I worked in many php frameworks including Codeigniter. also worked in Core php and laravel framework in many websites and hybrid mobile applications. and usually using MySql as a backend.
also it would be better if you explain in detail about the project so I can share my understanding, ideas & working plans with you.
So consider my resume for your assignment. And let me know next availability time so we can move forward for achieve your project requirement.
Also let me know if you have any doubts or queries about me.
Regards,
Ratan Singh