The ultimate guide to hiring a web developer in 2021
If you want to stay competitive in 2021, you need a high quality website. Learn how to hire the best possible web developer for your business fast.
Web Scraping is the process of extracting data or information from an online source such as a website, database, application, etc. Web Scraping Specialists have the skill that helps people collect valuable digital data and quickly find the useful information they need from websites, mobile apps, and APIs. The experts usually use web scraping tools and advanced technologies to collect large amounts of targeted data without any manual work for the client.
With web scraping, tasks that otherwise may require a lot of time can be automated and done faster. Our experienced Web Scraping Specialists use their expertise to develop scripts that continuously target structured and unstructured data sources.
Here's some projects that our expert Web Scraping Specialist made real:
Web Scraping Specialists are skilled professionals who know how to help businesses optimize processes while collecting rich structured data they need for their specific purposes. Our experts fasten the process and return accurate results in less time, so that the customer can make better decisions more quickly without any manual labour. If you are looking for a talented professional to make a web scraping project for you, you have come to the right place. Here in Freelancer.com you can find talented professionals who will get the job done with top quality results! Post your project now and see what our Web Scraping professionals can do for you!
From 363,685 reviews, clients rate our Web Scraping Specialists 4.9 out of 5 stars.Web Scraping is the process of extracting data or information from an online source such as a website, database, application, etc. Web Scraping Specialists have the skill that helps people collect valuable digital data and quickly find the useful information they need from websites, mobile apps, and APIs. The experts usually use web scraping tools and advanced technologies to collect large amounts of targeted data without any manual work for the client.
With web scraping, tasks that otherwise may require a lot of time can be automated and done faster. Our experienced Web Scraping Specialists use their expertise to develop scripts that continuously target structured and unstructured data sources.
Here's some projects that our expert Web Scraping Specialist made real:
Web Scraping Specialists are skilled professionals who know how to help businesses optimize processes while collecting rich structured data they need for their specific purposes. Our experts fasten the process and return accurate results in less time, so that the customer can make better decisions more quickly without any manual labour. If you are looking for a talented professional to make a web scraping project for you, you have come to the right place. Here in Freelancer.com you can find talented professionals who will get the job done with top quality results! Post your project now and see what our Web Scraping professionals can do for you!
From 363,685 reviews, clients rate our Web Scraping Specialists 4.9 out of 5 stars.I have a working Chrome extension (supplied as a zipped file) that books USA-visa appointment slots for me, but it requires the full browser UI. I now need the same logic migrated to a true browserless setup—ideally using Chrome-compatible automation such as Puppeteer with the API—so the whole process can run head-down on a server. Key goals • Replicate the current extension’s slot-booking workflow exactly, including every request, redirect, and timing nuance that makes it succeed today. • Support multiple user credentials in a single run so I can cycle through several accounts without manual relogins. • Keep it Chrome-compatible; I do not want to switch engines. • Maintain the stealth techniques in the extension (headers, user-agents, delays...
I need a clean, well-structured list of fresh leads gathered exclusively from UK-based, industry-specific websites. No social media, directories, or third-party databases—just direct website scraping. Scope of work • Identify qualifying UK websites within the target industry niches I provide. • Extract key contact fields: company name, decision-maker name (when available), role/title, direct email address, phone (if listed), and website URL. • Deliver the data in an Excel or Google Sheets file, clearly de-duplicated and ready for outreach. Quality criteria • All emails must be live (no bounces on a quick MailTester check). • Minimum 90 % accuracy on contact-to-company matching. • Each entry must show its source URL for validation. Feel free to...
I need my entire site crawled with Screaming Frog SEO Spider (licensed “Pro” edition) And a full report, of 404, 301,500 errors etc. EDIT : If you haven't got Screaming Frog Pro, dont bid ! sounds crazy but.... The site has about 2000 pages and is no more than 2 hours run time. I will pay 10-15 dollars to someone that knows what they are doing. I NEED THE REPORT WITHIN 24 HOURS PLEASE
I’m looking for a simple, reliable way to collect public TikTok comments so I can feed them into my sentiment-analysis pipeline and an Algospeak detector I’m prototyping. The only pieces of each comment I really care about are the raw text and any emojis that appear alongside it; usernames, likes, or other metadata are optional and can be ignored unless they come free with your method. The workflow is up to you—official API, browser automation, or another approach—as long as it stays stable over time and does not violate TikTok’s terms of service. I’ll point you to a set of video or hashtag URLs for testing, and I need the script plus a short README so I can rerun the scrape on fresh links whenever I need new data. Deliverables • A runnable scri...
We are looking for a field sales / data collection person who can visit local shops and collect business-related information. The person will physically go to shops and gather required details as per our format. Work Responsibilities: 1) Visit retail shops in assigned areas 2) Collect shop details such as: - Shop name - Owner name - Mobile number - Shop category (furniture, electrical, hardware, etc.) - Address & location 3) Explain our platform briefly to shop owners 4) hare daily data report in Excel / Google Sheet / WhatsApp Payment: Payment can be per shop / per day / weekly (to be discussed) Long-term work possible for good performance Location: On-ground field work (local market visits) Delhi, NCR
Looking for someone who can scrpe snapchat and get me all lenses i need it as soon as possible
Please start your proposal with the word "BRASIL" so I know you've read the full description. I run a PHP / MySQL video platform focused on the LATAM market and I’m looking for a developer who can become my long-term technical teammate. The codebase is a custom CMS that handles thousands of daily views, user uploads, and a growing affiliate program. Your first mission is two-fold: Replace the current international checkout with a seamless PIX integration—front-end flow, server-side validation, webhook handling, and clear logging. Connect the site to our affiliate tracking partners (API) and pull real-time conversion data into the admin dashboard. Ongoing Responsibilities: Maintain the user-upload pipeline (FFmpeg encoding, thumbnail generation, S3/CloudFron...
I need a reliable script that automatically pulls complete tour and pricing details from and pushes the results straight into MySQL database tables I can query. The data set must cover the full catalogue on the site and include, for every tour: the day-by-day itinerary text, available dates, all stated inclusions or exclusions, and the advertised price and details in Australian dollars for each departure date. You are free to use Python with BeautifulSoup, Scrapy (preferred), Selenium or another stack you trust, as long as the final result is a repeatable, well-commented script plus clear connection settings so I can run it myself on a schedule. Deliverables • Script loaded to server, tested and working. • Ongoing support (paid as separate tasks on hourly rate or ...
Hi this is the process. 1. we receive invoice for product purhcases - these come from different suppliers 2 we run ai scripts in deepseek, attaching the invoice we received from supplier 3. deepseek produces output from the data in the invoice into format that we use to paste into google spreadsheet. the outputs looks like this file for each respective supplier 4. in the spreadsheet the last two columns are 'compare at price' / image url, 'metafields'. This is where you come in. The python script will extract the Recommended retail price from supersmarkets e.g. Coles, Woolworths, IGA, chemist warehouse. Note that the script will need to be coded using the column 'name' not the column position in the sheet because for each supplier the position for certain co...
I’m looking for a developer who can build a data-analysis script that processes image data stored in my database. The goal is to connect to the existing DB, pull the image records, run the required analysis, and return clear, structured results I can store back in the same database or export as a CSV/JSON report. Key needs • Script category: Data analysis (not scraping or automation). • Data type: Images only. • Source: Directly from a database connection—no local-file or URL handling required. Core tasks 1. Establish a secure connection to my database and fetch the image blobs or paths. 2. Run analysis—basic metrics (dimensions, color histograms, format validation) plus any lightweight computer-vision insights you recommend. 3. Output findings in...
I need a clean, well-structured list of email addresses extracted directly from online websites and databases. The emails to get include: 10,000 contact email addresses for each of the following categories: 1. The contact email addresses for every country in the world including their government, military and intelligence force contact emails of each country. 2. Largest 10,000 Evangelical churches in the United States. 3. Largest 10,000 Mosques in the United States. 4. Largest 10,000 African American churches in the United States. 3. Largest 10,000 Christian churches in Europe. 4. 10,000 contact emails from all members of United States federal congress and their staff members. 5. 10,000 emails from the largest media companies in the United States. 6. 10,000 emails from the largest media com...
The development of a Robotic Process Automation (RPA) bot is required to optimize and automate the process of uploading insurance policies to a third-party registration website. The main objective is to reduce manual workload and improve efficiency in policy registration. The bot must perform the following functions: 1. Data Extraction: Capture key information from our internal policy issuance software (initially by reading an Excel file, but with the capability to soon read data via an API call to our system). The data to be extracted includes: * Neighborhoods * Policy number * Insurance company * Policy start date * Policy end date * Insured party data (name, ID number, last name, occupation). 2. Conditional Insured Party Registration: If an insured party's ID number does not alr...
I'm in need of a skilled professional who can add new features to my existing Selenium-based Excel macro tool. The main task involves improving its data output capabilities from CSV files to websites. Furthermore, I would like to enhance the tool's navigation for a more seamless user experience. Ideal skills and experience: - Proficient in Python and Selenium - Strong background in Excel macro development - Experience with data extraction and web interaction - UI/UX design skills for navigation improvements - Ability to implement new features seamlessly
I need help scraping data from a difficult-to-scrape website. I'm looking to have a script made that can be run on a daily basis, and that I can repeat myself, along with any help in setting it up. The script will pull down unique product IDs and high-level descriptions for all the products in inventory. Then, the next day, it will do it again and compare which products have been added or removed. Assume up to 80K items per day. message me if interested, and I'll provide more details. Thanks,
I need a complete front-end scrape of three Magento-based parts catalogs (a combined 59,214 SKUs). I do not have admin or API access, so the crawler must work solely through the public storefront. Each Catalog should be a separate export file. Data fields required for every SKU: product name, price, full description, all images, category path and product codes, plus fit, finish, color, manufacturer, style, and origin details where shown. I’m comfortable with Octoparse, but feel free to use the scraping tool you know best as long as it can respect the site’s structure and avoid throttling issues. First Catalog Big Twin 1984 & Up Number of products - 21,187 Number of Categories - 24 Second Catalog Classic Pre-1984 Number of products - 22,800 Number of Categories - 25 ...
I need a robust, fully-automated system that can crawl large batches of URLs and return a clear, structured report of every monetisation signal it finds. The core detection scope is fixed: • Adserver implementations • Headerbidding wrappers • SSP footprints • Text-to-speech (TTS) widgets • Web-push-notification scripts • 1-click-login flows built on FedCM • Keyword-link services used for programmatic monetisation • Identification markers for Ad Context Protocol (AdCP) and the Agentic RTB Framework (ARTF) The solution must scale to hundreds of sites without manual intervention. I’m open to the language and stack—Playwright, Puppeteer, Selenium, or comparable headless-browser tooling are all fine—as long as it reliably...
I’m building a catalogue that focuses strictly on alat modifikasi rather than general spare parts, and I want to start with two accessory lines: sistem pencahayaan as well as handlebar dan spion. Your task is to help me locate and vet reliable suppliers or manufacturers for these items, gather full product specifications, pricing tiers, minimum order quantities, shipping terms, and clear photos, then organise everything in a spreadsheet I can act on immediately. If you can add brief notes on current market trends or best-selling SKUs in each category, that’s a plus. Deliverable • Spreadsheet listing at least five qualified suppliers per product group, complete with contact details, spec sheets, prices, MOQs, lead times, and image links. • One-page summary hi...
I'm planning a marketing campaign and need fresh, well-organized data from specific online databases. I'm interested in a complete record capture: text fields, links, and, if available, email contacts and phone numbers. Scope in brief: I'm looking for companies that exhibited at Cologne Fair. You configure the crawler/script, perform scraping, and cleaning. You take into account server restrictions (rate limit, pagination, captcha) – I want everything to be done ethically and without IP blocking. Then, for each company, I need to find an email address, categorize the company, and specify the language. To be considered, please include a detailed project proposal: what technology you will use, how you will ensure data accuracy and minimize duplicates, estimated implement...
I need most recent 1000 leads (so start from today and go back in time ) for high level managers related to AI , especially CEO, CTO , AI managers and HR maximum 4 leads per startup csv file with one row per person, so 1000 rows for I need find leads for most recently funded USA AI startups (from today date and count down) use Tracxn Pitchoob Cranchbase fields to provide in csv file data of most recent funding ; amount of most recent funding ; weblink for source; -Company Name; -Company Website; -Company Address , -State -country; -Foundation Year; -personal Name -Last name; -privet Phone number only usa - privet Email; -company working Phone number only usa -company working Email ; link to LinkedIn profile; provide alse screen copy from source website list of companies I am open for...
I need a clean, repeatable scraper that pulls comprehensive product details directly from Sephora’s website. The data set has to cover far more than the basics: along with live price and stock status I also want the full product description, brand name, SKU code, pack or size information, and any promotional wording that appears on the page (discount banners, “limited-edition” tags, gift-with-purchase notes, etc.). Because this project centres on Sephora only, you can tune your approach to whatever will stay stable against their layout changes—Python with BeautifulSoup or Scrapy, Node with Puppeteer, or another stack you’re comfortable with—as long as the final script is documented and can be rerun on my side without a steep learning curve. Deliver...
Write a script that will pull list of profiles(links to photos and basic profile info like location, age, name, etc) from dating app ( id= on google play ) Programming language doesn't matter. I am not interested in solutions that emulate mobile app in any way, only direct api requests. More details: Is this project for business or personal use? Personal What information should successful freelancers include in their application? Detailed project proposals How soon do you need your project completed? Within a month
I want to scrape the state database of specific contractors in target states. Scrape for the mailing address and phone number
I am reopening this project with full scope clarity. Please only bid if you are experienced serious and able to complete the task end to end. âš NO BOT BIDS âš NO bait or placeholder bids low bid then higher price later âš DO NOT APPLY if you cannot identify real data sources for JS rendered websites CONTEXT The website [] renders apartment listings dynamically. help me find the correct data source for apartment listings and to set up a sustainable automation system to ALERT ME IMMEDIATELY WHEN NEW LISTINGS APPEAR. I already have the following set up correctly: * Telegram bot and channel * GitHub Actions workflow scheduled and manual runs * GitHub repo with existing scripts incorrect data source and wiring needs fixing I will share the full GitHub repository link immediately after awar...
I need a small, self-contained Python script that logs in to Telegram Web with Selenium or Playwright, opens a given group or channel, and reliably pulls back the information No broader “bot-style” automation is required—just open, read, return the data, and quit cleanly. Please structure the code so the target link or @handle can be passed in as an argument and the results are printed or written to JSON; I’ll handle any further processing on my side. You’re free to choose Selenium or Playwright, but the solution must: • run on Python 3.x in a head Linux environment • respect Telegram’s login flow (QR / code) without hard-coding cookies • be well-commented so I can tweak selectors later if Telegram updates its DOM If you&rsqu...
I want a small Windows-friendly script or program that captures the current cookies, refreshes or re-injects them before they lapse, and keeps my session alive for at least seven days. The workflow should be completely automated once I provide the initial login details or a valid cookie set. Whether you lean on Python with Selenium/Playwright or a compiled utility in another language is up to you, what matters is reliability on Windows and clear instructions so I can run it again whenever the site changes its expiry window. Deliverables • Source code and a ready-to-run executable or batch file for Windows I’m happy to test early builds and provide test account so you capture cookies
I am a realtor and wish to use AI to find the owners of homes using public government information and existing N8N and eleven labs subscriptions similar to what is mentioned in this podcast found in this link I want to be able to use google maps and choose any house and find who the owner and combine it with directory and N8N store in Excel files- this is a low budget task
I need the contact emails for the top 1000 military organizations including Gold Star mothers. I also need the contact emails for the top criminal justice reform organizations and United States legal law review publications. I also need the contact email address for all American jail and prison dating services including dating for felons.
I’m putting together a verified, permission-safe master list of high-profile email contacts and I need an experienced data researcher who can reliably source and structure the information for me. Here’s what I’m after, with the approximate volumes for each segment: • Largest 50,000 churches in the United States. For each record I just need an email address. I’m flexible on the final format—CSV or Excel both work—so choose whichever lets you keep the data clean, deduplicated, and ready for bulk import. Quality matters more than speed. Before delivery, please run a quick validation pass (e.g., Hunter, NeverBounce, or a similar tool) so that hard bounces stay under 5 %. Once you respond, let me know: 1. Your approach to locating hard-...
I’m putting together a verified, permission-safe master list of high-profile email contacts and I need an experienced data researcher who can reliably source and structure the information for me. Here’s what I’m after, with the approximate volumes for each segment: • Every televangelist worldwide plus their associated churches • Top 1,000 Christian bands • Top 10,000 global preachers or pastors • 10,000 leading Wall Street market professionals • The world’s 1,000 wealthiest families • Every reigning royal family • Every known billionaire For each record I just need an email address. I’m flexible on the final format—CSV or Excel both work—so choose whichever lets you keep the data clean, dedu...
I am looking for a freelancer to manually or automatically scrape the website MonCompteFormation Your goal is to identify and extract training providers that meet strict criteria. Scope of work You must find all training programs that match the following conditions: Topics: English, Excel, Website creation, Photoshop Duration: between 15 and 30 hours Price: more than 100 € per hour Format: 100% online or distance learning Deliverables Provide a clean and structured file, Excel or CSV, including for each training: -Training provider name -Training title -Duration in hours -Hourly price -Direct link to the training page -Provider contact details if available, website, email, phone Requirements: -Strong experience in web scraping or manual data collection -Ability to handle dy...
I need a clean, well structured list of email addresses extracted directly from online websites and databases. The emails to extract include: 1. The contact email addresses for every country in the world including their government, military and intelligence force of each country. 2. Largest 10,000 Evangelical churches in the United States. 3. Largest 10,000 Mosques in the United States. 4. Largest 10,000 Churches in the world. 5. Largest 10,000 Mosques in the world. 6. Largest 10,000 African American churches in the United States. 7. 10,000 contact emails from all members of United States federal congress and their staff members. 8. 10,000 emails from the largest television and print media companies in the United States including companies and journalists. 9. 10,000 emails from the l...
I want to build simple scripts and tracker setups for scraping. The goal is to make a system that anyone can set up in one day to scrape jobs and other targeted information based on my strategy.
I need a clean, well-structured Excel file containing unique business emails for two specific roles at U.S. venture-capital firms: • the General Partner (or most senior investment decision-maker) • the finance lead, typically the CFO or Controller My target volume is up to 50,000 contacts. If that exact total is not realistic, let me know the maximum you can confidently verify and deliver within a reasonable timeline. Acceptable sourcing includes company websites, LinkedIn, and any reputable professional databases you already license or access. Please rely only on data you can validate; bounced or generic “info@” addresses will be rejected. No phone numbers or social-media links are needed—just correctly formatted email addresses plus each person’...
Please pull the entire public Mortgage Broker database (URL will be shared as soon as we start) and give me a clean CSV file. For each record I only need 4 fields: the licensee’s full name, their and their cell-phone number and city. If the site paginates or hides the phone behind an extra click, the script still has to collect it. I’m happy with a one-off delivery right now, yet I’d like the code kept tidy so we can run it every month without rewriting anything. Feel free to use Python with Scrapy, BeautifulSoup, Selenium—or any stack you trust—as long as: • The final CSV opens without errors in Excel. • Every live record on the site is captured once, no duplicates. • The script, a short README, and any environment notes are included. ...
Project: Spreadsheet Cleanup & Link Coordination (5,000 Rows, 2 Tabs) I’ve already started this project — the spreadsheet is partially organized and just needs to be reviewed, verified, and touched up. I’m looking for a detail-oriented VA to clean and standardize a Google Spreadsheet with approximately 5,000 entries across 2 tabs. This is a data cleanup / quality assurance task, not data entry, scraping, or research. ⸻ Scope of Work 1. Use Tab 1 as the master reference 2. Review and ensure names, handles, links, cities, and follower counts match correctly between Tab 1 and Tab 2 3. Update cities and follower counts accordingly based only on the master tab or existing spreadsheet data (do not guess) 4. Verify all links are correct, clickable, and working 5. C...
I need a Python-savvy content creator to produce a short course called “Introduction to Python Business Automation.” Scope • 5 tutorial videos, each 5–10 minutes. • Screen-recorded walkthroughs with clear voice-over narration (no on-camera shots or animation needed). Topics to cover 1. Data processing and analysis – simple CSV/Excel manipulation with pandas. 2. Web scraping – gathering data with requests and BeautifulSoup or similar. 3. Automating emails – generating and sending messages with smtplib. 4. Quick project that combines the above skills. 5. Recap and next steps for learners. What I expect • Well-structured scripts and live coding so beginners can follow line-by-line. • Clean, readable code and comme...
I already have a spreadsheet listing dozens of YouTube channels—your job is to populate the next column with a verified contact email for each one. Start with the channel’s “About” tab; if nothing is public there, open every social link the creator has shared (Instagram, Twitter, Facebook, or any others) until you locate a valid address. Only write “Unavailable” after every linked account has been checked and no email is found. Accuracy is critical, so please copy each address exactly as displayed and keep the sheet clean, consistent, and free of extra characters or spaces. Ordinary verification tools are fine as long as the result is a genuine, working email. Deliverables • Updated spreadsheet returned within 7 days • One clearly formatt...
Every week I need 1,000 fresh, clean leads of professionals who actively work as financial advisors or planners. These contacts must come strictly from the finance industry—banking, wealth-management firms, broker-dealers, RIAs, insurance-based advisory groups, or similar environments. Required fields for every record • First & last name • Valid, deliverable email address Necessary; phone, address, or company details are optional but welcome if you happen to capture them at no extra cost. Compensation & cadence I pay $0.25 USD per contact. A full batch of 1,000 verified emails earns $250, released once my quick spot-check confirms bounce-free addresses and no duplicates against previous lists. I plan to repeat this order weekly, so reliability and speed m...
We are seeking a Junior Python Developer to assist with tasks for a data analysis project. You must have practical knowledge of Django, Flask, and BeautifulSoup, as well as experience implementing social auth and writing automated tests. This role requires a commitment of approximately 4 hours per day. We are looking for a reliable freelancer who writes clean code and can start immediately. Please place your bid with examples of your relevant work.
My e-commerce catalogue has grown to roughly five million product pages and I need a Python-based workflow that can: • Stream-generate standards-compliant XML sitemaps, split and indexed so every URL is covered without breaching the 50 000-URL/50 MB limits. • Automatically ping Google, Bing and Yahoo once the files are in place, making use of the Search Console and Bing Webmaster APIs where possible. • Run from the command line (or a simple cron) and finish with a short log that confirms file locations and submission status. Please structure the code so it writes each URL sequentially to disk rather than keeping the full set in memory, add lastmod timestamps pulled from our database (or a stub I can swap in later), and include optional gzip compression. Deliverables 1. We...
I need a compact desktop program written in Python that can automatically scrape data from a predefined set of web pages, clean or transform that information, and store it in a tidy format (CSV or SQLite—whichever you feel is more robust for the job). A simple GUI is required so I can paste in or update target URLs, press a single button, and watch the progress log inside the window while the app does the heavy lifting. Core workflow • Fetch the pages I specify, even if they sit behind basic HTTP auth or require a user-agent header. • Parse the relevant tables, lists, or JSON responses and strip out anything I don’t need. • Save the cleaned dataset locally, overwriting or versioning based on a toggle in the settings pane. I’m comfortable with widely...
I have a spreadsheet ready and waiting; each row lists a YouTube channel, and the empty column beside it needs a valid contact email address. Here’s how I’d like you to work: Begin with the channel’s own YouTube “About” tab and pull the publicly displayed email (yes, please clear the reCAPTCHA if it appears). If that tab shows nothing, jump to any linked Instagram profile first and look for the email button or an address in the bio. Still empty? Follow any other linked social accounts—Twitter, Facebook, whatever is available—and grab the first genuine email you find. Only mark a row “Unavailable” after every linked profile has been checked. Deliverable • My original spreadsheet returned with one verified email per channe...
I need someone who can translate each workday into precise Harvest entries for one executive—me. It’s a blend of billable and non-billable time covering meetings, email work, and project tasks. I’ll send you a brief daily email summarising what happened, and you’ll turn that into properly coded, start-to-finish time blocks inside Harvest. Typical effort is well under an hour a day, adding up to about 20 hours per month. Deliverables • Harvest timesheet updated by the next business morning (U.S. Eastern). • One-line confirmation back to me, flagging anything unclear for quick correction. Accuracy matters because the data feeds client invoices and internal reporting, so prior familiarity with Harvest—or similar tools like Toggl or ...
I need to turn roughly 500 eBay purchases from the last twelve months into a clean, reliable Excel file. eBay only lets me export sales data, so these purchase records have to be captured manually or scraped. Here’s what the finished workbook must contain: • Columns—Date of Purchase, Item Name, Seller Information, Price (GBP). • Every row arranged chronologically so the file drops straight into my existing sales-tracking workbook. Feel free to add a Month column or any other fields that come naturally from the export (order number, item URL) if it helps later analysis. What matters most is that the figures and seller details match exactly what appears in my eBay account; I’ll spot-check several entries against the site. You can either gather the in...
I already sell successfully on Amazon and Noon, and now I’m ready to take my entire sunglasses range to Trendyol UAE. All the information you’ll need—titles, detailed specs, prices, SKUs and high-resolution images—is neatly organised in an Excel file, so your focus is purely on getting the data into Trendyol’s system without errors. The catalogue covers every style we carry: Men’s Sunglasses, Women’s Sunglasses and Unisex Sunglasses. Because the copy and keywords are finalised, there’s no optimisation work required; I just need accurate category placement, attribute selection, image uploads and any mandatory compliance fields completed so each listing goes live without delays. Once the items are published, send me a quick confirmation sheet...
I need to turn roughly 500 eBay purchases from the last twelve months into a clean, reliable Excel file. eBay only lets me export sales data, so these purchase records have to be captured manually or scraped. Here’s what the finished workbook must contain: • Columns—Date of Purchase, Item Name, Seller Information, Price (GBP). • Every row arranged chronologically so the file drops straight into my existing sales-tracking workbook. Feel free to add a Month column or any other fields that come naturally from the export (order number, item URL) if it helps later analysis. What matters most is that the figures and seller details match exactly what appears in my eBay account; I’ll spot-check several entries against the site. You can either gather the in...
Freelancer Requirement: Email Outreach for US Wedding Videographers Job Description We are looking for a freelancer who can help us find and reach out to wedding videographers and small wedding studios in the USA to introduce our wedding video editing services. This role is focused on manual research and personalized email outreach, not spam or bulk blasting. ⸻ Key Responsibilities • Find wedding videographers and studios in the USA • Collect verified contact details, including: • Business name • Owner name (if available) • Email address • Instagram profile • Website link • Send short, personalized outreach emails introducing our services • Maintain a simple Google Sheet with outreach status • Follow up once, if required ⸻ ...
Please gather verified contact information from vendors in open-air markets through direct, door-to-door outreach rather than scraping online directories, business websites, or social media. The data set must include for each business: • Phone number • Email address • Physical (stall or shop) address Record everything in a clean Excel or Google Sheet, one row per vendor, with separate columns for Business Name (if available), Phone, Email, and Address. Accuracy is essential, so double-check spellings and digits on site before you log each entry. Once complete, share the spreadsheet and any field notes you kept while collecting the information so I can cross-reference if needed.
I need a small script that will visit a specific site (I will share the URL and login details after award), locate every file that meets my criteria, and download those files to a local folder. The job is a single run only; once the files are saved, the task is complete. A lightweight Python solution is ideal; standard libraries such as requests, BeautifulSoup, or Selenium are all fine as long as the finished script: • Logs in or navigates past any simple gating the site uses • Crawls the relevant pages and identifies the target files (PDFs and ZIPs in my case) • Saves each file locally, keeping the original filename intact • Produces a short README so I can rerun the scraper later if needed Please keep the code clean, commented, and fully self-contained—...
I need a reliable partner who can pull complete vehicle information straight from the mParivahan portal the moment I send a registration number. Timeliness is critical—I expect each lookup to be returned in real time, not in batch files hours later. Please place every result into one continuously updated Excel sheet. Each row should at least contain: insurance expiry date, fitness validity, PUC (pollution) date, and the mobile number listed on the record. If you’re able to include extra fields such as owner name, registration date, or model, that will only strengthen our collaboration. Accuracy must match what appears on mParivahan, and the file should stay properly formatted so I can run simple look-ups and filters without manual cleanup. My target rate is ₹0.50 for ever...
If you want to stay competitive in 2021, you need a high quality website. Learn how to hire the best possible web developer for your business fast.
Learn how to find and work with a top-rated Google Chrome Developer for your project today!
Learn how to find and work with a skilled Geolocation Developer for your project. Tips and tricks to ensure successful collaboration.