Responsibilities
• Lead projects end-to-end. Meet with business users to determine requirements, analyze the data lake for relevant datasets, collaborate with other developers to design a technical solution, and see the project through to completion.
• Design and build analytical workflows that take data from the lake and transforms, filters, aggregates, and computes a meaningful result or report for use by a consumer application
• Develop highly scalable and extensible Big Data platform, which enables collection, storage, modeling, and analysis of massive data sets from numerous channels
• Continuously evaluate new technologies, innovate and deliver solution for business critical applications
Preferred Skills
• SDLC Methodology - Agile / Scrum / Iterative Development
• Job Scheduling Tools (Autosys)
• Version Control System (Git, Bitbucket)
• Continuous Integration / Continuous Delivery (CI/CD) pipelines (Jenkins)
• Real Time Streaming (Kafka)
• Visual Analytics Tools (Tableau)
• No SQL Technologies (Hbase)
Timings- 2-3 hours/weekdays
Experience:- 3-4 Years
For Individual Indians Only
To be involved in work where I can utilize skill and creatively involved with system to implement the DEVOPS activity's in AWS Cloud Computing environment and supporting AWS cloud in enterprise level that effectively contributes to the growth of organizat