Roles And Responsibilities

  • Partner with our client teams, Engineering, IT functions to create clear requirements for build needs and convey that vision to multiple scrum teams.
  • Demonstrate deep understanding of technology stack and impact on final product.
  • Collaborate with architecture team to design the optimal solutions to meet the business needs.
  • Assemble large, complex data sets that meet functional / non-functional business requirements.
  • Understand and work with multiple data sources to meet business rules and support analytical needs



  • Bachelor’s degree in computer science or in “STEM” Majors (Science, Technology, Engineering and Math)
  • Minimum of 5 years of relevant experience.


Desired Requirements

  • Good understanding of Cloud computing and Big data concepts.
  • Advanced working SQL knowledge and experience working with relational databases, query authoring (SQL) as well as working familiarity with a variety of databases.
  • Experience building and optimizing ‘big data’ data pipelines, architectures and data sets.
  • Experience performing root cause analysis on internal and external data and processes to answer specific business questions and identify opportunities for improvement.
  • Strong analytic skills related to working with structured/unstructured/semi-structured datasets.
  • A successful history of manipulating, processing and extracting value from large datasets.
  • Experience with big data tools: Hadoop, Hive, Spark, Kafka, etc.
  • Experience with data pipeline and workflow management tools: Azkaban, Luigi, Airflow, etc.
  • Experience with any of the cloud platforms: AWS, GCP or Azure.
  • Experience with object-oriented/object function scripting languages: Python, Java, Scala, etc.

Location: Remote first job, but where the client mandates Work From Office, Candidate needs to relocate.

Kindly Submit Your Resume

    Name *


    Phone Number*


    Only PDF / Docx / Text Files are acceptable