Like Minded People
Work Together

Back to Career

Sr. Data Engineer (DataBricks/BigQuery)

Location: Hyderabad/Bangalore

Work Experience: 8-12 years

Requirements:

  • Key Skills: Databricks, BigQuery, ETL/ELT, Snowflake, SQL, Python.
  • Hands-on experience in data engineering, building production-grade ETL/ELT pipelines and data products. 
  • Strong proficiency in Python and SQL, with practical experience using PySpark or similar frameworks for large-scale data processing. 
  • Experience working with modern cloud data platforms such as Google BigQuery and Databricks, Snowflake / Redshift. 
  • Solid understanding of streaming and messaging patterns (e.g., Pub/Sub, Kafka) and how to simulate or process event streams. 
  • Proven experience implementing data quality checks, validation rules, and automated testing for data pipelines. 
  • Familiarity with orchestration tools (Airflow, Databricks Jobs, or similar) and version control / CI-CD practices. 
  • Ability to design clear JSON/REST payloads and collaborate with application/platform teams on integration contracts. 
  • Strong communication and documentation skills; comfortable working with distributed teams and client stakeholders in fast PoV timelines. 
  • Prior experience integrating data platforms with ServiceNow, especially for ITSM, FSM, or SecOps analytics use cases (Preferred).
  • Domain exposure to industrial IoT / predictive maintenance and/or cyber security analytics - threat detection, risk scoring (Preferred).
  • Experience with data observability or data quality tools (e.g., Acceldata, Monte Carlo, Ataccama) and basic monitoring dashboards (Preferred).
  • Hands-on work with ML feature engineering, scoring pipelines, or simple model hosting to simulate prediction outputs (Preferred).
  • Experience creating reusable PoC/PoV accelerators: template datasets, notebooks, configuration scripts, and documentation (Preferred).
  • Comfort using AI productivity tools (Cursor, Copilot, Claude, etc.) to accelerate development and documentation (Preferred). 

Qualifications: Bachelor’s degree in Computer Science, Information Technology, Engineering, or related field.

Job Description:

  • Design and build synthetic data sets and data contracts for Retail Predictive Maintenance and Proactive Cyberthreat Investigation use cases on BigQuery and Databricks. 
  • Develop batch and streaming data pipelines (Python / SQL / PySpark) to simulate telemetry, security events, model scores, and alert payloads feeding ServiceNow. 
  • Implement reusable event generation logic and REST-ready payloads that integrate cleanly with ServiceNow Incident, SecOps, and FSM workflows. 
  • Ensure strong data quality through validation rules, test harnesses, and basic observability (logging, monitoring, error handling) for all PoV assets. 
  • Optimize pipelines for performance and cost, and package notebooks/scripts so they can be easily re-used for future customer PoCs. 
  • Collaborate closely with the ServiceNow Platform Engineer and AI Engineer during build and demo cycles to troubleshoot data issues and refine scenarios.