Comfort using AI productivity tools (Cursor, Copilot, Claude, etc.) to accelerate development and documentation (Preferred).
Qualifications:
Bachelor’s degree in Computer Science, Information Technology, Engineering, or related field.
Job Description:
Design and build synthetic data sets and data contracts for Retail Predictive Maintenance and Proactive Cyberthreat Investigation use cases on BigQuery and Databricks.
Develop batch and streaming data pipelines (Python / SQL / PySpark) to simulate telemetry, security events, model scores, and alert payloads feeding ServiceNow.
Implement reusable event generation logic and REST-ready payloads that integrate cleanly with ServiceNow Incident, SecOps, and FSM workflows.
Ensure strong data quality through validation rules, test harnesses, and basic observability (logging, monitoring, error handling) for all PoV assets.
Optimize pipelines for performance and cost, and package notebooks/scripts so they can be easily re-used for future customer PoCs.
Collaborate closely with the ServiceNow Platform Engineer and AI Engineer during build and demo cycles to troubleshoot data issues and refine scenarios.