Senior Data Engineer – Data Platform & Analytics (Snowflake, Kafka, React)

New
Posted 3 hours ago  •  Less than 10 applicants •  Be one of the first to apply!
Tuple

Senior Data Engineer – Data Platform & Analytics (Snowflake, Kafka, React)

Our Client - Hospital & Health Care company

  • Sunnyvale, CA
$40.00 - $45.83/hour
Exact compensation may vary based on skills, experience, and location.
40 hrs/wk
Contract (w2)
Remote work partially (20%)
Travel not required
Start date
April 28, 2026
End date
April 28, 2027
Superpower
Technology
Capabilities
Business Intelligence
Technology Architecture
Software Development
Preferred skills
Terraform
React.js (Javascript Library)
Data Engineering
Apache Kafka
Data Modeling
SQL (Programming Language)
Snowflake (Data Warehouse)
Data Visualization
Apache Airflow
Preferred industry experience
Hospital & Health Care
Experience level
9+ years of experience

Job description

***Please note that our customer is currently not considering applicants from the following locations: Alabama, Arkansas, Delaware, Florida, Indiana, Iowa, Louisiana, Maryland, Mississippi, Missouri, Oklahoma, Pennsylvania, South Carolina, and Tennessee.***


Our Customer is a corporation that develops, manufactures, and markets robotic products designed to improve clinical outcomes of patients through minimally invasive surgery. Founded in 1995, their goal was to create innovative, robotic-assisted systems that help empower doctors and hospitals to make surgery less invasive than an open approach. Working with the top medical professionals, they continue to develop new, minimally invasive surgical platforms and future diagnostic tools to help solve complex healthcare challenges around the world.


We are seeking a Senior Data Engineer – Data Platform & Analytics (Snowflake, Kafka, React on a contract basis to support our Customer's business needs. This role is hybrid (4 days on-site and 1 day remote/week) in Sunnyvale, CA.


This position will design, build, and operate a modern data platform that powers engineering and manufacturing analytics. This role owns the full stack: ingesting and modeling data in Snowflake, implementing business logic in data schemas and services, and developing React-based front-end applications that expose this data to users through interactive dashboards and tools. The position partners closely with stakeholders to ensure data structures, APIs, and visualizations align with business needs.



Responsibilities:

  • Architect, develop, and maintain scalable data pipelines for real-time and batch processing
  • Design and build interactive, modular data visualizations using modern JavaScript frameworks and Python-based tools
  • Gather and translate business requirements from Engineering, Manufacturing, Analytics, and Finance into technical solutions
  • Ensure platform reliability through clean, typed code, documentation, version control, and testing practices
  • Implement CI/CD pipelines, infrastructure-as-code (e.g., Terraform), and observability frameworks
  • Present dashboards, metrics, and insights clearly to stakeholders to support decision-making
  • Standardize data schemas, metrics, and contracts across cloud and edge/manufacturing environments
  • Enable self-service analytics by building flexible tools and data solutions for business users
  • Integrate cloud data technologies and open-source platforms to deliver secure, scalable analytics solutions


Skills and Qualifications:

  • Bachelor’s or Master’s degree in Computer Science, Information Systems, Data Science, or related field
  • Specialization in BI, Data Engineering, Full Stack Development, or Enterprise Systems preferred
  • 8-14 years of experience in BI, Data Engineering, Backend, or Full Stack development
  • Strong experience with cloud data platforms, streaming technologies, and orchestration tools (Kafka, AWS, Spark/Flink, Airflow or alternatives, DBT, Terraform, CI/CD)
  • Proficiency in React.js, HTML5, and CSS3 for building data-driven front-end applications
  • Backend development experience using Node.js or Python to build APIs and services
  • Strong expertise in SQL, data modeling, and query optimization
  • Experience implementing testing frameworks, documentation standards, and operational runbooks
  • Experience working with manufacturing data and building related analytics/visualizations
  • Strong collaboration and communication skills with the ability to translate business needs into technical solutions

Preferred Qualifications:

  • Experience with open-source BI/visualization tools such as Streamlit or Superset
  • Familiarity with time-series databases or process historians (e.g., Aveva Historian)
  • Experience enabling self-service analytics and business user reporting
  • Experience with data governance, schema evolution, and metric contract management



We offer a competitive salary range for this position. Most candidates who join our team are hired at the median of this range, ensuring fair and equitable compensation based on experience and qualifications.


Contractor benefits are available through our 3rd Party Employer of Record (Available upon completion of waiting period for eligible engagements) Benefits include: Medical, Dental, Vision, 401k.


An Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, or protected veteran status and will not be discriminated against on the basis of disability.

All applicants applying for U.S. job openings must be legally authorized to work in the United States and are required to have U.S. residency at the time of application.

If you are a person with a disability needing assistance with the application, or at any point in the hiring process, please contact us at support@themomproject.com.