Senior Software Engineer – Data Platform & Streaming (Java, Python, AWS)

Posted 1 month ago  •  50+ applicants
Tuple

Senior Software Engineer – Data Platform & Streaming (Java, Python, AWS)

Our Client - Hospital & Health Care company

  • Remote
$72.98 - $87.98/hour
Exact compensation may vary based on skills, experience, and location.
40 hrs/wk
Contract (w2)
Remote work yes (100%)
Travel not required
Start date
December 8, 2025
End date
November 8, 2026
Superpower
Technology
Capabilities
Technology Product Management
Technology Architecture
Software Development
Technical Program/Project Management
Preferred skills
Apache Kafka
Python (Programming Language)
Terraform
Apache Flink
Infrastructure as Code (IaC)
Snowflake (Data Warehouse)
Kubernetes
Java (Programming Language)
Distributed File Systems
Data Streaming
Data Engineering
Cloud Computing
AWS Lambda
Amazon Web Services
Preferred industry experience
Hospital & Health Care
Experience level
5 - 8 years of experience

Job description

***Please note that our customer is currently not considering applicants from the following locations: Alabama, Arkansas, Delaware, Florida, Indiana, Iowa, Louisiana, Maryland, Mississippi, Missouri, Oklahoma, Pennsylvania, South Carolina, and Tennessee.***


Our Customer is a corporation that develops, manufactures, and markets robotic products designed to improve clinical outcomes of patients through minimally invasive surgery. Founded in 1995, their goal was to create innovative, robotic-assisted systems that help empower doctors and hospitals to make surgery less invasive than an open approach. Working with the top medical professionals, they continue to develop new, minimally invasive surgical platforms and future diagnostic tools to help solve complex healthcare challenges around the world.


We are seeking a Senior Software Engineer – Data Platform & Streaming on a contract basis to support our Customer's business needs.


This position is based on-site in Sunnyvale, CA and is open to qualified candidates nationwide who are willing to relocate on their own.


Responsibilities
  • Design and re-architect large-scale data ingestion and processing pipelines, transitioning from batch/file-based to streaming architectures.
  • Build and maintain backend services and data pipelines using Java, Python, AWS, Kafka, and Flink.
  • Migrate data transformation logic from Snowflake to AWS to improve performance, reliability, and cost efficiency.
  • Integrate existing AWS-based ingest systems with downstream Kafka/Flink streaming platforms.
  • Own production systems end-to-end, including code, deployment, and infrastructure defined via IaC.
  • Collaborate with platform, data, and application teams on design, reviews, and delivery.


Skills and Qualifications
  • 5+ years of professional experience as a software engineer in backend, data, or platform-focused teams.
  • Strong, production-level coding skills in Java and Python (required).
  • Proven experience designing, building, and operating data pipelines in distributed systems.
  • Hands-on experience with streaming and event-driven architectures (Kafka, Flink, or similar).
  • Solid experience with AWS and cloud-native services (e.g., S3, Lambda, DynamoDB, ECS/EKS).
  • Experience working with Snowflake or other cloud data warehouses.
  • Strong understanding of distributed systems, data reliability, latency, and cost optimization.
  • Experience with CI/CD pipelines and modern software delivery practices.
  • Experience using Infrastructure as Code and cloud provisioning tools such as Terraform.
  • Working knowledge of Kubernetes and containerized application deployment.
  • Proficiency with SQL and relational data models.
  • Ability to take end-to-end ownership of production systems and collaborate cross-functionally.
  • Experience working on a platform, backend, or data engineering team supporting multiple internal consumers.
  • Hands-on experience with CI/CD tooling such as GitLab CI or similar systems.
  • Exposure to Apache Iceberg or modern table formats used in large-scale data platforms.
  • Experience using Terraform and following infrastructure-as-code best practices.
  • Practical experience with cloud-native technologies, including Kubernetes and other CNCF projects, in production environments.


We offer a competitive salary range for this position. Most candidates who join our team are hired at the median of this range, ensuring fair and equitable compensation based on experience and qualifications.


Contractor benefits are available through our 3rd Party Employer of Record (Available upon completion of waiting period for eligible engagements) Benefits include: Medical, Dental, Vision, 401k.


An Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, or protected veteran status and will not be discriminated against on the basis of disability.

All applicants applying for U.S. job openings must be legally authorized to work in the United States and are required to have U.S. residency at the time of application.

If you are a person with a disability needing assistance with the application, or at any point in the hiring process, please contact us at support@themomproject.com.