[Hiring] Data Platform Engineer REMOTE USA

Position: Data Platform Engineer

Date Posted: 17 January 2026

Industry: Information Technology / Data Engineering

Employment Type: Long-Term Contract (Remote)

Experience: 5+ Years

Qualification: Bachelor’s Degree in Computer Science, Engineering, or a related field (or equivalent professional experience)

Salary: $80 – $95 per hour (Depending on experience)

Location: United States – REMOTE

Company: Piper Companies

Description:


📌 We’re Hiring – Data Platform Engineer | United States (Remote)

Piper Companies is looking for an experienced Data Platform Engineer to join a rapidly growing technology organization on a long-term contract with strong potential for conversion. This fully remote, US-based role focuses on designing, building, and operating a core data platform that supports large-scale integrations, analytics, and event-driven services.

Key Responsibilities:
• Own and continuously improve multi-stage data pipelines that ingest data from external sources such as APIs, direct database queries, and file-based feeds into centralized data lakes and warehouses.
• Develop and maintain event-driven services and ETL workflows to enrich, deduplicate, validate, and process data at scale.
• Design and enforce data contracts, schemas, and migration strategies to enable safe backfills, corrections, and alignment with legacy systems.
• Implement observability across data pipelines and jobs, including monitoring, alerting, lineage tracking, and operational runbooks.
• Operate, secure, and scale cloud-based data infrastructure on AWS using infrastructure-as-code best practices.
• Optimize data models and SQL queries to support high-performance analytical workloads.
• Collaborate closely with engineering and product teams to enable downstream integrations, APIs, and user-facing data access.
• Create and maintain documentation covering architecture, standards, and workflows to support a collaborative and growing platform team.

Qualifications:
• 5+ years of experience in backend and/or data engineering roles with direct ownership of production-grade data platforms.
• Strong hands-on experience building batch or event-driven data pipelines using Python, Node.js, or similar languages.
• Extensive experience operating AWS-based systems and provisioning infrastructure using Terraform.
• Advanced SQL expertise with proven ability to tune performance in MPP data platforms such as Redshift, Snowflake, or similar technologies.
• Demonstrated experience delivering reliable, idempotent data pipelines and validating outputs against legacy systems.
• Background working in regulated or highly governed environments such as healthcare or financial services.
• Strong understanding of security, compliance, and audit-readiness standards including SOC 2 and ISO 27001.
• Comfortable owning systems end-to-end and resolving complex production issues independently.

Preferred Experience:
• Workflow orchestration tools such as Airflow or equivalent platforms.
• Streaming and messaging technologies including Kafka, Kinesis, or SQS.
• Data modeling and transformation tools such as dbt.
• Experience with data quality, metadata management, and lineage tools.
• Exposure to real-time data enrichment or rules-based processing systems.

Compensation & Benefits:
• Hourly compensation ranging from $80 to $95, based on experience (flexibility may be available).
• Comprehensive benefits package including Medical, Dental, Vision, 401(k), and Sick Leave where required by law.

Disclaimer: The job details above are structured for clarity and based on publicly available content from recruiters/Company pages. All rights remain with the original source; names may be withheld for confidentiality. We are not involved in the hiring process.