Unlimited Job Postings Subscription - $99/yr!

Job Details

Principal Data Engineer

  2026-05-07     TechHuman     Springfield,MO  
Description:

Note: This is a 6-month contract-to-hire, working 100% remote until converted to permanent employment. Once converted, this position will require 100% onsite work in Springfield, MO.

We're seeking a Principal Data Engineer to lead the development, enhancement, and delivery of data pipelines, analytics solutions, and Business Intelligence-related infrastructure. This role leads Snowflake and GCP-based environments and partners closely with several business units to provide actionable insights and high-quality data products. The ideal candidate blends strong data engineering expertise with an understanding of how to turn business data into actionable insights which generate revenue and expansion within our organization.

Responsibilities

  • Lead data engineering initiatives across eCommerce data environments, leveraging Snowflake, BigQuery, GCP Cloud Functions, and real-time streaming pipelines.
  • Oversee the design, creation, and optimization of ELT pipelines, particularly GFO (Google Analytics) data pipelines, powered by large-scale datasets and growing use cases.
  • Provide clear, actionable insights to other teams and deliver dashboards/presentations to leadership.
  • Manage offshore vendors, including performance oversight, deliverables, headcount approvals, and budget management.
  • Collaborate with a Product Owners and Business Analysts to translate business requirements into scalable data solutions.
  • Ensure high-quality data ingestion, transformation, cleansing, and integration to support marketing campaigns and customer engagement strategies.
  • Support and influence analytical environments using tools such as Domo, Sigma, and upcoming BI platforms.
  • Drive architectural decisions related to Snowflake schema design, BigQuery usage, real-time data streaming, and ELT processes.

Must-Haves

  • 15+ Years of experience in data engineering, including pipeline creation, ELT processes, data cleansing, and large-scale dataset management.
  • Professional experience building data pipelines in conjunction with BigQuery/Google Cloud Platform, OR hands-on experience with Snowflake (schema design, performance optimization, etc.).
  • Expertise with real-time data streaming platforms such as Kafka, NiFi, or comparable technologies.
  • Adept ability to deliver insights and dashboards to Marketing teams and presenting to leadership stakeholders.
  • Proven ability to manage offshore teams or global vendors, including budget and headcount oversight.
  • Professional expertise gained working within the retail industry.
  • Bachelor's degree in Computer Science or related field.

Nice-to-Haves

  • Background using or implementing BI tools such as Domo, Sigma, Luca, or Tableau.
  • Additional experience with real-time streaming tools or advanced cloud-native data services.
  • Google Analytics data specialization or advanced knowledge of marketing attribution data.


Apply for this Job

Please use the APPLY HERE link below to view additional details and application instructions.

Apply Here

Back to Search