Senior Data Engineer
Published: 2025-11-14Job details
Central America, Americas (sub-continent)
Remote
Freelance
At Teampathy, we connect top-tier software developers like you with exciting global opportunities. Our core purpose is to forge meaningful connections and deliver exceptional value, ensuring mutual success for both clients and developers. We achieve this by providing high-impact, cost-effective software solutions through efficient, quality-driven, and agile processes, solidifying our role as a trusted partner committed to making a lasting impact in the global tech industry.
If you're looking to connect with meaningful opportunities and be part of a company that values your expertise and growth, we invite you to explore our current openings.
Job Description Overview- We are seeking a highly experienced Senior or Principal-level Data Engineer for a long-term contract role. The primary focus of this position is to design, develop, and maintain robust data integration solutions, specifically focused on moving and transforming data from Oracle databases to various Google Cloud Platform (GCP) data stores. The ideal candidate is a hands-on expert in building scalable, reliable ETL/ELT pipelines and has a deep understanding of both traditional and cloud-native data architectures. You will play a critical role in shaping our data infrastructure and ensuring the timely and accurate availability of data for analytics and business operations.
- Design, build, and maintain efficient and reliable ETL/ELT pipelines to integrate data from on-premise Oracle databases into Google Cloud Platform (GCP).
- Develop scalable data models and schemas in GCP data stores like BigQuery to support business intelligence and data science initiatives.
- Implement data quality checks, monitoring, and alerting to ensure the accuracy, completeness, and reliability of data pipelines.
- Collaborate with data architects, analysts, and business stakeholders to understand data requirements and translate them into technical solutions.
- Optimize and troubleshoot data pipelines for performance, scalability, and cost-effectiveness.
- Write clean, maintainable, and well-tested code for data transformations and pipeline orchestration.
- Contribute to the establishment of data engineering best practices, standards, and governance policies.
- 5+ years of professional experience in a data engineering role, with a proven track record of building and managing large-scale data pipelines.
- Expert-level proficiency in SQL and extensive experience with Oracle databases, including data extraction and performance tuning.
- Hands-on experience with core Google Cloud Platform (GCP) data services (e.g., BigQuery, Dataflow, Cloud Storage, Pub/Sub, Cloud Composer).
- Strong programming skills in Python for data processing and automation.
- In-depth understanding of data warehousing concepts, data modeling, and ETL/ELT design principles.
- Excellent problem-solving skills and the ability to work independently in a remote setting while collaborating effectively with a distributed team.
- Experience with Infrastructure as Code (IaC) tools, particularly Terraform, for managing cloud resources.
- Familiarity with CI/CD practices and tools (e.g., GitLab CI, Google Cloud Build) for automating data pipeline deployments.
- Google Cloud Professional Data Engineer certification.
- Experience with containerization technologies like Docker and Kubernetes.
- Knowledge of streaming data architectures and technologies.