Filters (Clear filters)
Salary
Categories
Data Warehouse
Add
Company
Work model
Employment type
Find your next tech job
Most relevant

Data Warehouse jobs

Data Integration ArchitectData Integration Architect
P\S\L Group
Canada, Northern America (country)
Databricks
AWS
Architect
Lambda
ML Engineer
S3 Bucket
Cloud
Data Warehouse
Posted 1 day ago
Senior Consultant, Data Integration and BISenior Consultant, Data Integration and BI
Myers-Holum
India, Southern Asia (country)
MySql
Cloud
GraphQL APIs
Sales
Video
Data Warehouse
Oracle
SQL
REST APIs
Business Intelligence
Posted 1 day ago
Senior Consultant, Data Integration and BISenior Consultant, Data Integration and BI
Myers-Holum
India, Southern Asia (country)
Cloud
Data Warehouse
Video
MySql
Sales
Business Intelligence
REST APIs
GraphQL APIs
SQL
Oracle
Posted 1 day ago
Independent Contractor, Data Integration and BIIndependent Contractor, Data Integration and BI
Myers-Holum
Los Ángeles, Mexico (city)
$70k - $90k
SQL
GraphQL APIs
Data Warehouse
Oracle
Sales
Video
MySql
Cloud
REST APIs
Business Intelligence
Posted 1 day ago
Software EngineerSoftware Engineer
VGW
Sydney, Australia (city)
Back-end
Javascript
AWS
Jest
DevOps
Docker
Software engineer
Node.js
Cloud
Typescript
Data Warehouse
Kubernetes
Posted 1 day ago
Software EngineerSoftware Engineer
VGW
Perth, Australia (city)
AWS
React
Software engineer
Typescript
Kafka
Grafana
Back-end
DynamoDB
Terraform
Kotlin
Cloud
Java
Data Warehouse
Kubernetes
Posted 1 day ago
Senior Backend EngineerNewSenior Backend EngineerNew
Wellth
Los Angeles, United States (city)
$150k - $180k
Architect
Front-end
Node.js
Jenkins
Developer
GitHub
AI
ElasticSearch
Azure
Scrum
Typescript
Cloud
CircleCi
Datadog
Agile
API
Search
Back-end
Kafka
AWS
Data Warehouse
Terraform
GCP
Kubernetes
GraphQL APIs
Posted 2 days ago
Senior Data EngineerSenior Data Engineer
CodePath
Northern America, Americas (sub-continent)
$110k - $150k
Data science
AI
Big Data Engineer
Cloud
Postgres
Data Warehouse
SQL
Business Intelligence
ML Engineer
Architect
Databricks
Posted 5 days ago
Senior Data EngineerSenior Data Engineer
ResortPass
New York, United States (region)
$160k - $200k
REST APIs
Big Data Engineer
Cloud
Data Warehouse
AWS
SQL
Business Intelligence
Posted 9 days ago
AI Solutions Developer & Data Engineering AnalystNewAI Solutions Developer & Data Engineering AnalystNew
RPA
Santa Monica, United States (city)
$80k - $100k
AI
GPT
Cloud
No Code
LLM
SQL
Business Intelligence
API
Product Manager
REST APIs
Python
Data Warehouse
Prompt Engineer
ML Engineer
Developer
Posted 10 days ago
Azure Data EngineerAzure Data Engineer
APAC
Jaipur, India (city)
AI
Shell
Data Warehouse
Cloud
SQL
Lambda
Big Data Engineer
Databricks
Python
Azure
Agile
Posted 11 days ago
Senior GCP Data Engineer - WPP Open : 6 month ContractSenior GCP Data Engineer - WPP Open : 6 month Contract
WPP
Singapore, South-Eastern Asia (country)
AI
Cloud
Jira
SQL
Business Intelligence
Marketing
Network
Agile
JSON
Big Data Engineer
Python
Data Warehouse
Bash
GCP
Posted 12 days ago
Staff Business Intelligence EngineerNewStaff Business Intelligence EngineerNew
Headspace
United States, Northern America (country)
$156k - $195k
Big Data Engineer
Databricks
SQL
Data Warehouse
Business Intelligence
S3 Bucket
Posted 13 days ago
Freelance Data EngineerFreelance Data Engineer
In The Pocket
Bucharest, Romania (city)
Terraform
AI
Big Data Engineer
Python
Cloud
Apache
Data Warehouse
SQL
Typescript
Git
GCP
Posted 14 days ago
Senior Data AnalystSenior Data Analyst
WelbeHealth
California, United States (region)
$90k - $119k
Data science
Data Analyst
Data Warehouse
SQL
Business Intelligence
ML Engineer
Marketing
Posted 15 days ago
BI Architecttags.newBI Architecttags.new
RxSense
Princeton, United States (city)
$120k - $140k
Business Intelligence
Data Warehouse
SQL
Cloud
Azure
SQL Server
AWS
C
Data science
Architect
DevOps
Posted 16 days ago
Data Engineer - Regulatory ReportingData Engineer - Regulatory Reporting
Clear Street
New York, United States (region)
$140k - $190k
Python
Software engineer
Data Warehouse
SQL
Kubernetes
REST APIs
Cloud
Docker
Tech lead
DevOps
Posted 16 days ago
Data Analyst (d/f/m) at SolarisData Analyst (d/f/m) at Solaris
Solaris
Chennai, India (city)
Cloud
Data Warehouse
SQL
Business Intelligence
Posted 17 days ago
Staff Data EngineerStaff Data Engineer
LendingTree
Charlotte, United States (city)
$100k - $120k
AWS
MongoDB
Python
Marketing
ML Engineer
Data Warehouse
Agile
Java
Databricks
ElasticSearch
SQL
Big Data Engineer
Kafka
DynamoDB
Lambda
Cloud
API
Sales
Data science
Posted 17 days ago
Python and Kubernetes Software Engineer - Data, Workflows, AI/ML & AnalyticsPython and Kubernetes Software Engineer - Data, Workflows, AI/ML & Analytics
Canonical
Northern America, Americas (sub-continent)
$2k - $2k
ML Engineer
Kubernetes
Apache
AI
Python
Data science
AWS
Developer
Linux
Docker
Software engineer
Open Source
Cloud
Data Warehouse
Azure
Posted 17 days ago
Senior Data EngineerSenior Data Engineer
Method, a GlobalLogic company
Poland, Eastern Europe (country)
Linux
Business Intelligence
API
GitHub
Oracle
Docker
REST APIs
Unix
AI
Databricks
Data Warehouse
Java
Shell
Cloud
Python
Bash
Spring Boot
S3 Bucket
SQL
Azure
Git
Posted 21 days ago
Staff Data EngineerStaff Data Engineer
Sotheby's
London, United Kingdom (city)
Business Intelligence
SQL
HTML
AWS
Data Warehouse
Python
Sales
Network
Big Data Engineer
S3 Bucket
Architect
Posted 21 days ago
Lead Data Engineer(Snowflake/DBT)Lead Data Engineer(Snowflake/DBT)
Careers at Tide
Bengaluru, India (city)
GCP
Git
Agile
Big Data Engineer
Python
Data Warehouse
Business Intelligence
AWS
Scrum
SQL
Apache
Posted 28 days ago
Sr Data Engineer II(Analytics)NewSr Data Engineer II(Analytics)New
eClinical Solutions
Karnataka, India (region)
REST APIs
Cloud
SQL Server
C
PL/SQL
Big Data Engineer
Python
Business Intelligence
Oracle
Data Warehouse
SQL
Azure
HTML
AWS
Java
Posted 28 days ago
Senior Marketing Operations ManagerNewSenior Marketing Operations ManagerNew
GoGuardian
United States, Northern America (country)
$100k - $115k
Business Intelligence
Marketing
Data Warehouse
Sales
Posted 1 month ago
Software Engineer, Infrastructure (8+ YOE)tags.newSoftware Engineer, Infrastructure (8+ YOE)tags.new
Airtable
San Francisco, United States (city)
$196k - $339k
MySql
Data Warehouse
No Code
Redis
Back-end
DynamoDB
ML Engineer
Apache
Developer
Kafka
AI
Posted 1 month ago
IT Project ManagerIT Project Manager
NZXT, Inc.
Banqiao, Taiwan (city)
AWS
Business Intelligence
Data Warehouse
Cloud
SQL
ML Engineer
Project manager
Spring Boot
AI
Posted 1 month ago
Data Analyst (Contract)Data Analyst (Contract)
Glide
United States, Northern America (country)
Marketing
Business Intelligence
Big Data Engineer
SQL
Sales
Python
Video
Data Analyst
Data Warehouse
No Code
Posted 2 months ago
Senior Software Engineer, DataSenior Software Engineer, Data
ACLU - National Office
New York, United States (region)
$161k - $161k
Big Data Engineer
Python
Software engineer
Data Warehouse
Agile
S3 Bucket
Cloud
Lambda
Azure
AWS
C
Terraform
Posted 2 months ago
Published: 2025-10-07  •  Canada, Northern America (country)
Data Warehouse
AWS
ML Engineer
Architect
Cloud
Databricks
S3 Bucket
Lambda
On-site
Full-time
Our Vision

P\S\L Group is a global organisation dedicated to putting information at the service of medicine.  The companies and people of the P\S\L Group aim to improve medical care by serving those who need it, those who provide it and those who seek to improve it.

To this end, we want our information and education services to contribute to the goals we share with our clinicians, clients and supporters, namely: to accelerate the advancement of medicine and help people enjoy better, longer lives. 

Purpose

Our key contribution to society is to help clinicians and those who support them provide state-of-the-art medical care. 

Our primary business purpose is to help clients and supporters increase the effectiveness of activities pertaining to scientific communication, medical education and the maintenance of clinician audience intimacy.

Position Summary

We are seeking an experienced Integration Data Integration Architect to lead the ETL architecture, design and modernization of end to end data pipeline solutions for our Data Management team. You will shape scalable, efficient and future ready data solutions enabling advanced analytics, reporting and machine learning across the organization. 

Key Responsibilities
  • Design and implement scalable, modular reusable ETL/ELT solutions across Snowflake, AWS, PostgreSQL and future platforms such as dbt and Databricks 
  • Ensure pipelines are optimized and performance and resource utilization, reducing cloud compute & storage costs
  • Define and document source-to-target mappings for structured and semi structured data 
  • Standardize data integration patterns that improve throughput and reduce maintenance overhead
  • Embed compliance requirements into pipeline design 
  • Ensure ETL/ELT processes scale seamlessly with business requirements
  • Tune workflows for higher concurrency and throughput without driving costs up 
  • Create clear technical documentation, design patterns and reusable templates 
  • Write pseudo code and technical specifications to accelerate development and ensure consistency across teams 
  • Work closely with data engineers, analyst and business stakeholders to ensure solutions align with business value 
  • Provide architectural guidance for integrating new data ecosystems such as dbt, databricks etc. 
Key Skills, Knowledge & Professional Education
  • 6 - 8+ years of experience in data engineering, ETL and or architecture
  • Strong hands on experience with Snowflake, AWS (e.g. S3,Glue, Lamda) and PostgreSQL
  • Familiarity with Dbt, Databricks, and modern ETL practices preferred 
  • Proven experience in pipeline optimization, modular design and reusable frameworks
  • Knowledge of compliance frameworks (e.g. GDPR, HIPPA) and data governance principles
  • Strong documentation and communication skills, including ability to write clear pseudo code for data engineers. 
Nice to Have 
  • Healthcare industry and life sciences data experience 
  • Knowledge of Databricks and DBT, including migration strategies from legacy data warehouse to modern lake house architectures
  • Professional certifications in AWS, Snowflake, DBT or Databricks
Notre vision

Le Groupe P\S\L est une organisation mondiale dédiée à mettre l’information au service de la médecine. Les sociétés et les personnes du Groupe P\S\L visent à améliorer les soins médicaux en servant ceux qui en ont besoin, ceux qui les fournissent et ceux qui cherchent à les améliorer.

À cette fin, nous souhaitons que nos services d’information et d’éducation contribuent aux objectifs que nous partageons avec nos cliniciens, nos clients et nos partenaires, à savoir : accélérer les progrès de la médecine et aider les gens à vivre mieux et plus longtemps.

Notre mission

Notre principale contribution à la société est d’aider les cliniciens et ceux qui les soutiennent à offrir des soins médicaux de pointe.

Notre objectif commercial principal est d’aider nos clients et partenaires à accroître l’efficacité des activités liées à la communication scientifique, à l’éducation médicale et au maintien d’une relation de proximité avec les cliniciens.

Résumé du poste

Nous recherchons un(e) Architecte en Intégration de Données expérimenté(e) pour diriger l’architecture, la conception et la modernisation des solutions de pipelines de données de bout en bout au sein de notre équipe de gestion des données. Vous façonnerez des solutions de données évolutives, efficaces et prêtes pour l’avenir, permettant l’analytique avancée, la production de rapports et l’apprentissage automatique à l’échelle de l’organisation.

Responsabilités principales
  • Concevoir et mettre en œuvre des solutions ETL/ELT évolutives, modulaires et réutilisables sur Snowflake, AWS, PostgreSQL et sur des plateformes futures telles que dbt et Databricks.

  • Garantir l’optimisation des pipelines en matière de performance et d’utilisation des ressources, en réduisant les coûts de calcul et de stockage dans le cloud.

  • Définir et documenter les mappings source-cible pour les données structurées et semi-structurées.

  • Standardiser les modèles d’intégration de données afin d’améliorer le débit et de réduire la charge de maintenance.

  • Intégrer les exigences de conformité dans la conception des pipelines.

  • S’assurer que les processus ETL/ELT évoluent de manière fluide en fonction des besoins de l’entreprise.

  • Ajuster les flux de travail pour accroître la concurrence et le débit sans augmenter les coûts.

  • Rédiger une documentation technique claire, des modèles de conception et des gabarits réutilisables.

  • Écrire du pseudo-code et des spécifications techniques afin d’accélérer le développement et d’assurer la cohérence entre les équipes.

  • Collaborer étroitement avec les ingénieurs de données, les analystes et les parties prenantes métier pour aligner les solutions sur la valeur commerciale.

  • Fournir des orientations architecturales pour l’intégration de nouveaux écosystèmes de données tels que dbt, Databricks, etc.

Compétences, connaissances et formation professionnelle
  • 6 à 8+ années d’expérience en ingénierie de données, ETL et/ou architecture.

  • Solide expérience pratique avec Snowflake, AWS (p. ex. S3, Glue, Lambda) et PostgreSQL.

  • Familiarité avec dbt, Databricks et les pratiques ETL modernes (atout).

  • Expérience avérée en optimisation de pipelines, conception modulaire et cadres réutilisables.

  • Connaissance des cadres de conformité (p. ex. RGPD, HIPAA) et des principes de gouvernance des données.

  • Excellentes compétences en documentation et communication, y compris la capacité à rédiger du pseudo-code clair pour les ingénieurs de données.

Atouts
  • Expérience dans l’industrie de la santé et des sciences de la vie.

  • Connaissance de Databricks et dbt, y compris des stratégies de migration d’entrepôts de données traditionnels vers des architectures modernes de type lakehouse.

  • Certifications professionnelles en AWS, Snowflake, dbt ou Databricks.

 

Looking for talent?

Get in front of thousands of skilled ML/AI Engineers and discover a suitable candidate for your job opening.