10,371 Azure Data jobs in the United States

Azure Data Engineer

41073 Bellevue, Kentucky Purple Drive

Posted 9 days ago

Job Viewed

Tap Again To Close

Job Description

Permanent

MUST HAVE  Technology skills

  1. Strong/expert Spark (PySpark) Using Jupyter Notebooks, Colab or DataBricks (preferred)
  2. Hands-on data pipeline development, ingest patterns in Azure
  3. Orchestration tools, ADF or Airflow
  4. SQL
  5. Denormalized Data modeling for big data systems

MUST HAVE  competencies:

  1. Collaborative, able to work remotely, and still be an engaging team member.
  2. Strong analytical and design skills.
Apply Now

Azure Data Engineer

08101 Camden, New Jersey Subaru of America, Inc.

Posted 12 days ago

Job Viewed

Tap Again To Close

Job Description

Permanent
COMPANY BACKGROUND

LOVE. It's what makes Subaru, Subaru®. As a leading auto brand in the US, we strive to be More Than a Car Company®. Subaru believes in being a positive force in the communities in which we live and work, not just with donations but with actions that set an example for others to follow. That's what we call our Subaru Love Promise®.

Subaru is a globally renowned automobile manufacturer known for its commitment to innovation, safety, and sustainability. With a rich history dating back to 1953, Subaru has consistently pushed the boundaries of automotive engineering to deliver vehicles that offer not only exceptional performance but also a unique blend of utility and adventure.

Subaru's company culture is built on collaboration, diversity, and a shared passion for our product. We foster an inclusive environment that encourages employees to bring their unique perspectives and talents to the table. Our team members are driven by a common goal: to create exceptional vehicles that inspire and delight our customers.

ROLE SUMMARY

The Azure Data Analytic Engineer will be the AZURE SME tasked with the development and optimization of cloud-based Business Intelligence solutions. Advances data analytics capabilities and drives innovative solutions. Possesses deep technical expertise in data engineering and plays instrumental role in managing data integrations from on-premises Oracle systems, Cloud CRM (Dynamics), and telematics. Collaborates closely with Data Science and Enterprise Data Warehouse teams and business stakeholders.

PRIMARY RESPONSIBILITIES:

Data Ingestion and Storage:

  • Designs, develops, and maintains scalable, efficient data pipelines using Data Factory, and Databricks, leveraging Py Spark for complex data transformations and large-scale processing.
  • Builds and manages extract, transform, and load (ETL)/extract, load, transform (ELT) processes to seamlessly extract, transform, and load data from on-premises Oracle systems, customer relationship management (CRM) technology, and connected vehicles into data storage solutions, such as Azure Data Lake Storage and Azure SQL Database.

Data Engineering:
  • Creates high-code data engineering solutions using Databricks to clean, transform, and prepare data for in-depth analysis.
  • Develops and manages data models, schemas, and data warehouses, utilizing Lakehouse Architecture to enhance advanced analytics and business intelligence.
  • Leverages Unity Catalog to ensure unified data governance and management across the enterprise's data assets.
  • Optimizes data storage, retrieval strategies, and query performance to drive scalability and efficiency in all data operations.

Data Integration:
  • Integrate and harmonize data from diverse sources including on-premises databases, cloud services, APIs, and connected vehicle telematics.
  • Ensure consistent data quality, accuracy, and reliability across all integrated data sources.

GitHub Development:
  • Utilizes GitHub for version control and collaborative development, implementing best practices for code management, testing, and deployment.
  • Develops workflows for continuous integration (CI) and continuous deployment (CD), ensuring efficient delivery and maintenance of data solutions.

ADDITIONAL RESPONSIBILITIES:
  • Work closely with Data Science, Enterprise Data Warehouse, and Data Visualization teams, as well as business stakeholders, to understand data requirements and deliver innovative solutions.
  • Collaborate with cross-functional teams to troubleshoot and resolve data infrastructure issues, identifying and addressing performance bottlenecks.
  • Provide technical leadership, mentorship, and guidance to junior data engineers, promoting a culture of continuous improvement and innovation.

REQUIRED SKILLS AND PERSONAL QUALIFICATIONS:
  • Technical Expertise: Extensive experience with Azure Data Factory, Databricks, and Azure Synapse, as well as proficiency in Python and PySpark.
  • Data Integration: Experience integrating data from on-premises Oracle systems and connected vehicle data into cloud-based solutions.
  • Lakehouse Architecture & Governance: Deep knowledge of Lakehouse Architecture and Unity Catalog for enterprise data governance.
  • Version Control & Collaboration: Demonstrated proficiency in GitHub for development, collaboration, and deployment in large-scale environments.
  • Infrastructure as Code (IaC): Experience with Infrastructure as Code tools such as Resource Manager (ARM) templates or terraform.
  • Problem-Solving & Troubleshooting: Strong analytical skills with the ability to diagnose and resolve complex data infrastructure challenges.
  • Collaboration: Proven ability to work effectively with Data Science teams, business stakeholders, and cross-functional teams to drive data-driven insights.
  • Communication: Excellent verbal and written communication skills with the ability to translate technical concepts to non-technical stakeholders.

Education/Experience Requirements: BA/BS with 4 to 6 years of relevant experience. Relevant experience accepted in lieu of a degree.

Work Environment
  • Hybrid Role: Remote work 2 days per week (After 90 Days Onboarding)
  • Travel Required: 0%

Compensation: The recruiting base salary range for this full-time position is $99,700.00 - $140,000.00/year. Within the range, individual pay is determined by factors, including job-related skills, experience, and relevant education or training. (Internal Job Grade: P3_T) In addition to competitive salary, Subaru offers an amazing benefits package that includes:
  • Medical, Dental, Vision Plans
  • Pension, Profit Sharing, and 401K Match Offerings
  • 15 Vacation days, 9 Company Holidays, 5 Floating Holidays, and 5 Sick days.
  • Tuition Reimbursement Program
  • Vehicle Discount Programs

Apply Now

Microsoft Azure Data Engineer(FABRIC,Azure Data Factory)

Purple Drive

Posted 22 days ago

Job Viewed

Tap Again To Close

Job Description

Job Description
We are looking for an Azure Data Engineer with hands-on expertise in Microsoft Fabric, cloud-native data architectures, and AI integration. The ideal candidate will architect and deliver end-to-end data engineering and AI solutions using Microsoft Fabric, Azure OpenAI, and related ecosystem components while driving business value through advanced analytics and responsible AI practices.

Must Have Technical/Functional Skills
10+ years of experience in data engineering within cloud-native architectures (Azure).

Hands-on expertise with Microsoft Fabric components: Data Factory, OneLake, Lakehouse, Power BI, and KQL.

Strong programming and modeling skills in SQL, Python, DAX, and Spark/Python development.

Experience with Azure OpenAI Service, Copilot, and LLM integration.

Familiarity with containerization (Docker), Azure Container Apps, and Azure DevOps tools.

Strong communication and documentation skills.

Preferred Certifications
Microsoft Certified: Azure Data Engineer Associate

Microsoft Certified: Azure AI Engineer Associate

Microsoft Certified: Fabric Analytics Engineer Associate

Microsoft 365 Certified: Enterprise Administrator Expert

Primary Responsibilities
Architect end-to-end Microsoft Fabric solutions, including Data Factory, OneLake, Lakehouse, and real-time analytics platforms.

Design and deploy AI models using Azure AI Foundry and Azure OpenAI Service.

Implement Copilot solutions across the Microsoft ecosystem with integration to Fabric data sources.

Build and optimize data pipelines from multiple sources (SQL, APIs, Salesforce, Oracle, Dynamics).

Design and maintain semantic models and implement Medallion architecture (bronze, silver, gold).

Develop real-time streaming analytics using Event Streams and KQL databases.

Monitor and optimize workloads using Fabric Capacity Metrics, addressing ingestion and modeling issues.

Bonus: Build and deliver solutions on Databricks platform (preferred).

AI & Advanced Analytics Focus
Customize and implement Microsoft Copilot solutions (e.g., Copilot for Power Platform, Fabric).

Work with GPT models and apply prompt engineering through Azure OpenAI Service.

Build RAG (retrieval-augmented generation) pipelines and design hybrid search architectures.

Ensure compliance with responsible AI practices and governance frameworks.

Governance & DevOps (Preferred)
Establish data governance frameworks with Microsoft Purview and Fabric governance tools.

Implement CI/CD pipelines and best practices in DevOps for data and AI platforms.

Manage RBAC and zero-trust security across Azure/Fabric workspaces. Generic Managerial Skills, If any:
- Lead a Team of 3-5 developers.
- Follow Scrum practices to lead team and manage deliveries.

Apply Now

Azure Cloud Data Engineer

60532 Lisle, Illinois International

Posted 2 days ago

Job Viewed

Tap Again To Close

Job Description

Position Overview
Engineer International's state of the art cloud technologies as our newest Cloud Data Engineer.
As the commercial vehicle industry undertakes its most significant transformation in a century, International stands at the forefront, a vanguard of the movement. No longer content with merely supplying trucks, buses, and engines, International is on a mission to redefine transportation. Embracing a bold digital transformation, International is ushering in a new era of complete and sustainable transport solutions.
International is not just building trucks - it's forging the future of mobility. As a global industry pioneer, International is assembling a team of makers, problem solvers, and future world builders. Together, we are not just imagining a better world - We're shaping it, one innovative solution at a time. Join International now and be a part of the journey towards a brighter, more connected tomorrow.
Responsibilities
As a cloud data engineer, you will design and manage International's cutting edge cloud computing systems.
Key Activities:
+ Design and deploy cloud infrastructure on the Azure platform
+ Build and manage scalable, secure, and highly available Azure environments
+ Implement and maintain cloud-based data infrastructure, including pipelines, data stores, data lakes, etc
+ Configure and manage Azure services
+ Monitor and troubleshoot Azure infrastructure to ensure high availability and performance
+ Automate deployment and configuration using tools such as Terraform and Ansible
+ Develop and implement security controls to protect cloud infrastructure and data
+ Collaborate with cross-functional teams to design and implement new cloud-based solutions
Minimum Requirements
+ Bachelor's degree
+ At least 5 years of Information Technology or IT Architecture experience
+ At least 1 year of lead experience
OR
+ Master's degree
+ At least 3 years of Information Technology or IT Architecture experience
+ At least 1 year of lead experience
OR
+ At least 8 years of Information Technology or IT Architecture experience
+ At least 1 year of lead experience
Additional Requirements
+ Qualified candidates, excluding current International Motors employees, must be legally authorized on an unrestricted basis (US Citizen, Legal Permanent Resident, Refugee or Asylee) to be employed in the United States. International Motors does not anticipate providing employment related work sponsorship for this position (e.g., H-1B status)
Desired Skills
+ Azure Cloud data engineering experience
+ Azure Cloud certification
+ Strong understanding of Azure services and their use cases
+ Experience with infrastructure-as-code tools such as Bicep, Terraform, CloudFormation, or Ansible
+ Proficiency in at least one programming language such as Python, Java or Go
+ Familiarity with network protocols, security controls, and monitoring tools
+ Excellent communication and collaboration skills
+ Ability to work independently as well as part of a team
+ Willingness to learn new technologies and skills as needed
Benefits and Compensation
We provide a competitive total rewards package which ensures job satisfaction both on and off the job. We offer market-based compensation, health benefits, 401(k) match, tuition assistance, EAP, legal insurance, an employee discount program, and more.
For this position, the expected salary range will be commensurate with the candidate's applicable skills, knowledge and experience.
You can learn more about International's comprehensive benefits package at Overview
ABOUT TRATON
With its brands Scania, MAN, International, and Volkswagen Truck & Bus, TRATON SE is the parent and holding company of the TRATON GROUP and one of the world's leading commercial vehicle manufacturers. The Group's product portfolio comprises trucks, buses, and light-duty commercial vehicles. "Transforming Transportation Together. For a sustainable world.": this intention underlines the Company's ambition to have a lasting and sustainable impact on the commercial vehicle business and on the Group's commercial growth.
ABOUT INTERNATIONALFrom a one-man company built on the world-changing invention of the McCormick reaper in 1831, to the 15,000-person-strong company we are today, few companies can lay claim to a history like International. Based in Lisle, Illinois, International Motors, LLC* creates solutions that deliver greater uptime and productivity to our customers throughout the full operation of our commercial vehicles. We build International® trucks and engines and IC Bus® school and commercial buses that are as tough and as smart as the people who drive them. We also develop Fleetrite® aftermarket parts. In everything we do, our vision is to accelerate the impact of sustainable mobility to create the cleaner, safer world we all deserve. As of 2021, we joined Scania, MAN and Volkswagen Truck & Bus in TRATON GROUP, a global champion of the truck and transport services industry. To learn more, visit ( .
*International Motors, LLC is d/b/a International Motors USA in Illinois, Missouri, New Jersey, Ohio, Texas, and Utah.
EEO Statement
International is an Equal Opportunity Employer. We evaluate qualified applicants without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability, veteran status, and other legally protected characteristics.
If you are a qualified individual with a disability and require a reasonable accommodation to access the online application system or participate in the interview process due to your disability, please email   to request assistance. Kindly specify Job Requisition Number / Job Title and Location in response. Otherwise, your request may not be considered.
View Now

Senior Azure Data Engineer

27512, North Carolina MetLife

Posted 2 days ago

Job Viewed

Tap Again To Close

Job Description

Role Value Proposition:
We are seeking a highly skilled Senior Data Engineer to join our Marketing Data Engineering team, focusing on Azure cloud services, Databricks, and marketing technology integrations (e.g., Adobe, Salesforce). This role will focus on data platform architecture, administration, and automation, ensuring scalable, secure, and cost-efficient data solutions aligned with our Cloud Adoption Framework (CAF) governance. Key responsibilities include supporting data engineering team, marketing analytics, Power BI dashboards, and advanced personalization, while leveraging DevOps practices, CI/CD pipelines.
Key Responsibilities:
Azure Platform Architecture & Administration:
* Design and manage a secure, scalable Azure-based data platform to support marketing and customer analytics.
* Administer and optimize Azure services including Azure Data Factory, Azure Databricks, Azure SQL Database, Azure Synapse Analytics, and Key Vault.
* Implement robust monitoring, alerting, and cost optimization for the Azure data environment.
* Ensure platform availability, data pipeline reliability, and high performance for downstream consumption.
* Establish and manage Azure Data Lake Storage (ADLS) structures, security models, and performance optimization strategies.
* Monitor platform health, performance, and cost usage; implement automation to improve efficiency and reliability
Data Engineering & ETL/ELT Pipelines:
* Architect and implement ETL/ELT pipelines using Databricks (PySpark, SQL, Delta Lake) and Azure Data Factory.
* Optimize data ingestion, transformation, and storage for big data workloads and real-time streaming.
* Work with stakeholders to design data models, schemas, and APIs supporting analytics and operational needs.
* Ensure data quality, lineage, and governance through integration with tools such as Purview or equivalent.
* Build robust data transformations and aggregations to support marketing dashboards and customer 360 views.
* Optimize Python jobs, SQL queries, and data flows for performance and scalability.
Integration with Marketing Platforms:
* Integrate marketing data from platforms such as Salesforce, Adobe Experience Platform, Adobe Analytics, Facebook Ads, HubSpot, and CRM systems.
* Work with MarTech and Marketing Ops teams to ensure clean, timely data feeds for campaigns.
* Build APIs, REST connectors, or use tools such as Azure Logic Apps, Azure Functions, and API Management.
Power BI Support and Semantic Layer Design:
* Provide data models and optimized datasets to support Power BI dashboards for marketing.
* Collaborate with BI developers on semantic models using Synapse SQL Serverless or Power BI Datasets.
* Ensure data freshness and efficient refresh cycles for marketing reports.
Data Governance, Security & Compliance:
* Implement data access controls, encryption, and auditing using Azure RBAC, Purview, and Private Endpoints.
* Ensure compliance with GDPR, CCPA, and internal policies.
* Define and manage metadata, data dictionaries, and quality processes.
Agile Project Delivery & Stakeholder Engagement:
* Work in Agile teams providing estimates and delivery timelines for marketing use cases.
* Collaborate with marketing analytics, Martech, and Marketing data platform teams to prioritize backlog items.
* Translate marketing needs into platform capabilities and deliverables.
DevOps & Infrastructure as Code (IaC):
* Establish DevOps practices for CI/CD pipeline development, deployment, and testing for data workflows using Azure DevOps or GitHub Actions.
* Manage source control, release management, and environment configuration for data engineering components (ADF, Databricks, Synapse).
* Automate deployment of data pipelines and related resources with rollback strategies and deployment validations.
* Set up monitoring and alerting for CI/CD pipelines and infrastructure health using tools like Azure Monitor, Log Analytics, or Application Insights.
Essential Business Experience and Technical Skills:
Required:
* Bachelor's or master's degree in computer science, Engineering, Information Systems, or a related field experience.
* 7+ years of experience in data engineering, architecture, or platform administration with at least 3+ years in Azure-based environments.
* Deep expertise in Azure Data Factory, Azure Databricks, Azure SQL Database, and/or Synapse Analytics.
* Strong SQL and Python skills; familiarity with Git, CI/CD (Azure DevOps or GitHub Actions).
* Understanding of customer data models, identity resolution, and campaign data lifecycles.
* Experience with CI/CD pipelines and DevOps best practices for data engineering and infrastructure deployments.
* Experience supporting enterprise Power BI solutions using Azure. Experience integrating marketing or any other platforms via APIs and RESTful services.
* Proficiency with Terraform for managing Azure infrastructure via code.
Preferred:
* Azure Solutions Architect or Azure Data Engineer Associate certifications.
* Experience with Marketing technologies (Adobe, Salesforce, Marketo, 6 sense, etc) and CDPs like Adobe CDP or Tealium
* Infrastructure as Code (IaC) using Terraform, YAML
* Streaming data architecture using Event Hubs, Kafka, or Stream Analytics.
At MetLife, we're leading the global transformation of an industry we've long defined. United in purpose, diverse in perspective, we're dedicated to making a difference in the lives of our customers.
Equal Employment Opportunity/Disability/Veterans
If you need an accommodation due to a disability, please email us at This information will be held in confidence and used only to determine an appropriate accommodation for the application process.
MetLife maintains a drug-free workplace.
View Now

Senior Azure Data Engineer

08807 Bridgeville, Pennsylvania MetLife

Posted 2 days ago

Job Viewed

Tap Again To Close

Job Description

Role Value Proposition:
We are seeking a highly skilled Senior Data Engineer to join our Marketing Data Engineering team, focusing on Azure cloud services, Databricks, and marketing technology integrations (e.g., Adobe, Salesforce). This role will focus on data platform architecture, administration, and automation, ensuring scalable, secure, and cost-efficient data solutions aligned with our Cloud Adoption Framework (CAF) governance. Key responsibilities include supporting data engineering team, marketing analytics, Power BI dashboards, and advanced personalization, while leveraging DevOps practices, CI/CD pipelines.
Key Responsibilities:
Azure Platform Architecture & Administration:
* Design and manage a secure, scalable Azure-based data platform to support marketing and customer analytics.
* Administer and optimize Azure services including Azure Data Factory, Azure Databricks, Azure SQL Database, Azure Synapse Analytics, and Key Vault.
* Implement robust monitoring, alerting, and cost optimization for the Azure data environment.
* Ensure platform availability, data pipeline reliability, and high performance for downstream consumption.
* Establish and manage Azure Data Lake Storage (ADLS) structures, security models, and performance optimization strategies.
* Monitor platform health, performance, and cost usage; implement automation to improve efficiency and reliability
Data Engineering & ETL/ELT Pipelines:
* Architect and implement ETL/ELT pipelines using Databricks (PySpark, SQL, Delta Lake) and Azure Data Factory.
* Optimize data ingestion, transformation, and storage for big data workloads and real-time streaming.
* Work with stakeholders to design data models, schemas, and APIs supporting analytics and operational needs.
* Ensure data quality, lineage, and governance through integration with tools such as Purview or equivalent.
* Build robust data transformations and aggregations to support marketing dashboards and customer 360 views.
* Optimize Python jobs, SQL queries, and data flows for performance and scalability.
Integration with Marketing Platforms:
* Integrate marketing data from platforms such as Salesforce, Adobe Experience Platform, Adobe Analytics, Facebook Ads, HubSpot, and CRM systems.
* Work with MarTech and Marketing Ops teams to ensure clean, timely data feeds for campaigns.
* Build APIs, REST connectors, or use tools such as Azure Logic Apps, Azure Functions, and API Management.
Power BI Support and Semantic Layer Design:
* Provide data models and optimized datasets to support Power BI dashboards for marketing.
* Collaborate with BI developers on semantic models using Synapse SQL Serverless or Power BI Datasets.
* Ensure data freshness and efficient refresh cycles for marketing reports.
Data Governance, Security & Compliance:
* Implement data access controls, encryption, and auditing using Azure RBAC, Purview, and Private Endpoints.
* Ensure compliance with GDPR, CCPA, and internal policies.
* Define and manage metadata, data dictionaries, and quality processes.
Agile Project Delivery & Stakeholder Engagement:
* Work in Agile teams providing estimates and delivery timelines for marketing use cases.
* Collaborate with marketing analytics, Martech, and Marketing data platform teams to prioritize backlog items.
* Translate marketing needs into platform capabilities and deliverables.
DevOps & Infrastructure as Code (IaC):
* Establish DevOps practices for CI/CD pipeline development, deployment, and testing for data workflows using Azure DevOps or GitHub Actions.
* Manage source control, release management, and environment configuration for data engineering components (ADF, Databricks, Synapse).
* Automate deployment of data pipelines and related resources with rollback strategies and deployment validations.
* Set up monitoring and alerting for CI/CD pipelines and infrastructure health using tools like Azure Monitor, Log Analytics, or Application Insights.
Essential Business Experience and Technical Skills:
Required:
* Bachelor's or master's degree in computer science, Engineering, Information Systems, or a related field experience.
* 7+ years of experience in data engineering, architecture, or platform administration with at least 3+ years in Azure-based environments.
* Deep expertise in Azure Data Factory, Azure Databricks, Azure SQL Database, and/or Synapse Analytics.
* Strong SQL and Python skills; familiarity with Git, CI/CD (Azure DevOps or GitHub Actions).
* Understanding of customer data models, identity resolution, and campaign data lifecycles.
* Experience with CI/CD pipelines and DevOps best practices for data engineering and infrastructure deployments.
* Experience supporting enterprise Power BI solutions using Azure. Experience integrating marketing or any other platforms via APIs and RESTful services.
* Proficiency with Terraform for managing Azure infrastructure via code.
Preferred:
* Azure Solutions Architect or Azure Data Engineer Associate certifications.
* Experience with Marketing technologies (Adobe, Salesforce, Marketo, 6 sense, etc) and CDPs like Adobe CDP or Tealium
* Infrastructure as Code (IaC) using Terraform, YAML
* Streaming data architecture using Event Hubs, Kafka, or Stream Analytics.
At MetLife, we're leading the global transformation of an industry we've long defined. United in purpose, diverse in perspective, we're dedicated to making a difference in the lives of our customers.
Equal Employment Opportunity/Disability/Veterans
If you need an accommodation due to a disability, please email us at This information will be held in confidence and used only to determine an appropriate accommodation for the application process.
MetLife maintains a drug-free workplace.
View Now

Senior Azure Data Engineer

10176 New York, New York MetLife

Posted 2 days ago

Job Viewed

Tap Again To Close

Job Description

Role Value Proposition:
We are seeking a highly skilled Senior Data Engineer to join our Marketing Data Engineering team, focusing on Azure cloud services, Databricks, and marketing technology integrations (e.g., Adobe, Salesforce). This role will focus on data platform architecture, administration, and automation, ensuring scalable, secure, and cost-efficient data solutions aligned with our Cloud Adoption Framework (CAF) governance. Key responsibilities include supporting data engineering team, marketing analytics, Power BI dashboards, and advanced personalization, while leveraging DevOps practices, CI/CD pipelines.
Key Responsibilities:
Azure Platform Architecture & Administration:
* Design and manage a secure, scalable Azure-based data platform to support marketing and customer analytics.
* Administer and optimize Azure services including Azure Data Factory, Azure Databricks, Azure SQL Database, Azure Synapse Analytics, and Key Vault.
* Implement robust monitoring, alerting, and cost optimization for the Azure data environment.
* Ensure platform availability, data pipeline reliability, and high performance for downstream consumption.
* Establish and manage Azure Data Lake Storage (ADLS) structures, security models, and performance optimization strategies.
* Monitor platform health, performance, and cost usage; implement automation to improve efficiency and reliability
Data Engineering & ETL/ELT Pipelines:
* Architect and implement ETL/ELT pipelines using Databricks (PySpark, SQL, Delta Lake) and Azure Data Factory.
* Optimize data ingestion, transformation, and storage for big data workloads and real-time streaming.
* Work with stakeholders to design data models, schemas, and APIs supporting analytics and operational needs.
* Ensure data quality, lineage, and governance through integration with tools such as Purview or equivalent.
* Build robust data transformations and aggregations to support marketing dashboards and customer 360 views.
* Optimize Python jobs, SQL queries, and data flows for performance and scalability.
Integration with Marketing Platforms:
* Integrate marketing data from platforms such as Salesforce, Adobe Experience Platform, Adobe Analytics, Facebook Ads, HubSpot, and CRM systems.
* Work with MarTech and Marketing Ops teams to ensure clean, timely data feeds for campaigns.
* Build APIs, REST connectors, or use tools such as Azure Logic Apps, Azure Functions, and API Management.
Power BI Support and Semantic Layer Design:
* Provide data models and optimized datasets to support Power BI dashboards for marketing.
* Collaborate with BI developers on semantic models using Synapse SQL Serverless or Power BI Datasets.
* Ensure data freshness and efficient refresh cycles for marketing reports.
Data Governance, Security & Compliance:
* Implement data access controls, encryption, and auditing using Azure RBAC, Purview, and Private Endpoints.
* Ensure compliance with GDPR, CCPA, and internal policies.
* Define and manage metadata, data dictionaries, and quality processes.
Agile Project Delivery & Stakeholder Engagement:
* Work in Agile teams providing estimates and delivery timelines for marketing use cases.
* Collaborate with marketing analytics, Martech, and Marketing data platform teams to prioritize backlog items.
* Translate marketing needs into platform capabilities and deliverables.
DevOps & Infrastructure as Code (IaC):
* Establish DevOps practices for CI/CD pipeline development, deployment, and testing for data workflows using Azure DevOps or GitHub Actions.
* Manage source control, release management, and environment configuration for data engineering components (ADF, Databricks, Synapse).
* Automate deployment of data pipelines and related resources with rollback strategies and deployment validations.
* Set up monitoring and alerting for CI/CD pipelines and infrastructure health using tools like Azure Monitor, Log Analytics, or Application Insights.
Essential Business Experience and Technical Skills:
Required:
* Bachelor's or master's degree in computer science, Engineering, Information Systems, or a related field experience.
* 7+ years of experience in data engineering, architecture, or platform administration with at least 3+ years in Azure-based environments.
* Deep expertise in Azure Data Factory, Azure Databricks, Azure SQL Database, and/or Synapse Analytics.
* Strong SQL and Python skills; familiarity with Git, CI/CD (Azure DevOps or GitHub Actions).
* Understanding of customer data models, identity resolution, and campaign data lifecycles.
* Experience with CI/CD pipelines and DevOps best practices for data engineering and infrastructure deployments.
* Experience supporting enterprise Power BI solutions using Azure. Experience integrating marketing or any other platforms via APIs and RESTful services.
* Proficiency with Terraform for managing Azure infrastructure via code.
Preferred:
* Azure Solutions Architect or Azure Data Engineer Associate certifications.
* Experience with Marketing technologies (Adobe, Salesforce, Marketo, 6 sense, etc) and CDPs like Adobe CDP or Tealium
* Infrastructure as Code (IaC) using Terraform, YAML
* Streaming data architecture using Event Hubs, Kafka, or Stream Analytics.
At MetLife, we're leading the global transformation of an industry we've long defined. United in purpose, diverse in perspective, we're dedicated to making a difference in the lives of our customers.
Equal Employment Opportunity/Disability/Veterans
If you need an accommodation due to a disability, please email us at This information will be held in confidence and used only to determine an appropriate accommodation for the application process.
MetLife maintains a drug-free workplace.
View Now
Be The First To Know

About the latest Azure data Jobs in United States !

Azure Data Engineer: INTL

75026 Plano, Texas Insight Global

Posted 16 days ago

Job Viewed

Tap Again To Close

Job Description

Job Description
A fortune 100 client is seeking a highly skilled and experienced Senior Data Engineer to support an enterprise-level migration of the metadata layer from Presto to Unity Catalog within the Databricks Lakehouse Platform. This role requires deep expertise in Databricks, Spark, and modern data architecture, with a strong focus on data governance, performance optimization, and scalable pipeline development. The two additional data engineers will join the client's 2 current team members and will work on an individual basis to complete the migration effort.
We are a company committed to creating diverse and inclusive environments where people can bring their full, authentic selves to work every day. We are an equal opportunity/affirmative action employer that believes everyone matters. Qualified candidates will receive consideration for employment regardless of their race, color, ethnicity, religion, sex (including pregnancy), sexual orientation, gender identity and expression, marital status, national origin, ancestry, genetic factors, age, disability, protected veteran status, military or uniformed service member status, or any other status or characteristic protected by applicable laws, regulations, and ordinances. If you need assistance and/or a reasonable accommodation due to a disability during the application or recruiting process, please send a request to learn more about how we collect, keep, and process your private information, please review Insight Global's Workforce Privacy Policy: and Requirements
- 8+ years experience as a Data Engineer with a minimum 5 years of experiences in Azure tool stack (Databricks, ADF, Synapse Analytics)
- Strong expertise in writing complex SQL and transformations
- Ability to write, review and examine code written in Python and PySpark
- Proficient in Devops CICD
- Expertise in RDBMS
- Fundamental knowledge of Data Warehousing concepts
- SQL and SQL tuning within Data Lake (Delta Format) - Azure services: azure functions, event grid
- Tableau & PowerBI
- Prior PepsiCo experience
- Databricks certification
- Local to HYD
View Now

Azure Data scientist

30383 Atlanta, Georgia r2 Technologies, Inc.

Posted 4 days ago

Job Viewed

Tap Again To Close

Job Description

Role: Azure Data scientist

Location: Atlanta GA

#Role is on-site, Must relocate

Must w2 Role

should have subject matter expertise in applying data science and machine learning to implement and run machine learning workloads on Azure. Additionally, you should have knowledge of optimizing language models for AI applications using Azure AI.

Your responsibilities for this role include:

  • Designing and creating a suitable working environment for data science workloads.
  • Exploring data.
  • Training machine learning models.
  • Implementing pipelines.
  • Running jobs to prepare for production.
  • Managing, deploying, and monitoring scalable machine learning solutions.
  • Using language models for building AI applications.


Skills:

Azure,Data Science,Machine Learning
View Now

Azure Data Scientist

60290 Chicago, Illinois Purple Drive

Posted 4 days ago

Job Viewed

Tap Again To Close

Job Description

Skills Required:

8+ years of relevant experience as a hands-on quantitative investment professional, with a track record of developing predictive alpha models in fixed income markets.

Masters, or PhD in a technical or quantitative discipline (e.g., statistics, mathematics, physics, electrical engineering, computer science, applied economics, or finance).

  • Systematic Fixed Income investing experience a plus.
  • Proficiency in at least one programming language, ideally Python, with additional experience in PySpark, SAS , MATLAB, or R being advantageous.
  • Hands on experience working in Azure Cloud Services, ADLS, ADF, Databricks will be very helpful.
  • Strong knowledge of asset pricing, factor anomaly literature, and the application of sustainability data within an investing context.
  • Expertise across the entire quantitative investment lifecycle, including alpha research, risk management, portfolio management, trade execution, and the application of technology.
  • Experience in applying machine learning models within an investing context, with familiarity in alternative data as an advantage.
  • Ability to think independently, creatively approach data analysis, and clearly communicate complex ideas.


Mandatory skills

Data Science

PySpark

Azure DataFactory

Azure Databricks

Secondary skills :

Asset Management
View Now
 

Nearby Locations

Other Jobs Near Me

Industry

  1. request_quote Accounting
  2. work Administrative
  3. eco Agriculture Forestry
  4. smart_toy AI & Emerging Technologies
  5. school Apprenticeships & Trainee
  6. apartment Architecture
  7. palette Arts & Entertainment
  8. directions_car Automotive
  9. flight_takeoff Aviation
  10. account_balance Banking & Finance
  11. local_florist Beauty & Wellness
  12. restaurant Catering
  13. volunteer_activism Charity & Voluntary
  14. science Chemical Engineering
  15. child_friendly Childcare
  16. foundation Civil Engineering
  17. clean_hands Cleaning & Sanitation
  18. diversity_3 Community & Social Care
  19. construction Construction
  20. brush Creative & Digital
  21. currency_bitcoin Crypto & Blockchain
  22. support_agent Customer Service & Helpdesk
  23. medical_services Dental
  24. medical_services Driving & Transport
  25. medical_services E Commerce & Social Media
  26. school Education & Teaching
  27. electrical_services Electrical Engineering
  28. bolt Energy
  29. local_mall Fmcg
  30. gavel Government & Non Profit
  31. emoji_events Graduate
  32. health_and_safety Healthcare
  33. beach_access Hospitality & Tourism
  34. groups Human Resources
  35. precision_manufacturing Industrial Engineering
  36. security Information Security
  37. handyman Installation & Maintenance
  38. policy Insurance
  39. code IT & Software
  40. gavel Legal
  41. sports_soccer Leisure & Sports
  42. inventory_2 Logistics & Warehousing
  43. supervisor_account Management
  44. supervisor_account Management Consultancy
  45. supervisor_account Manufacturing & Production
  46. campaign Marketing
  47. build Mechanical Engineering
  48. perm_media Media & PR
  49. local_hospital Medical
  50. local_hospital Military & Public Safety
  51. local_hospital Mining
  52. medical_services Nursing
  53. local_gas_station Oil & Gas
  54. biotech Pharmaceutical
  55. checklist_rtl Project Management
  56. shopping_bag Purchasing
  57. home_work Real Estate
  58. person_search Recruitment Consultancy
  59. store Retail
  60. point_of_sale Sales
  61. science Scientific Research & Development
  62. wifi Telecoms
  63. psychology Therapy
  64. pets Veterinary
View All Azure Data Jobs