1,955 Cloud Data Engineer jobs in the United States

Cloud Data Engineer

37230 Nashville, Tennessee NTT DATA North America

Posted 3 days ago

Job Viewed

Tap Again To Close

Job Description

NTT DATA strives to hire exceptional, innovative and passionate individuals who want to grow with us. If you want to be part of an inclusive, adaptable, and forward-thinking organization, apply now.
We are currently seeking a Cloud Data Engineer to join our team in Nashville, Tennessee (US-TN), United States (US).
Job Description:
**Job Title: Cloud Data Engineer**
**Location: Nashville (On-site at one of NTT DATA's flagship data delivery centers)**
NTT DATA is seeking a skilled Data Engineer to join our Data Village team. This role involves working on-site in Nashville as part of a dynamic team focused on delivering cutting-edge data solutions.
**The ideal candidate will have at least 3 years of experience in data engineering, with expertise in Databricks, Snowflake and AWS data services.**
**Key Responsibilities:**
**Data Engineering:**
+ Utilize your ETL/Data Engineering expertise in Databricks, Snowflake and Cloud data services to build and maintain robust data solutions.
+ Lead the evaluation of various catalogs and query engines for the Data Lakehouse platform, documenting findings and reviewing them with the architecture teams.
+ SQL, Python, and strong knowledge of the SDLC are required.
+ Build and manage dozens of data pipelines to source and transform data based on business requirements.
**Financial Data Analysis** : Apply your knowledge in financial data analysis, risk, and compliance data management to support our financial services customers
**Data Analysis and Discovery:** Leverage Databricks and Snowflake for data analysis and discovery, ensuring data is accessible and actionable. Leverage 10+ sources of data to derive insights.
**Innovation and Learning:** Quickly learn new technologies by applying your current skills, staying ahead of industry trends and advancements. Self-identify the need for new skills to be developed and adopt new technologies into your skill set in a month's time.
**Client Collaboration:** Work closely with financial services clients to build modern data solutions that transform how they leverage data for key business decisions, investment portfolio performance analysis, and risk and compliance management. Manage multiple stakeholder groups and their requirements.
**Team Collaboration:** Collaborate within a Pod of 4+ data engineers, working towards common objectives in a consultative fashion with clients.
**Data Movement and Transformation** : Use Cloud native ETL services (i.e., AWS Glue or Snaplogic) and Python/SQL for data movement, streaming, and transformation services, ensuring efficient and reliable data workflows.
**Industry Leadership:** Work with a client that is leading the industry in using data to drive business decision optimization and investment management strategies.
**Requirements:**
Experience:
+ At least 5 years of experience in data engineering.
**Technical Skills:**
+ Proficiency in Snowflake and Cloud data services.
+ 5+ years of experience working with cloud services.
+ 2+ years of experience working with Databricks or Snowflake.
+ 3+ years working with PySpark. Proficiency in the following areas:
**Data Warehousing** : Strong knowledge of data warehousing concepts.
Python or Java: Advanced skills in Python or Java programming for data engineering and data pipelines.
Data Integration: Proficiency in integrating data from various sources.
Data Platforms: Strong knowledge of data modeling, storage and design within Snowflake.
Data Security: Experience in securing data based on role and policy definitions.
Data Pipelines: Experience in building and managing data pipelines.
Cloud Experience: Experience AWS, Azure, or GCP data services.
GitHub: Proficiency in using GitHub for version control.
Preferred Technical skills: Apache Iceberg, Databricks, Dremio
Domain Expertise: Nice to have experience in financial data analysis, risk, and compliance data management.
Learning Agility: Ability to quickly learn new technologies by applying current skills. Operate within a sprint model. Stories are assigned, developed, validated, and peer reviewed. Each story typically involves: Writing and executing scripts. Validating results and capturing observations Documenting outcomes in Confluence Conducting peer reviews and merging code into Git branches Architect Reviews: Present deliverables and findings to the architecture team for feedback and alignment.
Adaptability: Ability to adapt to new technologies quickly and efficiently.
About NTT DATA:
NTT DATA is a $30 billion trusted global innovator of business and technology services. We serve 75% of the Fortune Global 100 and are committed to helping clients innovate, optimize and transform for long term success. As a Global Top Employer, we have diverse experts in more than 50 countries and a robust partner ecosystem of established and start-up companies. Our services include business and technology consulting, data and artificial intelligence, industry solutions, as well as the development, implementation and management of applications, infrastructure and connectivity. We are one of the leading providers of digital and AI infrastructure in the world. NTT DATA is a part of NTT Group, which invests over $3.6 billion each year in R&D to help organizations and society move confidently and sustainably into the digital future. Visit us at us.nttdata.com
NTT DATA endeavors to make accessible to any and all users. If you would like to contact us regarding the accessibility of our website or need assistance completing the application process, please contact us at This contact information is for accommodation requests only and cannot be used to inquire about the status of applications. NTT DATA is an equal opportunity employer. Qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability or protected veteran status. For our EEO Policy Statement, please click here. If you'd like more information on your EEO rights under the law, please click here. For Pay Transparency information, please click here.
View Now

Cloud Data Engineer

98194 Seattle, Washington The Boeing Company

Posted 5 days ago

Job Viewed

Tap Again To Close

Job Description

**Job Description**
At Boeing, we innovate and collaborate to make the world a better place. We're committed to fostering an environment for every teammate that's welcoming, respectful and inclusive, with great opportunity for professional growth. Find your future with us.
Boeing Defense, Space and Security (BDS) is seeking an **Advanced Information Technologist - Cloud Data Engineer** to join our Data Analytics Engineering & Integration team in **Ridley Park, PA; Seattle, WA; Arlington, VA; San Diego, CA; Long Beach, CA; Mesa, AZ; or Hazelwood, MO** . The Solutions team is responsible for ensuring that solution datasets are discoverable, secure, performant, and reliably delivered to downstream consumers to enable analytics, operations, and program decisioning across BDS.
**Role summary:**
+ As an Advanced Information Technologist - Cloud Data Engineer focused on the solutions space, you will own both DBaaS/platform lifecycle and consumer-facing dataset delivery. You will provision and operate DBaaS instances, implement dataset provisioning and access patterns, optimize query performance, enforce security/compliance (including GovCloud constraints), and collaborate with data architects to design, operationalize, and govern enterprise data ontologies and canonical models that enable consistent semantics and discoverability.
**Position Responsibilities**
+ Participate in all aspects of agile delivery, including design, implementation, testing, and deployment of solution‑space features and operational tooling.
+ Own provisioning, configuration, scaling, tuning, patching, backup/restore, and lifecycle management of DBaaS instances that host solution data (e.g., RDS/Aurora or equivalent), including capacity planning, disaster recovery, and incident response.
+ Implement and maintain dataset provisioning and delivery processes (cataloging, access controls, dataset packaging, freshness/latency SLAs) to support downstream consumers and integrations.
+ Optimize consumer performance and cost‑to‑serve through data layout and query optimization (indexing, partitioning, materialized views, caching) and by advising solution owners on access/query patterns.
+ Design, implement, and operationalize enterprise data ontologies and canonical models: create semantic models, map solution datasets to ontologies, enforce taxonomy/versioning, and partner with governance for discoverability and lineage.
+ Design and enforce authorization models: implement RBAC for role-level permissions and ABAC for fine‑grained, attribute‑driven policies (dataset sensitivity, clearance, environment, ontology tags), integrated via a centralized policy engine.
+ Build and maintain Infrastructure‑as‑Code (Terraform) and CI/CD for DB and dataset lifecycle changes; automate entitlement provisioning and deprovisioning workflows; author runbooks and participate in on‑call rotations.
+ Enforce security, compliance, and GovCloud requirements: manage RBAC/ABAC controls, encryption at rest/in transit, auditing and immutable logging, data classification/tags, masking/anonymization where required, and periodic access attestation.
+ Integrate policy & observability: centralize policy evaluation, instrument audit logs and alerts for policy decisions and anomalous access, and include policy tests in pipelines.
+ Design, deploy, and maintain data integrations and operational patterns within enterprise data platforms (e.g., Palantir Foundry) where applicable, including dataset modeling, Foundry Ontology alignment, transforms, and operationalization.
+ Provide stakeholder support: advise solution owners on SLAs, access patterns, and best practices; validate consumer requirements; perform dataset handoffs and document usage guides.
+ Continuously review and recommend platform improvements to improve reliability, security, performance, and cost efficiency.
**Work arrangement and clearance**
+ This position is hybrid. The selected candidate will be required to perform 1 day per week onsite at one of the listed location options. This requirement is at the hiring team's discretion and could change in the future.
+ This position requires the ability to obtain a U.S. Security Clearance for which the U.S. Government requires U.S. Citizenship. An interim and/or final U.S. Secret Clearance Post-Start is preferred.
**Basic Qualifications (Required Skills/Experience)**
+ 3+ years of experience in data engineering or cloud data platform operations with hands‑on experience managing cloud managed databases/DBaaS (e.g., AWS RDS/Aurora) and delivering datasets to downstream consumers.
+ Practical experience implementing Infrastructure‑as‑Code (Terraform) for DB and dataset provisioning and lifecycle management.
+ Proficiency in Python and SQL for automation, operational tooling, and query optimization.
+ Experience with version control and CI/CD pipelines (e.g., GitLab CI/CD) and containerization (Docker, Kubernetes).
+ Demonstrated experience with backups/DR, performance tuning, capacity planning, runbooks, and production incident response for DB-backed services.
+ Experience translating consumer requirements into dataset provisioning, access specifications, and SLAs (freshness, latency, availability).
+ Working knowledge of access control models and enforcement (RBAC) and practical exposure to attribute‑based or policy‑driven controls (ABAC or equivalent).
+ Experience implementing security/compliance controls for data (encryption, auditing/logging, data classification, masking/anonymization) in cloud or hybrid environments.
+ Technical Bachelor's degree or equivalent experience.
**Preferred Qualifications (Highly Preferred / Desired)**
+ Highly preferred: Hands‑on experience with **Palantir Foundry** (dataset design, Foundry Ontology, Transforms, Code Repositories) and operational dataset patterns.
+ Highly preferred: Practical experience designing, implementing, and operationalizing data ontologies, canonical models, semantic layers, or knowledge graphs to improve discoverability, lineage, and reuse.
+ Preferred: Experience implementing **RBAC + ABAC** (or policy engine-based controls) end‑to‑end, including attribute/tag definitions, centralized policy evaluation (e.g., OPA, IAM condition keys, Foundry policies), entitlement workflows, and access attestation.
+ Preferred: Experience operating in AWS GovCloud or other regulated cloud environments and applying compliance controls in cloud deployments.
+ Preferred: Experience optimizing consumer performance (indexing, partitioning, materialized views, caching) and cost‑to‑serve tradeoffs for query-heavy consumption workloads.
+ Preferred: Experience implementing observability for access and policy decisions (audit logging, alerting on anomalous access, policy denial metrics) and integrating those signals into incident response.
+ Preferred: Experience in aviation, defense, or other regulated industries and working in large matrixed organizations.
+ Preferred: Advanced degree or relevant certifications (e.g., AWS Specialty certs, Certified Data Management Professional, Palantir Foundry training) a plus.
**Relocation:**
Relocation assistance is not a negotiable benefit for this position. Candidates must live in the immediate area or relocate at their own expense.
**Drug-Free Workplace:**
Boeing is a Drug-Free Workplace where post-offer applicants and employees are subject to testing for marijuana, cocaine, opioids, amphetamines, PCP, and alcohol when criteria are met as outlined in our policies.
**Shift:**
This position is for 1st shift
**Pay & Benefits:**
At Boeing, we strive to deliver a Total Rewards package that will attract, engage and retain the top talent. Elements of the Total Rewards package include competitive base pay and variable compensation opportunities.
The Boeing Company also provides eligible employees with an opportunity to enroll in a variety of benefit programs, generally including health insurance, flexible spending accounts, health savings accounts, retirement savings plans, life and disability insurance programs, and a number of programs that provide for both paid and unpaid time away from work.
The specific programs and options available to any given employee may vary depending on eligibility factors such as geographic location, date of hire, and the applicability of collective bargaining agreements.
Pay is based upon candidate experience and qualifications, as well as market and business considerations.
**Summary Pay Range:**
$105,400 - $152,950
Applications for this position will be accepted until **Oct. 13, 2025**
**Export Control Requirements:** This is not an Export Control position.
**Education**
Bachelor's Degree or Equivalent Required
**Relocation**
Relocation assistance is not a negotiable benefit for this position.
**Security Clearance**
This position requires the ability to obtain a U.S. Security Clearance for which the U.S. Government requires U.S. Citizenship. An interim and/or final U.S. Secret Clearance Post-Start is required.
**Visa Sponsorship**
Employer will not sponsor applicants for employment visa status.
**Shift**
This position is for 1st shift
**Equal Opportunity Employer:**
Boeing is an Equal Opportunity Employer. Employment decisions are made without regard to race, color, religion, national origin, gender, sexual orientation, gender identity, age, physical or mental disability, genetic factors, military/veteran status or other characteristics protected by law.
Boeing is an Equal Opportunity Employer. Employment decisions are made without regard to race, color, religion, national origin, gender, sexual orientation, gender identity, age, physical or mental disability, genetic factors, military/veteran status or other characteristics protected by law.
View Now

Cloud Data Engineer

75219 Dallas, Texas Insight Global

Posted 15 days ago

Job Viewed

Tap Again To Close

Job Description

Job Description
Insight Global is seeking a mid-level Cloud Data Engineer to join the Technology Risk Division of a prominent financial institution. This role is instrumental in orchestrating firmwide liquidity and risk metric calculations, ensuring the organization leverages the right data sets for accurate and strategic decision-making. As part of this team, you'll be responsible for updating and optimizing data sources, enhancing the performance of large-scale datasets, and consolidating data currently stored across multiple platforms into Snowflake. Your work will directly support the firm's mission to improve risk exposure strategies through robust, cloud-based data engineering.
We are a company committed to creating diverse and inclusive environments where people can bring their full, authentic selves to work every day. We are an equal opportunity/affirmative action employer that believes everyone matters. Qualified candidates will receive consideration for employment regardless of their race, color, ethnicity, religion, sex (including pregnancy), sexual orientation, gender identity and expression, marital status, national origin, ancestry, genetic factors, age, disability, protected veteran status, military or uniformed service member status, or any other status or characteristic protected by applicable laws, regulations, and ordinances. If you need assistance and/or a reasonable accommodation due to a disability during the application or recruiting process, please send a request to learn more about how we collect, keep, and process your private information, please review Insight Global's Workforce Privacy Policy: and Requirements
- 7+ years of experience as a Data Engineer, working with large-scale data systems
- 2 - 3 years of hands-on experience with Amazon Web Services (AWS)
- Good understanding of Apache Airflow for scheduling and orchestration
- Strong proficiency in Python for data engineering and workflow automation
- Advanced skills in writing complex SQL queries, especially for data migration into Snowflake
- Familiarity with the Software Development Life Cycle (SDLC) especially for python dominated ecosystems.
- Excellent communication skills and ability to translate business needs into technical solutions - Prior experience with or interest in learning Slang Infrastructure, a proprietary programming language
- Familiarity with Amazon MWAA (Managed Workflows for Apache Airflow) for cloud-based workflow orchestration and monitoring
View Now

Azure Cloud Data Engineer

60532 Lisle, Illinois International

Posted 1 day ago

Job Viewed

Tap Again To Close

Job Description

Position Overview
Engineer International's state of the art cloud technologies as our newest Cloud Data Engineer.
As the commercial vehicle industry undertakes its most significant transformation in a century, International stands at the forefront, a vanguard of the movement. No longer content with merely supplying trucks, buses, and engines, International is on a mission to redefine transportation. Embracing a bold digital transformation, International is ushering in a new era of complete and sustainable transport solutions.
International is not just building trucks - it's forging the future of mobility. As a global industry pioneer, International is assembling a team of makers, problem solvers, and future world builders. Together, we are not just imagining a better world - We're shaping it, one innovative solution at a time. Join International now and be a part of the journey towards a brighter, more connected tomorrow.
Responsibilities
As a cloud data engineer, you will design and manage International's cutting edge cloud computing systems.
Key Activities:
+ Design and deploy cloud infrastructure on the Azure platform
+ Build and manage scalable, secure, and highly available Azure environments
+ Implement and maintain cloud-based data infrastructure, including pipelines, data stores, data lakes, etc
+ Configure and manage Azure services
+ Monitor and troubleshoot Azure infrastructure to ensure high availability and performance
+ Automate deployment and configuration using tools such as Terraform and Ansible
+ Develop and implement security controls to protect cloud infrastructure and data
+ Collaborate with cross-functional teams to design and implement new cloud-based solutions
Minimum Requirements
+ Bachelor's degree
+ At least 5 years of Information Technology or IT Architecture experience
+ At least 1 year of lead experience
OR
+ Master's degree
+ At least 3 years of Information Technology or IT Architecture experience
+ At least 1 year of lead experience
OR
+ At least 8 years of Information Technology or IT Architecture experience
+ At least 1 year of lead experience
Additional Requirements
+ Qualified candidates, excluding current International Motors employees, must be legally authorized on an unrestricted basis (US Citizen, Legal Permanent Resident, Refugee or Asylee) to be employed in the United States. International Motors does not anticipate providing employment related work sponsorship for this position (e.g., H-1B status)
Desired Skills
+ Azure Cloud data engineering experience
+ Azure Cloud certification
+ Strong understanding of Azure services and their use cases
+ Experience with infrastructure-as-code tools such as Bicep, Terraform, CloudFormation, or Ansible
+ Proficiency in at least one programming language such as Python, Java or Go
+ Familiarity with network protocols, security controls, and monitoring tools
+ Excellent communication and collaboration skills
+ Ability to work independently as well as part of a team
+ Willingness to learn new technologies and skills as needed
Benefits and Compensation
We provide a competitive total rewards package which ensures job satisfaction both on and off the job. We offer market-based compensation, health benefits, 401(k) match, tuition assistance, EAP, legal insurance, an employee discount program, and more.
For this position, the expected salary range will be commensurate with the candidate's applicable skills, knowledge and experience.
You can learn more about International's comprehensive benefits package at Overview
ABOUT TRATON
With its brands Scania, MAN, International, and Volkswagen Truck & Bus, TRATON SE is the parent and holding company of the TRATON GROUP and one of the world's leading commercial vehicle manufacturers. The Group's product portfolio comprises trucks, buses, and light-duty commercial vehicles. "Transforming Transportation Together. For a sustainable world.": this intention underlines the Company's ambition to have a lasting and sustainable impact on the commercial vehicle business and on the Group's commercial growth.
ABOUT INTERNATIONALFrom a one-man company built on the world-changing invention of the McCormick reaper in 1831, to the 15,000-person-strong company we are today, few companies can lay claim to a history like International. Based in Lisle, Illinois, International Motors, LLC* creates solutions that deliver greater uptime and productivity to our customers throughout the full operation of our commercial vehicles. We build International® trucks and engines and IC Bus® school and commercial buses that are as tough and as smart as the people who drive them. We also develop Fleetrite® aftermarket parts. In everything we do, our vision is to accelerate the impact of sustainable mobility to create the cleaner, safer world we all deserve. As of 2021, we joined Scania, MAN and Volkswagen Truck & Bus in TRATON GROUP, a global champion of the truck and transport services industry. To learn more, visit ( .
*International Motors, LLC is d/b/a International Motors USA in Illinois, Missouri, New Jersey, Ohio, Texas, and Utah.
EEO Statement
International is an Equal Opportunity Employer. We evaluate qualified applicants without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability, veteran status, and other legally protected characteristics.
If you are a qualified individual with a disability and require a reasonable accommodation to access the online application system or participate in the interview process due to your disability, please email   to request assistance. Kindly specify Job Requisition Number / Job Title and Location in response. Otherwise, your request may not be considered.
View Now

Sr. Cloud Data Engineer/Snowflake

35226 Hoover, Alabama Regions Bank

Posted 2 days ago

Job Viewed

Tap Again To Close

Job Description

Thank you for your interest in a career at Regions. At Regions, we believe associates deserve more than just a job. We believe in offering performance-driven individuals a place where they can build a career --- a place to expect more opportunities. If you are focused on results, dedicated to quality, strength and integrity, and possess the drive to succeed, then we are your employer of choice.
Regions is dedicated to taking appropriate steps to safeguard and protect private and personally identifiable information you submit. The information that you submit will be collected and reviewed by associates, consultants, and vendors of Regions in order to evaluate your qualifications and experience for job opportunities and will not be used for marketing purposes, sold, or shared outside of Regions unless required by law. Such information will be stored in accordance with regulatory requirements and in conjunction with Regions' Retention Schedule for a minimum of three years. You may review, modify, or update your information by visiting and logging into the careers section of the system.
**Job Description:**
At Regions, the Data Engineer focuses on the evaluation, design, and execution of data structures, processes, and logic to deliver business value through operational and analytical data assets. The Data Engineer uses advanced data design and technical skills to work with business subject matter experts to create enterprise data assets utilizing state of the art data techniques and tools.
**Primary Responsibilities**
+ Partners with Regions Technology partners to Design, Build, and Maintain the data-based structures and systems in support of Data and Analytics and Data Product use cases
+ Builds data pipelines to collect and arrange data and manage data storage in Regions' big data environment
+ Builds robust, testable programs for moving, transforming, and loading data using big data tools such as Spark
+ Coordinates design and development with Data Products Partners, Data Scientists, Data Management, Data Modelers, and other Technical partners to construct strategic and tactical data stores
+ Ensures data is prepared, arranged and ready for each defined business use case
+ Designs and deploys frameworks and micro services to serve data assets to data consumers
+ Collaborates and aligns with technical and non-technical stakeholders to translate customer needs into Data Design requirements, and work to deliver world-class visualizations, data stories while ensuring data quality and integrity
+ Provides consultation to all areas of the organization that plan to use data to make decisions
+ Supports any team members in the development of such information delivery and aid in the automation of data products
+ Acts as trusted adviser and partner to business leads- assisting in the identification of business needs & data opportunities, understanding key drivers of performance, interpreting business case data drivers, turning data into business value, and participating in the guidance of the overall data and analytics strategy
This position is exempt from timekeeping requirements under the Fair Labor Standards Act and is not eligible for overtime pay.
**Requirements**
+ Bachelor's degree and eight (8) years of experience in a quantitative/analytical/STEM field or technical related field
+ Or Master's degree and six (6) years of experience in a quantitative/analytical/STEM field or technical related field
+ Or Ph.D. and four (4) years of experience in a quantitative/analytical/STEM field
+ Five (5) years of working programming experience in Python/PySpark, Scala, SQL
+ Five (5) years of working experience in Big Data Technology in Hadoop, Hive, Impala, Spark, or Kafka
**Preferences**
+ Background in Big Data Engineering and Advanced Data Analytics
+ Experience developing solutions for the financial services industry
+ Experience in Agile Software Development
+ Experience or exposure to cloud technologies and migrations
+ Prior banking or financial services experience
**Skills and Competencies**
+ Experience building data solutions at scale
+ Experience designing and building relational data structures in multiple environments
+ Experience with DevOps principals, CI/CD, and Software Development Lifecycle
+ Experience with No-SQL databases
+ Experience with large-scale data Lakehouses at Enterprise scale
+ Proven record of accomplishment of delivering operational Data solutions including Report and Model Ready Data Assets
+ Significant experience working with senior executives in the use of data, reporting and visualizations to support strategic and operational decision making
+ Strong ability to transform and integrate complex data from multiple sources into accessible, understandable, and usable data assets and frameworks
+ Strong background in synthesizing data and analytics in a large (Fortune 500), complex, and highly regulated environment
+ Strong technical background including database and business intelligence skills
+ Strong communication skills through written and oral presentations
+ Snowflake experience a plus
+ Experience migrating to the cloud
**This position is intended to be onsite, now or in the near future** . Associates will have regular work hours, including full days in the office three or more days a week. The manager will set the work schedule for this position, including in-office expectations. Regions will not provide relocation assistance for this position, and relocation would be at your expense. The locations available for this role are **Birmingham, AL, Atlanta, GA or Charlotte, NC,**
**Position Type**
Full time
**Compensation Details**
Pay ranges are job specific and are provided as a point-of-market reference for compensation decisions. Other factors which directly impact pay for individual associates include: experience, skills, knowledge, contribution, job location and, most importantly, performance in the job role. As these factors vary by individuals, pay will also vary among individual associates within the same job.
The target information listed below is based on the Metropolitan Statistical Area Market Range for where the position is located and level of the position.
**Job Range Target:**
**_Minimum:_**
$135,668.03 USD
**_Median:_**
$166,971.00 USD
**Incentive Pay Plans:**
Opportunity to participate in the Long Term Incentive Plan.
**Benefits Information**
Regions offers a benefits package that is flexible, comprehensive and recognizes that "one size does not fit all" for benefits-eligible associates. ( Listed below is a synopsis of the benefits offered by Regions for informational purposes, which is not intended to be a complete summary of plan terms and conditions.
+ Paid Vacation/Sick Time
+ 401K with Company Match
+ Medical, Dental and Vision Benefits
+ Disability Benefits
+ Health Savings Account
+ Flexible Spending Account
+ Life Insurance
+ Parental Leave
+ Employee Assistance Program
+ Associate Volunteer Program
Please note, benefits and plans may be changed, amended, or terminated with respect to all or any class of associate at any time. To learn more about Regions' benefits, please click or copy the link below to your browser.
Details**
Charlotte Uptown
**Location:**
Charlotte, North Carolina
Equal Opportunity Employer/including Disabled/Veterans
Job applications at Regions are accepted electronically through our career site for a minimum of five business days from the date of posting. Job postings for higher-volume positions may remain active for longer than the minimum period due to business need and may be closed at any time thereafter at the discretion of the company.
View Now

Sr. Cloud Data Engineer/Snowflake

30309 Midtown Atlanta, Georgia Regions Bank

Posted 2 days ago

Job Viewed

Tap Again To Close

Job Description

Thank you for your interest in a career at Regions. At Regions, we believe associates deserve more than just a job. We believe in offering performance-driven individuals a place where they can build a career --- a place to expect more opportunities. If you are focused on results, dedicated to quality, strength and integrity, and possess the drive to succeed, then we are your employer of choice.
Regions is dedicated to taking appropriate steps to safeguard and protect private and personally identifiable information you submit. The information that you submit will be collected and reviewed by associates, consultants, and vendors of Regions in order to evaluate your qualifications and experience for job opportunities and will not be used for marketing purposes, sold, or shared outside of Regions unless required by law. Such information will be stored in accordance with regulatory requirements and in conjunction with Regions' Retention Schedule for a minimum of three years. You may review, modify, or update your information by visiting and logging into the careers section of the system.
**Job Description:**
At Regions, the Data Engineer focuses on the evaluation, design, and execution of data structures, processes, and logic to deliver business value through operational and analytical data assets. The Data Engineer uses advanced data design and technical skills to work with business subject matter experts to create enterprise data assets utilizing state of the art data techniques and tools.
**Primary Responsibilities**
+ Partners with Regions Technology partners to Design, Build, and Maintain the data-based structures and systems in support of Data and Analytics and Data Product use cases
+ Builds data pipelines to collect and arrange data and manage data storage in Regions' big data environment
+ Builds robust, testable programs for moving, transforming, and loading data using big data tools such as Spark
+ Coordinates design and development with Data Products Partners, Data Scientists, Data Management, Data Modelers, and other Technical partners to construct strategic and tactical data stores
+ Ensures data is prepared, arranged and ready for each defined business use case
+ Designs and deploys frameworks and micro services to serve data assets to data consumers
+ Collaborates and aligns with technical and non-technical stakeholders to translate customer needs into Data Design requirements, and work to deliver world-class visualizations, data stories while ensuring data quality and integrity
+ Provides consultation to all areas of the organization that plan to use data to make decisions
+ Supports any team members in the development of such information delivery and aid in the automation of data products
+ Acts as trusted adviser and partner to business leads- assisting in the identification of business needs & data opportunities, understanding key drivers of performance, interpreting business case data drivers, turning data into business value, and participating in the guidance of the overall data and analytics strategy
This position is exempt from timekeeping requirements under the Fair Labor Standards Act and is not eligible for overtime pay.
**Requirements**
+ Bachelor's degree and eight (8) years of experience in a quantitative/analytical/STEM field or technical related field
+ Or Master's degree and six (6) years of experience in a quantitative/analytical/STEM field or technical related field
+ Or Ph.D. and four (4) years of experience in a quantitative/analytical/STEM field
+ Five (5) years of working programming experience in Python/PySpark, Scala, SQL
+ Five (5) years of working experience in Big Data Technology in Hadoop, Hive, Impala, Spark, or Kafka
**Preferences**
+ Background in Big Data Engineering and Advanced Data Analytics
+ Experience developing solutions for the financial services industry
+ Experience in Agile Software Development
+ Experience or exposure to cloud technologies and migrations
+ Prior banking or financial services experience
**Skills and Competencies**
+ Experience building data solutions at scale
+ Experience designing and building relational data structures in multiple environments
+ Experience with DevOps principals, CI/CD, and Software Development Lifecycle
+ Experience with No-SQL databases
+ Experience with large-scale data Lakehouses at Enterprise scale
+ Proven record of accomplishment of delivering operational Data solutions including Report and Model Ready Data Assets
+ Significant experience working with senior executives in the use of data, reporting and visualizations to support strategic and operational decision making
+ Strong ability to transform and integrate complex data from multiple sources into accessible, understandable, and usable data assets and frameworks
+ Strong background in synthesizing data and analytics in a large (Fortune 500), complex, and highly regulated environment
+ Strong technical background including database and business intelligence skills
+ Strong communication skills through written and oral presentations
+ Snowflake experience a plus
+ Experience migrating to the cloud
**This position is intended to be onsite, now or in the near future** . Associates will have regular work hours, including full days in the office three or more days a week. The manager will set the work schedule for this position, including in-office expectations. Regions will not provide relocation assistance for this position, and relocation would be at your expense. The locations available for this role are **Birmingham, AL, Atlanta, GA or Charlotte, NC,**
**Position Type**
Full time
**Compensation Details**
Pay ranges are job specific and are provided as a point-of-market reference for compensation decisions. Other factors which directly impact pay for individual associates include: experience, skills, knowledge, contribution, job location and, most importantly, performance in the job role. As these factors vary by individuals, pay will also vary among individual associates within the same job.
The target information listed below is based on the Metropolitan Statistical Area Market Range for where the position is located and level of the position.
**Job Range Target:**
**_Minimum:_**
$135,668.03 USD
**_Median:_**
$166,971.00 USD
**Incentive Pay Plans:**
Opportunity to participate in the Long Term Incentive Plan.
**Benefits Information**
Regions offers a benefits package that is flexible, comprehensive and recognizes that "one size does not fit all" for benefits-eligible associates. ( Listed below is a synopsis of the benefits offered by Regions for informational purposes, which is not intended to be a complete summary of plan terms and conditions.
+ Paid Vacation/Sick Time
+ 401K with Company Match
+ Medical, Dental and Vision Benefits
+ Disability Benefits
+ Health Savings Account
+ Flexible Spending Account
+ Life Insurance
+ Parental Leave
+ Employee Assistance Program
+ Associate Volunteer Program
Please note, benefits and plans may be changed, amended, or terminated with respect to all or any class of associate at any time. To learn more about Regions' benefits, please click or copy the link below to your browser.
Details**
Charlotte Uptown
**Location:**
Charlotte, North Carolina
Equal Opportunity Employer/including Disabled/Veterans
Job applications at Regions are accepted electronically through our career site for a minimum of five business days from the date of posting. Job postings for higher-volume positions may remain active for longer than the minimum period due to business need and may be closed at any time thereafter at the discretion of the company.
View Now

Cleared Cloud Data Engineer (4610)

20080 Washington, District Of Columbia SMX

Posted 3 days ago

Job Viewed

Tap Again To Close

Job Description

Cleared Cloud Data Engineer (4610)at SMX(View all jobs) ( DC
SMX is seeking a Cloud Engineer to join our team. This role is 100% remote. We are looking for someone who has hands on experience with DevOps, data and code pipeline deployments, creating automation scripts, data engineering,experience with managing data and AWS technologies related to Data Analytics, Big Data & Analytics, Data Warehouse, Data Lakes, etc.
This person will be closely collaborating with data engineers/developers for data transformation and loading, data cleansing, etc. providing support on managing environments, schemas, roles, access, etc. In the future, there will be opportunities to produce creative solutions, create data pipelines, dimensional data modeling and data analytics. This role engages with both customer, third party and SMX engineers/architects to build solutions. You will be part of one of our heavily impactful teams, and working with a great client who loves data as much as we do. This is a remote role supporting a Washington DC based team.
**Essential Duties & Responsibilities:**
+ Technical support with AWS core services and data engineering products
+ Deployment of new data pipelines, update code via pipelines, and maintain ETL components.
+ Ability to communicate with data engineering, data scientists and architects clearly both verbally and in written documentation.
+ Ability to develop scripted solutions for deploying ETL and data pipelines using Python, powershell, bash shell, Terraform and any other automation tools as the client's needs arise.
**Required Skills & Experience:**
+ Bachelor's Degree in Statistics, Science, Computer Science, Management Information Systems, Engineering, Business Analytics disciplines, or related area
+ 3+ years of experience with AWS administration, Redshift, and SQL
+ Clearance Requirement: Must be able to obtain Public Trust agency clearance
+ Proficient in AWS Cloud Technologies like EC2, S3, Redshift, lambda, etc.
+ Experience with administration of users, roles, security policies, etc. on AWS including via federated identity.
+ Experience with RedShift administration and management
+ Solid command of SQL syntax and use cases
+ Bash and/or Powershell scripting
+ Ability to read and comprehend Python code
+ Experience with ETL tools
+ Experience with Reporting/Analytics tools
+ Data manipulation & analysis
+ Unstructured, semi-structured, and RDBMS data management
+ Knowledgeable on SageMaker deployments and management
**Desired Skills & Experience**
+ Certification related to any of theAWStechnologies specified above
+ Understanding of DevOps practices
+ Version control systems (Git, TFS, Azure DevOps, etc)
+ Knowledge if IaC products such as CloudFormation or Terraform
+ Python development experience
+ Some network knowledge
+ Some Windows or Linux system administration knowledge
**Application Deadline:** 11-7-2025
#LI-SA-1
The SMX salary determination process takes into account a number of factors, including but not limited to, geographic location, Federal Government contract labor categories, relevant prior work experience, specific skills, education and certifications. At SMX, one of our Core Values is to Invest in Our People so we offer a competitive mix of compensation, learning & development opportunities, and benefits. Some key components of our robust benefits include health insurance, paid leave, and retirement.
The proposed salary for this position is:
$105,100-$176,700 USD
At SMX®, we are a team of technical and domain experts dedicated to enabling your mission. From priority national security initiatives for the DoD to highly assured and compliant solutions for healthcare, we understand that digital transformation is key to your future success.
We share your vision for the future and strive to accelerate your impact on the world. We bring both cutting edge technology and an expansive view of what's possible to every engagement. Our delivery model and unique approaches harness our deep technical and domain knowledge, providing forward-looking insights and practical solutions to power secure mission acceleration.
SMX is an Equal Opportunity employer including disabilities and veterans.
Selected applicant may be subject to a background investigation and/or education verification.
SMX does not sponsor a new applicant for employment authorization or immigration related support for this position (i.e. H1B, F-1 OPT, F-1 STEM OPT, F-1 CPT, J-1, TN, E-2, E-3, L-1 and O-1, or any EADs or other forms of work authorization that require immigration support from an employer).
View Now
Be The First To Know

About the latest Cloud data engineer Jobs in United States !

Sr. Cloud Data Engineer/Snowflake

28230 Charlotte, North Carolina Regions Bank

Posted 16 days ago

Job Viewed

Tap Again To Close

Job Description

Thank you for your interest in a career at Regions. At Regions, we believe associates deserve more than just a job. We believe in offering performance-driven individuals a place where they can build a career --- a place to expect more opportunities. If you are focused on results, dedicated to quality, strength and integrity, and possess the drive to succeed, then we are your employer of choice.
Regions is dedicated to taking appropriate steps to safeguard and protect private and personally identifiable information you submit. The information that you submit will be collected and reviewed by associates, consultants, and vendors of Regions in order to evaluate your qualifications and experience for job opportunities and will not be used for marketing purposes, sold, or shared outside of Regions unless required by law. Such information will be stored in accordance with regulatory requirements and in conjunction with Regions' Retention Schedule for a minimum of three years. You may review, modify, or update your information by visiting and logging into the careers section of the system.
**Job Description:**
At Regions, the Data Engineer focuses on the evaluation, design, and execution of data structures, processes, and logic to deliver business value through operational and analytical data assets. The Data Engineer uses advanced data design and technical skills to work with business subject matter experts to create enterprise data assets utilizing state of the art data techniques and tools.
**Primary Responsibilities**
+ Partners with Regions Technology partners to Design, Build, and Maintain the data-based structures and systems in support of Data and Analytics and Data Product use cases
+ Builds data pipelines to collect and arrange data and manage data storage in Regions' big data environment
+ Builds robust, testable programs for moving, transforming, and loading data using big data tools such as Spark
+ Coordinates design and development with Data Products Partners, Data Scientists, Data Management, Data Modelers, and other Technical partners to construct strategic and tactical data stores
+ Ensures data is prepared, arranged and ready for each defined business use case
+ Designs and deploys frameworks and micro services to serve data assets to data consumers
+ Collaborates and aligns with technical and non-technical stakeholders to translate customer needs into Data Design requirements, and work to deliver world-class visualizations, data stories while ensuring data quality and integrity
+ Provides consultation to all areas of the organization that plan to use data to make decisions
+ Supports any team members in the development of such information delivery and aid in the automation of data products
+ Acts as trusted adviser and partner to business leads- assisting in the identification of business needs & data opportunities, understanding key drivers of performance, interpreting business case data drivers, turning data into business value, and participating in the guidance of the overall data and analytics strategy
This position is exempt from timekeeping requirements under the Fair Labor Standards Act and is not eligible for overtime pay.
**Requirements**
+ Bachelor's degree and eight (8) years of experience in a quantitative/analytical/STEM field or technical related field
+ Or Master's degree and six (6) years of experience in a quantitative/analytical/STEM field or technical related field
+ Or Ph.D. and four (4) years of experience in a quantitative/analytical/STEM field
+ Five (5) years of working programming experience in Python/PySpark, Scala, SQL
+ Five (5) years of working experience in Big Data Technology in Hadoop, Hive, Impala, Spark, or Kafka
**Preferences**
+ Background in Big Data Engineering and Advanced Data Analytics
+ Experience developing solutions for the financial services industry
+ Experience in Agile Software Development
+ Experience or exposure to cloud technologies and migrations
+ Prior banking or financial services experience
**Skills and Competencies**
+ Experience building data solutions at scale
+ Experience designing and building relational data structures in multiple environments
+ Experience with DevOps principals, CI/CD, and Software Development Lifecycle
+ Experience with No-SQL databases
+ Experience with large-scale data Lakehouses at Enterprise scale
+ Proven record of accomplishment of delivering operational Data solutions including Report and Model Ready Data Assets
+ Significant experience working with senior executives in the use of data, reporting and visualizations to support strategic and operational decision making
+ Strong ability to transform and integrate complex data from multiple sources into accessible, understandable, and usable data assets and frameworks
+ Strong background in synthesizing data and analytics in a large (Fortune 500), complex, and highly regulated environment
+ Strong technical background including database and business intelligence skills
+ Strong communication skills through written and oral presentations
+ Snowflake experience a plus
+ Experience migrating to the cloud
**This position is intended to be onsite, now or in the near future** . Associates will have regular work hours, including full days in the office three or more days a week. The manager will set the work schedule for this position, including in-office expectations. Regions will not provide relocation assistance for this position, and relocation would be at your expense. The locations available for this role are **Birmingham, AL, Atlanta, GA or Charlotte, NC,**
**Position Type**
Full time
**Compensation Details**
Pay ranges are job specific and are provided as a point-of-market reference for compensation decisions. Other factors which directly impact pay for individual associates include: experience, skills, knowledge, contribution, job location and, most importantly, performance in the job role. As these factors vary by individuals, pay will also vary among individual associates within the same job.
The target information listed below is based on the Metropolitan Statistical Area Market Range for where the position is located and level of the position.
**Job Range Target:**
**_Minimum:_**
$135,668.03 USD
**_Median:_**
$166,971.00 USD
**Incentive Pay Plans:**
Opportunity to participate in the Long Term Incentive Plan.
**Benefits Information**
Regions offers a benefits package that is flexible, comprehensive and recognizes that "one size does not fit all" for benefits-eligible associates. ( Listed below is a synopsis of the benefits offered by Regions for informational purposes, which is not intended to be a complete summary of plan terms and conditions.
+ Paid Vacation/Sick Time
+ 401K with Company Match
+ Medical, Dental and Vision Benefits
+ Disability Benefits
+ Health Savings Account
+ Flexible Spending Account
+ Life Insurance
+ Parental Leave
+ Employee Assistance Program
+ Associate Volunteer Program
Please note, benefits and plans may be changed, amended, or terminated with respect to all or any class of associate at any time. To learn more about Regions' benefits, please click or copy the link below to your browser.
Details**
Charlotte Uptown
**Location:**
Charlotte, North Carolina
Equal Opportunity Employer/including Disabled/Veterans
Job applications at Regions are accepted electronically through our career site for a minimum of five business days from the date of posting. Job postings for higher-volume positions may remain active for longer than the minimum period due to business need and may be closed at any time thereafter at the discretion of the company.
View Now

Cloud Data & AI Engineer

92713 Irvine, California OSI Digital

Posted today

Job Viewed

Tap Again To Close

Job Description

Job Title: Cloud Data & AI Engineer
Employment: Full-time

At OSI Digital Inc, we accelerate our client’s digital transformation journey by delivering modern data solutions, enabling them to unlock the full potential of their data with scalable cloud platforms, intelligent analytics, and AI-driven solutions. With deep expertise across data engineering, cloud platforms, advanced analytics, and AI/ML, our teams bring both technical mastery and business acumen to every engagement. We don’t just implement tools—we build scalable, future-ready solutions that drive measurable outcomes for our clients. Role Summary: We are seeking a highly skilled and results-driven Modern Cloud Data and AI Engineer with a strong background in modern cloud data architecture, specifically on Snowflake, and hands-on experience in developing Data solutions in Power BI, implementing AI Solutions.
The ideal candidate combines strong data engineering, integration, and BI expertise with hands-on AI project execution, supporting OSI’s reputation for high-impact consulting in cloud and digital transformation spaces and will be a strong communicator, capable of implementing projects from the ground up.

Key Responsibilities:
  • Lead the design, development, and implementation of highly scalable and secure data warehouse solutions on Snowflake, including schema design, data loading, performance tuning, and optimizing cloud costs.
  • Design and build robust, efficient data pipelines (ETL/ELT) using advanced data engineering techniques. This includes hands-on experience in data integration via direct APIs (REST/SOAP) and working with various integration tools (e.g., Talend, stitch, Fivetran, or native cloud services).
  • Develop and implement high-impact visual analytics and semantic models in Power BI. Apply advanced features such as DAX, Row-Level Security (RLS), and dashboard deployment pipelines.
  • Proficiency in Python/R, familiarity with ML frameworks (scikit-learn, TensorFlow, PyTorch), experience with MLOps concepts, and deploying models into a production environment on cloud platforms.
  • Develop and deploy AI/ML solutions using Python, Snowpark, or cloud-native ML services (AWS Sagemaker, Azure ML).
  • Exposure to LLM/GenAI projects (chatbot implementations, NLP, recommendation systems, anomaly detection) is highly desirable.
  • Implement and manage data solutions utilizing core services on at least one major cloud platform (AWS or Azure).
  • Demonstrate exceptional communication and articulation skills to engage with clients, gather requirements, and lead project delivery from ground up (inception to final deployment).
Required Qualifications:
  • Minimum of 4 years of professional experience in data engineering, consulting, and solution delivery.
  • Bachelor’s degree in computer science, Engineering, or a related technical field. A master’s degree in a relevant field is highly preferred.
  • Strong, hands-on experience in end-to-end Snowflake project implementation. Any professional certifications in snowflake preferred.
  • Expertise in designing, building, and maintaining ELT/ETL pipelines and data workflows, with a solid understanding of data warehousing best practices.
  • Hands-on experience implementing dashboards in Power BI, including DAX and RLS. Professional certifications in Power BI are preferred.
  • Proficiency in Python, with demonstrable experience deploying at least one AI/ML project (e.g., Snowpark, Databricks, SageMaker, Azure ML) including feature engineering, model deployment, and MLOps practices.
  • Experience with machine learning frameworks such as scikit-learn, TensorFlow, or PyTorch, and hands-on exposure to production deployments.
  • Familiarity with projects involving LLM/Generative AI (e.g., chatbots, NLP, recommendation systems, and anomaly detection).
  • Hands-on experience working with cloud platforms, specifically AWS or Azure.
  • Excellent verbal and written communication, presentation, and client-facing consulting skills, with proven track record of successfully leading projects from inception.


Preferred (Added Advantage) Qualifications:
  • Experience with Tableau or other leading BI tools.
  • Working knowledge of Databricks (e.g., Spark, Delta Lake).
  • Experience or strong understanding of Data Science methodologies and statistical modeling.
  • Relevant industry certifications, including Power BI, Snowflake, Databricks and AWS/Azure Data/AI credentials.
View Now

Data/Cloud Engineer

20151 Chantilly, Virginia KBR

Posted 16 days ago

Job Viewed

Tap Again To Close

Job Description

Title:
Data/Cloud Engineer
Belong. Connect. Grow. with KBR!
KBR's National Security Solutions team provides high-end engineering and advanced technology solutions to our customers in the intelligence and national security communities. In this position, your work will have a profound impact on the country's most critical role - protecting our national security.
Why Join Us?
+ Innovative Projects: KBR's work is at the forefront of engineering, logistics, operations, science, program management, mission IT and cybersecurity solutions.
+ Collaborative Environment: Be part of a dynamic team that thrives on collaboration and innovation, fostering a supportive and intellectually stimulating workplace.
+ Impactful Work: Your contributions will be pivotal in designing and optimizing defense systems that ensure national security and shape the future of space defense.
KBR is seeking a highly qualified Data/Cloud Engineer to provide support for our customers in Chantilly, VA. This effort includes support to innovation labs developing new tools, models, and algorithms to operate and exploit satellite space and ground systems.
This is a continent position based on contract award
Responsibilities:
+ Design/develop cloud-based data storage strategies by leveraging existing cloud services
+ Develop and maintain cloud-based data storage and dissemination retrieval services
+ Migrate on-premises legacy applications, services, and related data to existing cloud infrastructure
+ Identify best practices for cloud services monitoring and management
+ Research and implement cloud services to support cloud apps and maintain cloud services
+ Uses scientific methods, processes, algorithms, and systems to extract insights from structured and unstructured data
+ Apply mathematical, problem-solving, and coding skills to manage big data, extracting valuable insights
+ Assess other's R&D proposals for feasibility and progress towards mission goals
+ Other duties as assigned
Required Qualifications:
+ Active TS/SCI with current CI polygraph
+ Have at least 10 years serving as a Data/Cloud Systems Engineer
+ Bachelor's degree in engineering or related quantitative science field
+ Experience using AWS tools (S3, Lambda, EC2, Cloudwatch, etc)
+ Experience using Microsoft Azure
+ Demonstrated expertise and experience in developing intelligent algorithms capable of learning, analyzing and predicting future events, turning those algorithms into AI models and systems, and then testing and maintaining them
+ Ability to prepare and use large data sets for analysis as part of creating/developing robust data processing systems
+ Generate data exploitation algorithms that can learn, assess, analyze, and organize data for testing while optimizing the Machine Learning (ML) to develop high performance ML models
Desired Qualifications/Skills:
+ Master's degree in engineering or related quantitative science field
+ Experience leading teams resulting in successful verification and test strategies/solutions for complex programs
+ Experience with cloud development processes
+ Experience with industry system engineering practices
+ Strong people skills to influence, collaborate and work effectively with all levels of internal and external leadership
Security Requirements:  Must have an active TS-SCI with a current CI Polygraph
Belong, Connect and Grow at KBRAt KBR, we are passionate about our people and our Zero Harm culture. These inform all that we do and are at the heart of our commitment to, and ongoing journey toward being a People First company. That commitment is central to our team of team's philosophy and fosters an environment where everyone can Belong, Connect and Grow. We Deliver - Together.
KBR is an equal opportunity employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, disability, sex, sexual orientation, gender identity or expression, age, national origin, veteran status, genetic information, union status and/or beliefs, or any other characteristic protected by federal, state, or local law.
View Now
 

Nearby Locations

Other Jobs Near Me

Industry

  1. request_quote Accounting
  2. work Administrative
  3. eco Agriculture Forestry
  4. smart_toy AI & Emerging Technologies
  5. school Apprenticeships & Trainee
  6. apartment Architecture
  7. palette Arts & Entertainment
  8. directions_car Automotive
  9. flight_takeoff Aviation
  10. account_balance Banking & Finance
  11. local_florist Beauty & Wellness
  12. restaurant Catering
  13. volunteer_activism Charity & Voluntary
  14. science Chemical Engineering
  15. child_friendly Childcare
  16. foundation Civil Engineering
  17. clean_hands Cleaning & Sanitation
  18. diversity_3 Community & Social Care
  19. construction Construction
  20. brush Creative & Digital
  21. currency_bitcoin Crypto & Blockchain
  22. support_agent Customer Service & Helpdesk
  23. medical_services Dental
  24. medical_services Driving & Transport
  25. medical_services E Commerce & Social Media
  26. school Education & Teaching
  27. electrical_services Electrical Engineering
  28. bolt Energy
  29. local_mall Fmcg
  30. gavel Government & Non Profit
  31. emoji_events Graduate
  32. health_and_safety Healthcare
  33. beach_access Hospitality & Tourism
  34. groups Human Resources
  35. precision_manufacturing Industrial Engineering
  36. security Information Security
  37. handyman Installation & Maintenance
  38. policy Insurance
  39. code IT & Software
  40. gavel Legal
  41. sports_soccer Leisure & Sports
  42. inventory_2 Logistics & Warehousing
  43. supervisor_account Management
  44. supervisor_account Management Consultancy
  45. supervisor_account Manufacturing & Production
  46. campaign Marketing
  47. build Mechanical Engineering
  48. perm_media Media & PR
  49. local_hospital Medical
  50. local_hospital Military & Public Safety
  51. local_hospital Mining
  52. medical_services Nursing
  53. local_gas_station Oil & Gas
  54. biotech Pharmaceutical
  55. checklist_rtl Project Management
  56. shopping_bag Purchasing
  57. home_work Real Estate
  58. person_search Recruitment Consultancy
  59. store Retail
  60. point_of_sale Sales
  61. science Scientific Research & Development
  62. wifi Telecoms
  63. psychology Therapy
  64. pets Veterinary
View All Cloud Data Engineer Jobs