4,793 Gcp jobs in the United States
GCP Engineer
Posted 14 days ago
Job Viewed
Job Description
An employer is looking to add a Google Cloud Platform Engineer to their team. This person will be responsible for designing, implementing, and managing secure, scalable, and resilient infrastructure within Google Cloud using Assured Workloads. This role ensures all solutions meet CMMC Level 2 compliance and regulatory security requirements. They will design and deploy GCP infrastructure with Assured Workloads using Terraform and CI/CD pipelines, ensuring CMMC L2 controls are implemented. They will be responsible for managing core GCP services configured with Assured Workloads controls (IAM, VPC, GKE, DNS, firewall, load balancing) with a focus on regulated compliance requirements. They will automate provisioning and enforce security, compliance, and governance policies aligned with CMMC L2. They will be responsible for addressing infrastructure issues, optimize performance, and ensure cost efficiency within compliance boundaries. They will provide architectural guidance and support for platform-level services in a regulated environment, and be responsible for designing scalable, highly available, and redundant networking solutions for regulated workloads.
We are a company committed to creating diverse and inclusive environments where people can bring their full, authentic selves to work every day. We are an equal opportunity/affirmative action employer that believes everyone matters. Qualified candidates will receive consideration for employment regardless of their race, color, ethnicity, religion, sex (including pregnancy), sexual orientation, gender identity and expression, marital status, national origin, ancestry, genetic factors, age, disability, protected veteran status, military or uniformed service member status, or any other status or characteristic protected by applicable laws, regulations, and ordinances. If you need assistance and/or a reasonable accommodation due to a disability during the application or recruiting process, please send a request to learn more about how we collect, keep, and process your private information, please review Insight Global's Workforce Privacy Policy: and Requirements
3+ years in GCP infrastructure engineering (preferably in regulated sectors)
Proficiency with Terraform, Python/Bash, and DevOps tools (Jenkins, GitLab, Cloud Build)
Strong understanding of cloud networking, IAM, and security practices for regulated workloads.
Experience with Kubernetes, GKE, and container orchestration in regulated environments Google Cloud Professional Certification (Architect, DevOps Engineer)
Active Secret clearance
Experience with hybrid/multi-cloud in regulated settings
Familiarity with APIGX, Anthos, service mesh, and FedRAMP/CMMC frameworks
Strong communication/documentation skills for compliance reporting
Ability to mentor junior engineers and contribute to secure platform strategy
INTL - GCP Engineer

Posted 9 days ago
Job Viewed
Job Description
On a typical day, you will support and maintain our Google Cloud Platform (GCP) environments, assisting with the deployment and management of virtual machines and cloud resources. You will work on automation and development tasks, often using Python scripting, and help manage data operations involving SQL and BigQuery. Your role will involve collaborating with team members to resolve cloud-related issues, contributing to ongoing cloud projects, and ensuring the security and compliance of our systems. You will also participate in process improvements and may occasionally interact with Atlassian tools as an end user, all while working remotely and aligning your schedule with Pacific Time hours.
We are a company committed to creating diverse and inclusive environments where people can bring their full, authentic selves to work every day. We are an equal opportunity/affirmative action employer that believes everyone matters. Qualified candidates will receive consideration for employment regardless of their race, color, ethnicity, religion, sex (including pregnancy), sexual orientation, gender identity and expression, marital status, national origin, ancestry, genetic factors, age, disability, protected veteran status, military or uniformed service member status, or any other status or characteristic protected by applicable laws, regulations, and ordinances. If you need assistance and/or a reasonable accommodation due to a disability during the application or recruiting process, please send a request to learn more about how we collect, keep, and process your private information, please review Insight Global's Workforce Privacy Policy: and Requirements
- Hands-on experience with Google Cloud Platform (GCP).
- Linux administration skills.
- Python scripting skills.
- Automation support experience.
- Experience with VMs, SQL, and BigQuery.
- Familiarity with Atlassian products as an end user.
GCP Data engineer
Posted today
Job Viewed
Job Description
Role: GCP Data Engineer
Location : Hartford,CT (hybrid)
Must Have Skills:
7+ Years of experience with GCP , Python , Pyspark , SQL
GCP Services - Bigquery , Dataproc , Pub/sub , GCP Dataflow.
The client will take - Coderpad interview so Python coding is strongly required here.
GCP Data Engineer
Posted today
Job Viewed
Job Description
Role Overview:
We are seeking a highly skilled and technically proficient Lead Data Engineer/Architect with deep expertise in Customer Data Platforms (CDPs), Google Cloud Platform (GCP), and data pipeline orchestration tools like Airflow. This role is critical to driving our CRM, personalization, and measurement initiatives, and will require someone who can hit the ground running with minimal onboarding.
The ideal candidate will bring strong retail domain knowledge, hands-on experience with big data ecosystems, and a proven ability to architect and optimize scalable, modular data solutions. You will be expected to lead technical conversations, resolve complex data issues, and mentor junior team members while collaborating cross-functionally with marketing, product, and engineering teams.
Key Responsibilities:
- Lead analytics and engineering efforts across CRM, CDP, and personalization programs, enabling segmentation, targeting, lifecycle marketing, and campaign execution.
- Architect and optimize data pipelines using Airflow, BigQuery, and other GCP-native tools to support scalable and reusable workflows.
- Analyze and resolve data quality issues across platforms like Amperity, Acxiom, and other CDPs, ensuring accurate customer profiles and campaign performance.
- Design and implement modular, production-grade code for data ingestion, transformation, and measurement workflows.
- Develop and maintain dashboards and reporting frameworks using GCP-native solutions (e.g., Looker, BigQuery).
- Collaborate with cross-functional teams to translate business needs into technical solutions and data products.
- Lead strategic data discussions with senior stakeholders, influencing decisions through actionable insights and technical expertise.
- Serve as a Subject Matter Expert (SME) in data engineering and analytics, mentoring junior analysts and promoting best practices in data governance, storytelling, and campaign measurement.
- Apply agentic AI techniques to automate insight generation and enhance personalization and marketing intelligence.
Required Qualifications:
- Bachelor’s or master’s degree in data science, Computer Science, Statistics, or related field.
- 15+ years of experience in data analytics and engineering, with at least 3 years in a lead or SME role.
- Proven experience in retail industry, with deep understanding of customer behavior, merchandising, and omnichannel strategies.
- Hands-on expertise with GCP, including BigQuery, Cloud Composer (Airflow), Dataflow, and Looker.
- Strong experience with Customer Data Platforms such as Amperity, Acxiom, or similar.
- Proficiency in SQL, Python, and modular code development for analytics and engineering workflows.
- Deep understanding of CRM data structures, personalization logic, campaign execution, and marketing measurement.
- Ability to work independently, take ownership, and deliver results with minimal guidance.
- Excellent communication skills, with the ability to present complex data insights to senior stakeholders.
- Experience with agentic AI frameworks or similar technologies is a strong plus.
GCP Data Engineer

Posted 3 days ago
Job Viewed
Job Description
As a **GCP Data Engineer** , you will make an impact by designing and implementing scalable data pipelines and orchestration workflows on Google Cloud Platform (GCP). You will be a valued member of the Data Engineering team and work collaboratively with cloud architects, DevOps engineers, and business stakeholders to deliver high-performance data solutions.
**In this role, you will:**
+ Build and maintain streaming and batch data pipelines using GCP services such as Dataflow, Pub/Sub, and Cloud Storage
+ Develop and manage workflow orchestration using Apache Airflow and Cloud Composer
+ Design and optimize relational databases, particularly Cloud SQL and PostgreSQL
+ Collaborate with DevOps teams to implement CI/CD pipelines and containerized deployments using Docker and Kubernetes
+ Apply data modeling best practices to support analytics and business intelligence initiatives
**Work model:**
We strive to provide flexibility wherever possible. Based on this role's business requirements, this is a remote position open to qualified applicants in The United States. Regardless of your working arrangement, we are here to support a healthy work-life balance through our various wellbeing programs.
The working arrangements for this role are accurate as of the date of posting. This may change based on the project you're engaged in, as well as business and client requirements. Rest assured; we will always be clear about role expectations.
**What you need to have to be considered:**
+ Proven experience as a Data Engineer with a focus on Google Cloud Platform
+ Hands-on expertise with GCP services: Dataflow, Pub/Sub, Cloud Storage, Cloud Composer
+ Strong proficiency in workflow orchestration using Apache Airflow
+ Experience with relational databases such as Cloud SQL or PostgreSQL
+ Solid understanding of ETL processes, data modeling, and pipeline architecture
**These will help you stand out:**
+ Familiarity with DevOps practices, CI/CD pipelines, and containerization using Docker and Kubernetes
+ Experience working in agile environments and cross-functional teams
+ GCP certifications (e.g., Professional Data Engineer or Cloud Architect)
+ Knowledge of data governance and security best practices in cloud environments
We're excited to meet people who share our mission and can make an impact in a variety of ways. Don't hesitate to apply, even if you only meet the minimum requirements listed. Think about your transferable experiences and unique skills that make you stand out as someone who can bring new and exciting things to this role.
**Salary and Other Compensation:**
Applications will be accepted until October 27, 2025
The annual salary for this position is between $ 100,000 - $ 117,000 depending on experience and other qualifications of the successful candidate.
This position is also eligible for Cognizant's discretionary annual incentive program, based on performance and subject to the terms of Cognizant's applicable plans.
**Benefits:** Cognizant offers the following benefits for this position, subject to applicable eligibility requirements:
+ Medical/Dental/Vision/Life Insurance
+ Paid holidays plus Paid Time Off
+ 401(k) plan and contributions
+ Long-term/Short-term Disability
+ Paid Parental Leave
+ Employee Stock Purchase Plan
**Disclaimer:** The salary, other compensation, and benefits information is accurate as of the date of this posting. Cognizant reserves the right to modify this information at any time, subject to applicable law.
**#LI-KV1 #CB #Ind123**
Cognizant is an equal opportunity employer. All qualified applicants will receive consideration for employment without regard to sex, gender identity, sexual orientation, race, color, religion, national origin, disability, protected Veteran status, age, or any other characteristic protected by law.
GCP Cloud Engineer

Posted 3 days ago
Job Viewed
Job Description
**Location : Hartford CT**
- Design, implement, and manage scalable cloud infrastructure on **Google Cloud Platform (GCP).**
- Develop and maintain **Infrastructure as Code (IaC)** using Terraform for automated provisioning and configuration.
- Write clean, efficient, and reusable Python scripts for automation, monitoring, and integration tasks.
- Collaborate with cross-functional teams in an Agile/Scrum environment to deliver cloud solutions.
- Ensure high availability, performance, and security of cloud-based systems.
- Implement CI/CD pipelines using tools like Cloud Build, Jenkins, or GitHub Actions.
- Knowledge on components and templates for Vertex AI workflows.
- Monitor and troubleshoot cloud infrastructure using Stackdriver, Prometheus, or similar tools.
- Optimize cloud resource usage and cost through effective architecture and automation.
- Maintain documentation for infrastructure, processes, and best practices.
- Perform code reviews and provide mentorship to junior engineers.
- Collaborate with stakeholders to translate business requirements into AI solutions.
- Stay current with **GCP** innovations, especially in AI/ML, and recommend strategic improvements.
- Participate in sprint planning, daily stand-ups, and retrospectives.
- Demonstrate strong problem-solving skills and a proactive approach to cloud engineering challenges.
**Skill matrix**
**Mandatory Skills**
**(Top 5 Keywords or skills)**
**Skill Proficiency**
**Years of Experience**
**Basic Knowledge**
**Medium**
**Expert**
Google Infra
7X
Google Solutions / CI/CD pipelines
7 X
IaC/ Terraform
7 X
Python Scripting
5 X
Vertex AI
5X
Regards
Preethy Nathan
Cognizant is an equal opportunity employer. All qualified applicants will receive consideration for employment without regard to sex, gender identity, sexual orientation, race, color, religion, national origin, disability, protected Veteran status, age, or any other characteristic protected by law.
GCP Data Engineer

Posted 3 days ago
Job Viewed
Job Description
As a **GCP Data Engineer** , you will make an impact by designing and implementing scalable data pipelines and orchestration workflows on Google Cloud Platform (GCP). You will be a valued member of the Data Engineering team and work collaboratively with cloud architects, DevOps engineers, and business stakeholders to deliver high-performance data solutions.
**In this role, you will:**
+ Build and maintain streaming and batch data pipelines using GCP services such as Dataflow, Pub/Sub, and Cloud Storage
+ Develop and manage workflow orchestration using Apache Airflow and Cloud Composer
+ Design and optimize relational databases, particularly Cloud SQL and PostgreSQL
+ Collaborate with DevOps teams to implement CI/CD pipelines and containerized deployments using Docker and Kubernetes
+ Apply data modeling best practices to support analytics and business intelligence initiatives
**Work model:**
We strive to provide flexibility wherever possible. Based on this role's business requirements, this is a remote position open to qualified applicants in The United States. Regardless of your working arrangement, we are here to support a healthy work-life balance through our various wellbeing programs.
The working arrangements for this role are accurate as of the date of posting. This may change based on the project you're engaged in, as well as business and client requirements. Rest assured; we will always be clear about role expectations.
**What you need to have to be considered:**
+ Proven experience as a Data Engineer with a focus on Google Cloud Platform
+ Hands-on expertise with GCP services: Dataflow, Pub/Sub, Cloud Storage, Cloud Composer
+ Strong proficiency in workflow orchestration using Apache Airflow
+ Experience with relational databases such as Cloud SQL or PostgreSQL
+ Solid understanding of ETL processes, data modeling, and pipeline architecture
**These will help you stand out:**
+ Familiarity with DevOps practices, CI/CD pipelines, and containerization using Docker and Kubernetes
+ Experience working in agile environments and cross-functional teams
+ GCP certifications (e.g., Professional Data Engineer or Cloud Architect)
+ Knowledge of data governance and security best practices in cloud environments
We're excited to meet people who share our mission and can make an impact in a variety of ways. Don't hesitate to apply, even if you only meet the minimum requirements listed. Think about your transferable experiences and unique skills that make you stand out as someone who can bring new and exciting things to this role.
**Salary and Other Compensation:**
Applications will be accepted until October 27, 2025
The annual salary for this position is between $ 100,000 - $ 117,000 depending on experience and other qualifications of the successful candidate.
This position is also eligible for Cognizant's discretionary annual incentive program, based on performance and subject to the terms of Cognizant's applicable plans.
**Benefits:** Cognizant offers the following benefits for this position, subject to applicable eligibility requirements:
+ Medical/Dental/Vision/Life Insurance
+ Paid holidays plus Paid Time Off
+ 401(k) plan and contributions
+ Long-term/Short-term Disability
+ Paid Parental Leave
+ Employee Stock Purchase Plan
**Disclaimer:** The salary, other compensation, and benefits information is accurate as of the date of this posting. Cognizant reserves the right to modify this information at any time, subject to applicable law.
**#LI-KV1 #CB #Ind123**
Cognizant is an equal opportunity employer. All qualified applicants will receive consideration for employment without regard to sex, gender identity, sexual orientation, race, color, religion, national origin, disability, protected Veteran status, age, or any other characteristic protected by law.
Be The First To Know
About the latest Gcp Jobs in United States !
GCP Data Engineer

Posted 3 days ago
Job Viewed
Job Description
As a **GCP Data Engineer** , you will make an impact by designing and implementing scalable data pipelines and orchestration workflows on Google Cloud Platform (GCP). You will be a valued member of the Data Engineering team and work collaboratively with cloud architects, DevOps engineers, and business stakeholders to deliver high-performance data solutions.
**In this role, you will:**
+ Build and maintain streaming and batch data pipelines using GCP services such as Dataflow, Pub/Sub, and Cloud Storage
+ Develop and manage workflow orchestration using Apache Airflow and Cloud Composer
+ Design and optimize relational databases, particularly Cloud SQL and PostgreSQL
+ Collaborate with DevOps teams to implement CI/CD pipelines and containerized deployments using Docker and Kubernetes
+ Apply data modeling best practices to support analytics and business intelligence initiatives
**Work model:**
We strive to provide flexibility wherever possible. Based on this role's business requirements, this is a remote position open to qualified applicants in The United States. Regardless of your working arrangement, we are here to support a healthy work-life balance through our various wellbeing programs.
The working arrangements for this role are accurate as of the date of posting. This may change based on the project you're engaged in, as well as business and client requirements. Rest assured; we will always be clear about role expectations.
**What you need to have to be considered:**
+ Proven experience as a Data Engineer with a focus on Google Cloud Platform
+ Hands-on expertise with GCP services: Dataflow, Pub/Sub, Cloud Storage, Cloud Composer
+ Strong proficiency in workflow orchestration using Apache Airflow
+ Experience with relational databases such as Cloud SQL or PostgreSQL
+ Solid understanding of ETL processes, data modeling, and pipeline architecture
**These will help you stand out:**
+ Familiarity with DevOps practices, CI/CD pipelines, and containerization using Docker and Kubernetes
+ Experience working in agile environments and cross-functional teams
+ GCP certifications (e.g., Professional Data Engineer or Cloud Architect)
+ Knowledge of data governance and security best practices in cloud environments
We're excited to meet people who share our mission and can make an impact in a variety of ways. Don't hesitate to apply, even if you only meet the minimum requirements listed. Think about your transferable experiences and unique skills that make you stand out as someone who can bring new and exciting things to this role.
**Salary and Other Compensation:**
Applications will be accepted until October 27, 2025
The annual salary for this position is between $ 100,000 - $ 117,000 depending on experience and other qualifications of the successful candidate.
This position is also eligible for Cognizant's discretionary annual incentive program, based on performance and subject to the terms of Cognizant's applicable plans.
**Benefits:** Cognizant offers the following benefits for this position, subject to applicable eligibility requirements:
+ Medical/Dental/Vision/Life Insurance
+ Paid holidays plus Paid Time Off
+ 401(k) plan and contributions
+ Long-term/Short-term Disability
+ Paid Parental Leave
+ Employee Stock Purchase Plan
**Disclaimer:** The salary, other compensation, and benefits information is accurate as of the date of this posting. Cognizant reserves the right to modify this information at any time, subject to applicable law.
**#LI-KV1 #CB #Ind123**
Cognizant is an equal opportunity employer. All qualified applicants will receive consideration for employment without regard to sex, gender identity, sexual orientation, race, color, religion, national origin, disability, protected Veteran status, age, or any other characteristic protected by law.
GCP Data Engineer

Posted 3 days ago
Job Viewed
Job Description
As a **GCP Data Engineer** , you will make an impact by designing and implementing scalable data pipelines and orchestration workflows on Google Cloud Platform (GCP). You will be a valued member of the Data Engineering team and work collaboratively with cloud architects, DevOps engineers, and business stakeholders to deliver high-performance data solutions.
**In this role, you will:**
+ Build and maintain streaming and batch data pipelines using GCP services such as Dataflow, Pub/Sub, and Cloud Storage
+ Develop and manage workflow orchestration using Apache Airflow and Cloud Composer
+ Design and optimize relational databases, particularly Cloud SQL and PostgreSQL
+ Collaborate with DevOps teams to implement CI/CD pipelines and containerized deployments using Docker and Kubernetes
+ Apply data modeling best practices to support analytics and business intelligence initiatives
**Work model:**
We strive to provide flexibility wherever possible. Based on this role's business requirements, this is a remote position open to qualified applicants in The United States. Regardless of your working arrangement, we are here to support a healthy work-life balance through our various wellbeing programs.
The working arrangements for this role are accurate as of the date of posting. This may change based on the project you're engaged in, as well as business and client requirements. Rest assured; we will always be clear about role expectations.
**What you need to have to be considered:**
+ Proven experience as a Data Engineer with a focus on Google Cloud Platform
+ Hands-on expertise with GCP services: Dataflow, Pub/Sub, Cloud Storage, Cloud Composer
+ Strong proficiency in workflow orchestration using Apache Airflow
+ Experience with relational databases such as Cloud SQL or PostgreSQL
+ Solid understanding of ETL processes, data modeling, and pipeline architecture
**These will help you stand out:**
+ Familiarity with DevOps practices, CI/CD pipelines, and containerization using Docker and Kubernetes
+ Experience working in agile environments and cross-functional teams
+ GCP certifications (e.g., Professional Data Engineer or Cloud Architect)
+ Knowledge of data governance and security best practices in cloud environments
We're excited to meet people who share our mission and can make an impact in a variety of ways. Don't hesitate to apply, even if you only meet the minimum requirements listed. Think about your transferable experiences and unique skills that make you stand out as someone who can bring new and exciting things to this role.
**Salary and Other Compensation:**
Applications will be accepted until October 27, 2025
The annual salary for this position is between $ 100,000 - $ 117,000 depending on experience and other qualifications of the successful candidate.
This position is also eligible for Cognizant's discretionary annual incentive program, based on performance and subject to the terms of Cognizant's applicable plans.
**Benefits:** Cognizant offers the following benefits for this position, subject to applicable eligibility requirements:
+ Medical/Dental/Vision/Life Insurance
+ Paid holidays plus Paid Time Off
+ 401(k) plan and contributions
+ Long-term/Short-term Disability
+ Paid Parental Leave
+ Employee Stock Purchase Plan
**Disclaimer:** The salary, other compensation, and benefits information is accurate as of the date of this posting. Cognizant reserves the right to modify this information at any time, subject to applicable law.
**#LI-KV1 #CB #Ind123**
Cognizant is an equal opportunity employer. All qualified applicants will receive consideration for employment without regard to sex, gender identity, sexual orientation, race, color, religion, national origin, disability, protected Veteran status, age, or any other characteristic protected by law.
GCP Data Engineer

Posted 3 days ago
Job Viewed
Job Description
As a **GCP Data Engineer** , you will make an impact by designing and implementing scalable data pipelines and orchestration workflows on Google Cloud Platform (GCP). You will be a valued member of the Data Engineering team and work collaboratively with cloud architects, DevOps engineers, and business stakeholders to deliver high-performance data solutions.
**In this role, you will:**
+ Build and maintain streaming and batch data pipelines using GCP services such as Dataflow, Pub/Sub, and Cloud Storage
+ Develop and manage workflow orchestration using Apache Airflow and Cloud Composer
+ Design and optimize relational databases, particularly Cloud SQL and PostgreSQL
+ Collaborate with DevOps teams to implement CI/CD pipelines and containerized deployments using Docker and Kubernetes
+ Apply data modeling best practices to support analytics and business intelligence initiatives
**Work model:**
We strive to provide flexibility wherever possible. Based on this role's business requirements, this is a remote position open to qualified applicants in The United States. Regardless of your working arrangement, we are here to support a healthy work-life balance through our various wellbeing programs.
The working arrangements for this role are accurate as of the date of posting. This may change based on the project you're engaged in, as well as business and client requirements. Rest assured; we will always be clear about role expectations.
**What you need to have to be considered:**
+ Proven experience as a Data Engineer with a focus on Google Cloud Platform
+ Hands-on expertise with GCP services: Dataflow, Pub/Sub, Cloud Storage, Cloud Composer
+ Strong proficiency in workflow orchestration using Apache Airflow
+ Experience with relational databases such as Cloud SQL or PostgreSQL
+ Solid understanding of ETL processes, data modeling, and pipeline architecture
**These will help you stand out:**
+ Familiarity with DevOps practices, CI/CD pipelines, and containerization using Docker and Kubernetes
+ Experience working in agile environments and cross-functional teams
+ GCP certifications (e.g., Professional Data Engineer or Cloud Architect)
+ Knowledge of data governance and security best practices in cloud environments
We're excited to meet people who share our mission and can make an impact in a variety of ways. Don't hesitate to apply, even if you only meet the minimum requirements listed. Think about your transferable experiences and unique skills that make you stand out as someone who can bring new and exciting things to this role.
**Salary and Other Compensation:**
Applications will be accepted until October 27, 2025
The annual salary for this position is between $ 100,000 - $ 117,000 depending on experience and other qualifications of the successful candidate.
This position is also eligible for Cognizant's discretionary annual incentive program, based on performance and subject to the terms of Cognizant's applicable plans.
**Benefits:** Cognizant offers the following benefits for this position, subject to applicable eligibility requirements:
+ Medical/Dental/Vision/Life Insurance
+ Paid holidays plus Paid Time Off
+ 401(k) plan and contributions
+ Long-term/Short-term Disability
+ Paid Parental Leave
+ Employee Stock Purchase Plan
**Disclaimer:** The salary, other compensation, and benefits information is accurate as of the date of this posting. Cognizant reserves the right to modify this information at any time, subject to applicable law.
**#LI-KV1 #CB #Ind123**
Cognizant is an equal opportunity employer. All qualified applicants will receive consideration for employment without regard to sex, gender identity, sexual orientation, race, color, religion, national origin, disability, protected Veteran status, age, or any other characteristic protected by law.