14,018 Gcp Engineer jobs in the United States
Cloud GCP Engineer
Posted 7 days ago
Job Viewed
Job Description
***Onsite Phoenix - AZ
***Max rate $65
Cloud GCP Engineer
Job Summary :
Required Skills : Big Data ,GCP
Experience : 10 to 13yrs
Shift : Day 9am to 7pm EST
Project Planning and Setup
Understand the project scope identify activities tasks task level estimates schedule dependencies risks and provide inputs to Module Lead for review
Provide inputs to testing strategy configuration deployment hardware software requirement etc
Review plan and provide feedback on gaps timeline and execution feasibility etc as required in the project
Participate in KT sessions conducted by customer other business teams and provide feedback on requirements
Understanding and Analysis Analyze functional non functional requirements and seek clarifications for better understanding of requirements
Based on understanding of system upstream downstream provide feedback and inputs on gaps in requirements and technical feasibility of requirements
Design Prepare the LLD detailed design documents based on HLD and briefing from Module Lead
Seek inputs from the developers on specific modules as applicable
Consolidate all modules and provide to Module Lead Architects Designers for review
Suggest changes in design on technical grounds
Develop components inventory for the code to be developed tying it to the nonfunctional requirements
Perform sampling of data to understand the character quality of the data project dependent in the absence of data analyst or designer Identify tools and technologies to be used in the project as well as reusable objects that could be customized for the project
Testing Management
Develop unit test case for each module
Conduct guide conducting of unit and integration testing and fix defects
Review approve code to be moved to testing environment
Provide support to the QA team and coordinate for various phases of testing
Address queries raised by QA within defined timelines Investigate critical defects and establish need for fixing Raise issues to leadsQA Report defect status as per project standard process within agreed timelines
Share revised code with supervisor for review
Assist team lead and project manager
Job Location : Primary Location : USAZPHOC01(ITUSA,Phoenix - AZ USA, CLT)
Job Type : Business Associate (50CW00)
Demand Requires Travel? : No
Certification(s) Required : NA
Sr GCP Engineer
Posted today
Job Viewed
Job Description
Infrastructure Automation & Management:
Design, implement, and maintain scalable, reliable, and secure cloud infrastructure using GCP services.
Automate cloud infrastructure provisioning, scaling, and monitoring using Infrastructure as Code (IaC) tools such as Terraform or Google Cloud Deployment Manager.
Manage and optimize GCP resources such as Compute Engine, Kubernetes Engine, Cloud Functions, and BigQuery to support development teams.
CI/CD Pipeline Management:
Build, maintain, and enhance continuous integration and continuous deployment (CI/CD) pipelines to ensure seamless and automated code deployment to GCP environments.
Integrate CI/CD pipelines with GCP services like Cloud Build, Cloud Source Repositories, or third-party tools like Jenkins
Ensure pipelines are optimized for faster build, test, and deployment cycles.
Monitoring & Incident Management:
Implement and manage cloud monitoring and logging solutions using Dynatrace and GCP-native tools like Stackdriver (Monitoring, Logging, and Trace).
Monitor cloud infrastructure health and resolve performance issues, ensuring minimal downtime and maximum uptime.
Set up incident management workflows, implement alerting mechanisms, and create runbooks for rapid issue resolution.
Security & Compliance:
Implement security best practices for cloud infrastructure, including identity and access management (IAM), encryption, and network security.
Ensure GCP environments comply with organizational security policies and industry standards such as GDPR/CCPA, or PCI-DSS.
Conduct vulnerability assessments and perform regular patching and system updates to mitigate security risks.
Collaboration & Support:
Collaborate with development teams to design cloud-native applications that are optimized for performance, security, and scalability on GCP.
Work closely with cloud architects to provide input on cloud design and best practices for continuous integration, testing, and deployment.
Provide day-to-day support for development, QA, and production environments, ensuring availability and stability.
Cost Optimization:
Monitor and optimize cloud costs by analyzing resource utilization and recommending cost-saving measures such as right-sizing instances, using preemptible VMs, or implementing auto-scaling.
Tooling & Scripting:
Develop and maintain scripts (using languages like Python, Bash, or PowerShell) to automate routine tasks and system operations.
Use configuration management tools like Ansible, Chef, or Puppet to manage cloud resources and maintain system configurations.
Required Qualifications & Skills:
Experience:
3+ years of experience as a DevOps Engineer or Cloud Engineer, with hands-on experience in managing cloud infrastructure.
Proven experience working with Google Cloud Platform (GCP) services such as Compute Engine, Cloud Storage, Kubernetes Engine, Pub/Sub, Cloud SQL, and others.
Experience in automating cloud infrastructure with Infrastructure as Code (IaC) tools like Terraform, Cloud Deployment Manager, or Ansible.
Technical Skills:
Strong knowledge of CI/CD tools and processes (e.g., Jenkins, GitLab CI, CircleCI, or GCP Cloud Build).
Proficiency in scripting and automation using Python, Bash, or similar languages.
Strong understanding of containerization technologies (Docker) and container orchestration tools like Kubernetes.
Familiarity with GCP networking, security (IAM, VPC, Firewall rules), and monitoring tools (Stackdriver).
Cloud & DevOps Tools:
Experience with Git for version control and collaboration.
Familiarity with GCP-native DevOps tools like Cloud Build, Cloud Source Repositories, Artifact Registry, and Binary Authorization.
Understanding of DevOps practices and principles, including Continuous Integration, Continuous Delivery, Infrastructure as Code, and Monitoring/Alerting.
Security & Compliance:
Knowledge of security best practices for cloud environments, including IAM, network security, and data encryption.
Understanding of compliance and regulatory requirements related to cloud computing (e.g., GDPR/CCPA, HIPAA, or PCI).
Soft Skills:
Strong problem-solving skills with the ability to work in a fast-paced environment.
Excellent communication skills, with the ability to explain technical concepts to both technical and non-technical stakeholders.
Team-oriented mindset with the ability to work collaboratively with cross-functional teams.
Certifications (Preferred):
Google Professional Cloud DevOps Engineer certification (preferred).
Other GCP certifications such as Google Professional Cloud Architect or Associate Cloud Engineer are a plus.
DevOps certifications like Certified Kubernetes Administrator (CKA) or AWS/GCP DevOps certification are advantageous.
Sr GCP Engineer
Posted 7 days ago
Job Viewed
Job Description
Design, implement, and maintain scalable, reliable, and secure cloud infrastructure using GCP services.
Automate cloud infrastructure provisioning, scaling, and monitoring using Infrastructure as Code (IaC) tools such as Terraform or Google Cloud Deployment Manager.
Manage and optimize GCP resources such as Compute Engine, Kubernetes Engine, Cloud Functions, and BigQuery to support development teams.
CI/CD Pipeline Management:
Build, maintain, and enhance continuous integration and continuous deployment (CI/CD) pipelines to ensure seamless and automated code deployment to GCP environments.
Integrate CI/CD pipelines with GCP services like Cloud Build, Cloud Source Repositories, or third-party tools like Jenkins
Ensure pipelines are optimized for faster build, test, and deployment cycles.
Monitoring & Incident Management:
Implement and manage cloud monitoring and logging solutions using Dynatrace and GCP-native tools like Stackdriver (Monitoring, Logging, and Trace).
Monitor cloud infrastructure health and resolve performance issues, ensuring minimal downtime and maximum uptime.
Set up incident management workflows, implement alerting mechanisms, and create runbooks for rapid issue resolution.
Security & Compliance:
Implement security best practices for cloud infrastructure, including identity and access management (IAM), encryption, and network security.
Ensure GCP environments comply with organizational security policies and industry standards such as GDPR/CCPA, or PCI-DSS.
Conduct vulnerability assessments and perform regular patching and system updates to mitigate security risks.
Collaboration & Support:
Collaborate with development teams to design cloud-native applications that are optimized for performance, security, and scalability on GCP.
Work closely with cloud architects to provide input on cloud design and best practices for continuous integration, testing, and deployment.
Provide day-to-day support for development, QA, and production environments, ensuring availability and stability.
Cost Optimization:
Monitor and optimize cloud costs by analyzing resource utilization and recommending cost-saving measures such as right-sizing instances, using preemptible VMs, or implementing auto-scaling.
Tooling & Scripting:
Develop and maintain scripts (using languages like Python, Bash, or PowerShell) to automate routine tasks and system operations.
Use configuration management tools like Ansible, Chef, or Puppet to manage cloud resources and maintain system configurations.
Required Qualifications & Skills:
Experience:
3+ years of experience as a DevOps Engineer or Cloud Engineer, with hands-on experience in managing cloud infrastructure.
Proven experience working with Google Cloud Platform (GCP) services such as Compute Engine, Cloud Storage, Kubernetes Engine, Pub/Sub, Cloud SQL, and others.
Experience in automating cloud infrastructure with Infrastructure as Code (IaC) tools like Terraform, Cloud Deployment Manager, or Ansible.
Technical Skills:
Strong knowledge of CI/CD tools and processes (e.g., Jenkins, GitLab CI, CircleCI, or GCP Cloud Build).
Proficiency in scripting and automation using Python, Bash, or similar languages.
Strong understanding of containerization technologies (Docker) and container orchestration tools like Kubernetes.
Familiarity with GCP networking, security (IAM, VPC, Firewall rules), and monitoring tools (Stackdriver).
Cloud & DevOps Tools:
Experience with Git for version control and collaboration.
Familiarity with GCP-native DevOps tools like Cloud Build, Cloud Source Repositories, Artifact Registry, and Binary Authorization.
Understanding of DevOps practices and principles, including Continuous Integration, Continuous Delivery, Infrastructure as Code, and Monitoring/Alerting.
Security & Compliance:
Knowledge of security best practices for cloud environments, including IAM, network security, and data encryption.
Understanding of compliance and regulatory requirements related to cloud computing (e.g., GDPR/CCPA, HIPAA, or PCI).
Soft Skills:
Strong problem-solving skills with the ability to work in a fast-paced environment.
Excellent communication skills, with the ability to explain technical concepts to both technical and non-technical stakeholders.
Team-oriented mindset with the ability to work collaboratively with cross-functional teams.
Certifications (Preferred):
Google Professional Cloud DevOps Engineer certification (preferred).
Other GCP certifications such as Google Professional Cloud Architect or Associate Cloud Engineer are a plus.
DevOps certifications like Certified Kubernetes Administrator (CKA) or AWS/GCP DevOps certification are advantageous.
GCP Engineer L3
Posted 7 days ago
Job Viewed
Job Description
1. Experience in IT management, cloud services administration, DevOps engineering, or related occupation.
2. The position requires 1 year of experience with the following: Infrastructure as Code: Terraform required.
3. Experience with Ansible, Puppet or Chef is a bonus.
4. Cloud services including storage, databases, and networking.
5. CI/CD pipelines and tools including Jenkins, GitLab CI, or GitHub Actions.
6. Kubernetes security best practices including network policies, RBAC (Role-Based Access Control) and secrets management.
7. Version control systems like Git. Automating deployment pipelines and integrating them with Kubernetes.
Roles & Responsibilities
• Utilize software that manages and monitors networks, systems and applications not only to guarantee performance to cloud software environments but also to better orchestrate and automate provisioning of resources.
• Promote infrastructure automation, flexible collaboration and communication between development, security, compliance, testing, monitoring and production teams.
• Optimize the release process by leading teams to identify gaps and eliminate barriers to enable increased frequency of accurate code deployment.
• Partner with application teams to provide overall infrastructure guidance and assistance. Conduct infrastructure deployment activities.
• Orchestrate problem resolution efforts in an application's cloud migration journey.
• Identify opportunities to enhance current application automation processes.
• Identify new infrastructure automation opportunities.
Generic Managerial Skills, If any
• Good Team player
Diverse Lynx LLC is an Equal Employment Opportunity employer. All qualified applicants will receive due consideration for employment without any discrimination. All applicants will be evaluated solely on the basis of their ability, competence and their proven capability to perform the functions outlined in the corresponding role. We promote and support a diverse workforce across all levels in the company.
Cloud Engineer - GCP Engineer
Posted today
Job Viewed
Job Description
Cloud Engineer (GCP)
Washington, D.C
$125,000/yr - $200,000/yr
MUST:
Active Secret Clearance required
2 -3 years GCP engineering / developing required
2 - 3 years overall experience Developing on Cloud Engineering platform - GCP (Google Cloud Platform)
3+ years developing & engineering within GCP
3+ years Deploying and supporting workloads in an GCP cloud environment
3+ years of experience implementing, migrating, managing, and operating systems/applications, GCP
2+ years with Cloud resources architecture and operations , GCP
Infrastructure as a code and automation tools
Providing continuous monitoring and support; Delivering solutions using Agile methodologies
Experience provisioning systems in the cloud, creating projects and configuring access for users
Experience with cloud security; Experience with cloud networking; Experience with cloud storage provisioning and management worked with large datasets (100s of TBs).
Experience using API, scripting, and reporting
Experience with Cloud AD and LDAP integration
DUTIES:
Work with business stakeholders and senior leaders to deliver on complex, enterprise-level initiatives that are a part of the company's overall strategic direction.
Work closely with users, developers, and application owners to gather requirements and deliver solutions based on requirements.
Implement infrastructure as a coding process by creating templates, and scripts to automate the provisioning of services in GCP
Utilize automation tools such as Terraform, CloudFormation, Ansible, etc. to provision resources.
Implement monitoring and alerting of all services and applications hosted in GCP
Implement sound access control IAM policies, and custom IAM roles to manage access to resources in GCP troubleshoot any access-related issues for users.
Maintain end-to-end security ensuring best practices are always implemented.
Provision and maintain all servers using configuration management tools, including Chef.
Develop solutions using cloud services in GCP
Working knowledge of supporting IT infrastructure technologies and standards including software & hardware life cycle, system configuration policies, security, hardening, High Availability, Disaster Recovery etc
Monitor and report to management on actual and projected tasks. Generate regular cloud cost reports
*Progression, Inc. is an equal opportunity and affirmative action employer. Progression, Inc. is committed to administering all employment and personnel actions on the basis of merit and free of discrimination based on race, color, religion, sex, sexual orientation, gender identity, national origin, protected veteran status, or status as an individual with a disability. Consistent with this commitment, we are dedicated to the employment and advancement of qualified minorities, women, individuals with disabilities, protected veterans, persons of all ethnic backgrounds and religions according to their abilities.*
GCP Engineer - Networking | Maryland
Posted 7 days ago
Job Viewed
Job Description
Responsibilities:
- Infrastructure as Code (IaC):
- Design, implement, and manage infrastructure as code using Terraform for GCP environments.
- Ensure infrastructure configurations are scalable, reliable, and follow best practices.
- GCP Platform Management:
- Architect and manage GCP environments, including compute, storage, and networking components.
- Collaborate with cross-functional teams to understand requirements and provide scalable infrastructure solutions.
- Vertex AI Integration:
- Work closely with data scientists and AI specialists to integrate and optimize solutions using Vertex AI on GCP.
- Implement and manage machine learning pipelines and models within the Vertex AI environment.
- BigQuery Storage:
- Design and optimize data storage solutions using BigQuery Storage.
- Collaborate with data engineers and analysts to ensure efficient data processing and analysis.
- Wiz Security Control Integration:
- Integrate and configure Wiz Security Control for continuous security monitoring and compliance checks within GCP environments.
- Collaborate with security teams to implement and enhance security controls.
- Automation and Tooling:
- Implement automation and tooling solutions for monitoring, scaling, and managing GCP resources.
- Develop and maintain scripts and tools to streamline operational tasks.
- Security and Compliance:
- Implement security best practices in GCP environments, including identity and access management, encryption, and compliance controls.
- Must understand the Policies as a Code in GCP
- Perform regular security assessments and audits.
- Bachelor's Degree:
- Bachelor's degree in Computer Science, Information Technology, or a related field.
- GCP Certification:
- GCP Professional Cloud Architect or similar certifications are highly desirable.
- Infrastructure as Code:
- Proven experience with Infrastructure as Code (IaC) using Terraform for GCP environments.
- Vertex AI and BigQuery:
- Hands-on experience with Vertex AI for generative AI solutions and BigQuery for data storage and analytics.
- Wiz Security Control:
- Experience with Wiz Security Control and its integration for continuous security monitoring in GCP environments.
- GCP Services:
- In-depth knowledge of various GCP services, including Compute Engine, Cloud Storage, VPC, and IAM.
- Automation Tools:
- Proficiency in scripting languages (e.g., Python, Bash) and automation tools for GCP resource management.
- Security and Compliance:
- Strong understanding of GCP security best practices and compliance standards.
- Collaboration Skills:
- Excellent collaboration and communication skills, with the ability to work effectively in a team-oriented environment
GCP Engineer - Security | Maryland
Posted 9 days ago
Job Viewed
Job Description
Responsibilities:
- Infrastructure as Code (IaC):
- Design, implement, and manage infrastructure as code using Terraform for GCP environments.
- Ensure infrastructure configurations are scalable, reliable, and follow best practices.
- GCP Platform Management:
- Architect and manage GCP environments, including compute, storage, and networking components.
- Collaborate with cross-functional teams to understand requirements and provide scalable infrastructure solutions.
- Vertex AI Integration:
- Work closely with data scientists and AI specialists to integrate and optimize solutions using Vertex AI on GCP.
- Implement and manage machine learning pipelines and models within the Vertex AI environment.
- BigQuery Storage:
- Design and optimize data storage solutions using BigQuery Storage.
- Collaborate with data engineers and analysts to ensure efficient data processing and analysis.
- Wiz Security Control Integration:
- Integrate and configure Wiz Security Control for continuous security monitoring and compliance checks within GCP environments.
- Collaborate with security teams to implement and enhance security controls.
- Automation and Tooling:
- Implement automation and tooling solutions for monitoring, scaling, and managing GCP resources.
- Develop and maintain scripts and tools to streamline operational tasks.
- Security and Compliance:
- Implement security best practices in GCP environments, including identity and access management, encryption, and compliance controls.
- Must understand the Policies as a Code in GCP
- Perform regular security assessments and audits.
- Bachelor's Degree:
- Bachelor's degree in Computer Science, Information Technology, or a related field.
- GCP Certification:
- GCP Professional Cloud Architect or similar certifications are highly desirable.
- Infrastructure as Code:
- Proven experience with Infrastructure as Code (IaC) using Terraform for GCP environments.
- Vertex AI and BigQuery:
- Hands-on experience with Vertex AI for generative AI solutions and BigQuery for data storage and analytics.
- Wiz Security Control:
- Experience with Wiz Security Control and its integration for continuous security monitoring in GCP environments.
- GCP Services:
- In-depth knowledge of various GCP services, including Compute Engine, Cloud Storage, VPC, and IAM.
- Automation Tools:
- Proficiency in scripting languages (e.g., Python, Bash) and automation tools for GCP resource management.
- Security and Compliance:
- Strong understanding of GCP security best practices and compliance standards.
- Collaboration Skills:
- Excellent collaboration and communication skills, with the ability to work effectively in a team-oriented environment
Be The First To Know
About the latest Gcp engineer Jobs in United States !
Google Cloud Platform GCP Data Engineer
Posted 7 days ago
Job Viewed
Job Description
Job Summary :
As a GCP Data Engineer, you will lead technology innovation for our clients through robust delivery of world-class solutions. This includes integrating native GCP services and third-party technologies to architect scalable data warehouses, data lakes, and analytics platforms. You will be responsible for architecting transformation and modernization of enterprise data solutions on GCP integrating native GCP services and 3rd party data technologies. You’ll work with implementation teams, managing everything from data ingestion to visualization in complex client environments.We are looking for candidates who have a broad set of technology skills across these areas and who can demonstrate an ability to design right solutions with appropriate combination of GCP and 3rd party technologies for deploying on GCP cloud.
Key Responsibilities :
- Lead a team in designing, developing, and deploying high-performance data analytics solutions.
- Provide technical expertise from concept to operations, ensuring the successful deployment of large-scale data solutions.
- Build secure and reliable data-centric services in GCP.
- Implement end-to-end data analytics for complex environments, including data ingestion, transformation, and visualization.
- Provide thought leadership on Big Data and analytic strategies for clients.
- Support data migration and transformation projects, leveraging Google AutoML to enhance pipeline intelligence.
Required Experience:
- Data Platform Architecture: 3+ years of experience with GCP data engineering, ingestion, and curation.
- Data Modeling & Optimization: 3+ years of experience designing data models on GCP using BigQuery and BigTable.
- Vertex AI: 1+ years of experience building and managing machine learning models.
- MLOps for GenAI: 1+ years of implementing MLOps for GenAI model deployment.
Qualifications :
- Extensive experience in large-scale architecture, solution design, and operationalization of data warehouses, data lakes, and analytics platforms on GCP is essential.
- Strong knowledge of GCP services, with at least 5 years in cloud platforms and 2+ years of deep experience in GCP data services (e.g., Spark, DataProc, Dataflow, BigQuery, Pub/Sub).
- 3+ years of experience re-architecting data warehouses on GCP, designing and building production data pipelines using Java and Python.
- Hands-on experience with GCP data lakes and ingestion solutions.
- Experience with metadata management, Hadoop/NoSQL, performance engineering, and self-service data preparation tools like Trifacta or Paxata.
- Bachelor's degree or equivalent work experience.
- Google Certified Professional Data Engineer certification or Google Professional Machine Learning Engineer certification required
We cannot work with third-party agencies at this time. Resumes submitted via unapproved agencies will be automatically rejected.
Google Cloud Platform (GCP) Data Engineer
Posted 7 days ago
Job Viewed
Job Description
Job Summary :
As a GCP Data Engineer, you will lead technology innovation for our clients through robust delivery of world-class solutions. This includes integrating native GCP services and third-party technologies to architect scalable data warehouses, data lakes, and analytics platforms. You will be responsible for architecting transformation and modernization of enterprise data solutions on GCP integrating native GCP services and 3rd party data technologies. You’ll work with implementation teams, managing everything from data ingestion to visualization in complex client environments.We are looking for candidates who have a broad set of technology skills across these areas and who can demonstrate an ability to design right solutions with appropriate combination of GCP and 3rd party technologies for deploying on GCP cloud.
Key Responsibilities :
- Lead a team in designing, developing, and deploying high-performance data analytics solutions.
- Provide technical expertise from concept to operations, ensuring the successful deployment of large-scale data solutions.
- Build secure and reliable data-centric services in GCP.
- Implement end-to-end data analytics for complex environments, including data ingestion, transformation, and visualization.
- Provide thought leadership on Big Data and analytic strategies for clients.
- Support data migration and transformation projects, leveraging Google AutoML to enhance pipeline intelligence.
Required Experience:
- Data Platform Architecture: 3+ years of experience with GCP data engineering, ingestion, and curation.
- Data Modeling & Optimization: 3+ years of experience designing data models on GCP using BigQuery and BigTable.
- Vertex AI: 1+ years of experience building and managing machine learning models.
- MLOps for GenAI: 1+ years of implementing MLOps for GenAI model deployment.
Qualifications :
- Extensive experience in large-scale architecture, solution design, and operationalization of data warehouses, data lakes, and analytics platforms on GCP is essential.
- Strong knowledge of GCP services, with at least 5 years in cloud platforms and 2+ years of deep experience in GCP data services (e.g., Spark, DataProc, Dataflow, BigQuery, Pub/Sub).
- 3+ years of experience re-architecting data warehouses on GCP, designing and building production data pipelines using Java and Python.
- Hands-on experience with GCP data lakes and ingestion solutions.
- Experience with metadata management, Hadoop/NoSQL, performance engineering, and self-service data preparation tools like Trifacta or Paxata.
- Bachelor's degree or equivalent work experience.
- Google Certified Professional Data Engineer certification or Google Professional Machine Learning Engineer certification required
We cannot work with third-party agencies at this time. Resumes submitted via unapproved agencies will be automatically rejected.
GCP Cloud Engineer
Posted today
Job Viewed
Job Description
Join us in making a meaningful impact! At our company, we help people gain access to essential medications that enhance their wellbeing. This mission fuels our passion and drives every aspect of our work.
Position: Cloud Engineer (GCP)
As a vital member of our team, you will take charge of supporting Medical data by investigating, implementing changes to our data platform, and developing new data pipelines. This role involves not only analyzing existing data platform functionalities but also designing, coding, testing, debugging, and documenting innovative data workflows.
Key Responsibilities:
- Collaborate with System Analysts, Technical Leads, and Project Managers to determine specifications and estimations for various initiatives.
- Create comprehensive descriptions of user requirements, system designs, program functions, and necessary changes to the data platform to ensure compliance.
- Develop, modify, test, and debug data pipelines aligned with change control protocols.
- Provide training to end-users and assist technical support staff with the data platform and BI tools.
- Analyze and propose enhancements to boost efficiency, performance, and quality of data assets while minimizing costs.
- Work closely with cross-functional teams to devise and implement acceptance test plans to meet customer expectations.
- Review code quality and promote automation process routines.
- Perform other duties as assigned.
Minimum Qualifications:
- Bachelor's degree in Computer Science or a related field, or equivalent education and work experience.
- At least 2 years of experience supporting enterprise business applications and servers.
- Eligibility to work in the United States without requiring visa or residency sponsorship.
Additional Qualifications:
- Hands-on experience with Google Cloud Composer, Google BigQuery, Google Dataflow, Google Dataplex, Python, and Pyspark.
- Strong problem-solving capabilities and analytical skills.
- Excellent documentation and communication skills.
- A strong desire to learn and grow within the team.
Preferred Qualifications:
- Proficiency in Google Cloud Storage, Google Looker Studio, BigQuery Data Transfer Service, Apache Airflow, Apache Beam, SQL, Git, Java, Kafka Libraries/Kafka Connect, and Alteryx Designer.
- Experience in Agile Data Warehousing.
Physical Job Requirements:
- Ability to travel up to 5% of the time and work a flexible schedule, including weekends and shifts outside of core business hours.
- Ability to sit for extended periods and utilize hands for various tasks.
- Capable of lifting and moving up to 25 pounds occasionally.
This position reports to a Manager in the Information Technology department. Our office is located at (precise work address) . If you're ready to make a difference with us, apply today!