6,970 Big Data Hadoop jobs in the United States

Big Data Hadoop Snowflake

85223 Arizona City, Arizona Syntricate Technologies

Posted 3 days ago

Job Viewed

Tap Again To Close

Job Description

Position : Big Data Hadoop Snowflake
Location : Phoenix, AZ (Hybrid - 3 days onsite / 2 days remote)
Duration : W2 / C2C Contract
Experience : + Years

Job Description :
  • Must have good experience working in large scale application development using Big Data ecosystem
  • Hadoop (HDFS, MapReduce, Yarn), Hive, Kafka.
  • Should have hands on experience using Spark, Sacal & Spark SQL
  • Hands-on experience with Snowflake utilities, SnowSQL, SnowPipe, Big Data model techniques using Python
  • Should be able to work in person from Phoenix office 3 days a week
  • BigData and Hadoop Ecosystems, BigData and Hadoop Ecosystem - MapR


If you are interested, Please share your resume with below basic details:
Basic Details:
  • Current Location with Zip Code:
  • LinkedIn URL:

Regards,
Ashish Rastogi
Technical Recruiter | Syntricate Technologies Inc.
Direct :
Email : | Web:
We're hiring! connect with us on LinkedIn and visit our Jobs Portal

Minority Business Enterprise (MBE) Certified | E-Verified Corporation | Equal Employment Opportunity (EEO) Employer

This e-mail message may contain confidential or legally privileged information and is intended only for the use of the intended recipient(s). Any unauthorized disclosure, dissemination, distribution, copying or the taking of any action in reliance on the information herein is prohibited. Please notify the sender immediately by email if you have received this email by mistake and delete this e-mail from your system. You have received this email as we have your email address shared by you or from one of our data sources or from our member(s) or subscriber(s) list. If you do not want to receive any further emails or updates, please reply and request to unsubscribe.
View Now

Hadoop Big Data Engineer

07308 Jersey City, New Jersey Insight Global

Posted 5 days ago

Job Viewed

Tap Again To Close

Job Description

Job Description
For 68.93/hr, we are seeking a skilled Big Data Engineer to support a leading financial institution in building and optimizing large-scale data processing systems. This role involves working with Hadoop and Spark to ingest, transform, and analyze high-volume market and trading data. The engineer will contribute to the development and maintenance of distributed data pipelines, ensuring performance, scalability, and reliability across the analytics infrastructure. You will use datasets, data lake architecture, etc. to help build a proprietary analytics platform. The ideal candidate will have a strong understanding of big data frameworks and a passion for enabling data-driven decision-making in a fast-paced financial environment.
We are a company committed to creating diverse and inclusive environments where people can bring their full, authentic selves to work every day. We are an equal opportunity/affirmative action employer that believes everyone matters. Qualified candidates will receive consideration for employment regardless of their race, color, ethnicity, religion, sex (including pregnancy), sexual orientation, gender identity and expression, marital status, national origin, ancestry, genetic factors, age, disability, protected veteran status, military or uniformed service member status, or any other status or characteristic protected by applicable laws, regulations, and ordinances. If you need assistance and/or a reasonable accommodation due to a disability during the application or recruiting process, please send a request to learn more about how we collect, keep, and process your private information, please review Insight Global's Workforce Privacy Policy: and Requirements
- 7+ years of experience in a Data Engineer role utilizing Python scripting
- Experience with Hadoop and Spark - Background in financial services or analytics platforms
- Familiarity with tools like Apache Hudi, Hive, and Airflow is a plus
View Now

: Big Data Developer with Hadoop Ecosystems

07390 Jersey City, New Jersey Diverse Lynx

Posted 3 days ago

Job Viewed

Tap Again To Close

Job Description

Strong expertise in Big Data technologies, including hands-on experience with Hadoop ecosystem components like Sqoop and Spark for large-scale data processing and management? Proficient in Java programming, with experience in Spring, Spring Boot, and REST services to develop robust backend solutions and integrate with distributed systems? Skilled in Angular framework for building dynamic and responsive front-end applications that enhance user experience? Experience with Oracle database, including designing, querying, and optimizing database performance? Knowledge of event-driven architecture is an added advantage? Ability to collaborate effectively across teams and contribute beyond project tasks by participating in mentorship, conducting interviews, and supporting firmwide initiatives aligned with BNY's culture of ownership and continuous learning? Strong financial experience with brokerage, custody services

Diverse Lynx LLC is an Equal Employment Opportunity employer. All qualified applicants will receive due consideration for employment without any discrimination. All applicants will be evaluated solely on the basis of their ability, competence and their proven capability to perform the functions outlined in the corresponding role. We promote and support a diverse workforce across all levels in the company.

View Now

Hadoop Big Data Engineer - AZ - VMO

85286 Tempe, Arizona ManpowerGroup

Posted 1 day ago

Job Viewed

Tap Again To Close

Job Description

Our client, a leading player in the financial services industry, is seeking a Hadoop Big Data Engineer to join their team. As a Hadoop Big Data Engineer, you will be part of the Data Engineering department supporting various teams. The ideal candidate will have strong problem-solving skills, a proactive mindset, and a solid technical background which will align successfully in the organization.
**Job Title:** Hadoop Big Data Engineer
**Location:** Chandler, AZ (In-office, 3 days a week)
**Pay Range:** Competitive
**What's the Job?**
+ Build and maintain data pipelines using a big-data stack including Hadoop, Hive, and PySpark.
+ Implement data modeling and database design to ensure efficient data management.
+ Utilize Amazon AWS S3 for object storage and data service integration.
+ Schedule jobs using Autosys and automate processes with Unix/shell scripting.
+ Collaborate with team members to troubleshoot and optimize data transformation processes.
**What's Needed?**
+ Minimum of 4 years of hands-on experience in big data technologies.
+ Proficiency in Python and experience with CI/CD pipelines.
+ Strong understanding of database design principles, preferably MySQL or similar.
+ Experience with PowerBI and Dremio for data visualization.
+ Exposure to GCP cloud data engineering is a plus.
**What's in it for me?**
+ Opportunity to work in a dynamic and innovative environment.
+ Engage with cutting-edge technologies in data engineering.
+ Collaborate with experienced professionals in the field.
+ Potential for conversion to a full-time position after the contract period.
+ Participate in a pilot program that enhances your career growth.
If this is a role that interests you and you'd like to learn more, click apply now and a recruiter will be in touch with you to discuss this great opportunity. We look forward to speaking with you!
**About ManpowerGroup, Parent Company of: Manpower, Experis, Talent Solutions, and Jefferson Wells**
_ManpowerGroup® (NYSE: MAN), the leading global workforce solutions company, helps organizations transform in a fast-changing world of work by sourcing, assessing, developing, and managing the talent that enables them to win. We develop innovative solutions for hundreds of thousands of organizations every year, providing them with skilled talent while finding meaningful, sustainable employment for millions of people across a wide range of industries and skills. Our expert family of brands -_ **_Manpower, Experis, Talent Solutions, and Jefferson Wells_** _-_ creates substantial value for candidates and clients across more than 75 countries and territories and has done so for over 70 years. We are recognized consistently for our diversity - as a best place to work for Women, Inclusion, Equality and Disability and in 2023 ManpowerGroup was named one of the World's Most Ethical Companies for the 14th year - all confirming our position as the brand of choice for in-demand talent.
ManpowerGroup is committed to providing equal employment opportunities in a professional, high quality work environment. It is the policy of ManpowerGroup and all of its subsidiaries to recruit, train, promote, transfer, pay and take all employment actions without regard to an employee's race, color, national origin, ancestry, sex, sexual orientation, gender identity, genetic information, religion, age, disability, protected veteran status, or any other basis protected by applicable law.
View Now

Senior Hadoop Admin - Big Data - Federal - 2nd Shift

98034 Kirkland, Washington ServiceNow

Posted today

Job Viewed

Tap Again To Close

Job Description

Senior Hadoop Admin - Big Data - Federal - 2nd Shift at ServiceNow summary:

The Senior Hadoop Admin on the Federal Big Data Team is responsible for deploying, maintaining, and supporting Big Data infrastructure on federal cloud environments such as ServiceNow Cloud and Azure. The role involves managing Kubernetes containerized applications, automating CI/CD pipelines, ensuring high availability of Big Data systems, and collaborating across teams to troubleshoot and optimize performance. This position supports 24x7 government cloud infrastructure operations with a focus on data governance, analytics, and AI-driven insights.

Company Description
It all started in sunny San Diego, California in 2004 when a visionary engineer, Fred Luddy, saw the potential to transform how we work. Fast forward to today - ServiceNow stands as a global market leader, bringing innovative AI-enhanced technology to over 8,100 customers, including 85% of the Fortune 500®. Our intelligent cloud-based platform seamlessly connects people, systems, and processes to empower organizations to find smarter, faster, and better ways to work. But this is just the beginning of our journey. Join us as we pursue our purpose to make the world work better for everyone.
Job Description
Please Note: This position will include supporting our US Federal Government Cloud Infrastructure.
This position requires passing a ServiceNow background screening, USFedPASS (US Federal Personnel Authorization Screening Standards). This includes a credit check, criminal/misdemeanor check and taking a drug test. Any employment is contingent upon passing the screening. Due to Federal requirements, only US citizens, US naturalized citizens or US Permanent Residents, holding a green card, will be considered.
As a Staff DevOps Engineer on our Big Data Federal Team, you will help deliver 24x7 support for our Government Cloud infrastructure.
The Federal Big Data Team has 3 shifts that provide 24x7 production support for our Big Data Government cloud infrastructure.
This is a 2nd Shift Position - Sunday to Wednesday with work hours from 3 pm Pacific Time to 2 am Pacific Time

Below are some highlights.
  • 4 Day work week (Sunday to Wednesday)
  • No on-call rotation
  • Shift Bonuses for 2nd and 3rd shifts

The Big Data team plays a critical and strategic role in ensuring that ServiceNow can exceed the availability and performance SLAs of the ServiceNow Platform powered Customer instances - deployed across the ServiceNow cloud and Azure cloud. Our mission is to:
Deliver state-of-the-art Monitoring, Analytics and Actionable Business Insights by employing new tools, Big Data systems, Enterprise Data Lake, AI, and Machine Learning methodologies that improve efficiencies across a variety of functions in the company: Cloud Operations, Customer Support, Product Usage Analytics, Product Upsell Opportunities enabling to have a significant impact both on the top-line and bottom-line growth. The Big Data team is responsible for:
  • Collecting, storing, and providing real-time access to large amount of data
  • Provide real-time analytic tools and reporting capabilities for various functions including:
    • Monitoring, alerting, and troubleshooting
    • Machine Learning, Anomaly detection and Prediction of P1s
    • Capacity planning
    • Data analytics and deriving Actionable Business Insights

What you get to do in this role
  • Responsible for deploying, production monitoring, maintaining and supporting Big Data infrastructure, Applications on ServiceNow Cloud and Azure environments.
  • Deploy, scale, and manage containerized applications using Kubernetes, docker, and other related tools.
  • Automate Continuous Integration / Continuous Deployment (CI/CD) data pipelines for applications leveraging tools such as Jenkins, Ansible, and Docker.
  • Proactively identify and resolve issues within Kubernetes clusters, containerized applications, and CI/CD pipelines. Provide expert-level support for incidents and perform root cause analysis.
  • Understanding of networking concepts related to containerized environments
  • Provide production support to resolve critical Big Data pipelines and application issues and mitigating or minimizing any impact on Big Data applications. Collaborate closely with Site Reliability Engineers (SRE), Customer Support (CS), Developers, QA and System engineering teams in replicating complex issues leveraging broad experience with UI, SQL, Full-stack and Big Data technologies.
  • Responsible for enforcing data governance policies in Commercial and Regulated Big Data environments.

Qualifications
To be successful in this role you have:
  • Experience in leveraging or critically thinking about how to integrate AI into work processes, decision-making, or problem-solving. This may include using AI-powered tools, automating workflows, analyzing AI-driven insights, or exploring AI's potential impact on the function or industry
  • Deep understanding of Hadoop / Big Data Ecosystem.
  • 6+ Experience working with systems such as HDFS, Yarn, Hive, HBase, Kafka, RabbitMQ, Impala, Kudu, Redis, MariaDB, and PostgreSQL
  • Hands-on experience with Kubernetes in a production environment
  • Deep understanding of Kubernetes architecture, concepts, and operations
  • Strong knowledge in querying and analyzing large-scale data using VictoriaMetrics, Prometheus, Spark, Flink, and Grafana
  • Experience supporting CI/CD pipelines for automated applications deployment to Kubernetes
  • Strong Linux Systems Administration skills
  • Strong scripting skills in Bash, Python for automation and task management
  • Proficient with Git and version control systems
  • Familiarity with Cloudera Data Platform (CDP) and its ecosystem
  • Ability to learn quickly in a fast-paced, dynamic team environment

GCS-23
For positions in the Bay Area, we offer a base pay of $158,500 - $277,500, plus equity (when applicable), variable/incentive compensation and benefits. Sales positions generally offer a competitive On Target Earnings (OTE) incentive compensation structure. Please note that the base pay shown is a guideline, and individual total compensation will vary based on factors such as qualifications, skill level, competencies and work location. We also offer health plans, including flexible spending accounts, a 401(k) Plan with company match, ESPP, matching donations, a flexible time away plan and family leave programs (subject to eligibility requirements). Compensation is based on the geographic location in which the role is located, and is subject to change based on work location.
Not sure if you meet every qualification? We still encourage you to apply! We value inclusivity, welcoming candidates from diverse backgrounds, including non-traditional paths. Unique experiences enrich our team, and the willingness to dream big makes you an exceptional candidate!
Additional Information
Work Personas
We approach our distributed world of work with flexibility and trust. Work personas (flexible, remote, or required in office) are categories that are assigned to ServiceNow employees depending on the nature of their work. Learn more here.
Equal Opportunity Employer
ServiceNow is an equal opportunity employer. All qualified applicants will receive consideration for employment without regard to race, color, creed, religion, sex, sexual orientation, national origin or nationality, ancestry, age, disability, gender identity or expression, marital status, veteran status, or any other category protected by law. In addition, all qualified applicants with arrest or conviction records will be considered for employment in accordance with legal requirements.
Accommodations
We strive to create an accessible and inclusive experience for all candidates. If you require a reasonable accommodation to complete any part of the application process, or are unable to use this online application and need an alternative method to apply, please contact for assistance.
Export Control Regulations
For positions requiring access to controlled technology subject to export control regulations, including the U.S. Export Administration Regulations (EAR), ServiceNow may be required to obtain export control approval from government authorities for certain individuals. All employment is contingent upon ServiceNow obtaining any export license or other approval that may be required by relevant export control authorities.
From Fortune. ©2024 Fortune Media IP Limited. All rights reserved. Used under license.

Keywords:

Hadoop, Big Data, Kubernetes, CI/CD, Cloud Infrastructure, ServiceNow, Azure, Linux Administration, Data Analytics, Machine Learning

View Now

Senior Hadoop Admin - Big Data - Federal - 2nd Shift

95054 Santa Clara, California ServiceNow, Inc.

Posted 1 day ago

Job Viewed

Tap Again To Close

Job Description

It all started in sunny San Diego, California in 2004 when a visionary engineer, Fred Luddy, saw the potential to transform how we work. Fast forward to today - ServiceNow stands as a global market leader, bringing innovative AI-enhanced technology to over 8,100 customers, including 85% of the Fortune 500®. Our intelligent cloud-based platform seamlessly connects people, systems, and processes to empower organizations to find smarter, faster, and better ways to work. But this is just the beginning of our journey. Join us as we pursue our purpose to make the world work better for everyone.
**Please Note:** This position will include supporting our US Federal Government Cloud Infrastructure.
_This position requires passing a ServiceNow background screening, USFedPASS (US Federal Personnel Authorization Screening Standards). This includes a credit check, criminal/misdemeanor check and taking a drug test. Any employment is contingent upon passing the screening. _ **_Due to Federal requirements, only US citizens, US naturalized citizens or US Permanent Residents, holding a green card, will be considered._**
As a **Staff DevOps Engineer** on our **Big Data Federal Team,** you will help deliver 24x7 support for our Government Cloud infrastructure.
**_The Federal Big Data Team has 3 shifts that provide 24x7 production support for our Big Data Government cloud infrastructure._**
**_This is a 2nd Shift Position - Sunday to Wednesday with work hours from 3 pm Pacific Time to 2 am Pacific Time_**
_Below are some highlights._
+ **4 Day work week** (Sunday to Wednesday)
+ No on-call rotation
+ Shift Bonuses for 2nd and 3rd shifts
The Big Data team plays a critical and strategic role in ensuring that ServiceNow can exceed the availability and performance SLAs of the ServiceNow Platform powered Customer instances - deployed across the ServiceNow cloud and Azure cloud. Our mission is to:
Deliver _state-of-the-art Monitoring, Analytics and Actionable Business Insights_ by employing _new tools, Big Data systems, Enterprise Data Lake, AI, and Machine Learning methodologies_ that _improve efficiencies across a variety of functions in the company: Cloud Operations, Customer Support, Product Usage Analytics, Product Upsell Opportunities_ enabling to have a significant impact both on the top-line and bottom-line growth. The Big Data team is responsible for:
+ Collecting, storing, and providing real-time access to large amount of data
+ Provide real-time analytic tools and reporting capabilities for various functions including:
+ Monitoring, alerting, and troubleshooting
+ Machine Learning, Anomaly detection and Prediction of P1s
+ Capacity planning
+ Data analytics and deriving Actionable Business Insights
**What you get to do in this role**
+ Responsible for deploying, production monitoring, maintaining and supporting Big Data infrastructure, Applications on ServiceNow Cloud and Azure environments.
+ Deploy, scale, and manage containerized applications using Kubernetes, docker, and other related tools.
+ Automate Continuous Integration / Continuous Deployment (CI/CD) data pipelines for applications leveraging tools such as Jenkins, Ansible, and Docker.
+ Proactively identify and resolve issues within Kubernetes clusters, containerized applications, and CI/CD pipelines. Provide expert-level support for incidents and perform root cause analysis.
+ Understanding of networking concepts related to containerized environments
+ Provide production support to resolve critical Big Data pipelines and application issues and mitigating or minimizing any impact on Big Data applications. Collaborate closely with Site Reliability Engineers (SRE), Customer Support (CS), Developers, QA and System engineering teams in replicating complex issues leveraging broad experience with UI, SQL, Full-stack and Big Data technologies.
+ Responsible for enforcing data governance policies in Commercial and Regulated Big Data environments.
**To be successful in this role you have:**
+ Experience in leveraging or critically thinking about how to integrate AI into work processes, decision-making, or problem-solving. This may include using AI-powered tools, automating workflows, analyzing AI-driven insights, or exploring AI's potential impact on the function or industry
+ Deep understanding of Hadoop / Big Data Ecosystem.
+ 6+ Experience working with systems such as HDFS, Yarn, Hive, HBase, Kafka, RabbitMQ, Impala, Kudu, Redis, MariaDB, and PostgreSQL
+ Hands-on experience with Kubernetes in a production environment
+ Deep understanding of Kubernetes architecture, concepts, and operations
+ Strong knowledge in querying and analyzing large-scale data using VictoriaMetrics, Prometheus, Spark, Flink, and Grafana
+ Experience supporting CI/CD pipelines for automated applications deployment to Kubernetes
+ Strong Linux Systems Administration skills
+ Strong scripting skills in Bash, Python for automation and task management
+ Proficient with Git and version control systems
+ Familiarity with Cloudera Data Platform (CDP) and its ecosystem
+ Ability to learn quickly in a fast-paced, dynamic team environment
GCS-23
For positions in the Bay Area, we offer a base pay of $158,500 - $277,500, plus equity (when applicable), variable/incentive compensation and benefits. Sales positions generally offer a competitive On Target Earnings (OTE) incentive compensation structure. Please note that the base pay shown is a guideline, and individual total compensation will vary based on factors such as qualifications, skill level, competencies and work location. We also offer health plans, including flexible spending accounts, a 401(k) Plan with company match, ESPP, matching donations, a flexible time away plan and family leave programs (subject to eligibility requirements). Compensation is based on the geographic location in which the role is located, and is subject to change based on work location.
_Not sure if you meet every qualification? We still encourage you to apply! We value inclusivity, welcoming candidates from diverse backgrounds, including non-traditional paths. Unique experiences enrich our team, and the willingness to dream big makes you an exceptional candidate!_
**Work Personas**
We approach our distributed world of work with flexibility and trust. Work personas (flexible, remote, or required in office) are categories that are assigned to ServiceNow employees depending on the nature of their work. Learn more here ( .
**Equal Opportunity Employer**
ServiceNow is an equal opportunity employer. All qualified applicants will receive consideration for employment without regard to race, color, creed, religion, sex, sexual orientation, national origin or nationality, ancestry, age, disability, gender identity or expression, marital status, veteran status, or any other category protected by law. In addition, all qualified applicants with arrest or conviction records will be considered for employment in accordance with legal requirements.
**Accommodations**
We strive to create an accessible and inclusive experience for all candidates. If you require a reasonable accommodation to complete any part of the application process, or are unable to use this online application and need an alternative method to apply, please contact for assistance.
**Export Control Regulations**
For positions requiring access to controlled technology subject to export control regulations, including the U.S. Export Administration Regulations (EAR), ServiceNow may be required to obtain export control approval from government authorities for certain individuals. All employment is contingent upon ServiceNow obtaining any export license or other approval that may be required by relevant export control authorities.
From Fortune. ©2024 Fortune Media IP Limited. All rights reserved. Used under license.
View Now

Senior Hadoop Admin - Big Data - Federal - 2nd Shift

92108 Mission Valley, California ServiceNow, Inc.

Posted 1 day ago

Job Viewed

Tap Again To Close

Job Description

It all started in sunny San Diego, California in 2004 when a visionary engineer, Fred Luddy, saw the potential to transform how we work. Fast forward to today - ServiceNow stands as a global market leader, bringing innovative AI-enhanced technology to over 8,100 customers, including 85% of the Fortune 500®. Our intelligent cloud-based platform seamlessly connects people, systems, and processes to empower organizations to find smarter, faster, and better ways to work. But this is just the beginning of our journey. Join us as we pursue our purpose to make the world work better for everyone.
**Please Note:** This position will include supporting our US Federal Government Cloud Infrastructure.
_This position requires passing a ServiceNow background screening, USFedPASS (US Federal Personnel Authorization Screening Standards). This includes a credit check, criminal/misdemeanor check and taking a drug test. Any employment is contingent upon passing the screening. _ **_Due to Federal requirements, only US citizens, US naturalized citizens or US Permanent Residents, holding a green card, will be considered._**
As a **Staff DevOps Engineer** on our **Big Data Federal Team,** you will help deliver 24x7 support for our Government Cloud infrastructure.
**_The Federal Big Data Team has 3 shifts that provide 24x7 production support for our Big Data Government cloud infrastructure._**
**_This is a 2nd Shift Position - Sunday to Wednesday with work hours from 3 pm Pacific Time to 2 am Pacific Time_**
_Below are some highlights._
+ **4 Day work week** (Sunday to Wednesday)
+ No on-call rotation
+ Shift Bonuses for 2nd and 3rd shifts
The Big Data team plays a critical and strategic role in ensuring that ServiceNow can exceed the availability and performance SLAs of the ServiceNow Platform powered Customer instances - deployed across the ServiceNow cloud and Azure cloud. Our mission is to:
Deliver _state-of-the-art Monitoring, Analytics and Actionable Business Insights_ by employing _new tools, Big Data systems, Enterprise Data Lake, AI, and Machine Learning methodologies_ that _improve efficiencies across a variety of functions in the company: Cloud Operations, Customer Support, Product Usage Analytics, Product Upsell Opportunities_ enabling to have a significant impact both on the top-line and bottom-line growth. The Big Data team is responsible for:
+ Collecting, storing, and providing real-time access to large amount of data
+ Provide real-time analytic tools and reporting capabilities for various functions including:
+ Monitoring, alerting, and troubleshooting
+ Machine Learning, Anomaly detection and Prediction of P1s
+ Capacity planning
+ Data analytics and deriving Actionable Business Insights
**What you get to do in this role**
+ Responsible for deploying, production monitoring, maintaining and supporting Big Data infrastructure, Applications on ServiceNow Cloud and Azure environments.
+ Deploy, scale, and manage containerized applications using Kubernetes, docker, and other related tools.
+ Automate Continuous Integration / Continuous Deployment (CI/CD) data pipelines for applications leveraging tools such as Jenkins, Ansible, and Docker.
+ Proactively identify and resolve issues within Kubernetes clusters, containerized applications, and CI/CD pipelines. Provide expert-level support for incidents and perform root cause analysis.
+ Understanding of networking concepts related to containerized environments
+ Provide production support to resolve critical Big Data pipelines and application issues and mitigating or minimizing any impact on Big Data applications. Collaborate closely with Site Reliability Engineers (SRE), Customer Support (CS), Developers, QA and System engineering teams in replicating complex issues leveraging broad experience with UI, SQL, Full-stack and Big Data technologies.
+ Responsible for enforcing data governance policies in Commercial and Regulated Big Data environments.
**To be successful in this role you have:**
+ Experience in leveraging or critically thinking about how to integrate AI into work processes, decision-making, or problem-solving. This may include using AI-powered tools, automating workflows, analyzing AI-driven insights, or exploring AI's potential impact on the function or industry
+ Deep understanding of Hadoop / Big Data Ecosystem.
+ 6+ Experience working with systems such as HDFS, Yarn, Hive, HBase, Kafka, RabbitMQ, Impala, Kudu, Redis, MariaDB, and PostgreSQL
+ Hands-on experience with Kubernetes in a production environment
+ Deep understanding of Kubernetes architecture, concepts, and operations
+ Strong knowledge in querying and analyzing large-scale data using VictoriaMetrics, Prometheus, Spark, Flink, and Grafana
+ Experience supporting CI/CD pipelines for automated applications deployment to Kubernetes
+ Strong Linux Systems Administration skills
+ Strong scripting skills in Bash, Python for automation and task management
+ Proficient with Git and version control systems
+ Familiarity with Cloudera Data Platform (CDP) and its ecosystem
+ Ability to learn quickly in a fast-paced, dynamic team environment
GCS-23
For positions in the Bay Area, we offer a base pay of $158,500 - $277,500, plus equity (when applicable), variable/incentive compensation and benefits. Sales positions generally offer a competitive On Target Earnings (OTE) incentive compensation structure. Please note that the base pay shown is a guideline, and individual total compensation will vary based on factors such as qualifications, skill level, competencies and work location. We also offer health plans, including flexible spending accounts, a 401(k) Plan with company match, ESPP, matching donations, a flexible time away plan and family leave programs (subject to eligibility requirements). Compensation is based on the geographic location in which the role is located, and is subject to change based on work location.
_Not sure if you meet every qualification? We still encourage you to apply! We value inclusivity, welcoming candidates from diverse backgrounds, including non-traditional paths. Unique experiences enrich our team, and the willingness to dream big makes you an exceptional candidate!_
**Work Personas**
We approach our distributed world of work with flexibility and trust. Work personas (flexible, remote, or required in office) are categories that are assigned to ServiceNow employees depending on the nature of their work. Learn more here ( .
**Equal Opportunity Employer**
ServiceNow is an equal opportunity employer. All qualified applicants will receive consideration for employment without regard to race, color, creed, religion, sex, sexual orientation, national origin or nationality, ancestry, age, disability, gender identity or expression, marital status, veteran status, or any other category protected by law. In addition, all qualified applicants with arrest or conviction records will be considered for employment in accordance with legal requirements.
**Accommodations**
We strive to create an accessible and inclusive experience for all candidates. If you require a reasonable accommodation to complete any part of the application process, or are unable to use this online application and need an alternative method to apply, please contact for assistance.
**Export Control Regulations**
For positions requiring access to controlled technology subject to export control regulations, including the U.S. Export Administration Regulations (EAR), ServiceNow may be required to obtain export control approval from government authorities for certain individuals. All employment is contingent upon ServiceNow obtaining any export license or other approval that may be required by relevant export control authorities.
From Fortune. ©2024 Fortune Media IP Limited. All rights reserved. Used under license.
View Now
Be The First To Know

About the latest Big data hadoop Jobs in United States !

Senior Hadoop Admin - Big Data - Federal - 2nd Shift

98033 Kirkland, Washington ServiceNow, Inc.

Posted 7 days ago

Job Viewed

Tap Again To Close

Job Description

It all started in sunny San Diego, California in 2004 when a visionary engineer, Fred Luddy, saw the potential to transform how we work. Fast forward to today - ServiceNow stands as a global market leader, bringing innovative AI-enhanced technology to over 8,100 customers, including 85% of the Fortune 500®. Our intelligent cloud-based platform seamlessly connects people, systems, and processes to empower organizations to find smarter, faster, and better ways to work. But this is just the beginning of our journey. Join us as we pursue our purpose to make the world work better for everyone.
**Please Note:** This position will include supporting our US Federal Government Cloud Infrastructure.
_This position requires passing a ServiceNow background screening, USFedPASS (US Federal Personnel Authorization Screening Standards). This includes a credit check, criminal/misdemeanor check and taking a drug test. Any employment is contingent upon passing the screening. _ **_Due to Federal requirements, only US citizens, US naturalized citizens or US Permanent Residents, holding a green card, will be considered._**
As a **Staff DevOps Engineer** on our **Big Data Federal Team,** you will help deliver 24x7 support for our Government Cloud infrastructure.
**_The Federal Big Data Team has 3 shifts that provide 24x7 production support for our Big Data Government cloud infrastructure._**
**_This is a 2nd Shift Position - Sunday to Wednesday with work hours from 3 pm Pacific Time to 2 am Pacific Time_**
_Below are some highlights._
+ **4 Day work week** (Sunday to Wednesday)
+ No on-call rotation
+ Shift Bonuses for 2nd and 3rd shifts
The Big Data team plays a critical and strategic role in ensuring that ServiceNow can exceed the availability and performance SLAs of the ServiceNow Platform powered Customer instances - deployed across the ServiceNow cloud and Azure cloud. Our mission is to:
Deliver _state-of-the-art Monitoring, Analytics and Actionable Business Insights_ by employing _new tools, Big Data systems, Enterprise Data Lake, AI, and Machine Learning methodologies_ that _improve efficiencies across a variety of functions in the company: Cloud Operations, Customer Support, Product Usage Analytics, Product Upsell Opportunities_ enabling to have a significant impact both on the top-line and bottom-line growth. The Big Data team is responsible for:
+ Collecting, storing, and providing real-time access to large amount of data
+ Provide real-time analytic tools and reporting capabilities for various functions including:
+ Monitoring, alerting, and troubleshooting
+ Machine Learning, Anomaly detection and Prediction of P1s
+ Capacity planning
+ Data analytics and deriving Actionable Business Insights
**What you get to do in this role**
+ Responsible for deploying, production monitoring, maintaining and supporting Big Data infrastructure, Applications on ServiceNow Cloud and Azure environments.
+ Deploy, scale, and manage containerized applications using Kubernetes, docker, and other related tools.
+ Automate Continuous Integration / Continuous Deployment (CI/CD) data pipelines for applications leveraging tools such as Jenkins, Ansible, and Docker.
+ Proactively identify and resolve issues within Kubernetes clusters, containerized applications, and CI/CD pipelines. Provide expert-level support for incidents and perform root cause analysis.
+ Understanding of networking concepts related to containerized environments
+ Provide production support to resolve critical Big Data pipelines and application issues and mitigating or minimizing any impact on Big Data applications. Collaborate closely with Site Reliability Engineers (SRE), Customer Support (CS), Developers, QA and System engineering teams in replicating complex issues leveraging broad experience with UI, SQL, Full-stack and Big Data technologies.
+ Responsible for enforcing data governance policies in Commercial and Regulated Big Data environments.
**To be successful in this role you have:**
+ Experience in leveraging or critically thinking about how to integrate AI into work processes, decision-making, or problem-solving. This may include using AI-powered tools, automating workflows, analyzing AI-driven insights, or exploring AI's potential impact on the function or industry
+ Deep understanding of Hadoop / Big Data Ecosystem.
+ 6+ Experience working with systems such as HDFS, Yarn, Hive, HBase, Kafka, RabbitMQ, Impala, Kudu, Redis, MariaDB, and PostgreSQL
+ Hands-on experience with Kubernetes in a production environment
+ Deep understanding of Kubernetes architecture, concepts, and operations
+ Strong knowledge in querying and analyzing large-scale data using VictoriaMetrics, Prometheus, Spark, Flink, and Grafana
+ Experience supporting CI/CD pipelines for automated applications deployment to Kubernetes
+ Strong Linux Systems Administration skills
+ Strong scripting skills in Bash, Python for automation and task management
+ Proficient with Git and version control systems
+ Familiarity with Cloudera Data Platform (CDP) and its ecosystem
+ Ability to learn quickly in a fast-paced, dynamic team environment
GCS-23
For positions in the Bay Area, we offer a base pay of $158,500 - $277,500, plus equity (when applicable), variable/incentive compensation and benefits. Sales positions generally offer a competitive On Target Earnings (OTE) incentive compensation structure. Please note that the base pay shown is a guideline, and individual total compensation will vary based on factors such as qualifications, skill level, competencies and work location. We also offer health plans, including flexible spending accounts, a 401(k) Plan with company match, ESPP, matching donations, a flexible time away plan and family leave programs (subject to eligibility requirements). Compensation is based on the geographic location in which the role is located, and is subject to change based on work location.
_Not sure if you meet every qualification? We still encourage you to apply! We value inclusivity, welcoming candidates from diverse backgrounds, including non-traditional paths. Unique experiences enrich our team, and the willingness to dream big makes you an exceptional candidate!_
**Work Personas**
We approach our distributed world of work with flexibility and trust. Work personas (flexible, remote, or required in office) are categories that are assigned to ServiceNow employees depending on the nature of their work. Learn more here ( .
**Equal Opportunity Employer**
ServiceNow is an equal opportunity employer. All qualified applicants will receive consideration for employment without regard to race, color, creed, religion, sex, sexual orientation, national origin or nationality, ancestry, age, disability, gender identity or expression, marital status, veteran status, or any other category protected by law. In addition, all qualified applicants with arrest or conviction records will be considered for employment in accordance with legal requirements.
**Accommodations**
We strive to create an accessible and inclusive experience for all candidates. If you require a reasonable accommodation to complete any part of the application process, or are unable to use this online application and need an alternative method to apply, please contact for assistance.
**Export Control Regulations**
For positions requiring access to controlled technology subject to export control regulations, including the U.S. Export Administration Regulations (EAR), ServiceNow may be required to obtain export control approval from government authorities for certain individuals. All employment is contingent upon ServiceNow obtaining any export license or other approval that may be required by relevant export control authorities.
From Fortune. ©2024 Fortune Media IP Limited. All rights reserved. Used under license.
View Now

Data Processing Specialist

Premium Job
Remote $25 - $30 per hour Devlan LLC

Posted 27 days ago

Job Viewed

Tap Again To Close

Job Description

Full time Permanent

We are looking for a detail-driven Data Processing Specialist to join our team. In this role, you will be responsible for collecting, organizing, and processing large volumes of data with accuracy and efficiency. The ideal candidate will have strong technical skills, excellent attention to detail, and the ability to analyze and interpret information to support business operations.

Responsibilities:

  • Collect, process, and validate data from various sources to ensure accuracy and completeness
  • Prepare, format, and upload data into databases or software systems
  • Identify, investigate, and resolve data discrepancies or inconsistencies
  • Generate reports, summaries, and visual presentations of processed data
  • Collaborate with cross-functional teams to provide data insights and support decision-making
  • Maintain data confidentiality and comply with company policies and industry regulations
  • Perform quality control checks to ensure data integrity and reliability

Qualifications:

  • Bachelor’s degree in information systems, computer science, business administration, or related field (or equivalent work experience)
  • Previous experience in data processing, data management, or data analysis
  • Strong knowledge of spreadsheets, databases, and data management tools
  • Proficient in Microsoft Excel; familiarity with SQL or other query languages is a plus
  • Excellent problem-solving and analytical skills
  • Strong attention to detail and ability to work under deadlines
  • Effective communication and teamwork skills

Benefits:

  • Competitive salary
  • Comprehensive health, dental, and vision insurance
  • Retirement savings plan with company match
  • Paid time off, sick leave, and holidays
  • Flexible work arrangements (remote/hybrid options may be available)
  • Professional growth and training opportunities
  • Supportive and collaborative team culture

Company Details

The Land Report Analysis, a service offering of Devlan LLC, provides you with the knowledge you need about the highest and best use of your property. Not the current value but the potential future value based on development. Know what the big developers know. Be informed. Don’t leave money on the table. We have over 30 years of land development and engineering experience. Get your comprehensive land report analysis today! We have over 30 years of land development and engineering experience. We realized that many people do not know the true value of their properties, and when they go to buy or sell, often times they leave money on the table or lose money. We’re here to help. Contact us for more information on how to receive your initial land report.
Apply Now

Data Processing Specialist

40324 Georgetown, Kentucky Conduent Commercial Solutions, LLC

Posted 2 days ago

Job Viewed

Tap Again To Close

Job Description

At Conduent, we are committed to delivering essential services and solutions for Fortune 100 companies and over 500 governments. Our dedicated associates create positive outcomes for our clients and the millions relying on them. You now have the chance to thrive personally, make a meaningful impact, and be a part of a culture that values individuality every day.

Onsite Data Processing Specialist

Location: Onsite in Lexington, KY
Training Schedule: 6-10 weeks, Monday-Friday 10:30 AM-7:00 PM
Production Schedule: Monday-Friday, 8-hour shift between 7:30 AM - 6:00 PM
Pay: $15.00/hour during training, with performance-based pay thereafter

About the Role

As a Data Processing Specialist, your role will be integral in providing document review and data entry support to our clients. Your contributions will enhance the operations of the organization you support, ensuring effective administration.

Your key responsibilities will include:

  • Delivering production services by carrying out administrative tasks such as data entry, document processing, and scanning.
  • Receiving documents in both electronic and hard copy formats for precise processing.
  • Processing documents accurately while identifying any gaps in required information.
  • Classifying documents to help build a comprehensive database of information.
  • Providing excellent customer service to enhance client satisfaction.

Requirements

To excel in this role, you should:

  • Possess a High School Diploma or equivalent education.
  • Be legally authorized to work permanently in the United States without needing a visa transfer or sponsorship.
  • Pass a criminal background check and drug test successfully.
  • Demonstrate typing skills of a minimum of 45 WPM on a computer.
  • Exhibit strong IT skills and the ability to learn new systems quickly.
  • Show exceptional attention to detail.
  • Be organized, capable of multitasking, and adaptable to changing priorities.

Working with Conduent

Join an organization that is quickly growing and is dedicated to supporting your career aspirations.

What We Offer

As part of our team, you'll receive:

  • Paid Training.
  • Opportunities for Career Growth.
  • Comprehensive Benefits Options.
  • A Supportive Work Environment.

About Us

At Conduent, our associates are dedicated to delivering critical services and solutions for Fortune 100 companies and numerous governments, thus ensuring exceptional outcomes for our clients and their stakeholders.

Join Us

If you're looking for an opportunity to create a real impact within a company that values innovative ideas, we invite you to join our team and grow alongside colleagues who will encourage and inspire you to excel.

Conduent is an Equal Opportunity Employer, welcoming applicants of all backgrounds without discrimination. For individuals with disabilities needing reasonable accommodations during the job application process, please reach out with your requests for assistance.

View Now
 

Nearby Locations

Other Jobs Near Me

Industry

  1. request_quote Accounting
  2. work Administrative
  3. eco Agriculture Forestry
  4. smart_toy AI & Emerging Technologies
  5. school Apprenticeships & Trainee
  6. apartment Architecture
  7. palette Arts & Entertainment
  8. directions_car Automotive
  9. flight_takeoff Aviation
  10. account_balance Banking & Finance
  11. local_florist Beauty & Wellness
  12. restaurant Catering
  13. volunteer_activism Charity & Voluntary
  14. science Chemical Engineering
  15. child_friendly Childcare
  16. foundation Civil Engineering
  17. clean_hands Cleaning & Sanitation
  18. diversity_3 Community & Social Care
  19. construction Construction
  20. brush Creative & Digital
  21. currency_bitcoin Crypto & Blockchain
  22. support_agent Customer Service & Helpdesk
  23. medical_services Dental
  24. medical_services Driving & Transport
  25. medical_services E Commerce & Social Media
  26. school Education & Teaching
  27. electrical_services Electrical Engineering
  28. bolt Energy
  29. local_mall Fmcg
  30. gavel Government & Non Profit
  31. emoji_events Graduate
  32. health_and_safety Healthcare
  33. beach_access Hospitality & Tourism
  34. groups Human Resources
  35. precision_manufacturing Industrial Engineering
  36. security Information Security
  37. handyman Installation & Maintenance
  38. policy Insurance
  39. code IT & Software
  40. gavel Legal
  41. sports_soccer Leisure & Sports
  42. inventory_2 Logistics & Warehousing
  43. supervisor_account Management
  44. supervisor_account Management Consultancy
  45. supervisor_account Manufacturing & Production
  46. campaign Marketing
  47. build Mechanical Engineering
  48. perm_media Media & PR
  49. local_hospital Medical
  50. local_hospital Military & Public Safety
  51. local_hospital Mining
  52. medical_services Nursing
  53. local_gas_station Oil & Gas
  54. biotech Pharmaceutical
  55. checklist_rtl Project Management
  56. shopping_bag Purchasing
  57. home_work Real Estate
  58. person_search Recruitment Consultancy
  59. store Retail
  60. point_of_sale Sales
  61. science Scientific Research & Development
  62. wifi Telecoms
  63. psychology Therapy
  64. pets Veterinary
View All Big Data Hadoop Jobs