687 Hadoop jobs in the United States
Hadoop developer
Posted 1 day ago
Job Viewed
Job Description
CapB is a global leader on IT Solutions and Managed Services. Our R&D is focused on providing cutting edge products and solutions across Digital Transformations from Cloud, AI/ML, IOT, Blockchain to MDM/PIM, Supply chain, ERP, CRM, HRMS and Integration solutions.
Looking for a Hadoop Developer with 3 to 7 years of experience and at least 2-4 years in Production Support team. The hire will be responsible for supporting Hadoop application, expanding and optimizing our data and data pipeline architecture.
Responsibilities: • Experience in Hadoop distribution areas MapR Data platform and on Big Data infrastructure
• Lead for production support and enhancements using Hadoop, Spark, Kafka, Grafana and Splunk
• Being able to lead/participate in Design/Solutions/Architecture discussions
• Communicating on production support issues and managing tickets, resolutions and reporting metrics
Requirements: • Bachelor’s degree in Computer Science or a related field and/or equivalent experience.
• 3+ years of hands-on experience with Big Data technologies – Specifically, Hadoop, Hive, Spark, Scala, Java, Sqoop and Pig
• 4+ years of experience in Big data platform production support projects
• Skilled in development of Structured Spark Streaming applications using Scala
• Experience on importing and exporting data using stream processing like Kafka.
• Experience on OpenTSDB is preferable
• Good understanding of Business Intelligence solutions for Customers
Hadoop Administration
Posted 3 days ago
Job Viewed
Job Description
Sonsoft , Inc. is a USA based corporation duly organized under the laws of the Commonwealth of Georgia. Sonsoft Inc. is growing at a steady pace specializing in the fields of Software Development, Software Consultancy and Information Technology Enabled Services.
- At least 4 years of experience in Implementation and Administration of Hadoop infrastructure
- At least 2 years of experience Architecting, Designing, Implementation and Administration of Hadoop infrastructure
- At least 2 years of experience in Project life cycle activities on development and maintenance projects.
- Should be able to provide Consultancy to client / internal teams on which product/flavor is best for which situation/setup
- Operational expertise in troubleshooting , understanding of systems capacity, bottlenecks, basics of memory, CPU, OS, storage, and networks
- Hadoop, MapReduce, HBase, Hive, Pig, Mahout
- Hadoop Administration skills: Experience working in Cloudera Manager or Ambari, Ganglia, Nagios
- Experience in using Hadoop Schedulers - FIFO, Fair Scheduler, Capacity Scheduler
- Experience in Job Schedule Management - Oozie or Enterprise Schedulers like Control-M, Tivoli
- Good knowledge of Linux (RHEL, Centos, Ubuntu)
- Experience in setting up Ad/LDAP/Kerberos Authentication models
- Experience in Data Encryption technique
Responsibilities:-
- Upgrades and Data Migrations
- Hadoop Ecosystem and Clusters maintenance as well as creation and removal of nodes
- Perform administrative activities with Cloudera Manager/Ambari and tools like Ganglia, Nagios
- Setting up and maintaining Infrastructure and configuration for Hive, Pig and MapReduce
- Monitor Hadoop Cluster Availability, Connectivity and Security
- Setting up Linux users, groups, Kerberos principals and keys
- Aligning with the Systems engineering team in maintaining hardware and software environments required for Hadoop
- Software installation, configuration, patches and upgrades
- Working with data delivery teams to setup Hadoop application development environments
- Performance tuning of Hadoop clusters and Hadoop MapReduce routines
- Screen Hadoop cluster job performances and capacity planning
- Data modelling, Database backup and recovery
- Manage and review Hadoop log files
- File system management, Disk space management and monitoring (Nagios, Splunk etc)
- HDFS support and maintenance
- Planning of Back-up, High Availability and Disaster Recovery Infrastructure
- Diligently teaming with Infrastructure, Network, Database, Application and Business Intelligence teams to guarantee high data quality and availability
- Collaborating with application teams to install operating system and Hadoop updates, patches and version upgrades
- Implementation of Strategic Operating model in line with best practices
- Point of Contact for Vendor escalations
- Ability to work in team in diverse/ multiple stakeholder environment
- Analytical skills
- Bachelors degree or foreign equivalent required from an accredited institution. Will also consider three years of progressive experience in the specialty in lieu of every year of education.
- At least 7 years of experience within the Information Technologies.
**U.S. citizens and those authorized to work in the U.S. are encouraged to apply .We are unable to sponsor at this time.
Note:-
- This is a Full-Time Permanent job opportunity for you.
- Only US Citizen, Green Card Holder, GC-EAD ,H4-EAD & L2-EAD can apply.
- No OPT-EAD, TN Visa & H1B Consultants please.
- Please mention yourVisa Status in youremail orresume .
#J-18808-Ljbffr
Hadoop Tester
Posted 3 days ago
Job Viewed
Job Description
Hello,
Greetings!
This is Abhishek from Jconnect INC . Below is the requirement with my client. Please let me know if you are available for this role.
Title: Hadoop Tester
Location: Charlotte NC (Onsite)
Duration: Fulltime
JOB DESCRIPTION :
Skills Desired:
- 5+ years of experience and knowledge on Database and Hadoop testing, test automation, involved in test planning, writing test scripts and test execution.
- Should be good with tools such as SharePoint, Microsoft Project/Excel, word.
- Self-starter, open to learn new tools and should be team player.
- Utilizes in-depth knowledge of testing best practices and defect management.
- Methodologies and Agile tools (Jira). Knowledge of Defect Management process and enterprise testing standards.
If you are interested, please send me your updated resume ASAP with below details :
Full Name:
Current Location/Zip:
Contact Number:
E-Mail Id:
Alternate Email Id:
Visa/Work Permit Status:
Current Rate/Salary:
Expected Base Salary:
Notice Period/Availability to Start:
Skype ID:
Willingness to relocate to job location:
Any Relocation Concern (family/house/weather):
Current/Previous Employer Name:
Preferred Interview timings (Specify Time zone):
Any other job opportunity in process & at which stage:
Overall Experience Summary:
LinkedIn URL:
Looking forward for your response.
Thanks and Regards,
Abhishek Singh
Jconnect Infotech Inc.
168 Barclay Center Ste. 347,
Cherry Hill, NJ 08034
Contact:
Email:
Hadoop Admin
Posted 6 days ago
Job Viewed
Job Description
Summary:
Seeking an experienced Hadoop Admin / SRE to support NextGen platforms built on Big Data technologies such as Hadoop, Spark, Kafka, Impala, HBase, Docker, Ansible, and more. The role focuses on cluster management, platform operations, and DevOps across tools like Cloudera, Jupyter, DataRobot, ELK, and others.
Key Responsibilities:
- Manage Cloudera Hadoop and AI/ML platforms in all environments (upgrades, monitoring, deployments, disaster recovery).
- Administer Docker, OpenShift, Jupyter, and containerized platforms.
- Act as SME for Hadoop ecosystem, including Kafka, Spark, Impala, Hive, HBase, etc.
- Handle incident/problem management, performance tuning, and capacity planning.
- Collaborate with developers, vendors, and stakeholders for end-to-end platform solutions.
Required Skills:
- Strong knowledge of Hadoop architecture, HDFS, Kerberos/AD authentication.
- Experience analyzing Hadoop logs, tuning YARN, and configuring clusters.
- Expertise in Cloudera tools (HDFS, Kafka, Hive, HBase, Impala, Hue, SOLR, etc.).
- Proficient in Unix/Linux, Java, Python, Shell/Perl scripting.
- Familiarity with DevOps tools: Ansible, Jenkins, Bitbucket, SVN.
- Experience with job scheduling, monitoring, and automation.
- Hands-on with databases: Sybase, SQL, Oracle.
Certifications Preferred:
- Cloudera Admin / Dev
- Cloud, Docker, and OpenShift technologies
Hadoop Developer
Posted 7 days ago
Job Viewed
Job Description
Tekfortune is a fast-growing consulting firm specialized in permanent, contract & project-based staffing services for worlds leading organizations in a broad range of industries. In this quickly changing economic landscape, virtual recruiting and remote work are critical for the future of work. To support the active project demands and skills gaps, our staffing experts can help you find the best job for you.
Role:
Location:
Duration:
Required Skills:
Job Description:
Title: Hadoop Developer
Duration: 6-12+ months W2 contract (with possible extension or CTH)
Locations: Dallas TX, Greater NY, Pennington NJ and Charlotte NC
3 days Onsite
Must have: On prem Experience, SQL strong, HDFS/Hive, Data Ingestion, Data Lakes
Required Skills
• 6 years experience in Hadoop stack and storage technologies, HDFS, MapReduce, Yarn, HIVE, sqoop, Impala , spark, flume, kafka and oozie
• Extensive Knowledge on Bigdata Enterprise architecture (Cloudera preferred)
• Excellent analytical capabilities - Strong interest in algorithms
• Experienced in HBase, RDBMS, SQL, ETL and data analysis
• Experience in No SQL Technologies (ex., Cassandra/ MongoDB, etc )
• Experienced in scripting(Unix/Linux) and scheduling (Autosys)
• Experience with team delivery/release processes and cadence pertaining to code deployment and release
• Research oriented, motivated, pro-active, self-starter with strong technical, analytical and interpersonal skills.
• A team player with good verbal and written skills, capable of working with a team of Architects, Developers, Business/Data Analysts, QA and client stakeholders
• Versatile resource with balanced development skills and business acumen to operate at a fast and accurate speed
• Proficient understanding of distributed computing principles. Continuously evaluate new technologies, innovate and deliver solution for business critical applications
Hadoop Developer
Posted 7 days ago
Job Viewed
Job Description
Hope you are doing well
Number of Position: 3
NO C2C OR CONTRACT-ONLY FULLTIME
I, Abhishek would like to share a job opportunity as Hadoop Developer (Hadoop, Unix shell scripting)
in Richmond, VA / Jacksonville, FL for a Fulltime position.
*** In case, if you are not comfortable with this location, please share your preference with contact details for further requirements ***
Kindly find the JD below and let me know if you are available for the same.
Hadoop Developer (Hadoop, Unix shell scripting)
Job Location: Richmond, VA / Jacksonville, FL (Onsite)
Must Have Technical/Functional Skills
Skill: Hadoop Developer (Hadoop, Unix shell scripting)
7+ Years of experience in the data warehousing architectural approaches and minimum 3 years of experience in Big data(Cloudera).
Exposure to and strong working knowledge on distributed systems.
Excellent understanding of client - service models and customer orientation in service delivery.
Ability to grasp the 'big picture' of a solution by considering all potential options in impacted area.
Aptitude to understand and adapt to newer technologies.
Experience in managing and leading small development teams in an agile environment.
The ability to work with teammates in a collaborative manner to achieve a mission.
Presentation skills to prepare and present to large and small groups on technical and functional topics.
#LI-NS2
Please reply me with your updated resume and required details:
Full Name:
Best number to reach you:
Work authorization/Visa Status:
Current Location:
Current Compensation:
Expected Compensation:
Best time to call you:
Waiting for your earliest response
Sincerely yours,
Abhishek Rana
Sr. Technical Recruiter | Syntricate Technologies Inc.
Direct: (
LinkedIn -
Email: | Web:
Boston, MA
Hadoop Developer
Posted 7 days ago
Job Viewed
Job Description
Akkodis is seeking a Hadoop Developer for a Contract job with a client in Charlotte, NC(Hybrid) . This role involves designing, coding, and maintaining applications to support Financial Crimes technology, while also collaborating with business partners to define system requirements.
Rate Range: $60/hour to $5/hour; The rate may be negotiable based on experience, education, geographic location, and other factors.
Hadoop Developer job responsibilities include:
- Design, develop, test, and maintain applications in the Financial Crimes technology space to meet business objectives.
- Work closely with business partners to define system requirements and translate them into technical specifications.
- Develop and enhance data provisioning models for large-scale financial crimes data sourcing initiatives.
- Contribute to story refinement, defining requirements, and estimating work necessary for delivery.
- Perform proof of concept (POC) as needed to mitigate risk or implement new ideas.
- Automate and set up continuous integration/continuous delivery (CI/CD) pipelines.
- Manage technical aspects of the application, including Change Management, Platform Upgrades, and interfacing application changes.
- Analyze and model data, ensuring best practices and governance in data sourcing and provisioning.
- Communicate deliverables and timelines effectively with managers, peers, and business partners.
- Follow Agile practices and ensure all enterprise-changing standards are met.
- BS/MS in Computer Science, Engineering, or a related quantitative discipline.
- 10+ years of applicable experience in software development, with a focus on data systems and financial crime technology.
- Expertise in Hadoop ecosystem components such as HIVE, HDFS, and SPARK.
- Strong experience with PySpark (2.4 or higher), SQL, and data analysis, with familiarity in REST API and Agile methodologies.
Pay Details: 60.00 to 65.00 per hour
Benefit offerings available for our associates include medical, dental, vision, life insurance, short-term disability, additional voluntary benefits, EAP program, commuter benefits and a 401K plan. Our benefit offerings provide employees the flexibility to choose the type of coverage that meets their individual needs. In addition, our associates may be eligible for paid leave including Paid Sick Leave or any other paid leave required by Federal, State, or local law, as well as Holiday pay where applicable.
Equal Opportunity Employer/Veterans/Disabled
To read our Candidate Privacy Information Statement, which explains how we will use your information, please navigate to
The Company will consider qualified applicants with arrest and conviction records in accordance with federal, state, and local laws and/or security clearance requirements, including, as applicable:
- The California Fair Chance Act
- Los Angeles City Fair Chance Ordinance
- Los Angeles County Fair Chance Ordinance for Employers
- San Francisco Fair Chance Ordinance
Be The First To Know
About the latest Hadoop Jobs in United States !
Hadoop Developer
Posted 7 days ago
Job Viewed
Job Description
- Primary Skills: Hadoop
- Secondary Skill: Python
- Tertiary Skill: UNIX/SHELL SCRIPTS
- Bachelor's degree in a technical or business-related field or equivalent working experience.
- 4+ years of experience in data warehousing architectural approaches.
- Minimum of 4 years in big data. (Cloudera) Sound understanding and experience with Hadoop ecosystem (Cloudera). Able to understand and explore the constantly evolving tools within Hadoop ecosystem and apply them appropriately to the relevant problems at hand. Experience in working with a Big Data implementation in production environment.
- Must have experience with Big Data technologies like Hadoop, Hive, Spark, python, scala etc. Experience in python and Unix shell scripting.
- Experience in scheduling tool like Autosys Understanding of Agile methodologies and technologies
- Sound knowledge of relational databases (SQL) and experience with large SQL based systems.
- Exposure to and strong working knowledge of distributed systems. Excellent understanding of client-service models and customer orientation in service delivery.
- Ability to grasp the 'big picture' for a solution by considering all potential options in impacted area.
- Aptitude to understand and adapt to newer technologies.
- The ability to work with team mates in a collaborative manner to achieve a mission.
- Experience in query optimization, performance tuning of the complex SQL queries. Benchmark and debug critical issues with algorithms and software as they arise
Hadoop Developer
Posted 9 days ago
Job Viewed
Job Description
Skill: Hadoop Developer
Must Have Technical/Functional Skills:
Primary Skill: Hadoop Developer.
Secondary: UI – Hive, Spark.
Experience: Minimum 7 years.
Roles & Responsibilities:
Hive Experience, HOL, File Ingestion.
SQL knowledge (using different functions in the sql rank, dense rank, group).
Partitioning concepts.
Slowing Changing Dimensions.
Spark Concepts (RDD, Dataframe, Datasets, etc).
Shell Scripting concepts.
Kerberos Authentication concepts.
Designing and coding Hadoop applications to analyze data collections.
Creating data processing frameworks.
Extracting data and isolating data clusters.
Troubleshooting application bugs.
Creating data tracking programs.
Advanced knowledge of the Hadoop ecosystem and its components.
In-depth knowledge of Hive, HBase, and Pig.
Knowledge of SQL Server and Linux is good to have.
High-level analytical and problem-solving skills.
Hadoop Admin
Posted 10 days ago
Job Viewed
Job Description
Technical Skills : Must be experienced in the following areas: Linux expertise; experience in the set-up andadministration of Hadoop platform; working knowledge of tools like Hive, Spark, HBase, Sqoop, Impala,Kafka, Flume, Oozie, MapReduce, etc.• Languages : Understanding of scripting languages such as Java, Scala, Python, or Shell Scripting, etc.; practicalknowledge of end-to-end design and build process of Near-Real Time and Batch Data Pipelines; expertise withSQL and Data modeling work in Agile development process.• Ability to work with large data sets: Big Data involves large data sets, so applicants must be able to work withhighly diverse data in multiple types and formats, and sources• Teamwork abilities: The big data architect must be able to work in a team-oriented environment with people ofdiverse skill sets• Analytical skills: analyze complex problems using information provided, understand customer requests, andprovide the appropriate solution• Communication skills: Engage with clients and stakeholders to understand their objectives for setting up Big Dataand apply it in the architecture.