79,916 Kafka Engineer jobs in the United States

Kafka Engineer

21117 Owings Mills, Maryland Georgia IT Inc

Posted 21 days ago

Job Viewed

Tap Again To Close

Job Description

Role Description
• The successful candidate will be responsible for developing and managing infrastructure as code (IaC), software development, continuous integration, system administration, and Linux.
• The candidate will be working with Confluent Kafka, Confluent cloud, Schema Registry, KStreams, and technologies like Terraform and Kubernetes to develop and manage infrastructure-related code on AWS platform.

Responsibilities
• Support systems engineering lifecycle activities for Kafka platform, including requirements gathering, design, testing, implementation, operations, and documentation.
• Automating platform management processes through Ansible, Python or other scripting tools/languages .
• Troubleshooting incidents impacting the Kafka platform.
• Collaborate with cross-functional teams to understand data requirements and design scalable solutions that meet business needs.
• Develop documentation materials.
• Participate in on-call rotations to address critical issues and ensure the reliability of data engineering systems.
• Monitor, troubleshoot, and optimize the performance and reliability of Kafka in AWS environments.

Experience
• Ability to troubleshoot and diagnose complex issues (e.g. including internal and external SaaS/PaaS, troubleshooting network flows).
• Able to demonstrate experience supporting technical users and conduct requirements analysis
• Can work independently with minimal guidance & oversight.
• Experience with IT Service Management and familiarity with Incident & Problem management
• Highly skilled in identifying performance bottlenecks, identifying anomalous system behavior, and resolving root cause of service issues.
• Demonstrated ability to effectively work across teams and functions to influence design, operations, and deployment of highly available software.
• Knowledge of standard methodologies related to security, performance, and disaster recovery
• Advanced understanding of agile methodologies such as CI/CD, Application Resiliency, and Security.

Required Technical Expertise
• Develop and maintain a deep understanding of Kafka and its various components.
• Strong Knowledge in Kafka Connect, KSQL and KStreams.
• Implementation experience in designing and building secure Kafka/streaming/messaging platform at enterprise scale and integration with other data system in hybrid multi-cloud environment.
• Experience in working with Confluent Kafka, Confluent Cloud, Schema Registry, and KStreams Infrastructure as code (IaC) using tools like Terraform.
• Strong operational background running Kafka clusters at scale.
• Knowledge of both physical/onprem systems and public cloud infrastructure.
• Strong understanding of Kafka broker, connect, and topic tuning and architectures.
• Strong understanding of Linux fundamentals as related to Kafka performance.
• Background in both Systems and Software Engineering.
• Strong understanding and working knowledge, experience of containers and Kubernetes cluster.
• Proven experience as a DevOps Engineer with a focus on AWS.
• Strong proficiency in AWS services such as EC2, IAM, S3, RDS, Lambda , EKS and VPC. Working knowledge of networking - VPCs, Transit Gateways, firewalls, load balancers, etc.
• Experience in monitoring and visualizing tools like Prometheus, Grafana, Kibana.
• Competent developing new solutions in one or more of high-level language Java, Python.
• Competent with configuration management in code/IaC including Ansible and Terraform
• Hands on experience delivering complex software in an enterprise environment.
• 3+ years of Python and Shell Scripting.
• 3+ years of AWS DevOps experience.
• Proficiency in distributed Linux environments.

Preferred Technical Experience
• Certification in Confluent Kafka and/or Kubernetes is a plus

View Now

Kafka Engineer

85285 Tempe, Arizona Diverse Lynx

Posted 21 days ago

Job Viewed

Tap Again To Close

Job Description

Role Description: Key skills required for Kafka messaging troubleshooting:? Deep understanding of Kafka architecture: Thorough knowledge of how Kafka components like brokers, topics, partitions, consumer groups, and replication factors work together. ? Log analysis: Ability to interpret Kafka logs from producers, consumers, and brokers to identify error messages, warnings, and potential issues. ? Monitoring and metrics: Familiarity with monitoring tools to track key Kafka metrics like consumer lag, message throughput, broker CPU usage, and network latency. ? Distributed systems knowledge: Understanding of concepts like fault tolerance, data replication, leader election, and distributed consensus to troubleshoot issues related to cluster failures. ? Programming language proficiency: Strong coding skills in Java or Scala, as many Kafka applications are written in these languages, allowing you to debug custom producers and consumers. ? Network troubleshooting: Ability to diagnose network connectivity issues between brokers and clients, including checking network configurations and firewall rules. ? Kafka configuration management: Knowledge of Kafka configuration parameters, including topic creation, partition settings, replication factors, and consumer group settings. ? Security understanding: Awareness of Kafka security mechanisms like authentication, authorization, and encryption to troubleshoot related issues. ? Troubleshooting tools and techniques: Familiarity with Kafka management tools, command-line utilities, and debugging techniques to investigate and resolve issues. ? Consumer lag: Identifying the cause of high consumer lag (e.g., slow processing, insufficient consumers) and adjusting consumer configurations or application logic. ? Broker failures: Analyzing logs and metrics to determine the root cause of a broker failure and taking actions like rebalancing partitions or restarting the broker. ? Message delivery issues: Investigating missing messages, message duplication, or out-of-order delivery by examining producer and consumer configurations. ? Performance bottlenecks: Identifying performance issues related to high message throughput, network congestion, or slow disk I/O and optimizing Kafka settings. As Kafka DeveloperA strong proficiency in Confluent Kafka architecture, a programming language like Java or Scala, expertise in system design, data management skills, and the ability to understand and implement data streaming pipelines.Key skills required for Kafka Developer:? Deep understanding of Confluent Kafka: Thorough knowledge of Kafka concepts like producers, consumers, topics, partitions, brokers, and replication mechanisms. ? Programming language proficiency: Primarily Java or Scala, with potential for Python depending on the project. ? System design and architecture: Ability to design robust and scalable Kafka-based data pipelines, considering factors like data throughput, fault tolerance, and latency. ? Data management skil Competencies: Digital : Kafk

Diverse Lynx LLC is an Equal Employment Opportunity employer. All qualified applicants will receive due consideration for employment without any discrimination. All applicants will be evaluated solely on the basis of their ability, competence and their proven capability to perform the functions outlined in the corresponding role. We promote and support a diverse workforce across all levels in the company.
View Now

Kafka Engineer

78716 Austin, Texas MHK TECH INC

Posted 21 days ago

Job Viewed

Tap Again To Close

Job Description

Standing up and administer Kafka cluster. Strong experience with Confluent Kafka.
• Expertise in Kafka brokers, zookeepers, Kafka connect, schema registry, KSQL, Rest proxy, Replicator. Both Confluent and Apache Kafka.
• Solid understanding of Kafka architecture, how to save Kafka space.
• Working knowledge of Apache Kafka, Linux/Unix systems.
• Use automation tools like provisioning using Docker, Bamboo, Harness and GitLab.
• Hands on commands usage of Kafka and all components.
• Ability to perform data related benchmarking, performance analysis and tuning.
• Solid knowledge of monitoring tools and fine-tuning s on Splunk, Prometheus, Grafana, Splunk.
• Setting up security on Kafka (Setting different security mechanism in Kafka).
• Manage Kafka users and topics.
• Monitor, prevent and troubleshoot security related issues.
• Good knowledge on CI/CD pipelines
• Automation and scripting using Shell, python, Ansible, Salt stack

View Now

KAFKA ENGINEER

75086 Fairview, Texas Keylent Inc

Posted 21 days ago

Job Viewed

Tap Again To Close

Job Description

This is old requirement Dont work on it Request-ID: 28703-1
BFS
***Onsite in Plano - TX
***Max rate $65
KAFKA ENGINEER
Experience 7to10Yrs
Shift Day 9AM TO 7PM
Required Skills - Kafka
Nice to have skills - Azure DevOps
Roles & Responsibilities
Utilize your 7-10 years of experience and expertise in Kafka to drive effective communication strategies.
Hands on experience working on Kafka connect using schema registry in a very high volume environment
Complete understanding of Kafka config properties acks timeouts buffering partitioning etc
Design recommends the best approach suited for data movement to from different sources using Apache Confluent Kafka
Hands on experience working on Converters AvroJson and Kafka connectors
Hands on experience on custom connectors using the Kafka core concepts and API
Working knowledge on Kafka Rest proxy
Ensure the optimum performance high availability and stability of solutions
Create topics set up redundancy cluster deploy monitoring tools and alerts and has good knowledge of best practices
Create stubs for producers consumers and consumer groups to help onboard applications from different languages platforms
Ability to perform data related benchmarking performance analysis and tuning
Skills Required Spring Kubernetes Kafka Azure DevOps pipelines Jenkins pipelines etc
Job Location Primary: USTXPAOC01 Plano - TX USA
Job Type 60CW00 Business Associate
Demand Requires Travel? N
Certification(s) Required NA
View Now

Kafka Engineer

21117 Owings Mills, Maryland TTI of USA, Inc.

Posted 21 days ago

Job Viewed

Tap Again To Close

Job Description

• The successful candidate will be responsible for developing and managing infrastructure as code (IaC), software development, continuous integration, system administration, and Linux.
• The candidate will be working with Confluent Kafka, Confluent cloud, Schema Registry, KStreams, and technologies like Terraform and Kubernetes to develop and manage infrastructure-related code on AWS platform.

Responsibilities
• Support systems engineering lifecycle activities for Kafka platform, including requirements gathering, design, testing, implementation, operations, and documentation.
• Automating platform management processes through Ansible, Python or other scripting tools/languages .
• Troubleshooting incidents impacting the Kafka platform.
• Collaborate with cross-functional teams to understand data requirements and design scalable solutions that meet business needs.
• Develop documentation materials.
• Participate in on-call rotations to address critical issues and ensure the reliability of data engineering systems.
• Monitor, troubleshoot, and optimize the performance and reliability of Kafka in AWS environments.

Experience
• Ability to troubleshoot and diagnose complex issues (e.g. including internal and external SaaS/PaaS, troubleshooting network flows).
• Able to demonstrate experience supporting technical users and conduct requirements analysis
• Can work independently with minimal guidance & oversight.
• Experience with IT Service Management and familiarity with Incident & Problem management
• Highly skilled in identifying performance bottlenecks, identifying anomalous system behavior, and resolving root cause of service issues.
• Demonstrated ability to effectively work across teams and functions to influence design, operations, and deployment of highly available software.
• Knowledge of standard methodologies related to security, performance, and disaster recovery
• Advanced understanding of agile methodologies such as CI/CD, Application Resiliency, and Security.

Required Technical Expertise
• Develop and maintain a deep understanding of Kafka and its various components.
• Strong Knowledge in Kafka Connect, KSQL and KStreams.
• Implementation experience in designing and building secure Kafka/streaming/messaging platform at enterprise scale and integration with other data system in hybrid multi-cloud environment.
• Experience in working with Confluent Kafka, Confluent Cloud, Schema Registry, and KStreams Infrastructure as code (IaC) using tools like Terraform.
• Strong operational background running Kafka clusters at scale.
• Knowledge of both physical/onprem systems and public cloud infrastructure.
• Strong understanding of Kafka broker, connect, and topic tuning and architectures.
• Strong understanding of Linux fundamentals as related to Kafka performance.
• Background in both Systems and Software Engineering.
• Strong understanding and working knowledge, experience of containers and Kubernetes cluster.
• Proven experience as a DevOps Engineer with a focus on AWS.
• Strong proficiency in AWS services such as EC2, IAM, S3, RDS, Lambda , EKS and VPC. Working knowledge of networking - VPCs, Transit Gateways, firewalls, load balancers, etc.
• Experience in monitoring and visualizing tools like Prometheus, Grafana, Kibana.
• Competent developing new solutions in one or more of high-level language Java, Python.
• Competent with configuration management in code/IaC including Ansible and Terraform
• Hands on experience delivering complex software in an enterprise environment.
• 3+ years of Python and Shell Scripting.
• 3+ years of AWS DevOps experience.
• Proficiency in distributed Linux environments.

Preferred Technical Experience
• Certification in Confluent Kafka and/or Kubernetes is a plus

View Now

Kafka Engineer

60290 Chicago, Illinois Syntricate Technologies

Posted 23 days ago

Job Viewed

Tap Again To Close

Job Description

Required Skills: 10+ Year experience
  • Good Knowledge In Kafka
  • Good Communication Skill
  • Design, develop, and manage Kafka-based data pipelines
  • Work with other big data technologies such as Hadoop, Spark, and Storm
  • Monitor and optimize Kafka clusters
  • Troubleshoot Kafka related issues
  • Handle customer queries and support
View Now

Sr kafka Engineer

92713 Irvine, California USM

Posted 8 days ago

Job Viewed

Tap Again To Close

Job Description

  • Start Date: Interview Types
  • Skills Kafka, Confluent Kaf. Visa Types H1B, Green Card, US .


  • We are seeking a Senior Kafka Engineer to manage, enhance, and scale an enterprise-grade Apache Kafka implementation deployed on AWS and the Confluent Platform . This person will be responsible for keeping the system reliable, improving it over time, and expanding it to support new applications.

    This role involves performing detailed architectural reviews, monitoring, performance tuning, optimizing existing Kafka pipelines, and partnering with application teams to deliver reliable, secure, and performant streaming solutions.

    Qualifications:

    • 8+ years in platform engineering with 3+ years of hands-on experience with Apache Kafka.
    • Expertise with Confluent Platform (Brokers, Schema Registry, Control Center, ksqlDB).
    • Experience deploying and managing Kafka on AWS (including MSK or self-managed EC2-based setups).
    • Solid understanding of Kubernetes, especially EKS, for microservices integration.
    • Familiarity with monitoring and alerting stacks: Prometheus, Grafana, ELK, or similar.
    • Management and support of Kafka Cloud and on-premise platforms.
    • Capacity management.
    • Good understanding of SRE principles and methodologies, and experience learning, adapting to, and automating new developments in the Kafka ecosystem.
    • Experience troubleshooting integration platform issues (e.g., connectivity, schema management, producer/consumer, etc.) for Kafka, API gateway, etc.
    • In-depth understanding of the Kafka producer and consumer client functionality.
    • Experience troubleshooting custom Kafka client applications written in Java, .NET, Python, and Spring Boot.
    • Hands-on experience with Kafka Connect, Kafka Streams, and Kafka Schema Registry components, understanding its underlying functionality and implementation.
    • Preferably, at least 3 years' experience working in AWS, specifically EKS, EC2, IAM, Route53 and Terraform.
    • Proficiency in deploying, scaling, and managing Kubernetes clusters, with a strong understanding of security best practices.
    • Familiarity with Docker and Helm.
    • Kafka, Kubernetes, Docker, or any Cloud certification.
    Responsibilities:
    • Manage and enhance existing Apache Kafka and Confluent Platform on AWS.
    • Review existing implementations and recommend improvements.
    • Collaborate with internal teams and respective stakeholders to understand user requirements and implement technical solutions.
    • Collaborate with engineering and product teams to integrate new use cases and define scalable streaming patterns.
    • Implement and maintain Kafka producers/consumers, Connectors, and Kafka Streams applications.
    • Enforce governance around topic design, schema evolution, partitioning, and data retention.
    • Monitor, troubleshoot, and tune Kafka clusters using Confluent Control Center, Prometheus, and Grafana.
    • Use Kubernetes and Terraform to automate Kafka infrastructure deployment and scaling.
    • Ensure high availability, security, and disaster recovery including participation in all DR exercises
    • Responsibility and ownership for lifecycle management which includes upgrades, maintenance, restart, and migration projects.
    • Analyze all platforms and ensure the environments are right-sized (i.e., capacity management), along with managing configurations, monitors, and alerts.
    • Create guidelines, procedures, standards, conventions, and best practices for Kafka usage and administration.
    • Develop, maintain, and troubleshoot Terraform IaC modules.
    • Automate repetitive tasks using Terraform scripts.
    • Guide and mentor team members on Terraform implementations.
    • Partner with team members deploying releases to production, and support solution teams
    • Coordinate with the Compute, Database, and other infrastructure teams to support weekend patching activities.
    • Coordinate vendor support.
    • Broad support with troubleshooting and resolving issues, and to be an escalation point for on-premise and cloud integration platform incidents.
    • Remediate security vulnerabilities reported by VMAST across application runtimes, integration services, and messaging platforms.
    • Support production incidents and outages for faster service restoration and provide required support to ensure application connectivity to API services.
    • Work on service requests submitted by application teams requesting application integration team services to integration products (Kafka and any other messaging platforms).
    • Create defects when services like tenant onboarding, resource provisioning, and other customer products are not functioning as per the documentation or working as intended.
    • Analyze SN INCs and REQs to identify trends and opportunities for improvement.
    • Consider implementing automation for manual and repetitive operations support functions.
    • Troubleshoot operational issues related to the AWS infrastructure.
    View Now
    Be The First To Know

    About the latest Kafka engineer Jobs in United States !

    Senior Kafka Engineer

    92713 Irvine, California Solugenix Corporation

    Posted 14 days ago

    Job Viewed

    Tap Again To Close

    Job Description

    Senior Kafka Engineer
    Irvine, CA (Hybrid)
    Full-Time
    Job ID 25-09676

    Solugenix is seeking an experienced Senior Kafka Engineer for a full-time, hybrid position based in Irvine, CA.

    We are seeking a Senior Kafka Engineer to manage, enhance, and scale an enterprise-grade Apache Kafka implementation deployed on AWS and the Confluent Platform. This person will be responsible for keeping the system reliable, improving it over time, and expanding it to support new applications.
    This role involves performing detailed architectural reviews, monitoring, performance tuning, optimizing existing Kafka pipelines, and partnering with application teams to deliver reliable, secure, and performant streaming solutions.

    Qualifications:
    • 8+ years in platform engineering with 3+ years of hands-on experience with Apache Kafka.
    • Expertise with Confluent Platform (Brokers, Schema Registry, Control Center, ksqlDB).
    • Experience deploying and managing Kafka on AWS (including MSK or self-managed EC2-based setups).
    • Solid understanding of Kubernetes, especially EKS, for microservices integration.
    • Familiarity with monitoring and alerting stacks: Prometheus, Grafana, ELK, or similar.
    • Management and support of Kafka Cloud and on-premise platforms.
    • Capacity management.
    • Good understanding of SRE principles and methodologies, and experience learning, adapting to, and automating new developments in the Kafka ecosystem.
    • Experience troubleshooting integration platform issues (e.g., connectivity, schema management, producer/consumer, etc.) for Kafka, API gateway, etc.
    • In-depth understanding of the Kafka producer and consumer client functionality.
    • Experience troubleshooting custom Kafka client applications written in Java, .NET, Python, and Spring Boot.
    • Hands-on experience with Kafka Connect, Kafka Streams, and Kafka Schema Registry components, understanding its underlying functionality and implementation.
    • Preferably, at least 3 years' experience working in AWS, specifically EKS, EC2, IAM, Route53 and Terraform.
    • Proficiency in deploying, scaling, and managing Kubernetes clusters, with a strong understanding of security best practices.
    • Familiarity with Docker and Helm.
    • Kafka, Kubernetes, Docker, or any Cloud certification.
    Responsibilities:
    • Manage and enhance existing Apache Kafka and Confluent Platform on AWS.
    • Review existing implementations and recommend improvements.
    • Collaborate with internal teams and respective stakeholders to understand user requirements and implement technical solutions.
    • Collaborate with engineering and product teams to integrate new use cases and define scalable streaming patterns.
    • Implement and maintain Kafka producers/consumers, Connectors, and Kafka Streams applications.
    • Enforce governance around topic design, schema evolution, partitioning, and data retention.
    • Monitor, troubleshoot, and tune Kafka clusters using Confluent Control Center, Prometheus, and Grafana.
    • Use Kubernetes and Terraform to automate Kafka infrastructure deployment and scaling.
    • Ensure high availability, security, and disaster recovery including participation in all DR exercises
    • Responsibility and ownership for lifecycle management which includes upgrades, maintenance, restart, and migration projects.
    • nalyze all platforms and ensure the environments are right-sized (i.e., capacity management), along with managing configurations, monitors, and alerts.
    • Create guidelines, procedures, standards, conventions, and best practices for Kafka usage and administration.
    • Develop, maintain, and troubleshoot Terraform IaC modules.
    • utomate repetitive tasks using Terraform scripts.
    • Guide and mentor team members on Terraform implementations.
    • Partner with team members deploying releases to production, and support solution teams
    • Coordinate with the Compute, Database, and other infrastructure teams to support weekend patching activities.
    • Coordinate vendor support.
    • Broad support with troubleshooting and resolving issues, and to be an escalation point for on-premise and cloud integration platform incidents.
    • Remediate security vulnerabilities reported by VMAST across application runtimes, integration services, and messaging platforms.
    • Support production incidents and outages for faster service restoration and provide required support to ensure application connectivity to API services.
    • Work on service requests submitted by application teams requesting application integration team services to integration products (Kafka and any other messaging platforms).
    • Create defects when services like tenant onboarding, resource provisioning, and other customer products are not functioning as per the documentation or working as intended.
    • nalyze SN INCs and REQs to identify trends and opportunities for improvement.
    • Consider implementing automation for manual and repetitive operations support functions.
    • Troubleshoot operational issues related to the AWS infrastructure.
    Pay Range for CA, CO, IL, NJ, NY, WA, and DC: $70/hour to $90/hour. Starting rate of pay offered may vary depending on factors including but not limited to, position offered, location, education, training and/or experience.

    Solugenix will consider qualified applicants with a criminal history pursuant to the California Fair Chance Act and Ordinance. Applicants do not need to disclose their criminal history or participate in a background check until a conditional job offer is made to you. After making a conditional offer and running a background check, if we are concerned about conviction that is directly related to the job, applicants will be given the chance to explain the circumstances surrounding the conviction, provide mitigating evidence, or challenge the accuracy of the background report.

    bout Solugenix
    Solugenix is a leader in IT services, delivering cutting-edge technology solutions, exceptional talent, and managed services to global enterprises. With extensive expertise in highly regulated and complex industries, we are a trusted partner for integrating advanced technologies with streamlined processes. Our solutions drive growth, foster innovation, and ensure compliance-providing clients with reliability and a strong competitive edge.
    Recognized as a 2024 Top Workplace, Solugenix is proud of its inclusive culture and unwavering commitment to excellence. Our recent expansion, with new offices in the Dominican Republic, Jakarta, and the Philippines, underscores our growing global presence and ability to offer world-class technology solutions. Partnering with Solugenix means more than just business-it means having a dedicated ally focused on your success in today's fast-evolving digital world.
    View Now

    Messaging(Kafka) Engineer

    75215 Park Cities, Texas Info Way Solutions

    Posted 18 days ago

    Job Viewed

    Tap Again To Close

    Job Description

    Messaging(Kafka) Engineer
    Dallas/Tampa/New Jersey - onsite

    Key Responsibilities:
    Design and implement scalable, secure, and high-throughput messaging solutions using Confluent Kafka , Google Pub/Sub Lite , and Azure Service Bus .
    Develop and maintain Kafka topics, schemas, producers, and consumers to support real-time data pipelines and event-driven architectures.
    Collaborate with engineering, DevOps, and product teams to integrate messaging systems into cloud-native microservices and blockchain-based platforms.
    Ensure message durability, ordering, and replayability for financial transactions, including those related to corporate actions and collateral lifecycle events.
    Monitor and tune messaging infrastructure for performance, reliability, and compliance with regulatory requirements.
    Contribute to the development of internal standards and best practices for messaging and data streaming across hybrid cloud environments.

    Technical skills:
    5+ years of experience in distributed systems engineering with a focus on messaging and event streaming.
    Deep expertise in Confluent Kafka (including Schema Registry, Connect, and KSQL), with hands-on experience in production environments.
    Experience with Google Pub/Sub Lite and Azure Service Bus, including hybrid integration patterns.
    Proficiency in Java, Python, or Scala for building Kafka clients and stream processors.
    Familiarity with Kubernetes, Terraform, and CI/CD pipelines for infrastructure automation
    View Now

    Sr. Kafka Engineer

    30383 Atlanta, Georgia Truist Inc

    Posted 21 days ago

    Job Viewed

    Tap Again To Close

    Job Description

    The position is described below. If you want to apply, click the Apply Now button at the top or bottom of this page. After you click Apply Now and complete your application, you'll be invited to create a profile, which will let you see your application status and any communications. If you already have a profile with us, you can log in to check status.

    Need Help? (

    If you have a disability and need assistance with the application, you can request a reasonable accommodation. Send an email to Accessibility ( ?subject=Accommodation%20request)

    (accommodation requests only; other inquiries won't receive a response).

    Regular or Temporary:

    Regular

    Language Fluency: English (Required)

    Work Shift:

    1st shift (United States of America)

    Please review the following job description:

    ** This position is on-site 4 days per week **

    The primary purpose of this role is to provide consultation and technical direction on translating business requirements and functional specifications into Middleware Messaging designs-particularly Kafka. This includes facilitating the implementation and maintenance of complex business and enterprise software solutions to ensure successful deployment of released applications. This role serves as a technical expert for project teams throughout the implementation and maintenance of enterprise Middleware solutions. In addition, this role personally develops and delivers automation, operational support, stable platforms, and integrated enterprise software solutions within various computing environments including on premise and cloud datacenters.

    ESSENTIAL DUTIES AND RESPONSIBILITIES

    Following is a summary of the essential functions for this job. Other duties may be performed, both major and minor, which are not mentioned below. Specific activities may change from time to time.

    1. Resolves complex problems throughout the entire lifecycle of Kafka based platforms, from build to retirement. This includes monitoring, security, automation, troubleshooting, and deployment capabilities.

    2. Provides consultation on business requirements and functional specifications in logical program designs, code modules, stable application systems, and software solutions designed around Kafka; facilitates the transition to high level design and supports the project lifecycle with input from executive leadership where needed.

    3. Contributes to and leverages the technical direction for the development, configuration, or modification of integrated business and enterprise Kafka solutions within various computing environments by providing insight and guidance for the design and coding of Kafka infrastructure.

    4. Supports systems integration testing and performance testing for large, complex, cross-functional application initiatives by providing insight to testing teams and ensuring the messaging platform is properly tuned and configured.

    5. Mentors and advises others in all software development lifecycle phases by applying and sharing an in-depth understanding of company and industry methodologies, policies, standards, and controls.

    6. Communicates changes in software architecture and coaches others to apply this understanding to software solutions; resolves escalated issues.

    7. Leads efforts to improve engineering, test, and operational excellence best practices.

    8. Solves complex cross-functional architecture/design and business problems; solutions are extensible; works to simplify, optimize, remove bottlenecks, etc.

    9. Mentors and advises others, sharing an in-depth understanding of company and industry methodologies, policies, standards, and controls.

    QUALIFICATIONS

    Required Qualifications:

    The requirements listed below are representative of the knowledge, skill and/or ability required. Reasonable accommodations may be made to enable individuals with disabilities to perform the essential functions.

    1. Bachelor's degree in computer science, CIS, or related field

    2. Five to seven years of experience in software development or a related field

    3. Five to seven years of experience in database technologies

    4. Five to seven years of experience working on project(s) involving the implementation of solutions applying development life cycles (SDLC)

    Preferred Qualifications:

    1. Master's degree in computer science, CIS, or related field

    2. Five to seven years of IT experience developing and implementing business systems within an organization

    3. Five to seven years of experience working with defect or incident tracking software

    4. Five to seven years of experience writing technical documentation in a software development environment

    5. Five to seven years of experience working with an IT Infrastructure Library (ITIL) framework

    6. Three to five years of experience working with Kafka platforms

    7. One to three years' experience working with Kubernetes based deployments

    8. Experience with other messaging platforms like IBM MQ is a plus

    9. Specific experience with OpenShift and/or EKS based Kubernetes is a plus

    For this opportunity, Truist will not sponsor an applicant for work visa status or employment authorization, nor will we offer any immigration-related support for this position (including, but not limited to H-1B, F-1 OPT, F-1 STEM OPT, F-1 CPT, J-1, TN-1 or TN-2, E-3, O-1, or future sponsorship for U.S. lawful permanent residence status.)

    General Description of Available Benefits for Eligible Employees of Truist Financial Corporation: All regular teammates (not temporary or contingent workers) working 20 hours or more per week are eligible for benefits, though eligibility for specific benefits may be determined by the division of Truist offering the position. Truist offers medical, dental, vision, life insurance, disability, accidental death and dismemberment, tax-preferred savings accounts, and a 401k plan to teammates. Teammates also receive no less than 10 days of vacation (prorated based on date of hire and by full-time or part-time status) during their first year of employment, along with 10 sick days (also prorated), and paid holidays. For more details on Truist's generous benefit plans, please visit our Benefits site (

    . Depending on the position and division, this job may also be eligible for Truist's defined benefit pension plan, restricted stock units, and/or a deferred compensation plan. As you advance through the hiring process, you will also learn more about the specific benefits available for any non-temporary position for which you apply, based on full-time or part-time status, position, and division of work.

    Truist is an Equal Opportunity Employer that does not discriminate on the basis of race, gender, color, religion, citizenship or national origin, age, sexual orientation, gender identity, disability, veteran status, or other classification protected by law. Truist is a Drug Free Workplace.

    EEO is the Law (

    Pay Transparency Nondiscrimination Provision (

    E-Verify (

    View Now
     

    Nearby Locations

    Other Jobs Near Me

    Industry

    1. request_quote Accounting
    2. work Administrative
    3. eco Agriculture Forestry
    4. smart_toy AI & Emerging Technologies
    5. school Apprenticeships & Trainee
    6. apartment Architecture
    7. palette Arts & Entertainment
    8. directions_car Automotive
    9. flight_takeoff Aviation
    10. account_balance Banking & Finance
    11. local_florist Beauty & Wellness
    12. restaurant Catering
    13. volunteer_activism Charity & Voluntary
    14. science Chemical Engineering
    15. child_friendly Childcare
    16. foundation Civil Engineering
    17. clean_hands Cleaning & Sanitation
    18. diversity_3 Community & Social Care
    19. construction Construction
    20. brush Creative & Digital
    21. currency_bitcoin Crypto & Blockchain
    22. support_agent Customer Service & Helpdesk
    23. medical_services Dental
    24. medical_services Driving & Transport
    25. medical_services E Commerce & Social Media
    26. school Education & Teaching
    27. electrical_services Electrical Engineering
    28. bolt Energy
    29. local_mall Fmcg
    30. gavel Government & Non Profit
    31. emoji_events Graduate
    32. health_and_safety Healthcare
    33. beach_access Hospitality & Tourism
    34. groups Human Resources
    35. precision_manufacturing Industrial Engineering
    36. security Information Security
    37. handyman Installation & Maintenance
    38. policy Insurance
    39. code IT & Software
    40. gavel Legal
    41. sports_soccer Leisure & Sports
    42. inventory_2 Logistics & Warehousing
    43. supervisor_account Management
    44. supervisor_account Management Consultancy
    45. supervisor_account Manufacturing & Production
    46. campaign Marketing
    47. build Mechanical Engineering
    48. perm_media Media & PR
    49. local_hospital Medical
    50. local_hospital Military & Public Safety
    51. local_hospital Mining
    52. medical_services Nursing
    53. local_gas_station Oil & Gas
    54. biotech Pharmaceutical
    55. checklist_rtl Project Management
    56. shopping_bag Purchasing
    57. home_work Real Estate
    58. person_search Recruitment Consultancy
    59. store Retail
    60. point_of_sale Sales
    61. science Scientific Research & Development
    62. wifi Telecoms
    63. psychology Therapy
    64. pets Veterinary
    View All Kafka Engineer Jobs