39,835 Kafka Engineer jobs in the United States
Kafka Engineer
Posted today
Job Viewed
Job Description
- Good Knowledge In Kafka
- Good Communication Skill
- Design, develop, and manage Kafka-based data pipelines
- Work with other big data technologies such as Hadoop, Spark, and Storm
- Monitor and optimize Kafka clusters
- Troubleshoot Kafka related issues
- Handle customer queries and support
Kafka Engineer
Posted 3 days ago
Job Viewed
Job Description
Xenith Solutions is a small family focused business where we focus on taking care of our employees and customers equally. We are focused on serving Federal / Civilian, Defense and Intelligence organizations with superior service. If you want to be a part of a rapidly growing business with an exceptional culture, then you want to be a part of the Xenith Solutions family.
Xenith offers unmatched Benefits:
- 100% of Medical, Dental, and Vision benefits paid by employer
- FSA or HSA available
- Unlimited Paid Time Off (PTO)
- 401(k) matching (100% up to the first 4%) with NO vesting period
- Tuition / Certification / Training reimbursement
- Accident / Disability / Universal Life Insurance
- And much more…
Xenith Solutions is currently looking for a Kafka Engineer with agile methodology experience to join our BEAGLE (Border Enforcement Applications for Government Leading-Edge Information Technology) Agile Solution Factory (ASF) Team supporting Customs and Border Protection (CBP) client located in Northern Virginia! Join this passionate team of industry-leading individuals supporting the best practices in Agile Software Development for the Department of Homeland Security (DHS).
Responsibilities include:- Design, develop, and deploy high-performance Kafka producers, consumers, and stream processing applications (using Kafka Streams, ksqlDB, Flink, or Spark Streaming) in Java.
- Collaborate with architects and other engineering teams to define and evolve our event-driven architecture, ensuring best practices for Kafka topic design, partitioning, replication, and data retention.
- Implement and manage components of the Kafka ecosystem, including Kafka Connect (source and sink connectors), Schema Registry (Avro, Protobuf), and Kafka security features.
- Monitor, troubleshoot, and optimize Kafka clusters and Kafka-dependent applications for throughput, latency, reliability, and resource utilization.
- Build and maintain robust and resilient data pipelines for real-time ingestion, transformation, and distribution of data across various systems.
- Provide operational support for Kafka-based systems, including incident response, root cause analysis, and proactive maintenance to ensure high availability and reliability.
- Enforce data contract definitions and schema evolution strategies using Schema Registry to maintain data quality and compatibility across services.
- Implement comprehensive testing strategies for Kafka applications, including unit, integration, and end-to-end tests, ensuring data integrity and system reliability.
- Create and maintain detailed technical documentation, architectural diagrams, and operational runbooks for Kafka-related components and processes.
- Act as a subject matter expert, sharing knowledge, mentoring junior engineers, and championing Kafka best practices across the organization.
You have:
- Must be a U.S. Citizen with the ability to pass CBP background investigation, criteria include but are not limited to:
- 3 year check for felony convictions
- 1 year check for illegal drug use
- 1 year check for misconduct such as theft or fraud
- College degree (B.S.) in Computer Science, Software Engineering, Information Management Systems or a related discipline. Equivalent professional experience will be considered in lieu of degree
- Professional Experience: at least seven (7) years related technical experience, with software design, development and implementation in a Windows Environment
You’ll Bring These Qualifications:
- Extensive hands-on experience designing, developing, and deploying applications using Apache Kafka (producers, consumers, topic management, consumer groups).
- Deep understanding of Kafka's internal architecture, guarantees (at-least-once, exactly-once), offset management, and delivery semantics.
- Experience with Kafka Streams API or other stream processing frameworks (e.g., Flink, Spark Streaming with Kafka).
- Programming Proficiency: High-level proficiency in at least one modern backend programming language suitable for Kafka development (Java strongly preferred).
- Strong understanding of distributed systems principles, concurrency, fault tolerance, and resilience patterns.
- Experience with data serialization formats such as Avro, Protobuf, or JSON Schema, and their use with Kafka Schema Registry.
- Solid understanding of relational and/or NoSQL databases, and experience integrating them with Kafka.
- Excellent analytical, debugging, and problem-solving skills in complex distributed environments.
- Strong verbal and written communication skills, with the ability to clearly articulate technical concepts to diverse audiences.
- Knowledge of monitoring and observability tools for Kafka and streaming applications (e.g., Prometheus, Grafana, ELK stack, Datadog).
- Working knowledge of Git and collaborative development workflows.
- Understanding of all elements of the software development life cycle, including planning, development, requirements management, CM, quality assurance, and release management.
These Qualifications Would be Nice to Have:
- Hands-on experience with Confluent Platform components (Control Center, ksqlDB, REST Proxy, Tiered Storage).
- Experience with Kafka Connect for building data integration pipelines (developing custom connectors is a plus).
- Familiarity with cloud platforms (AWS, Azure, GCP) and managed Kafka services (e.g., AWS MSK, Confluent Cloud, Azure Event Hubs).
- Experience with containerization (Docker) and orchestration (Kubernetes) for deploying Kafka-dependent applications.
- Experience with CI/CD pipelines for automated testing and deployment of Kafka-based services.
- Familiarity with performance testing and benchmarking tools for Kafka and related applications.
Xenith Solutions LLC is a Service-Disabled Veteran-Owned Small Business founded in 2019. We provide comprehensive, timely and relevant Solutions and Business Consulting support to our customers as a key partner. Our leadership brings over a century of combined experience in Defense and Civilian markets. Our employees possess experience in all aspects of solution development from requirements creation, development, test and evaluation, fielding, and sustainment. At the core of our offerings, we provide strategy and technology solutions, giving our customers valuable insights and thought leadership on the best application of information technology to drive business objectives.
Xenith focuses on solving complex business challenges facing our customers. Our “Success Through Achievement” work ethic means our customer receive quality solutions through our commitment. We pride ourselves on tackling some of the most difficult operational requirements our customers have – ensuring an appropriate match between the mission requirements, financials, schedule, and security.
EEO
Xenith Solutions provides equal employment opportunities (EEO) to all employees and applicants for employment without regard to race, color, religion, gender, sexual orientation, gender identity or expression, national origin, age, disability, genetic information, marital status, amnesty, or status as a covered veteran in accordance with applicable federal, state and local laws.
EEO IS THE LAW
If you are an individual with a disability and would like to request a reasonable accommodation as part of the employment selection process, please contact Xenith Solutions.
E-Verify
As a Federal Contractor, Xenith Solutions is required to participate in the E-Verify Program to confirm eligibility to work in the United States.
Affirmative Action Plan
As a federal government contractor and based on Executive Orders and applicable laws and regulations, Xenith Solutions develops and maintains annual written Affirmative Action Plans and endeavors to hire and advance qualified minorities, females, individuals with disabilities, and protected veterans.
Kafka Engineer
Posted 3 days ago
Job Viewed
Job Description
Xenith Solutions is a small family focused business where we focus on taking care of our employees and customers equally. We are focused on serving Federal / Civilian, Defense and Intelligence organizations with superior service. If you want to be a part of a rapidly growing business with an exceptional culture, then you want to be a part of the Xenith Solutions family.
Xenith offers unmatched Benefits:
- 100% of Medical, Dental, and Vision benefits paid by employer
- FSA or HSA available
- Unlimited Paid Time Off (PTO)
- 401(k) matching (100% up to the first 4%) with NO vesting period
- Tuition / Certification / Training reimbursement
- Accident / Disability / Universal Life Insurance
- And much more.
Xenith Solutions is currently looking for a Kafka Engineer with agile methodology experience to join our BEAGLE (Border Enforcement Applications for Government Leading-Edge Information Technology) Agile Solution Factory (ASF) Team supporting Customs and Border Protection (CBP) client located in Northern Virginia! Join this passionate team of industry-leading individuals supporting the best practices in Agile Software Development for the Department of Homeland Security (DHS).
Responsibilities include:
- Design, develop, and deploy high-performance Kafka producers, consumers, and stream processing applications (using Kafka Streams, ksqlDB, Flink, or Spark Streaming) in Java.
- Collaborate with architects and other engineering teams to define and evolve our event-driven architecture, ensuring best practices for Kafka topic design, partitioning, replication, and data retention.
- Implement and manage components of the Kafka ecosystem, including Kafka Connect (source and sink connectors), Schema Registry (Avro, Protobuf), and Kafka security features.
- Monitor, troubleshoot, and optimize Kafka clusters and Kafka-dependent applications for throughput, latency, reliability, and resource utilization.
- Build and maintain robust and resilient data pipelines for real-time ingestion, transformation, and distribution of data across various systems.
- Provide operational support for Kafka-based systems, including incident response, root cause analysis, and proactive maintenance to ensure high availability and reliability.
- Enforce data contract definitions and schema evolution strategies using Schema Registry to maintain data quality and compatibility across services.
- Implement comprehensive testing strategies for Kafka applications, including unit, integration, and end-to-end tests, ensuring data integrity and system reliability.
- Create and maintain detailed technical documentation, architectural diagrams, and operational runbooks for Kafka-related components and processes.
- Act as a subject matter expert, sharing knowledge, mentoring junior engineers, and championing Kafka best practices across the organization.
You have:
- Must be a U.S. Citizen with the ability to pass CBP background investigation, criteria include but are not limited to:
- 3 year check for felony convictions
- 1 year check for illegal drug use
- 1 year check for misconduct such as theft or fraud
- College degree (B.S.) in Computer Science, Software Engineering, Information Management Systems or a related discipline. Equivalent professional experience will be considered in lieu of degree
- Professional Experience: at least seven (7) years related technical experience, with software design, development and implementation in a Windows Environment
You'll Bring These Qualifications:
- Extensive hands-on experience designing, developing, and deploying applications using Apache Kafka (producers, consumers, topic management, consumer groups).
- Deep understanding of Kafka's internal architecture, guarantees (at-least-once, exactly-once), offset management, and delivery semantics.
- Experience with Kafka Streams API or other stream processing frameworks (e.g., Flink, Spark Streaming with Kafka).
- Programming Proficiency: High-level proficiency in at least one modern backend programming language suitable for Kafka development (Java strongly preferred).
- Strong understanding of distributed systems principles, concurrency, fault tolerance, and resilience patterns.
- Experience with data serialization formats such as Avro, Protobuf, or JSON Schema, and their use with Kafka Schema Registry.
- Solid understanding of relational and/or NoSQL databases, and experience integrating them with Kafka.
- Excellent analytical, debugging, and problem-solving skills in complex distributed environments.
- Strong verbal and written communication skills, with the ability to clearly articulate technical concepts to diverse audiences.
- Knowledge of monitoring and observability tools for Kafka and streaming applications (e.g., Prometheus, Grafana, ELK stack, Datadog).
- Working knowledge of Git and collaborative development workflows.
- Understanding of all elements of the software development life cycle, including planning, development, requirements management, CM, quality assurance, and release management.
These Qualifications Would be Nice to Have:
- Hands-on experience with Confluent Platform components (Control Center, ksqlDB, REST Proxy, Tiered Storage).
- Experience with Kafka Connect for building data integration pipelines (developing custom connectors is a plus).
- Familiarity with cloud platforms (AWS, Azure, GCP) and managed Kafka services (e.g., AWS MSK, Confluent Cloud, Azure Event Hubs).
- Experience with containerization (Docker) and orchestration (Kubernetes) for deploying Kafka-dependent applications.
- Experience with CI/CD pipelines for automated testing and deployment of Kafka-based services.
- Familiarity with performance testing and benchmarking tools for Kafka and related applications.
Xenith Solutions LLC is a Service-Disabled Veteran-Owned Small Business founded in 2019. We provide comprehensive, timely and relevant Solutions and Business Consulting support to our customers as a key partner. Our leadership brings over a century of combined experience in Defense and Civilian markets. Our employees possess experience in all aspects of solution development from requirements creation, development, test and evaluation, fielding, and sustainment. At the core of our offerings, we provide strategy and technology solutions, giving our customers valuable insights and thought leadership on the best application of information technology to drive business objectives.
Xenith focuses on solving complex business challenges facing our customers. Our "Success Through Achievement" work ethic means our customer receive quality solutions through our commitment. We pride ourselves on tackling some of the most difficult operational requirements our customers have - ensuring an appropriate match between the mission requirements, financials, schedule, and security.
EEO
Xenith Solutions provides equal employment opportunities (EEO) to all employees and applicants for employment without regard to race, color, religion, gender, sexual orientation, gender identity or expression, national origin, age, disability, genetic information, marital status, amnesty, or status as a covered veteran in accordance with applicable federal, state and local laws.
EEO IS THE LAW
If you are an individual with a disability and would like to request a reasonable accommodation as part of the employment selection process, please contact Xenith Solutions.
E-Verify
As a Federal Contractor, Xenith Solutions is required to participate in the E-Verify Program to confirm eligibility to work in the United States.
Affirmative Action Plan
As a federal government contractor and based on Executive Orders and applicable laws and regulations, Xenith Solutions develops and maintains annual written Affirmative Action Plans and endeavors to hire and advance qualified minorities, females, individuals with disabilities, and protected veterans.
Kafka Engineer
Posted 3 days ago
Job Viewed
Job Description
Kafka Engineer
Job Category: Information Technology
Time Type: Full time
Minimum Clearance Required to Start: None
Employee Type: Regular
Percentage of Travel Required: Up to 10%
Type of Travel: Local
The Opportunity:
CACI is seeking a Kafka Engineer to join our team and support the Border Enforcement Applications for Government Leading-Edge Information Technology (IT) (BEAGLE) contract. You will have the opportunity to apply your knowledge, skills and experience to building a truly modern application that is new development and cloud native. If you thrive in a culture of innovation and bring creative ideas to solve complex technical and procedural problems at the at the team and portfolio levels, then this opportunity is for you!
Join this passionate team of industry-leading individuals supporting best practices in agile software development for the Department of Homeland Security (DHS). You will support the men and women charged with safeguarding the American people and enhancing the nation's safety and security.
Responsibilities:
-
Serve as an Agile Scrum team member providing software development support and maintenance for the delivery of releasable software in short sprint cycles. Responsible for activities associated with delivery of software solutions associated with customer-defined systems and software projects by working in close collaboration with software?developers/engineers,?stakeholders, and end users within Agile processes. Responsibilities include:
-
Design, develop, and deploy high-performance Kafka producers, consumers, and stream processing applications (using Kafka Streams, ksqlDB, Flink, or Spark Streaming) in Java.
-
Collaborate with architects and other engineering teams to define and evolve our event-driven architecture, ensuring best practices for Kafka topic design, partitioning, replication, and data retention.
-
Implement and manage components of the Kafka ecosystem, including Kafka Connect (source and sink connectors), Schema Registry (Avro, Protobuf), and Kafka security features.
-
Monitor, troubleshoot, and optimize Kafka clusters and Kafka-dependent applications for throughput, latency, reliability, and resource utilization.
-
Build and maintain robust and resilient data pipelines for real-time ingestion, transformation, and distribution of data across various systems.
-
Provide operational support for Kafka-based systems, including incident response, root cause analysis, and proactive maintenance to ensure high availability and reliability.
-
Enforce data contract definitions and schema evolution strategies using Schema Registry to maintain data quality and compatibility across services.
-
Implement comprehensive testing strategies for Kafka applications, including unit, integration, and end-to-end tests, ensuring data integrity and system reliability.
-
Create and maintain detailed technical documentation, architectural diagrams, and operational runbooks for Kafka-related components and processes.
-
Act as a subject matter expert, sharing knowledge, mentoring junior engineers, and championing Kafka best practices across the organization.
Qualifications:
Required:
-
Must be a U.S. Citizen with the ability to pass CBP background investigation, criteria include but are not limited to:
-
Extensive hands-on experience designing, developing, and deploying applications using Apache Kafka (producers, consumers, topic management, consumer groups).
-
Deep understanding of Kafka's internal architecture, guarantees (at-least-once, exactly-once), offset management, and delivery semantics.
-
Experience with Kafka Streams API or other stream processing frameworks (e.g., Flink, Spark Streaming with Kafka).
-
Programming Proficiency: High-level proficiency in at least one modern backend programming language suitable for Kafka development (Java strongly preferred).
-
Strong understanding of distributed systems principles, concurrency, fault tolerance, and resilience patterns.
-
Experience with data serialization formats such as Avro, Protobuf, or JSON Schema, and their use with Kafka Schema Registry.
-
Solid understanding of relational and/or NoSQL databases, and experience integrating them with Kafka.
-
Excellent analytical, debugging, and problem-solving skills in complex distributed environments.
-
Strong verbal and written communication skills, with the ability to clearly articulate technical concepts to diverse audiences.
-
Knowledge of monitoring and observability tools for Kafka and streaming applications (e.g., Prometheus, Grafana, ELK stack, Datadog).
-
Working knowledge of Git and collaborative development workflows.Understanding of all elements of the software development life cycle, including planning, development, requirements management, CM, quality assurance, and release management.
-
Professional Experience: at least seven (7) years related technical experience, with software design, development and implementation in a Windows Environment
-
College degree (B.S.) in Computer Science, Software Engineering, Information Management Systems or a related discipline.
-
Equivalent professional experience will be considered in lieu of degree
-
1 year check for misconduct such as theft or fraud
-
1 year check for illegal drug use
-
3 year check for felony convictions
Desired:
-
Hands-on experience with Confluent Platform components (Control Center, ksqlDB, REST Proxy, Tiered Storage).
-
Experience with Kafka Connect for building data integration pipelines (developing custom connectors is a plus).
-
Familiarity with cloud platforms (AWS, Azure, GCP) and managed Kafka services (e.g., AWS MSK, Confluent Cloud, Azure Event Hubs).
-
Experience with containerization (Docker) and orchestration (Kubernetes) for deploying Kafka-dependent applications.
-
Experience with CI/CD pipelines for automated testing and deployment of Kafka-based services.
-
Familiarity with performance testing and benchmarking tools for Kafka and related applications.
___
What You Can Expect:
A culture of integrity.
At CACI, we place character and innovation at the center of everything we do. As a valued team member, you'll be part of a high-performing group dedicated to our customer's missions and driven by a higher purpose - to ensure the safety of our nation.
An environment of trust.
CACI values the unique contributions that every employee brings to our company and our customers - every day. You'll have the autonomy to take the time you need through a unique flexible time off benefit and have access to robust learning resources to make your ambitions a reality.
A focus on continuous growth.
Together, we will advance our nation's most critical missions, build on our lengthy track record of business success, and find opportunities to break new ground - in your career and in our legacy.
Your potential is limitless. So is ours.
Learn more about CACI here. (
___
Pay Range : There are a host of factors that can influence final salary including, but not limited to, geographic location, Federal Government contract labor categories and contract wage rates, relevant prior work experience, specific skills and competencies, education, and certifications. Our employees value the flexibility at CACI that allows them to balance quality work and their personal lives. We offer competitive compensation, benefits and learning and development opportunities. Our broad and competitive mix of benefits options is designed to support and protect employees and their families. At CACI, you will receive comprehensive benefits such as; healthcare, wellness, financial, retirement, family support, continuing education, and time off benefits. Learn more here ( .
The proposed salary range for this position is:
$103,800 - $218,100
CACI is an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, pregnancy, sexual orientation, age, national origin, disability, status as a protected veteran, or any other protected characteristic.
Kafka Engineer
Posted 9 days ago
Job Viewed
Job Description
Diverse Lynx LLC is an Equal Employment Opportunity employer. All qualified applicants will receive due consideration for employment without any discrimination. All applicants will be evaluated solely on the basis of their ability, competence and their proven capability to perform the functions outlined in the corresponding role. We promote and support a diverse workforce across all levels in the company.
Kafka Engineer
Posted 9 days ago
Job Viewed
Job Description
Standing up and administer Kafka cluster. Strong experience with Confluent Kafka.
• Expertise in Kafka brokers, zookeepers, Kafka connect, schema registry, KSQL, Rest proxy, Replicator. Both Confluent and Apache Kafka.
• Solid understanding of Kafka architecture, how to save Kafka space.
• Working knowledge of Apache Kafka, Linux/Unix systems.
• Use automation tools like provisioning using Docker, Bamboo, Harness and GitLab.
• Hands on commands usage of Kafka and all components.
• Ability to perform data related benchmarking, performance analysis and tuning.
• Solid knowledge of monitoring tools and fine-tuning s on Splunk, Prometheus, Grafana, Splunk.
• Setting up security on Kafka (Setting different security mechanism in Kafka).
• Manage Kafka users and topics.
• Monitor, prevent and troubleshoot security related issues.
• Good knowledge on CI/CD pipelines
• Automation and scripting using Shell, python, Ansible, Salt stack
KAFKA ENGINEER
Posted 9 days ago
Job Viewed
Job Description
BFS
***Onsite in Plano - TX
***Max rate $65
KAFKA ENGINEER
Experience 7to10Yrs
Shift Day 9AM TO 7PM
Required Skills - Kafka
Nice to have skills - Azure DevOps
Roles & Responsibilities
Utilize your 7-10 years of experience and expertise in Kafka to drive effective communication strategies.
Hands on experience working on Kafka connect using schema registry in a very high volume environment
Complete understanding of Kafka config properties acks timeouts buffering partitioning etc
Design recommends the best approach suited for data movement to from different sources using Apache Confluent Kafka
Hands on experience working on Converters AvroJson and Kafka connectors
Hands on experience on custom connectors using the Kafka core concepts and API
Working knowledge on Kafka Rest proxy
Ensure the optimum performance high availability and stability of solutions
Create topics set up redundancy cluster deploy monitoring tools and alerts and has good knowledge of best practices
Create stubs for producers consumers and consumer groups to help onboard applications from different languages platforms
Ability to perform data related benchmarking performance analysis and tuning
Skills Required Spring Kubernetes Kafka Azure DevOps pipelines Jenkins pipelines etc
Job Location Primary: USTXPAOC01 Plano - TX USA
Job Type 60CW00 Business Associate
Demand Requires Travel? N
Certification(s) Required NA
Be The First To Know
About the latest Kafka engineer Jobs in United States !
Kafka Engineer
Posted 9 days ago
Job Viewed
Job Description
Role Details: · Title: Kafka Engineer (80% engineering, 20% admin) · Duration: 1 year · Location: Remote (PST hours) · Interview: 1–2 rounds focused on technical knowledge and configuration
Top Skills: · Kafka Connectors (most important: integration knowledge) · Configuration experience · Nice to have: KSQL (Streaming SQL for Apache Kafka)
Job Responsibilities: · Confluent Kafka administration across multi-DC brokers, connectors, C3, KSQL DB, Rest Proxy, Schema Registry · Configure and maintain Kafka topics, RBAC, connectors, KSQL, and schema registry · Ensure Kafka security, scalability, availability, and disaster recovery · Support Kafka clients in Java, Node.js, and Python · Basic administration of Apache NiFi (OSS) · Design and develop Kafka-based solutions using Confluent Kafka, Connectors, KSQL, and NiFi · Experience with cloud (AWS), data connectors, and queue connectors (design + configuration)
This is a remote position.
Kafka Engineer
Posted 9 days ago
Job Viewed
Job Description
• Title: Kafka Engineer (80% engineering, 20% admin)
• Duration: 1 year
• Location: Remote (PST hours)
• Interview: 1-2 rounds focused on technical knowledge and configuration
Top Skills:
• Kafka Connectors (most important: integration knowledge)
• Configuration experience
• Nice to have: KSQL (Streaming SQL for Apache Kafka)
Job Responsibilities:
• Confluent Kafka administration across multi-DC brokers, connectors, C3, KSQL DB, Rest Proxy, Schema Registry
• Configure and maintain Kafka topics, RBAC, connectors, KSQL, and schema registry
• Ensure Kafka security, scalability, availability, and disaster recovery
• Support Kafka clients in Java, Node.js, and Python
• Basic administration of Apache NiFi (OSS)
• Design and develop Kafka-based solutions using Confluent Kafka, Connectors, KSQL, and NiFi
• Experience with cloud (AWS), data connectors, and queue connectors (design + configuration)
This is a remote position.
Compensation: $45.00 - $50.00 per hour
Kafka Engineer

Posted 15 days ago
Job Viewed
Job Description
Job Category: Information Technology
Time Type: Full time
Minimum Clearance Required to Start: None
Employee Type: Regular
Percentage of Travel Required: Up to 10%
Type of Travel: Local
* * *
**The Opportunity:**
CACI is seeking a Kafka Engineer to join our team and support the Border Enforcement Applications for Government Leading-Edge Information Technology (IT) (BEAGLE) contract. You will have the opportunity to apply your knowledge, skills and experience to building a truly modern application that is new development and cloud native. If you thrive in a culture of innovation and bring creative ideas to solve complex technical and procedural problems at the at the team and portfolio levels, then this opportunity is for you!
Join this passionate team of industry-leading individuals supporting best practices in agile software development for the Department of Homeland Security (DHS). You will support the men and women charged with safeguarding the American people and enhancing the nation's safety and security.
**Responsibilities:**
+ Serve as an Agile Scrum team member providing software development support and maintenance for the delivery of releasable software in short sprint cycles. Responsible for activities associated with delivery of software solutions associated with customer-defined systems and software projects by working in close collaboration with software developers/engineers, stakeholders, and end users within Agile processes. Responsibilities include:
+ Design, develop, and deploy high-performance Kafka producers, consumers, and stream processing applications (using Kafka Streams, ksqlDB, Flink, or Spark Streaming) in Java.
+ Collaborate with architects and other engineering teams to define and evolve our event-driven architecture, ensuring best practices for Kafka topic design, partitioning, replication, and data retention.
+ Implement and manage components of the Kafka ecosystem, including Kafka Connect (source and sink connectors), Schema Registry (Avro, Protobuf), and Kafka security features.
+ Monitor, troubleshoot, and optimize Kafka clusters and Kafka-dependent applications for throughput, latency, reliability, and resource utilization.
+ Build and maintain robust and resilient data pipelines for real-time ingestion, transformation, and distribution of data across various systems.
+ Provide operational support for Kafka-based systems, including incident response, root cause analysis, and proactive maintenance to ensure high availability and reliability.
+ Enforce data contract definitions and schema evolution strategies using Schema Registry to maintain data quality and compatibility across services.
+ Implement comprehensive testing strategies for Kafka applications, including unit, integration, and end-to-end tests, ensuring data integrity and system reliability.
+ Create and maintain detailed technical documentation, architectural diagrams, and operational runbooks for Kafka-related components and processes.
+ Act as a subject matter expert, sharing knowledge, mentoring junior engineers, and championing Kafka best practices across the organization.
**Qualifications:**
_Required:_
+ Must be a U.S. Citizen with the ability to pass CBP background investigation, criteria include but are not limited to:
+ Extensive hands-on experience designing, developing, and deploying applications using Apache Kafka (producers, consumers, topic management, consumer groups).
+ Deep understanding of Kafka's internal architecture, guarantees (at-least-once, exactly-once), offset management, and delivery semantics.
+ Experience with Kafka Streams API or other stream processing frameworks (e.g., Flink, Spark Streaming with Kafka).
+ Programming Proficiency: High-level proficiency in at least one modern backend programming language suitable for Kafka development (Java strongly preferred).
+ Strong understanding of distributed systems principles, concurrency, fault tolerance, and resilience patterns.
+ Experience with data serialization formats such as Avro, Protobuf, or JSON Schema, and their use with Kafka Schema Registry.
+ Solid understanding of relational and/or NoSQL databases, and experience integrating them with Kafka.
+ Excellent analytical, debugging, and problem-solving skills in complex distributed environments.
+ Strong verbal and written communication skills, with the ability to clearly articulate technical concepts to diverse audiences.
+ Knowledge of monitoring and observability tools for Kafka and streaming applications (e.g., Prometheus, Grafana, ELK stack, Datadog).
+ Working knowledge of Git and collaborative development workflows.Understanding of all elements of the software development life cycle, including planning, development, requirements management, CM, quality assurance, and release management.
+ Professional Experience: at least seven (7) years related technical experience, with software design, development and implementation in a Windows Environment
+ College degree (B.S.) in Computer Science, Software Engineering, Information Management Systems or a related discipline.
+ Equivalent professional experience will be considered in lieu of degree
+ 1 year check for misconduct such as theft or fraud
+ 1 year check for illegal drug use
+ 3 year check for felony convictions
_Desired:_
+ Hands-on experience with Confluent Platform components (Control Center, ksqlDB, REST Proxy, Tiered Storage).
+ Experience with Kafka Connect for building data integration pipelines (developing custom connectors is a plus).
+ Familiarity with cloud platforms (AWS, Azure, GCP) and managed Kafka services (e.g., AWS MSK, Confluent Cloud, Azure Event Hubs).
+ Experience with containerization (Docker) and orchestration (Kubernetes) for deploying Kafka-dependent applications.
+ Experience with CI/CD pipelines for automated testing and deployment of Kafka-based services.
+ Familiarity with performance testing and benchmarking tools for Kafka and related applications.
-
**___**
**What You Can Expect:**
**A culture of integrity.**
At CACI, we place character and innovation at the center of everything we do. As a valued team member, you'll be part of a high-performing group dedicated to our customer's missions and driven by a higher purpose - to ensure the safety of our nation.
**An environment of trust.**
CACI values the unique contributions that every employee brings to our company and our customers - every day. You'll have the autonomy to take the time you need through a unique flexible time off benefit and have access to robust learning resources to make your ambitions a reality.
**A focus on continuous growth.**
Together, we will advance our nation's most critical missions, build on our lengthy track record of business success, and find opportunities to break new ground - in your career and in our legacy.
**Your potential is limitless.** So is ours.
Learn more about CACI here. ( Range** : There are a host of factors that can influence final salary including, but not limited to, geographic location, Federal Government contract labor categories and contract wage rates, relevant prior work experience, specific skills and competencies, education, and certifications. Our employees value the flexibility at CACI that allows them to balance quality work and their personal lives. We offer competitive compensation, benefits and learning and development opportunities. Our broad and competitive mix of benefits options is designed to support and protect employees and their families. At CACI, you will receive comprehensive benefits such as; healthcare, wellness, financial, retirement, family support, continuing education, and time off benefits. Learn more here ( .
The proposed salary range for this position is:
$103,800 - $218,100
_CACI is_ _an Equal Opportunity Employer._ _All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, pregnancy, sexual orientation, age, national origin, disability, status as a protected veteran, or any_ _other protected characteristic._