15,200 Databricks jobs in the United States
Data Engineer /Databricks, Pyspark, DevOps, Databricks
Posted 21 days ago
Job Viewed
Job Description
Data Engineer
Seattle, WA
Contract Role
Onsite from Day 1
Essential Skills:
• 7+ years of experience in data engineering or a related field
• Expertise with programming languages such as Python/PySpark, SQL, or Scala
• Experience working in a cloud environment (Azure preferred) with strong understanding of cloud data architecture
• Hands-on experience with Databricks Cloud Data Platforms Required
• Experience with workflow orchestration (e.g., Databricks Jobs, or Azure Data Factory pipelines) Required
Diverse Lynx LLC is an Equal Employment Opportunity employer. All qualified applicants will receive due consideration for employment without any discrimination. All applicants will be evaluated solely on the basis of their ability, competence and their proven capability to perform the functions outlined in the corresponding role. We promote and support a diverse workforce across all levels in the company.
Databricks Engineer

Posted 2 days ago
Job Viewed
Job Description
Insight Global is seeking a Sr. Databricks Administrator to join the Unified Data Platform team at one of our largest Utility Clients. This individual will be the key participant and contributor to the Databricks center of Excellence for Data Platform tools. The Databricks Administrator will be responsible for providing technical expertise in deploying applications on AWS Databricks platform. As a Databricks Administrator, you will be responsible for the administration, configuration, and optimization of the Databricks platform to enable data analytics, machine learning, and data engineering activities within the organization. You will collaborate with data engineers, data scientists, and other stakeholders to ensure the platform meets their requirements while maintaining security and performance standards.
We are a company committed to creating inclusive environments where people can bring their full, authentic selves to work every day. We are an equal opportunity employer that believes everyone matters. Qualified candidates will receive consideration for employment opportunities without regard to race, religion, sex, age, marital status, national origin, sexual orientation, citizenship status, disability, or any other status or characteristic protected by applicable laws, regulations, and ordinances. If you need assistance and/or a reasonable accommodation due to a disability during the application or recruiting process, please send a request to Human Resources Request Form ( . The EEOC "Know Your Rights" Poster is available here ( .
To learn more about how we collect, keep, and process your private information, please review Insight Global's Workforce Privacy Policy: .
Skills and Requirements
- 5+ years of experience within the IT industry
- Understanding Databricks administration within an AWS infrastructure
- Proficiency in the AWS cloud platform
- Experience with Terraform
- Experience with GitLab
- Experience with Python or and/or Spark in Databricks
- Bachelor's degree in a relevant field (computer science, data engineering, etc.) - Experience within Data Analytics
- Enterprise experience
- Experience in Utility/energy industry null
We are a company committed to creating diverse and inclusive environments where people can bring their full, authentic selves to work every day. We are an equal employment opportunity/affirmative action employer that believes everyone matters. Qualified candidates will receive consideration for employment without regard to race, color, ethnicity, religion,sex (including pregnancy), sexual orientation, gender identity and expression, marital status, national origin, ancestry, genetic factors, age, disability, protected veteran status, military oruniformed service member status, or any other status or characteristic protected by applicable laws, regulations, andordinances. If you need assistance and/or a reasonable accommodation due to a disability during the application or the recruiting process, please send a request to
Databricks Engineer

Posted 2 days ago
Job Viewed
Job Description
Nightwing is supporting a U.S. Government customer on a large mission critical development and sustainment program to design, integrate, build, deliver, and operate a network operations environment including introducing new cyber capabilities to address emerging threats. Nightwing is seeking a Databricks Engineer to support the migration of customer applications, services and platforms to a Medallion Model. This opportunity represents the cornerstone of the future of the organization.
**Responsibilities:**
+ Supporting teams to migrate services, applications and platforms from legacy back-end systems to Databricks.
+ Identifying the optimal path for migration, building the plan for migration and finally, execution of the plan.
+ Data Pipeline migration of legacy pipelines from NiFi to Databricks, complete with validation.
+ Implementing the medallion model for each of the data assets being migrated to Databricks.
+ Develop an SOP for integration of data assets into the Databricks platform focused on efficiency, instrumentation and performance.
+ Optimize development, testing, monitoring and security for data assets being added to the Databricks platform.
+ Develop and implement a strategy for optimizing migration and integration of data assets to the Databricks platform.
+ Develop code using various programming and scripting languages to automate/optimize data ingestion, pipeline orchestration and improve data management processes.
+ Ingest transparency, leveraging technologies such as AWS CloudWatch to identify places for measuring and gathering performance information on automated data pipelines
+ Ensure Data Engineering Team Standard Operating Procedures are appropriately captured and communicated across the team.
+ Ensure technical correctness, timeliness and quality of delivery for the team.
+ Demonstrate excellent oral and written communication skills to all levels of management and the customer.
**Required Skills:**
+ Must be a US Citizen
+ Active Secret (S) clearance. Must be able to obtain a TS/SCI clearance
+ Must be able to obtain DHS Suitability
+ 8 + years of directly relevant software development experience required.
+ Minimum of 5 years of experience performing data engineering work in a cloud environment.
+ Experience with relational, noSQL and/or file-based storage (e.g. Databricks, Elastic, Postgres, S3, Athena, etc.)
+ Experience working in a CICD Pipeline factory environment
+ Working knowledge of Databricks, Cloud Relational Database Services, NiFi, AWS Redshift and Elasticsearch
**Desired Skills:**
+ Experience with:
+ Databricks workflows
+ Databricks Unity Catalog
+ Databricks Autoloader
+ Databricks Delta Live Tables/Delta Lake
+ Databricks Workspace/Notebooks
+ MLflow
+ Apache Spark
+ Experience with collaboration tools including MS Teams, MS Outlook, MS SharePoint, and Confluence
+ Amazon Web Services (AWS) Professional certification or equivalent.
+ Excellent problem-solving and communication skills.
+ Familiarity with CISA: Securing the Software Supply Chain
+ Familiarity with CISA: Cybersecurity Best Practices
+ Familiarity with CISA: Open-Source Software Security
+ Familiarity with NIST SP 800-218, Secure Software Development Framework V1.1: Recommendations for Mitigating the Risk of Software Vulnerabilities
**Required Education:**
+ Bachelor's degree in Software Engineering, Computer Science or a related discipline is required. (Ten (10) years of experience (for a total of six (18) or more years) may be substituted for a degree.)
Syndeo # - 3466
_At Nightwing, we value collaboration and teamwork. You'll have the opportunity to work alongside talented individuals who are passionate about what they do. Together, we'll leverage our collective expertise to drive innovation, solve complex problems, and deliver exceptional results for our clients._
_Thank you for considering joining us as we embark on this new journey and shape the future of cybersecurity and intelligence together as part of the Nightwing team._
_Nightwing is An Equal Opportunity/Affirmative Action Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability or veteran status, age or any other federally protected class._
Databricks Engineer
Posted 13 days ago
Job Viewed
Job Description
Nightwing is supporting a U.S. Government customer on a large mission critical development and sustainment program to design, integrate, build, deliver, and operate a network operations environment including introducing new cyber capabilities to address emerging threats. Nightwing is seeking a Databricks Engineer to support the migration of customer applications, services and platforms to a Medallion Model. This opportunity represents the cornerstone of the future of the organization.
**Responsibilities:**
+ Supporting teams to migrate services, applications and platforms from legacy back-end systems to Databricks.
+ Identifying the optimal path for migration, building the plan for migration and finally, execution of the plan.
+ Data Pipeline migration of legacy pipelines from NiFi to Databricks, complete with validation.
+ Implementing the medallion model for each of the data assets being migrated to Databricks.
+ Develop an SOP for integration of data assets into the Databricks platform focused on efficiency, instrumentation and performance.
+ Optimize development, testing, monitoring and security for data assets being added to the Databricks platform.
+ Develop and implement a strategy for optimizing migration and integration of data assets to the Databricks platform.
+ Develop code using various programming and scripting languages to automate/optimize data ingestion, pipeline orchestration and improve data management processes.
+ Ingest transparency, leveraging technologies such as AWS CloudWatch to identify places for measuring and gathering performance information on automated data pipelines
+ Ensure Data Engineering Team Standard Operating Procedures are appropriately captured and communicated across the team.
+ Ensure technical correctness, timeliness and quality of delivery for the team.
+ Demonstrate excellent oral and written communication skills to all levels of management and the customer.
**Required Skills:**
+ Must be a US Citizen
+ Active Secret (S) clearance. Must be able to obtain a TS/SCI clearance
+ Must be able to obtain DHS Suitability
+ 8 + years of directly relevant software development experience required.
+ Minimum of 5 years of experience performing data engineering work in a cloud environment.
+ Experience with relational, noSQL and/or file-based storage (e.g. Databricks, Elastic, Postgres, S3, Athena, etc.)
+ Experience working in a CICD Pipeline factory environment
+ Working knowledge of Databricks, Cloud Relational Database Services, NiFi, AWS Redshift and Elasticsearch
**Desired Skills:**
+ Experience with:
+ Databricks workflows
+ Databricks Unity Catalog
+ Databricks Autoloader
+ Databricks Delta Live Tables/Delta Lake
+ Databricks Workspace/Notebooks
+ MLflow
+ Apache Spark
+ Experience with collaboration tools including MS Teams, MS Outlook, MS SharePoint, and Confluence
+ Amazon Web Services (AWS) Professional certification or equivalent.
+ Excellent problem-solving and communication skills.
+ Familiarity with CISA: Securing the Software Supply Chain
+ Familiarity with CISA: Cybersecurity Best Practices
+ Familiarity with CISA: Open-Source Software Security
+ Familiarity with NIST SP 800-218, Secure Software Development Framework V1.1: Recommendations for Mitigating the Risk of Software Vulnerabilities
**Required Education:**
+ Bachelor's degree in Software Engineering, Computer Science or a related discipline is required. (Ten (10) years of experience (for a total of six (18) or more years) may be substituted for a degree.)
Syndeo # - 3466
_At Nightwing, we value collaboration and teamwork. You'll have the opportunity to work alongside talented individuals who are passionate about what they do. Together, we'll leverage our collective expertise to drive innovation, solve complex problems, and deliver exceptional results for our clients._
_Thank you for considering joining us as we embark on this new journey and shape the future of cybersecurity and intelligence together as part of the Nightwing team._
_Nightwing is An Equal Opportunity/Affirmative Action Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability or veteran status, age or any other federally protected class._
Databricks Data Engineer
Posted 14 days ago
Job Viewed
Job Description
Job Type: Contract
Duration: 6 Months
100% Remote role
Job Summary:
We are seeking a skilled and experienced Data Engineer to join our team and take a pivotal role in building, optimizing, and maintaining our Databricks platform. As a technical team member, you will leverage your expertise in cloud technologies and data engineering to ensure high performance and scalability of our data solutions. This position requires a passion for data, a commitment to innovation, and a focus on delivering efficient and effective solutions.
Key Responsibilities:
- Ability to analyze data and support data preparation
- Design and maintain data pipelines for streaming data, batch processing, and real-time analytics.
- Ability to build data pipelines for different ingestion patterns.
- Apply data modeling concepts to pipelines.
- Implement basic data validation checks in pipelines.
- Automate execution and deployment of pipelines.
- Collaborate with Data Platform Architects and Data Engineers to integrate feature stores, manage model deployment, and implement data preparation workflows.
- Support data preparation, exploration, and visualization to facilitate data-driven decision-making.
- Work closely with other engineering teams to ensure data pipelines are secure, efficient, and aligned with business requirements.
- Document current and future state data flows
Qualifications:
- Bachelor's degree in Computer Science, a technical field, or a related business discipline. Equivalent experience, certifications, or training will be considered.
- 5+ years of experience in designing and delivering secure cloud solutions.
- 2+ years of experience with Databricks
- 2+ years of experience with AWS (preferred)
- Experience with Agile methodology
- Strong problem-solving, analytical, and communication skills.
Databricks Data Engineer
Posted 19 days ago
Job Viewed
Job Description
Databricks Data Engineer
The Opportunity:
Ever-expanding technology like IoT, machine learning, and artificial intelligence means that there's more structured and unstructured data available today than ever before. As a Data Engineer, you know that organizing big data can yield pivotal insights when it's gathered from disparate sources. We need an experienced Data Engineer like you to help our clients find answers in their big data to impact important missions-from fraud detection to cancer research to national intelligence.
As a Data Engineer at Booz Allen, you'll implement data engineering activities on some of the most mission-driven projects in the industry. You'll deploy and develop pipelines and platforms that organize and make disparate data meaningful.
Here, you'll work with and guide a multi-disciplinary team of analysts, data engineers, developers, and data consumers in a fast-paced, agile environment. You'll use your experience in analytical exploration and data examination while you manage the assessment, design, building, and maintenance of scalable platforms for your clients.
Join us. The world can't wait.
You Have:
5+ years of experience designing, developing, operationalizing, and maintaining complex data applications at an enterprise scale
3+ years of experience creating software for retrieving, parsing, and processing structured and unstructured data, including Python, SQL, or Java
3+ years of experience building scalable ETL and ELT workflows for reporting and analytics
2+ years of experience with Databricks
Experience with data warehousing using AWS Redshift, MySQL, PostgreSQL, or Snowflake
Experience with distributed data and computing tools, including Spark, AWS EMR, or Kafka
Experience with a public cloud, including AWS, Microsoft Azure, or Google Cloud
IRS Moderate Risk Background Investigation (IRS MBI)
Bachelor's degree
Databricks Certified Data Engineer Professional Certification
Nice If You Have:
Experience with Palantir Foundry
Experience with UNIX or Linux, including basic commands and Shell scripting
Experience working on real-time data and streaming applications
Experience with AWS
Master's degree
Vetting:
Applicants selected must possess an active IRS Moderate Risk Background Investigation (IRS MBI) and meet eligibility requirements of the U.S. government client; IRS MBI is required.
Compensation
At Booz Allen, we celebrate your contributions, provide you with opportunities and choices, and support your total well-being. Our offerings include health, life, disability, financial, and retirement benefits, as well as paid leave, professional development, tuition assistance, work-life programs, and dependent care. Our recognition awards program acknowledges employees for exceptional performance and superior demonstration of our values. Full-time and part-time employees working at least 20 hours a week on a regular basis are eligible to participate in Booz Allen's benefit programs. Individuals that do not meet the threshold are only eligible for select offerings, not inclusive of health benefits. We encourage you to learn more about our total benefits by visiting the Resource page on our Careers site and reviewing Our Employee Benefits page.
Salary at Booz Allen is determined by various factors, including but not limited to location, the individual's particular combination of education, knowledge, skills, competencies, and experience, as well as contract-specific affordability and organizational requirements. The projected compensation range for this position is $62,000.00 to $141,000.00 (annualized USD). The estimate displayed represents the typical salary range for this position and is just one component of Booz Allen's total compensation package for employees. This posting will close within 90 days from the Posting Date.Identity Statement
As part of the application process, you are expected to be on camera during interviews and assessments. We reserve the right to take your picture to verify your identity and prevent fraud.
Work Model
Our people-first culture prioritizes the benefits of flexibility and collaboration, whether that happens in person or remotely.
- If this position is listed as remote or hybrid, you'll periodically work from a Booz Allen or client site facility.
- If this position is listed as onsite, you'll work with colleagues and clients in person, as needed for the specific role.
Commitment to Non-Discrimination
All qualified applicants will receive consideration for employment without regard to disability, status as a protected veteran or any other status protected by applicable federal, state, local, or international law.
Databricks Data Engineer
Posted 21 days ago
Job Viewed
Job Description
About the job Databricks Data Engineer
Title: Databricks Data Engineer (Senior Manager / AVP)
Location: Columbus, OH (1st choice), Remote (2nd choice)
Experience Required: 7 12 years
Base Salary: Up to $150k
Job Overview:
We are seeking a skilled Data Engineer to join our team. The successful candidate will be responsible for development and optimization of data pipelines, implementing robust data checks, and ensuring the accuracy and integrity of data flows. This role is critical in supporting data-driven decision-making processes, especially in the context of our insurance-focused business operations.
Key Responsibilities:
- Collaborate with data analysts, reporting team and business advisors to gather requirements and define data models that effectively support business requirements
- Develop and maintain scalable and efficient data pipelines to ensure seamless data flow across various systems address any issues or bottlenecks in existing pipelines.
- Implement robust data checks to ensure the accuracy and integrity of data. Summarize and validate large datasets to ensure they meet quality standards.
- Monitor data jobs for successful completion. Troubleshoot and resolve any issues that arise to minimize downtime and ensure continuity of data processes.
- Regularly review and audit data processes and pipelines to ensure compliance with internal standards and regulatory requirements
- Familiar with working on Agile methodologies - scrum, sprint planning, backlog refinement etc.
- 7-12 years experience on Data Engineering role working with Databricks & Cloud technologies.
- Bachelors degree in computer science, Information Technology, or related field.
- Strong proficiency in PySpark, Python, SQL.
- Strong experience in data modeling, ETL/ELT pipeline development, and automation
- Hands-on experience with performance tuning of data pipelines and workflows
- Proficient in working on Azure cloud components Azure Data Factory, Azure Databricks, Azure Data Lake etc.
- Experience with data modeling, ETL processes, Delta Lake and data warehousing.
- Experience on Delta Live Tables, Autoloader & Unity Catalog.
- Preferred - Knowledge of the insurance industry and its data requirements.
- Strong analytical skills with the ability to collect, organize, analyze, and disseminate significant amounts of information with attention to detail and accuracy.
- Excellent communication and problem-solving skills to work effectively with diverse teams
- Excellent problem-solving skills and ability to work under tight deadlines.
Be The First To Know
About the latest Databricks Jobs in United States !
Databricks Data Engineer
Posted 21 days ago
Job Viewed
Job Description
Tekfortune is a fast-growing consulting firm specialized in permanent, contract & project-based staffing services for world's leading organizations in a broad range of industries. In this quickly changing economic landscape, virtual recruiting and remote work are critical for the future of work. To support the active project demands and skills gaps, our staffing experts can help you find the best job for you.
Role:
Location:
Duration:
Required Skills:
Job Description:
For more information and other jobs available please contact our recruitment team at To view all the jobs available in the USA and Asia please visit our website at
Databricks Data Engineer
Posted 21 days ago
Job Viewed
Job Description
Cloud Platform - GCP preferred! but Azure, AWS are okay.
- ETL, ELT, Data Streaming, Data Sharing
- Strong data engineering background
- Python preferred or Scala
- Retail Data Experience
- Moving data from source system to target ecommerce system - Product Information Data is the focus.
- Understand the Databricks platform and how it works inside and out - data sharing, Apache iceberg (big plus).
Databricks Data Engineer
Posted 21 days ago
Job Viewed
Job Description
Cloud Platform - GCP preferred! but Azure, AWS are okay.
- ETL, ELT, Data Streaming, Data Sharing
- Strong data engineering background
- Python preferred or Scala
- Retail Data Experience
- Moving data from source system to target ecommerce system - Product Information Data is the focus.
- Understand the Databricks platform and how it works inside and out - data sharing, Apache iceberg (big plus).