2,658 Databricks jobs in the United States

Databricks Analyst

75215 Park Cities, Texas Diverse Lynx

Posted 2 days ago

Job Viewed

Tap Again To Close

Job Description

Must have at least 8+ years of extensive Data Engineering Experience in Databricks, ETL/ELT using Data pipelines, SQL/Procedure experience.Must have 8+ Years of hands-on development proficiency in implementing Databricks solutions using Scala and Spark with Data frames and Notebooks/SQLMust have 8+ Years of hands-on development and performance tuning/enhancement experience in Scala/PySpark and SQL.Must have 8+ Years of hands-on development experience in working on Azure Cloud.Must have strong experience with Databricks, including developing and optimizing Spark jobs, data transformations, and data processing workflows.Must have experience and a good understanding on the below topicsDatabricks Delta Lake storageUnity CatalogAutoloader Lakehouse ArchitectureMedallion ArchitectureDatabricks Serverless Options.Dataset and Data frameDatabricks Clusters Configuration and SizingDatabricks Job OrchestrationSpark Structured Streaming Must have experience with complex and large volume data streaming and data transformations using Azure Databricks.Must have extensive experience in Datawarehouse/Data Lakehouse implementations.Must have experience with monitoring, debugging, and resolving issues in live Databricks jobs.Must have strong hands-on expertise in troubleshooting DevOps pipelines, Azure, and AKS services

View Now

Databricks Developer

20022 Washington, District Of Columbia Louisiana Staffing

Posted today

Job Viewed

Tap Again To Close

Job Description

Databricks Developer

Are you a talented Databricks Developer looking for an exciting opportunity to showcase your skills and make a real impact? Join our dynamic team and take on the challenge of supporting client-specific development of our client's cutting-edge BI solution. We are seeking an experienced Azure Databricks Developer to join our team. The successful candidate will be responsible for designing, developing, and maintaining scalable data pipelines and solutions using Databricks. The ideal candidate will have a strong background in data engineering, experience with big data technologies, and a deep understanding of Databricks and Apache Spark. This position is located in our Arlington, VA office; however, a hybrid working model is acceptable. This role can also be hybrid to Belton, TX, Lafayette, LA, Knoxville, TN, and Lebanon, VA. This role requires USC or GC.

Your future duties and responsibilities:

  • Design and develop scalable data pipelines and ETL processes using Databricks and Apache Spark.
  • Collaborate with data scientists, analysts, and other stakeholders to understand data requirements and deliver solutions.
  • Optimize and tune data pipelines for performance and scalability.
  • Implement data quality checks and validations to ensure data accuracy and consistency.
  • Monitor and troubleshoot data pipelines to ensure reliable and timely data delivery.
  • Develop and maintain documentation for data pipelines, processes, and solutions.
  • Implement best practices for data security, governance, and compliance.
  • Participate in code reviews and contribute to the continuous improvement of the BI Reporting team.

Required qualifications to be successful in this role:

  • Bachelor's degree in Computer Science, Information Technology, or a related field.
  • 3+ years of experience in data engineering or a related field.
  • Strong experience with Databricks and Apache Spark.
  • Proficiency in programming languages such as Python, Scala, or Java.
  • Experience with big data technologies such as Hadoop, Hive, and Kafka.
  • Strong SQL skills and experience with relational databases.
  • Experience with cloud platforms such as AWS, Azure, or Google Cloud.
  • Knowledge of data warehousing concepts and technologies.
  • Experience with version control systems such as Git.
  • Strong problem-solving skills and attention to detail.
  • Excellent communication and collaboration skills.

Desired qualifications/non-essential skills required:

  • Experience with Delta Lake and Databricks Delta.
  • Experience with data visualization tools such as Power BI, Tableau, or Looker.
  • Knowledge of machine learning and data science concepts.
  • Experience with CI/CD pipelines and DevOps practices.
  • Certification in Databricks, AWS, Azure, or Google Cloud.

CGI is required by law in some jurisdictions to include a reasonable estimate of the compensation range for this role. The determination of this range includes various factors not limited to skill set, level, experience, relevant training, and licensure and certifications. To support the ability to reward for merit-based performance, CGI typically does not hire individuals at or near the top of the range for their role. Compensation decisions are dependent on the facts and circumstances of each case. A reasonable estimate of the current range for this role in the U.S. is $76,600.00 - $134,100.00. CGI Federal's benefits are offered to eligible professionals on their first day of employment to include: competitive compensation, comprehensive insurance options, matching contributions through the 401(k) plan and the share purchase plan, paid time off for vacation, holidays, and sick time, paid parental leave, learning opportunities and tuition assistance, wellness and well-being programs.

Together, as owners, let's turn meaningful insights into action. Life at CGI is rooted in ownership, teamwork, respect and belonging. Here, you'll reach your full potential because You are invited to be an owner from day 1 as we work together to bring our Dream to life. That's why we call ourselves CGI Partners rather than employees. We benefit from our collective success and actively shape our company's strategy and direction. Your work creates value. You'll develop innovative solutions and build relationships with teammates and clients while accessing global capabilities to scale your ideas, embrace new opportunities, and benefit from expansive industry and technology expertise. You'll shape your career by joining a company built to grow and last. You'll be supported by leaders who care about your health and well-being and provide you with opportunities to deepen your skills and broaden your horizons. Come join our teamone of the largest IT and business consulting services firms in the world. Qualified applicants will receive consideration for employment without regard to their race, ethnicity, ancestry, color, sex, religion, creed, age, national origin, citizenship status, disability, pregnancy, medical condition, military and veteran status, marital status, sexual orientation or perceived sexual orientation, gender, gender identity, and gender expression, familial status or responsibilities, reproductive health decisions, political affiliation, genetic information, height, weight, or any other legally protected status or characteristics to the extent required by applicable federal, state, and/or local laws where we do business. CGI provides reasonable accommodations to qualified individuals with disabilities. If you need an accommodation to apply for a job in the U.S., please email the CGI U.S. Employment Compliance mailbox at You will need to reference the Position ID of the position in which you are interested. Your message will be routed to the appropriate recruiter who will assist you. We make it easy to translate military experience and skills! Click here to be directed to our site that is dedicated to veterans and transitioning service members. All CGI offers of employment in the U.S. are contingent upon the ability to successfully complete a background investigation. Background investigation components can vary dependent upon specific assignment and/or level of US government security clearance held. Dependent upon role and/or federal government security clearance requirements, and in accordance with applicable laws, some background investigations may include a credit check. CGI will consider for employment qualified applicants with arrests and conviction records in accordance with all local regulations and ordinances. CGI will not discharge or in any other manner discriminate against employees or applicants because they have inquired about, discussed, or disclosed their own pay or the pay of another employee or applicant.

View Now

Databricks Developer

28245 Charlotte, North Carolina Sumitomo Mitsui Financial Group, Inc.

Posted today

Job Viewed

Tap Again To Close

Job Description

Role Description

The engineer designs, develops and work with a team of Azure Databricks and Actimize engineers, support external Vendors to deliver the compliance and surveillance solutions. The engineer is responsible for supporting and enhancing the surveillance platform, ensuring the bank meets its risk management, legal, and regulatory obligations related to fraud detection and prevention. This role requires deep functional and technical expertise in databases, including strong development skills for customizing and optimizing the platform. The engineer works closely with business units and support teams to deliver system enhancements, perform upgrades, and provide on-call user support. Responsibilities include designing and implementing workflows, plugins, data integration pipelines, writing custom rules and logic, and troubleshooting complex issues.

Operating in a highly transactional and tactical environment, the engineer exercises sound judgment and discretion to resolve challenges and support the achievement of business objectives.

Role Objectives: Delivery

Role Objectives: Delivery and BAU Support• Plan and design code and configuration for Actimize applications, ensuring alignment with business and technical requirements.
• Conduct detailed analysis to validate requirements, incidents, process flows, and project deliverables. Prepare comprehensive technical documentation to support development and implementation efforts.
• Troubleshoot and resolve issues across all environments, performing in-depth root cause analysis to prevent recurrence and improve system reliability.
• Maintain high levels of customer satisfaction by delivering timely, high-quality solutions and support for production and project-related issues.
• Communicate proactively and effectively with business users, support teams, vendors, and other stakeholders to ensure alignment and transparency throughout the project lifecycle.
• Manage multiple projects and tasks simultaneously, balancing competing priorities and deadlines in a fast-paced environment.
• Work closely with other technical teams to ensure seamless execution of processes and integration of solutions.
• Develop and execute unit test cases and scripts to validate changes and releases, ensuring high-quality deliverables
• Coordinate and execute system software upgrades in collaboration with end users and technical teams.
• Serve as a key escalation point for complex production issues, ensuring timely resolution and minimal impact to business operations.
• Participate in system implementations and go-lives, which may occasionally require evening or weekend availability to support deployment activities and ensure successful transitions.
• Collaborate with cross-functional teams to troubleshoot and resolve high-priority incidents, and contribute to post-implementation reviews and continuous improvement efforts.

Qualifications and Skills

Qualifications and SkillsCore technologies:
  • Azure Technologies (Databricks, Data Lake etc.), Python, Power Apps, Power BI, SQL Server, Oracle / PL-SQL, SQL stored procedures, Actimize (nice to have)

• Minimum of 5 years' experience implementing enterprise technologies and/or vendor platforms to meet business needs. Bachelor's degree in Computer Science or a related field preferred.
• Demonstrated experience in IT development, system design, integration, and data analysis across complex environments.
• Strong understanding of relational and SQL database platforms, with proficiency in Oracle, or similar technologies.
• Familiarity with surveillance and corporate compliance systems.
• Ability to independently manage, organize, and prioritize multiple tasks and projects in a dynamic environment.
• Well-developed research and analytical capabilities, with a strong aptitude for problem-solving and lateral thinking.

Additional Requirements
View Now

Databricks Developer

43224 Columbus, Ohio TCS USAAvance Consulting

Posted 1 day ago

Job Viewed

Tap Again To Close

Job Description

Must Have Functional Skills 5 years of experience in Databricks

# 7 years or more experience working in Data warehousing system

# At least 2-3 years of experience in Working with Informatica

# Good understanding of data warehouse and ETL concepts

# Very strong in SQL programming

 # Teradata - SQL , implementing complex stored and Hands-on experience building OLAP queries  experience in any Cloud platform

# Azure, AWS, GCP and Experience with AWS or Azure data storage and management technologies such as S3 and ADLS

# Hands-on experience in Python # Hands on experience in Unix and Perl

#Excellent communication skills

View Now

Databricks Developer

28245 Charlotte, North Carolina SMBC

Posted 2 days ago

Job Viewed

Tap Again To Close

Job Description

Role Description

The engineer designs, develops and work with a team of Azure Databricks and Actimize engineers, support external Vendors to deliver the compliance and surveillance solutions. The engineer is responsible for supporting and enhancing the surveillance platform, ensuring the bank meets its risk management, legal, and regulatory obligations related to fraud detection and prevention. This role requires deep functional and technical expertise in databases, including strong development skills for customizing and optimizing the platform. The engineer works closely with business units and support teams to deliver system enhancements, perform upgrades, and provide on-call user support. Responsibilities include designing and implementing workflows, plugins, data integration pipelines, writing custom rules and logic, and troubleshooting complex issues.

Operating in a highly transactional and tactical environment, the engineer exercises sound judgment and discretion to resolve challenges and support the achievement of business objectives.

Role Objectives: Delivery

Role Objectives: Delivery and BAU Support• Plan and design code and configuration for Actimize applications, ensuring alignment with business and technical requirements.

• Conduct detailed analysis to validate requirements, incidents, process flows, and project deliverables. Prepare comprehensive technical documentation to support development and implementation efforts.

• Troubleshoot and resolve issues across all environments, performing in-depth root cause analysis to prevent recurrence and improve system reliability.

• Maintain high levels of customer satisfaction by delivering timely, high-quality solutions and support for production and project-related issues.

• Communicate proactively and effectively with business users, support teams, vendors, and other stakeholders to ensure alignment and transparency throughout the project lifecycle.

• Manage multiple projects and tasks simultaneously, balancing competing priorities and deadlines in a fast-paced environment.

• Work closely with other technical teams to ensure seamless execution of processes and integration of solutions.

• Develop and execute unit test cases and scripts to validate changes and releases, ensuring high-quality deliverables

• Coordinate and execute system software upgrades in collaboration with end users and technical teams.

• Serve as a key escalation point for complex production issues, ensuring timely resolution and minimal impact to business operations.

• Participate in system implementations and go-lives, which may occasionally require evening or weekend availability to support deployment activities and ensure successful transitions.

• Collaborate with cross-functional teams to troubleshoot and resolve high-priority incidents, and contribute to post-implementation reviews and continuous improvement efforts.

Qualifications and Skills

Qualifications and SkillsCore technologies:

  • Azure Technologies (Databricks, Data Lake etc.), Python, Power Apps, Power BI, SQL Server, Oracle / PL-SQL, SQL stored procedures, Actimize (nice to have)

• Minimum of 5 years’ experience implementing enterprise technologies and/or vendor platforms to meet business needs. Bachelor’s degree in Computer Science or a related field preferred.

• Demonstrated experience in IT development, system design, integration, and data analysis across complex environments.

• Strong understanding of relational and SQL database platforms, with proficiency in Oracle, or similar technologies.

• Familiarity with surveillance and corporate compliance systems.

• Ability to independently manage, organize, and prioritize multiple tasks and projects in a dynamic environment.

• Well-developed research and analytical capabilities, with a strong aptitude for problem-solving and lateral thinking.

Additional Requirements

View Now

Databricks Developer

20022 Washington, District Of Columbia Innovative Object Solutions, Inc.

Posted 3 days ago

Job Viewed

Tap Again To Close

Job Description

Are you passionate about data engineering, cloud technologies, and building high-quality scalable solutions? We are assembling a team of talented engineers to develop the next generation of the Federal Student Aid system. This high-profile initiative will leverage advanced analytics, big data, and machine learning to serve millions of students and families. Our platform is cloud-native, built on modern microservices, and powered by Databricks to enable highly performant, scalable data processing and analytics.

Ideal candidates will have strong data engineering skills with experience in Databricks, Apache Spark, Delta Lake, and cloud platforms (AWS, Azure, or GCP).

Note: U.S. citizenship is required for this role.

Basic Qualifications:

4+ years of experience in IT or software engineering

3+ years of hands-on experience with Databricks or Apache Spark (PySpark, Scala, or SQL)

2+ years of experience building scalable ETL/ELT pipelines with Delta Lake or similar technologies

Strong knowledge of cloud environments (Azure Databricks, AWS EMR/Databricks, or GCP Dataproc/Databricks)

Experience with distributed computing and large-scale data processing

BA or BS degree in Computer Science, Data Engineering, or related field

Ability to obtain a security clearance required

Additional Qualifications:

Experience with MLflow, feature stores, or machine learning pipelines in Databricks

Knowledge of streaming data frameworks (Structured Streaming, Kafka, Kinesis)

Familiarity with CI/CD and DevOps practices for data engineering (Git, Azure DevOps, Jenkins)

Ability to optimize Spark jobs for performance and cost efficiency

Strong problem-solving skills and ability to communicate effectively with technical and non-technical stakeholders

Ability to work independently and in a collaborative team environment

Excellent written and verbal communication skills

U.S. Citizenship required

View Now

DataBricks Developer

16371 Youngsville, Pennsylvania Diverse Lynx

Posted 3 days ago

Job Viewed

Tap Again To Close

Job Description

Role name: Developer Role Description: DataBricks Developer Competencies: Digital : Databricks, Database Optimization and Tuning Experience (Years): 6-8 Essential Skills: DataBricks Developer

Diverse Lynx LLC is an Equal Employment Opportunity employer. All qualified applicants will receive due consideration for employment without any discrimination. All applicants will be evaluated solely on the basis of their ability, competence and their proven capability to perform the functions outlined in the corresponding role. We promote and support a diverse workforce across all levels in the company.
View Now
Be The First To Know

About the latest Databricks Jobs in United States !

Databricks Developer

20811 Bethesda, Maryland h3 Technologies

Posted 3 days ago

Job Viewed

Tap Again To Close

Job Description

Job TitleDatabricks Developer

Position TypeContract

LocationBethesda, MD

Remote Work100%

Primary SkillsJava

Job Description

We are seeking a highly experienced Data Integration and Business Intelligence (BI) Developer with deep expertise in Databricks, AWS cloud technologies, and Tableau. The ideal candidate will have a strong background in designing and implementing scalable data solutions and BI reporting in cloud environments, particularly within the IRS EDP and Analytical platforms.

Key Responsibilities:

Design and develop scalable data integration, aggregation, and extraction workflows using Databricks.

Build and optimize robust data pipelines to support analytics and reporting.

Develop interactive dashboards and reports using Tableau.

Collaborate with cross-functional teams to gather requirements and deliver innovative data and application solutions.

Ensure data solutions are secure, scalable, and aligned with architectural best practices.

Required Skills & Qualifications:

Hands-on experience with AWS cloud platform.

Proven experience designing and implementing scalable and secure AWS cloud solutions using the Databricks platform.

Expertise in Databricks Delta Lake and Lakehouse Architecture.

Strong experience in designing and implementing scalable BI reports in Tableau.

Expert-level skills in architecting solutions on the AWS Databricks platform.

Familiarity with IRS EDP Databricks platform and IRS Analytical platform.

Strong understanding of data integration, data warehousing, and BI best practices.

Excellent problem-solving, communication, and collaboration skills.

Programming Skills: Java 17; MongoDB 5.3.x; Spring Boot; AWS (dynamo MSK); Kafka; React; NodeJS; Javabase (front-end) Thymeleaf; PySpark;

View Now

Databricks Developer

20811 Bethesda, Maryland Sparktek

Posted 3 days ago

Job Viewed

Tap Again To Close

Job Description

We are seeking a highly experienced Data Integration and Business Intelligence (BI) Developer with deep expertise in Databricks, AWS cloud technologies, and Tableau. The ideal candidate will have a strong background in designing and implementing scalable data solutions and BI reporting in cloud environments, particularly within the IRS EDP and Analytical platforms.

Key Responsibilities:

  • Design and develop scalable data integration, aggregation, and extraction workflows using Databricks .
  • Build and optimize robust data pipelines to support analytics and reporting.
  • Develop interactive dashboards and reports using Tableau .
  • Collaborate with cross-functional teams to gather requirements and deliver innovative data and application solutions.
  • Ensure data solutions are secure, scalable, and aligned with architectural best practices.
Required Skills & Qualifications:
  • Hands-on experience with AWS cloud platform.
  • Proven experience designing and implementing scalable and secure AWS cloud solutions using the Databricks platform.
  • Expertise in Databricks Delta Lake and Lakehouse Architecture.
  • Strong experience in designing and implementing scalable BI reports in Tableau.
  • Expert-level skills in architecting solutions on the AWS Databricks platform.
  • Familiarity with IRS EDP Databricks platform and IRS Analytical platform.
  • Strong understanding of data integration, data warehousing, and BI best practices.
  • Excellent problem-solving, communication, and collaboration skills.
  • Programming Skills: Java 17; MongoDB 5.3.x; Spring Boot; AWS (dynamo MSK); Kafka; React; NodeJS; Javabase (front-end) Thymeleaf; PySpark;
View Now

Databricks Developer

95053 Santa Clara, California eTeam

Posted 9 days ago

Job Viewed

Tap Again To Close

Job Description

Databricks JD
We are seeking multiple positions (5-7, 7-10 years of experience) in Databricks along with Adv SQL Knowledge.
The primary responsibility of this role is to provide Technical support for the Databricks Environment.
Key Responsibilities:
  • Bug Fixes in the Databricks environment
  • Ability to Monitor, Transform and optimize ETL pipelines for Databricks and Knowledge of Data Lakehouse Architecture and knowledge of Pyspark (At least Mid Level)
  • Experience in complex data migration and familiarity with the knowledge is a plus
  • Ensure data accessibility and integrity for the migrated objects
  • Collaborate effectively with cross-functional teams
  • Communicate progress and challenges clearly to stakeholders
Qualifications:
  • Experience in SQL and BigData
  • Proficiency in Spark and Impala/Hive
  • Experience with Databricks and cloud platforms, particularly Azure
  • Good understanding of data modeling concepts and data warehouse designs
  • Excellent problem-solving skills and a passion for data accessibility
  • Effective communication and collaboration skills
  • Experience with Agile methodologies


Data Platform -
SQL, Pyspark, Python, Databricks, Job Monitoring using Redwood, Other Open Source Relational Databases like PostgreSQL etc
View Now
 

Nearby Locations

Other Jobs Near Me

Industry

  1. request_quote Accounting
  2. work Administrative
  3. eco Agriculture Forestry
  4. smart_toy AI & Emerging Technologies
  5. school Apprenticeships & Trainee
  6. apartment Architecture
  7. palette Arts & Entertainment
  8. directions_car Automotive
  9. flight_takeoff Aviation
  10. account_balance Banking & Finance
  11. local_florist Beauty & Wellness
  12. restaurant Catering
  13. volunteer_activism Charity & Voluntary
  14. science Chemical Engineering
  15. child_friendly Childcare
  16. foundation Civil Engineering
  17. clean_hands Cleaning & Sanitation
  18. diversity_3 Community & Social Care
  19. construction Construction
  20. brush Creative & Digital
  21. currency_bitcoin Crypto & Blockchain
  22. support_agent Customer Service & Helpdesk
  23. medical_services Dental
  24. medical_services Driving & Transport
  25. medical_services E Commerce & Social Media
  26. school Education & Teaching
  27. electrical_services Electrical Engineering
  28. bolt Energy
  29. local_mall Fmcg
  30. gavel Government & Non Profit
  31. emoji_events Graduate
  32. health_and_safety Healthcare
  33. beach_access Hospitality & Tourism
  34. groups Human Resources
  35. precision_manufacturing Industrial Engineering
  36. security Information Security
  37. handyman Installation & Maintenance
  38. policy Insurance
  39. code IT & Software
  40. gavel Legal
  41. sports_soccer Leisure & Sports
  42. inventory_2 Logistics & Warehousing
  43. supervisor_account Management
  44. supervisor_account Management Consultancy
  45. supervisor_account Manufacturing & Production
  46. campaign Marketing
  47. build Mechanical Engineering
  48. perm_media Media & PR
  49. local_hospital Medical
  50. local_hospital Military & Public Safety
  51. local_hospital Mining
  52. medical_services Nursing
  53. local_gas_station Oil & Gas
  54. biotech Pharmaceutical
  55. checklist_rtl Project Management
  56. shopping_bag Purchasing
  57. home_work Real Estate
  58. person_search Recruitment Consultancy
  59. store Retail
  60. point_of_sale Sales
  61. science Scientific Research & Development
  62. wifi Telecoms
  63. psychology Therapy
  64. pets Veterinary
View All Databricks Jobs