247 Etl Developer jobs in Austin
Business Intelligence Engineer II
Posted 19 days ago
Job Viewed
Job Description
LOCATION: Onsite in Austin, TX
DURATION: 6 months with possible extension/conversion
PAY RANGE: $43-53/hour
TOP 3 SKILLS:
- Data Analysis and Modeling Experience
- Database Engineering experience
- Data Pipelines and Automation Experience
Job Description:
Mechatronics and Sustainable Packaging Customer Experience (MSP CX) is seeking an experienced and self-driven Business Intelligence Engineer (BIE). In this role, you will be building complex data engineering and business intelligence applications using AWS big data stack. You should have deep expertise and passion in working with large data sets, data visualization, building complex data processes, performance tuning, bringing data from disparate data stores and programmatically identifying patterns. You should have excellent communication skills to be able to work with business owners to develop and define key business questions and requirements.
The MSP CX organization is responsible for providing software and data solutions for organizations working on delivering capacity to the Fulfillment network, including planning, configuration management, design, construction, startup, launch, and operations engineering teams. Partnering with app developers, SDE's, data engineers and TPM's, you would support existing front-end tools/applications and help build new ones to be used by various teams across the globe. The individual must have the ability to communicate effectively across multiple technical and non-technical teams, as well as across other geographies. Successful members of this team collaborate effectively to solve problems, implement new solutions, and deliver successfully against high operational standards.
If you are motivated to serve the needs of our customers, you will also be able to satisfy your curiosity working with one of the world's largest datasets. We seek candidates who are passionate about data analysis and data-driven decision making, uncompromisingly detail oriented, smart, efficient, and driven to help our business succeed by providing key insights that translate into action.
successful candidate will have strong data mining and modeling skills and be comfortable facilitating ideation and working from concept through to execution. This role will also build tools and support structures needed to analyze data. As a DE in this team, you will be responsible for developing the data architecture components that scales for the ever-evolving data needs. You will apply cloud-based AWS services to solve challenging problems around: big data processing, data warehouse design, and enabling self-service. You will focus on automation and optimization for all areas of Databases/DW/ETL maintenance and deployment. You will work closely with software development teams on many non-standard and unique business problems and use creative problem solving to deliver actionable output. The role of a BIE requires excellent technical skills in order to develop systems and tools to process data as well as, but not limited to, the ability to analyze data.
Key Responsibilities:
- Evaluation of the performance of program features and marketing content along measures of customer response, use, conversion, and retention
- Statistical testing of A/B and multivariate experiments
- Design, build and maintain metrics and reports on program health
- Respond to ad hoc requests from business leaders to investigate critical aspects of customer behavior, e.g. how many customers use a given feature or fit a given profile, deep dive into unusual patterns, and exploratory data analysis
- Employ data mining, model building, segmentation, and other analytical techniques to capture important trends in the customer base
- Participate in strategic and tactical planning discussions
Job Qualifications:
- 3-6 years of experience working with large-scale complex datasets
- Strong analytical mindset, ability to decompose business requirements into an analytical plan, and execute the plan to answer those business questions
- Strong working knowledge of SQL
- Background (academic or professional) in statistics, programming, and marketing
- SAS experience a plus
- AWS Quicksight experience also a plus
- Graduate degree in math/statistics, computer science or related field, or marketing is highly desirable.
- Excellent communication skills, equally adept at working with engineers as well as business leaders
- Data Analysis and Modeling Experience
- Database Engineering experience
- Data Pipelines and Automation Experience
BENEFITS SUMMARY: Individual compensation is determined by skills, qualifications, experience, and location. Compensation details listed in this posting reflect the base hourly rate or annual salary only, unless otherwise stated. In addition to base compensation, full-time roles are eligible for Medical, Dental, Vision, Commuter and 401K benefits with company matching.
IND123
ETL Developer
Posted 14 days ago
Job Viewed
Job Description
ETL Developer
Hybrid 2 days a week onsite in Austin, TX (Candidates are required to already live here so not willing to relocate)
Interview - there are 2, first is video, 2nd is in person REQUIRED
Rate $70hr ctc
Visas - h4 ead, USC, GC/GC EAD
multi year
Job Description
Duties and Responsibilities
Essential Functions:
• Performs advanced ETL development work on data primarily residing in SQL Server
• Translates rules provided by PeopleSoft developers (Oracle based) into efficient T-SQL scripts to load data into the target database on SQL Server
• Troubleshoots, supports and maintains the ETL processes developed by the team
• Designs, deploys, and manages standardized, centralized data sources
• Helps to establish best practices for development, testing, migration and error handling of the ETL activities
• Performs other duties as assigned
Other Duties and Responsibilities:
• May interact with outside vendors
Essential Work Behaviors:
• Impeccable written and oral communication
• Communicates respectfully and works harmoniously with all co-workers, customers, and vendors.
• Provides exceptional customer service.
• Is flexible; able to work under pressure and; able to adapt to change; and able to work on multiple problems and tasks.
• Takes initiative to prevent and solve problems.
Qualification Requirements
Education/Special Requirements:
• Bachelor's degree in computer science, engineering, or related business field.
Experience and Training
• Three (3) years of hands on experience in T-SQL and SSIS/SSMS development
• Three (3) years of hands on experience in ETL development
• Three (3) years of hands on experience moving and manipulating large data sets
Preferred Qualifications
• Experience in health/life insurance, accounting and financials, or investment data related activities
• Experience with data tools (i.e. Visual Studio, SSIS, MSSQL )
• Experience with Azure and agile iterations
• Experience with ERP applications (i.e. PeopleSoft, etc.)
Knowledge, Skills and Abilities
• Uses all knowledge, skills, and abilities to apply critical thinking to all aspects of the job. Critical thinking is a process of forming reasoned opinions through observation, information collection, interpretation, analysis, inference, evaluation, and other skills necessary to successfully meet performance standards of the job.
• Experience with PeopleSoft HRMS or Finance packages preferred
• Experience with ERD Modeling & Maintenance
• Knowledge of RDBMS architecture and troubleshooting is desirable.
• Knowledge of data mining methods
Requires in-office work. Please submit resumes only for candidates who are currently based in Austin.
Azure ETL Developer
Posted today
Job Viewed
Job Description
JOB DESCRIPTION
Job Summary
The Azure ETL Developer is tasked with designing and developing Azure Data Factory solutions, covering projects of moderate to high complexity. This position also necessitates proficiency in Microsoft SSIS ETL solutions to support the architecture, development, and migration of legacy solutions to Azure Data Factory. Candidates should possess three to five years of experience with the Microsoft ETL stack, including experience in SSIS, SSRS, T-SQL, and Azure Data Factory. Furthermore, the role requires a comprehensive understanding of emerging technologies and the capability to design, research, and implement innovative solutions.
Knowledge/Skills/Abilities
-
Proven ability to architect and develop solutions which perform data transformations using Azure Data Factory
-
Strong knowledge of SQL Server, SSIS and t-SQL, preferably on Azure and/or SQL 2016+ Design and develop SQL Server stored procedures, functions, views and triggers
-
Design, implement and maintain SQL database objects (tables, views, indexes) and database security
-
Debug and tune existing SSIS/ETL processes to ensure accurate and efficient movement of processed data
-
Collaborate with Product Owners to elicit and document business requirements for ETL and report design.
-
Ability to translate business requirements into sound technical specifications
-
Research issues and sets up proof of concept tests
-
Support quality acceptance testing which includes the development and/or refinement of test plans
-
Lead design review session with scrum team to validate requirements
-
Troubleshoot data quality issues and defects to determine root cause
-
5+ years ETL development experience with 1-3 years utilizing Azure Data Factory
-
Experience working with Azure SQL Database, DevOps, GIT and Continual Integration (CI)
-
Knowledge and/or experience of the Agile framework and working in a scrum team
-
Familiarity with healthcare data and concepts
-
Familiarity with QNXT
-
Excellent analytical and problem-solving abilities
-
Strong written and oral communication skills
Required Education
Bachelor's Degree or equivalent combination of education and experience
To all current Molina employees: If you are interested in applying for this position, please apply through the intranet job listing.
Molina Healthcare offers a competitive benefits and compensation package. Molina Healthcare is an Equal Opportunity Employer (EOE) M/F/D/V.
Pay Range: $77,969 - $120,000 / ANNUAL
*Actual compensation may vary from posting based on geographic location, work experience, education and/or skill level.
Azure ETL Developer

Posted today
Job Viewed
Job Description
**Job Summary**
The Azure ETL Developer is tasked with designing and developing Azure Data Factory solutions, covering projects of moderate to high complexity. This position also necessitates proficiency in Microsoft SSIS ETL solutions to support the architecture, development, and migration of legacy solutions to Azure Data Factory. Candidates should possess three to five years of experience with the Microsoft ETL stack, including experience in SSIS, SSRS, T-SQL, and Azure Data Factory. Furthermore, the role requires a comprehensive understanding of emerging technologies and the capability to design, research, and implement innovative solutions.
**Knowledge/Skills/Abilities**
+ Proven ability to architect and develop solutions which perform data transformations using Azure Data Factory
+ Strong knowledge of SQL Server, SSIS and t-SQL, preferably on Azure and/or SQL 2016+ Design and develop SQL Server stored procedures, functions, views and triggers
+ Design, implement and maintain SQL database objects (tables, views, indexes) and database security
+ Debug and tune existing SSIS/ETL processes to ensure accurate and efficient movement of processed data
+ Collaborate with Product Owners to elicit and document business requirements for ETL and report design.
+ Ability to translate business requirements into sound technical specifications
+ Research issues and sets up proof of concept tests
+ Support quality acceptance testing which includes the development and/or refinement of test plans
+ Lead design review session with scrum team to validate requirements
+ Troubleshoot data quality issues and defects to determine root cause
+ 5+ years ETL development experience with 1-3 years utilizing Azure Data Factory
+ Experience working with Azure SQL Database, DevOps, GIT and Continual Integration (CI)
+ Knowledge and/or experience of the Agile framework and working in a scrum team
+ Familiarity with healthcare data and concepts
+ Familiarity with QNXT
+ Excellent analytical and problem-solving abilities
+ Strong written and oral communication skills
**Required Education**
Bachelor's Degree or equivalent combination of education and experience
To all current Molina employees: If you are interested in applying for this position, please apply through the intranet job listing.
Molina Healthcare offers a competitive benefits and compensation package. Molina Healthcare is an Equal Opportunity Employer (EOE) M/F/D/V.
Pay Range: $77,969 - $120,000 / ANNUAL
*Actual compensation may vary from posting based on geographic location, work experience, education and/or skill level.
Graph (Neo4j) and ETL Developer
Posted 2 days ago
Job Viewed
Job Description
Job Title: Graph (Neo4j) and ETL Developer Job ID: 2023-12156 Job Location: Austin, TX; Morris Plains, NJ Job Travel Location(s): 0% # Positions: 1 Employment Type: W2 Candidate Constraints: Duration:Long Term # of Layers:0 Work Eligibility:All Work Authorizations are Permitted Key Technology: Graph database (Neo4j) experience and its query language (Cypher) Job Responsibilities: Work with ETL, data warehousing and graph database software to store and organize Graph data Secure and timely delivery of Graph data / use case results to consuming applications and user communities Ensure that databases meet user requirements. Develop archiving procedures for old and expired transactions / data Participate in sprint planning sessions and help TPOs Skills and Experience Required: 10+ years of strong Graph database (Neo4j) experience and its query language (Cypher) Should be able to demonstrate graph data loads pipelines from Hadoop Proven skills on Python, PyArrow and Spark/Scala Hands on experience on SQL/In-memory databases and automation of ETL Strong knowledge on Hadoop, Neo4j and PyArrow technologies Strong development experience on Hadoop file formats and organizing warehouse data Strong exposure to Graph nodes/edges and relations Strong experience on DevOps pipeline implementation Healthcare domain knowledge is plus Good Understanding of spark/Hadoop eco system including YARN responsibilities Should have great debugging skills on Hadoop spark/graph Excellent thinker with exceptional problem solving skills Should have strong communication and interpersonal skills Good exposure to Agile development process Excellent query optimization skills and provide inputs of required keys for better performance improvement. #J-18808-Ljbffr
Sr. ETL Developer & Informatica Power Center
Posted today
Job Viewed
Job Description
Position: MAL - 960 - Systems Analyst-529601476 / Sr. ETL Developer & Informatica Power Center Location: Austin, TX Position is ONSITE . Local candidates only. This is our direct client: State of Texas - Texas Health and Human Services Commission. Duration: 12+ months, with likely extensions. Interview Type: Microsoft Teams Job Description: Note: This position requires onsite work in Austin, TX. No remote work is available. Candidates must reside within a 50-mile radius of Austin, TX. Please do not submit candidates who are out of state or planning to move to Texas; candidates must already reside in Texas. HHSC IT is developing a data integration hub with the following goals: Implementation and configuration of the infrastructure for the data integration hub. Design, development, and implementation of the data integration hub using agile methodologies across all SDLC phases, including: Validation of performance metrics Creation of Epics, User Stories, and Tasks Automation of data acquisition from various sources Development of complex SQL scripts Testing: integration, load, stress Deployment and publication internally and externally This development will follow an agile approach similar to the Texas Integrated Eligibility Redesign System (TIERS). Responsibilities include: Leading an agile development team as a technical leader. Data acquisition from multiple sources. Developing complex SQL scripts for data transformation into a dimensional model, creating views and materialized views in Oracle. Automating data processes using Informatica Power Center/IICS. Collaborating with the Data Engineering Team on data design. Verifying and validating SQL scripts, Informatica automations, and database views. Developing automated verification and validation methods. Participating in sprint ceremonies. Working with architects and data engineers on implementation and data strategies. Creating mockups and validating with customers. Addressing technical issues with team members. Supporting tool implementation and configuration. Producing technical documentation. Participating in requirements and design sessions. Adapting to changing business requirements and proposing enhancements. Performing other duties as assigned. Company: MasterApp Labs, a direct vendor for various state clients. #J-18808-Ljbffr
Senior ETL Developer with Healthcare DataSystems
Posted 1 day ago
Job Viewed
Job Description
Direct message the job poster from TekValue IT Solutions
Senior ETL Developer with Healthcare DataSystems
Onsite - Austin, TX ( Local to TX )
Long Term
Required Skills:
- Experience in the Healthcare Industry, HHS agency and PII or PHI data.
- Implementation and configuration of the infrastructure for the data integration hub.
- Design, development, and implementation (DD&I) of the data integration hub using an agile methodology for all standard SDLC phases that includes, but is not limited to:
- Validation of performance metric requirements
- Creation of Epics/User Stories/Tasks
- Automation of data acquisition from a variety of data sources
- Development of complex SQL scripts
- Testing integration, load and stress
- Deployment / publication internally and externally
- Operations support and enhancement of the data integration hub
- Filling the role of a technical leader, leading an agile development team through a project.
- Data acquisition from a variety of data sources for multiple uses.
- Developing complex SQL scripts to transform the source data to fit into a dimensional model, then to create views and materialized views in Oracle.
- Developing automation with Informatica Power Center/IICS to pull data from external data sources and transform it to fit into a dimensional model.
- Collaborating with other members of the Data Engineering Team on the design and implementation of an optimal data design.
- Verification and validation of SQL scripts, Informatica automation and database views.
- Developing automated means of performing verification and validation.
- Participating in all sprint ceremonies
- Work closely with the Architects and Data Engineering Team on implementation designs and data acquisition strategies.
- Develop mockups and work with customers for validation
- Working closely with other members of the team to address technical problems
- Assisting with the implementation and configuration of developmental tools
- Producing and maintaining technical specifications, diagrams, or other documentation as needed to support the DD&I efforts
- Participation in requirements and design sessions
- Interpreting new and changing business requirements to determine the impact and proposing enhancements and changes to meet these new requirements.
- Seniority level Mid-Senior level
- Employment type Contract
- Job function Information Technology
- Industries Hospitals and Health Care
Referrals increase your chances of interviewing at TekValue IT Solutions by 2x
Get notified about new Senior ETL Developer jobs in Austin, TX .
Austin, TX $100,000.00-$50,000.00 2 weeks ago
Austin, TX 109,000.00- 136,000.00 6 days ago
Austin, TX 57,000.00- 141,200.00 11 minutes ago
Austin, TX 108,826.00- 141,999.00 1 day ago
Austin, TX 150,000.00- 175,000.00 2 months ago
Austin, TX 90,000.00- 140,000.00 1 month ago
Austin, TX 99,500.00- 200,000.00 1 week ago
Austin, Texas Metropolitan Area 6 days ago
Software Engineer, Full-Stack (Austin/ Seattle)Austin, TX 90,000.00- 170,000.00 4 months ago
Austin, TX 84,600.00- 119,650.00 6 days ago
Austin, TX 120,000.00- 160,000.00 3 weeks ago
Austin, TX 150,000.00- 180,000.00 3 weeks ago
Austin, TX 90,000.00- 150,000.00 4 weeks ago
Austin, TX 1,000.00- 2,000.00 1 month ago
Software Development Engineer, Amazon KeyWere unlocking community knowledge in a new way. Experts add insights directly into each article, started with the help of AI.
#J-18808-LjbffrBe The First To Know
About the latest Etl developer Jobs in Austin !
ETL Developer/Programmer Analyst Level 3
Posted 1 day ago
Job Viewed
Job Description
PROLIM Global Corporation ( is currently seeking an ETL Developer/Programmer Analyst Level 3 for the Austin, Texas, United States location, for one of our top clients.
Job Description:
Description Of Services
Skills/Experience
- Hands-on experience with data warehouse development (relational and dimensional) and proven ability to implement using databases like Oracle, SQL Server, etc., preferably Oracle, with different data sources 8 Years Required
- Proven ability to work successfully with technical and non-technical groups and manage multiple responsibilities 8 Years Required
- Excellent communication, analytical, and interpersonal skills at all levels 8 Years Required
- Strong technical writing skills 8 Years Required
- Experience with ERWin or other equivalent data modeling tools 6 Preferred
- Experience with more than one type of RDBMS (DB2, SQL Server, Oracle) 6 Preferred
- Experience in Public Sector delivery 6 Preferred
- Certified Data Management Professional (CDMP) Preferred
- Design and develop Extraction, Transformation, Load (ETL) processes for enterprise data warehouse using Informatica; deep hands-on expertise with Informatica ETL processes, development, data management, data profiling, data flows, data relationships, and data quality standards and processes Preferred
- Experience with Informatica product suite in the cloud Preferred
- Experience with Texas Workforce Commission Preferred
To apply, please send your updated resume and contact information via email to
About PROLIM Corporation
PROLIM is a leading provider of end-to-end IT, PLM, and Engineering Services and Solutions for Global 1000 companies. They understand both business and technology, helping clients improve profitability and efficiency through high-value technology consulting, staffing, and project management outsourcing services.
Their offerings include advisory, PLM software/services, program management, solution architecture, training/staffing, cloud solutions, servers/networking, infrastructure, ERP practices, and QA services. Engineering services encompass data translation, CAD/CAM/CAE, process & product engineering, prototyping, testing, and validation across various markets and industries.
#J-18808-LjbffrSenior Full Stack Developer- ETL

Posted today
Job Viewed
Job Description
**Workplace Classification:**
**Hybrid** : This role is categorized as hybrid. This means the candidate is expected to report to their primary work location three times per week, at minimum, or other frequency dictated by the business.
**The Role**
In this Full Stack / ETL Developer role, you will develop and maintain an application designed to ensure GM's compliance with NHTSA TREAD reporting requirements. This involves processing multiple sources of vehicle safety data and working with frontend and backend technologies to build secure and scalable solutions that facilitate timely and accurate reporting.
This role provides a unique opportunity to work on a mission-critical application that reports data to federal regulators with strict legal requirements. You will work across the full technology stack, developing efficient data workflows that handle sensitive and regulated information. Your contributions will directly impact the organization's ability to meet mandatory reporting deadlines and uphold data quality standards.
**What You'll Do (Responsibilities)**
+ Develop, enhance, and support ETL processes using IBM DataStage.
+ Design and develop responsive front-end Web applications using Angular.
+ Build and maintain RESTful APIs using Spring Boot and Java.
+ Write clean, maintainable, and well-documented code and guide others on the same.
+ Participate in code reviews, unit testing, and integration testing.
+ Troubleshoot and debug applications and data pipelines.
+ Ensure system scalability, security, and performance.
+ Work with business partners on data-related technical issues and develop requirements to support their data management needs.
**Additional Job Description**
**What You'll Need (Required Qualifications)**
+ Bachelor's degree or equivalent experience
+ 5+ or more years working with one or more object-oriented/object function scripting languages: Python, Java, C++, Scala, etc.
+ 3+ years of experience in front-end development with Angular
+ 3+ years of experience developing REST APIs with Java and Spring Boot or Quarkus
+ Strong understanding of REST architecture, microservices, and API security
+ Hands-on experience with IBM DataStage for ETL and data integration tasks
+ Solid knowledge of SQL and experience with relational databases (Postgres, Oracle)
+ Familiarity with Git and CI/CD pipelines
+ Ability to identify tasks which require automation and automate them
+ A demonstrable understanding of networking/distributed computing environment concepts
+ Responsible for the defining, maintaining, and verifying compliance with technical and operational metadata standards, Data Movement Patterns, ETL Standards and Best Practices
+ Knowledge of the Software Development Lifecycle (SDLC), experience working with Agile methodology and Scrum framework
**People Skills:**
+ Ability to lead development effort managing the planning and execution of deliverables in a team environment
+ Independent thinker, self-motivated, quick learner with ability to tackle problems quickly and completely
+ Ability to multi-task and stay organized in a dynamic work environment
+ Excellent verbal and written communication skills and ability to effectively communicate and translate feedback, needs and solutions
**What Will Give You a Competitive Edge (Preferred Qualifications)**
+ Knowledge of cloud platforms such as AWS or Azure
+ Experience with Terraform
+ 3+ years of hands-on experience with Big Data Framework: Hadoop, Spark, Kafka, Kubernetes, Hive
+ Stream-processing systems: Storm, Spark-Streaming, etc.
+ Experience with NoSQL databases
**GM DOES NOT PROVIDE IMMIGRATION-RELATED SPONSORSHIP FOR THIS ROLE. PLEASE DO NOT APPLY FOR THIS ROLE IF YOU WILL NEED GM IMMIGRATION SPONSORSHIP (e.g., H-1B, TN, STEM OPT, etc.) NOW OR IN THE FUTURE.**
**This job may be eligible for relocation benefits.**
#LI-DH2
**About GM**
Our vision is a world with Zero Crashes, Zero Emissions and Zero Congestion and we embrace the responsibility to lead the change that will make our world better, safer and more equitable for all.
**Why Join Us**
We believe we all must make a choice every day - individually and collectively - to drive meaningful change through our words, our deeds and our culture. Every day, we want every employee to feel they belong to one General Motors team.
**Benefits Overview**
From day one, we're looking out for your well-being-at work and at home-so you can focus on realizing your ambitions. Learn how GM supports a rewarding career that rewards you personally by visiting Total Rewards Resources ( .
**Non-Discrimination and Equal Employment Opportunities (U.S.)**
General Motors is committed to being a workplace that is not only free of unlawful discrimination, but one that genuinely fosters inclusion and belonging. We strongly believe that providing an inclusive workplace creates an environment in which our employees can thrive and develop better products for our customers.
All employment decisions are made on a non-discriminatory basis without regard to sex, race, color, national origin, citizenship status, religion, age, disability, pregnancy or maternity status, sexual orientation, gender identity, status as a veteran or protected veteran, or any other similarly protected status in accordance with federal, state and local laws.
We encourage interested candidates to review the key responsibilities and qualifications for each role and apply for any positions that match their skills and capabilities. Applicants in the recruitment process may be required, where applicable, to successfully complete a role-related assessment(s) and/or a pre-employment screening prior to beginning employment. To learn more, visit How we Hire ( .
**Accommodations**
General Motors offers opportunities to all job seekers including individuals with disabilities. If you need a reasonable accommodation to assist with your job search or application for employment, email ( ) us or call us at . In your email, please include a description of the specific accommodation you are requesting as well as the job title and requisition number of the position for which you are applying.
We are leading the change to make our world better, safer and more equitable for all through our actions and how we behave. Learn more about:
**Our Company ( Culture**
**How we hire? ( diverse team of employees bring their collective passion for engineering, technology and design to deliver on our vision of a world with Zero Crashes, Zero Emissions and Zero Congestion. We are looking for adventure-seekers and imaginative thought leaders to help us transform mobility.
Explore our global locations ( policy of General Motors is to extend opportunities to qualified applicants and employees on an equal basis regardless of an individual's age, race, color, sex, religion, national origin, disability, sexual orientation, gender identity/expression or veteran status. Additionally, General Motors is committed to being an Equal Employment Opportunity Employer and offers opportunities to all job seekers including individuals with disabilities. If you need a reasonable accommodation to assist with your job search or application for employment, email us at .In your email, please include a description of the specific accommodation you are requesting as well as the job title and requisition number of the position for which you are applying.
Data Engineer

Posted today
Job Viewed
Job Description
Meta Platforms, Inc. (Meta), formerly known as Facebook Inc., builds technologies that help people connect, find communities, and grow businesses. When Facebook launched in 2004, it changed the way people connect. Apps and services like Messenger, Instagram, and WhatsApp further empowered billions around the world. Now, Meta is moving beyond 2D screens toward immersive experiences like augmented and virtual reality to help build the next evolution in social technology. To apply, click "Apply to Job" online on this web page.
**Required Skills:**
Data Engineer Responsibilities:
1. Design, model, or implement data warehousing activities in order to contribute to the design and development of Facebook products.
2. Manage and execute data warehouse plans for a product or a group of products to solve well-scoped problems.
3. Identify the data needed for a business problem and implement logging required to ensure availability of data, while working with data infrastructure to triage issues and resolve.
4. Collaborate with engineers, product managers and data scientists to understand data needs, representing key data insights visually in a meaningful way.
5. Build data expertise and leverage data controls to ensure privacy, security, compliance, data quality, and operations for allocated areas of ownership.
6. Design, build and launch new data models and visualizations in production, leveraging common development toolkits.
7. Independently design, build and launch new data extraction, transformation and loading processes in production, mentoring others around creating efficient queries.
8. Support existing processes running in production and implement optimized solutions with limited guidance.
9. Define and manage SLA for data sets in allocated areas of ownership.
10. Work on problems of moderate scope where analysis of situations or data requires a review of a variety of factors.
11. Exercise judgment within defined procedures and practices to determine appropriate action.
12. Telecommuting is permitted from anywhere in the U.S.
**Minimum Qualifications:**
Minimum Qualifications:
13. Requires a Master's degree (or foreign equivalent) in Computer Science, Engineering, Information Systems Management, or a related field and 24 months of experience in the job offered or in a related occupation.
14. Requires 24 months in the following:
15. 1. Custom ETL design, implementation, and maintenance
16. 2. Object-oriented programming languages
17. 3. Schema design and dimensional data modeling
18. 4. Writing SQL statements
19. 5. Analyzing data to identify deliverables, gaps and inconsistencies
20. 6. Managing and communicating data warehouse plans to internal clients
21. 7. MapReduce or MPP system
22. 8. Python.
**Public Compensation:**
$229,173/year to $235,400/year + bonus + equity + benefits
**Industry:** Internet
**Equal Opportunity:**
Meta is proud to be an Equal Employment Opportunity and Affirmative Action employer. We do not discriminate based upon race, religion, color, national origin, sex (including pregnancy, childbirth, or related medical conditions), sexual orientation, gender, gender identity, gender expression, transgender status, sexual stereotypes, age, status as a protected veteran, status as an individual with a disability, or other applicable legally protected characteristics. We also consider qualified applicants with criminal histories, consistent with applicable federal, state and local law. Meta participates in the E-Verify program in certain locations, as required by law. Please note that Meta may leverage artificial intelligence and machine learning technologies in connection with applications for employment.
Meta is committed to providing reasonable accommodations for candidates with disabilities in our recruiting process. If you need any assistance or accommodations due to a disability, please let us know at