2,211 Data Pipeline jobs in the United States
Data Pipeline Engineer (ML)
Posted today
Job Viewed
Job Description
Our client is scaling production ML systems and needs a hands-on engineer to help build, maintain, and run essential ML data pipelines . You’ll own high-throughput data ingestion and transformation workflows (including image- and array-type modalities), enforce rigorous data quality standards, and partner with research and platform teams to keep models fed with reliable, versioned datasets.
- Design, build, and operate reliable ML data pipelines for batch and/or streaming use cases across cloud environments.
- Develop robust ETL/ELT processes (ingest, validate, cleanse, transform, and publish) with clear SLAs and monitoring.
- Implement data quality gates (schema checks, null/outlier handling, drift and bias signals) and data versioning for reproducibility.
- Optimize pipelines for distributed computing and large modalities (e.g., images, multi-dimensional arrays).
- Automate repetitive workflows with CI/CD and infrastructure-as-code; document, test, and harden for production.
- Collaborate with ML, Data Science, and Platform teams to align datasets, features, and model training needs.
Minimum Qualifications:
5+ years building and operating data pipelines in production.
- Cloud: Hands-on with AWS , Azure , or GCP services for storage, compute, orchestration, and security.
- Programming: Strong proficiency in Python and common data/ML libraries (pandas , NumPy , etc.).
- Distributed compute: Experience with at least one of Spark , Dask , or Ray .
- Modalities: Experience handling image-type and array-type data at scale.
- Automation: Proven ability to automate repetitive tasks (shell/Python scripting, CI/CD).
- Data Quality: Implemented validation, cleansing, and transformation frameworks in production.
- Data Versioning: Familiar with tools/practices such as DVC , LakeFS , or similar.
- Languages: Fluent in English or Farsi .
- Strongly PreferredSQL expertise (writing performant queries; optimizing on large datasets).
- Data warehousing/lakehouse concepts and tools (e.g., Snowflake/BigQuery/Redshift; Delta/Lakehouse patterns).
- Data virtualization/federation exposure (e.g., Presto/Trino) and semantic/metadata layers.
- Orchestration (Airflow, Dagster, Prefect) and observability/monitoring for data pipelines.
- MLOps practices (feature stores, experiment tracking, lineage, artifacts).
- Containers & IaC (Docker; Terraform/CloudFormation) and CI/CD for data/ML workflows.
- Testing for data/ETL (unit/integration tests, great_expectations or similar).
- Soft Skills Executes independently and creatively ; comfortable owning outcomes in ambiguous environments.
- Proactive communicator who collaborates cross-functionally with DS/ML/Platform stakeholders.
Location: Seattle, WA
Duration: 1+ year
Pay: $56/hr
Business Analyst - Data Pipeline
Posted 3 days ago
Job Viewed
Job Description
Location: Remote - Mexico
Type: Full-Time Contract
About BigRio:
BigRio is a remote-based, technology consulting firm with headquarters in Boston, MA. We deliver software solutions ranging from custom development and software implementation to data analytics and machine learning/AI integrations. As a one-stop shop, we attract clients from a variety of industries due to our proven ability to deliver cutting-edge, cost-effective software solutions.
About the Role:
We are seeking a detail-oriented and technically proficient Business Analyst to support our healthcare data projects, with a focus on ETL pipelines , clinical data integration , and EMR/EHR systems . The successful candidate will work closely with both technical and clinical stakeholders to facilitate the smooth flow of data from electronic health records into analytics platforms. This is a remote position based in Mexico , and supporting a US-based client in the healthcare and life sciences sector.
Key Responsibilities:
- Gather, analyze, and translate business and clinical requirements into clear technical documentation.
- Collaborate with engineering teams to support the design and implementation of ETL pipelines for clinical data ingestion.
- Conduct data mapping, gap analysis, and validation across multiple EMR/EHR systems and target data platforms.
- Ensure compatibility with interoperability standards such as HL7, FHIR, and common clinical data models.
- Document workflows, specifications, and test cases to support development and QA processes.
- Engage with clinical and business stakeholders to understand and refine data requirements.
- Maintain awareness of compliance requirements, including HIPAA, 21 CFR Part 11, and other relevant standards.
- 5+ years of experience as a Business Analyst in a healthcare or clinical data environment.
- Strong experience in ETL workflows , data pipeline design , and integration with EMR/EHR systems (e.g., Epic, Cerner, Meditech).
- Familiarity with healthcare interoperability standards, including HL7, FHIR, and CDA.
- Proficient in SQL and capable of analyzing and validating large datasets.
- Strong documentation skills for technical specs, data mappings, and business requirements.
- Experience with data governance, clinical terminology (ICD-10, SNOMED CT, LOINC).
- Exposure to cloud platforms like AWS, Azure, or GCP.
- Familiarity with project management and documentation tools (e.g., JIRA, Confluence).
- Familiarity with the US healthcare system and associated data standards is a major plus .
- Prior experience working with clients in the US healthcare or life sciences domain.
- Excellent communication skills in English (verbal and written).
- Comfortable working in a remote, distributed team setting.
- Strong analytical and stakeholder management skills.
Data Pipeline Engineer (ML) (Seattle)
Posted today
Job Viewed
Job Description
Our client is scaling production ML systems and needs a hands-on engineer to help build, maintain, and run essential ML data pipelines . Youll own high-throughput data ingestion and transformation workflows (including image- and array-type modalities), enforce rigorous data quality standards, and partner with research and platform teams to keep models fed with reliable, versioned datasets.
- Design, build, and operate reliable ML data pipelines for batch and/or streaming use cases across cloud environments.
- Develop robust ETL/ELT processes (ingest, validate, cleanse, transform, and publish) with clear SLAs and monitoring.
- Implement data quality gates (schema checks, null/outlier handling, drift and bias signals) and data versioning for reproducibility.
- Optimize pipelines for distributed computing and large modalities (e.g., images, multi-dimensional arrays).
- Automate repetitive workflows with CI/CD and infrastructure-as-code; document, test, and harden for production.
- Collaborate with ML, Data Science, and Platform teams to align datasets, features, and model training needs.
Minimum Qualifications:
5+ years building and operating data pipelines in production.
- Cloud: Hands-on with AWS , Azure , or GCP services for storage, compute, orchestration, and security.
- Programming: Strong proficiency in Python and common data/ML libraries (pandas , NumPy , etc.).
- Distributed compute: Experience with at least one of Spark , Dask , or Ray .
- Modalities: Experience handling image-type and array-type data at scale.
- Automation: Proven ability to automate repetitive tasks (shell/Python scripting, CI/CD).
- Data Quality: Implemented validation, cleansing, and transformation frameworks in production.
- Data Versioning: Familiar with tools/practices such as DVC , LakeFS , or similar.
- Languages: Fluent in English or Farsi .
- Strongly PreferredSQL expertise (writing performant queries; optimizing on large datasets).
- Data warehousing/lakehouse concepts and tools (e.g., Snowflake/BigQuery/Redshift; Delta/Lakehouse patterns).
- Data virtualization/federation exposure (e.g., Presto/Trino) and semantic/metadata layers.
- Orchestration (Airflow, Dagster, Prefect) and observability/monitoring for data pipelines.
- MLOps practices (feature stores, experiment tracking, lineage, artifacts).
- Containers & IaC (Docker; Terraform/CloudFormation) and CI/CD for data/ML workflows.
- Testing for data/ETL (unit/integration tests, great_expectations or similar).
- Soft Skills Executes independently and creatively ; comfortable owning outcomes in ambiguous environments.
- Proactive communicator who collaborates cross-functionally with DS/ML/Platform stakeholders.
Location: Seattle, WA
Duration: 1+ year
Pay: $56/hr
Data Pipeline/ETL (Informatica) Developer
Posted today
Job Viewed
Job Description
At Maximus, we're proud to be celebrating our 50th year in business, with strong financial performance - including $1.4B in revenue this quarter and 15% growth in our Federal services group.
Becoming part of Maximus means joining a team that offers:
* A generous annual allowance for education or professional certification
* Free access to robust certification and training programs to help you grow your career
* Strong career path with support for internal mobility
* A collaborative, respectful work environment with supportive leadership
* Comprehensive benefits, including medical/dental/vision, generous paid time off, and more
We are seeking a Data Pipeline/ETL(Informatica) Developer to join our team supporting an Internal Revenue Service (IRS) client.
Position is 100% remote with preference close to Farmers Branch, TX or Lanham, MD.
Employment is contingent upon successful completion of the IRS-required Moderate Risk Background Investigation (MBI). The MBI requires the selected candidate be a U.S. Citizenship or Permanent Resident (Green card) status for at least 3 years. The MBI certification process will take 4 to 5+ months, unless the candidate already holds an active MBI, which may shorten the timeline.
Essential Duties and Responsibilities:
- Provide design and implementation expertise to a cross-functional software development team.
- Design and develop software applications from business requirements in collaboration with other team members.
- Support testing and remediate defects.
- May provide guidance, coaching, and training to other employees within job area.
This role will be responsible for the development of data pipeline (Informatica workflows) and ETL (Post processing) code. The candidate will collaborate with IRS teams to design and develop ETL code to transform and process large amounts of data from disparate IRS data sources.
Job-Essential Duties and Responsibilities:
- Design and develop Informatica PowerCenter mappings including transformations, mappings, mapplets, workflows, workflow monitoring
- Design, code, unit test ETL packages, triggers, and stored procedures, views, SQL transactions
- Migrate data from On-prem managed service environments to cloud platforms AWS Redshift/Databricks
- Ingest data from disparate data sources and apply business rules using SQL/Python to create fact, dimension and summary tables for ingestion by BI tools
- Create, test and refine SQL queries to support data profiling and data management
- Understand and comply with development standards and SDLC to ensure consistency across the project.
- Perform troubleshooting, debugging, and performance tuning
- Support installation, upgrades, and testing of Informatica components.
- Perform data profiling, data standardization, data transformations, data loads.
- Perform detailed data profiling and end-to-end data management analysis to understand data issues, and to identify potential options to resolve issues.
- Support phases of the project lifecycle including requirements gathering, analysis, design, development, testing, deployment, support, and documentation
- Analyze, document and define data management processes, both batch and real-time.
- Evaluate data management and data governance practices to provide options and recommendations.
- Create clear deliverable materials for a range of recipients including technical teams, business teams, project leads and executives.
Job-specific minimum requirements:
- Bachelor's Degree in Information Technology or related field from an accredited college or university required; an additional four (4) years of related work experience can substitute for a degree
- At least five (3-5) years of relevant experience required
- Oracle and GreenPlum experience
- Proven Experience with SQL (Postgres, Redshift, MySQL), Python
- Strong experience with ETL development and working with diverse data sources
- Working knowledge of S3
- Advanced experience with Informatica Power Exchange, Control-M
- Experience with data modeling concepts
- Hands-on experience with Unix and shell scripts
- Experience in development, maintenance, and enhancement of Informatica Mappings, Work-flows, and processes
- Good verbal and written communication skills
- Ability to interface with all levels of management
- Ability to perform complex tasks with minimal supervision and guidance
- Excellent time management, scheduling and organizational skills
- Ability to work well independently or in a team setting
- Candidates must be a US Citizen or a Legal Permanent Resident (Green Card status) for 3 years and be Federal Tax compliant.
#techjobs #veteranspage #C0reJobs
Minimum Requirements
- Bachelor's degree in relevant field of study and 5+ years of relevant professional experience required, or equivalent combination of education and experience.
EEO Statement
Maximus is an equal opportunity employer. We evaluate qualified applicants without regard to race, color, religion, sex, age, national origin, disability, veteran status, genetic information and other legally protected characteristics.
Pay Transparency
Maximus compensation is based on various factors including but not limited to job location, a candidate's education, training, experience, expected quality and quantity of work, required travel (if any), external market and internal value analysis including seniority and merit systems, as well as internal pay alignment. Annual salary is just one component of Maximus's total compensation package. Other rewards may include short- and long-term incentives as well as program-specific awards. Additionally, Maximus provides a variety of benefits to employees, including health insurance coverage, life and disability insurance, a retirement savings plan, paid holidays and paid time off. Compensation ranges may differ based on contract value but will be commensurate with job duties and relevant work experience. An applicant's salary history will not be used in determining compensation. Maximus will comply with regulatory minimum wage rates and exempt salary thresholds in all instances.
Accommodations
Maximus provides reasonable accommodations to individuals requiring assistance during any phase of the employment process due to a disability, medical condition, or physical or mental impairment. If you require assistance at any stage of the employment process-including accessing job postings, completing assessments, or participating in interviews,-please contact People Operations at .
Minimum Salary
$
135,000.00
Maximum Salary
155,000.00
Data Pipeline/ETL (Informatica) Developer
Posted today
Job Viewed
Job Description
At Maximus, we're proud to be celebrating our 50th year in business, with strong financial performance - including $1.4B in revenue this quarter and 15% growth in our Federal services group.
Becoming part of Maximus means joining a team that offers:
* A generous annual allowance for education or professional certification
* Free access to robust certification and training programs to help you grow your career
* Strong career path with support for internal mobility
* A collaborative, respectful work environment with supportive leadership
* Comprehensive benefits, including medical/dental/vision, generous paid time off, and more
We are seeking a Data Pipeline/ETL(Informatica) Developer to join our team supporting an Internal Revenue Service (IRS) client.
Position is 100% remote with preference close to Farmers Branch, TX or Lanham, MD.
Employment is contingent upon successful completion of the IRS-required Moderate Risk Background Investigation (MBI). The MBI requires the selected candidate be a U.S. Citizenship or Permanent Resident (Green card) status for at least 3 years. The MBI certification process will take 4 to 5+ months, unless the candidate already holds an active MBI, which may shorten the timeline.
Essential Duties and Responsibilities:
- Provide design and implementation expertise to a cross-functional software development team.
- Design and develop software applications from business requirements in collaboration with other team members.
- Support testing and remediate defects.
- May provide guidance, coaching, and training to other employees within job area.
This role will be responsible for the development of data pipeline (Informatica workflows) and ETL (Post processing) code. The candidate will collaborate with IRS teams to design and develop ETL code to transform and process large amounts of data from disparate IRS data sources.
Job-Essential Duties and Responsibilities:
- Design and develop Informatica PowerCenter mappings including transformations, mappings, mapplets, workflows, workflow monitoring
- Design, code, unit test ETL packages, triggers, and stored procedures, views, SQL transactions
- Migrate data from On-prem managed service environments to cloud platforms AWS Redshift/Databricks
- Ingest data from disparate data sources and apply business rules using SQL/Python to create fact, dimension and summary tables for ingestion by BI tools
- Create, test and refine SQL queries to support data profiling and data management
- Understand and comply with development standards and SDLC to ensure consistency across the project.
- Perform troubleshooting, debugging, and performance tuning
- Support installation, upgrades, and testing of Informatica components.
- Perform data profiling, data standardization, data transformations, data loads.
- Perform detailed data profiling and end-to-end data management analysis to understand data issues, and to identify potential options to resolve issues.
- Support phases of the project lifecycle including requirements gathering, analysis, design, development, testing, deployment, support, and documentation
- Analyze, document and define data management processes, both batch and real-time.
- Evaluate data management and data governance practices to provide options and recommendations.
- Create clear deliverable materials for a range of recipients including technical teams, business teams, project leads and executives.
Job-specific minimum requirements:
- Bachelor's Degree in Information Technology or related field from an accredited college or university required; an additional four (4) years of related work experience can substitute for a degree
- At least five (3-5) years of relevant experience required
- Oracle and GreenPlum experience
- Proven Experience with SQL (Postgres, Redshift, MySQL), Python
- Strong experience with ETL development and working with diverse data sources
- Working knowledge of S3
- Advanced experience with Informatica Power Exchange, Control-M
- Experience with data modeling concepts
- Hands-on experience with Unix and shell scripts
- Experience in development, maintenance, and enhancement of Informatica Mappings, Work-flows, and processes
- Good verbal and written communication skills
- Ability to interface with all levels of management
- Ability to perform complex tasks with minimal supervision and guidance
- Excellent time management, scheduling and organizational skills
- Ability to work well independently or in a team setting
- Candidates must be a US Citizen or a Legal Permanent Resident (Green Card status) for 3 years and be Federal Tax compliant.
#techjobs #veteranspage #C0reJobs
Minimum Requirements
- Bachelor's degree in relevant field of study and 5+ years of relevant professional experience required, or equivalent combination of education and experience.
EEO Statement
Maximus is an equal opportunity employer. We evaluate qualified applicants without regard to race, color, religion, sex, age, national origin, disability, veteran status, genetic information and other legally protected characteristics.
Pay Transparency
Maximus compensation is based on various factors including but not limited to job location, a candidate's education, training, experience, expected quality and quantity of work, required travel (if any), external market and internal value analysis including seniority and merit systems, as well as internal pay alignment. Annual salary is just one component of Maximus's total compensation package. Other rewards may include short- and long-term incentives as well as program-specific awards. Additionally, Maximus provides a variety of benefits to employees, including health insurance coverage, life and disability insurance, a retirement savings plan, paid holidays and paid time off. Compensation ranges may differ based on contract value but will be commensurate with job duties and relevant work experience. An applicant's salary history will not be used in determining compensation. Maximus will comply with regulatory minimum wage rates and exempt salary thresholds in all instances.
Accommodations
Maximus provides reasonable accommodations to individuals requiring assistance during any phase of the employment process due to a disability, medical condition, or physical or mental impairment. If you require assistance at any stage of the employment process-including accessing job postings, completing assessments, or participating in interviews,-please contact People Operations at .
Minimum Salary
$
135,000.00
Maximum Salary
155,000.00
Data Pipeline/ETL (Informatica) Developer
Posted today
Job Viewed
Job Description
At Maximus, we're proud to be celebrating our 50th year in business, with strong financial performance - including $1.4B in revenue this quarter and 15% growth in our Federal services group.
Becoming part of Maximus means joining a team that offers:
* A generous annual allowance for education or professional certification
* Free access to robust certification and training programs to help you grow your career
* Strong career path with support for internal mobility
* A collaborative, respectful work environment with supportive leadership
* Comprehensive benefits, including medical/dental/vision, generous paid time off, and more
We are seeking a Data Pipeline/ETL(Informatica) Developer to join our team supporting an Internal Revenue Service (IRS) client.
Position is 100% remote with preference close to Farmers Branch, TX or Lanham, MD.
Employment is contingent upon successful completion of the IRS-required Moderate Risk Background Investigation (MBI). The MBI requires the selected candidate be a U.S. Citizenship or Permanent Resident (Green card) status for at least 3 years. The MBI certification process will take 4 to 5+ months, unless the candidate already holds an active MBI, which may shorten the timeline.
Essential Duties and Responsibilities:
- Provide design and implementation expertise to a cross-functional software development team.
- Design and develop software applications from business requirements in collaboration with other team members.
- Support testing and remediate defects.
- May provide guidance, coaching, and training to other employees within job area.
This role will be responsible for the development of data pipeline (Informatica workflows) and ETL (Post processing) code. The candidate will collaborate with IRS teams to design and develop ETL code to transform and process large amounts of data from disparate IRS data sources.
Job-Essential Duties and Responsibilities:
- Design and develop Informatica PowerCenter mappings including transformations, mappings, mapplets, workflows, workflow monitoring
- Design, code, unit test ETL packages, triggers, and stored procedures, views, SQL transactions
- Migrate data from On-prem managed service environments to cloud platforms AWS Redshift/Databricks
- Ingest data from disparate data sources and apply business rules using SQL/Python to create fact, dimension and summary tables for ingestion by BI tools
- Create, test and refine SQL queries to support data profiling and data management
- Understand and comply with development standards and SDLC to ensure consistency across the project.
- Perform troubleshooting, debugging, and performance tuning
- Support installation, upgrades, and testing of Informatica components.
- Perform data profiling, data standardization, data transformations, data loads.
- Perform detailed data profiling and end-to-end data management analysis to understand data issues, and to identify potential options to resolve issues.
- Support phases of the project lifecycle including requirements gathering, analysis, design, development, testing, deployment, support, and documentation
- Analyze, document and define data management processes, both batch and real-time.
- Evaluate data management and data governance practices to provide options and recommendations.
- Create clear deliverable materials for a range of recipients including technical teams, business teams, project leads and executives.
Job-specific minimum requirements:
- Bachelor's Degree in Information Technology or related field from an accredited college or university required; an additional four (4) years of related work experience can substitute for a degree
- At least five (3-5) years of relevant experience required
- Oracle and GreenPlum experience
- Proven Experience with SQL (Postgres, Redshift, MySQL), Python
- Strong experience with ETL development and working with diverse data sources
- Working knowledge of S3
- Advanced experience with Informatica Power Exchange, Control-M
- Experience with data modeling concepts
- Hands-on experience with Unix and shell scripts
- Experience in development, maintenance, and enhancement of Informatica Mappings, Work-flows, and processes
- Good verbal and written communication skills
- Ability to interface with all levels of management
- Ability to perform complex tasks with minimal supervision and guidance
- Excellent time management, scheduling and organizational skills
- Ability to work well independently or in a team setting
- Candidates must be a US Citizen or a Legal Permanent Resident (Green Card status) for 3 years and be Federal Tax compliant.
#techjobs #veteranspage #C0reJobs
Minimum Requirements
- Bachelor's degree in relevant field of study and 5+ years of relevant professional experience required, or equivalent combination of education and experience.
EEO Statement
Maximus is an equal opportunity employer. We evaluate qualified applicants without regard to race, color, religion, sex, age, national origin, disability, veteran status, genetic information and other legally protected characteristics.
Pay Transparency
Maximus compensation is based on various factors including but not limited to job location, a candidate's education, training, experience, expected quality and quantity of work, required travel (if any), external market and internal value analysis including seniority and merit systems, as well as internal pay alignment. Annual salary is just one component of Maximus's total compensation package. Other rewards may include short- and long-term incentives as well as program-specific awards. Additionally, Maximus provides a variety of benefits to employees, including health insurance coverage, life and disability insurance, a retirement savings plan, paid holidays and paid time off. Compensation ranges may differ based on contract value but will be commensurate with job duties and relevant work experience. An applicant's salary history will not be used in determining compensation. Maximus will comply with regulatory minimum wage rates and exempt salary thresholds in all instances.
Accommodations
Maximus provides reasonable accommodations to individuals requiring assistance during any phase of the employment process due to a disability, medical condition, or physical or mental impairment. If you require assistance at any stage of the employment process-including accessing job postings, completing assessments, or participating in interviews,-please contact People Operations at .
Minimum Salary
$
135,000.00
Maximum Salary
155,000.00
Data Pipeline/ETL (Informatica) Developer
Posted today
Job Viewed
Job Description
At Maximus, we're proud to be celebrating our 50th year in business, with strong financial performance - including $1.4B in revenue this quarter and 15% growth in our Federal services group.
Becoming part of Maximus means joining a team that offers:
* A generous annual allowance for education or professional certification
* Free access to robust certification and training programs to help you grow your career
* Strong career path with support for internal mobility
* A collaborative, respectful work environment with supportive leadership
* Comprehensive benefits, including medical/dental/vision, generous paid time off, and more
We are seeking a Data Pipeline/ETL(Informatica) Developer to join our team supporting an Internal Revenue Service (IRS) client.
Position is 100% remote with preference close to Farmers Branch, TX or Lanham, MD.
Employment is contingent upon successful completion of the IRS-required Moderate Risk Background Investigation (MBI). The MBI requires the selected candidate be a U.S. Citizenship or Permanent Resident (Green card) status for at least 3 years. The MBI certification process will take 4 to 5+ months, unless the candidate already holds an active MBI, which may shorten the timeline.
Essential Duties and Responsibilities:
- Provide design and implementation expertise to a cross-functional software development team.
- Design and develop software applications from business requirements in collaboration with other team members.
- Support testing and remediate defects.
- May provide guidance, coaching, and training to other employees within job area.
This role will be responsible for the development of data pipeline (Informatica workflows) and ETL (Post processing) code. The candidate will collaborate with IRS teams to design and develop ETL code to transform and process large amounts of data from disparate IRS data sources.
Job-Essential Duties and Responsibilities:
- Design and develop Informatica PowerCenter mappings including transformations, mappings, mapplets, workflows, workflow monitoring
- Design, code, unit test ETL packages, triggers, and stored procedures, views, SQL transactions
- Migrate data from On-prem managed service environments to cloud platforms AWS Redshift/Databricks
- Ingest data from disparate data sources and apply business rules using SQL/Python to create fact, dimension and summary tables for ingestion by BI tools
- Create, test and refine SQL queries to support data profiling and data management
- Understand and comply with development standards and SDLC to ensure consistency across the project.
- Perform troubleshooting, debugging, and performance tuning
- Support installation, upgrades, and testing of Informatica components.
- Perform data profiling, data standardization, data transformations, data loads.
- Perform detailed data profiling and end-to-end data management analysis to understand data issues, and to identify potential options to resolve issues.
- Support phases of the project lifecycle including requirements gathering, analysis, design, development, testing, deployment, support, and documentation
- Analyze, document and define data management processes, both batch and real-time.
- Evaluate data management and data governance practices to provide options and recommendations.
- Create clear deliverable materials for a range of recipients including technical teams, business teams, project leads and executives.
Job-specific minimum requirements:
- Bachelor's Degree in Information Technology or related field from an accredited college or university required; an additional four (4) years of related work experience can substitute for a degree
- At least five (3-5) years of relevant experience required
- Oracle and GreenPlum experience
- Proven Experience with SQL (Postgres, Redshift, MySQL), Python
- Strong experience with ETL development and working with diverse data sources
- Working knowledge of S3
- Advanced experience with Informatica Power Exchange, Control-M
- Experience with data modeling concepts
- Hands-on experience with Unix and shell scripts
- Experience in development, maintenance, and enhancement of Informatica Mappings, Work-flows, and processes
- Good verbal and written communication skills
- Ability to interface with all levels of management
- Ability to perform complex tasks with minimal supervision and guidance
- Excellent time management, scheduling and organizational skills
- Ability to work well independently or in a team setting
- Candidates must be a US Citizen or a Legal Permanent Resident (Green Card status) for 3 years and be Federal Tax compliant.
#techjobs #veteranspage #C0reJobs
Minimum Requirements
- Bachelor's degree in relevant field of study and 5+ years of relevant professional experience required, or equivalent combination of education and experience.
EEO Statement
Maximus is an equal opportunity employer. We evaluate qualified applicants without regard to race, color, religion, sex, age, national origin, disability, veteran status, genetic information and other legally protected characteristics.
Pay Transparency
Maximus compensation is based on various factors including but not limited to job location, a candidate's education, training, experience, expected quality and quantity of work, required travel (if any), external market and internal value analysis including seniority and merit systems, as well as internal pay alignment. Annual salary is just one component of Maximus's total compensation package. Other rewards may include short- and long-term incentives as well as program-specific awards. Additionally, Maximus provides a variety of benefits to employees, including health insurance coverage, life and disability insurance, a retirement savings plan, paid holidays and paid time off. Compensation ranges may differ based on contract value but will be commensurate with job duties and relevant work experience. An applicant's salary history will not be used in determining compensation. Maximus will comply with regulatory minimum wage rates and exempt salary thresholds in all instances.
Accommodations
Maximus provides reasonable accommodations to individuals requiring assistance during any phase of the employment process due to a disability, medical condition, or physical or mental impairment. If you require assistance at any stage of the employment process-including accessing job postings, completing assessments, or participating in interviews,-please contact People Operations at .
Minimum Salary
$
135,000.00
Maximum Salary
155,000.00
Be The First To Know
About the latest Data pipeline Jobs in United States !
Data Pipeline/ETL (Informatica) Developer
Posted today
Job Viewed
Job Description
At Maximus, we're proud to be celebrating our 50th year in business, with strong financial performance - including $1.4B in revenue this quarter and 15% growth in our Federal services group.
Becoming part of Maximus means joining a team that offers:
* A generous annual allowance for education or professional certification
* Free access to robust certification and training programs to help you grow your career
* Strong career path with support for internal mobility
* A collaborative, respectful work environment with supportive leadership
* Comprehensive benefits, including medical/dental/vision, generous paid time off, and more
We are seeking a Data Pipeline/ETL(Informatica) Developer to join our team supporting an Internal Revenue Service (IRS) client.
Position is 100% remote with preference close to Farmers Branch, TX or Lanham, MD.
Employment is contingent upon successful completion of the IRS-required Moderate Risk Background Investigation (MBI). The MBI requires the selected candidate be a U.S. Citizenship or Permanent Resident (Green card) status for at least 3 years. The MBI certification process will take 4 to 5+ months, unless the candidate already holds an active MBI, which may shorten the timeline.
Essential Duties and Responsibilities:
- Provide design and implementation expertise to a cross-functional software development team.
- Design and develop software applications from business requirements in collaboration with other team members.
- Support testing and remediate defects.
- May provide guidance, coaching, and training to other employees within job area.
This role will be responsible for the development of data pipeline (Informatica workflows) and ETL (Post processing) code. The candidate will collaborate with IRS teams to design and develop ETL code to transform and process large amounts of data from disparate IRS data sources.
Job-Essential Duties and Responsibilities:
- Design and develop Informatica PowerCenter mappings including transformations, mappings, mapplets, workflows, workflow monitoring
- Design, code, unit test ETL packages, triggers, and stored procedures, views, SQL transactions
- Migrate data from On-prem managed service environments to cloud platforms AWS Redshift/Databricks
- Ingest data from disparate data sources and apply business rules using SQL/Python to create fact, dimension and summary tables for ingestion by BI tools
- Create, test and refine SQL queries to support data profiling and data management
- Understand and comply with development standards and SDLC to ensure consistency across the project.
- Perform troubleshooting, debugging, and performance tuning
- Support installation, upgrades, and testing of Informatica components.
- Perform data profiling, data standardization, data transformations, data loads.
- Perform detailed data profiling and end-to-end data management analysis to understand data issues, and to identify potential options to resolve issues.
- Support phases of the project lifecycle including requirements gathering, analysis, design, development, testing, deployment, support, and documentation
- Analyze, document and define data management processes, both batch and real-time.
- Evaluate data management and data governance practices to provide options and recommendations.
- Create clear deliverable materials for a range of recipients including technical teams, business teams, project leads and executives.
Job-specific minimum requirements:
- Bachelor's Degree in Information Technology or related field from an accredited college or university required; an additional four (4) years of related work experience can substitute for a degree
- At least five (3-5) years of relevant experience required
- Oracle and GreenPlum experience
- Proven Experience with SQL (Postgres, Redshift, MySQL), Python
- Strong experience with ETL development and working with diverse data sources
- Working knowledge of S3
- Advanced experience with Informatica Power Exchange, Control-M
- Experience with data modeling concepts
- Hands-on experience with Unix and shell scripts
- Experience in development, maintenance, and enhancement of Informatica Mappings, Work-flows, and processes
- Good verbal and written communication skills
- Ability to interface with all levels of management
- Ability to perform complex tasks with minimal supervision and guidance
- Excellent time management, scheduling and organizational skills
- Ability to work well independently or in a team setting
- Candidates must be a US Citizen or a Legal Permanent Resident (Green Card status) for 3 years and be Federal Tax compliant.
#techjobs #veteranspage #C0reJobs
Minimum Requirements
- Bachelor's degree in relevant field of study and 5+ years of relevant professional experience required, or equivalent combination of education and experience.
EEO Statement
Maximus is an equal opportunity employer. We evaluate qualified applicants without regard to race, color, religion, sex, age, national origin, disability, veteran status, genetic information and other legally protected characteristics.
Pay Transparency
Maximus compensation is based on various factors including but not limited to job location, a candidate's education, training, experience, expected quality and quantity of work, required travel (if any), external market and internal value analysis including seniority and merit systems, as well as internal pay alignment. Annual salary is just one component of Maximus's total compensation package. Other rewards may include short- and long-term incentives as well as program-specific awards. Additionally, Maximus provides a variety of benefits to employees, including health insurance coverage, life and disability insurance, a retirement savings plan, paid holidays and paid time off. Compensation ranges may differ based on contract value but will be commensurate with job duties and relevant work experience. An applicant's salary history will not be used in determining compensation. Maximus will comply with regulatory minimum wage rates and exempt salary thresholds in all instances.
Accommodations
Maximus provides reasonable accommodations to individuals requiring assistance during any phase of the employment process due to a disability, medical condition, or physical or mental impairment. If you require assistance at any stage of the employment process-including accessing job postings, completing assessments, or participating in interviews,-please contact People Operations at .
Minimum Salary
$
135,000.00
Maximum Salary
155,000.00
Data Pipeline/ETL (Informatica) Developer
Posted today
Job Viewed
Job Description
At Maximus, we're proud to be celebrating our 50th year in business, with strong financial performance - including $1.4B in revenue this quarter and 15% growth in our Federal services group.
Becoming part of Maximus means joining a team that offers:
* A generous annual allowance for education or professional certification
* Free access to robust certification and training programs to help you grow your career
* Strong career path with support for internal mobility
* A collaborative, respectful work environment with supportive leadership
* Comprehensive benefits, including medical/dental/vision, generous paid time off, and more
We are seeking a Data Pipeline/ETL(Informatica) Developer to join our team supporting an Internal Revenue Service (IRS) client.
Position is 100% remote with preference close to Farmers Branch, TX or Lanham, MD.
Employment is contingent upon successful completion of the IRS-required Moderate Risk Background Investigation (MBI). The MBI requires the selected candidate be a U.S. Citizenship or Permanent Resident (Green card) status for at least 3 years. The MBI certification process will take 4 to 5+ months, unless the candidate already holds an active MBI, which may shorten the timeline.
Essential Duties and Responsibilities:
- Provide design and implementation expertise to a cross-functional software development team.
- Design and develop software applications from business requirements in collaboration with other team members.
- Support testing and remediate defects.
- May provide guidance, coaching, and training to other employees within job area.
This role will be responsible for the development of data pipeline (Informatica workflows) and ETL (Post processing) code. The candidate will collaborate with IRS teams to design and develop ETL code to transform and process large amounts of data from disparate IRS data sources.
Job-Essential Duties and Responsibilities:
- Design and develop Informatica PowerCenter mappings including transformations, mappings, mapplets, workflows, workflow monitoring
- Design, code, unit test ETL packages, triggers, and stored procedures, views, SQL transactions
- Migrate data from On-prem managed service environments to cloud platforms AWS Redshift/Databricks
- Ingest data from disparate data sources and apply business rules using SQL/Python to create fact, dimension and summary tables for ingestion by BI tools
- Create, test and refine SQL queries to support data profiling and data management
- Understand and comply with development standards and SDLC to ensure consistency across the project.
- Perform troubleshooting, debugging, and performance tuning
- Support installation, upgrades, and testing of Informatica components.
- Perform data profiling, data standardization, data transformations, data loads.
- Perform detailed data profiling and end-to-end data management analysis to understand data issues, and to identify potential options to resolve issues.
- Support phases of the project lifecycle including requirements gathering, analysis, design, development, testing, deployment, support, and documentation
- Analyze, document and define data management processes, both batch and real-time.
- Evaluate data management and data governance practices to provide options and recommendations.
- Create clear deliverable materials for a range of recipients including technical teams, business teams, project leads and executives.
Job-specific minimum requirements:
- Bachelor's Degree in Information Technology or related field from an accredited college or university required; an additional four (4) years of related work experience can substitute for a degree
- At least five (3-5) years of relevant experience required
- Oracle and GreenPlum experience
- Proven Experience with SQL (Postgres, Redshift, MySQL), Python
- Strong experience with ETL development and working with diverse data sources
- Working knowledge of S3
- Advanced experience with Informatica Power Exchange, Control-M
- Experience with data modeling concepts
- Hands-on experience with Unix and shell scripts
- Experience in development, maintenance, and enhancement of Informatica Mappings, Work-flows, and processes
- Good verbal and written communication skills
- Ability to interface with all levels of management
- Ability to perform complex tasks with minimal supervision and guidance
- Excellent time management, scheduling and organizational skills
- Ability to work well independently or in a team setting
- Candidates must be a US Citizen or a Legal Permanent Resident (Green Card status) for 3 years and be Federal Tax compliant.
#techjobs #veteranspage #C0reJobs
Minimum Requirements
- Bachelor's degree in relevant field of study and 5+ years of relevant professional experience required, or equivalent combination of education and experience.
EEO Statement
Maximus is an equal opportunity employer. We evaluate qualified applicants without regard to race, color, religion, sex, age, national origin, disability, veteran status, genetic information and other legally protected characteristics.
Pay Transparency
Maximus compensation is based on various factors including but not limited to job location, a candidate's education, training, experience, expected quality and quantity of work, required travel (if any), external market and internal value analysis including seniority and merit systems, as well as internal pay alignment. Annual salary is just one component of Maximus's total compensation package. Other rewards may include short- and long-term incentives as well as program-specific awards. Additionally, Maximus provides a variety of benefits to employees, including health insurance coverage, life and disability insurance, a retirement savings plan, paid holidays and paid time off. Compensation ranges may differ based on contract value but will be commensurate with job duties and relevant work experience. An applicant's salary history will not be used in determining compensation. Maximus will comply with regulatory minimum wage rates and exempt salary thresholds in all instances.
Accommodations
Maximus provides reasonable accommodations to individuals requiring assistance during any phase of the employment process due to a disability, medical condition, or physical or mental impairment. If you require assistance at any stage of the employment process-including accessing job postings, completing assessments, or participating in interviews,-please contact People Operations at .
Minimum Salary
$
135,000.00
Maximum Salary
155,000.00
Data Pipeline/ETL (Informatica) Developer
Posted today
Job Viewed
Job Description
At Maximus, we're proud to be celebrating our 50th year in business, with strong financial performance - including $1.4B in revenue this quarter and 15% growth in our Federal services group.
Becoming part of Maximus means joining a team that offers:
* A generous annual allowance for education or professional certification
* Free access to robust certification and training programs to help you grow your career
* Strong career path with support for internal mobility
* A collaborative, respectful work environment with supportive leadership
* Comprehensive benefits, including medical/dental/vision, generous paid time off, and more
We are seeking a Data Pipeline/ETL(Informatica) Developer to join our team supporting an Internal Revenue Service (IRS) client.
Position is 100% remote with preference close to Farmers Branch, TX or Lanham, MD.
Employment is contingent upon successful completion of the IRS-required Moderate Risk Background Investigation (MBI). The MBI requires the selected candidate be a U.S. Citizenship or Permanent Resident (Green card) status for at least 3 years. The MBI certification process will take 4 to 5+ months, unless the candidate already holds an active MBI, which may shorten the timeline.
Essential Duties and Responsibilities:
- Provide design and implementation expertise to a cross-functional software development team.
- Design and develop software applications from business requirements in collaboration with other team members.
- Support testing and remediate defects.
- May provide guidance, coaching, and training to other employees within job area.
This role will be responsible for the development of data pipeline (Informatica workflows) and ETL (Post processing) code. The candidate will collaborate with IRS teams to design and develop ETL code to transform and process large amounts of data from disparate IRS data sources.
Job-Essential Duties and Responsibilities:
- Design and develop Informatica PowerCenter mappings including transformations, mappings, mapplets, workflows, workflow monitoring
- Design, code, unit test ETL packages, triggers, and stored procedures, views, SQL transactions
- Migrate data from On-prem managed service environments to cloud platforms AWS Redshift/Databricks
- Ingest data from disparate data sources and apply business rules using SQL/Python to create fact, dimension and summary tables for ingestion by BI tools
- Create, test and refine SQL queries to support data profiling and data management
- Understand and comply with development standards and SDLC to ensure consistency across the project.
- Perform troubleshooting, debugging, and performance tuning
- Support installation, upgrades, and testing of Informatica components.
- Perform data profiling, data standardization, data transformations, data loads.
- Perform detailed data profiling and end-to-end data management analysis to understand data issues, and to identify potential options to resolve issues.
- Support phases of the project lifecycle including requirements gathering, analysis, design, development, testing, deployment, support, and documentation
- Analyze, document and define data management processes, both batch and real-time.
- Evaluate data management and data governance practices to provide options and recommendations.
- Create clear deliverable materials for a range of recipients including technical teams, business teams, project leads and executives.
Job-specific minimum requirements:
- Bachelor's Degree in Information Technology or related field from an accredited college or university required; an additional four (4) years of related work experience can substitute for a degree
- At least five (3-5) years of relevant experience required
- Oracle and GreenPlum experience
- Proven Experience with SQL (Postgres, Redshift, MySQL), Python
- Strong experience with ETL development and working with diverse data sources
- Working knowledge of S3
- Advanced experience with Informatica Power Exchange, Control-M
- Experience with data modeling concepts
- Hands-on experience with Unix and shell scripts
- Experience in development, maintenance, and enhancement of Informatica Mappings, Work-flows, and processes
- Good verbal and written communication skills
- Ability to interface with all levels of management
- Ability to perform complex tasks with minimal supervision and guidance
- Excellent time management, scheduling and organizational skills
- Ability to work well independently or in a team setting
- Candidates must be a US Citizen or a Legal Permanent Resident (Green Card status) for 3 years and be Federal Tax compliant.
#techjobs #veteranspage #C0reJobs
Minimum Requirements
- Bachelor's degree in relevant field of study and 5+ years of relevant professional experience required, or equivalent combination of education and experience.
EEO Statement
Maximus is an equal opportunity employer. We evaluate qualified applicants without regard to race, color, religion, sex, age, national origin, disability, veteran status, genetic information and other legally protected characteristics.
Pay Transparency
Maximus compensation is based on various factors including but not limited to job location, a candidate's education, training, experience, expected quality and quantity of work, required travel (if any), external market and internal value analysis including seniority and merit systems, as well as internal pay alignment. Annual salary is just one component of Maximus's total compensation package. Other rewards may include short- and long-term incentives as well as program-specific awards. Additionally, Maximus provides a variety of benefits to employees, including health insurance coverage, life and disability insurance, a retirement savings plan, paid holidays and paid time off. Compensation ranges may differ based on contract value but will be commensurate with job duties and relevant work experience. An applicant's salary history will not be used in determining compensation. Maximus will comply with regulatory minimum wage rates and exempt salary thresholds in all instances.
Accommodations
Maximus provides reasonable accommodations to individuals requiring assistance during any phase of the employment process due to a disability, medical condition, or physical or mental impairment. If you require assistance at any stage of the employment process-including accessing job postings, completing assessments, or participating in interviews,-please contact People Operations at .
Minimum Salary
$
135,000.00
Maximum Salary
155,000.00