1,747 Etl Architect jobs in the United States
ETL Architect
Posted today
Job Viewed
Job Description
Job Description
Key Responsibilities
- Design and implement ETL/ELT architecture with Snowflake as the enterprise data warehouse .
- Integrate data from diverse sources (RDBMS, APIs, SaaS apps, flat files, streaming platforms, cloud services ) into Snowflake.
- Define data integration best practices, including reusability, scalability, and cost optimization.
- Lead and mentor ETL/ELT developers in building robust pipelines.
- Optimize Snowflake performance (virtual warehouses, clustering, query tuning).
- Establish data quality, governance, and lineage frameworks.
- Collaborate with data architects, BI developers, and business stakeholders for end-to-end data delivery.
- Evaluate and implement ETL/ELT tools and automation frameworks suited for multiple source systems.
- Troubleshoot integration issues and define long-term solutions.
- Keep up to date with Snowflake features and emerging data integration technologies.
Required Skills & Qualifications
- 15+ years in ETL/ELT architecture and development .
- Strong expertise in Snowflake (warehouses, streams, tasks, snowpipe, data sharing).
- Proficiency in multiple ETL/ELT tools: Informatica, Talend, Matillion, SSIS, AWS Glue, DBT, Fivetran .
- Strong SQL and performance optimization skills.
ETL Architect

Posted 1 day ago
Job Viewed
Job Description
+ Design, develop, and support data ingestion and transformation solutions for reporting and analytics as the ETL Architect, while performing data analysis, constructing technical designs, communicating with stakeholders, and coordinating the work of offshore development and support resources
+ Perform data analysis and create data models for peer review, ensuring high standards of data integrity and quality
+ Provide guidance and direction to offshore support and development resources to enhance collaboration and project outcomes
+ Communicate effectively with BI team members, offshore teams, and cross-IT staff regarding support and project activities
+ Assist in task identification and effort estimation for ETL development, contributing to project planning and execution
+ Offer off-hour and weekend support on a rotating basis to ensure the reliability and availability of data solutions
+ Other duties as assigned
Required Education and Experience:
+ Bachelor's Degree in Computer Science, Information Systems, or in a related discipline and 4 to 6 plus years of related experience or High School Diploma/General Education Degree (GED) and 8 plus years of specific experience
+ 2 plus years of ETL design experience
+ Design and development experience with scheduling and automation tools, preferably in the Azure environment, along with excellent ETL development skills using Azure Data Factory, strong SQL proficiency, and a solid understanding of dimensional modeling and SDLC best practices in DW/BI/Analytics
Preferred Education and Experience:
+ Master's Degree
+ 4 plus years of ETL development experience
+ Microsoft Azure, Snowflake, and other relevant certifications
+ Experience with orchestration in the Azure environment, Snowflake data warehouse, Azure DevOps, GitHub, Azure Blob Storage/ADLS for data storage and analysis, and proficiency in JavaScript and Python for stored procedures and scripting
Benefits
At the Reyes Family of Businesses, our Total Rewards Strategy prioritizes the holistic well-being of our employees. This position offers a comprehensive benefits package that includes Medical, Dental, Vision coverage, Paid Time Off, Retirement Benefits, and complimentary Health Screenings.
Equal Opportunity Employee & Physical Demands
Reyes Holdings and its businesses are equal opportunity employers. Company policy prohibits discrimination and harassment against any applicant or employee based on race, color, religion, sex, pregnancy or pregnancy-related medical conditions, marital status, sexual orientation, gender identity or expression, age, national origin, citizenship, disability, genetic information, military or veteran status, or any other basis protected by applicable law. In addition, the Company is committed to providing reasonable accommodation to applicants and employees in accordance with applicable law. Requests for accommodation should be directed to your point of contact in the Talent Acquisition or Human Resources departments.
Background Check and Drug Screening
Offers of employment are contingent upon successful completion of a background check and drug screening.
Pay Transparency
Our compensation philosophy embraces diverse factors for fair pay decisions, valuing skills, experience, and the needs of our business. Moreover, this role may have the opportunity to participate in a discretionary incentive program, subject to program rules.
All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation and gender identity, national origin, disability, or protected veteran status. Drug Free Workplace.
ETL Solutions Architect
Posted today
Job Viewed
Job Description
Role Overview:
The Solution Architect will be responsible for designing, overseeing, and ensuring the successful implementation of data integration pipelines and executive dashboards.
This role requires close collaboration with business stakeholders, project managers, developers, and QA teams to translate requirements into scalable architecture and actionable technical solutions.
Key Responsibilities:
• Define and own the overall solution architecture for dashboards, ETL pipelines, and integrations across tools such as Jira, Jama, TestRail, GitHub, and Excel.
• Lead requirements workshops with stakeholders to finalize KPIs, data sources, and dashboard layouts.
• Design role-based, secure, and scalable architecture aligned with enterprise standards and compliance needs.
• Guide development teams on building Power BI dashboards, data transformation scripts, and CI/CD pipelines.
• Ensure solution alignment with V&V governance process and compliance models (ISO/ASPICE).
• Oversee deployment strategy, training, and transition to client teams.
• Provide architectural oversight during iterative sprints, ensuring quality, performance, and stakeholder satisfaction.
• Collaborate with DevOps and QA teams to validate solution performance and reliability.
Required Skills & Experience:
• 10+ years in IT with at least 5 years as a Solution Architect in analytics, data integration, or dashboard projects.
• Strong expertise in Power BI, ETL/ELT design, and data modeling.
• Experience with GCP cloud services, IAM/AD integration, and CI/CD pipelines.
• Hands-on knowledge of APIs, data pipelines, and enterprise data platforms.
• Familiarity with V&V, QA, and compliance-focused dashboards in automotive or manufacturing domains.
• Proven ability to run workshops, translate business KPIs into technical designs, and deliver executive-ready solutions.
• Excellent communication and stakeholder management skills.
Nice to Have:
• Experience with Jira, Jama, TestRail, GitHub.
• Knowledge of automotive quality frameworks (ISO, ASPICE).
• Prior work on shift-left quality dashboards or similar initiatives.
Mulesoft kafka ETL Developer/ Architect
Posted 4 days ago
Job Viewed
Job Description
Location: Plano, TX
Duration: Fulltime
Job Description :
Must Have Technical/Functional Skills:
Primary Skill: Mulesoft
Secondary: kafka, Hadoop or Informatica.
Experience: Minimum 8 years.
Roles & Responsibilities:
MuleSoft:
- Strong knowledge of MuleSoft Anypoint platform, including Anypoint Studio IDE, Design Center, API Manager, and Runtime.
- Experience with API-led connectivity and microservices architecture.
- Experience with MuleSoft connectors for various data sources and systems.
- Experience with DataWeave scripting and RAML specifications.
- Experience with Kafka brokers, producers, and consumers.
- Knowledge of Kafka concepts such as topics, partitions, and offsets.
- Experience to read/write messages to kafka using any ETL tool like informatica/Hadoop.
- Experience with HDFS, MapReduce, and Hadoop ecosystems.
- Understanding data storage and processing techniques in Hadoop.
- Work with Informatica PowerCenter or other Informatica products for ETL (Extract, Transform, Load) processes and data integration.
- API-Driven Connectivity: Focus on building APIs that enable seamless data flow between systems, following API-led connectivity principles Drive value-delivery and continuously improve the product by effectively utilizing data such as feedback and metrics like quality, delivery rate, etc., to identify opportunities.
Diverse Lynx LLC is an Equal Employment Opportunity employer. All qualified applicants will receive due consideration for employment without any discrimination. All applicants will be evaluated solely on the basis of their ability, competence and their proven capability to perform the functions outlined in the corresponding role. We promote and support a diverse workforce across all levels in the company.
Data Integration
Posted 4 days ago
Job Viewed
Job Description
Client: Options Clearing Corporation
Location: Hybrid downtown Chicago
Interview Mode: Virtual 2 rounds
- Contingent Worker - RCR-956 TDM-6
- I. WORK TO BE PERFORMED:
- Responsible for the design, development, implementation, and maintenance of new and existing Data Integration processes within the Strategic Data Platform and other data solutions implementations.
- Design, develop, maintain, enhance, and monitor Data Integration processes sourcing data from various structured and semi-structured data sources, transforming as per provided business rules or data mappings, and loading into target databases/services including DB2, AWS S3, and Apache Iceberg.
- Serve as a subject matter expert on multiple data integration projects, ensuring consistent design and efficient data processing.
- Play a leading role in Build, Test, and Implementation phases of assigned projects, with a diverse development team spread across multiple locations.
- Provide Level 2 and Level 3 technical support to mission-critical applications.
- Perform process improvement and re-engineering with an understanding of technical data problems and solutions as they relate to the current and future business environment.
- Analyze complex distributed production deployments and make recommendations to optimize performance.
- Design and develop innovative data solutions for demanding business situations.
- Develop and document Data Integration standards and procedures.
- Design reusable and portable Data Integration components.
- Transform flat delimited files into multiple formats, including Parquet, JSON, and Protobuf.
- Play a supporting role in Build, Test, and Implementation phases of assigned projects, with a diverse development team spread across multiple locations.
- Investigate production Data Integration issues and work to determine the root cause of problems.
- Work with the DBA teams to fix performance issues as required.
- Work with the QA team to address testing issues as they occur.
- Work with the Data Quality/Data Governance team to ensure required data quality and data integrity standards are met.
- Design reusable Data Integration components.
- Participate in daily Scrum standups. Update task status in Jira.
- Conduct Code reviews and incorporate suggested improvements/fixes.
- Perform other duties as assigned.
- II. SKILL AND EXPERIENCE REQUIRED:
- 3+ years in IT application development, with at least 2 years in Data Integration (ETL) Development, using tools such as DataStage, Python, SQL and Unix Shell Scripting.
- Experienced practitioner with a proven track record of implementing mid-scale to large-scale Data Integration (ETL) solutions, performing data analysis and profiling, interacting with client management, and designing and architecting dynamic ETL solutions.
- Working knowledge for Data Types and Databases including Relational, Dimensional, Vertical and No SQL databases.
- Hands-on experience with data cleansing and data visualization and familiarity with Hadoop Ecosystem.
- Experience with platforms such as Hive and Trino/Starburst.
- Strong conceptual knowledge of file formats such as JSON, Parquet and Protobuf.
- Capable of writing and optimizing SQL queries and stored procedures.
- Exposure to AWS Services including but not limited S3, PostgreSQL, Apache Iceberg and Kafka.
- Ideal candidate would have worked on a Scrum Team and familiar with using tools such as Jira and Git.
- Experience developing containerized Data Integration solutions using Kubernetes, Rancher, and CI/CD pipelines using Harness, Jenkins and Artifactory is a plus.
- Experience working with scheduling tools such as UC4, Control-M, or Autosys is a plus.
- Experience working with Cloud ecosystems (AWS, Azure, GCP, AWS preferred) in multi-availability zone, multi-region deployments.
- Experience in database performance optimization techniques, multidimensional database designs, and physical database implementations for databases including DB2 and Apache Iceberg is a plus.
- Strong verbal/written communication skills.
- Ability to interact with business users and to develop positive working relationships.
- AWS Certified Solution Architect Associate Level is a plus.
- Big data related certification is a plus.
Data Integration/Data Engineer

Posted 1 day ago
Job Viewed
Job Description
Job Category: Science
Time Type: Full time
Minimum Clearance Required to Start: Secret
Employee Type: Regular
Percentage of Travel Required: Up to 10%
Type of Travel: Local
* * *
**The Opportunity:**
Data Integration/Data Engineer working at Joint Interagency Task Force South (JIATF-S), you will support mission-critical data analytics initiatives to improve decision-making, counter threats, and enhance partnerships. The candidate shall perform the following tasks:
+ Lead data processing and analytics initiatives to create and maintain a Common Intelligence Picture (CIP) and Common Operational Picture (COP) for real-time mission awareness.
+ Enable effective integration across a multi-platform ecosystem, focusing on data platforms such as Advana and Maven Smart to improve data accessibility and operational efficiency.
+ Design, test, and manage advanced data solutions, developing and deploying analytical models that drive data-driven insights.
+ Work collaboratively with intelligence and operational teams to ensure mission data is accurate, timely, and accessible across all relevant systems.
+ Support data literacy and promote data-driven decision-making by contributing to training and adoption efforts within the command.
**Responsibilities:**
+ Integrate mission-specific data to support the development of a Common Intelligence Picture (CIP) and Common Operational Picture (COP), enhancing situational awareness and decision-making.
+ Support data ingestion, cleaning, transformation, and storage in Advana and Maven Smart. Develop and manage ETL pipelines to ensure data accuracy, consistency, and accessibility.
+ Build, test, and deploy machine learning and statistical models tailored to JIATF-S and SOUTHCOM's needs for data-driven insights, such as real-time intelligence analysis, strategic decision-making support, and operational planning.
+ Implement processes for data verification, tagging, and structuring to maintain trusted, secure, and high-quality data resources. Ensure that data assets meet compliance standards and organizational requirements for secure access and usage.
+ Work closely with internal and external stakeholders, including the Chief Data Office, intelligence teams, and operational units, to address data needs and deliver actionable insights that align with JIATF-S and SOUTHCOM's objectives.
+ Utilize specialized knowledge in Advana and Maven Smart for data-driven solutions. Lead efforts to integrate new data sources into these platforms, ensuring interoperability across JIATF-S's data environment.
+ Contribute to data literacy initiatives by creating and delivering training materials for team members and partners. Promote data-driven decision-making practices within JIATF-S and SOUTHCOM.
+ Develop comprehensive documentation of data processes, models, and analytical findings. Generate reports to track progress on data and AI initiatives, ensuring alignment with the SOUTHCOM Campaign Plan and Command priorities.
**Qualifications:**
_Required:_
+ Bachelor's degree in Data Science, Computer Science, Engineering, or a related field with at least 3 years of experience in data science or data engineering. Equivalent combinations of education and specialized experience, such as an Associate degree with 7 years of experience, or more than 11 years of recent specialized experience will also be considered.
+ Must hold a current secret security clearance.
+ Proficiency in SQL, Python, R, or similar data science and engineering languages.
+ Hands-on experience with DoD data platforms, specifically Advana and Maven Smart.
+ Strong knowledge of data management, ETL processes, and data integration best practices.
_Desired:_
+ Experience with data analytics and AI in support of Geographic Combatant Command operations.
+ Prior involvement in data-driven decision support for Operational Planning Teams (OPTs) and similar cross-functional DoD projects.
+ Familiarity with cloud environments and big data platforms, including data security standards and compliance for government systems.
+ Proven track record of promoting data literacy and supporting data-driven organizational culture.
-
**___**
**What You Can Expect:**
**A culture of integrity.**
At CACI, we place character and innovation at the center of everything we do. As a valued team member, you'll be part of a high-performing group dedicated to our customer's missions and driven by a higher purpose - to ensure the safety of our nation.
**An environment of trust.**
CACI values the unique contributions that every employee brings to our company and our customers - every day. You'll have the autonomy to take the time you need through a unique flexible time off benefit and have access to robust learning resources to make your ambitions a reality.
**A focus on continuous growth.**
Together, we will advance our nation's most critical missions, build on our lengthy track record of business success, and find opportunities to break new ground - in your career and in our legacy.
**Your potential is limitless.** So is ours.
Learn more about CACI here. ( Range** : There are a host of factors that can influence final salary including, but not limited to, geographic location, Federal Government contract labor categories and contract wage rates, relevant prior work experience, specific skills and competencies, education, and certifications. Our employees value the flexibility at CACI that allows them to balance quality work and their personal lives. We offer competitive compensation, benefits and learning and development opportunities. Our broad and competitive mix of benefits options is designed to support and protect employees and their families. At CACI, you will receive comprehensive benefits such as; healthcare, wellness, financial, retirement, family support, continuing education, and time off benefits. Learn more here ( .
The proposed salary range for this position is:
$78,000 - $163,800
_CACI is_ _an Equal Opportunity Employer._ _All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, pregnancy, sexual orientation, age, national origin, disability, status as a protected veteran, or any_ _other protected characteristic._
Data Integration Engineer
Posted today
Job Viewed
Job Description
DATA INTEGRATION ENGINEER
REPORTS TO: DIRECTOR OF SOFTWARE DEVELOPMENT AND INTEGRATIONS
STATUS: EXEMPT
Summary
Boot Barn is where community comes first. We thrive on togetherness, collaboration, and belonging. We build each other up, listen intently, and implement out-of-the-box ideas. We celebrate new innovations, congratulate one another’s achievements, and most importantly support each other.
At Boot Barn, we work together to make a positive impact on the world around us, and by working collectively with encouragement, we consider ourselves “Partners.” With the values of the West guiding us, Boot Barn celebrates heritage, welcomes all, and values each unique Partner within our Boot Barn community.
Our vision is to offer everyone a piece of the American spirit – one handshake at a time.
The Data Integration Engineer designs, builds, and operates batch and streaming pipelines that move data from MS SQL Server, MongoDB, legacy files, and third-party APIs into our Data Vault warehouse and machine-learning (ML) cluster, ensuring that data is accurate, timely, and analytics-ready. The role blends hands-on ETL/ELT development in SSIS, Spark, Runbooks, and Azure Data Factory with data-modeling expertise (hubs, links, satellites) to support scalable reporting, predictive models, and AI agents. Working closely with development team and cross-functional product teams.
Essential Duties and Responsibilities
- Design, develop, and deploy incremental and full load pipelines using SSIS, Spark, Runbooks and Azure Data Factory to ingest data into landing, raw, and curated layers of the Data Vault.
- Build CDC (change data capture) solutions to minimize latency for downstream reporting and ML features.
- Automate schema evolution and metadata population for hubs, links, and satellites.
- Implement validation rules, unit tests, and data quality frameworks to enforce referential integrity and conformance to business rules.
- Maintain a requirements traceability matrix and publish data lineage documentation Metadata Management / SSAS models.
- Partner with Business Analysts to translate user stories into technical interfaces and mapping specs.
- Provide regular knowledge transfer sessions and production support, serving as Tier 2 escalation for data integration incidents.
- Create CI/CD pipelines (Azure DevOps, Git) to version ETL code, infrastructure as code, and automated tests.
- Develop PowerShell/.NET utilities to orchestrate jobs, manage secrets, and push metrics to Grafana or Azure Monitor.
- Benchmark and tune Spark, SQL, and SSIS performance; recommend index strategies, partitioning, and cluster sizing strategies for cost/performance balance.
- Stay current with emerging integration patterns (e.g., event driven architectures, Delta Lake) and propose pilots for adoption.
- Demonstrates high level of quality work, attendance and appearance.
- Demonstrates high degree of professionalism in communication, attitude and teamwork with customers, peers and management.
- Adhere to all local, federal and state laws in addition to Company policies, procedures, and practices.
- Performs any other duties that may be assigned by management.
Additional Responsibilities
- Provide functional support for critical business applications and integrations, escalating technical issues to development teams as needed.
- Contribute to data-catalog enrichment, data-dictionary curation, and SLA reporting.
- Mentor developers on ETL best practices and Data Vault modeling.
- Demonstrates high level of quality work, attendance and appearance
- Adheres to all Company Policies & Procedures and Safety Regulations
- Adheres to local, state and federal laws
- Perform other duties as assigned in alignment with evolving business priorities.
- Understands and complies with all company rules and regulations
- No direct reports; may lead ad-hoc project teams or vendor resources during integration projects.
Qualifications
- Strong communication, customer service, time management and organizational skills.
- Bachelor’s in Computer Science, Information Systems, or related field.
- 4+ years building data integration with MS SQL, SSIS, or Spark; 2+ years with cloud based ETL platforms.
- Strong T SQL, Python/Scala for Spark, PowerShell/.NET scripting; working knowledge of MongoDB aggregation, SSAS tabular models, and Git CI/CD.
- Data Vault 2.0 certification a plus.
- Excellent problem solving, communication, and stakeholder management abilities.
Competencies
- Ensure Effective Communication - Listens carefully and attentively to others' opinions and ideas. Communicates information clearly, concisely, and professionally.
- Establish Trust - Follows through on commitments. Is honest and direct with others. Promotes a culture of respect for, commitment to, and compliance with Company values, beliefs, and standards. Ensures the protection of confidential information.
Boot Barn Benefits & Additional Compensation Opportunities
- Competitive hourly salary.
- Merchandise discount: 50% off of Exclusive Brands and 40% off of third-party brands.
- Paid Time Off plan for year-round Boot Barn Partners.**
- Medical, Dental, Vision and Life Insurance.**
- 401(k) plan with generous company matching.
- Flexible schedules and work/life balance.
- Opportunities for growth at every level – we are opening 50+ new stores each year.
**For eligible Boot Barn Partners
PAY RANGE: $100,000.00 - $120,000.00*
*compensation varies based on geography, skills, experience, and tenure
Physical Demands
In general, the following physical demands are representative of those that must be met by a Partner to successfully perform the essential functions of this job. Reasonable accommodations may be made to allow differently abled individuals to perform the essential functions of the job.
- Standing, walking and squatting less than fifty percent of the work shift.
- Required to lift, move and carry up to 40 pounds.
- Ability to read, count and write to accurately complete all documentation and reports.
- Must be able to see, hear and speak in order to communicate with partners and customers.
- Specific vision abilities include close vision, distance vision, peripheral vision, depth perception and ability to adjust focus.
- Manual dexterity required using hands to finger; handle, feel and type; reach with hands and arms.
( ) Sedentary: Limited activity, no lifting, limited walking
( X ) Light: Office work, some lifting, bending, stooping or kneeling, walking
( ) Moderate: Mostly standing, walking, bending, frequent lifting
( ) Arduous: Heavy lifting, bending, crawling, climbing
Work Environment
In general, the following conditions of the work environment are representative of those that a Partner encounters while performing the essential functions of this job. Reasonable accommodations may be made to allow differently abled individuals to perform the essential functions of the job within the environment.
- The workspace is clean, orderly, properly lighted and ventilated with the proper safety compliance.
- Noise levels are considered moderate.
Boot Barn, Inc. reserves the right to make exceptions to modify or eliminate this document or its content. This document supersedes all previous policies, procedures or guidelines pertaining to this subject.
Our core value of community bands us together in supportive and inclusive ways to drive our collective success. Boot Barn provides equal employment opportunity to all applicants and employees without regard to race, color, religion, sex, sexual orientation, age, national or ethnic origin, veteran or military status, disability, as well as any other protected status under the law.
Americans with Disabilities Act (ADA) - Boot Barn will provide reasonable accommodations (such as a qualified sign language interpreter or other personal assistance) with the application process upon your request as required by applicable laws. If you have a disability and require assistance in this application process, please visit your nearest Boot Barn Store or Distribution Center or reach out to Human Resources at , Option 4.
Be The First To Know
About the latest Etl architect Jobs in United States !
Data Integration Specialist
Posted today
Job Viewed
Job Description
Who You Are
You are detail-oriented, analytical, and thrive on solving problems in real time. With at least 3 years of related IT or data integration experience, you’re comfortable monitoring transactions, troubleshooting issues, and keeping systems running smoothly. You enjoy collaborating across teams and have strong communication skills to explain technical issues in a clear, simple way.
What You Will Be Doing
As our Data Integration Specialist, you’ll ensure the accuracy and reliability of data integrations for purchase orders, sales orders, and other transactional systems. Your day-to-day will include:
- Monitoring inbound and outbound integrations to ensure accuracy and timeliness
- Identifying and resolving discrepancies or integration issues
- Performing initial troubleshooting and escalating when necessary
- Maintaining process documentation and adhering to workflows
- Supporting the business with data integrity across all interfaces
- Being available for off-hours response if urgent issues arise
What You Need
- Bachelor’s degree in IT, computer science, or related field (or equivalent experience)
- 3+ years of related experience in data integration or IT operations
- Strong problem-solving and analytical skills (level 1 troubleshooting)
- Familiarity with integration tools and processes (no coding required)
- Excellent communication and collaboration skills with technical and non-technical teams
- Ability to work onsite full-time in Mansfield, MA
- Flexibility to respond to after-hours emergencies if needed
Additional Information
Our client has over 90 years of experience in the beverage alcohol industry. We’re a family-owned company that values our employees and offers:
- 85k-100k annual salary
- Medical, dental, and 401(k) with profit sharing
- Tuition reimbursement
- A supportive, people-first culture
Data Integration Analyst
Posted 3 days ago
Job Viewed
Job Description
* Develop detailed technical specifications and designs for waterfall/XP projects / system enhancements.
* Provide application domain knowledge to the ADM team.
* Manage unstable applications / systems by helping to resolve incidents and issues.
* Support infrastructure first build with Solution Architects and Infrastructure Engineers.
* Review, guide, troubleshoot and resolve contentious issues escalated during implementation.
* Evaluate integration of SaaS / COTS packages to current environment.
* Develop operating procedures and user manuals for transition into routine operations.
* Enable building of knowledge databases for technology, solutions and domain areas.
* Work with architecture, application services, infrastructure, release management and other relevant stakeholders to enable a smooth deployment.
* Continuously work on capturing and transferring tacit knowledge to vendors and knowledge management systems.
Required Job Qualifications
*Bachelor Degree and 2 years Information Technology OR Technical Certification and/or College Courses and 4 year Information Technology experience OR 6 years Information Technology experience.
* Knowledge of application configuration
* Knowledge of Big Data Analytics - HAVeN / HADOOP/ Autonomy / Vertica
* Knowledge of Cloud computing / SaaS / IaaS / PaaS technologies
* Knowledge of Integration Technologies - Tibco/Informatica/CAS
* Knowledge of Mobile technologies - Phonegap / iOS / Android / Java / HTML5
* Knowledge of Scripting Languages - BASH / PERL / PYTHON / RUBY
* Service Oriented Architecture - SOAP / REST
* Web Technologies - HTML/CSS/Java/ASP.Net/PHP/Ruby/C#
* Rapid prototyping
* Requirements definition & management
* SDLC Methodology - Agile / Scrum / Iterative Development
* System Performance management
* Conceptual thinking
* Creative thinking
* Problem solving and analytical thinking
* Strong oral/written communication skills
Preferred Job Qualifications:
* Bachelor Degree in Computer Science or Information Technology related
* Ability to execute
Developer, Data Integration
Posted 4 days ago
Job Viewed
Job Description
Education Required: Bachelor's degree in computer science or information management
Experience Required: 2 years of work experience as a data management analyst, data analyst or related
Special Skills Required: Must have work experience in each of the following: 1) Working with Blackbaud SDK, using XML specifications, SSIS packaging solutions, version control tools, VB .Net programming language, MS SQL, MS Visual Studio and other PC based applications to build and document data integration processes; 2) Using Blackbuad CRM adhoc query tool and API tools to create automated export processes for data syncs with external data vendors; and 3) Performing analytical programming using SQL, writing audit reports using SSRS, and working with large data sets.
Benefits: Your comprehensive benefits package includes medical, vision, and dental coverage, company paid life insurance, generous vacation time including 24 days annually, and more. Generous pension benefits are also included. We invest in your present and future wellbeing, providing the support you need to succeed.
Salary: $74,275