73 Data Architect jobs in Chicago
Data Architect
Posted 2 days ago
Job Viewed
Job Description
* Responsible for ensuring that the data management and governance frameworks are in place and adhered to
* Responsible for Architecting Data engineering pipelines including data acquisition, Ingestion, management, processing, publishing by leveraging various AWS services
* Responsible for developing and optimizing database models to store and retrieve company information
* Responsible for determining strategic data requirements, creating high-level designs according to these requirements
* Responsible for guiding and resolving issues with AWS service configuration, performance, security compliance, data quality issues and data retrieval.
* Prepare Architecture and high-level design documents for the data engineering pipelines
* Designing DevOps & DataOps pipelines and resolve issues during implementation and execution.
* Responsible for researching data acquisition opportunities
Skillset Candidate Self Rating (5) Candidate Writeup Work Experience (Years) Finance Domain Data expertise. bility to Model data in Data Lake and Data warehouse for Analytical use cases. bility to understand and negotiate solution with Business teams/Business Analysts. bility to architect and Design Data Storage, Data Transformation, Data Security, Data Quality and Performance optimization techniques in AWS and Snowflake. bility to architect and solution data pipelines for batch, streaming, API, Message Queues and real time sources. bility to work with structured, semi structured and unstructured data. bility to create clear and impactful Presentations based on audience.
Data Architect
Posted 6 days ago
Job Viewed
Job Description
COGNIZANT / Client / Onsite
We have the following 3 positions to be filled immediately. Please make sure to submit QUALITY resumes and who are available Local / Ready to Relocate. Check with consultants if they have submitted earlier.
Location - Chicago, IL (Onsite Hybrid)
Rate - Competitive.
Client - Client (Confidential)
Position 2- Data Architect
- Data Architect - Informatica / IICS. Experience in designing end to end solution. Additional data warehousing capabilities is good to have
- AWS red shift exposure
- Very good in communication
- Experience in Life Science domain -- Supply Chain is added advantage.
Data Architect
Posted 6 days ago
Job Viewed
Job Description
Cuesta Partners is a technology strategy consulting firm supporting organizations of various sizes throughout their technology journey. We help firms:
- Identify, scope, prioritize, design and deliver AI and Data Solutions that create transformative value for middle and enterprise class organizations
- Consider or prepare for acquisition - often by improving the company's data posture
- Through new product development or re-invigoration, new team structures, or adopting new practices and tools - including the use of modern data technologies
- Guide leadership teams in developing vision, direction, and scalable strategies for their technology business - including implementation of comprehensive data programs
Cuesta Partners is looking for a Data Architect to engage with us on transformational data programs with companies looking to take their AI & data capabilities to the next level.
Key Areas of Focus:
Business data modeling
- Trade-offs between different modeling philosophies - dimensional, 3NF, Data Vault
- Conceptual vs physical modeling
- Modeling techniques such as inheritance, parent / child tables, ragged structures, slowly changing dimensions etc.
- Normalization vs de-normalization trade-offs
- Detailed understanding the design trade-offs around different modeling approaches
- Ability to lead model review sessions, and being able to lay out the design "options" and implications, and also present it in a way that both executives and technical-minded people can understand
- Data as product
- Design compromises & considerations
- Know what exemplar deliverables look like
- Team composition and responsibilities / work to be done
- Streaming vs batch design patterns / considerations
- Pros / cons of data mesh delivery model vs alternatives
- Comparison of modern cloud native platforms vs legacy on-premises data solutions
- Types and most common root cause of DQ issues
- Remediation approaches
- MDM architecture styles / patterns
- Key capabilities of MDM & DQ vendors
- Data management layer
- SnowFlake
- DataBricks
- Microsoft Fabric
- GCP Big Query/ AWS DB Options
- Data acquisition & integration
- Azure Data Factory (ADF)
- Matillion
- FiveTran & HVR
- Keboola
- Data transformation & orchestration
- ETL
- DBT
- Python / SQL
- Apache Airflow etc.
- Vis:
- PowerBI
- Tableau
- Looker
- Domo / ThoughtSpot / Qlik / platform BI vendors (ORCL, SAP, AWS etc.)
- Considerations / experience evaluating lift & shift vs re-model trade-offs
- Considerations / experience when consolidating decentralized silos
- Considerations / experience when moving to modern cloud stacks
- Considerations / experience when enabling unified operational & analytics data hubs
- Able to identify & evaluate most important criteria when making design decisions
- Able to look around corners - recognize likely issues before they happen
- Able to communicate complex subjects with executive leaders
- Able to solicit input & feedback to model and design decisions
- Ability to teach / leverage experience to develop team/talent around them
- Use past experiences to help with change management to eases concerns over shifts in approach
- Design the business data model based on the discovered business processes and data analysis
- Translate business requirements into technical design specifications, including data streams, integrations, transformations, databases, and data warehouses.
- Develop work estimates for Data Warehouse & Data Lake deliverables
- Coach and mentor a team of a few dozen data engineers, analysts and ML Engineers on data architecture and modeling best practices
- Define the data architecture framework, standards, and principles, including modeling, metadata, security, reference data such as product codes and client categories, and master data such as clients, vendors, materials, and employees
- Define reference architecture, which is a pattern others can follow to create and improve data systems
- Define data flows, i.e., which parts of the organization generate data, which require data to function, how data flows are managed, and how data changes in transition
- Collaborate and coordinate with team members, clients and external SMEs
- Bachelor's degree in a technical or quantitative field (e.g. Computer Science, Math, Economics Statistics)
- 10+ years of work experience in the data analytics space
- Previous experience in the consulting space is a plus
- A passion for exploring and solving different kinds of problems
- A desire to learn and assimilate technical information quickly
- Hands-on experience deploying solutions in large-scale, high performing databases
- Expertise aligned to technologies listed in Key Areas of Focus
At Cuesta, we value entrepreneurship, humility, diversity, learning, speed, and balance. We provide our team members with:
- Constant opportunities for exposure & learning
- Flexible working location and enabling personal-life harmony with work
- Agency and influence in the company's total strategy and direction
- Collaboration with a high-performing team
- Competitive base salary (outlined in this listing) and target bonus of 20-25%
- 401k, healthcare benefits, paid time-off, and more!
Data Architect
Posted 6 days ago
Job Viewed
Job Description
Data Architect
Location: Chicago Hybrid onsite 3x a week (local only)
Interview Mode: Virtual 2 Rounds (1st technical round, 2nd questions based - previous project details) 1-hour each
Type: Contract
Key Skills:
-Ability to Design and develop (must be hands on).
-Python: ability to create own scripts for dependency injection into Airflow (scheduling, workflows). Expert level
-Airflow: strong familiarization
-Snowflake: primary database
This role will operate as an individual contributor,
Description:
- The ideal contractor will be responsible for designing, developing, testing, and deploying software solutions for Hedge Fund Services.
- Propose new designs and modify existing ones to continuously improve performance, functionality, and stability of the system.
- Partner with business leaders and business unit partners to define priorities and deliver custom solutions to solve business problems or address business needs.
- Must be competent to work at the highest technical level of all phases of system design and implementation.
- Provides comprehensive consultation to Business Unit and IT management and staff at the highest technical level on all phases of the project development cycle.
- Acts as the principal designer for major systems and their subsystems utilizing a thorough understanding of available technology, tools, and existing designs.
- Design and develop high-performance programming language components used by trading applications.
- Provide technical expertise to support and enhance core-trading applications.
- Provides leadership and guidance to staff, fostering an environment that encourages employee participation, teamwork, and communication.
- Seasoned multi-disciplinary expert with extensive technical and / or business knowledge and functional expertise. Works at the highest technical level of all phases of system design and implementation.
- Focus of role is on execution of strategic direction of business function activities
- Carries out complex initiatives involving multiple disciplines and/or ambiguous issues
- Displays a balanced, cross-functional perspective, liaising with the business to improve efficiency, effectiveness, and productivity
Experience Level:
- Senior
Qualifications:
- A BS degree in Computer Science, Mathematics, or related Computer Engineering or Science curriculum is required.
- Strong programming skills in Snowflake, Python, AirFlow, DBT, Linux
- Strong server-side programming experience with automation and backend support.
- Experience with Snowflake.
- Experience with agile project methodology and collaboration.
- Excellent communication skills, analytical ability, strong judgment and management skills, and the ability to work effectively with client and IT management and staff required.
- Strong skills in working with Opensource technologies, Database technologies, micro service architecture, cloud-native development, continuous build, continuous integration and continuous deployment.
- Ability to work effectively with end users to define requirements.
- Leadership and organizational skills are required to determine the Business Unit's goals, resources needed, and to assess and develop the skills of staff.
- Experience designing and building cloud-native applications using microservices architecture.
- Hands-On experience with Kafka and overall use for developing an Event driven architecture model
- Experience in Domain Driven Design.
- Experience with continuous integration and collaboration tools like JIRA, Bitbucket, GitHub, and Confluence.
- Experience with building Data pipelines to Snowflake.
Specific Technical Responsibilities:
- Overall (applies to all technology platforms listed below):
- Provide production support for several data analytics solutions used every day
- Ability to perform as a technical lead in addition to being a contributing developer
- Code review of other teams' members
- Create and enhance data architecture models
- Ability to troubleshoot and identify root causes for a variety of production and data issues
Snowflake:
- Data transformation (ETL)
- Write Snowflake SQL including stored procedures and complex queries involving CTEs and temp tables
- Help design data models for new data to be ingested
- Snowflake SQL performance tuning
- Help complete migration of existing SQL Server based Data Vault into Snowflake
- Continue to support and work on future enhancements for Snowflake Data Vault
- Data ingestion (familiarity with Python and Kafka connectors is a nice to have but not necessarily required)
- Python/Linux
Nice to Have:
- A MS Degree is preferred. Experience with multi-threaded application design and development; including testing and deployment phases
Data Architect
Posted 6 days ago
Job Viewed
Job Description
Remote
90hc2c
***MUST HAVE MANUFACTURING EXPEIRENCE WITH AUTOMOTIVE HIGHLY PREFERRED.
Job Description
Join an innovative R&D group focused on enhancing the efficiency and automation of their data pipeline, including data collection, transformation, and system architecture. Work with advanced technologies such as smart displays on off-highway equipment, telematics, edge devices, and cloud computing.
Responsibilities
- Design and architect scalable and reliable data pipelines to automate data processing workflows.
- Develop and test software to implement prototype code to prove out designs.
- Collaborate with data scientists, engineers, and stakeholders to understand data needs and requirements.
- Develop and implement ETL processes to transform and load data from various sources into our data warehouse.
- Optimize data pipelines for performance, scalability, and efficiency.
- Create data architectural designs and develop related best practices.
- Transform business requirements into conceptual, logical, and physical data models.
- Promote and support a data-driven culture of quality, collaboration, rapid delivery, and business impact.
- Collaborate with Product Managers, Delivery leads, System Engineers, Data Analysts, Data Stewards, Business Stakeholders, and Software Development teams during the solution design process.
- Author, refine, and collaborate on the improvement of corporate data architecture standards, procedures, and metrics.
- Communicate data architecture-related concepts to both business and development teams.
- Integrate Data Architecture with the existing software delivery deployment process.
- 7+ years of experience with Data Modeling, Data Warehousing, and working with large-scale datasets.
- Experience working with Business Architects, System Architects, Engineering, Data Architects, and/or Data Stewards to capture business requirements in a Logical Data Model.
- Expertise in data architecture best practices, relational data modeling, and data warehouse/data mart concepts.
- Experience in integrating Data Architecture with CI/CD processes.
- Comprehensive understanding of data lineage, data mapping, data security/confidentiality/privacy, and related data standards.
- Deep knowledge of entity-relationship diagrams, normalization, abstraction, denormalization, dimensional modeling, and metadata modeling practices.
- Working knowledge of Relational Database Management Systems (RDBMS) and SQL.
- Strong collaboration skills with experience driving conversations and influencing complex data design challenges.
- Ability to make decisions with limited knowledge while balancing best practices, schedule, and business needs.
- Strong communication skills to model and convey complex concepts both in writing and verbally.
- Ability to work with limited supervision, break down complex data architectural requirements, prioritize work, and execute.
- BS/MS in Computer Engineering or Computer Science.
- Proficiency in programming languages such as Python, Golang, or others.
- Experience with ETL tools and frameworks (Apache tools, AWS Lambda, etc.).
- Experience with cloud platforms (e.g., AWS, Azure, Google Cloud) and their data services and deployment strategies.
- Knowledge of data modeling, database design, and data warehousing concepts.
- Experience with container orchestration tools (Docker, Kubernetes).
- Ability to create supporting documentation such as design documents, architecture diagrams, test procedures, and reports.
- Good oral and written communication skills with the ability to professionally support periodic communication to management and technical teams.
- Experience with Agile Scrum development methodologies and common workflow tools: GIT, Jira, etc.
- Ability to code (hands-on architect).
- Experience with ER/Studio.
- Experience in the off-highway heavy machinery, automotive, or industrial control industry.
- Experience designing software for a distributed ECU system using CAN or Ethernet communication.
- Understanding of machine learning and AI concepts.
- Experience developing software in additional programming languages.
- 5+ years of experience leveraging metadata within repositories.
- Working experience in AWS or exposure to other cloud technologies.
- Working knowledge of or exposure to Data Engineering tools and technologies (e.g., Databricks, Data Lake concepts).
- Working knowledge of AWS services such as Lambda, RDS, ECS, DynamoDB, API Gateway, S3, etc.
- Experience with REST API development or providing data architectural support during API development.
- Experience implementing and/or leveraging Data Governance and Stewardship program capabilities.
- Passionate, creative, and eager to learn new complex technical areas.
- Accountable, curious, and collaborative with a focus on quality.
- Skilled in interpersonal communications and ability to communicate complex topics to non-technical audiences.
- Experience working in an agile team environment.
- Ability to function in a fast-paced, collaborative team environment distributed across time zones and locations.
Work in a dynamic R&D environment utilizing advanced technologies such as telematics, edge devices, and cloud computing. Collaborate closely with cross-functional teams to drive innovation and efficiency in data architecture and processing workflows. The role may require working with diverse technologies and tools, including AWS, Python, Golang, Docker, and Kubernetes.
Data Architect
Posted 6 days ago
Job Viewed
Job Description
Location: Keyence U.S. Headquarters - Itasca, IL
Total Compensation (Base + Bonus): $103,283
As a Data Architect / DBA you will join our Data Analytics & Governance team. You will support and enhance our data infrastructure by managing SQL Server databases, building reports, and ensuring data security and governance. You'll collaborate across departments to drive data-driven decision-making and maintain the integrity of critical business systems.
- Respond to data-related inquiries from business units with clarity and precision.
- Support and eventually own Keyence's incentive compensation system (training provided).
- Manage and optimize SQL Server databases for performance, availability, and security.
- Develop and enforce database access policies and security standards.
- Build and maintain reports using SQL, Excel, and BI tools.
- Monitor database usage and performance metrics.
- Champion data governance and security best practices across the organization.
- Collaborate with third-party data providers to align with Keyence's architecture.
- Contribute to ongoing data governance initiatives and process improvements.
- Tackle other exciting projects as assigned.
- Bachelor's degree in IT, Computer Science, or a related field.
- 1-2 years of hands-on experience in database administration and design.
- Strong communication skills and a collaborative, problem-solving mindset.
- Solid understanding of relational databases and normalization techniques.
- Intermediate proficiency in:
- T-SQL
- Microsoft Excel
- SQL Server Database Management
- Some experience with Tableau or Power BI, familiarity with SSRS, SSIS, and exposure to AWS and modern data warehouses like Snowflake preferred but not required.
- Base Salary: $6,650
- Bonus Target: 36,633 annually (performance-based), paid quarterly
- Benefits: Medical, dental, vision, 401K match, ~4 weeks PTO in first full year
- Career Growth: Promote-from-within culture with base and bonus increases
- Recognized by Forbes as one of the World's Most Innovative Companies
- A global leader in factory automation and quality assurance solutions
- Operating profit of over 40% for 25 consecutive years
- A culture that invests in your success from day one
KEYENCE is an at-will, Equal Opportunity Employer.
None
Data Architect
Posted 6 days ago
Job Viewed
Job Description
At the Federal Home Loan Bank of Chicago, employees come first - that's why we offer a highly competitive compensation and bonus package, and access to a comprehensive benefits program designed to meet the needs of our employees.
- Collaborative, in-office operating model
- Retirement program (401k and Pension)
- Medical, dental and vision insurance
- Lifestyle Spending Account
- Competitive PTO plan
- 11 paid holidays per year
Who we are:
Our mission at FHLBank Chicago: To partner with our members in Illinois and Wisconsin to provide them competitively priced funding, a reasonable return on their investment, and support for their community investment activities.
Simply said, we're a bank for banks and other financial institutions, focused on being a strategic partner for our members and working together to reinvest in our communities, from urban centers to rural areas. Created by Congress in 1932, FHLBank Chicago is one of 11 Federal Home Loan Banks, government sponsored in support of mortgage lending and community investment.
What it's like to work here:
At FHLBank Chicago, we bring people together. We are committed to a high performing, engaged workforce, and to supporting the communities we serve across Illinois and Wisconsin. Our Buddy Program pairs new hires with tenured employees to guide their onboarding. Our professional development and training opportunities through upskilling, mentorship programs, and tuition reimbursement allow employees to grow their career with us. Our collaborative, in-office operating model brings teams together to foster innovation, connection, and shared success. To support balance and flexibility, employees are provided with an allocation of remote days to use as needed throughout the year.
What you'll do:
The Data Architect plays a pivotal role in shaping the enterprise data landscape at the Federal Home Loan Bank of Chicago. This position will be responsible for leading the design, development, and governance of scalable, secure, and high-quality enterprise data architecture that will support decision making and strategic initiatives for all business units across the enterprise.
This role will be instrumental in shaping how data is structured, accessed, and leveraged across the organization to drive business insights and operational efficiency. The Senior Data Architect will focus on Data Lakehouse designs, advanced analytics, and forward-looking AI/ML technologies. Additionally, this role will be a key member of the Solution Architecture team, providing an overall view of data architecture across solutions for the enterprise. The ideal candidate will have a strong background in data architecture, data engineering and analytics, with a proven track record of successfully leading data implementations.
How you'll make an impact:
- Drive Strategic Data Utilization : By designing and governing enterprise-wide data architecture, the Data Architect will enable the organization to structure and access data in ways that unlock business insights and improve operational efficiency.
- Ensure Scalable and Compliant Data Infrastructure : Through the integration of modern data technologies and adherence to privacy and security standards (e.g., GDPR, CCPA, SOC 2), this role ensures the data ecosystem is both scalable and compliant.
- Empower Cross-Functional Collaboration and Innovation : By partnering with teams across engineering, analytics, compliance, and business units, the Data Architect will align data systems with strategic goals, supporting initiatives in BI, AI/ML, and centralized data platforms.
- Design and maintain enterprise-wide data architecture aligned with business goals, operational efficiency, cyber security, and data safety standards in alignment with the Bank's regulatory requirements; including data models, data lakes, data warehouses, and real-time data pipelines.
- Participates in strategic planning for the organization, driving high level technical direction and vision.
- Lead the data transformation from traditional data architectures to a modern data lakehouse architecture.
- Provide leadership to the technical team and colleagues; help lead effort to build target team structure.
- Provide architectural oversight and validation of conceptual, logical, and physical data models developed by the data modeling team, ensuring alignment with enterprise data architecture standards and strategic objectives.
- Lead the integration of data from multiple internal and external sources, ensuring consistency, accuracy, and quality across systems through close collaboration with cross-functional IT teams.
- Collaborate with cross-functional teams including data engineering, network and system operations, identity and access, analytics, compliance, and business stakeholders to align data architecture with business goals.
- Ensure data integration strategies support enterprise standards for high availability, scalability, security, and resilience
- Continuously explore, evaluate, and recommend modern data technologies (e.g., cloud data platforms, Databricks, etc.) to enhance the enterprise data ecosystem.
- Collaborate with Data Governance and business units to understand their data needs and provide solutions.
- Communicate complex technical concepts to non-technical stakeholders in a clear and concise manner.
- Ensure compliance with data privacy and security standards (e.g., PII, SOX, SOC 2)
- Support the development and continuous enhancement of our centralized data platform and contribute to the evolution of our data lakehouse and governance frameworks.
- Provide technical leadership and guidance on data architecture best practices and standards.
- Serve as the primary point of contact for data architecture across the enterprise.
- Work closely with data scientists and analysts to ensure data availability and quality for advanced analytics use cases.
- Foster a culture of data-driven decision-making across the organization.
- Provide architectural guidance and standardization for data warehousing, BI, and AI/ML initiatives.
- Stay current with emerging technologies and trends in data architecture, analytics, and AI/ML.
- Bachelor's or Master's degree in Computer Science, Information Systems, or a related field.
- 10+ years of experience in data architecture, data engineering, or database design.
- Proven experience in leading data transformation initiatives.
- Deep expertise in Spark, Python, SQL, data modeling, and ETL/ELT pipeline strategy and optimization.
- Extensive experience with Data Lakehouse designs and advanced analytics.
- Hands-on experience with AI/ML technologies and their application in data architectures
- Banking industry knowledge an added plus.
- Ability to work effectively in a fast-paced, dynamic environment where priorities can change quickly.
- Hands-on experience with cloud platforms (e.g., AWS, Azure, GCP) and tools like Snowflake, Databricks, or BigQuery.
- Strong understanding of financial data domains such as payments, lending, or capital markets.
- Strong understanding of data governance, security, and compliance frameworks.
- Familiarity with regulatory reporting and compliance in financial services.
- Experience with data cataloging tools and data governance frameworks.
- Knowledge of Python, Spark, or other data processing languages.
- Experience working in an Agile Scrum environment.
- Proven ability to partner with and challenge strategic implementation partner(s).
- Excellent communication and stakeholder management skills.
- Certifications in data architecture or related fields, including; Certified Data Management Professional (CDMP), AWS Certified Data Analytics, Google Cloud Professional Data Engineer, Databricks Certified Engineer.
- Experience with the Databricks platform is required.
The perks:
At FHLBank Chicago, we believe in rewarding our high performing workforce. We offer a highly competitive compensation and bonus package, and access to a comprehensive benefits program designed to meet the needs of our employees. Our retirement program includes a 401(k) and pension plan. Our wellbeing program supports employees at work and in their personal lives: Our PTO plan provides five weeks of vacation for new employees and 11 paid holidays per year; our Lifestyle Spending Account provides an annual stipend for employees to support wellbeing activities; and our central downtown location at the Old Post Office provides easy access to public transportation and breathtaking views from our award-winning rooftop. Step into a brighter future with us.
Salary Range:
$125,825.00 - $221,275.00
The above represents the expected salary range for this job requisition. Ultimately, in determining your pay, we may also consider your experience, and other job-related factors. In addition to the base salary, we offer a comprehensive benefits package which can be found here:
Be The First To Know
About the latest Data architect Jobs in Chicago !
Data Architect
Posted 6 days ago
Job Viewed
Job Description
Overview
The Data Architect leads the strategic design, implementation, and governance of enterprise data systems. This role integrates responsibilities across data architecture, master data management (MDM), metadata management, and analytics to ensure scalable, secure, and high-quality data environments. Collaborates across business and technology teams to align data strategies with organizational goals, drive operational excellence, and foster a data-driven culture.
Responsibilities
1. Strategic Leadership & Governance
-
Define and lead the enterprise-wide strategy for Master Data Management and Data Architecture.
-
Develop and operate governance models balancing structure, agility, and business needs.
-
Align data architecture with enterprise goals such as operational efficiency, regulatory compliance, and customer-facing analytics.
-
Establish and enforce data governance frameworks, metadata management practices, and architectural standards.
-
Drive accountability through robust data governance and measurement frameworks.
2. Data Architecture & Engineering
-
Design and implement scalable, secure, and high-performance data architectures supporting analytics and reporting.
-
Architect and manage the Enterprise Data Warehouse (EDW), including data lineage, dimensional modeling (e.g., Kimball), and metadata management.
-
Develop and maintain data pipelines using Azure Data Factory, Synapse, SSIS, and other modern tools.
-
Integrate external and internal data sources to support advanced analytics and decision-making.
3. Master Data Management & Stewardship
-
Lead standardization of data structures, attributes, and taxonomies across member and provider domains.
-
Collaborate with IT to integrate new data management technologies and APIs for a single source of truth.
-
Educate business units on MDM best practices and data stewardship principles.
4. Business Intelligence & Analytics
-
Lead the design and development of Power BI dashboards and reports.
-
Enable self-service analytics through reusable data models and reduced IT dependency.
-
Translate business requirements into technical specifications and data models.
5. Compliance & Security
-
Design and implement secure data access infrastructure aligned with privacy regulations (e.g., GDPR, HIPAA).
-
Monitor and enforce data quality rules and compliance metrics.
-
Provide data governance scorecards and exception reports to stakeholders.
Qualifications
Education & Experience
-Bachelor's or Master's degree in Computer Science, Information Systems, Data Engineering, Business, or related fields.
-
7-10 years of experience in data architecture, governance, and MDM.
-
Proven track record in delivering enterprise data strategies and governance programs.
Technical Expertise
-
Dimensional modeling, metadata management, and data warehousing principles.
-
ETL development using Azure Data Factory, Synapse, SSIS.
-
Advanced Power BI skills including DAX and dashboard/report development.
-
Familiarity with data lake, lakehouse, and metadata-driven design.
-
Experience with platforms such as SAP, Oracle, Google Cloud, NetSuite.
-
Moderate to advanced SQL/NoSQL proficiency.
Governance & Stewardship
-
Deep understanding of MDM, metadata, reference data, and data quality initiatives.
-
Experience with tools like Collibra and governance frameworks.
-
Ability to establish and drive data stewardship operating models.
Additional Leadership Requirements
-
Strong leadership and mentoring capabilities.
-
Excellent communication and cross-functional collaboration skills.
-
Ability to influence and build consensus across executive and operational levels.
-
Strategic thinking and innovation mindset.
Preferred Certifications
-
Certified in Governance, Risk and Compliance (ISC²)
-
Certified in the Governance of Enterprise IT (CGEIT)
We offer a comprehensive benefit packages for our permenant employees. For a complete overview of our benefits package, please visit ourJoint Commission Career Page (
This job description is intended to describe the general nature and level of work performed by an employee assigned to this position. The description is not an exhaustive list of all duties, responsibilities, knowledge, skills, and abilities, and working conditions associated with this position. All requirements are subject to possible modification and reasonably accommodate individuals with disabilities.
Min
USD $127,000.00/year
Max
USD $174,500.00/year
Job Locations US-IL-Oak Brook
Job ID
# of Openings 1
Category Information Technology
-
Equal Opportunity Employer/Protected Veterans/Individuals with Disabilities.
-
Please view Equal Employment Opportunity Posters provided by OFCCP here.
-
The contractor will not discharge or in any other manner discriminate against employees or applicants because they have inquired about, discussed, or disclosed their own pay or the pay of another employee or applicant. However, employees who have access to the compensation information of other employees or applicants as a part of their essential job functions cannot disclose the pay of other employees or applicants to individuals who do not otherwise have access to compensation information, unless the disclosure is (a) in response to a formal complaint or charge, (b) in furtherance of an investigation, proceeding, hearing, or action, including an investigation conducted by the employer, or (c) consistent with the contractor's legal duty to furnish information.
-
This Organization Participates in E-Verify. Click here for more information.
Data Architect
Posted 6 days ago
Job Viewed
Job Description
Hourly Pay Range:
$41.64 - $64.54 - The hourly pay rate offered is determined by a candidate's expertise and years of experience, among other factors.Position Highlights:
- Position: Data Architect
- Location: 4901 Searle Parkway, Skokie IL
- Full Time
- Hours: Monday-Friday, 8am- 4:30pm
- Hybrid schedule
- On Call (rotating)
What you will do:
- Interact and lead discussions with stakeholders to understand and document the scope, business requirements and translate these into the technical design to support Endeavor Health's NorthShore entity Enterprise Data Warehouse.
- Lead the translation of business needs into feasible and acceptable data-centric semantic layer designs.
- Create conceptual, logical and physical data models Kimball,Inmonand other modern frameworks using 3NF and de-normalized dimensional design philosophies.
- Drive and document the source to target mapping with the ETL developers.
- Design highly performant and secure structures such as staging areas, integrated data, data marts, cubes and operational data stores.
- Suggest, document and enforce data warehousing best practices including overall Data warehouse architecture relating to ODS, ETL
- Drive adoption of cloud data warehouse and lakehouse solutions such as Snowflake, Databricks, BigQuery, or Azure Fabric.
- Partner with engineering teams to design ETL/ELT pipelines, data marts, and semantic layers.
- Ensure data governance, security, lineage, and compliance standards are embedded in architecture.
- Estimate and communicate the work effort to EDW management, Stakeholders and own the deliverables from inception, review and signoff of the deliverables.
- Review, improve and enhance existing models to ensure compliance with best practices around master data management and a 3-layer architecture across technology platforms.
- Act independently under general direction and may also provide technical consulting on complex projects throughout the organization.
- Suggest improvements to data quality through data quality frameworks and processes preferred.
- Knowledge of definingMaster Data Managementsolutions architecture and setting technical direction and defining component architecture.
What you will need:
- Education : 4 year college degree in computer science/data processing or equivalent work experience
- Experience :
- 6+ years supporting and developing software applications.
- 6+ years of experience as a Data Architect, Data Modeler with a data warehouse (PLSQLMSSQL database experience is key, DBA experience not required)
- Unique or Preferred Skills :
- Strong dimensional modeling and star schema design skills
- Ability to translate business needs into technical design.
- 2+ yearsrequired experience with any Data Modeling tool (Erwin preferred)
- Familiarity with cloud-native architectures, APIs, microservices, and event-driven systems is a plus
- 1+ year of Proven hands-on experience with at least one major cloud data platform:
- Snowflake or Databricks or Google BigQuery or Azure Synapse/ Fabric
- Experience with SQL, ETL/ELT frameworks, and data integration tools (Informatica, dbt, Azure Data Factory, Talend, etc.).
- Understanding of data governance, metadata management, MDM, and data security practices.
- Experience with reporting tools such as Cognos/Power BI is a plus.
- Experience with Epic Clarity and Caboodle databases is a plus
- Familiarity with Healthcare industry is a plus
- Experience and detailed understanding of iterative system implementation, programming, integration, data conversion and testing techniques.
- Strong verbal and written communication, presentation and customer service skills
- Ability to solve highly complex technical and operational problems
- Programming skills and the ability to maintain and understand complex applications
Benefits (For full time or part time positions):
- Incentive pay for select positions
- Opportunity for annual increases based on performance
- Career Pathways to Promote Professional Growth and Development
- Various Medical, Dental, Pet and Vision options
- Tuition Reimbursement
- Free Parking
- Wellness Program Savings Plan
- Health Savings Account Options
- Retirement Options with Company Match
- Paid Time Off and Holiday Pay
- Community Involvement Opportunities
Endeavor Health is a fully integrated healthcare delivery system committed to providing access to quality, vibrant, community-connected care, serving an area of more than 4.2 million residents across six northeast Illinois counties. Our more than 25,000 team members and more than 6,000 physicians aim to deliver transformative patient experiences and expert care close to home across more than 300 ambulatory locations and eight acute care hospitals - Edward (Naperville), Elmhurst, Evanston, Glenbrook (Glenview), Highland Park, Northwest Community (Arlington Heights) Skokie and Swedish (Chicago) - all recognized as Magnet hospitals for nursing excellence. For more information, visit
When you work for Endeavor Health, you will be part of an organization that encourages its employees to achieve career goals and maximize their professional potential.
Please explore our website ( to better understand how Endeavor Health delivers on its mission to "help everyone in our communities be their best".
Endeavor Health is committed to working with and providing reasonable accommodation to individuals with disabilities. Please refer to the main career page for more information.
Diversity, equity and inclusion is at the core of who we are; being there for our patients and each other with compassion, respect and empathy. We believe that our strength resides in our differences and in connecting our best to provide community-connected healthcare for all.
EOE: Race/Color/Sex/Sexual Orientation/ Gender Identity/Religion/National Origin/Disability/Vets, VEVRRA Federal Contractor.
Data Architect
Posted 6 days ago
Job Viewed
Job Description
Functional Manager role (no direct reports at this time, but the role could evolve into management). Candidates must have experience with campaign management, pharmacy, customer insights, consumer research, or pricing and promotion. Must have extensive experience with improving data quality and data integrity on a large scale. Problem identification and resolution, working across the organization from sales, marketing, IT. etc. Consults with cross functional partners in IT and Analytics on data ingestion and database design, identifying areas for optimization. Communicates processes and changes to both technical and non-technical stakeholders.
This person will be responsible for meeting with business partners, making recommendations, performing systems analysis and big fixes.