8,297 Senior Data Engineer jobs in the United States

Senior Data Engineer - Cloud Data Warehousing

53202 West Milwaukee, Wisconsin $130000 Annually WhatJobs

Posted 7 days ago

Job Viewed

Tap Again To Close

Job Description

full-time
Our client, a leading data analytics firm based in **Milwaukee, Wisconsin, US**, is seeking a Senior Data Engineer to architect and implement scalable cloud-based data warehousing solutions. This is a remote role, providing an opportunity to work from anywhere within the US. You will be responsible for designing, building, and maintaining robust data pipelines, optimizing data storage, and ensuring data quality and accessibility for analytics and business intelligence teams. The ideal candidate possesses strong expertise in cloud data platforms, ETL/ELT processes, and big data technologies. Responsibilities include:
  • Designing, developing, and implementing efficient and scalable data pipelines using cloud technologies (e.g., AWS, Azure, GCP).
  • Building and managing cloud-based data warehouses and data lakes, optimizing for performance and cost-effectiveness.
  • Developing and maintaining ETL/ELT processes to ingest, transform, and load data from various sources into the data warehouse.
  • Implementing data quality frameworks and data governance policies to ensure data accuracy, consistency, and reliability.
  • Collaborating with data scientists, analysts, and business stakeholders to understand their data requirements and provide solutions.
  • Optimizing database performance, query tuning, and data modeling techniques.
  • Developing and maintaining infrastructure as code for data platform deployment and management.
  • Mentoring junior data engineers and contributing to best practices in data engineering.
Qualifications:
  • Bachelor's degree in Computer Science, Engineering, or a related quantitative field. Master's degree preferred.
  • Minimum of 6 years of experience in data engineering, with a strong focus on cloud data warehousing and big data technologies.
  • Proficiency in SQL and experience with at least one major cloud data warehouse platform (e.g., Snowflake, Redshift, BigQuery).
  • Hands-on experience with ETL/ELT tools and data pipeline orchestration frameworks (e.g., Airflow, dbt).
  • Strong programming skills in Python or Scala.
  • Familiarity with data modeling concepts and data architecture principles.
  • Experience with distributed data processing frameworks (e.g., Spark, Hadoop) is a plus.
  • Excellent problem-solving and analytical skills.
  • Strong communication and collaboration abilities.
This role offers a challenging and rewarding opportunity to leverage your data engineering expertise in a remote setting and drive significant impact through robust data solutions.
Apply Now

Remote Lead Data Engineer, Cloud Data Warehousing

23218 Richmond, Virginia $140000 Annually WhatJobs

Posted 7 days ago

Job Viewed

Tap Again To Close

Job Description

full-time
Our client is seeking an experienced and visionary Lead Data Engineer to build and scale their cloud-based data warehousing solutions. This is a fully remote position, offering an exceptional opportunity to architect and implement robust data pipelines and storage solutions that drive business insights. You will be responsible for designing, developing, and optimizing complex ETL/ELT processes, data models, and data lakes on modern cloud platforms. The ideal candidate will possess deep expertise in cloud data warehousing technologies (e.g., Snowflake, Redshift, BigQuery), distributed data processing frameworks (e.g., Spark, Hadoop), and programming languages commonly used in data engineering (e.g., Python, SQL). You will lead a team of talented data engineers, mentor junior members, and collaborate closely with data scientists, analysts, and business stakeholders to ensure data accessibility, quality, and integrity. This role demands strong architectural skills, a strategic mindset for data governance, and a passion for building performant and scalable data infrastructure. You will play a key role in defining the future of our client's data strategy and enabling data-driven decision-making across the organization.

Responsibilities:
  • Design, develop, and implement scalable cloud data warehouse solutions.
  • Architect and build robust ETL/ELT pipelines for data ingestion and transformation.
  • Develop and maintain efficient data models and schemas for analytical purposes.
  • Lead and mentor a team of data engineers, providing technical guidance and oversight.
  • Collaborate with data scientists, analysts, and business stakeholders to understand data requirements.
  • Ensure data quality, integrity, and reliability across all data platforms.
  • Optimize data pipelines and warehouse performance for speed and cost-efficiency.
  • Implement data governance policies and best practices.
  • Evaluate and recommend new data technologies and tools.
  • Troubleshoot and resolve complex data-related issues.

Qualifications:
  • Bachelor's or Master's degree in Computer Science, Engineering, Information Systems, or a related field.
  • 8+ years of experience in data engineering, with at least 3 years in a lead or architect role.
  • Extensive experience with cloud data warehousing platforms (Snowflake, Redshift, BigQuery).
  • Proficiency in SQL, Python, and other relevant data engineering languages.
  • Strong experience with big data technologies and distributed processing frameworks (Spark, Hadoop ecosystem).
  • Solid understanding of data modeling techniques and database design principles.
  • Experience with data pipeline orchestration tools (e.g., Airflow, Prefect).
  • Familiarity with data governance, data cataloging, and data security best practices.
  • Excellent leadership, communication, and interpersonal skills.
  • Ability to thrive in a remote, collaborative team environment.
Apply Now

Big Data Engineer

92713 Irvine, California TP-Link North America, Inc.

Posted 4 days ago

Job Viewed

Tap Again To Close

Job Description

Headquartered in the United States, TP-Link Systems Inc. is a global provider of reliable networking devices and smart home products, consistently ranked as the world’s top provider of Wi-Fi devices. The company is committed to delivering innovative products that enhance people’s lives through faster, more reliable connectivity. With a commitment to excellence, TP-Link serves customers in over 170 countries and continues to grow its global footprint.

We believe technology changes the world for the better! At TP-Link Systems Inc, we are committed to crafting dependable, high-performance products to connect users worldwide with the wonders of technology.

Embracing professionalism, innovation, excellence, and simplicity, we aim to assist our clients in achieving remarkable global performance and enable consumers to enjoy a seamless, effortless lifestyle.

KEY RESPONSIBILITIES:

  • Design, develop, and maintain scalable big data infrastructure and pipelines, including data ingestion, cleansing, transformation, and data warehouse modeling for large-scale datasets.

  • Design and maintain vector databases and embedding pipelines to support LLM applications, RAG (Retrieval Augmented Generation) systems, semantic search and agentic capabilities.

  • Collaborate with cross-functional teams to deliver reliable, actionable data solutions that support business and product decisions.

  • Implement and manage batch and streaming ETL/ELT workflows using distributed data processing frameworks Spark and orchestration tools.

  • Participate in data integration and ETL pipeline development, ensuring secure and efficient data processing.

  • Investigate system issues, perform troubleshooting, and assist in optimizing data processing workflows.

Requirements

REQUIRED QUALIFICATIONS

  • Bachelor’s degree in Computer Science, Information Systems, or related field.

  • 3-5 years of hands-on experience in data engineering or big data infrastructure, working with large scale datasets in a production environment.

  • Proficiency in Python with experience developing scalable ETL/ELT pipelines or made significant contributions to open source python library

  • Ability to work effectively in a team-oriented environment with good communication and problem-solving skills.

PREFERRED QUALIFICATIONS

  • Experience with LLM frameworks and libraries (e.g. LangChain, LlamaIndex) is strongly preferred

Benefits

Salary Range: $100,000 - 150,000

  • Free snacks and drinks, and provided lunch on Fridays

  • Fully paid medical, dental, and vision insurance (partial coverage for dependents)

  • Contributions to 401k funds

  • Bi-annual reviews, and annual pay increases

  • Health and wellness benefits, including free gym membership

  • Quarterly team-building events

At TP-Link Systems Inc., we are continually searching for ambitious individuals who are passionate about their work. We believe that diversity fuels innovation, collaboration, and drives our entrepreneurial spirit. As a global company, we highly value diverse perspectives and are committed to cultivating an environment where all voices are heard, respected, and valued. We are dedicated to providing equal employment opportunities to all employees and applicants, and we prohibit discrimination and harassment of any kind based on race, color, religion, age, sex, national origin, disability status, genetics, protected veteran status, sexual orientation, gender identity or expression, or any other characteristic protected by federal, state, or local laws. Beyond compliance, we strive to create a supportive and growth-oriented workplace for everyone. If you share our passion and connection to this mission, we welcome you to apply and join us in building a vibrant and inclusive team at TP-Link Systems Inc.

Please, no third-party agency inquiries, and we are unable to offer visa sponsorships at this time.

View Now

Big Data Engineer

92713 Irvine, California TP-Link Systems Inc.

Posted 4 days ago

Job Viewed

Tap Again To Close

Job Description

Headquartered in the United States, TP-Link Systems Inc. is a global provider of reliable networking devices and smart home products, consistently ranked as the world’s top provider of Wi-Fi devices. The company is committed to delivering innovative products that enhance people’s lives through faster, more reliable connectivity. With a commitment to excellence, TP-Link serves customers in over 170 countries and continues to grow its global footprint.

We believe technology changes the world for the better! At TP-Link Systems Inc, we are committed to crafting dependable, high-performance products to connect users worldwide with the wonders of technology. 

Embracing professionalism, innovation, excellence, and simplicity, we aim to assist our clients in achieving remarkable global performance and enable consumers to enjoy a seamless, effortless lifestyle.

KEY RESPONSIBILITIES:

    • Design, develop, and maintain scalable big data infrastructure and pipelines, including data ingestion, cleansing, transformation, and data warehouse modeling for large-scale datasets.
    • Design and maintain vector databases and embedding pipelines to support LLM applications, RAG (Retrieval Augmented Generation) systems, semantic search and agentic capabilities.
    • Collaborate with cross-functional teams to deliver reliable, actionable data solutions that support business and product decisions.
    • Implement and manage batch and streaming ETL/ELT workflows using distributed data processing frameworks Spark and orchestration tools.
    • Participate in data integration and ETL pipeline development, ensuring secure and efficient data processing.
    • Investigate system issues, perform troubleshooting, and assist in optimizing data processing workflows.

Requirements

REQUIRED QUALIFICATIONS

    • Bachelor’s degree in Computer Science, Information Systems, or related field.
    • 3-5 years of hands-on experience in data engineering or big data infrastructure, working with large scale datasets in a production environment.
    • Proficiency in Python with experience developing scalable ETL/ELT pipelines or made significant contributions to open source python library
    • Ability to work effectively in a team-oriented environment with good communication and problem-solving skills.

PREFERRED QUALIFICATIONS

    • Experience with LLM frameworks and libraries (e.g. LangChain, LlamaIndex) is strongly preferred

Benefits

Salary Range: $100,000 - 150,000

  • Free snacks and drinks, and provided lunch on Fridays
  • Fully paid medical, dental, and vision insurance (partial coverage for dependents)
  • Contributions to 401k funds
  • Bi-annual reviews, and annual pay increases
  • Health and wellness benefits, including free gym membership
  • Quarterly team-building events

At TP-Link Systems Inc., we are continually searching for ambitious individuals who are passionate about their work. We believe that diversity fuels innovation, collaboration, and drives our entrepreneurial spirit. As a global company, we highly value diverse perspectives and are committed to cultivating an environment where all voices are heard, respected, and valued. We are dedicated to providing equal employment opportunities to all employees and applicants, and we prohibit discrimination and harassment of any kind based on race, color, religion, age, sex, national origin, disability status, genetics, protected veteran status, sexual orientation, gender identity or expression, or any other characteristic protected by federal, state, or local laws. Beyond compliance, we strive to create a supportive and growth-oriented workplace for everyone. If you share our passion and connection to this mission, we welcome you to apply and join us in building a vibrant and inclusive team at TP-Link Systems Inc.

Please, no third-party agency inquiries, and we are unable to offer visa sponsorships at this time.

View Now

Big Data Engineer

77007 Houston, Texas Robert Half

Posted today

Job Viewed

Tap Again To Close

Job Description

Description
Are you obsessed with data and thrive on turning complexity into clarity? We're looking for a Lead Data Engineer who lives and breathes Python, knows Snowflake inside out, and can architect scalable solutions in AWS. If you love solving hard problems, building elegant data platforms, and communicating your ideas with clarity and impact-this is your role.
What You'll Do
+ Design and implement cloud-native data platforms using AWS and Snowflake
+ Build robust data pipelines and services using Python and modern engineering practices
+ Architect scalable solutions for data ingestion, transformation, and analytics
+ Collaborate with analysts, scientists, and business stakeholders to translate ideas into technical reality
+ Lead cross-functional teams to deliver high-impact data products
+ Migrate legacy systems to cloud-based platforms with minimal disruption
+ Define long-term data architecture strategies aligned with business goals
+ Mentor junior engineers and champion best practices in design, testing, and deployment
+ Communicate complex technical concepts in a way that resonates with non-technical audiences
Requirements
7+ years of hands-on experience in data engineering and architecture
Mastery of Python and SQL for building scalable data solutions
Deep expertise in Snowflake, including performance tuning and advanced modeling
Strong experience with AWS data services (e.g., S3, Glue, Lambda, Redshift, Kinesis)
Passion for data modeling: dimensional, data vault, and modern techniques
Familiarity with tools like DBT, Airflow, and Terraform
Experience with streaming, CDC, and real-time data integration patterns
Excellent communication skills-you make the complex simple and the abstract tangible
A genuine love for data and a drive to innovate
Technology Doesn't Change the World, People Do.®
Robert Half is the world's first and largest specialized talent solutions firm that connects highly qualified job seekers to opportunities at great companies. We offer contract, temporary and permanent placement solutions for finance and accounting, technology, marketing and creative, legal, and administrative and customer support roles.
Robert Half works to put you in the best position to succeed. We provide access to top jobs, competitive compensation and benefits, and free online training. Stay on top of every opportunity - whenever you choose - even on the go. Download the Robert Half app ( and get 1-tap apply, notifications of AI-matched jobs, and much more.
All applicants applying for U.S. job openings must be legally authorized to work in the United States. Benefits are available to contract/temporary professionals, including medical, vision, dental, and life and disability insurance. Hired contract/temporary professionals are also eligible to enroll in our company 401(k) plan. Visit roberthalf.gobenefits.net for more information.
© 2025 Robert Half. An Equal Opportunity Employer. M/F/Disability/Veterans. By clicking "Apply Now," you're agreeing to Robert Half's Terms of Use ( .
View Now

Big Data Engineer

20080 Washington, District Of Columbia SAIC

Posted 7 days ago

Job Viewed

Tap Again To Close

Job Description

**Description**
SAIC is seeking **Big Data Engineers** to join the Machine-assisted Analytic Rapid-repository System (MARS) Advanced Development Operations (DevOps) and Sustainment Support (ADOS) program and provide on-site technical support to facilitate operations of critical MARS infrastructure and services. This effort focuses on providing a comprehensive set of System/ Software Engineering and IT Services to maintain, sustain, enhance, and improve/ modernize MARS. This position will be located in the National Capital Region.
**Please note that this is contingent upon contract award, with an anticipated decision expected by Winter/ Spring 2026.**
The Big Data Engineer responsibilities include, but are not limited to:
+ Designs scalable data architectures, leading ETL processes, and oversees the implementation of data storage solutions
+ Support the development and integration of advanced analytics tools, ensuring efficient data access and insights generation
+ Optimize system performance and scalability, implementing continuous improvement initiatives
**Qualifications**
+ Active TS/SCI with Polygraph
+ Bachelor's degree in Information Technology, Cybersecurity, Computer Science, Information Systems, Data Science, or Software Engineering and 14 years or more relevant experience (will consider an additional 4+ years of relevant experience in lieu of degree)
+ One Active Certification: CCISO, CISM, CISSP, GSLC, SSCP or GSEC
+ Expertise in designing, implementing, and managing Big Data solutions using Hadoop, Spark, and data streaming technologies
+ Proven experience optimizing data pipelines, performing large-scale data processing, and ensuring data quality
+ Strong knowledge of data warehousing concepts, ETL processes, and distributed computing environments
Target salary range: $160,001 - $00,000. The estimate displayed represents the typical salary range for this position based on experience and other factors.
REQNUMBER:
SAIC is a premier technology integrator, solving our nation's most complex modernization and systems engineering challenges across the defense, space, federal civilian, and intelligence markets. Our robust portfolio of offerings includes high-end solutions in systems engineering and integration; enterprise IT, including cloud services; cyber; software; advanced analytics and simulation; and training. We are a team of 23,000 strong driven by mission, united purpose, and inspired by opportunity. Headquartered in Reston, Virginia, SAIC has annual revenues of approximately 6.5 billion. For more information, visit saic.com. For information on the benefits SAIC offers, see Working at SAIC. EOE AA M/F/Vet/Disability
View Now

IT Senior Data Engineer (SAP, Data Warehousing and Modeling Focus)

80631 Briggsdale, Colorado Hensel Phelps

Posted 1 day ago

Job Viewed

Tap Again To Close

Job Description

**Description**
**About Hensel Phelps:**
Founded in 1937, Hensel Phelps specializes in building development, construction and facility services in markets ranging from aviation to government, commercial, transportation, critical facilities, healthcare and transportation. Ranked #1 in aviation and #6 overall general contractor in 2024 by BD+C, Hensel Phelps is one of the largest employee-owned general contractors in the country. Driven to deliver EXCELLENCE in all we do and supported by our core values of Ownership, Integrity, Builder, Diversity and Community, Hensel Phelps brings our clients' visions to life with a comprehensive approach that begins with innovative planning and extends throughout the life of the property.
**Position Description:**
Our growing Data & Analytics team is seeking a highly skilled Senior Data Engineer with extensive experience in using Azure and SAP technologies to build, maintain, and scale large enterprise datasets. The ideal candidate will have a strong background in data modeling, integration, and migration techniques, and will be adept at handling complex data environments.
**Position Qualifications:**
+ College diploma or university degree in the field of computer science or statistics, and/or 5+ years of equivalent work experience.
+ Certifications in BI solutions are a plus.
+ Strong understanding of database structures, theories, principles, and practices including enterprise data warehouses.
+ Strong familiarity with metadata management and associated processes.
+ Excellent verbal, written and analytical skills with the ability to actively listen and effectively gather business requirements for analysis.
+ Demonstrated expertise with data architecture, enterprise architecture tools, data mining, large-scale data modeling, data mapping tools, data profiling tools, data governance and data life cycle methodologies.
+ Direct experience in implementing enterprise data management processes, procedures, and decision support.
+ Construction industry knowledge a plus.
+ Hands-on database tuning and troubleshooting experience.
+ Excellent customer service skills combined with the ability to solve problems.
+ Attention to detail and strong analytical skills.
+ Ability to prioritize issues and monitor progress.
+ Ability to work in a team environment.
+ SAP S/4 HANA
+ SAP Analytic Cloud
+ SAP Datasphere
+ SAP Mater Data Governance
+ Azure Cloud
+ Power Bi
+ DevOps
+ Synapse, BigQuery, Databricks
+ Data Vault
+ Python
+ SQL
+ Dax
+ Docker
+ Experience pulling data from APIs.
+ Good written and oral communication skills.
+ Strong technical documentation skills.
+ Good interpersonal skills.
+ Ability to conduct research into data management issues, practices, and products as required.
+ Ability to present ideas in a user-friendly language.
+ Highly self-motivated and directed.
+ Keen attention to detail and strong organizational skills.
+ Proven analytical and problem-solving abilities.
+ Ability to effectively prioritize and execute tasks in a high-pressure environment.
+ Strong customer service orientation.
+ Experience working in a team-oriented, collaborative environment
**Essential Duties:**
+ Assist data leadership in establishing long-term strategic goals for business intelligent platforms and toolsets.
+ Assist in the development of global maintenance schedules for BI and data warehousing systems.
+ Lead system feasibility studies, proof of concepts, pilot projects and testing.
+ Assist with data migration, data integration, and security best practices and documentation.
+ Lead the deployment, monitoring, maintenance, development, upgrade, and support of BI/EDW systems, including data architecture, data integration, high availability, security, and data privacy.
+ Analyze existing operations and make recommendations for the improvement and growth of the BI/EDW architecture.
+ Conduct research and remain current with the latest data technologies and solutions in support of future data management procurement efforts.
+ Develop, deploy, support, and optimize tools for data extraction, queries, and data manipulation in accordance with business processes utilizing the MS solution suite including Azure SQL DB, Synapse, metadata automation.
+ Data modeling, data architecting and systems architecture development in collaboration with our solutions architecture team to ensure cross-application consistency and business value is realized
+ Ensure the reliability of data access and data quality across the organization via ongoing database/enterprise data warehouse support and maintenance.
+ Develop, implement, and maintain change control and testing processes for modifications to enterprise data warehouse.
+ Document business and security requirements, test plans, database dictionaries, etc.
+ Identify inefficiencies and gaps in current data warehouse and leverage solutions to ensure data standards.
+ Documentation via automated data discovery and data maps.
+ Implement data security best practices and perform testing/remediation.
**Physical Work Classification & Demands:**
+ Light Work. Exerting up to 25 pounds of force occasionally, and/or up to 10 pounds of force frequently, and/or a negligible amount of force constantly to move objects.
+ The individual in this position will periodically walk, kneel, sit, crouch, reach, stoop, read/see, speak, push, pull, lift, stand, and finger/type. The frequency of each action varies by workflow and office activity.
+ Walking - The person in this position needs to occasionally move about inside the office to access file cabinets, office machinery, boxes, cabinets, etc.
+ Constantly operates a computer and other office machinery, such as a calculator, copy machine, phone, computer, and computer printer.
+ The person in this position frequently communicates with employees and external stakeholders regarding a variety of topics related to office administration.
+ Constantly computes, analyzes, and conceptualizes mathematical calculations and formulas.
+ Constantly reads written communications and views email submissions.
+ The person in this position regularly sits in a stationary position in front of a computer screen.
+ Visual acuity and ability to operate a vehicle as certified and appropriate.
+ Rarely exposed to high and low temperatures
+ Rarely exposed to noisy environments and outdoor elements such as precipitation and wind.
**Compensation Range (Colorado only):**
$84,930 - $93,870 annual salary
**Benefits:**
Hensel Phelps provides generous benefits for our salaried employees. This position is eligible for company paid medical insurance, life insurance, accidental death & dismemberment, long-term disability, 401(K) retirement plan, health savings account (HSA) (HSA not available in Hawaii), and our employee assistance program (EAP). It also is eligible for employee paid enrollment in vision and dental insurance. Hensel Phelps also believes in the importance of taking time to recharge. As a result, salaried employees are eligible for paid time off beginning upon hire. Salaried positions (project engineers and above) participate in an annual bonus plan, subject to company and employee performance. Salaried employees (this is all salaried employees) are also eligible for a company cell phone or cell phone allowance in accordance with company policy. Further, salaried employees (project engineers and above) also receive either a vehicle or vehicle allowance in accordance with Hensel Phelps' policies. Based on position location, a cost-of-living adjustment (COLA) may also be included (subject to periodic review and adjustment).
**Any Employment Offers are Contingent Upon Successful Completion of the Following:**
+ Verification of Work Authorization and Employment Eligibility
+ Substance Abuse Screening
+ Physical Exam (if applicable)
+ Background Checks for Badging/Security Clearances (if applicable)
**Equal Opportunity and Affirmative Action Employer:**
Hensel Phelps is an equal opportunity employer. Hensel Phelps is committed to engaging in affirmative action to increase employment opportunities for protected veterans and individuals with disabilities. Hensel Phelps shall not discriminate against any employee or applicant for employment on the basis of race, color, religion, sex, age, national origin, sexual orientation, gender identity and expression, domestic partner status, pregnancy, disability, citizenship, genetic information, protected veteran status, or any other characteristic protected by federal, state, or local law.
The contractor will not discharge or in any other manner discriminate against employees or applicants because they have inquired about, discussed, or disclosed their own pay or the pay of another employee or applicant. However, employees who have access to the compensation information of other employees or applicants as a part of their essential job functions cannot disclose the pay of other employees or applicants to individuals who do not otherwise have access to compensation information, unless the disclosure is (a) in response to a formal complaint or charge, (b) in furtherance of an investigation, proceeding, hearing, or action, including an investigation conducted by the employer, or (c) consistent with the contractor's legal duty to furnish information. 41 CFR 60-1.35(c)
#LI-DG1
#LI-DG1
Equal Opportunity Employer/Protected Veterans/Individuals with Disabilities
This employer is required to notify all applicants of their rights pursuant to federal employment laws.
For further information, please review the Know Your Rights ( notice from the Department of Labor.
View Now
Be The First To Know

About the latest Senior data engineer Jobs in United States !

Senior Big Data Engineer

92713 Irvine, California TP-Link North America, Inc.

Posted 4 days ago

Job Viewed

Tap Again To Close

Job Description

ABOUT US:

Headquartered in the United States, TP-Link Systems Inc. is a global provider of reliable networking devices and smart home products, consistently ranked as the world’s top provider of Wi-Fi devices. The company is committed to delivering innovative products that enhance people’s lives through faster, more reliable connectivity. With a commitment to excellence, TP-Link serves customers in over 170 countries and continues to grow its global footprint.

We believe technology changes the world for the better! At TP-Link Systems Inc, we are committed to crafting dependable, high-performance products to connect users worldwide with the wonders of technology. 

Embracing professionalism, innovation, excellence, and simplicity, we aim to assist our clients in achieving remarkable global performance and enable consumers to enjoy a seamless, effortless lifestyle. 

KEY RESPONSIBILITIES

  • Develop and maintain the Big Data Platform by performing data cleansing, data warehouse modeling, and report development on large datasets. Collaborate with cross-functional teams to provide actionable insights for decision-making.

  • Manage the operation and administration of the Big Data Platform, including system deployment, task scheduling, proactive monitoring, and alerting to ensure stability and security.

  • Handle data collection and integration tasks, including ETL development, data de-identification, and managing data security.

  • Provide support for other departments by processing data, writing queries, developing solutions, performing statistical analysis, and generating reports.

  • Troubleshoot and resolve critical issues, conduct fault diagnosis, and optimize system performance.

Requirements

REQUIRED QUALIFICATIONS

  • Bachelor’s degree or higher in Computer Science or a related field, with at least three years of experience maintaining a Big Data platform.

  • Strong understanding of Big Data technologies such as Hadoop, Flink, Spark, Hive, HBase, and Airflow, with proven expertise in Big Data development and performance optimization.

  • Familiarity with Big Data OLAP tools like Kylin, Impala, and ClickHouse, as well as experience in data warehouse design, data modeling, and report generation.

  • Proficiency in Linux development environments and Python programming.

  • Excellent communication, collaboration, and teamwork skills, with a proactive attitude and a strong sense of responsibility.

PREFERRED QUALIFICAITONS

  • Experience with cloud-based deployments, particularly AWS EMR, with familiarity in other cloud platforms being a plus.

  • Proficiency in additional languages such as Java or Scala is a plus.

Benefits

Salary Range: $150,000 - $180,000

  • Free snacks and drinks, and provided lunch on Fridays

  • Fully paid medical, dental, and vision insurance (partial coverage for dependents)

  • Contributions to 401k funds

  • Bi-annual reviews, and annual pay increases

  • Health and wellness benefits, including free gym membership

  • Quarterly team-building events

At TP-Link Systems Inc., we are continually searching for ambitious individuals who are passionate about their work. We believe that diversity fuels innovation, collaboration, and drives our entrepreneurial spirit. As a global company, we highly value diverse perspectives and are committed to cultivating an environment where all voices are heard, respected, and valued. We are dedicated to providing equal employment opportunities to all employees and applicants, and we prohibit discrimination and harassment of any kind based on race, color, religion, age, sex, national origin, disability status, genetics, protected veteran status, sexual orientation, gender identity or expression, or any other characteristic protected by federal, state, or local laws. Beyond compliance, we strive to create a supportive and growth-oriented workplace for everyone. If you share our passion and connection to this mission, we welcome you to apply and join us in building a vibrant and inclusive team at TP-Link Systems Inc.

Please, no third-party agency inquiries, and we are unable to offer visa sponsorships at this time.

View Now

Senior Big Data Engineer

92713 Irvine, California TP-Link Systems Inc.

Posted 4 days ago

Job Viewed

Tap Again To Close

Job Description

ABOUT US:

Headquartered in the United States, TP-Link Systems Inc. is a global provider of reliable networking devices and smart home products, consistently ranked as the world’s top provider of Wi-Fi devices. The company is committed to delivering innovative products that enhance people’s lives through faster, more reliable connectivity. With a commitment to excellence, TP-Link serves customers in over 170 countries and continues to grow its global footprint.

We believe technology changes the world for the better! At TP-Link Systems Inc, we are committed to crafting dependable, high-performance products to connect users worldwide with the wonders of technology. 

Embracing professionalism, innovation, excellence, and simplicity, we aim to assist our clients in achieving remarkable global performance and enable consumers to enjoy a seamless, effortless lifestyle. 

KEY RESPONSIBILITIES

  • Develop and maintain the Big Data Platform by performing data cleansing, data warehouse modeling, and report development on large datasets. Collaborate with cross-functional teams to provide actionable insights for decision-making.
  • Manage the operation and administration of the Big Data Platform, including system deployment, task scheduling, proactive monitoring, and alerting to ensure stability and security.
  • Handle data collection and integration tasks, including ETL development, data de-identification, and managing data security.
  • Provide support for other departments by processing data, writing queries, developing solutions, performing statistical analysis, and generating reports.
  • Troubleshoot and resolve critical issues, conduct fault diagnosis, and optimize system performance.

Requirements

REQUIRED QUALIFICATIONS

  • Bachelor’s degree or higher in Computer Science or a related field, with at least three years of experience maintaining a Big Data platform.
  • Strong understanding of Big Data technologies such as Hadoop, Flink, Spark, Hive, HBase, and Airflow, with proven expertise in Big Data development and performance optimization.
  • Familiarity with Big Data OLAP tools like Kylin, Impala, and ClickHouse, as well as experience in data warehouse design, data modeling, and report generation.
  • Proficiency in Linux development environments and Python programming.
  • Excellent communication, collaboration, and teamwork skills, with a proactive attitude and a strong sense of responsibility.

PREFERRED QUALIFICAITONS

  • Experience with cloud-based deployments, particularly AWS EMR, with familiarity in other cloud platforms being a plus.
  • Proficiency in additional languages such as Java or Scala is a plus.

Benefits

Salary Range: $150,000 - $180,000

  • Free snacks and drinks, and provided lunch on Fridays
  • Fully paid medical, dental, and vision insurance (partial coverage for dependents)
  • Contributions to 401k funds
  • Bi-annual reviews, and annual pay increases
  • Health and wellness benefits, including free gym membership
  • Quarterly team-building events

At TP-Link Systems Inc., we are continually searching for ambitious individuals who are passionate about their work. We believe that diversity fuels innovation, collaboration, and drives our entrepreneurial spirit. As a global company, we highly value diverse perspectives and are committed to cultivating an environment where all voices are heard, respected, and valued. We are dedicated to providing equal employment opportunities to all employees and applicants, and we prohibit discrimination and harassment of any kind based on race, color, religion, age, sex, national origin, disability status, genetics, protected veteran status, sexual orientation, gender identity or expression, or any other characteristic protected by federal, state, or local laws. Beyond compliance, we strive to create a supportive and growth-oriented workplace for everyone. If you share our passion and connection to this mission, we welcome you to apply and join us in building a vibrant and inclusive team at TP-Link Systems Inc.

Please, no third-party agency inquiries, and we are unable to offer visa sponsorships at this time.

View Now

Hadoop Big Data Engineer

07308 Jersey City, New Jersey Insight Global

Posted 6 days ago

Job Viewed

Tap Again To Close

Job Description

Job Description
For 68.93/hr, we are seeking a skilled Big Data Engineer to support a leading financial institution in building and optimizing large-scale data processing systems. This role involves working with Hadoop and Spark to ingest, transform, and analyze high-volume market and trading data. The engineer will contribute to the development and maintenance of distributed data pipelines, ensuring performance, scalability, and reliability across the analytics infrastructure. You will use datasets, data lake architecture, etc. to help build a proprietary analytics platform. The ideal candidate will have a strong understanding of big data frameworks and a passion for enabling data-driven decision-making in a fast-paced financial environment.
We are a company committed to creating diverse and inclusive environments where people can bring their full, authentic selves to work every day. We are an equal opportunity/affirmative action employer that believes everyone matters. Qualified candidates will receive consideration for employment regardless of their race, color, ethnicity, religion, sex (including pregnancy), sexual orientation, gender identity and expression, marital status, national origin, ancestry, genetic factors, age, disability, protected veteran status, military or uniformed service member status, or any other status or characteristic protected by applicable laws, regulations, and ordinances. If you need assistance and/or a reasonable accommodation due to a disability during the application or recruiting process, please send a request to learn more about how we collect, keep, and process your private information, please review Insight Global's Workforce Privacy Policy: and Requirements
- 7+ years of experience in a Data Engineer role utilizing Python scripting
- Experience with Hadoop and Spark - Background in financial services or analytics platforms
- Familiarity with tools like Apache Hudi, Hive, and Airflow is a plus
View Now
 

Nearby Locations

Other Jobs Near Me

Industry

  1. request_quote Accounting
  2. work Administrative
  3. eco Agriculture Forestry
  4. smart_toy AI & Emerging Technologies
  5. school Apprenticeships & Trainee
  6. apartment Architecture
  7. palette Arts & Entertainment
  8. directions_car Automotive
  9. flight_takeoff Aviation
  10. account_balance Banking & Finance
  11. local_florist Beauty & Wellness
  12. restaurant Catering
  13. volunteer_activism Charity & Voluntary
  14. science Chemical Engineering
  15. child_friendly Childcare
  16. foundation Civil Engineering
  17. clean_hands Cleaning & Sanitation
  18. diversity_3 Community & Social Care
  19. construction Construction
  20. brush Creative & Digital
  21. currency_bitcoin Crypto & Blockchain
  22. support_agent Customer Service & Helpdesk
  23. medical_services Dental
  24. medical_services Driving & Transport
  25. medical_services E Commerce & Social Media
  26. school Education & Teaching
  27. electrical_services Electrical Engineering
  28. bolt Energy
  29. local_mall Fmcg
  30. gavel Government & Non Profit
  31. emoji_events Graduate
  32. health_and_safety Healthcare
  33. beach_access Hospitality & Tourism
  34. groups Human Resources
  35. precision_manufacturing Industrial Engineering
  36. security Information Security
  37. handyman Installation & Maintenance
  38. policy Insurance
  39. code IT & Software
  40. gavel Legal
  41. sports_soccer Leisure & Sports
  42. inventory_2 Logistics & Warehousing
  43. supervisor_account Management
  44. supervisor_account Management Consultancy
  45. supervisor_account Manufacturing & Production
  46. campaign Marketing
  47. build Mechanical Engineering
  48. perm_media Media & PR
  49. local_hospital Medical
  50. local_hospital Military & Public Safety
  51. local_hospital Mining
  52. medical_services Nursing
  53. local_gas_station Oil & Gas
  54. biotech Pharmaceutical
  55. checklist_rtl Project Management
  56. shopping_bag Purchasing
  57. home_work Real Estate
  58. person_search Recruitment Consultancy
  59. store Retail
  60. point_of_sale Sales
  61. science Scientific Research & Development
  62. wifi Telecoms
  63. psychology Therapy
  64. pets Veterinary
View All Senior Data Engineer Jobs