334 Data Scientists jobs in Seattle
Distinguished, Software Engineer / Big Data

Posted 7 days ago
Job Viewed
Job Description
**What you'll do.**
The People Data Platform organization within People Technology is seeking a Distinguished Software Engineer to join the team. The People Data Platform acts as the single source of truth for data related to People at Walmart. In this critical role, you will design, build, and optimize a secure and scalable data management solution which powers analytical and machine learning use cases across Walmart.
**About Team:**
The Enterprise People Technology team supports the successful deployment and adoption of new People technology across the enterprise. As a Fortune #1 company, our work impacts millions of associates globally. We strive to continuously improve people technology and products to help managers and associates so they can focus on what matters most - supporting our customers and members. People Technology is one of the major segments of Walmart Global Tech's Enterprise Business Services, which is invested in building a compact, robust organization that includes service operations and technology solutions for Finance, People, and the Associate Digital Experience.
**What you'll do:**
+ Architect and implement a robust Data Lakehouse solution that meets the needs for analytical and machine learning use cases
+ Build and enhance data ingestion, data transformation, and data storage infrastructure to handle large volume of data with low latency and high reliability, while protecting the privacy of People data
+ Establish and enforce data governance policies and best practices, ensuring that data is consistent, reliable, and compliant with regulatory requirements. Work closely with data stakeholders to monitor and improve data quality across all systems
+ Lead the evaluation and adoption of new tools, frameworks, and methodologies that enhance the scalability and robustness of our data management solution
+ Provide technical leadership and mentorship to a team of engineers, fostering a culture of continuous learning, collaboration, and innovation. Guide the team in solving complex technical challenges and ensuring best practices in software and data engineering are followed
**What you'll bring:**
+ 10+ years of experience in software engineering and 5+ years of experience in developing large scale data management solution
+ Proven examples of setting technical vision for data management solutions and effectively materialize that vision through working with multiple engineering teams
+ Possess a deep understanding of the architecture of a modern data warehouse and data lake solutions
+ Strong problem-solving skills, with the ability to work in a fast-paced, collaborative environment.
+ Excellent technical leadership and communication skills, with a passion for mentoring and developing engineering talent.
**About Walmart Global Tech**
Imagine working in an environment where one line of code can make life easier for hundreds of millions of people. That's what we do at Walmart Global Tech. We're a team of software engineers, data scientists, cybersecurity expert's and service professionals within the world's leading retailer who make an epic impact and are at the forefront of the next retail disruption. People are why we innovate, and people power our innovations. We are people-led and tech-empowered. We train our team in the skillsets of the future and bring in experts like you to help us grow. We have roles for those chasing their first opportunity as well as those looking for the opportunity that will define their career. Here, you can kickstart a great career in tech, gain new skills and experience for virtually every industry, or leverage your expertise to innovate at scale, impact millions and reimagine the future of retail. Flexible, hybrid work:
We use a hybrid way of working that is primarily in office coupled with virtual when not onsite. Our campuses serve as a hub to enhance collaboration, bring us together for purpose and deliver on business needs. This approach helps us make quicker decisions, remove location barriers across our global team and be more flexible in our personal lives.
**Benefits:**
Benefits: Beyond our great compensation package, you can receive incentive awards for your performance. Other great perks include 401(k) match, stock purchase plan, paid maternity and parental leave, PTO, multiple health plans, and much more.
**Equal Opportunity Employer:**
Walmart, Inc. is an Equal Opportunity Employer - By Choice. We believe we are best equipped to help our associates, customers and the communities we serve live better when we really know them. That means understanding, respecting and valuing diversity- unique styles, experiences, identities, ideas and opinions - while being inclusive of all people.
_The above information has been designed to indicate the general nature and level of work performed in the role. It is not designed to contain or be interpreted as a comprehensive inventory of all responsibilities and qualifications required of employees assigned to this job. The full Job Description can be made available as part of the hiring process._
At Walmart, we offer competitive pay as well as performance-based bonus awards and other great benefits for a happier mind, body, and wallet. Health benefits include medical, vision and dental coverage. Financial benefits include 401(k), stock purchase and company-paid life insurance. Paid time off benefits include PTO (including sick leave), parental leave, family care leave, bereavement, jury duty, and voting. Other benefits include short-term and long-term disability, company discounts, Military Leave Pay, adoption and surrogacy expense reimbursement, and more.
br> r>You will also receive PTO and/or PPTO that can be used for vacation, sick leave, holidays, or other purposes. The amount you receive depends on your job classification and length of employment. It will meet or exceed the requirements of paid sick leave laws, where applicable.
r>For information about PTO, see .
r> r>Live Better U is a Walmart-paid education benefit program for full-time and part-time associates in Walmart and Sam's Club facilities. Programs range from high school completion to bachelor's degrees, including English Language Learning and short-form certificates. Tuition, books, and fees are completely paid for by Walmart.
r>Eligibility requirements apply to some benefits and may depend on your job classification and length of employment. Benefits are subject to change and may be subject to a specific plan or program terms.
r>For information about benefits and eligibility, see One.Walmart ( .
r>Bellevue, Washington US-11075:The annual salary range for this position is $156,000.00-$12,000.00
r>Bentonville, Arkansas US-10735:The annual salary range for this position is 130,000.00- 260,000.00
r> r> r> r> r> r> r> r> r> r>Additional compensation includes annual or quarterly performance bonuses.
r>Additional compensation for certain positions may also include:
r> r>- Stock
r> r>**Minimum Qualifications.**
_Outlined below are the required minimum qualifications for this position. If none are listed, there are no minimum qualifications._
Option 1: Bachelor's degree in computer science, computer engineering, computer information systems, software engineering, or related area and6 years' experience in software engineering or related area.
Option 2: 8 years' experience in software engineering or related area.
**Preferred Qualifications.**
_Outlined below are the optional preferred qualifications for this position. If none are listed, there are no preferred qualifications._
Master's degree in computer science, computer engineering, computer information systems, software engineering, or related area and 4 years' experience in software engineering or related area, We value candidates with a background in creating inclusive digital experiences, demonstrating knowledge in implementing Web Content Accessibility Guidelines (WCAG) 2.2 AA standards, assistive technologies, and integrating digital accessibility seamlessly. The ideal candidate would have knowledge of accessibility best practices and join us as we continue to create accessible products and services following Walmart's accessibility standards and guidelines for supporting an inclusive culture.
**Primary Location.**
10500 Ne 8Th St, Bellevue, WA 98004, United States of America
Walmart, Inc. is an Equal Opportunity Employer- By Choice. We believe we are best equipped to help our associates, customers, and the communities we serve live better when we really know them. That means understanding, respecting, and valuing diversity- unique styles, experiences, identities, abilities, ideas and opinions- while being inclusive of all people.
Big Data Systems Engineer (Remote)

Posted 7 days ago
Job Viewed
Job Description
Big Data Systems Engineer (Remote)
Belong, Connect, Grow, with KBR!
KBR's National Security Solutions (NSS) team provides high-end engineering and advanced technology solutions to our customers in the intelligence and national security communities. In this position, your work will have a profound impact on the country's most critical role - protecting our national security.
KBR is seeking a Big Data Systems Engineer to join our team. The successful candidate will be part of the KBR team supporting the Test Resource Management Center's (TRMC) Big Data (BD) and Knowledge Management (KM) Team deploying BD and KM systems for DoD testing Ranges and various acquisition programs.
Responsibilities:
+ The Big Data Systems Engineer will work on the deployment and integration of a highly visible data analytic project called Cloud Hybrid Edge-to-Enterprise Evaluation Test & Analysis Suite (CHEETAS) at multiple DoD ranges and labs.
+ As a Big Data Systems Engineer, you will be a critical part of our technical team responsible for deploying CHEETAS within customer environments. You will be the frontline interface that customers will have when first experiencing CHEETAS within their DoD Range and lab environments.
+ This position will require you to work closely with system administrators and software developers to communicate, document and ultimately resolve deployment issues as they arise.
+ You will deploy CHEETAS within disparate DoD testing Ranges and acquisition programs environments (on different non-standard hardware stacks and integrated into different existing ecosystems) sometimes located within DoD vaults with no outside internet connectivity.
+ Work on the deployment and integration of a highly visible data analytic project called Cloud Hybrid Edge-to-Enterprise Evaluation Test & Analysis Suite (CHEETAS) at multiple DoD ranges and labs
+ Deploy CHEETAS within customer environments
+ Work closely with system administrators and software developers to communicate, document and ultimately resolve deployment issues as they arise
+ Deploy CHEETAS within disparate DoD testing Ranges and acquisition programs environments (on different non-standard hardware stacks and integrated into different existing ecosystems) sometimes located within DoD vaults with no outside internet connectivity
Work Environment:
+ Location: Remote - The candidate can either work in one of KBR's facilities or work from home, with a stable internet connection.
+ Travel Requirements: This position is anticipated to require travel of 25% with surges possible up to 50% to support end users located at various DoD Ranges and Labs across the US (including Alaska and Hawaii).
+ Working Hours: Standard
Basic Qualifications/Knowledge:
+ Must have an active TS/SCI Security Clearance to be considered for this position.
+ This position requires a bachelor's degree in a STEM Computer Science, Data Science, Statistics or related, technical field, and 10 years of DoD experience. Entry level Big Data Engineers will NOT be considered due to the breadth of knowledge necessary to be successful in the position.
+ Previous experience must include three (3) years of hands-on experience in the integration with and configuration of: SQL Server Big Data Cluster, CentOS, Ubuntu, RedHat, Windows Server, VMWare, etc.)
+ Previous experience must include five (5) years of hands-on experience in big data environments.
+ Must be adept at deploying and configuring Big Data and Knowledge Management tools in an enterprise environment.
+ Must have extensive technical expertise in the configuration and troubleshooting of big data ecosystems.
+ Must have excellent written and verbal communication skills and be comfortable assisting customers with installation and configuration of their big data infrastructure.
+ Must have strong troubleshooting skills and the ability to become a CHEETAS deployment subject matter expert.
+ Must be comfortable working with a wide range of stakeholders and functional teams at various levels of experience.
+ Excellent interpersonal skills, oral and written communication skills, and strong personal motivation are necessary to succeed within this position.
+ Experience with installation, configuration, integration with and usage of the following tools and technologies: NFS, SMB, S3, SQL Server, Windows Server, Windows 10/11, Linux (CentOS, Ubuntu, RedHat).
+ Must be prepared to learn new business processes or CHEETAS application nuances every Agile sprint release (roughly every 6 weeks) prior to deploying to customer sites.
+ Ability to problem solve, debug, and troubleshoot while under pressure and time constraints is required.
+ Ability to communicate effectively about technical topics to both experts and non-experts at both the management and technical level is required.
+ Ability to work independently and provide appropriate recommendations for optimal design, analysis, and development.
+ Excellent verbal communications skills are required, as the Integration Engineer will be in frequent contact with the project technical lead, be taking direction from various government leads, and will frequently be interacting with end users to gather requirements and implement solutions while away from other team members.
+ Excellent testing, debugging and problem-solving skills are required to be successful in this position.
+ Experience designing, building, integrating with and maintaining both new and existing big data systems and solutions.
+ Ability to speak and present findings in front of large groups.
+ Ability to document and repeat procedures.
+ This position is anticipated to require travel of 25% with surges possible up to 50% to support end users located at various DoD Ranges and Labs across the United States.
Preferred Qualifications:
+ Experience working in government/defense labs and their computing restrictions.
+ Experience working with major DoD acquisition programs.
+ Knowledge of the Test and Training Enabling Architecture (TENA), the Joint Mission Environment Testing Capability (JMETC) and distributed testing and training.
+ Experience with working in distributed team environment.
+ Ability to teach and mentor engineers with a variety of skill levels and backgrounds.
+ Knowledge of DoD cybersecurity policies.
Basic Compensation:
$142,400 - $80,000 (For the Denver, CO Area only)
148,900 - 200,000 (For the Los Angeles, CA Area Only)
148,900 - 200,00 (For the Washington, DC Area Only)
The offered rate will be based on the selected candidate's working location, knowledge, skills, abilities and/or experience, clearance level, contract affordability and in consideration of internal parity.
Additional Compensation:
KBR may offer bonuses, commissions, or other forms of compensation to certain job titles or levels, per internal policy or contractual designation. Additional compensation may be in the form of a sign on bonus, relocation benefits, short-term incentives, long-term incentives, or discretionary payments for exceptional performance.
Ready to Make a Difference? If you're excited about making a significant impact in the field of space defense and working on projects that matter, we encourage you to apply and join our team at KBR. Let's shape the future together. Come join the ITEA award winning TRMC BDKM team and be a part of the team responsible for revolutionizing how data analysis is performed across the entire Department of Defense!
Belong, Connect and Grow at KBRAt KBR, we are passionate about our people and our Zero Harm culture. These inform all that we do and are at the heart of our commitment to, and ongoing journey toward being a People First company. That commitment is central to our team of team's philosophy and fosters an environment where everyone can Belong, Connect and Grow. We Deliver - Together.
KBR is an equal opportunity employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, disability, sex, sexual orientation, gender identity or expression, age, national origin, veteran status, genetic information, union status and/or beliefs, or any other characteristic protected by federal, state, or local law.
Senior/Staff Big Data Storage and Computing Engineer, Recommendation Data Ecosystem
Posted 1 day ago
Job Viewed
Job Description
Responsibilities
Our team plays a crucial role in the data ecosystem of the TikTok Recommendation System, focusing on creating offline and real-time data storage solutions for large-scale recommendation, search, and advertising businesses, serving over 1 billion users. The core goals of the team are to ensure high system reliability, uninterrupted service, and smooth data processing. We are committed to building a storage and computing infrastructure that can adapt to various data sources and meet diverse storage requirements, ultimately providing efficient, cost-effective, and user-friendly data storage and management tools for the business. Responsibilities 1. Architecture Design and Implementation: Design and implement offline and real-time data architectures for large-scale recommendation, search, and advertising systems based on Paimon and Flink. Ensure efficient data processing and storage to meet the strict requirements of the business for data timeliness and accuracy. 2. System Construction and Optimization: Design and implement flexible, scalable, stable, and high-performance storage systems and computing models. Use Paimon as the storage foundation and combine it with the powerful computing capabilities of Flink. Continuously optimize system performance to cope with the challenges brought by business growth. 3. Troubleshooting and Stability Assurance: Be responsible for troubleshooting production systems. For problems that occur in the Paimon-Flink architecture during operation, design and implement necessary mechanisms and tools, such as data consistency assurance and exception recovery, to ensure the overall stability of the production system. 4. Distributed System Construction: Build industry-leading distributed systems, including offline and online storage based on Paimon and batch and stream processing frameworks based on Flink, providing solid and reliable infrastructure support for massive data and large-scale business systems.
Qualifications
Minimum Qualifications: - A bachelor's degree or above in computer science, software engineering, or related fields, with more than 2 years of experience in building scalable systems. - Technical Skills: 1. Paimon - Flink Technology Stack: Have a thorough understanding of Paimon and Flink, and be able to understand and use them at the source-code level. Experience in customizing or extending these two systems is preferred. 2. Data Lake Technology: Have an in-depth understanding of at least one data lake technology (such as Paimon), with practical implementation and customization experience, which should be highlighted in the resume. 3. Storage Knowledge: Be familiar with the principles of HDFS, and knowledge of columnar storage formats such as Parquet and ORC is preferred. 4. Programming Languages: Be proficient in programming languages such as Java, C++, and Scala, with strong coding and problem-solving abilities. - Project Experience: Have experience in data warehouse modeling and be able to design efficient data models that meet complex business scenarios. - Experience in using other big-data systems/frameworks (such as Hive, HBase, Kudu, etc.) is preferred. - Comprehensive Qualities: Have the courage to take on complex problems and be willing to explore problems without clear solutions. - Be passionate about learning new technologies and be able to quickly master and apply them to practical work. - Experience in handling large-scale data (PB - level and above) is preferred.
Job Information
(For Pay Transparency)Compensation Description (Annually)
The base salary range for this position in the selected city is $ - $ annually.
Compensation may vary outside of this range depending on a number of factors, including a candidate's qualifications, skills, competencies and experience, and location. Base pay is one part of the Total Package that is provided to compensate and recognize employees for their work, and this role may be eligible for additional discretionary bonuses/incentives, and restricted stock units.
Benefits may vary depending on the nature of employment and the country work location. Employees have day one access to medical, dental, and vision insurance, a 401(k) savings plan with company match, paid parental leave, short-term and long-term disability coverage, life insurance, wellbeing benefits, among others. Employees also receive 10 paid holidays per year, 10 paid sick days per year and 17 days of Paid Personal Time (prorated upon hire with increasing accruals by tenure).
The Company reserves the right to modify or change these benefits programs at any time, with or without notice.
For Los Angeles County (unincorporated) Candidates:
Qualified applicants with arrest or conviction records will be considered for employment in accordance with all federal, state, and local laws including the Los Angeles County Fair Chance Ordinance for Employers and the California Fair Chance Act. Our company believes that criminal history may have a direct, adverse and negative relationship on the following job duties, potentially resulting in the withdrawal of the conditional offer of employment:
1. Interacting and occasionally having unsupervised contact with internal/external clients and/or colleagues;
2. Appropriately handling and managing confidential information including proprietary and trade secret information and access to information technology systems; and
3. Exercising sound judgment.
About TikTok
TikTok is the leading destination for short-form mobile video. At TikTok, our mission is to inspire creativity and bring joy. TikTok's global headquarters are in Los Angeles and Singapore, and we also have offices in New York City, London, Dublin, Paris, Berlin, Dubai, Jakarta, Seoul, and Tokyo.
Why Join Us
Inspiring creativity is at the core of TikTok's mission. Our innovative product is built to help people authentically express themselves, discover and connect - and our global, diverse teams make that possible. Together, we create value for our communities, inspire creativity and bring joy - a mission we work towards every day.
We strive to do great things with great people. We lead with curiosity, humility, and a desire to make impact in a rapidly growing tech company. Every challenge is an opportunity to learn and innovate as one team. We're resilient and embrace challenges as they come. By constantly iterating and fostering an "Always Day 1" mindset, we achieve meaningful breakthroughs for ourselves, our company, and our users. When we create and grow together, the possibilities are limitless. Join us.
Diversity & Inclusion
TikTok is committed to creating an inclusive space where employees are valued for their skills, experiences, and unique perspectives. Our platform connects people from across the globe and so does our workplace. At TikTok, our mission is to inspire creativity and bring joy. To achieve that goal, we are committed to celebrating our diverse voices and to creating an environment that reflects the many communities we reach. We are passionate about this and hope you are too.
TikTok Accommodation
TikTok is committed to providing reasonable accommodations in our recruitment processes for candidates with disabilities, pregnancy, sincerely held religious beliefs or other reasons protected by applicable laws. If you need assistance or a reasonable accommodation, please reach out to us at
Lead Engineer, Big Data (AI / Azure Data Services / Data Governance) - REMOTE

Posted 1 day ago
Job Viewed
Job Description
A Lead Data Engineer collaborates with the Data Quality and Governance team to ensure data pipelines, data integrity, and compliance by defining data strategies, implementing Data Governance capabilities, creating self-service data assets, and integrating robust data quality and governance frameworks to support trustworthy AI solutions. Key responsibilities include designing scalable data architectures, establishing data quality standards and monitoring, managing data lineage, and ensuring adherence to regulatory requirements and privacy policies
**KNOWLEDGE, SKILLS & ABILITIES** (Occupational knowledge and specific technical and professional skills and abilities required to perform the essential duties of this job):
**1. Data Quality & Governance Leadership**
+ Define and enforce data quality standards, validation rules, and monitoring systems tailored to healthcare data.
+ Collaborate with data stewards, compliance officers, and business stakeholders to resolve data integrity issues and ensure consistent data across systems.
+ Develop and maintain metadata management, data dictionaries, and stewardship workflows.
**2. Regulatory Compliance & Security**
+ Ensure all data engineering practices align with healthcare regulations such as HIPAA, HITECH, and other privacy laws.
+ Implement data governance policies that support secure, ethical, and compliant data usage across the organization.
**3. Data Architecture & Lifecycle Management**
+ Design scalable data architectures that support healthcare analytics and AI/ML workflows.
+ Automate data lineage tracking, governance documentation, and audit trails to support transparency and traceability.
+ Establish data lifecycle policies for retention, archiving, and disposal in accordance with regulatory and operational requirements.
**5. AI/ML Enablement**
+ Collaborate with data scientists and ML engineers to ensure data pipelines meet model training and inference requirements.
+ Architect and implement end-to-end AI pipelines using Agentic AI and GenAI frameworks.
+ Support the development of trustworthy AI by ensuring the use of reliable, consented, and well-governed data to ensure AI enabled solutions around Data Governance.
**6. Cross-Functional Collaboration**
+ Partner with clinical informatics, compliance, IT, and analytics teams to align data engineering efforts with healthcare delivery goals and governance strategies.
**7. Lead by example**
+ Must be hands on Data quality, Data Governance tools, Databricks, Power BI
+ Experience in Azure Data Services (like Azure Databricks, Unity Catalog, Purview, Azure Data Factory) and Power BI
+ Ability to lead, in solving technical issues while engaged with infrastructure and vendor support teams
+ Analyze current business practices, processes and procedures and identify opportunities for leveraging Microsoft Azure data & analytics PaaS services.
**8. Leadership & Mentorship**
+ Lead and mentor a team of data engineers, fostering a culture of quality, compliance, and innovation.
+ Oversee project delivery and ensure alignment with enterprise data governance objectives.
**JOB FUNCTION:**
Responsible for all the aspects of architecture, design and implementation of Data Governance in Databricks
**REQUIRED EDUCATION:**
Bachelor's Degree
**REQUIRED EXPERIENCE:**
+ 8 + years of data management experience
+ Prior experience leading projects or teams
+ Strong experience on Data Lake, Data Warehouse, Data Validation & Certification, Data Quality, Metadata Management and Data Governance
+ Experience in Azure Data Services (like Azure Databricks, Unity Catalog, Purview, Azure Data Factory) and Power BI, and in programming languages such as, PySpark/Python/SQL, etc.
+ Preferred experience in building stream-processing systems, using solutions such as Kafka, Storm or Spark-Streaming
+ Experience in implementing AI based solutions in Data Governance space
**PREFERRED EDUCATION:**
Master's Degree
**PREFERRED EXPERIENCE:**
Experience in the healthcare industry is preferred
To all current Molina employees: If you are interested in applying for this position, please apply through the intranet job listing.
Molina Healthcare offers a competitive benefits and compensation package. Molina Healthcare is an Equal Opportunity Employer (EOE) M/F/D/V.
Pay Range: $107,028 - $208,446 / ANNUAL
*Actual compensation may vary from posting based on geographic location, work experience, education and/or skill level.
Principal Data Scientist - Machine Learning
Posted today
Job Viewed
Job Description
Responsibilities:
- Lead the design, development, and implementation of advanced machine learning models and algorithms to solve complex business problems.
- Conduct exploratory data analysis, feature engineering, and model selection to optimize predictive accuracy and performance.
- Develop and deploy ML models into production environments, ensuring scalability, robustness, and maintainability.
- Stay abreast of the latest research and advancements in machine learning, artificial intelligence, and related fields, and identify opportunities for application.
- Mentor and guide junior data scientists and engineers, fostering technical excellence and knowledge sharing.
- Collaborate with product managers, engineers, and business stakeholders to define project requirements and translate them into data science solutions.
- Communicate complex findings and recommendations clearly and concisely to both technical and non-technical audiences.
- Evaluate and select appropriate tools, technologies, and frameworks for data science projects.
- Contribute to the development of the company's data science strategy and long-term vision.
- Design and conduct experiments to validate hypotheses and measure the impact of ML solutions.
- Ph.D. or Master's degree in Computer Science, Statistics, Mathematics, Physics, or a related quantitative field.
- 8+ years of experience in data science, with a strong focus on machine learning model development and deployment.
- Proven track record of delivering impactful ML solutions in production environments.
- Expertise in a wide range of ML algorithms, including supervised, unsupervised, and deep learning techniques.
- Proficiency in programming languages such as Python (with libraries like TensorFlow, PyTorch, scikit-learn) and R.
- Strong experience with big data technologies (e.g., Spark, Hadoop) and cloud platforms (e.g., AWS, Azure, GCP).
- Excellent understanding of statistical modeling, experimental design, and data mining techniques.
- Exceptional problem-solving skills and the ability to think critically and creatively.
- Outstanding communication and presentation skills, with the ability to explain complex technical concepts to diverse audiences.
- Demonstrated leadership capabilities and experience mentoring technical teams.
Senior Data Scientist (Machine Learning)
Posted 23 days ago
Job Viewed
Job Description
Key Responsibilities:
- Design, develop, and implement advanced machine learning models and algorithms.
- Perform exploratory data analysis and feature engineering on large datasets.
- Evaluate and optimize model performance using rigorous statistical methods.
- Deploy machine learning models into production environments.
- Collaborate with cross-functional teams to define business problems and translate them into data science solutions.
- Communicate complex findings and insights to technical and non-technical stakeholders.
- Stay current with the latest advancements in machine learning, AI, and data science.
- Mentor junior data scientists and contribute to the team's knowledge base.
- Develop and maintain robust data pipelines and workflows.
- Conduct A/B testing and other experiments to validate model effectiveness.
- Master's or Ph.D. in Computer Science, Statistics, Mathematics, or a related quantitative field.
- 5+ years of professional experience in data science and machine learning.
- Proven expertise in developing and deploying machine learning models for various applications (e.g., prediction, classification, recommendation systems, NLP).
- Proficiency in programming languages such as Python or R and relevant libraries (e.g., scikit-learn, TensorFlow, PyTorch, Keras).
- Experience with big data technologies (e.g., Spark, Hadoop) and SQL.
- Familiarity with cloud computing platforms (AWS, Azure, GCP) and their data science services.
- Strong understanding of statistical concepts, experimental design, and data mining.
- Excellent problem-solving, analytical, and critical thinking skills.
- Effective communication and presentation skills.
Remote Data Scientist - Machine Learning
Posted 24 days ago
Job Viewed
Job Description
As a Remote Data Scientist, your responsibilities will include:
- Designing, developing, and implementing sophisticated machine learning algorithms and statistical models to address business needs, such as predictive analytics, recommendation systems, natural language processing, and computer vision.
- Performing extensive data exploration, cleaning, and feature engineering on large, complex datasets from various sources.
- Collaborating closely with product managers, engineers, and other data scientists to define project scope, establish key performance indicators (KPIs), and deliver impactful solutions.
- Conducting rigorous model evaluation, validation, and A/B testing to ensure accuracy, reliability, and scalability.
- Translating complex analytical findings and model results into clear, concise, and actionable recommendations for technical and non-technical stakeholders through compelling visualizations and presentations.
- Staying abreast of the latest advancements in machine learning, deep learning, and artificial intelligence, and exploring their potential applications within the company.
- Contributing to the development and maintenance of our data science infrastructure, tools, and best practices.
- Mentoring junior data scientists and fostering a collaborative learning environment.
- Proactively identifying opportunities to leverage data science to improve business outcomes.
Be The First To Know
About the latest Data scientists Jobs in Seattle !
Junior Data Scientist - Machine Learning Applications
Posted 11 days ago
Job Viewed
Job Description
Graduate Data Scientist - Machine Learning Focus
Posted 16 days ago
Job Viewed
Job Description
Advanced Data Scientist - Machine Learning Specialist
Posted 17 days ago
Job Viewed
Job Description
Responsibilities:
- Design, develop, and implement sophisticated machine learning models for a wide range of applications, including but not limited to natural language processing, computer vision, recommendation systems, and predictive analytics.
- Conduct in-depth data analysis, feature engineering, and model validation to ensure accuracy and performance.
- Collaborate with cross-functional teams, including product managers, engineers, and business analysts, to understand requirements and translate them into data-driven solutions.
- Stay at the forefront of ML research and techniques, exploring and applying novel algorithms and methodologies.
- Develop and maintain robust data pipelines and infrastructure for training and deploying ML models.
- Optimize model performance for scalability, efficiency, and real-time application.
- Communicate complex technical findings and insights clearly and effectively to both technical and non-technical audiences.
- Contribute to the development of best practices and standards for machine learning development within the organization.
- Mentor junior data scientists and share knowledge across the team.
- Evaluate and integrate new tools and technologies to enhance the ML development workflow.
- Ph.D. or Master's degree in Computer Science, Statistics, Mathematics, Physics, or a related quantitative field.
- 5+ years of experience in advanced data science and machine learning, with a strong portfolio of deployed ML models.
- Expertise in Python and relevant ML libraries (e.g., TensorFlow, PyTorch, Scikit-learn, Keras).
- Proficiency in SQL and experience with big data technologies (e.g., Spark, Hadoop).
- Deep understanding of statistical modeling, algorithm development, and data mining techniques.
- Experience with cloud platforms (e.g., AWS, Azure, GCP) and MLOps principles is highly desirable.
- Excellent problem-solving, analytical, and critical thinking skills.
- Strong communication and collaboration skills, with the ability to work effectively in a remote, team-oriented environment.
- Published research in top-tier ML conferences or journals is a significant plus.
- Passion for innovation and a drive to solve challenging real-world problems using data.