4,454 Analytics Engineer jobs in the United States
Analytics Engineer
Posted today
Job Viewed
Job Description
About us: Duet empowers Nurse Practitioners (NP) to tackle the primary care crisis by leading their own practices, closing the gap in access while keeping care local. We're a well-funded seed-stage company led by experienced entrepreneurs and Nurse Practitioners, and backed by investors like Kairos HQ and Lerer Hippeau.
We’re building a vertically integrated platform for NP-led practices to thrive as standalone businesses in a time of corporate consolidation. Think of workflows to engage patients, streamline administration, forecast business growth, and drive value-based outcomes while building community among NPs. These solutions sit on a foundation of data that we harness across patients and providers for the benefit of care and practice success.
About the role: As an Analytics Engineer at Duet, you will play a key role in shaping how we collect, transform, model, and leverage data to drive clinical and business outcomes. You will work cross-functionally with product, engineering, and clinical teams to build a modern data stack that supports everything from self-serve analytics to predictive modeling. This role blends strategic thinking, technical execution, and a strong sense of ownership — with the opportunity to build foundational systems that scale with our company.
Key Responsibilities:
- Design, implement, and optimize the end-to-end data infrastructure, including ELT pipelines, semantic layers, and analytics-ready datasets to support real-time and batch analytics.
- Develop and maintain scalable, well-documented data models in our cloud data warehouse to power reporting, analytics, and data science use cases.
- Partner with stakeholders across operations, product, and care delivery to understand data needs and translate them into robust, intuitive data products.
- Elevate our analytics capabilities by building trusted dashboards, reporting frameworks, and self-serve tools that enable data-driven decision-making across the organization.
- Lead data quality initiatives by implementing robust validation and monitoring frameworks to ensure the accuracy, consistency, and integrity of our data.
- Mentor junior team members and contribute to establishing data best practices, governance standards, and scalable analytics processes.
Qualifications:
- 5+ years of experience in analytics engineering, data engineering, or related roles, preferably in high-growth startups or healthcare environments.
- Advanced proficiency in SQL and data modeling (e.g., dbt); strong understanding of data warehousing principles and dimensional modeling.
- Experience with cloud data platforms (e.g., GCP, BigQuery) and modern data stack tools (e.g., dbt)
- Proficiency with data visualization and BI tools (e.g., Looker, Metabase) and a track record of building high-impact dashboards.
- Strong communication skills with the ability to partner across technical and non-technical teams, aligning data initiatives with business goals.
- Familiarity with healthcare data (EHRs, claims, clinical KPIs) is a strong plus.
- Experience with version control, testing frameworks, and CI/CD pipelines for data is a bonus.
Ideal Candidate:
- Strategic thinker who can translate ambiguous problems into structured data solutions.
- Passionate about building durable systems that support care delivery and community-building for Nurse Practitioners.
- Thrives in fast-paced, collaborative environments with a bias toward action and iteration.
- Committed to data integrity, scalability, and empowering others through high-quality analytics.
This role is on-site 3 days a week in New York.
Analytics Engineer
Posted today
Job Viewed
Job Description
We're assisting one of our tech clients in the life sciences industry as they hire an Analytics Engineer . In this role, you’ll design and maintain core data models, pipelines, and reporting infrastructure — ensuring that the right people have access to clean, trustworthy, decision-grade data.
This is a highly cross-functional role with visibility across product, ops, growth, and leadership. You’ll help define analytics engineering practices, scale the data stack, and shape how the company makes better, faster decisions.
What you’ll do
- Build and maintain core data models that power internal analytics and client-facing insights
- Design and operate pipelines that transform raw data into analytics-ready tables
- Create dashboards and reporting tools that drive visibility across product, ops, and GTM
- Define and enforce metric consistency across teams (business KPIs, product usage definitions, etc.)
- Improve and scale the analytics stack (GCP BigQuery, Dataform, Fivetran, Postgres, etc.)
- Partner with stakeholders to understand data needs and translate them into reliable systems
- Contribute to internal data culture through documentation, education, and code standards
What we’re looking for
- 2–6 years of experience in analytics engineering, data engineering, or technical data analysis with production ownership
- Strong SQL and Python skills, with experience in modern data stacks (dbt, Fivetran, Airflow, BigQuery, Postgres, Metabase)
- A low-ego, ownership mindset — hungry to learn, comfortable with autonomy, and motivated by impact
Analytics Engineer
Posted today
Job Viewed
Job Description
About the Role
We’re hiring a hands-on Data Analyst to turn business questions into reliable datasets, clear visuals, and automated workflows. You’ll build and maintain Power BI dashboards, shape data models, and orchestrate pipelines using Microsoft Fabric. You’ll also own Make.com automation for API-based data acquisition and notifications, perform web scraping (ethically and compliantly), and write Python to automate, cleanse, and enrich data. The ideal candidate is equal parts analyst and problem-solver, comfortable moving from stakeholder requirements to production-grade dashboards and automations.
What You’ll Do
Business Intelligence & Visualization
- Build, iterate, and maintain Power BI dashboards; design robust data models (Power Query, DAX), optimize performance, and manage Row-Level Security (RLS).
- Translate stakeholder requirements into insightful visuals and clear data stories; support production refreshes and user inquiries.
Data Engineering with Power BI & Fabric
- Use Microsoft Fabric (e.g., Dataflows Gen2/Lakehouse/Pipelines) or Gen1 Power BI dataflows to build reliable ETL/ELT processes and refresh schedules.
- Query and validate data with SQL (plus KQL/PySpark where applicable for larger or log-style datasets).
Automation with Make.com & Power Automate
- Design, build, and maintain Make.com scenarios (HTTP modules, routers, iterators, data stores) to integrate third‑party REST APIs, webhooks, and internal systems; implement scheduling, logging, alerting, retries, and error handling.
- Use Power Automate for complementary internal workflow automation and notifications.
API Integration & Data Acquisition
- Connect to external services via REST/Graph APIs (OAuth/API keys), handle pagination and rate limits, and ingest JSON into curated datasets for BI and analytics.
- Configure inbound/outbound webhooks between Make.com, Monday.com, and other tools to streamline processes.
Web Scraping
- Build targeted scrapers (Make.com HTTP + parsing or Python with requests/beautifulsoup4/playwright/scrapy) to collect publicly available data.
Python for Analytics & Ops
- Write production‑minded Python for data cleaning, reconciliation, and transformations (e.g., pandas); package repeatable scripts, add unit tests where practical, and schedule jobs via Fabric or Make.com.
Platform & Stakeholder Support
- Provide first‑line support for BI dashboards, dataset refreshes, and workflow issues; help administer Monday.com (permissions, simple automations, troubleshooting).
- Document data models, metrics, pipelines, and scenarios; champion data quality and governance.
What You’ll Bring
Required Qualifications
- Bachelor’s degree in Computer Science (or equivalent experience).
- 2–5 years in data analytics/BI with hands‑on Power BI (data modeling, DAX, refreshes, RLS).
- Practical experience building automations in Make.com (or similar: Power Automate/Zapier/n8n), integrating REST APIs and webhooks.
- Python skills for ETL/automation/scraping (e.g., pandas, requests, bs4, playwright/scrapy).
- Working SQL proficiency; familiarity with Microsoft Fabric concepts and ETL in Power Query.
- Strong troubleshooting, communication, and stakeholder support skills (especially for dashboards and Monday.com workflows).
Preferred Qualifications
- Experience with KQL (Kusto) or PySpark; Azure Data Factory; integrating data via REST APIs into Power BI.
- Exposure to Copilot in Fabric or Azure OpenAI to accelerate BI workflows.
- Certifications: PL‑300 (Power BI Data Analyst), DP‑600 (Fabric Analytics Engineer), AI‑900 (Azure AI Fundamentals).
- Monday.com administration experience; understanding of access controls and basic governance.
Analytics Engineer
Posted today
Job Viewed
Job Description
Loop is on a mission to streamline the movement of money between shippers, carriers, and brokers to unlock margin and increase liquidity in the supply chain.
Today, logistics payments are a big black box that unnecessarily drains thousands, if not millions, of dollars. Legacy services do not meet the market's needs. Transportation teams need a solution that unlocks insights from their complex datasets, rather than drowning them in manual workflows.
Loop harnesses AI expertise with deep logistics industry knowledge to deliver automated audits, streamlined resolutions, expedited payments, and most importantly actionable insights. All so that shippers and 3PLs can improve spend visibility, eliminate unnecessary costs, and optimize their transportation spend. Loop's customers see a 4% reduction in costs and an 80% boost in back-office productivity.
Loop is a customer-obsessed company that sees the complexity in the supply chain as an opportunity rather than a challenge. Our foundational AI capitalizes on the messiness in the system to maximize outcomes for our customers.
Investors include Founders Fund, 8VC, Susa Ventures, Flexport, and 50 industry-leading angel investors. Our team brings subject matter expertise and experience from companies like Uber, Google, Flexport, Meta, Samsara, Intuit, and Rakuten - as well as traditional logistics companies like CH Robinson.
About Role
As an Analytics Engineer at Loop, you will play a pivotal role in maturing the data organization. You'll work cross-functionally to design, build, and own the core infrastructure and data models. These systems will enable both internal and client facing analytics, accelerating data-driven decision-making throughout the company and our clients.
What you will do
- Own Core Data Models and ETL Pipeline s: Design, build, and maintain Loop's core data models and ETL processes.
- Define Metrics and Governance : Establish and govern key business metrics for consistency across teams.
- Maintain Data Quality and Uptime : Ensure data accuracy and reliability with strong SLAs.
- Automate Data Requests : Convert ad-hoc data requests into scalable, repeatable pipelines.
- Optimize Data Infrastructure : Manage and optimize data infrastructure for performance, cost, and compliance.
- Develop Data Products : Create data-driven solutions and embedded analytics to support business and product teams.
- 2+ years experience building analytical products (system, dashboard, models).
- Strong proficiency in SQL and Python for data transformations and automation.
- Hands-on experience with modern data stack tools (e.g., Snowflake, dbt, Dagster, Airflow, Looker).
- Excellent business communication skills to translate data insights into actionable business strategies.
- Experience with data dashboarding and visualization tools (e.g., Tableau, Looker, Superset, Mode).
- Ability to manage, optimize, and scale data infrastructure.
- Previous experience setting up a data stack in a startup from scratch.
- Interest or experience in working with cutting-edge open-source data tool.
- Proven ability to convert ad-hoc data requests into scalable, automated pipelines.
- Experience working in the logistics domain.
- EPD (Engineering, Product, and Design) : Become the data expert within the EPD organization, empowering data-driven decisions across product development and iterations
- Strategy and Operations : Enable product delivery teams (solutions, analyst) to deliver value to customers through actionable dashboards, metrics, and self-service reports
- AI Platform: Expedite the development of new models and training of existing models by simplifying the necessary data cleansing and preparation
- Base pay 120k - 190k
- Premium Medical, Dental, and Vision Insurance plans
- Insurance premiums covered 100% for you
- Unlimited PTO
- Fireside chats with industry leading keynote speakers
- Off-sites in locales such as Napa and Tahoe
- Generous professional development budget to feed your curiosity
- Physical and Mental fitness subsidies for yoga, meditation, gym, or ski memberships
Why you should join Loop? -
Analytics Engineer
Posted today
Job Viewed
Job Description
Remote if located in DC, NYC, Boston, or Chicago
Ideal Candidate: Strong background in designing, developing, and working with users in creating analytical reports. Must be well-versed with Power BI and Tableau. Must also have SQL experience.
Analytics Engineer
Posted today
Job Viewed
Job Description
We're building the all-in-one B2B post-sales support platform powered by conversational data and layered with intelligence to help our customers run their operations in real-time. We're backed by YCombinator, General Catalyst, and a16z. Pylon is a tool that thousands of users spend most of their day in.
Currently, 780+companies including Linear, Cognition (makers of Devin), Modal Labs, and Incident.io use us everyday to run their support and customer success workflows. Also, we recently made this year's Enterprise Tech 30 List.
Our product has a very large problem space so there's a ton of stuff to build and take ownership of - we'd be excited to get your help as we're hiring several extremely talented software engineers across the stack.
We're now looking for our first Analytics Engineer to help us centralize our data and build the foundation for company-wide reporting and insights.
What you'll work on
- Own our data warehouse: build and maintain pipelines to unify product, GTM, and customer data
- Model product usage, feature adoption, and account activity so teams can self-serve answers
- Partner with Sales, Success, and Product to answer questions like:
- "Which enterprise customers use our AI features most?"
- "Who are great reference customers by usage and size?"
- Create dashboards and metrics that the company can operate on
- Help design how Pylon measures success across customers and product
- You've owned analytics or data modeling for a fast-moving SaaS product
- You're fluent in SQL and familiar with modern data stacks (dbt, Snowflake, Fivetran, Looker, etc.)
- You can translate ambiguous business questions into structured analysis
- You're motivated by autonomy, high ownership, and fast iteration
- You're in SF (or open to relocating) and want to help shape our data culture from the start
Our perks
Lunch, dinner and snacks at the office
Fully covered medical, dental, and vision insurance for employees
Retirement savings
14 company holidays + unlimited PTO
Annual offsite
Relocation and immigration support
More about Pylon
Traction: Have hundreds of paying customers and are growing fast
Funding: In addition to investment from Y Combinator and General Catalyst, a16z we just announced our Series B ($51M total raised)
Founders: Advith Chelikani, Robert Eng, and Marty Kausas
Team: Currently 55+ and growing!
Analytics Engineer
Posted 3 days ago
Job Viewed
Job Description
The Analytics Engineer is responsible for data consolidation and maintenance across the organization's data infrastructure and various data sources. The engineer works with Data Analysts and other business units to create data models and to build and maintain a cloud data infrastructure. Their work will support recurring and ad-hoc reporting, dashboard creation and maintenance, and data requests from internal and external stakeholders. The Analytics Engineer will also build business intelligence tools to help data analysts do their work more efficiently, such as improved or automated data clean up processes and a self-service analytics platform. They will also collaborate with the Product Development team and IT Operations team on data infrastructure and integration projects focused on transforming data and supplying data between operational systems to meet business needs. Lastly, they are responsible for using best practices to maintain version control and perform internal testing and deploy production-ready code.
Position Responsibilities- Design, develop, implement, and maintain data models used for reporting and analytics.
- Work with Data Analysts and other business teams to identify and develop ETL/ELT processes to automate, orchestrate, and maintain data pipelines.
- Maintain data documentation and definitions, including data quality standards and business logic for metrics/KPIs, naming and formatting conventions and detailed specifications/documentation for analytics code.
- Maintain version control for, test, and continuously integrate analytics code into testing and production environments.
- Monitor data models and pipelines to ensure performance meets internal service level agreement.
- Collaborate with Data Analysts to build dashboards and visualizations that show trends, patterns, and outliers in the data pipeline and models; maintain current inventory of reports and dashboards and publish this information to business users.
- Work with Product Development and other business teams to consolidate and integrate data from and between multiple data sources (such as Salesforce, Financial Systems, HRIS, and other data sources).
- Maintain and update (as necessary) existing integration infrastructure, with a focus on streamlining and consolidating processes.
- Help evaluate and validate new and existing tools or development needs to suggest improvements and streamline operations.
- Support updates/upgrades, configuration and troubleshooting for business intelligence tools.
- Help coach Data Analysts on software development best practices.
Professional Experience
- Proven success in designing, building and maintaining data warehouse and infrastructure that meet business needs.
- Experience with SQL.
- Experience with Python or R.
- Strong problem-solving skills in relation to data and software applications.
- Excellent interpersonal skills and verbal/written/virtual communication skills to communicate technical details to both other team members and non-technical stakeholders.
- Experience supporting and working with cross-functional teams.
- Prior experience with software development best practices around code (version control using Git, testing, debugging, continuous integration and deployment, clear comments in code).
- Experience with cloud data warehouses and platforms like Snowflake, GCP, Azure.
- Experience with ETL/ELT tools like FiveTran, Matillion, Stitch.
- Experience with DBT, Dataform, Databricks.
- Experience with data visualization tools like Tableau, PowerBI, Looker, Sigma.
- Experience with Integration Platform-as-a-Service like Workato or Mulesoft.
- Clear technical writing experience, including data mapping documents, reports, and presentations.
- Experience in a non-profit, human services setting.
- Dealing with Ambiguity: Remains productive and effective in uncertain, rapidly changing situations by quickly analyzing information to adapt approach. Demonstrates flexibility, composure and good judgment despite challenges
- Technical Learning: Quickly learns and applies new technical skills, knowledge, and industry expertise. Seeks guidance when needed to ensure quality
- Problem Solving: Uses critical thinking to creatively investigate issues from diverse perspectives. Makes evidence-based recommendations addressing short and long-term needs
- Total Work Systems: Designs, implements, and improves organization-wide systems to enhance quality and efficiency. Leverages technology and data to standardize workflows and meet stakeholder needs
- Organizational Agility: Understands how the organization operates through formal and informal structures. Navigates dynamics, communicates rationale behind policies, and builds relationships to achieve goals
- Customer Focus: Proactively understands and meets others' needs through a service-minded approach. Builds trust, leverages insights, and provides responsive support to align with evolving requirements
- Humanity: Putting people first: We are committed to meeting people where they're at, honoring their dignity, diversity, and experience.
- Community: Building a better future: Sustainable housing solutions are fostered through partnership, collaboration, and human connection.
- Ingenuity: Innovating for transformation: Systems-change requires relentless determination, thinking outside the box and challenging the status quo.
Candidates should have physical mobility for tasks such as standing, bending, stooping, kneeling, crouching, reaching, twisting, and walking on uneven surfaces. They should be capable of performing stationary tasks like sitting for up to 6 to 8 hours a day. Additionally, candidates should be able to lift, carry, push, pull light to moderate weights up to 15 pounds safely. Requires mental acuity for analytical reasoning and document interpretation.
Benefits- Health Care Plan (Medical, Dental, & Vision)
- Retirement Plan (With 5% Match)
- Life Insurance (Basic, Voluntary and AD&D)
- Paid Time Off (Vacation, Sick & Public Holidays)
- Family Leave (Maternity, Paternity)
- Short Term & Long-Term Disability
- Training & Development
- Wellness Resources
- Hybrid Work
Location: Los Angeles, California
Base pay range: $95,000.00/yr - $105,000.00/yr
#J-18808-LjbffrBe The First To Know
About the latest Analytics engineer Jobs in United States !
Analytics Engineer
Posted 3 days ago
Job Viewed
Job Description
HiveWatch is a tech-forward, inclusive organization fostering the evolution of the physical security industry. We are a diverse team of forward thinkers who empower each other to find creative and collaborative solutions in an industry ripe for modernization. We are passionate about the problems were solving for our customers and equally passionate about the company were building.
HiveWatch is here to help security teams pivot from chasing threats to preventing them. We protect organizations, people, and property through the intelligent orchestration of physical security programs. With better communication, more insights, and less noise, we are modernizing what it means for businesses and their employees to truly feel safe.
The RoleWe're seeking an Analytics Engineer who will be instrumental in transforming how we leverage data across our security platform. This role uniquely combines the technical rigor of analytics engineering with hands-on analysis and visualization work. You'll own the foundational data models that power our insights, while also supporting our Data Analyst on the dashboards and reports that drive decision-making across product, engineering, and customer success teams.
This position is perfect for someone who thrives at the intersection of complex data infrastructure and practical business intelligence, particularly in handling the diverse data streams inherent in security operationsfrom IoT stream data to application logs and relational databases.
WHAT YOU'LL DO:- Own and evolve overlaying BI and semantic layer datasets from our OLTP databases that service product analytics, ensuring scalability and performance
- Collaborate directly with both customers and internal stakeholders to translate requirements into optimized pipelines, dashboards, frameworks, and systems that facilitate easier development of data artifacts
- Perform extensive data munging and transformation of non-relational data sources, including IoT stream data, camera telemetry, and application logs
- Conduct deep-dive analyses on camera and device behavior to optimize product performance
- Train teams on how to derive their own insights from data, and contribute to building a strong data culture by advocating best practices and improving data literacy across the company
- 4+ years of experience in analytics engineering, data engineering, or similar role with strong technical foundations
- Advanced SQL skills for complex data transformation and performance tuning
- Proficiency in Python or similar programming language for data processing and automation
- Proficiency in git, bash, and/or linux CLI
- Experience with database fundamentals such as data modeling, indexing, normalization, denormalization, foreign key constraints, slowly changing dimensions
- Strong experience writing data quality checks and validation frameworks
- Proven ability to work with both technical and non-technical stakeholders
- Self-starter comfortable in a fast-paced environment with high autonomy
- Proficiency in creating compelling reporting and visualization solutions using modern dashboarding tools (QuickSight, Tableau, Sigma, etc.)
- Startup experience
- Has developed BI frameworks from scratch
- Hands-on experience with high-volume, real-time data streams
- Experience building ETL/ELT pipelines and orchestrating workflows
- Experience with AWS ecosystem
- Experience with Terraform
- Familiarity with modern data warehousing platforms
- Knowledge of streaming data processing and real-time analytics
- Experience with experimentation frameworks and A/B testing methodologies
- Background in device telemetry, camera systems, or edge computing data
- Experience with security operations, surveillance systems, or physical security technology
- Base salary range for this position is $140,000 - $165,000 USD per year
- Eligible to participate in HiveWatch Equity Incentive Plan
The final offer will be at the company's sole discretion and determined by multiple factors, including years and depth of relevant experience and expertise, location, and other business considerations.
BENEFITS & CULTUREIn an effort to provide for our employees, HiveWatch offers a competitive benefits package which includes:
- Health Benefits: Medical, Vision, Dental and Life Insurance
- Cutting edge solutions in an emerging field with lots of growth potential
- Generous compensation packages
- 401K
- Family friendly & compassionate work culture
- Work with good people who CARE about making the world a better place
What is CARE? HiveWatch enables its employees to CARE for themselves, and each other through unique programs crafted by HiveWatch employees themselves. To deliver on our mission, we empower our employees to consider the meaning of job security on a holistic level. At HiveWatch, you are encouraged to challenge the status quo, provide your unique point of view, and leave fear at the (access controlled) door. In practicing CARE, we:
- Celebrate our diverse workforce and all communities within HiveWatch.
- Assist the varying needs of our employees, from maintaining a work life balance to encouraging personal aspirations.
- Respect one another through our interactions and set personal boundaries.
- Embrace equity through our policies and practices of hiring, promoting, and offering benefits that take care of the whole person, not just the worker.
OUR EEO STATEMENT:
At HiveWatch, you are encouraged to challenge the status quo, provide your unique point of view, and leave fear at the (access controlled) door. HiveWatch enables its employees to CARE for themselves and each other through unique programs crafted by HiveWatch employees.
HiveWatch is an equal opportunity employer and we are committed to cultivating a work environment that supports, inspires, and respects all individuals. We execute our hiring practices so that they are merit-based and we do not discriminate on the basis of race, color, religion, sex, sexual orientation, gender identity/expression, marital status, age, disability, medical condition, genetic information, national origin, ancestry, military or veteran status, or other protected characteristic.
#J-18808-LjbffrAnalytics Engineer
Posted 3 days ago
Job Viewed
Job Description
As our first Analytics Engineer, you'll be instrumental in defining and building our analytics infrastructure, ensuring data is reliable, accessible, and drives critical business decisions across the company. You'll play a key role in shaping how we use data at Perplexity, contributing directly to our mission of becoming the world's most knowledge-centric company.
Perplexity is building the next-generation answer engine, empowering our users to find information more effectively. We are headquartered in San Francisco, and on a hybrid schedule with in-office days on Monday, Wednesday, Friday.
Responsibilities
- Design and build core data models to improve analysis efficiency, enabling rapid, reliable insights for teams
- Define and champion data modeling standards and best practices using tools like dbt
- Boost data team productivity by improving tooling, automating workflows, and streamlining processes
- Own decisions on tooling selection, balancing build vs. buy and managing vendor relationships when necessary
- Partner closely with Data Scientists to ensure analytics requirements are clearly understood and effectively implemented
- Develop and maintain critical dashboards and reporting to track business health and enable better decision-making
- Lead data governance efforts, ensuring security, compliance, and quality standards are consistently met
- Have 6-8+ years of professional experience as an analytics engineer, data engineer, data scientist, or closely related role
- SQL expert
- Experience in data modeling, including dimensional modeling and analytics engineering best practices
- Experience creating high-impact dashboards and visualizations using BI tools (e.g., Omni, Mode, Hex, Looker, or similar)
- Have prior experience in fast-paced, rapidly scaling environments
- Comfortable working autonomously, taking projects from concept through execution with minimal oversight.
- Experience with dbt
- Comfortable with Python
- Experience with Snowflake administration and optimization
- Familiarity or experience with Databricks
- Previous experience as an early or first analytics engineer in a high-growth startup
The cash compensation range for this role is $00,000 - 300,000.
Final offer amounts are determined by multiple factors, including, experience and expertise, and may vary from the amounts listed above.
Equity: In addition to the base salary, equity may be part of the total compensation package.
Benefits: Comprehensive health, dental, and vision insurance for you and your dependents. Includes a 401(k) plan.
Analytics Engineer
Posted 3 days ago
Job Viewed
Job Description
- Location: This role is based 3 days per week out of our San Francisco, HQ and is not eligible for full-time remote work.
- Compensation: The annual cash compensation range for this position is $170,000-$25,000 in addition to equity & benefits.
About Gem
Gem is the only AI-first all-in-one recruiting platform. Over 1,000 industry leaders including Anthropic, Reddit, Figma, Zillow, Robinhood, and DoorDash trust Gem to fuel their growth and recognize Gem as one of the highest-satisfaction products on G2 with a 4.8/5.0 rating. The company has raised 148M from renown investors including Accel, Greylock, ICONIQ, Sapphire, and Meritech.
With Gem, you can experience the power of a truly connected recruiting platform - one consistent interface, unified data, smarter AI recommendations, and simplified permissions. Our customers achieve remarkable results:
- Cut costs through consolidation: Replace your scattered recruiting tools with one AI-powered platform for ATS, CRM, sourcing, scheduling, and analytics. Reduce spend on expensive sources of talent and eliminate redundant point solutions.
- Maximize recruiter productivity: Get up to 5x efficiency gains with AI built into every workflow - from sourcing talent and reviewing applications to scheduling interviews and managing candidates. With all products working better together, recruiters spend less time switching systems and more time building relationships.
- Unlock data-driven recruiting: Get access to analytics across the recruiting funnel without complex BI tools. From pipeline metrics to hiring forecasts, Gem helps you make data-driven decisions and demonstrate impact.
- Use AI that actually works: Get better results from AI that learns from all your recruiting data. With insight into every candidate touchpoint and interaction, Gem AI makes smarter decisions about who to engage and how.
- Work with software built for recruiters: Use a platform designed around how recruiters actually work, not a collection of technical features. Quick implementation, intuitive workflows, and clear analytics make Gem the solution teams actually use and trust.
Just as we strive to help our customers find great talent, we also invest in our own people and culture. We are proud of the culture we've built and have recently been recognized as:
- Forbes America's Best Startup Employers 2024
- Great Place to Work Certified, 2024
- Fortune Best Workplaces for Millennials, 2023
The Team & Role
You will be a founding member of our analytics team, responsible for transforming raw data into products for our customers and colleagues. This includes building/ maintaining/ improving our data pipelines and reports, architecting product usage event taxonomy, designing metrics to measure success, and running in-depth investigative analyses. As data at Gem becomes more complex, you will also help us shape how we scale the analytics team, identify opportunities for tech stack improvements, and lead the charge on enforcing best practices.
What You'll Do Day-to-Day
- Build and manage core data assets (infrastructure, models, reports, and automation) that ensure teams can self-serve accurate, reliable insights
- Work cross-functionally to define and track metrics that drive product adoption and revenue growth
- Conduct custom analyses to support customers and drive decisions
- Identify and execute on opportunities where data and analytics can have significant impact
- Improve our processes for maintaining high quality data and reporting
- Drive continuous improvement initiatives by analyzing workflow inefficiencies and recommending data-driven solutions that reduce manual processes and improve team productivity
- You're a critical thinker with strong business and product intuition. You stay on top of best practices, but are also unafraid to solve problems from first principles.
- You're extremely analytical. You sweat the details and like to think things through in a very methodical, structured way.
- You're comfortable communicating complex analytical concepts to others in a concise, articulate manner and enjoy collaborating cross-functionally to get things done.
- You're capable of prioritizing your work, setting clear expectations, and delivering against those expectations.
- 2.5+ years working experience as an Analytics Engineer, Data Analyst, or Data Scientist
- Strong proficiency in SQL (dbt experience preferred) and BI tools (Looker preferred) with a deep understanding of dimensional modeling and transformation
- Bachelor's degree in Economics, Finance, Statistics, Engineering, or other quantitative disciplines strongly preferred
- Experience operating in a scrappy, dynamic startup environment (companies with
- Highly competitive salary & equity
- 10-year window to exercise your stock options
- Supportive Flexible Time Off program
- 16 paid holidays, including regular company-wide wellness days
- Best-in-class medical, dental & vision insurance
- 1,200 annual stipend for learning and development opportunities
- 16 weeks of Paid Parental Leave for birthing and non-birthing parents
- New Parent Perks totaling 1,500 and flexibility upon return to work
Gem is an equal opportunity employer. We celebrate our inclusive work environment and encourage folks of all backgrounds and perspectives to apply. At Gem, we're committed to having an inclusive and transparent environment where every voice is heard and acknowledged. We embrace our differences, and know that our diverse team is a strength that drives our success.
Gem is committed to developing a barrier-free recruitment process and work environment. If you require any accommodation, please email us at we'll work with you to meet your accessibility needs.
Gem's Candidate Privacy Notice
By clicking "Submit Application", you acknowledge and agree that you have read and understand Gem's Candidate Privacy Statement, including the information provided on how Gem processes your personal data and your related rights as set forth therein.