11,464 Senior Analytics Engineer jobs in the United States
Data Analytics Engineer
Posted today
Job Viewed
Job Description
Job Title: Data Analytics Engineer
Location: Dallas, TX
Employment Type: Full-Time | Hybrid (Onsite Monday–Wednesday, Remote Thursday–Friday after training)
Compensation: $80,000 annually + 10% bonus
Note: C2C and/or sponsorship is not available for this role.
Position Overview
Wheeler Staffing Partners is seeking a Data Analytics Engineer for a client located in Dallas, TX. This role will begin 100% onsite during the initial training period (estimated 2–3 months) and will transition to a hybrid schedule thereafter (in office Monday–Wednesday, remote Thursday–Friday). The Data Analytics Engineer will play a critical role in developing accurate and reliable reporting solutions to support supply chain, procurement, sales, and operations teams.
Key Responsibilities
- Model and transform data to create clean, accurate, and actionable datasets.
- Integrate and manage multiple data sources for reporting and analysis.
- Conduct cause-and-effect analyses to support key business decisions.
- Develop and maintain BI reporting solutions using tools such as WebFOCUS or Vision.
- Forecast inbound distribution center requirements and monitor inventory levels.
- Improve forecasting and demand planning processes across the supply chain.
- Create ad-hoc reports to measure forecast accuracy and track key performance metrics.
- Draft technical requirements, specifications, and reporting documentation.
- Train team members on reporting tools, processes, and best practices.
- Collaborate with cross-functional teams on strategic projects and initiatives.
- Stay current with BI technologies and recommend enhancements as needed.
Qualifications
- 3–5 years of experience in supply chain analysis, demand planning, or materials planning.
- Minimum 3 years of ERP/MRP software experience (ASW, AS400, or similar).
- Minimum 3 years of BI software experience (WebFOCUS, Vision preferred).
- APICS Certification strongly desired.
- Experience with category management in wholesale, retail, manufacturing, or consumer packaged goods (preferred).
- Strong analytical, problem-solving, and project management skills.
- Advanced proficiency in MS Excel, including VBA, macros, and automation.
- Excellent written and verbal communication skills, with the ability to collaborate across teams and organizational levels.
- Strong organizational skills with high attention to detail.
- Ability to work independently in a fast-paced environment and manage multiple priorities.
Why Work with Wheeler Staffing Partners
At Wheeler Staffing Partners, we believe that finding the right role is more than just matching skills to a job description — it’s about creating long-term success for both candidates and employers. Our team is committed to guiding you through every step of the hiring process with transparency, support, and expertise. When you work with us, you gain a partner dedicated to helping you achieve your career goals while connecting you with industry-leading organizations.
Data Analytics Engineer

Posted 1 day ago
Job Viewed
Job Description
Meta Platforms, Inc. (Meta), formerly known as Facebook Inc., builds technologies that help people connect, find communities, and grow businesses. When Facebook launched in 2004, it changed the way people connect. Apps and services like Messenger, Instagram, and WhatsApp further empowered billions around the world. Now, Meta is moving beyond 2D screens toward immersive experiences like augmented and virtual reality to help build the next evolution in social technology. To apply, click "Apply to Job" online on this web page.
**Required Skills:**
Data Analytics Engineer Responsibilities:
1. Coordinate and execute data warehouse plans for a product or a group of products to solve well-scoped problems.
2. Identify the data needed for a business problem and implement logging required to ensure availability of data, while working with data infrastructure to triage issues and resolve.
3. Collaborate with engineers, product managers and data scientists to understand data needs, representing key data insights visually in a meaningful way.
4. Build data expertise and leverage data controls to ensure privacy, security, compliance, data quality, and operations for allocated areas of ownership.
5. Design, build and launch new data models and visualizations in production, leveraging common development toolkits.
6. Independently design, build and launch new data extraction, transformation and loading processes in production, mentoring others around efficient queries.
7. Support existing processes running in production and implement optimized solutions with limited guidance.
8. Define and manage SLA for data sets in allocated areas of ownership.
**Minimum Qualifications:**
Minimum Qualifications:
9. Requires a Master's degree (or foreign equivalent degree) in Computer Science, Data Science, Information Systems, or a related field and 2 years of experience in the job offered or in a related occupation
10. Experience must include 24 months of experience in the following skills and technologies:
11. 1. Custom ETL design, implementation and maintenance
12. 2. Object-oriented programming languages
13. 3. Schema design and dimensional data modeling
14. 4. Managing and communicating data warehouse plans to internal clients
15. 5. Writing SQL statements and
16. 6. Analyzing data to identify deliverables, gaps and inconsistencies
**Public Compensation:**
$191,728/year to $196,900/year + bonus + equity + benefits
**Industry:** Internet
**Equal Opportunity:**
Meta is proud to be an Equal Employment Opportunity and Affirmative Action employer. We do not discriminate based upon race, religion, color, national origin, sex (including pregnancy, childbirth, or related medical conditions), sexual orientation, gender, gender identity, gender expression, transgender status, sexual stereotypes, age, status as a protected veteran, status as an individual with a disability, or other applicable legally protected characteristics. We also consider qualified applicants with criminal histories, consistent with applicable federal, state and local law. Meta participates in the E-Verify program in certain locations, as required by law. Please note that Meta may leverage artificial intelligence and machine learning technologies in connection with applications for employment.
Meta is committed to providing reasonable accommodations for candidates with disabilities in our recruiting process. If you need any assistance or accommodations due to a disability, please let us know at
Analytics Engineer
Posted today
Job Viewed
Job Description
About us: Duet empowers Nurse Practitioners (NP) to tackle the primary care crisis by leading their own practices, closing the gap in access while keeping care local. We're a well-funded seed-stage company led by experienced entrepreneurs and Nurse Practitioners, and backed by investors like Kairos HQ and Lerer Hippeau.
We’re building a vertically integrated platform for NP-led practices to thrive as standalone businesses in a time of corporate consolidation. Think of workflows to engage patients, streamline administration, forecast business growth, and drive value-based outcomes while building community among NPs. These solutions sit on a foundation of data that we harness across patients and providers for the benefit of care and practice success.
About the role: As an Analytics Engineer at Duet, you will play a key role in shaping how we collect, transform, model, and leverage data to drive clinical and business outcomes. You will work cross-functionally with product, engineering, and clinical teams to build a modern data stack that supports everything from self-serve analytics to predictive modeling. This role blends strategic thinking, technical execution, and a strong sense of ownership — with the opportunity to build foundational systems that scale with our company.
Key Responsibilities:
- Design, implement, and optimize the end-to-end data infrastructure, including ELT pipelines, semantic layers, and analytics-ready datasets to support real-time and batch analytics.
- Develop and maintain scalable, well-documented data models in our cloud data warehouse to power reporting, analytics, and data science use cases.
- Partner with stakeholders across operations, product, and care delivery to understand data needs and translate them into robust, intuitive data products.
- Elevate our analytics capabilities by building trusted dashboards, reporting frameworks, and self-serve tools that enable data-driven decision-making across the organization.
- Lead data quality initiatives by implementing robust validation and monitoring frameworks to ensure the accuracy, consistency, and integrity of our data.
- Mentor junior team members and contribute to establishing data best practices, governance standards, and scalable analytics processes.
Qualifications:
- 5+ years of experience in analytics engineering, data engineering, or related roles, preferably in high-growth startups or healthcare environments.
- Advanced proficiency in SQL and data modeling (e.g., dbt); strong understanding of data warehousing principles and dimensional modeling.
- Experience with cloud data platforms (e.g., GCP, BigQuery) and modern data stack tools (e.g., dbt)
- Proficiency with data visualization and BI tools (e.g., Looker, Metabase) and a track record of building high-impact dashboards.
- Strong communication skills with the ability to partner across technical and non-technical teams, aligning data initiatives with business goals.
- Familiarity with healthcare data (EHRs, claims, clinical KPIs) is a strong plus.
- Experience with version control, testing frameworks, and CI/CD pipelines for data is a bonus.
Ideal Candidate:
- Strategic thinker who can translate ambiguous problems into structured data solutions.
- Passionate about building durable systems that support care delivery and community-building for Nurse Practitioners.
- Thrives in fast-paced, collaborative environments with a bias toward action and iteration.
- Committed to data integrity, scalability, and empowering others through high-quality analytics.
This role is on-site 3 days a week in New York.
Analytics Engineer
Posted today
Job Viewed
Job Description
We're assisting one of our tech clients in the life sciences industry as they hire an Analytics Engineer . In this role, you’ll design and maintain core data models, pipelines, and reporting infrastructure — ensuring that the right people have access to clean, trustworthy, decision-grade data.
This is a highly cross-functional role with visibility across product, ops, growth, and leadership. You’ll help define analytics engineering practices, scale the data stack, and shape how the company makes better, faster decisions.
What you’ll do
- Build and maintain core data models that power internal analytics and client-facing insights
- Design and operate pipelines that transform raw data into analytics-ready tables
- Create dashboards and reporting tools that drive visibility across product, ops, and GTM
- Define and enforce metric consistency across teams (business KPIs, product usage definitions, etc.)
- Improve and scale the analytics stack (GCP BigQuery, Dataform, Fivetran, Postgres, etc.)
- Partner with stakeholders to understand data needs and translate them into reliable systems
- Contribute to internal data culture through documentation, education, and code standards
What we’re looking for
- 2–6 years of experience in analytics engineering, data engineering, or technical data analysis with production ownership
- Strong SQL and Python skills, with experience in modern data stacks (dbt, Fivetran, Airflow, BigQuery, Postgres, Metabase)
- A low-ego, ownership mindset — hungry to learn, comfortable with autonomy, and motivated by impact
Analytics Engineer
Posted today
Job Viewed
Job Description
About the Role
We’re hiring a hands-on Data Analyst to turn business questions into reliable datasets, clear visuals, and automated workflows. You’ll build and maintain Power BI dashboards, shape data models, and orchestrate pipelines using Microsoft Fabric. You’ll also own Make.com automation for API-based data acquisition and notifications, perform web scraping (ethically and compliantly), and write Python to automate, cleanse, and enrich data. The ideal candidate is equal parts analyst and problem-solver, comfortable moving from stakeholder requirements to production-grade dashboards and automations.
What You’ll Do
Business Intelligence & Visualization
- Build, iterate, and maintain Power BI dashboards; design robust data models (Power Query, DAX), optimize performance, and manage Row-Level Security (RLS).
- Translate stakeholder requirements into insightful visuals and clear data stories; support production refreshes and user inquiries.
Data Engineering with Power BI & Fabric
- Use Microsoft Fabric (e.g., Dataflows Gen2/Lakehouse/Pipelines) or Gen1 Power BI dataflows to build reliable ETL/ELT processes and refresh schedules.
- Query and validate data with SQL (plus KQL/PySpark where applicable for larger or log-style datasets).
Automation with Make.com & Power Automate
- Design, build, and maintain Make.com scenarios (HTTP modules, routers, iterators, data stores) to integrate third‑party REST APIs, webhooks, and internal systems; implement scheduling, logging, alerting, retries, and error handling.
- Use Power Automate for complementary internal workflow automation and notifications.
API Integration & Data Acquisition
- Connect to external services via REST/Graph APIs (OAuth/API keys), handle pagination and rate limits, and ingest JSON into curated datasets for BI and analytics.
- Configure inbound/outbound webhooks between Make.com, Monday.com, and other tools to streamline processes.
Web Scraping
- Build targeted scrapers (Make.com HTTP + parsing or Python with requests/beautifulsoup4/playwright/scrapy) to collect publicly available data.
Python for Analytics & Ops
- Write production‑minded Python for data cleaning, reconciliation, and transformations (e.g., pandas); package repeatable scripts, add unit tests where practical, and schedule jobs via Fabric or Make.com.
Platform & Stakeholder Support
- Provide first‑line support for BI dashboards, dataset refreshes, and workflow issues; help administer Monday.com (permissions, simple automations, troubleshooting).
- Document data models, metrics, pipelines, and scenarios; champion data quality and governance.
What You’ll Bring
Required Qualifications
- Bachelor’s degree in Computer Science (or equivalent experience).
- 2–5 years in data analytics/BI with hands‑on Power BI (data modeling, DAX, refreshes, RLS).
- Practical experience building automations in Make.com (or similar: Power Automate/Zapier/n8n), integrating REST APIs and webhooks.
- Python skills for ETL/automation/scraping (e.g., pandas, requests, bs4, playwright/scrapy).
- Working SQL proficiency; familiarity with Microsoft Fabric concepts and ETL in Power Query.
- Strong troubleshooting, communication, and stakeholder support skills (especially for dashboards and Monday.com workflows).
Preferred Qualifications
- Experience with KQL (Kusto) or PySpark; Azure Data Factory; integrating data via REST APIs into Power BI.
- Exposure to Copilot in Fabric or Azure OpenAI to accelerate BI workflows.
- Certifications: PL‑300 (Power BI Data Analyst), DP‑600 (Fabric Analytics Engineer), AI‑900 (Azure AI Fundamentals).
- Monday.com administration experience; understanding of access controls and basic governance.
Analytics Engineer
Posted today
Job Viewed
Job Description
Loop is on a mission to streamline the movement of money between shippers, carriers, and brokers to unlock margin and increase liquidity in the supply chain.
Today, logistics payments are a big black box that unnecessarily drains thousands, if not millions, of dollars. Legacy services do not meet the market's needs. Transportation teams need a solution that unlocks insights from their complex datasets, rather than drowning them in manual workflows.
Loop harnesses AI expertise with deep logistics industry knowledge to deliver automated audits, streamlined resolutions, expedited payments, and most importantly actionable insights. All so that shippers and 3PLs can improve spend visibility, eliminate unnecessary costs, and optimize their transportation spend. Loop's customers see a 4% reduction in costs and an 80% boost in back-office productivity.
Loop is a customer-obsessed company that sees the complexity in the supply chain as an opportunity rather than a challenge. Our foundational AI capitalizes on the messiness in the system to maximize outcomes for our customers.
Investors include Founders Fund, 8VC, Susa Ventures, Flexport, and 50 industry-leading angel investors. Our team brings subject matter expertise and experience from companies like Uber, Google, Flexport, Meta, Samsara, Intuit, and Rakuten - as well as traditional logistics companies like CH Robinson.
About Role
As an Analytics Engineer at Loop, you will play a pivotal role in maturing the data organization. You'll work cross-functionally to design, build, and own the core infrastructure and data models. These systems will enable both internal and client facing analytics, accelerating data-driven decision-making throughout the company and our clients.
What you will do
- Own Core Data Models and ETL Pipeline s: Design, build, and maintain Loop's core data models and ETL processes.
- Define Metrics and Governance : Establish and govern key business metrics for consistency across teams.
- Maintain Data Quality and Uptime : Ensure data accuracy and reliability with strong SLAs.
- Automate Data Requests : Convert ad-hoc data requests into scalable, repeatable pipelines.
- Optimize Data Infrastructure : Manage and optimize data infrastructure for performance, cost, and compliance.
- Develop Data Products : Create data-driven solutions and embedded analytics to support business and product teams.
- 2+ years experience building analytical products (system, dashboard, models).
- Strong proficiency in SQL and Python for data transformations and automation.
- Hands-on experience with modern data stack tools (e.g., Snowflake, dbt, Dagster, Airflow, Looker).
- Excellent business communication skills to translate data insights into actionable business strategies.
- Experience with data dashboarding and visualization tools (e.g., Tableau, Looker, Superset, Mode).
- Ability to manage, optimize, and scale data infrastructure.
- Previous experience setting up a data stack in a startup from scratch.
- Interest or experience in working with cutting-edge open-source data tool.
- Proven ability to convert ad-hoc data requests into scalable, automated pipelines.
- Experience working in the logistics domain.
- EPD (Engineering, Product, and Design) : Become the data expert within the EPD organization, empowering data-driven decisions across product development and iterations
- Strategy and Operations : Enable product delivery teams (solutions, analyst) to deliver value to customers through actionable dashboards, metrics, and self-service reports
- AI Platform: Expedite the development of new models and training of existing models by simplifying the necessary data cleansing and preparation
- Base pay 120k - 190k
- Premium Medical, Dental, and Vision Insurance plans
- Insurance premiums covered 100% for you
- Unlimited PTO
- Fireside chats with industry leading keynote speakers
- Off-sites in locales such as Napa and Tahoe
- Generous professional development budget to feed your curiosity
- Physical and Mental fitness subsidies for yoga, meditation, gym, or ski memberships
Why you should join Loop? -
Analytics Engineer
Posted today
Job Viewed
Job Description
Remote if located in DC, NYC, Boston, or Chicago
Ideal Candidate: Strong background in designing, developing, and working with users in creating analytical reports. Must be well-versed with Power BI and Tableau. Must also have SQL experience.
Be The First To Know
About the latest Senior analytics engineer Jobs in United States !
Analytics Engineer
Posted 1 day ago
Job Viewed
Job Description
We're building the all-in-one B2B post-sales support platform powered by conversational data and layered with intelligence to help our customers run their operations in real-time. We're backed by YCombinator, General Catalyst, and a16z. Pylon is a tool that thousands of users spend most of their day in.
Currently, 780+companies including Linear, Cognition (makers of Devin), Modal Labs, and Incident.io use us everyday to run their support and customer success workflows. Also, we recently made this year's Enterprise Tech 30 List.
Our product has a very large problem space so there's a ton of stuff to build and take ownership of - we'd be excited to get your help as we're hiring several extremely talented software engineers across the stack.
We're now looking for our first Analytics Engineer to help us centralize our data and build the foundation for company-wide reporting and insights.
What you'll work on
- Own our data warehouse: build and maintain pipelines to unify product, GTM, and customer data
- Model product usage, feature adoption, and account activity so teams can self-serve answers
- Partner with Sales, Success, and Product to answer questions like:
- "Which enterprise customers use our AI features most?"
- "Who are great reference customers by usage and size?"
- Create dashboards and metrics that the company can operate on
- Help design how Pylon measures success across customers and product
- You've owned analytics or data modeling for a fast-moving SaaS product
- You're fluent in SQL and familiar with modern data stacks (dbt, Snowflake, Fivetran, Looker, etc.)
- You can translate ambiguous business questions into structured analysis
- You're motivated by autonomy, high ownership, and fast iteration
- You're in SF (or open to relocating) and want to help shape our data culture from the start
Our perks
Lunch, dinner and snacks at the office
Fully covered medical, dental, and vision insurance for employees
Retirement savings
14 company holidays + unlimited PTO
Annual offsite
Relocation and immigration support
More about Pylon
Traction: Have hundreds of paying customers and are growing fast
Funding: In addition to investment from Y Combinator and General Catalyst, a16z we just announced our Series B ($51M total raised)
Founders: Advith Chelikani, Robert Eng, and Marty Kausas
Team: Currently 55+ and growing!
Analytics Engineer
Posted 3 days ago
Job Viewed
Job Description
The Analytics Engineer is responsible for data consolidation and maintenance across the organization's data infrastructure and various data sources. The engineer works with Data Analysts and other business units to create data models and to build and maintain a cloud data infrastructure. Their work will support recurring and ad-hoc reporting, dashboard creation and maintenance, and data requests from internal and external stakeholders. The Analytics Engineer will also build business intelligence tools to help data analysts do their work more efficiently, such as improved or automated data clean up processes and a self-service analytics platform. They will also collaborate with the Product Development team and IT Operations team on data infrastructure and integration projects focused on transforming data and supplying data between operational systems to meet business needs. Lastly, they are responsible for using best practices to maintain version control and perform internal testing and deploy production-ready code.
Position Responsibilities- Design, develop, implement, and maintain data models used for reporting and analytics.
- Work with Data Analysts and other business teams to identify and develop ETL/ELT processes to automate, orchestrate, and maintain data pipelines.
- Maintain data documentation and definitions, including data quality standards and business logic for metrics/KPIs, naming and formatting conventions and detailed specifications/documentation for analytics code.
- Maintain version control for, test, and continuously integrate analytics code into testing and production environments.
- Monitor data models and pipelines to ensure performance meets internal service level agreement.
- Collaborate with Data Analysts to build dashboards and visualizations that show trends, patterns, and outliers in the data pipeline and models; maintain current inventory of reports and dashboards and publish this information to business users.
- Work with Product Development and other business teams to consolidate and integrate data from and between multiple data sources (such as Salesforce, Financial Systems, HRIS, and other data sources).
- Maintain and update (as necessary) existing integration infrastructure, with a focus on streamlining and consolidating processes.
- Help evaluate and validate new and existing tools or development needs to suggest improvements and streamline operations.
- Support updates/upgrades, configuration and troubleshooting for business intelligence tools.
- Help coach Data Analysts on software development best practices.
Professional Experience
- Proven success in designing, building and maintaining data warehouse and infrastructure that meet business needs.
- Experience with SQL.
- Experience with Python or R.
- Strong problem-solving skills in relation to data and software applications.
- Excellent interpersonal skills and verbal/written/virtual communication skills to communicate technical details to both other team members and non-technical stakeholders.
- Experience supporting and working with cross-functional teams.
- Prior experience with software development best practices around code (version control using Git, testing, debugging, continuous integration and deployment, clear comments in code).
- Experience with cloud data warehouses and platforms like Snowflake, GCP, Azure.
- Experience with ETL/ELT tools like FiveTran, Matillion, Stitch.
- Experience with DBT, Dataform, Databricks.
- Experience with data visualization tools like Tableau, PowerBI, Looker, Sigma.
- Experience with Integration Platform-as-a-Service like Workato or Mulesoft.
- Clear technical writing experience, including data mapping documents, reports, and presentations.
- Experience in a non-profit, human services setting.
- Dealing with Ambiguity: Remains productive and effective in uncertain, rapidly changing situations by quickly analyzing information to adapt approach. Demonstrates flexibility, composure and good judgment despite challenges
- Technical Learning: Quickly learns and applies new technical skills, knowledge, and industry expertise. Seeks guidance when needed to ensure quality
- Problem Solving: Uses critical thinking to creatively investigate issues from diverse perspectives. Makes evidence-based recommendations addressing short and long-term needs
- Total Work Systems: Designs, implements, and improves organization-wide systems to enhance quality and efficiency. Leverages technology and data to standardize workflows and meet stakeholder needs
- Organizational Agility: Understands how the organization operates through formal and informal structures. Navigates dynamics, communicates rationale behind policies, and builds relationships to achieve goals
- Customer Focus: Proactively understands and meets others' needs through a service-minded approach. Builds trust, leverages insights, and provides responsive support to align with evolving requirements
- Humanity: Putting people first: We are committed to meeting people where they're at, honoring their dignity, diversity, and experience.
- Community: Building a better future: Sustainable housing solutions are fostered through partnership, collaboration, and human connection.
- Ingenuity: Innovating for transformation: Systems-change requires relentless determination, thinking outside the box and challenging the status quo.
Candidates should have physical mobility for tasks such as standing, bending, stooping, kneeling, crouching, reaching, twisting, and walking on uneven surfaces. They should be capable of performing stationary tasks like sitting for up to 6 to 8 hours a day. Additionally, candidates should be able to lift, carry, push, pull light to moderate weights up to 15 pounds safely. Requires mental acuity for analytical reasoning and document interpretation.
Benefits- Health Care Plan (Medical, Dental, & Vision)
- Retirement Plan (With 5% Match)
- Life Insurance (Basic, Voluntary and AD&D)
- Paid Time Off (Vacation, Sick & Public Holidays)
- Family Leave (Maternity, Paternity)
- Short Term & Long-Term Disability
- Training & Development
- Wellness Resources
- Hybrid Work
Location: Los Angeles, California
Base pay range: $95,000.00/yr - $105,000.00/yr
#J-18808-LjbffrAnalytics Engineer
Posted 3 days ago
Job Viewed
Job Description
HiveWatch is a tech-forward, inclusive organization fostering the evolution of the physical security industry. We are a diverse team of forward thinkers who empower each other to find creative and collaborative solutions in an industry ripe for modernization. We are passionate about the problems were solving for our customers and equally passionate about the company were building.
HiveWatch is here to help security teams pivot from chasing threats to preventing them. We protect organizations, people, and property through the intelligent orchestration of physical security programs. With better communication, more insights, and less noise, we are modernizing what it means for businesses and their employees to truly feel safe.
The RoleWe're seeking an Analytics Engineer who will be instrumental in transforming how we leverage data across our security platform. This role uniquely combines the technical rigor of analytics engineering with hands-on analysis and visualization work. You'll own the foundational data models that power our insights, while also supporting our Data Analyst on the dashboards and reports that drive decision-making across product, engineering, and customer success teams.
This position is perfect for someone who thrives at the intersection of complex data infrastructure and practical business intelligence, particularly in handling the diverse data streams inherent in security operationsfrom IoT stream data to application logs and relational databases.
WHAT YOU'LL DO:- Own and evolve overlaying BI and semantic layer datasets from our OLTP databases that service product analytics, ensuring scalability and performance
- Collaborate directly with both customers and internal stakeholders to translate requirements into optimized pipelines, dashboards, frameworks, and systems that facilitate easier development of data artifacts
- Perform extensive data munging and transformation of non-relational data sources, including IoT stream data, camera telemetry, and application logs
- Conduct deep-dive analyses on camera and device behavior to optimize product performance
- Train teams on how to derive their own insights from data, and contribute to building a strong data culture by advocating best practices and improving data literacy across the company
- 4+ years of experience in analytics engineering, data engineering, or similar role with strong technical foundations
- Advanced SQL skills for complex data transformation and performance tuning
- Proficiency in Python or similar programming language for data processing and automation
- Proficiency in git, bash, and/or linux CLI
- Experience with database fundamentals such as data modeling, indexing, normalization, denormalization, foreign key constraints, slowly changing dimensions
- Strong experience writing data quality checks and validation frameworks
- Proven ability to work with both technical and non-technical stakeholders
- Self-starter comfortable in a fast-paced environment with high autonomy
- Proficiency in creating compelling reporting and visualization solutions using modern dashboarding tools (QuickSight, Tableau, Sigma, etc.)
- Startup experience
- Has developed BI frameworks from scratch
- Hands-on experience with high-volume, real-time data streams
- Experience building ETL/ELT pipelines and orchestrating workflows
- Experience with AWS ecosystem
- Experience with Terraform
- Familiarity with modern data warehousing platforms
- Knowledge of streaming data processing and real-time analytics
- Experience with experimentation frameworks and A/B testing methodologies
- Background in device telemetry, camera systems, or edge computing data
- Experience with security operations, surveillance systems, or physical security technology
- Base salary range for this position is $140,000 - $165,000 USD per year
- Eligible to participate in HiveWatch Equity Incentive Plan
The final offer will be at the company's sole discretion and determined by multiple factors, including years and depth of relevant experience and expertise, location, and other business considerations.
BENEFITS & CULTUREIn an effort to provide for our employees, HiveWatch offers a competitive benefits package which includes:
- Health Benefits: Medical, Vision, Dental and Life Insurance
- Cutting edge solutions in an emerging field with lots of growth potential
- Generous compensation packages
- 401K
- Family friendly & compassionate work culture
- Work with good people who CARE about making the world a better place
What is CARE? HiveWatch enables its employees to CARE for themselves, and each other through unique programs crafted by HiveWatch employees themselves. To deliver on our mission, we empower our employees to consider the meaning of job security on a holistic level. At HiveWatch, you are encouraged to challenge the status quo, provide your unique point of view, and leave fear at the (access controlled) door. In practicing CARE, we:
- Celebrate our diverse workforce and all communities within HiveWatch.
- Assist the varying needs of our employees, from maintaining a work life balance to encouraging personal aspirations.
- Respect one another through our interactions and set personal boundaries.
- Embrace equity through our policies and practices of hiring, promoting, and offering benefits that take care of the whole person, not just the worker.
OUR EEO STATEMENT:
At HiveWatch, you are encouraged to challenge the status quo, provide your unique point of view, and leave fear at the (access controlled) door. HiveWatch enables its employees to CARE for themselves and each other through unique programs crafted by HiveWatch employees.
HiveWatch is an equal opportunity employer and we are committed to cultivating a work environment that supports, inspires, and respects all individuals. We execute our hiring practices so that they are merit-based and we do not discriminate on the basis of race, color, religion, sex, sexual orientation, gender identity/expression, marital status, age, disability, medical condition, genetic information, national origin, ancestry, military or veteran status, or other protected characteristic.
#J-18808-Ljbffr