3,410 Software Analytics jobs in the United States

Software Engineer - Analytics & AI

94583 San Ramon, California CXApp, Inc

Posted 4 days ago

Job Viewed

Tap Again To Close

Job Description

Who we are

At CXApp, we are the innovators of Indoor Intelligence, delivering actionable insights for people, places and things. Our flagship product the "CXApp" is a workplace experience platform for the enterprise. Our technologies and solutions help enterprise customers deliver a comprehensive business journey in a work 'from-anywhere' world for employees, partners, customers, and visitors.

We take pride in the way we positively impact the daily lives of our customers and continue to push the boundaries of how our platform can benefit others.

The technology

The CXApp platform tech stack uses native mapping, analytics, on-device positioning and app technologies. The overall solution helps organizations provide a frictionless work environment to employees with features such as: hot desk and room booking, indoor navigation with turn-by-turn directions on a digital map, company-wide news feeds, an in-app company directory of colleagues and workplace amenities, as well as bookable opportunities and experiences.

Job brief

We are seeking a talented and experienced Software Engineer to join our dynamic engineering team. As a Software Engineer specializing in enterprise software, analytics, and AI, you will play a crucial role in designing, developing, and maintaining advanced software solutions for our enterprise customers. Your primary focus will be on leveraging analytics and artificial intelligence technologies to deliver innovative software products that drive data-driven insights and decision-making.

Responsibilities:

  • Collaborate with cross-functional teams, including product managers, data scientists, and UX/UI designers, to understand customer requirements and translate them into scalable software solutions.
  • Design, develop, test, and deploy enterprise software applications with a focus on analytics and AI capabilities.
  • Utilize advanced analytics techniques to extract insights from large datasets, enabling data-driven decision-making for our customers.
  • Develop and integrate machine learning algorithms and models into software solutions to provide predictive and prescriptive analytics functionality.
  • Optimize software performance and scalability by implementing efficient algorithms and leveraging distributed computing frameworks.
  • Stay up-to-date with the latest trends and advancements in analytics, AI, and enterprise software development, and incorporate them into our software products.
  • Collaborate with the DevOps team to ensure smooth deployment and operation of software applications in production environments.
  • Perform code reviews, provide constructive feedback, and mentor junior team members to foster a culture of continuous learning and improvement.
  • Troubleshoot and resolve complex software issues, ensuring high-quality and robust software deliverables.
Requirements:
  • Bachelor's or Master's degree in Computer Science, Software Engineering, or a related field.
  • Proven experience in enterprise software development, with a focus on analytics and AI.
  • Strong programming skills in languages such as Java, Python, or C++, with a solid understanding of object-oriented programming principles.
  • Experience with analytics frameworks and libraries, such as TensorFlow, PyTorch, or Scikit-learn.
  • Familiarity with big data processing tools and technologies, such as Hadoop, Spark, or Kafka.
  • Solid understanding of machine learning concepts and techniques, including supervised and unsupervised learning, deep learning, and reinforcement learning.
  • Proficiency in designing and implementing RESTful APIs for integrating software applications.
  • Strong problem-solving and analytical thinking skills.
  • Excellent communication and collaboration abilities to work effectively in a team environment.
  • Experience with Agile/Scrum methodologies and practices is a plus.
  • Prior experience in the development of enterprise software applications for industries like finance, healthcare, or e-commerce is a plus.


Join our team of talented engineers and make a significant impact by developing cutting-edge software solutions that empower businesses to harness the power of analytics and AI. Apply now and be part of our exciting journey towards innovation and digital transformation.
View Now

Lead Software Engineer (Analytics)

17101 Harrisburg, Pennsylvania DEW Softech, Inc

Posted 17 days ago

Job Viewed

Tap Again To Close

Job Description

Lead Software Engineer

Location: Remote

Duration: 6 Months (Potentially Contract-To-Hire)

Interview process : 2 rounds of client interviews

MUST BE US/GC ONLY

Job Summary

Required Skills:
• This role is 70% lead and 30% hands on technical
• Bachelor's degree
• 5+ years of experience leading Business Intelligence and Analytics teams and enterprise solutions
• Power BI, Tableau, DAX, SQL, and LOD expressions
• Cloud platforms, AWS is preferred
• Data warehousing, preferred Redshift and Databricks
• Agile project management
• Excellent communication and leadership skills

Nice To Haves:
• Databricks Ginni or Microsoft Copilot
• Master's degree

Job Description

As a Lead Software Engineer , you will be responsible for leading the Analytics Engineering & Delivery function and for delivering scalable, performant, and insightful analytics solutions that support strategic and operational decision-making across the enterprise. This role manages a team of analytics engineers and oversees the development of enterprise dashboards, semantic models, and data pipelines across Fabric (Power BI), Tableau, and other supported platforms. The position partners closely with Data Architecture, Data Engineering, Governance, and Business Stakeholders to drive consistency, performance, and usability of business analytics products.

What You'll Do

  • Lead and manage the delivery team responsible for the design, development, and deployment of enterprise-grade analytics solutions using Fabric (Power BI), Tableau and other supported platforms
  • Manage the execution of key analytics initiatives including KPI frameworks, cross-platform data integration, visual cognition standards, and optimized data pipelines
  • Provide direct leadership to Analytics Engineers by prioritizing workload, tracking progress, and facilitating solution reviews for consistency and alignment
  • Serve as delivery owner for data analytics projects, collaborating with business stakeholders, product owners, and data platform teams to ensure on-time and high-quality delivery
  • Translate business needs into scalable analytical products with structured semantic models, reusable datasets, and governance-aligned data definitions
  • Support the adoption of version control practices and drive automation, reusability, and design consistency across reporting solutions
  • Oversee continuous improvement efforts including code reviews, performance tuning, and compliance with architectural best practices
  • Lead the team's engagement in agile delivery practices, backlog grooming, and sprint planning in coordination with strategy and capability leads
  • Partner with the BI Strategy & Business Partnership team to align delivery outcomes with enterprise data governance and user enablement goals
  • Contribute to the standardization of data modeling and visualization best practices rooted in analytical literacy, business relevance, and perceptual design principles

What You'll Need

Required:

  • Bachelor's degree in Engineering, Computer Science, Analytics, or related field with 7-10 years of experience
  • 5+ years leading business intelligence or analytics engineering teams delivering production-grade solutions using Power BI, Tableau, and enterprise-scale data platforms
  • Deep understanding of semantic modeling, star and snowflake schema design, dimensional modeling, and visual analytics best practices
  • Strong hands-on experience with DAX, Level of Detail (LOD) expressions, advanced SQL, and programming languages such as Python or R, as well as data preparation tools including Alteryx, Power Query, and Tableau Prep
  • Experience with Cloud Data Platforms such as Azure, AWS, or GCP, including cloud-native storage and computer solutions
  • Experience with enterprise data warehousing platforms such as Databricks, Snowflake, Redshift, Synapse, or similar technologies
  • Experience with generative AI and natural language technologies such as Microsoft Copilot, Salesforce Einstein, Databricks Ginni, or other LLM-driven analytics tools
  • Experience leading solution design and implementation using cloud data lakes, and version control tools like Git
  • Demonstrated success managing delivery pipelines and analytics squads in agile or hybrid environments
  • Proven ability to coordinate complex technical efforts across data engineering, governance, and business teams
  • Fluent in English with strong verbal and written communication skills, including executive reporting and stakeholder engagement
  • Familiarity with IBCS, cognitive design for analytics, and standardization frameworks

Preferred:

  • Master's degree with 5-7 years related experience preferred

Physical Demands

  • Ability to safely and successfully perform the essential job functions consistent with the ADA and other federal, state and local standards
  • Sedentary work that involves sitting or remaining stationary most of the time with occasional need to move around the office to attend meetings, etc.
  • Ability to conduct repetitive tasks on a computer, utilizing a mouse, keyboard and monitor
Apply Now

Lead Software Engineer - Analytics & Data

94420 Foster City, California Coupa Software

Posted 4 days ago

Job Viewed

Tap Again To Close

Job Description

Coupa makes margins multiply through its community-generated AI and industry-leading total spend management platform for businesses large and small. Coupa AI is informed by trillions of dollars of direct and indirect spend data across a global network of 10M+ buyers and suppliers. We empower you with the ability to predict, prescribe, and automate smarter, more profitable business decisions to improve operating margins.

Why join Coupa?

Pioneering Technology: At Coupa, we're at the forefront of innovation, leveraging the latest technology to empower our customers with greater efficiency and visibility in their spend.

Collaborative Culture: We value collaboration and teamwork, and our culture is driven by transparency, openness, and a shared commitment to excellence.

Global Impact: Join a company where your work has a global, measurable impact on our clients, the business, and each other.

Learn more on Life at Coupa blog and hear from our employees about their experiences working at Coupa.

The Impact of a Lead Software Engineer - Analytics & Data at Coupa:

As a Lead Software Engineer, you'll play a key role in building and scaling Coupa's data infrastructure to support real-time decision-making across global operations. You'll design and optimize high-performance data pipelines and Spark clusters on AWS, ensuring seamless data integration, automation, and availability. Your work will directly enhance Coupa's analytics capabilities, empower internal teams with trusted insights, and drive smarter, faster business decisions through secure, scalable, and AI-enabled data solutions.

What You'll Do:

    • Design, build, and maintain highly scalable and efficient data pipeline architectures, ensuring seamless data integration and high availability.
    • Lead the optimization of Spark clusters, implement advanced monitoring, and proactively identify and resolve performance bottlenecks for continuous improvement.
    • Orchestrate the assembly and delivery of large, complex datasets that align with critical business requirements, while automating data processes and significantly enhancing data delivery efficiency.
    • Architect robust data extraction, transformation, and loading (ETL) infrastructure using SQL and advanced AWS big data technologies, developing sophisticated analytics tools to drive actionable business insights.
    • Spearhead collaboration with cross-functional teams to resolve complex data-related issues, ensure stringent data security across AWS regions, and empower data scientists to optimize advanced product functionalities.
What You Will Bring to Coupa:
    • 8-12 years of data engineering experience with deep expertise in big data (Spark, Kafka), cloud platforms (AWS), and orchestration tools (Airflow, Luigi).
    • Advanced degree in a quantitative field; expert in SQL, relational databases, query optimization, and diverse database systems.
    • Extensive experience in building and optimizing Spark clusters, scalable data pipelines, and ingestion systems across structured/unstructured data.
    • Proven skills in designing cloud-native, API-centric data architectures using REST/GraphQL, with strong support for analytics and reporting.
    • Strong background in data governance, quality control, microservice orchestration, and performance monitoring.
    • Proficient in applying AI/ML techniques for data classification, harmonization, and predictive analytics.


The estimated pay range for this role is $144,075 - $212,000

The starting salary for the successful candidate will be based on permissible, non-discriminatory factors such as skills, experience, and geographic location.

Coupa complies with relevant laws and regulations regarding equal opportunity and offers a welcoming and inclusive work environment. Decisions related to hiring, compensation, training, or evaluating performance are made fairly, and we provide equal employment opportunities to all qualified candidates and employees.

Please be advised that inquiries or resumes from recruiters will not be accepted.

By submitting your application, you acknowledge that you have read Coupa's Privacy Policy and understand that Coupa receives/collects your application, including your personal data, for the purposes of managing Coupa's ongoing recruitment and placement activities, including for employment purposes in the event of a successful application and for notification of future job opportunities if you did not succeed the first time. You will find more details about how your application is processed, the purposes of processing, and how long we retain your application in our Privacy Policy.
View Now

Senior Software Engineer, Analytics and Reporting

94199 San Francisco, California Cloudflare Inc

Posted 4 days ago

Job Viewed

Tap Again To Close

Job Description

About Us

At Cloudflare, we are on a mission to help build a better Internet. Today the company runs one of the world's largest networks that powers millions of websites and other Internet properties for customers ranging from individual bloggers to SMBs to Fortune 500 companies. Cloudflare protects and accelerates any Internet application online without adding hardware, installing software, or changing a line of code. Internet properties powered by Cloudflare all have web traffic routed through its intelligent global network, which gets smarter with every request. As a result, they see significant improvement in performance and a decrease in spam and other attacks. Cloudflare was named to Entrepreneur Magazine's Top Company Cultures list and ranked among the World's Most Innovative Companies by Fast Company.

We realize people do not fit into neat boxes. We are looking for curious and empathetic individuals who are committed to developing themselves and learning new skills, and we are ready to help you do that. We cannot complete our mission without building a diverse and inclusive team. We hire the best people based on an evaluation of their potential and support them throughout their time at Cloudflare. Come join us!

Available Locations: Austin Texas

What you'll do

As a member of the Analytics and Reporting (ART) team within CF1 (Zero Trust), you will work alongside other engineers, designers, and product managers to develop and enhance Cloudflare's analytics and reporting capabilities. Your role will involve developing software applications that efficiently handle large-scale data analytics, deliver full-stack solutions that include data ingestion, storage, processing, and visualization layers, contribute to the architecture and continuous improvement of analytics platforms. You will see projects through from conception to deployment, delivering data-driven solutions that empower Cloudflare customers to harness the full potential of Zero Trust.

Technologies we use:

  • ART core backend and API services are written in Go.
  • We deploy our solutions in Kubernetes environments.
  • We utilize Postgres, Clickhouse, and a number of cloud native data store technologies.
  • We use HTML, CSS, JavaScript, TypeScript for front-end development.
  • Kafka plays a large role at Cloudflare and this team with message streaming.
  • For service monitoring we use Prometheus and Grafana.
  • For service logging we use Sentry, Elasticsearch and Kibana.

Examples of desirable skills, knowledge and experience

  • 3-5+ years of professional experience building and managing reliable and performant software systems at scale, preferably with some experience in our tech stack.
  • Experience in designing and architecting distributed systems.
  • Experience in designing and implementing REST APIs.
  • Experience in SQL, familiarity with common relational database concepts.
  • Familiarity with the web and technologies such as web browsers, HTTP, JavaScript.
  • Familiarity with Kubernetes, Kafka, Clickhouse.
  • Passion for making the digital world a more secure place.
  • Willingness, curiosity, and enthusiasm to learn new programming languages, technologies and systems
  • Strong interpersonal and communication skills. Caring and empathy are coveted traits here!

Bonus

  • Previous experience with building enterprise analytics products.

This role may require flexibility to be on-call outside of standard working hours to address technical issues as needed.

What Makes Cloudflare Special?

We're not just a highly ambitious, large-scale technology company. We're a highly ambitious, large-scale technology company with a soul. Fundamental to our mission to help build a better Internet is protecting the free and open Internet.

Project Galileo : Since 2014, we've equipped more than 2,400 journalism and civil society organizations in 111 countries with powerful tools to defend themselves against attacks that would otherwise censor their work, technology already used by Cloudflare's enterprise customers--at no cost.

Athenian Project : In 2017, we created the Athenian Project to ensure that state and local governments have the highest level of protection and reliability for free, so that their constituents have access to election information and voter registration. Since the project, we've provided services to more than 425 local government election websites in 33 states.

1.1.1.1 : We released 1.1.1.1 to help fix the foundation of the Internet by building a faster, more secure and privacy-centric public DNS resolver. This is available publicly for everyone to use - it is the first consumer-focused service Cloudflare has ever released. Here's the deal - we don't store client IP addresses never, ever. We will continue to abide by our privacy commitment and ensure that no user data is sold to advertisers or used to target consumers.

Sound like something you'd like to be a part of? We'd love to hear from you!

This position may require access to information protected under U.S. export control laws, including the U.S. Export Administration Regulations. Please note that any offer of employment may be conditioned on your authorization to receive software or technology controlled under these U.S. export laws without sponsorship for an export license.

Cloudflare is proud to be an equal opportunity employer. We are committed to providing equal employment opportunity for all people and place great value in both diversity and inclusiveness. All qualified applicants will be considered for employment without regard to their, or any other person's, perceived or actual race, color, religion, sex, gender, gender identity, gender expression, sexual orientation, national origin, ancestry, citizenship, age, physical or mental disability, medical condition, family care status, or any other basis protected by law. We are an AA/Veterans/Disabled Employer.

Cloudflare provides reasonable accommodations to qualified individuals with disabilities. Please tell us if you require a reasonable accommodation to apply for a job. Examples of reasonable accommodations include, but are not limited to, changing the application process, providing documents in an alternate format, using a sign language interpreter, or using specialized equipment. If you require a reasonable accommodation to apply for a job, please contact us via e-mail at or via mail at 101 Townsend St. San Francisco, CA 94107.

View Now

Software Engineer, Infrastructure - Analytics

94199 San Francisco, California OpenAI

Posted 4 days ago

Job Viewed

Tap Again To Close

Job Description

About the Team

The Scaling team designs, builds, and operates critical infrastructure that enables research at OpenAI.

Our mission is simple: accelerate the progress of research towards AGI. We do this by building core systems that researchers rely on - ranging from low-level infrastructure components to research-facing custom applications. These systems must scale with the increasing complexity and size of our workloads, while remaining reliable and easy to use.

About the Role

As we grow, we're looking for a pragmatic and versatile software engineer who thrives in fast-moving environments and enjoys building systems that empower others.

This is a generalist software engineering role with an emphasis on distributed systems, data processing infrastructure, and operational excellence. You'll develop and operate foundational backend services that power key OpenAI's research workflows - both by creating new infrastructure and by building on existing systems. The use cases will span across observability, analytics, performance engineering, and other domains, all with the goal of solving meaningful and impactful problems to research.

This role is based in San Francisco, CA or open to being remote within the US. We use a hybrid work model of 3 days in the office per week and offer relocation assistance to new employees.

In this role, you will:
  • Design, build, and operate scalable backend systems that support various ML research workflows, including observability and analytics.

  • Develop reliable infrastructure that supports both streaming and batch data processing at scale.

  • Creating internal-facing tools and applications as needed.

  • Debug and improve performance of services running on Kubernetes, including operational tooling and observability.

  • Collaborate with engineers and researchers to deliver reliable systems that meet real-world needs in production.

  • Help improve system reliability by participating in the on-call rotation and responding to critical incidents.

You might thrive in this role if you have:
  • Strong proficiency in Python/Rust and backend software development, ideally in large codebases.

  • Experience with distributed systems and scalable data processing infrastructure, including technologies like Kafka, Spark, Trino/Presto, Iceberg.

  • Hands-on experience operating services in Kubernetes, with familiarity in tools like Terraform and Helm.

  • Comfort working across the stack - from low-level infrastructure components to application logic - and making trade-offs to move quickly.

  • A focus on building systems that are both technically sound and easy for others to use.

  • Curiosity and adaptability in fast-changing environments, especially in high-growth orgs.

About OpenAI

OpenAI is an AI research and deployment company dedicated to ensuring that general-purpose artificial intelligence benefits all of humanity. We push the boundaries of the capabilities of AI systems and seek to safely deploy them to the world through our products. AI is an extremely powerful tool that must be created with safety and human needs at its core, and to achieve our mission, we must encompass and value the many different perspectives, voices, and experiences that form the full spectrum of humanity.

We are an equal opportunity employer, and we do not discriminate on the basis of race, religion, color, national origin, sex, sexual orientation, age, veteran status, disability, genetic information, or other applicable legally protected characteristic.

For additional information, please see OpenAI's Affirmative Action and Equal Employment Opportunity Policy Statement.

Qualified applicants with arrest or conviction records will be considered for employment in accordance with applicable law, including the San Francisco Fair Chance Ordinance, the Los Angeles County Fair Chance Ordinance for Employers, and the California Fair Chance Act. For unincorporated Los Angeles County workers: we reasonably believe that criminal history may have a direct, adverse and negative relationship with the following job duties, potentially resulting in the withdrawal of a conditional offer of employment: protect computer hardware entrusted to you from theft, loss or damage; return all computer hardware in your possession (including the data contained therein) upon termination of employment or end of assignment; and maintain the confidentiality of proprietary, confidential, and non-public information. In addition, job duties require access to secure and protected information technology systems and related data security obligations.

To notify OpenAI that you believe this job posting is non-compliant, please submit a report through this form. No response will be provided to inquiries unrelated to job posting compliance.

We are committed to providing reasonable accommodations to applicants with disabilities, and requests can be made via this link.

OpenAI Global Applicant Privacy Policy

At OpenAI, we believe artificial intelligence has the potential to help people solve immense global challenges, and we want the upside of AI to be widely shared. Join us in shaping the future of technology.

Compensation Range: $310K - $460K

View Now

Software Engineer Manager (Analytics)

8501 DEW Softech, Inc

Posted 1 day ago

Job Viewed

Tap Again To Close

Job Description

Software Engineer Manager (Analytics)

Location: Remote

Position Type: Contract to hire


Required Skills:
• This role is 70% management and 30% hands on technical
• Bachelor's degree
• 5+ years of experience leading Business Intelligence and Analytics teams and enterprise solutions
• FABRIC, Power BI, Tableau, DAX, SQL, and LOD expressions
• Cloud platforms, AWS is preferred
• Data warehousing, preferred Redshift and Databricks
• Agile project management
• Excellent communication and leadership skills
Nice To Haves:
• Databricks Ginni or Microsoft Copilot
• Master's degree

Job Description

As a Software Engineer Manager , you will be responsible for leading the Analytics Engineering & Delivery function and for delivering scalable, performant, and insightful analytics solutions that support strategic and operational decision-making across the enterprise. This role manages a team of analytics engineers and oversees the development of enterprise dashboards, semantic models, and data pipelines across Fabric (Power BI), Tableau, and other supported platforms. The position partners closely with Data Architecture, Data Engineering, Governance, and Business Stakeholders to drive consistency, performance, and usability of business analytics products.

What You'll Do

  • Lead and manage the delivery team responsible for the design, development, and deployment of enterprise-grade analytics solutions using Fabric (Power BI), Tableau and other supported platforms
  • Manage the execution of key analytics initiatives including KPI frameworks, cross-platform data integration, visual cognition standards, and optimized data pipelines
  • Provide direct leadership to Analytics Engineers by prioritizing workload, tracking progress, and facilitating solution reviews for consistency and alignment
  • Serve as delivery owner for data analytics projects, collaborating with business stakeholders, product owners, and data platform teams to ensure on-time and high-quality delivery
  • Translate business needs into scalable analytical products with structured semantic models, reusable datasets, and governance-aligned data definitions
  • Support the adoption of version control practices and drive automation, reusability, and design consistency across reporting solutions
  • Oversee continuous improvement efforts including code reviews, performance tuning, and compliance with architectural best practices
  • Lead the team's engagement in agile delivery practices, backlog grooming, and sprint planning in coordination with strategy and capability leads
  • Partner with the BI Strategy & Business Partnership team to align delivery outcomes with enterprise data governance and user enablement goals
  • Contribute to the standardization of data modeling and visualization best practices rooted in analytical literacy, business relevance, and perceptual design principles

What You'll Need

Required:

  • Bachelor's degree in Engineering, Computer Science, Analytics, or related field with 7-10 years of experience
  • 5+ years leading business intelligence or analytics engineering teams delivering production-grade solutions using Fabric, Power BI, Tableau, and enterprise-scale data platforms
  • Deep understanding of semantic modeling, star and snowflake schema design, dimensional modeling, and visual analytics best practices
  • Strong hands-on experience with DAX, Level of Detail (LOD) expressions, advanced SQL, and programming languages such as Python or R, as well as data preparation tools including Alteryx, Power Query, and Tableau Prep
  • Experience with Cloud Data Platforms such as Azure, AWS, or GCP, including cloud-native storage and computer solutions
  • Experience with enterprise data warehousing platforms such as Databricks, Snowflake, Redshift, Synapse, or similar technologies
  • Experience with generative AI and natural language technologies such as Microsoft Copilot, Salesforce Einstein, Databricks Ginni, or other LLM-driven analytics tools
  • Experience leading solution design and implementation using cloud data lakes, and version control tools like Git
  • Demonstrated success managing delivery pipelines and analytics squads in agile or hybrid environments
  • Proven ability to coordinate complex technical efforts across data engineering, governance, and business teams
  • Fluent in English with strong verbal and written communication skills, including executive reporting and stakeholder engagement
  • Familiarity with IBCS, cognitive design for analytics, and standardization frameworks

Preferred:

  • Master's degree with 5-7 years related experience preferred

Physical Demands

  • Ability to safely and successfully perform the essential job functions consistent with the ADA and other federal, state and local standards
  • Sedentary work that involves sitting or remaining stationary most of the time with occasional need to move around the office to attend meetings, etc.
  • Ability to conduct repetitive tasks on a computer, utilizing a mouse, keyboard and monitor
Apply Now

Data-Science

94537 Fremont, California NEXTracker

Posted 3 days ago

Job Viewed

Tap Again To Close

Job Description

Job Description:

Nextracker is the leading provider of intelligent solar tracking systems, offering advanced solutions that optimize the performance and reliability of utility-scale solar power plants around the world. Our technology enables maximum energy yield, grid resilience, and operational efficiency-supporting a more sustainable energy future.

Key Responsibilities:
  • Assist in the development and evaluation of machine learning models for image-based object detection and tracking.
  • Preprocess and annotate image datasets for training and validation.
  • Implement and fine-tune deep learning architectures (e.g., CNNs, YOLO, Faster R-CNN) for real-time image recognition tasks.
  • Collaborate on model deployment pipelines using tools such as TensorFlow Serving, ONNX, or Docker.
  • Analyze model performance and contribute to iterative improvements.
  • Document findings, experiments, and best practices.
Preferred Qualifications:
  • Currently pursuing a degree in Computer Science, Data Science, Artificial Intelligence, or a related field.
  • Strong foundation in Python and machine learning libraries (e.g., TensorFlow, PyTorch, OpenCV).
  • Familiarity with image processing techniques and computer vision concepts.
  • Experience with version control (Git) and cloud platforms (AWS, GCP, or Azure) is a plus.
  • Excellent problem-solving skills and a collaborative mindset.


The hourly rate for this position is $45-50/hr

At Nextracker, we are leading in the energy transition, providing the most comprehensive portfolio of intelligent solar tracker and software solutions for solar power plants, as well as strategic services to capture the full value of solar power plants for our customers. Our talented worldwide teams are transforming PV plant performance every day with smart technology, data monitoring and analysis services.

For us at Nextracker, sustainability is not just a word. It's a core part of our business, values and our operations. Our sustainability efforts are based on five cornerstones: People, Community, Environment, Innovation, and Integrity. We are creative, collaborative and passionate problem-solvers from diverse backgrounds, driven by our shared mission to provide smart solar and software solutions for our customers and to mitigate climate change for future generations.

Nextracker is an equal opportunity employer. We celebrate diversity and are committed to creating an inclusive environment for all employees.

Culture is our Passion
View Now
Be The First To Know

About the latest Software analytics Jobs in United States !

Data Science

22042 Falls Church, Virginia ClearanceJobs

Posted 3 days ago

Job Viewed

Tap Again To Close

Job Description

Data Science Engineer

Volt is immediately hiring for Data Science Engineer in Falls Church, VA. As Data Science Engineer you will be responsible for:

  • MLOps, Model Engineering, Training on time series data.
  • After few weeks training on Customer's AI platform, develop AI-based applications using Customer's platform for operational cloud-deployed and secure lab deployed AI-based applications.
  • Develop candidate models that are promoted to active models when their performance meets threshold.
  • Train, validate and deploy machine learning pipelines.
  • Test, troubleshoot, and enhance customer AI-based applications based on feedback.
  • Manage individual project deliverables.
  • Identify application performance bottlenecks and implement optimizations.
  • Write application specifications and documentation.
  • Articulate methodologies, experiments, and findings clearly in actionable way.
  • Work in a fast-paced environment.

This is a Contract opportunity for 12 months. Security Clearance Active Secret Clearance.

The Ideal candidate will have:

  • Bachelor's degree in a Science, Technology, Engineering or Mathematics (STEM), or comparable area of study.
  • 5+ years of Data Science development experience using Python.
  • Proficiency in data science, machine learning, and analytics, including statistical data analysis, model and feature evaluations.
  • Strong proficiency in numpy & pandas.
  • Demonstrated skills with Jupyter Notebook or comparable environments.
  • Practical experience in solving complex problems in an applied environment, and proficiency in critical thinking.

Qualifications We Prefer:

  • Degree in Data Science, Machine Learning, Computer Science, Engineering, Statistics, or equivalent fields.
  • Strong mathematical background (linear algebra, calculus, probability & statistics).
  • Experience with machine learning model training and analysis through open-source frameworks (Pytorch, Tensorflow, Sklearn).
  • Experience crafting, conducting, analyzing, and interpreting experiments and investigations.
  • Experience with modern software development tools and practices (Git, pull requests).
  • Experience analyzing model performance with relevant metrics and optimizing.
  • Familiarity with AI agent frameworks.

Pay Rate: $80 - $90/hr

Volt offers benefits (based on eligibility) that include the following: health, dental, vision, term life, short term disability, AD&D, 401(k), Sick time, and other types of paid leaves (as required by law), Employee Assistance Program (EAP).

Volt is an Equal Opportunity Employer and prohibits any kind of unlawful discrimination and harassment. Volt is committed to the principle of equal employment opportunity for all employees and to providing employees with a work environment free of discrimination and harassment on the basis of race, color, religion or belief, national origin, citizenship, social or ethnic origin, sex, age, physical or mental disability, veteran status, marital status, domestic partner status, sexual orientation, or any other status protected by the statutes, rules, and regulations in the locations where it operates.

If you are an individual with a disability and need a reasonable accommodation to assist with your job search or application for employment, please email hr_dept@ or call (866) - .

Volt does not discriminate against applicants based on citizenship status, immigration status, or national origin, in accordance with 8 U.S.C. 1324b. The company will consider for employment qualified applicants with arrest and conviction records in a manner that complies with the San Francisco Fair Chance Ordinance, the Los Angeles Fair Chance Initiative for Hiring Ordinance, and other applicable laws.

View Now

Data Science

85670 Fort Huachuca, Arizona Peraton

Posted 4 days ago

Job Viewed

Tap Again To Close

Job Description



Data Science

Job Locations

US-AZ-Fort Huachuca

Requisition ID



Position Category

Cyber Security

Clearance

Top Secret/SCI

Responsibilities

Secure Division Support. The GCC provides CSSP responsibilities and conducts DODIN Operations and DCO - Internal Defensive Measures (IDM) to protect the DODIN IAW the DoDM and the DoD Cybersecurity Services Evaluator Scoring Metrics (ESM). These responsibilities are broken into five (5) CSSP functions; Identify, Protect, Detect, Respond, and Recover. GCC is responsible to conduct these functions for its assigned portion of the DODIN for both unclassified and classified networks/ systems. The division provides support services for the protection, monitoring, analysis, detection, and response to unauthorized activity within the DoD Information Systems and Networks. DCO-IDM services are required to defend against unauthorized activity on all Army assets residing on the NIPRNet and SIPRNet. The division provides defensive measures to protect and defend information, computers, and networks from disruption, denial, degradation, or destruction. The division provides sensor management and event analysis and response for network and host-based events. For sensor management, the division provides management of in-line Network Intrusion Protection System/Network Intrusion Detection System (NIPS/NIDS) sensors monitoring all CONUS DoDIN-A NIPRNet and SIPRNet Enterprise traffic to detect sensor outages and activities that attempt to compromise the confidentiality, integrity, or availability of the network. In coordination with GCC Operations, DCO initiates defensive security procedures upon detection of these attacks. Event analysis and response includes the processes involved with reducing multiple cyber incidents to actual malicious threat determinations and mitigating those threats IAW guidance received from GCC Government leadership. Support the Government in providing services for CSSP services on both the NIPRNet and SIPRNet IAW Appendix E: Secure Division Workload Assessment in support of the CONUS portion of the DoDIN-A. Develop reports and products, both current and long-term, in support of CSSP and course of action development. Prepare Tactics, Techniques, and Procedures (TTP), SOPs, Executive Summary (EXSUMS), trip reports, and information/point papers. Contribute during the preparation of agreements, policy, and guidance documentation such as Memorandums of Understanding / Agreement (MOU/A), Service Level Agreements (SLA).
* Defensive Cyber Infrastructure (DCI) Support. Perform the following DCI functions:
* Develop and distribute content provided by security platform vendors at least weekly and as needed.
* Develop and distribute in-house content based on tippers from higher organizations and the Threat Hunt team.
* Provide content development and distribution to tactical edge customers and develop TTPs for doing so.
* Consolidate different data sources into a single view used to assess the status of a specific threat on the network.
* Develop and/or maintain dashboards displaying specific CSSP items of interest (i.e. top 10 attackers, top 10 destinations, top attack vector, etc.) and all active cyber incidents, in near real time, within each respective AOR.
* Maintain, update, test, and implement signatures and policies for each sensor managed by GCC.
* Changes must be approved through the established ITIL process.
* Develop signatures and policies that include both network- and host-based sensors.
* Update as necessary to minimize false positives and validate for proper syntax.
* Conduct all development and testing on isolated networks.
* Document and conduct testing activity with plan procedures, results, and operational procedures as signatures are developed and/or updated.
* Update and validate plan at least annually. A signature test plan shall be developed.

Qualifications

Basic Qualifications:

    8 years with BS/BA; 6 years with MS/MA; 3 years with PhD
  • Certifications: DCWF Code 521 Advanced: Certified Information Systems Security Professional (CISSP) or GIAC Certified Intrusion Analyst (GCIA) or GIAC Cloud Security Essentials (GCLD) or GIAC Defensible Security Architecture (GDSA) or GIAC Global Industrial Cyber Security Professional (GICSP) or GIAC Security Essentials Certification (GSEC) or Information Systems Security Architecture Professional (ISSAP) or Information Systems Security Engineering Professional (ISSEP)
  • Active TS/SCI Clearance
  • Ability to conduct vulnerability assessments and monitor networks to support test and operational environment requirements.
  • Solid understanding of data transport, encryption, networking, IT systems, and cybersecurity fundamentals.
Peraton Overview

Peraton is a next-generation national security company that drives missions of consequence spanning the globe and extending to the farthest reaches of the galaxy. As the world's leading mission capability integrator and transformative enterprise IT provider, we deliver trusted, highly differentiated solutions and technologies to protect our nation and allies. Peraton operates at the critical nexus between traditional and nontraditional threats across all domains: land, sea, space, air, and cyberspace. The company serves as a valued partner to essential government agencies and supports every branch of the U.S. armed forces. Each day, our employees do the can't be done by solving the most daunting challenges facing our customers. Visit peraton.com to learn how we're keeping people around the world safe and secure.

Target Salary Range

$66,000 - $106,000. This represents the typical salary range for this position. Salary is determined by various factors, including but not limited to, the scope and responsibilities of the position, the individual's experience, education, knowledge, skills, and competencies, as well as geographic location and business and contract considerations. Depending on the position, employees may be eligible for overtime, shift differential, and a discretionary bonus in addition to base pay.

EEO

EEO: Equal opportunity employer, including disability and protected veterans, or other characteristics protected by law.
View Now

Data Science

78716 Austin, Texas Syntricate Technologies

Posted 4 days ago

Job Viewed

Tap Again To Close

Job Description

Position- Data Science
Duration-Contract
Location- Austin, TX~ Sunnyvale, CA

Jd
1. Experience in Machine Learning and Deep Learning, including regression, classification, neural network, and Natural Language Processing (NLP), LLM.
2. Extensive experience in TIME SERIES.
3. Extensive experience on Natural Language Processing (NLP) libraries such as Spacy, NLTK, flair, and sklearn-crfsuite.
4. Strong background in DNN, CNN, RNN(LSTM), GAN, and libraries to deploy these models, such as Sklearn, Keras, Pandas, and TensorFlow.
5. Experience in Text Analytics, developing different Statistical Machine Learning, Data Mining solutions to various business problems, and generating data visualizations using R, Python.
6. Experience with common data science toolkits and libraries, such as Pandas, NumPy, SciPy, Scikit-learn.
7. Experience with data exploration to find actionable insights and make Product Recommendations through Funnel Analyses, A/B testing, Churn analysis, User Segmentation, Retention Rate, and business KPIs.
Data Science, Python, TIME SERIES, NLP, LLM

Regards,
Pallavi Verma
Sr. Technical Recruiter | Syntricate Technologies Inc.
Direct : |
Email : | Web:

We're hiring! connect with us on LinkedIn nd visit our Jobs Portal

Minority Business Enterprise (MBE) Certified | E-Verified Corporation | Equal Employment Opportunity (EEO) Employer

This e-mail message may contain confidential or legally privileged information and is intended only for the use of the intended recipient(s). Any unauthorized disclosure, dissemination, distribution, copying or the taking of any action in reliance on the information herein is prohibited. Please notify the sender immediately by email if you have received this email by mistake and delete this e-mail from your system. You have received this email as we have your email address shared by you or from one of our data sources or from our member(s) or subscriber(s) list. If you do not want to receive any further emails or updates, please reply and request to unsubscribe .
View Now
 

Nearby Locations

Other Jobs Near Me

Industry

  1. request_quote Accounting
  2. work Administrative
  3. eco Agriculture Forestry
  4. smart_toy AI & Emerging Technologies
  5. school Apprenticeships & Trainee
  6. apartment Architecture
  7. palette Arts & Entertainment
  8. directions_car Automotive
  9. flight_takeoff Aviation
  10. account_balance Banking & Finance
  11. local_florist Beauty & Wellness
  12. restaurant Catering
  13. volunteer_activism Charity & Voluntary
  14. science Chemical Engineering
  15. child_friendly Childcare
  16. foundation Civil Engineering
  17. clean_hands Cleaning & Sanitation
  18. diversity_3 Community & Social Care
  19. construction Construction
  20. brush Creative & Digital
  21. currency_bitcoin Crypto & Blockchain
  22. support_agent Customer Service & Helpdesk
  23. medical_services Dental
  24. medical_services Driving & Transport
  25. medical_services E Commerce & Social Media
  26. school Education & Teaching
  27. electrical_services Electrical Engineering
  28. bolt Energy
  29. local_mall Fmcg
  30. gavel Government & Non Profit
  31. emoji_events Graduate
  32. health_and_safety Healthcare
  33. beach_access Hospitality & Tourism
  34. groups Human Resources
  35. precision_manufacturing Industrial Engineering
  36. security Information Security
  37. handyman Installation & Maintenance
  38. policy Insurance
  39. code IT & Software
  40. gavel Legal
  41. sports_soccer Leisure & Sports
  42. inventory_2 Logistics & Warehousing
  43. supervisor_account Management
  44. supervisor_account Management Consultancy
  45. supervisor_account Manufacturing & Production
  46. campaign Marketing
  47. build Mechanical Engineering
  48. perm_media Media & PR
  49. local_hospital Medical
  50. local_hospital Military & Public Safety
  51. local_hospital Mining
  52. medical_services Nursing
  53. local_gas_station Oil & Gas
  54. biotech Pharmaceutical
  55. checklist_rtl Project Management
  56. shopping_bag Purchasing
  57. home_work Real Estate
  58. person_search Recruitment Consultancy
  59. store Retail
  60. point_of_sale Sales
  61. science Scientific Research & Development
  62. wifi Telecoms
  63. psychology Therapy
  64. pets Veterinary
View All Software Analytics Jobs