Senior Data Scientist for eComm Analytics

Company:  Dice
Location: Little Ferry
Closing Date: 03/11/2024
Salary: £125 - £150 Per Annum
Hours: Full Time
Type: Permanent
Job Requirements / Description

Dice is the leading career destination for tech experts at every stage of their careers. Our client, Ravh IT Solutions, is seeking the following. Apply via Dice today!


Senior Data Scientist for eComm Analytics

Bentonville, Arkansas OR Remote

Long Term Contract

Must Have Skills:

  1. Python 6+ Yrs of Exp
  2. Pyspark 6+ Yrs of Exp
  3. Pytorch 6+ Yrs of Exp
  4. Google Cloud Platform 3+ Yrs of Exp
  5. Web development Prior experience 3+ Years
  6. Docker 4+ Years
  7. KubeFlow - 4+ Years
  8. Domain Experience (If any) Retail Experience

Description:

Client is looking for a highly energetic and collaborative Senior Data Scientist with experience building enterprise level GenAI applications, designed and developed MLOps pipelines. The ideal candidate should have a deep understanding of the NLP field, hands-on experience in design and development of NLP models, and experience in building LLM-based applications. Excellent written and verbal communication skills with the ability to collaborate effectively with domain experts and IT leadership team is key to be successful in this role. We are looking for candidates with expertise in Python, Pyspark, Pytorch, Langchain, Google Cloud Platform, Web development, Docker, Kubeflow, etc.

Key Responsibilities:

  1. Work with client AI/ML Platform Enablement team within the eCommerce Analytics team. The broader team is currently on a transformation path, and this role will be instrumental in enabling the broader team's vision.
  2. Work closely with other Data Scientists to help with production models and maintain them in production.
  3. Deploy and configure Kubernetes components for production cluster, including API Gateway, Ingress, Model Serving, Logging, Monitoring, Cron Jobs, etc. Improve the model deployment process for MLE for faster builds and simplified workflows.
  4. Be a technical leader on various projects across platforms and a hands-on contributor of the entire platform's architecture.
  5. Responsible for leading operational excellence initiatives in the AI/ML space which includes efficient use of resources, identifying optimization opportunities, forecasting capacity, etc.
  6. Design and implement different flavors of architecture to deliver better system performance and resiliency.
  7. Develop capability requirements and transition plan for the next generation of AI/ML enablement technology, tools, and processes to enable Walmart to efficiently improve performance with scale.

Tools/Skills (hands-on experience is must):

  1. Ability to transform designs ground up and lead innovation in system design.
  2. Deep understanding of GenAI applications and NLP field.
  3. Hands-on experience in the design and development of NLP models.
  4. Experience in building LLM-based applications.
  5. Design and development of MLOps pipelines.
  6. Fundamental understanding of the data science parameterized and non-parameterized algorithms.
  7. Knowledge on AI/ML application lifecycles and workflows.
  8. Experience in the design and development of an ML pipeline using containerized components.
  9. Have worked on at least one Kubernetes cloud offering (EKS/GKE/AKS) or on-prem Kubernetes (native Kubernetes, Gravity, MetalK8s).
  10. Programming experience in Python, Pyspark, Pytorch, Langchain, Docker, Kubeflow.
  11. Ability to use observability tools (Splunk, Prometheus, and Grafana) to look at logs and metrics to diagnose issues within the system.
  12. Experience with Web development.

Education & Experience:

6+ years relevant experience in roles with responsibility over data platforms and data operations dealing with large volumes of data in cloud-based distributed computing environments. Graduate degree preferred in a quantitative discipline (e.g., computer engineering, computer science, economics, math, operations research). Proven ability to solve enterprise-level data operations problems at scale which require cross-functional collaboration for solution development, implementation, and adoption.

#J-18808-Ljbffr
Apply Now
An error has occurred. This application may no longer respond until reloaded. Reload 🗙