Role Responsibilities:
Model and Data Pipelines:
• Develop automated processes for large scale data pipelines, model development, operationalization and model monitoring.
• Develop and automate the MLOps pipeline.
• Deployment on low code env and developing integrations
• Take responsibility for production issues, perform root cause analysis, and recommend changes to reduce/eliminate re-occurrence of issues.
• Accurately apply technical job knowledge and skills to complete all work in a timely manner in accordance with policies, procedures and regulatory requirements
Automate and Improve Efficiency:
• Automate monitoring of models both for accuracy degradation and failures
• Optimize deployment and change control processes for models
• Automate logging of model usage and predictions provided
Research, Evolve and publish best practices:
• Recommend model refinements to optimize cloud spend
• Enrich existing ML frameworks and libraries
• Research and operationalize technology and processes necessary to scale ML models
• Ability to research and recommend best practices on new technologies, platforms and services
• Improve ML pipeline documentation and understandability
Communication and Collaboration:
• Collaborate with technical teams like data scientist, data developers, development and platform
• Knowledge sharing with the broader analytics team and stakeholders is essential
• Communicate on the on-goings to embrace the remote and cross geography culture
• Align on the key priorities and focus areas
• Ability to communicate the accomplishments, failures and risks in timely manner
Embrace learning mindset:
• Continually invest in your own knowledge and skillset through formal training, reading, and attending conferences and meetups
Must - have technical skills and experience:
• Expertise and at least 3yrs of professional experience in MLOps
• Software engineering skills and expertise in SQL
• ML skills- DL, neural network architectures, NLP, graph
• Knowledge of ML and AI frameworks
• Knowledge of GCP Vertex AI
• At least 5yrs of professional experience in the related field of data science or MLOps
• Professional experience with a major cloud computing platform such as GCP or Azure
• Understanding of AutoML and ensembled models and its pipelines
• Strong communication skills both verbal and written including the ability to interact effectively with colleagues of varying technical and non-technical abilities.
• Passionate about agile software processes, data-driven development, reliability, and systematic experimentation.
• Passion for learning new technologies and solving challenging problems.
Good to have skills:
• GCP certification
• Understanding of CPG industry
• Basic understanding of dbt