We are seeking an experienced MLOps / AI Ops Engineer to join our newly established DevOps & Run Team in the Data & AI CoE at Prague. Shape the future of enterprise AI/LLMOps , collaborate across borders, drive innovation, and grow with a competitive package and continuous learning.
The tasks you will perform
AI/ML & LLMOps Pipelines
-
Build and maintain CI/CD pipelines using Azure DevOps, Databricks Repos, and Asset Bundles
-
Automate model packaging, registration, and promotion with MLflow and Unity Catalog
-
Operationalize open-source and vendor LLMs (e.g., Llama, Azure OpenAI, Anthropic)
-
Support RAG pipelines with vector stores (Databricks, Azure AI Search, pgvector, FAISS)
-
Contribute to internal Python libraries and tooling
Monitoring & Reliability
-
Implement observability for data drift, model performance, and agent workflows
-
Integrate with Azure Monitor, Prometheus, Grafana, and Databricks dashboards
-
Build alerting and automated remediation for failing models and endpoints
-
Define and track SLOs/SLIs for uptime, latency, and retraining cadence
Explainability & Responsible AI
-
Apply SHAP and MLflow tools for interpretability
-
Ensure traceability of LLM outputs and agent decisions
-
Build dashboards aligned with GDPR, EU AI Act, and ethical AI guidelines
-
Enable human-in-the-loop reviews for critical use cases
-
Enforce policy-as-code guardrails across pipelines
Platform Operations
-
Optimize Databricks clusters, Delta Live Tables, Feature Store, and Unity Catalog
-
Manage hybrid inference setups across Databricks, AKS, and external APIs
-
Apply FinOps practices to manage GPU/CPU usage efficiently
-
Build reusable Terraform/Bicep modules for infrastructure provisioning
Collaboration & Improvement
-
Support AI Engineers and Data Scientists in productionizing experiments
-
Align with DevOps Engineers on shared infrastructure and tooling
-
Mentor squads on MLOps, LLMOps, and AgenticOps
-
Evaluate and integrate emerging frameworks and tools
What we expect from you
Must-Haves
-
Bachelor’s/Master’s in Computer Science, Data/AI Engineering, or related field.
-
4+ years in IT, ideally with a focus on data, machine learning, and/or operations
-
2+ years of hands-on experience with Python and Azure
-
1+ year of experience in MLOps—running ML models in production
-
6+ months of experience in AIOps—operating LLM-based applications in production
-
1+ year working with Databricks
-
Familiarity with CI/CD, MLflow, Unity Catalog, and containerized inference
-
Advanced English proficiency required for working in an international environment.
Nice to Have
-
Experience running pre-trained LLMs/SLMs in production environments
-
Understanding of observability, explainability, and responsible AI principles
-
Exposure to hybrid inference setups and FinOps practices
-
Familiarity with agent orchestration tools like LangChain, LangGraph, or Mosaic AI
-
Knowledge of SHAP, GDPR/EU AI Act compliance, and policy-as-code frameworks
-
Experience with vector databases (e.g., FAISS, pgvector, Azure AI Search)
Perks of joining NN
-
We allow you to work where you feel the most comfortable, whether it is in the office or from home, and we contribute to your home office expenses every month.
-
We understand the importance of having a work-life balance, which is why we offer 5 weeks of vacation, 5 well-being days, additional paid time off for personal and family events, and 1 volunteering day to support our community.
-
In addition to your base salary you will have a lump-sum meal allowance, up to CZK 20,000 in the Cafeteria per year, the possibility of arranging a MultiSport card, the possibility of contributing to supplementary pension insurance / supplementary pension savings, and a discount on life insurance.
-
We believe that your professional and personal growth is crucial, which is why we provide you with tailor-made professional training.
-
Your friends and acquaintances are a valuable source of talent for us, which is why we offer up to 60,000 CZK as a reward for recommending a suitable candidate.
-
A business laptop and an iPhone with a paid O2 tariff and a data package are basic tools for your work.