JobsAisle
F

GCP AI Engineer

flint-international

Riyadh, Saudi ArabiaSAR 25,000-33,333/moToday
Saudi ArabiaIT & TechnologyFull Time

Skills Required

PythonSqlDockerKubernetesGitDevopsMachine LearningErp

Job Description

<div><p><b>Notice Period</b>: Only Immediate joiners OR not longer than 30 days’ Notice Period.</p><p>Job Description</p><p>We are seeking a highly skilled and motivated Senior AI/ML Engineer with deep, hands‑on expertise in building custom AI Models that can be deployed on Prem, and/or on Google Cloud Platform (GCP) to bridge the gap between Data Science research and Enterprise IT production.</p><p>In this role, you will be the architectural backbone of our AI practice. You will work across diverse AI fields – from traditional predictive analytics to cutting‑edge Large Language Models (LLMs) and computer vision engines – deploying them into robust, highly available, and secure microservices on Prem and on GCP. Whether building a real‑time REST API to intercept medical claims in milliseconds or orchestrating massive batch‑scoring pipelines, your work will directly optimize billions of Riyals in healthcare operations.</p><p>Key Responsibilities:</p><ul><li>End-to-End ML model Development along MLOps Pipelines: Design, develop, and implement production‑ready CI/CD pipelines on GCP, encompassing data ingestion, feature engineering, model training, evaluation, and scalable deployment.</li><li>GCP AI Architecture: Leverage and orchestrate the full GCP data stack to build Bupa Arabia’s AI infrastructure:</li><li>Data&Features: Build robust data pipelines and Feature Stores using BigQuery and Dataflow / Apache Beam.</li><li>Model Training&Registry: Train and version control models using Vertex AI Workbench and the Vertex Model Registry.</li></ul><p>Confidential</p><ul><li>Deployment&Serving: Deploy low‑latency real‑time inference using Vertex AI Endpoints and containerize lightweight deterministic rule engines using Cloud Run or Google Kubernetes Engine (GKE).</li><li>Orchestration: Schedule complex batch‑scoring workflows using Cloud Composer (Apache Airflow).</li><li>Hybrid Cloud AI Integration: Experience designing architectures that securely bridge on‑premises data centers (Oracle/SQL) with GCP AI services using Cloud Interconnect, Apigee API gateways, or secure REST endpoints.</li><li>Data Anonymization&Security: Proven ability to build on‑premises data masking and tokenization pipelines (removing PHI/PII) before sending stateless inference requests to cloud‑based LLMs.</li><li>GCP AI Architecture: Leverage and orchestrate the full GCP data stack to build Bupa Arabia’s AI infrastructure:</li><li>Data&Features: Build robust data pipelines and Feature Stores using BigQuery and Dataflow / Apache Beam.</li><li>Model Training&Registry: Train and version control models using Vertex AI Workbench and the Vertex Model Registry.</li><li>Deployment&Serving: Deploy low‑latency real‑time inference using Vertex AI Endpoints, and containerize lightweight deterministic rule engines using Cloud Run or Google Kubernetes Engine (GKE).</li><li>Orchestration: Schedule complex batch‑scoring workflows using Cloud Composer (Apache Airflow).</li><li>API&System Integration: Wrap machine learning models in secure, high‑performance RESTful APIs (e.g., FastAPI/Flask) to integrate seamlessly with Bupa’s core claims processing engines (e.g., Care Connect) and API gateways.</li><li>Model Observability: Implement Vertex AI Model Monitoring to continuously track data drift, concept drift, and training‑serving skew, ensuring models adapt to changing healthcare billing behaviors.</li><li>Data Security&KSA Compliance: Architect AI solutions that strictly adhere to Saudi Arabian data sovereignty and healthcare regulations (SAMA, CHI, NDMO, PDPL). Implement Cloud DLP (Data Loss Prevention) and VPC Service Controls to dynamically mask and secure Protected Health Information (PHI) and National IDs.</li></ul><ul><li>Cross‑Functional Collaboration: Partner closely with Data Scientists, FWA Investigators, Medical SMEs, and Product Managers to translate clinical rules and business requirements into scalable technical solutions.</li><li>Bachelor’s or master’s degree in computer science, Software Engineering, Artificial Intelligence, or a related quantitative field.</li><li>3–5+ years of professional engineering experience, with a proven track record of taking ML models out of Jupyter notebooks and deploying them into production environments.</li><li>Deep, hands‑on mastery of Google Cloud Platform (GCP) for ML workloads is essential.</li><li>Strong proficiency in Python (OOP, modular design, unit testing) and relevant AI/ML libraries (TensorFlow, PyTorch, scikit‑learn, Pandas).</li><li>Experience with backend API development frameworks (FastAPI, Flask) for high throughput model serving.</li><li>Strong DevOps fundamentals: Docker containerization, Git version control, CI/CD tools (Cloud Build, GitHub Actions), and Infrastructure as Code (Terraform).</li><li>Solid understanding of machine learning evaluation metrics (Precision, Recall, ROC‑AUC) and the ability to evaluate algorithmic trade‑offs for specific business problems.</li><li>GCP Certifications: Pro