Our client, a governmental entity in KSA, with a strong focus on banking and finance is looking for a DataOps Engineer.
The ideal candidate will build, automate, and maintain scalable and secure data pipelines, MLOps workflows, and analytics infrastructure to support business and AI/ML initiatives.
Responsibilities
- Design, implement, and manage CI/CD pipelines for data engineering and AI/ML projects.
- Support data scientists and data engineers in deploying and operationalizing data and ML workflows.
- Manage infrastructure as code using tools like Terraform, Docker, and Kubernetes.
- Monitor system performance, ensure data pipeline health, and implement disaster recovery solutions.
- Automate workflows with tools such as Apache Airflow, Jenkins, or GitLab CI.
- Collaborate with data teams to ensure data quality, versioning, and governance.
- Integrate with platforms and tools such as Dataiku, Informatica, Apache Spark, Kafka, and Snowflake.
- Apply DataOps principles to design and optimize ETL/ELT pipelines and modern data architectures.
- Work with cloud platforms (AWS, Azure, or GCP) to deploy scalable data solutions.
- Utilize scripting languages such as Python, Bash, and configuration tools (e.g., YAML).
- Ensure adherence to data security, governance, and standards.
Qualifications
- Bachelor’s or Master’s degree in Computer Science, Data Engineering, or related field.
- 5+ years of experience in DataOps, DevOps, or Data Engineering.
- Strong expertise in ETL/ELT, CI/CD pipelines, and workflow automation.
- Hands-on experience with cloud platforms (AWS, Azure, or GCP).
- Proficiency in Python, Bash, Kubernetes, Docker, and Terraform.
- Familiarity with data platforms such as Informatica, Spark, Kafka, Dataiku, or Snowflake.
- Solid knowledge of data governance, security, and compliance.
- Excellent problem-solving and communication skills.