IT
Belgrade
Posted 1 week ago

Role : Data Infrastructure / Platform Engineer (External Consultant) 

Location: Belgrade (Hybrid) 

Job Overview 

We are looking for a skilled Data Infrastructure / Platform Engineer to join our Data Infrastructure team. In this role, you will help design, operate, and secure a modern data platform supporting large-scale data processing, streaming, and analytics workloads. 

You will work closely with DevOps, Data Engineering, and Security teams to ensure the platform is reliable, scalable, and secure. 

What You’ll Do 

Platform & DevOps Engineering 

  • Deploy and operate Kubernetes clusters using Rancher 
  • Configure Kubernetes networking (e.g., Calico) 
  • Automate deployments using Helm and GitLab CI/CD 
  • Provision and manage infrastructure with Terraform and GitOps workflows 
  • Build automation scripts, agents, or background services 
  • Develop and support internal tools and lightweight GUIs for platform control and monitoring 
  • Ensure high availability, scalability, and security of the platform 
  • Operate and optimize Ceph distributed storage 
  • Implement observability solutions (Prometheus, Grafana, ELK) 

Data Engineering & Data Lake Operations 

  • Operate and support Big Data technologies including: 
  • Spark (PySpark / Scala), Spark Streaming 
  • Airflow, NiFi, Trino 
  • Kafka 
  • Support Zeppelin and Jupyter notebook environments 
  • Maintain complex data ingestion, transformation, and streaming pipelines 
  • Troubleshoot distributed systems and workflow orchestration 
  • Optimize compute and storage resource usage across the data platform 

Security & Governance 

  • Integrate platform security using Keycloak, RBAC, certificates, and secrets management 
  • Apply DevSecOps practices and cluster hardening 
  • Validate and continuously improve platform security controls 

What We’re Looking For 

Required Skills 

  • Strong experience with Kubernetes, Rancher, Helm 
  • Hands-on experience with GitLab CI/CD, Terraform, GitOps 
  • Solid Linux administration and shell scripting skills 
  • Experience with Big Data technologies: 
  • Spark, Airflow, NiFi, Kafka, Trino, Zeppelin 
  • Understanding of Data Lake architectures and ingestion patterns 
  • Proficiency in Python or Scala 
  • Experience with Ceph or similar distributed storage systems 

Nice to Have 

  • Experience building internal tools, agents, or background services 
  • Development of lightweight GUIs for automation or monitoring 
  • Experience with Jupyter 
  • Knowledge of HashiCorp Vault or Consul 
  • Experience with streaming workloads (Spark Streaming, Kafka Streams) 

Why Join Us 

  • Work on a modern, large-scale data platform 
  • Collaborate with experienced engineers in an Agile environment 
  • Hybrid working model in Belgrade 
  • Opportunity to make a real impact on data infrastructure and platform reliability 

Job Features

Job Category

Hybrid

Apply For This Job

A valid email address is required.
A valid phone number is required.
clearclickdigital@gmail.com
clearclickdigital@gmail.com
https://zenithss.com

This website stores cookies on your computer. Cookie Policy

Share this job

Share on WhatsApp Share on LinkedIn email [#1572] Created with Sketch. Share via Email