Beschreibung
For our banking client in Zurich, are we looking for a
Hadoop Administrator
Start: ASAP
Location: Zurich
Duration: initial contract of 12 months
In this project will you help to deliver, maintain and support a growing platform which serves multiple areas of the bank, including some very exciting and cutting-edge projects, with board-level sponsorship.
Your key responsibilities are
- Engineering of data pipelines (primarily batch, increasingly intra-day/near-Real Time)
- Integration and evaluation of new big data and data science technologies
- Development of platform components
- Consultation to application groups on how best to utilise the platform and technologies
- Engagement with groups for PoCs and full platform on-boarding
Key skills/experience/knowledge needed:
- Detailed knowledge about core services
- Experience with Kerberos enabled clusters
- Excellent communication skills
- Working with streaming services (Kafka, Flume)
- Batch ETL (Hive/PIG/Spark)
- Scripting experience
- Cloudera/Hortenworks
Are you the person we are looking for? Or do you have these skills but maybe with a different focus? (Hadoop, DevOps, Big Data)
We have 3 positions open for this team so don't hesitate to contact us for more info.
Looking forward to hear from you!
Michael Bailey Associates
Beheshta Saya
Michael Bailey International is acting as an Employment Business in relation to this vacancy.