Beschreibung
Spark Developer/Engineer - Java/Scala/Big Data - with profound analytical knowledge wanted for our Basel based client.
Your experience/skills:
- Proven experience in implementing solutions to process large amounts of data in a Hadoop ecosystem utilising Apache Spark, therefore in-depth experience of Hadoop technologies like. MapReduce, HDFS and HBase is mandatory
- Experience implementing generic components for the ingestion, validation, and structuring of disparate data sources into a Big Data platform
- Excellent programming and Scripting skills in Java, C/C++, Scala, Bash, R and Python
- Experience with Master Data Management (MDM) including ontology curation and data cleansing in addition to common SDLC tools and practices for Agile, including Continuous Delivery
- Skills in exploratory statistics to relate to data analysts and statisticians
- Languages: fluent English both written and spoken
Your tasks:
- Interacting with the architecture team to refine requirements
- Cooperating with the project team to implement solutions on the existing Hadoop platform
- Assisting the Platform Engineer to ensure that dependent components are provisioned
Start: ASAP
Duration: 3 - 6 MM+
Location: Basel, Switzerland
Ref.Nr.: BH 11362
Does this sound like an interesting and challenging opportunity to you? Then take the next step by sending us your CV as a Word document and a contact telephone number.
Due to work permit restrictions we can unfortunately only consider applications from EU or Swiss citizens as well as current work-permit holders for Switzerland.
Going the extra mile
New to Switzerland? In case of successful placement, we support you with:
- All administrative questions
- Finding an apartment
- Health - and social insurance
- Work permit and much more