Hadoop Big Data Engineer with Analytics - English speaking contract in

Basel  ‐ Vor Ort
Dieses Projekt ist archiviert und leider nicht (mehr) aktiv.
Sie finden vakante Projekte hier in unserer Projektbörse.

Beschreibung

Our client in Basel, Switzerland, is seeking an experienced implementer of productive, business rule and metadata driven Big Data solutions, using Apache Hadoop and Spark. The successful candidate should have demonstrable experience of implementing similar solutions, be capable of working independently as well as comfortable working within an agile project team. The ability to relate complex data problems to business users and vice-versa, is crucial.

The Job:

  • Interact with the architecture team to refine requirements
  • Work within project team to implement solutions on the existing Hadoop platform
  • Work with Platform Engineers to ensure Dependent components are provisioned
  • Provide input in defining the design/architecture of the envisaged solution
  • Implement business rules for streamlining data feed(s)
  • Implement rule based framework to abstract complex technical implementation into reusable, generic components

We are looking for candidates with:

  • Proven experience implementing solutions to process large amounts of data in a Hadoop ecosystem, utilising Apache Spark
  • Experience implementing generic components for the ingestion, validation, and structuring of disparate data sources into a distributed data platform
  • Experiencing implementing complex event processing patterns
  • Strong programming and Scripting skills; eg Java or Scala or Python or R
  • Strong understanding of Hadoop technologies; eg MapReduce, HDFS, HBase, Hive, Sqoop, Flume, Kafka
  • Software Engineering background, with experience using common SDLC tools and practices for agile, ideally including Continuous Delivery
  • Experience in BRMS driven solutions
  • Experience with master data management (MDM) including ontology curation and data cleansing
  • Experience with end-user notepad tools; eg, Jupyter, Zeppelin
  • Ability to operate in a global team and coordinate activities with remote resources
  • Capable of documenting solutions so they can be re-used
  • Skills in exploratory statistics to relate to data analysts and statisticians

Nice to have:

  • Macroeconomic domain knowledge
  • Experience in econometrics
  • Experience with data visualisation tools, eg Tableau

We look forward to receiving your CV.

Start
May 2018
Dauer
3 months + + (you could be there 3 years)
(Verlängerung möglich)
Von
Stamford Consultants AG
Eingestellt
14.04.2018
Projekt-ID:
1538340
Vertragsart
Freiberuflich
Um sich auf dieses Projekt zu bewerben müssen Sie sich einloggen.
Registrieren