Bigdata Engineer

Waadt  ‐ Vor Ort
Dieses Projekt ist archiviert und leider nicht (mehr) aktiv.
Sie finden vakante Projekte hier in unserer Projektbörse.

Beschreibung

Title: Bigdata Engineer

Location: Lausanne,Switzerland

Role: Contract

We are looking for a candidate with 5+ years of experience in a Data Engineer role, who has attained a Graduate degree in Computer Science, Statistics, Informatics, Information Systems or another quantitative field.

Experience supporting and working with cross-functional teams in a dynamic environment.

Responsibilities for Data Engineer

  • Create and maintain optimal data pipeline architecture,
  • Assemble large, complex data sets that meet functional/non-functional business requirements.
  • Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc.
  • Work closely with the infrastructure team for optimal extraction, transformation, and loading of data from a wide variety of data sources using SQL, Sqoop, Rest API. file based and Kafka.

Qualifications for Data Engineer

  • Candidate must be from Java background and experienced working in big data platform.
  • Advanced working SQL knowledge and experience working with relational databases, query authoring (SQL) as well as working familiarity with a variety of databases.
  • Experience building and optimizing big data' data pipelines, architectures and data sets.
  • Experience performing root cause analysis on internal and external data and processes to answer specific business questions and identify opportunities for improvement.
  • Strong analytic skills related to working with unstructured datasets.
  • Build processes supporting data transformation, data structures, metadata, dependency and workload management.
  • A successful history of manipulating, processing and extracting value from large disconnected datasets.
  • Working knowledge of message queuing, stream processing, and highly scalable big data' data stores.
  • Must have experience in operating in an agile and Devops environment.
  • Must have experience setting up Continues Integration and deployment process.

Technical Skills

  • Experience with Java and big data tools: Hadoop, Spark, Kafka, etc.
  • Experience with relational SQL and NoSQL databases
  • Experience with bash Scripting, GIT SVN, Jenkins and deployment tools like Nolio.
  • Experience with object-oriented/object function Scripting languages: Python, Java, C++, Scala, etc.
Start
keine Angabe
Dauer
6 months
Von
TomGandhi Consulting Limited
Eingestellt
22.10.2019
Projekt-ID:
1838425
Vertragsart
Freiberuflich
Um sich auf dieses Projekt zu bewerben müssen Sie sich einloggen.
Registrieren