Data Engineer (Azure/Python/Pyspark) - remote

Zürich  ‐ Remote
Dieses Projekt ist archiviert und leider nicht (mehr) aktiv.
Sie finden vakante Projekte hier in unserer Projektbörse.

Schlagworte

Information Engineering Microsoft Azure Global Finance Python Azure Data Lake SQL Azure Data Factory Apache Spark Pyspark Buchhaltung Versicherungsmathematik Big Data Partnerschaften Continuous Integration Software Design Patterns Devops International Financial Reporting Standards Softwareentwicklung Integration (Software) Databricks

Beschreibung

Please note that for this role we can only hire EU-27 citizens or candidates with an existing Swiss working permit/citizenship.

SuisseCo specializes in the international recruitment and placement of highly qualified IT specialists in Switzerland. We support our clients in implementing their IT projects and guarantee quick and flexible solutions of the highest quality.

For one of our clients in the insurance industry we are currently looking for a Data Engineer

- Start date: ASAP
- End date: 12 months with the possibility of an extension
- Location: remote within EU

About the project

The IFRS Global Finance Transformation Programme's Non-Standard Entities Product Area (NSE PA) is seeking an Azure Data Engineer to join our engineering team. The NSE PA is responsible for integrating the finance of all special case legal entities globally, such as new acquisitions and joint ventures. Our data engineering team uses Azure Databricks, Azure Data Factory, and Azure Data Lake (Delta Lake) to extract, load, and transform data from various source systems. We deliver this data in a harmonized model to the new global finance platform, built on SAP FPSL.

Essentials Skills and Qualifications

  • 3+ years of experience in data engineering application development using Spark, preferably in an enterprise environment on Microsoft Azure
  • 5+ years of experience in building enterprise big data/data engineering applications using continuous integration tools like Azure DevOps
  • Strong understanding of reusable software design patterns
  • Experience in Azure Data Factory, Azure Data Lake and Azure Log Analytics
  • Design and implementation of production-grade solutions
  • Delivering high-quality code, focusing on simplicity, performance, maintainability and scalability
  • Practical experience in applying an agile way of working and in applying DevOps methodologies
  • Excellent analytical and conceptual skills to understand complex technology stacks and their dependencies

Desired Skills

  • Proficient in Python/PySpark and SQL (Spark SQL)
  • Enterprise system integration experience
  • Insurance know-how
  • Finance & accounting or actuarial knowledge
Start
May 2023
Dauer
12 months
Von
SuisseCo GmbH
Eingestellt
31.03.2023
Projekt-ID:
2575004
Vertragsart
Freiberuflich
Einsatzart
100 % Remote
Um sich auf dieses Projekt zu bewerben müssen Sie sich einloggen.
Registrieren