Technical Consultant, Specialist

Zürich, Zürich  ‐ Remote
Dieses Projekt ist archiviert und leider nicht (mehr) aktiv.
Sie finden vakante Projekte hier in unserer Projektbörse.

Beschreibung

Itecor is an international IT consulting headquartered in Switzerland. At Itecor, we have been advising and supporting our clients for over twenty years to ensure their business systems are effective, aligned with business needs and economic realities. Our clients have come to rely on our work ethic, independence and rigorous approach. Our team of over 200 consultants benefits from the latest technology and market innovations and we ensure our consultants are able to provide high value-added solutions that have proved their worth in the field

In order to reinforce our team in the German-speaking part of Switzerland, we are looking for a Technical Consultant, Specialist (all genders)

Short Project Description: Engineering Data Hub (EDH)/Cyclone: As part of the “Cyclone” project, an executable basic architecture of an EDH (data-centric architecture) based on Apache Kafka (Confluent) and components for metadata management was created in previous projects. A number of systems, including KVS and RVS, have already been connected and exchange data via the EDH. Focus topics of the agile development team until the end of 2021: - advice on data analysis regarding metadata management (concept and architecture) and configuration in Apache Atlas and Ranger, - description/documentation of the application structure and processing logic according to Definition of Done (DoD) and IT PEP, - execution of system and change tests – ensuring functionality in accordance to defined test cases as part of release process, code review, test review, - documentation of known errors within the release process, - preparation of operational handover, - release stabilization, - service handover to 3rd level incl. process and technical documentation and troubleshooting

The requested extension regarding the SoftwareBroker consists of a connector to be developed as software broker in order to collect data of source systems, e.g. Version42, process data, and make them available for handover.
It is based on the existing design principles of the EDH.

Timeframe: 01.10.2021 - 31.12.2021
Effort: 100%
Delivery Location: remote
Language: Mixed (German, English)

Required Key Skills:
Data-centric architecture
eg connector config, cluster mgmt, metadata mgmteg apache kafka, nifi, confluent kafka, ranger

Profile & Skills Description:
Scrum, SAFe incl. events, workflows etc. - Jira/Confluence

Product/components/capabilities:
- apache kafka/kafka connect/buld kafka connectors
- apache kafka/kafka streams/steup, configuration, integration with apps
- nifi/nifi/implement data ingestion, connection, transformation
- confluent kafka/ksql/setup, Configuration of Event Streaming DataBase
- apache kafka/authentification/configuration of Kafka Authentication Module, integrationin Authentification Ecosystem (e.g. Kerberos, Ranger)
- kerberos/kerberos/setup, configuration kerberos
- apache kafka/kafka manager/setup, configuration
- confluent kafka/control center/setup, configuration
- apache/zookeeper/ setup, configuration as critical element in Kafka & Hadoop ecosystem, configuration of e.g. Leading Broker & Follower
- apache kafka/schema registry/understand, implement schemas, schema evolution
- confluent kafka/confluent registry/understand, implement schemas, schema evolution
- apache atlas/apache atlas/ understand, implement Meta Data Tagging (e.g. classification)
- apache atlas/ecosystem integration
- ranger/data security for hadoop, kafka
- MapR eventstore/setup, configuration
- flink/streaming analytics/setup, configuration
- rancher
- container platform
- redhat openshift
- promotheus/metric monitoring
- influx db
- ELK (elasticsearch, kibana)
- grafana/dashboard build

Basic knowledge:
- docker
- kubernetes
- AWS
- openshift cloud
- hashicorp/certificate, identy/only integration knowledge

Optional (not yet used within the project)
- hadoop/ambari/setup, configuration
- collibra/connector, jobserver/setup, configuration (needs contract with Collibra and very high-priced)
- lenses
- alation
- spark/streaming analytics/setup, configuration
- hadoop/hive/setup, configuration
- hadoop/hdfs/setup, configuration
- data fabric/filesystem/setup, configuration
- data fabric/MapR DB/setup, configuration
- mongo db

With offices in five countries: Switzerland, France, Spain, Macedonia, and Mexico, Itecor provides the opportunity to grow in a multi-cultural environment. Each consultant is part of a practice where he/she is regularly encouraged to acquire new skills, develop expertise and share what they have learned. We are looking forward to your application.
Start
10.2021
Dauer
3 Monate
(Verlängerung möglich)
Von
Itecor Schweiz AG
Eingestellt
22.09.2021
Ansprechpartner:
Lindsay Rutz
Projekt-ID:
2211112
Vertragsart
Freiberuflich
Einsatzart
100 % Remote
Um sich auf dieses Projekt zu bewerben müssen Sie sich einloggen.
Registrieren