Hadoop Big Data Administrator - English speaking contract in Zurich

Zürich  ‐ Vor Ort
Dieses Projekt ist archiviert und leider nicht (mehr) aktiv.
Sie finden vakante Projekte hier in unserer Projektbörse.

Beschreibung

For a global bank based here in Zurich we are searching for a Hadoop Big Data Administrator to join a large team and exciting project. There are 30 staff in the team spread across 5 countries.

Overview of business area or project:

Project is set within the Compliance business domain to create a single client view (SCV)' available to compliance employees working on the Wealth Management business segment (EMEA, APAC, and CH). The project will enable transparency on a single holistic client view and also enhance the Public Exposed Person (PEP) and associated capabilities.

Key Responsibilities:

  • Strategic planning of the Environment team book of work, and working with application developers and IT operations to support customer-specific production and non-production environments.
  • Manage the provision of global development, UAT and PROD environments for both internal clients (developers, QA team) and external clients (other systems that hook-up to department resources)
  • Management of escalation issues within the team and wider department, and responsible for preventing re-occurring incidents
  • Define processes for access rights for specific systems.
  • Improve monitoring of systems and hardware in order to minimise future incidents.
  • Provision of BAU support for development and test teams
  • Manage project book of work. Projects range from adoption of new technologies to managing DR tests and patching cycles.
  • Manage the code release of components from Development through UAT to Production
  • Manage, in conjunction with project managers, the release of components/configuration changes from UAT to Production.
  • Liaise with infrastructure teams where required - across Compute, Storage, Database and Web platforms
  • Work closely with Change Management and Quality Assurance functions on production release cycle, ensuring environments are available and correctly configured for testing.
  • Work on process automation to improve the efficiency of both environment management and release management process.
  • Create and maintain documentation on the environment processes.

Essential Skills and Qualifications:

  • Experience in Hadoop Administration & Big Data Technologies.
  • Experience in Installation, configuration, supporting and managing Hadoop Clusters using Apache or Cloudera (CDH4, CDH5) Hadoop distributions.
  • Installing and configuring Hadoop Framework ecosystem services like: HDFS, YARN, Pig, Hive or Impala, Hbase, sqoop, zookeeper, Oozie, flume, Hue.
  • Adding/removing new nodes to an existing Hadoop cluster.
  • Required UNIX Admin skills to setup the cloudera environments/cluster and support the environments.
  • Experience in setting up any BigData analytical tools would be an added advantage (Palantir/Anaconda Python/scikit-learn/SQLite/PySpark)
  • Basic UNIX Scripting knowledge is essential.
  • Should be able to communicate and co-ordinate with various core teams (Oracle, UNIX, Windows, Network, Security, Business)
  • Should be willing to adopt the Credit Suisse environment and processes

Nice to have skills and qualifications:

  • Required basic Oracle skills (basic Admin) to work with DBAs for setting up the environments.
  • Experience developing applications in Java/Scala/Python or other Scripting language.
  • Experience with configuration management systems like Puppet/SaltStack.
  • SSL/TLS/HPPTS/FTPS/Control-M experience would be added advantage.

We look forward to reading your CV.

Start
Dec 2016 or January 2017
Dauer
12 months ! +
(Verlängerung möglich)
Von
Stamford Consultants AG
Eingestellt
24.01.2017
Projekt-ID:
1274134
Vertragsart
Freiberuflich
Um sich auf dieses Projekt zu bewerben müssen Sie sich einloggen.
Registrieren