Junior Big Data Engineer for Cardano - Málaga

Información del Proceso
% CVs leídos 100%
Rapidez lectura: 00 d. 10 h. 0 m.
CVs inscritos en el proceso: 15

Junior Big Data Engineer for Cardano



On behalf of Cardano, Ciklum is looking for a Junior Big Data Engineer to join Malaga team on a full-time basis.
As a Junior Big Data Engineer you will be involved in an innovation through software implementation and improvement of business processes. This is an opportunity for a bright and enthusiastic individual to learn Cardano’s Big Data initiatives and make a significant contribution to the business productivity and efficiency.
Training will be provided for all data engineers so that they maintain a skillset across the team.
Cardano’s technology requires a step up to the next level of sophistication. The setup is currently modest and the firm recognises the need to invest in an increasingly critical part of its business. The technology stack is currently C# and SQL based, and needs a fresh direction to leverage the scalability and flexibility of web/JavaScript and Big Data technologies.

We are looking for someone who will join the Data Engineering. You will be working on the collecting, storing, processing, and analyzing of sets of data. The primary focus will be on solutions for maintaining, implementing, and monitoring them. You will also be responsible for integrating them with the architecture used across the company. You will gain exposure to Big Data technologies and platforms including Hadoop, Spark and Trifacta.

Cardano was founded in 2000 to help pension plans achieve their financial objectives in a steady, predictable way by applying robust investment and risk management techniques. We currently employ 160 people based in London and Rotterdam with clients whose assets total in excess of £140bn.

• Implementing ETL processes and ensuring best practices within Cardano
• Become an expert in Big Data tools and frameworks used to provide requested capabilities
• Gain knowledge and an understanding of distributed computing principles
• Learn the Hortonworks Data Platform services
• Ability to solve ongoing issues with the operation cluster
• Build Web APIs for uniformed data ingestion
• Learn best practices for ETL processes and work closely with quants and business users to ensure that they have the required tools and datasets at their disposal
• Ensure that all data flows are transparent and are monitored in our Operations Dashboard

• Experience in Python, Java or C#
• Knowledge of SQL Server design and development.
• Knowledge of RDBMS, Postgres or MySQL will be desirable
• Big data skills or interest to quickly upskill
• Experience writing RESTful APIs, GraphQL is desirable
• Knowledge of software design patterns and best practices
• Creating clean code and using test driven development
• Knowledge of CI/CD

Training and Additional Skills:
Hortonworks Data Platform (2.6.1)
• Hive2
• Spark2
• Ambari
• Ranger
• Zeppelin
• Web APIs implementing GraphQL
• Azure Functions
Data Flow Platforms:
• Apache NiFi (HDF 3.0)
• Apache Oozie
• Apache Sqoop
Build Services:
• TeamCity
• Jenkins
• GitHub CI Environments
Programming Languages, scripting:
• C#
• Python
• JavaScript
• Java
• Scala
• Powershell
• Shell Scripts / Bash
• Groovy

What’s in it for you?
• Ability to work on challenging, large project with complex high load e-commerce platform;
• State of the art, cool, centrally located offices with warm atmosphere which creates really good working conditions;
• Unique working environment where you communicate and work directly with client;
• Competitive salary;
• Career and professional growth;
• Possibility to work in a big and successful company;
• Long-term employment with 20 working-days paid vacation and other social benefits.

Funciones Profesionales

Detalles de la oferta
  • Idioma: Inglés (Medio)
  • Experiencia: Sin Experiencia
  • Formación Mínima: Otra Formación Tecnológica
  • Nivel Profesional: Especialista
  • Tipo contrato: Indefinido
  • Jornada: Jornada completa
  • Salario: No especificado