Qliro has a track record of delivering new products and services at unmatched speed. We are active in an exciting intersection of the e-commerce, payment and financial services markets. We target both leading online merchants and millions of Nordic consumers seeking superior digital payments and consumer finance products. In addition to our E-commerce business we have also successfully released D2C saving and loans products, apps, self-service website and merchant-facing interfaces.
We are looking for Big Data Engineers to take a crucial part in establishing our Qliro Exploration Hub in a cloud first solution. As a Big Data Engineer, you will work in a team of Big Data-, Data Warehouse engineers and data scientists with end-to-end responsibility for the whole data ecosystem of the usage of data and information with the intention to explore and gain insights.
The role includes the development of batch-oriented- as well as real time data flows, tuning our infrastructure for analytical workloads, advising other teams with integrations into our environment. You will also ensure that the entire solution is developing at a high pace. At the core of our current solution, we have Hadoop and HDFS, and our Kafka deployment is gaining more and more important as a central part of the data infrastructure. Our target is to be cloud first.
Note that this ad contains two openings in different levels.
Some of the technologies you will get to work with:
AWS, EMR, MSK, Snowflake, Qlik Replicate, Qlik Compose, Qlik Sense, Apache Spark, Hive, Sqoop, Scala, Python, Airflow, Docker, Server, Apache Kafka, Kafka Streaming, ksqlDB, Elastic Search, Hortonworks, Hadoop (HDFS, YARN), GitLab, CICD and NiFi.
In this role you will:
- Be the go-to expert for the company with anything in the realm of Bid Data and data management in that context
- Be responsible for assuring that the Big Data deliver quality and trust in the data.
- Challenge the preconceived notions and create new conditions for how the Big Data solution can be used to create benefits within the organization.
- Be part of define the data and identify assets to be managed within the Exploration Hub.
- Together with the team be responsible for data and information content and meta data management related to the data assets.
- Monitor data usage to assist teams, share best practice trends in how to utilize Big Data in a business context, and provide insight into how and where teams can use the Big Data solution to help the organization in the day-to-day operation.
- Ensure compliance and security of the data within the Big Data platform.
- Build, improve and maintain data pipelines that handle data and information
- Develop, administer, maintain and improve our current Kafka and Big Data installation, on-prem and in cloud
- Tune and improve the infrastructure to handle ever-changing demands
- Conduct unit testing and troubleshooting
- Continuing the journey for going to cloud. Prepare the Exploration Hub for going towards Cloud
We are looking for curious and out-of-the-box-thinking Big Data Engineer to create and manage our magic big data platform. You should have skills and experience in developing big data streaming solutions using Hadoop and Kafka, both on-prem and cloud solutions. You are comfortable working with and administering Linux environments, big data clusters and have experience with the core software packages and methodologies in big data deployments.
We believe you have:
- Relevant education from a university with a degree in engineering or comparable skills
- Deep knowledge in Hadoop and Kafka and creating data pipelines and real time sourcing in complex environments
- Hands on knowledge when it comes to creating and running big data cluster and data pipelines in AWS
- Experience from running and maintaining Airflow for workflow automation
- Knowledge and hands on experience with Spark, , Scala, Python
- Hands-on experience with data modeling and its impact on platform performance and quality of the data
- Experience from agile way of working
- Experience from testing and test methodology
- Experience from writing complex queries
It is meriting if you have:
- Experience from transforming big data solutions to Cloud. Preferable from both perspectives establishing a DR and setting up an active cluster, including migration approaches like shift-and-lift and hybrid.
- Hortonworks Certification
- Experiences from establishing Data Warehouse, Data Lakes and ODS
- Experience from Snowflake on AWS
WOW in everything we do
We are currently transforming Qliro to cater for both the next levels of payment services to consumers and merchants as well as becoming a Digital Banking Platform. We do that by embracing technology, leverage partnerships, accelerate through data, empower sustainability, and drive outstanding consumer experience. Our goal is to be known as a trusted partner to merchants and to provide an exceptional digital banking experience for consumers - simply WOW.
As a team member, "a Qliroer," you will be offered to work in a dynamic and fast-paced Fintech environment. We thrive on challenges, personal growth and we have a culture that emphasizes learning and development embracing our core values. We believe that through collaboration and our everyday curiosity, we feel empowered and can take accountability. Together we work hard to create a workplace that is diverse and inclusive. We strive to empower our people to manage their time and work effectively and, wherever possible. We promote flexible working hours to help you juggle your everyday life and love the work you do.