Skip to content

Genzo Vandervelden

About myself

A portretpicture of Genzo Vandervelden

  • Nationality: Belgian
  • Residence: 9180 Lokeren, Belgium
  • Native language: Dutch
  • Other languages: English

Background

I completed a data-oriented education where I became familiar with the data world. My passion lies in big data. Here, I perform the necessary analyses both individually and in a team, and I develop them in collaboration with stakeholders. I enjoy working in a team and contributing to a warm and authentic atmosphere.

Education

  • 2014 - 2017 Information Management & Systems @ Thomas More

Certifications

  • AWS Certified Solutions Architect - Associate
  • Google Analytics Qualified Individual
  • Google Adwords Qualified Individual
  • Apache Spark 2 with Scala - Hands On with Big Data

Highlights tools and Technologies

  • AWS
  • Apache Spark
  • Data integratie
  • Data modelling
  • Data warehousing
  • DevOps
  • DBT
  • ELT
  • Python
  • Scala
  • SQL

Experiences

Telenet (Freelance)

May 2022 - …

Telenet is a leading telecom company in Belgium, providing innovative solutions for internet, television, and telephony to both private and business customers.

I joined Telenet to contribute to the migration of an on-premise Cloudera cluster to AWS. This project aimed to modernize the data infrastructure, enhance scalability, and optimize costs. We use Apache Spark and DBT to load and transform data, and make it available in Snowflake. By transitioning to cloud-native solutions, I helped create a future-proof data environment with more room for innovation.

As a Big Data Engineer, I design, develop, and implement scalable data solutions. I work with modern technologies like Scala, Snowflake, and AWS to process and analyze large volumes of data. My responsibilities include building data pipelines, optimizing data processing workflows, and providing insights to support strategic decisions within the company. With my expertise in big data and cloud technologies, I contribute to various projects and improve the efficiency of data-driven operations.

Agile, AWS, Apache Spark, Athena, CI/CD, Data warehousing, Data modelling, DevOps, Data quality, DBT, Docker, ELT, Git, Snowflake, SQL, Scala, Scrum


DPG Media

Jan 2020 - May 2022

DPG Media is a leading media company in Belgium and the Netherlands, with strong and well-known brands such as VTM, Q-Music, Het Laatste Nieuws, Tweakers, and many more. The company places a strong emphasis on technology and digitalization within the industry.

Within DPG Media, I worked as a Data Engineer in the Data Area.

When I started, I joined the B2B team. My task was to collaborate with the team to phase out an old on-premise data warehouse. We thoroughly reviewed and refined the data model. This data warehouse contained data on advertisements and the corresponding revenue. It was used by sales representatives and for reporting to the management. However, over the years, this system had become difficult to maintain, reached its limits due to the ever-increasing data volumes, and the knowledge of the system was lost.

We gradually phased out this system to make way for a new data warehouse running on AWS S3, Apache Spark, Kubernetes, and Snowflake. During the migration of the data warehouse, I continuously performed analyses and worked closely with stakeholders. We ensured that the new environment remained functionally equivalent but was also faster, more maintainable, and more scalable for the future. I also provided demos to the business team on how to use our data in Tableau so they could get started themselves.

Looking for new internal experiences, I started in the B2C team. There, I performed various tasks, ranging from processing real-time events from email campaigns to maintaining subscription systems.

Agile, AWS, Apache Airflow, Apache Spark, Athena, Data warehousing, Data modelling, Data quality, DevOps, CI/CD, Docker, ELT, Git, Linux, Python, Snowflake, SQL, Scala, Scrum

Crunch Analytics

Sep 2019 - Jan 2020

Crunch Analytics is a data science company providing tailored solutions for its clients.

Example projects include: - Automating stock replenishment in retail - Intelligent promotion of products

As a data engineer, I contributed to the development and maintenance of the internal data processing framework written in Python. The framework integrates multiple components: Python scripts are scheduled and executed via Apache Airflow, interacting with the Kubernetes cluster on Google Cloud. I was also responsible for operationalizing machine learning models and other code developed by the data scientists.

Apache Airflow, Docker, Google Cloud, Jenkins, Kubernetes, Linux, PostgreSQL, Python, SQL


IntoData

IntoData specializes in providing tailored data engineering solutions for clients.

At IntoData, I worked on various projects for different clients.

Essent Belgium

Nov 2018 - Sep 2019

Essent was an electricity and gas provider in Belgium, catering to both B2B and B2C markets. Essent strategically adopted a data-centric approach to better understand and engage with its customers. Using big data tools like Apache Spark, statistical models were developed for customer segmentation, scoring, and targeted campaigns.

At Essent, I was part of the analytics team, writing Scala code within the Apache Spark framework based on analyses from data analysts. To streamline job scheduling, we used Apache Airflow to automatically start AWS EMR clusters and process data on AWS S3. I was responsible for ensuring the entire pipeline ran successfully, from programming and testing to automation.

AWS, Apache Airflow, Apache Spark, CI/CD, Jenkins, Linux, Machine learning, PostgreSQL, Python, SQL, Scala

Oranje

Mrt 2018 - Okt 2018

Oranje manufactures care products for furniture and offers repair and care services.

Oranje wanted to replace its legacy system with a new Salesforce CRM. I managed the data migration from the legacy system to Salesforce, tackling challenges like evolving business logic, structural differences between systems, and resolving data quality issues.

AWS, Data integration, Java, PostgreSQL, SQL, SQL Server, Salesforce, Talend

R&D

Jul 2018 - Aug 2018

IntoData has an R&D department focusing on machine learning, deep learning, and artificial intelligence.

The research focused on the potential of machine learning and deep learning. I started by familiarizing myself with machine learning concepts, particularly focusing on deep learning. Before building models, data had to be preprocessed, using SQL and various Python libraries like Numpy, Pandas, and Scikit-learn. To build the models, we used Keras and TensorFlow.

Throughout the project, I worked with Jupyter Notebooks, conducting numerous experiments. The most successful outcome was a simple sentiment analysis API with a web interface built in HTML and JavaScript. This setup was designed to score the sentiment (positive or negative) of a given message.

I presented the results of this research to my colleagues at our annual conference.

AWS, JavaScript, Linux, Machine learning, PostgreSQL, Python, SQL

Roompot

Mar 2018 - Jun 2018

Roompot offers stays in numerous holiday parks across Belgium, the Netherlands, France, and other countries.

At Roompot, I was responsible for integrating data between their old back-office system and the new CRM system. Files were placed nightly on their FTP server, and I developed the integration in Talend. These Talend jobs read the files, process the data, and load it into Salesforce. The jobs were executed on AWS EC2 instances.

AWS, Data integration, Java, Linux, Salesforce, Talend

Compasity

Jan 2018 - Feb 2018

Compasity offers a service in the Netherlands to help businesses track and prevent employee absenteeism. Compasity connects to various systems, and I was tasked with building one such integration. The goal of this integration was to keep two systems synchronized, ensuring that Compasity always had the latest data.

Data integration, Java, Oracle, SQL, Talend

FCR Media

Sep 2017 - Dec 2017

FCR Media is best known for its Gouden Gids (Golden Pages), and it also takes responsibility for various technical aspects of its clients, such as their websites, digital marketing, and SEO.

I joined their data team, where my tasks were varied. I worked on enriching their database, synchronizing their internal database with the CRM system, and generating data quality reports.

Data integration, Java, Oracle, SQL, Talend