Senior Systems Architect and Data Engineer with a proven track record of implementing Analytical Solutions on cloud and on-premise infrastructures. A quick learner and a troubleshooter at heart with a genuine interest in technology. Manages the full life cycle from idea to deployment and helps solving practical issues related to operating large-scale systems in production.
Lead Data Engineer
Actively participated in architecting and developing central parts of ICAs cloud based data warehouse.
- Design and implementation of a "Batch Framework" built on dbt to support ICAs layered data warehouse. Data-transformation managed by dbt and scheduling with Cloud Workflows and Cloud Build. Focus on developer experience and CICD-processes.
- Design and implementation of a "Sales-API" with near realtime store-KPIs for end-user applications. Data provided from BigQuery through a GraphQL-API hosted on Cloud Run. Focus on performance and scalability.
- Design and implementation of a "Semantic Layer" on Looker Core and BigQuery for reporting of point-of-sales data. Focus on re-use, performance and cost-efficiency.
- GraphQL, Python, Terraform, Google Cloud Platform, Looker, Dbt, BigQuery, Google Looker
Senior Data Engineer
Implementation of analytical data-models (in dbt and BigQuery) with end-user analytical functionality in Looker for international book-publishing and streaming.
- Custom Data Studio (now Looker Studio) plugin for integration with Looker for external publishers
- Ingestion mechanism from event-streams from PubSub to BigQuery
- Flexible BigQuery batch-export in Python
- Python, Terraform, Google Cloud Platform, Apache Airflow, Looker, Dbt, BigQuery
Senior Data Engineer
Senior Data Engineer in two parallel initiatives at ICA, both with a clear focus on transitioning from on-premise solutions to Google Cloud Platform.
One project at ICA Group level, with a focus on establishing an architecture and design-patterns for a common Data Lake and Machine Learning platform. The platform should support all companies within the group with a strong focus on reusability and isolation between different tenants. Initial use-case was focused on transitioning a legacy solution for ICA Fastigheter onto GCP.
The other project, at ICA Sverige, aimed to establish a new Cloud Data Warehouse platform with several delivery teams working in parallel to onboard data sources.
Both projects with a strong focus on leveraging managed products within GCP, security and efficient use of infrastructure as code.
- Terraform, Google Cloud Platform, Dataflow, BigQuery
Senior Data Engineer
Implementing telemetry solution for an upcoming multi-player video-game.
Solution spans from game-client and backend through backendservices to a GCP-hosted analytical platform.
High degree of complexity due to different technologies, protocols and languages used in the underlying frameworks and platforms.
- Java, Golang, Protocol Buffers, Unreal game engine, Angelscript, Terraform, C++, BigQuery
Big Data Architect
Initially in an advisory role as Big Data specialist with focus on hybrid architectures, event-sourcing and analytical platforms.
From January 2020 as architect and senior Data Engineer designing and implementing Google Cloud Platform-based analytical solution coupled with an on-premise Kafka event-integration platform.
- Kafka, Google Cloud Platform, Dataflow, BigQuery
Solution Architect
Solution Architect of a multi-tenant data lake supporting the Tele2 group as a shared service. Focus on building a common foundation with a high degree of security and information integrity to meet existing and upcoming regulations on storage and processing of sensitive personal information.
Big Data Expert
Architecture and implementation.
Big Data Specialist / Product Owner
Advisory role as Big Data Specialist starting up "Data Pipeline & Data Store"- team. Focus on knowledge transfer in the areas of architecture and analytics.
Big Data Specialist
Svenska Spel is a Swedish state-owned gambling company offering lotteries, sports betting, online casino, and land-based casino services.
- Big Data Architect designing and implementing a Data Warehouse and Analytics infrastructure based on Hadoop technology.
- The project had a strong emphasis on knowledge transfer and helping to form the organization for development and operations as well as showcasing analytics capabilities.
Big Data Expert
Advisory role as Big Data Expert in a program to establish a multi-tenant data lake supporting multiple TeliaSonera markets and organizations.
Big Data Architect
Supporting Klarna's Data Vault-team as Big Data Architect with hands-on tutoring on implementing Hive, Oozie and general MapReduce knowledge.
Data Architect designing, implementing and supporting an analytics platform for social games based on Hadoop technology. Implementing and handing over the system required close co-operation with King's development and operations teams.
Data Architect