About
Profile Link to heading
Data Engineer at Sollers Consulting, focused on cloud data platforms, ETL/ELT delivery, and data quality. I am open to collaboration on data-related work in companies, especially around designing, implementing, and auditing data systems.
Key competencies Link to heading
- Data engineering: Snowflake, dbt, Fivetran, ETL/ELT pipelines, data modeling, data validation
- Programming and analytics: SQL, Python, R, PySpark
- Cloud and infrastructure: AWS (S3), GitHub Actions, Azure (DevOps, Synapse, AI Foundry), Docker, Linux
- Databases: PostgreSQL, DuckDB, MSSQL
Selected achievements Link to heading
Building Common Data Model for a Leading London Market Broker Link to heading
- Implemented dbt/Snowflake models for a shared data model spanning multiple similar client data sources.
- Designed and implemented CI/CD in dbt Cloud (DEV/STG/PROD), with project structure standards, naming conventions, and quality gates before merge.
- Collaborated with architects and client stakeholders to clarify requirements and make scalability decisions for onboarding additional source system types.
- Onboarded new team members, prepared project onboarding documentation, and provided code reviews and technical feedback.
Migrating On-prem Reporting System to Cloud for a Leading London Market Insurer Link to heading
- Reverse-engineered transformation logic from Java to SQL and implemented Snowflake procedures for the reporting layer.
- Improved ingestion and transformation pipelines (Fivetran + Snowflake + S3/CDC), including incremental refresh optimization.
- Built a Python (
multiprocessing,pandas) parallel ingestion workflow for Parquet files from S3 to Snowflake staging; reduced ingestion time from about 4h to about 45 minutes. - Supported reconciliation between Snowflake outputs and source data and helped stabilize full history handling; final reporting refresh moved from daily to every 15 minutes.
Education Link to heading
- B.Eng., Data Engineering and Analytics - Lublin University of Technology