|
Bachelor's or master's degree in computer science, Engineering, Data Science, Data Engineering or similar field
4+ years hands-on experience around:
- Data Management (DWH and Data Lake architectures with snowflake, Azure Data Lake Store)
- Data Modeling (e.g. Inmon, Kimball, Data Vault etc.)
- ETL/ELT design (with tools such as dbt, Talend etc.)
- Data Ingestion & Processing (Azure Data Factory, Functions / Batch Ingestion, Stream Analytics, ¿)
- DevOps (Continuous Integration / Delivery, testing automation, Azure DevOps, Git(-hub)
Experience in the following fields will be considered as strong assets:
- Proven track record in partnering with the business to extract value from data.
- Working with data ingested from systems such as SAP (ECC or S4), Salesforce, Tagetik, ProAlpha
- Data Engineering (Python / pySpark, SQL)
- Security (Active Directory, Application Gateway, Key Vault, Virtual Networks, Network Security Groups)
- BI reporting tools, like QlikView/QlikSense, PowerBI
Big-picture thinker with positive attitude while maintaining an attention to detail.
Experienced in managing vendors/partners as well as projects from PoC to scalable solution. Problem solving skills and capabilities to explain complex situations in an easy and understandable manner.
Project Management skills are a big plus.
Experience working with agile methodologies such as Scrum, SAFe etc. Team player and motivational capabilities.
Fluent in English.
|