Azure Data Architect

Job description

We are looking for highly-motivated data architects with focus on Azure services that are able to design and implement high-performant, production-grade big data pipelines on top of modern datalakes, data warehouses and data marts. You will have a unique opportunity to deliver big data solutions to a broad variety of our clients and participate in development of our products and services such as our Insights Platform.

Our Insights Platform is an end-to-end next generation big data analytics service that is composed of best of breed tools and services to support ingestion, hygiene, refinement, analytic processing and visualization of data. Unlike competitive offerings which require long and expensive set-up and configuration, our service is Available in hours; Useful in days; Benefits the business in weeks.

Your responsibilities

  • Working as a part of the Prague Data Analytics team and our global network of experts, you will be responsible for building modern data platforms as well as designing, developing, deploying, and maintaining scalable production-grade data transformation pipelines.
  • You will need to deep dive into solution architecture, data architecture, data management and warehousing, ETLs, data governance and data versioning. All that packed into fully automated DevOps pipelines.

Our requirements

  •  Strong previous experience working in a data architecture / data-driven role (either as an external consultant or an in-house expert) preferably in Finance industry
  • Deep knowledge of Microsoft Azure data stack (e.g. Azure Databricks, Delta Lake, Azure Data Factory, Azure Storage, Azure SQL Database, Azure Purview) and integration patterns
  • BSc or MSc University degree (preferably in Math, Computer Science, Economics, Physics, Operational Research or related)
  • Strong command of data analysis tools and technologies (SQL, Python, PySpark) 
  • Knowledge of tools and programming languages used in large data sets analysis in Hadoop ecosystem (Spark, Hive, Pig)
  • Experience with building data pipelines to integrate various applications and systems (ETL)
  • Willingness to learn new tools and software solutions.
  • Very good command of English
  • Strong interpersonal and communication skills - essential in day-to-day cooperation with clients and the team
  • Willingness to travel


  • Flexible working hours
  • Preferred equipment and platform provided for efficient work
  • Possibility to work as an employee or as a freelancer according to your priorities
  • Wide range of benefits for employees:
  • Pension and Life insurance contribution
  • Private healthcare
  • Meal tickets (100 CZK/working day)
  • Cafeteria system/Multisport cards
  • Up to 5 weeks of holiday depending on length of employment
  • Technical and personal training
  • Language courses
  • Personal Event Bonus
  • Loyalty presents

Share this offer:

You haven’t found the position you were looking for? Create your profile, send us your CV and stay connected!

Create your profile