Search

Enterprise Data Architect

Hyrhub
locationNewton, MA, USA
PublishedPublished: 6/14/2022
Technology
Full Time

Job Description

A seasoned Azure Data Architect with deep expertise in the Microsoft Fabric platform to design, develop, and govern enterprise-scale analytics and data-platform solutions.

  • Bachelor’s or Master’s degree in computer science, Information Technology, or a related field.
  • Candidate will be responsible for designing, developing, and maintaining scalable, secure, and efficient data integration pipelines using Microsoft Fabric capabilities.
  • Develop and manage lakehouse and warehouse architectures (using Fabric Lakehouse/Warehouse, Delta Lake, etc.).
  • This role requires strong expertise in Azure Data Factory, Azure Synapse / Data Lake, Data Lake Storage Gen2 and associated Azure data services, along with hands-on experience in ETL/ELT development, performance tuning, and automation.
  • Design and develop data pipelines using Azure Data Factory for ingesting, transforming, and orchestrating data from multiple on-prem and cloud sources.
  • Implement ETL/ELT processes to move data from various sources (SQL, APIs, files, SAP, etc.) to Azure Data Lake, Azure Synapse, or Databricks.
  • Strong problem-solving skills and attention to detail.
  • Experience working in Agile/Scrum development teams is preferred.
  • Certifications such as Microsoft Certified: Fabric Data Engineer Associate, Azure Solutions Architect or similar.
  • Exposure to Customer data platform (MS Customer Insights Data and MS customer insights journey).

Key Responsibilities

  • Collaborate with business stakeholders and senior leadership to translate business requirements into a coherent data-platform architecture.
  • Define and maintain the data-architecture roadmap, including data-ingestion, transformation, storage and analytics layers using Microsoft Fabric.
  • Design end-to-end data solutions: ingestion pipelines, lakehouses/warehouses, semantic layer, analytics consumption and real-time capabilities.
  • Architect and guide modelling of data (conceptual, logical, physical) – ensuring consistency, performance and reusability.
  • Oversee migration of existing platforms (on-premises or legacy cloud systems) to Fabric-centric architecture.
  • Work with data-engineering and analytics teams to implement solutions (e.g., Azure Data Factory/SQL/Synapse, Fabric pipelines, OneLake).
  • Build and maintain parameterized, reusable ADF pipelines with dynamic configurations.
  • Integrate ADF with Azure Functions, Logic Apps, and DevOps CI/CD pipelines.
  • Ensure data quality, data governance, security and compliance across Fabric solutions.
  • Monitor, tune, and optimise performance of data pipelines and storage solutions to ensure efficiency, cost-effectiveness and reliability.
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...