Huquo

Azure Data Architect - Big Data Engineering

Click Here to Apply

Job Location

Pune, India

Job Description

Position : Azure Data Architect Location : Pune (Hybrid) Experience : 8-12 years. Clairvoyant is a global technology consulting and services company founded in 2012, headquartered in Chandler, US, and has delivery centers across the globe. We help organizations maximize the value of data by providing data engineering, analytics, machine learning, and user experience consulting and development projects to multiple Fortune 500 clients. Clairvoyant clients rely on its deep vertical knowledge and best-in-class services to drive revenue growth, boost operational efficiencies, and manage risk and compliance. Our team of experts with direct industry experience in data engineering, analytics, machine learning, and user experience has your back every step of the way. "Our Values: Passion, Continuous Learning, Adaptability, Teamwork, Customer Centricity, Reliability" As a Software Technical Architect in our Big Data Engineering team, you will play a crucial role in designing, developing, and maintaining complex data solutions. You will collaborate with cross-functional teams to define and implement scalable, high-performance data architectures. Your primary responsibilities will include data modelling, leading software design and development efforts, and ensuring the reliability, scalability, and performance of our big data infrastructure. Job Description : - 8 years of proven experience as a Software Technical Architect in Big Data Engineering. - Strong understanding of Data Warehousing, Data Modelling, Cloud and ETL concepts - Experience with Azure Cloud technologies, including Azure Data Factory, Azure Data Lake Storage, Databricks, Event Hub, Azure Monitor and Azure Synapse Analytics - Proficiency in Python, PySpark, Hadoop, and SQL. - Designing and building of data pipelines using API ingestion and Streaming ingestion methods. - Knowledge of Dev-Ops processes (including CI/CD) and Infrastructure as code is essential. - Thorough understanding of Azure Cloud Infrastructure offerings. - Strong experience in common data warehouse modelling principles including Kimball, Inmon. - In-depth knowledge of big data technologies and ecosystems. - Excellent problem-solving skills and the ability to address complex technical challenges. - Strong communication and leadership skills. - Experience developing security models. Must Have : - Strong understanding of Azure data platform - Azure Data Factory, Azure Databricks, Synapse, Event Hubs, ADLS, Delta Files, Azure Monitor, Azure Security, Azure DevOps - Experience in developing NO SQL solutions using Azure Cosmos DB. - Good knowledge on setting up Data Governance in Azure - Hands on experience on Python Programming Language and excellent code debugging skills. - Excellent knowledge on SQL - Experience with data modelling, ETL, and data warehousing concepts and implementation - Strong customer engagement skills to understand customer needs for Analytics solutions fully. - Excellent communication and teamwork skills - Strong data analysis and analytical skills. - Experience in working in a fast-paced agile environment. - Strong problem solving and troubleshooting skills Good to Have : - Experience with Power BI Reporting - Experience with Microsoft Purview - Experience on processing streaming data - Programming Languages - Java or Scala - Familiarity with other cloud platforms (e.g., AWS, GCP) is a plus. - Azure Architect Certification Responsibilities : - Analyses current business practices, processes, and procedures as well as identifying future business opportunities for leveraging Microsoft Azure Data & Analytics Services. - Engage and collaborate with customers to understand business requirements/use cases and translate them into detailed technical specifications. - Collaborate with project managers for project/sprint planning by estimating technical tasks and deliverables. - Data Modelling: Develop and maintain data models to represent our complex data structures, ensuring data accuracy, consistency, and efficiency. - Technical Leadership: Provide technical leadership and guidance to development teams, promoting best practices in software architecture and design. - Solution Design: Collaborate with stakeholders to define technical requirements and create solution designs that align with business goals and objectives. - Programming: Develop and maintain software components using Python, PySpark, and Hadoop to process and analyze large datasets efficiently. - Big Data Ecosystem: Work with various components of the Hadoop ecosystem, such as HDFS, Hive, and Spark, to build data pipelines and perform data transformations. - SQL Expertise: Utilize SQL for data querying, analysis, and optimization of database performance. - Performance Optimization: Identify and address performance bottlenecks, ensuring the system meets required throughput and latency targets. - Scalability: Architect scalable and highly available data solutions, considering both batch and real-time processing. - Documentation: Create and maintain comprehensive technical documentation to support the development and maintenance of data solutions. - Security and Compliance: Ensure that data solutions adhere to security and compliance standards, implementing necessary controls and encryption mechanisms. - Improve the scalability, efficiency, and cost-effectiveness of data pipelines. - Monitoring and resolving data pipeline problems will guarantee consistency and availability of the data. - Responsible to identify and resolve any performance issues. - Keep up to date with new technology development and implementation. - Participate in code review to make sure standards and best practices are met. - Take responsibility for estimating, planning, and managing all tasks and report on progress. (ref:hirist.tech)

Location: Pune, IN

Posted Date: 4/24/2024
Click Here to Apply
View More Huquo Jobs

Contact Information

Contact Human Resources
Huquo

Posted

April 24, 2024
UID: 4648053006

AboutJobs.com does not guarantee the validity or accuracy of the job information posted in this database. It is the job seeker's responsibility to independently review all posting companies, contracts and job offers.