Cloudcover Consultancy Private Limited
Cloudcover - Senior Data Engineer - Java/Python
Job Location
pune, India
Job Description
Job Overview : The Senior Data Engineering Consultant will be responsible for requirements gathering, solutioning, designing and building modern data platforms to support data-driven decision making. The Engineer will execute technical implementation of data engineering and visualization projects, and will be a hands-on role. The Senior Data Engineering Consultant will help build a data and analytics consulting practice by taking part in recruiting efforts, creating technical collateral, and staying on top of technology trends with ongoing training and certifications. The entire consulting team will be responsible for building long-term strategic relationships with clients and participating in all aspects of project Responsibilities : - Lead and drive discovery sessions with external clients and build state-of-the-art data architectures. - Work with stakeholders to understand their problem statements, data requirements and implement solutions that meet their needs. - Design, implement, and develop data pipelines to collect and process large amounts of data from various sources. - Implement data storage solutions that are scalable, secure, and efficient, such as data warehouses and databases. - Develop and implement data validation and testing processes to ensure that data is processed accurately and efficiently. - Automate data collection, processing, and reporting processes to minimize manual work and improve efficiency. - Create high quality documents to capture problem statements, requirements, solutions and designs. - Support pre-sales activities, including whiteboard sessions, collaborating on solution architecture design, and assisting in proposal and statement of work creation. - Contribute to the development of reusable, repeatable collateral for use across the practice. - Obtain / maintain training and certification in cloud technologies. - Work with the marketing team to produce content to promote the practice across the Requirements : - Bachelor's degree in Computer Science, Information Technology, or a related field. - 7-9 years of data engineering experience in data management, database architecture, data engineering, or data visualization. - Excellent problem-solving, organization, debugging, and analytical skills. - Ability to work independently and in a team environment. - Excellent communication skills for effectively expressing ideas to team members and clients. - Strong experience in Integrating with multiple data sources with both structured and unstructured data in both batch and streaming modes. - Knowledge of cloud computing platforms, such as Amazon Web Services (AWS), Google Cloud Platform (GCP), or Microsoft Azure. - Experience building data pipelines with ETL tools/Equivalent Cloud services, such as Azure Data Factory, AWS Glue or equivalent. - Familiarity with data warehousing solutions, such as Snowflake, Google BigQuery, Databricks & Azure Synapse. - Experience with at least one programming language, such as Python, Java, or Scala. - Experience in at least one RDBMS Experience in at least one NoSQL databases (MongoDB/Cassandra/BigTable). - Strong Experience in Orchestration tools such as Apache Airflow/Nifi/ equivalent cloud native services. - Strong skills in Apache Spark/Flink/Beam. - Experience in designing, implementing, and managing event-driven data pipelines using Kafka or similar cloud-native streaming services like AWS Kinesis, Azure Event Hubs, or Google Cloud Pub/Sub. - Experience with visualization tools, such as Power BI, Looker, Tableau, and QuickSight. - Familiarity with Docker and Kubernetes. - Debug and optimize existing data infrastructure and processes as needed. - Familiarity with version control systems, particularly Git, for managing code repositories, branching strategies, and collaborative development To Have : - Experience building large-scale, high throughput, 24x7 data systems. - Any data engineering, visualization, or data science certifications on any of the clouds. - Exposure to machine learning algorithms, AI, and/or LLM, with implementation in practice. - Experience with legacy data systems (e. Hadoop, Informatica). - Experience with CI/CD concepts and tools for automating data pipeline deployments, testing, and version control, such as Jenkins, GitLab CI, Azure DevOps, or similar platforms. - Experience with Data Quality frameworks and Data Governance tools such as Alation/Collibra/DataPlex. (ref:hirist.tech)
Location: pune, IN
Posted Date: 5/17/2025
Location: pune, IN
Posted Date: 5/17/2025
Contact Information
Contact | Human Resources Cloudcover Consultancy Private Limited |
---|