Ascendion
Ascendion - Kafka Engineer
Job Location
bangalore, India
Job Description
Job Title : Kafka Engineer Location : Bengaluru, Karnataka, India About Us : Ascendion is a global, leading provider of AI-first software engineering services, delivering transformative solutions across North America, APAC, and Europe. We are headquartered in New Jersey. We combine technology and talent to deliver tech debt relief, improve engineering productivity solutions, and accelerate time to value, driving our clients digital journeys with efficiency and velocity. Guided by our Engineering to the power of AI [EngineeringAI] methodology, we integrate AI into software engineering, enterprise operations, and talent orchestration, to address critical challenges of trust, speed, and capital. For more information, please go to www.ascendion.com About the Role : Ascendion is looking to hire a skilled Kafka Engineer with proven expertise in Kafka/Confluent Kafka streaming platforms, specifically with experience working within the Retail or Banking domains. As a Kafka Engineer, you will be responsible for designing, building, and maintaining robust, scalable, and high-performance data streaming solutions. You will leverage your deep technical knowledge of Kafka and its ecosystem to implement critical data pipelines that support real-time processing and data integration needs for our clients in the financial and retail sectors. Key Responsibilities : - Design, develop, and implement data streaming solutions using Apache Kafka and/or Confluent Kafka platforms. - Build and maintain reliable and scalable Kafka topics, producers, and consumers. - Develop data integration pipelines using Kafka Connect for connecting Kafka with other systems (databases, applications, cloud services). - Implement data transformation and processing logic using Kafka Streams or ksqlDB. - Monitor the health, performance, and scalability of Kafka clusters and streaming applications. - Troubleshoot and resolve issues related to Kafka cluster operations, data flow, and application connectivity. - Ensure the security and reliability of Kafka-based data pipelines. - Collaborate with data architects, data engineers, developers, and operations teams to integrate Kafka into the overall data architecture. - Apply your domain knowledge to understand specific data challenges and requirements within the Retail or Banking sectors. - Contribute to best practices for Kafka development, deployment, and operations. - Optimize Kafka configurations and infrastructure for cost-efficiency and performance. Required Qualifications : - Bachelor's degree in Computer Science, Software Engineering, or a related technical field (or equivalent practical experience). - Proven expertise in Apache Kafka and/or Confluent Kafka streaming platforms. - Demonstrated experience working on data streaming or messaging projects. - Experience working specifically within the Retail or Banking domains. Note : While the prompt didn't specify years of experience, "expertise" typically implies significant hands-on experience, likely 5 years specifically with Kafka/data streaming technologies is expected for this level. Required Skills & Experience : - Deep technical knowledge of Kafka/Confluent Kafka, including brokers, topics, partitions, producers, consumers, and Kafka Connect. - Experience with designing and implementing data streaming architectures. - Familiarity with the Kafka ecosystem tools and technologies (e.g., Zookeeper, Schema Registry, ksqlDB, Kafka Streams, Kafka Connect). - Understanding of data processing concepts (real-time vs batch, exactly-once processing). - Experience with data modeling for streaming data. - Strong problem-solving and troubleshooting skills. - Excellent communication and collaboration skills. - Specific experience applying Kafka solutions within the Retail or Banking industry. (ref:hirist.tech)
Location: bangalore, IN
Posted Date: 6/2/2025
Location: bangalore, IN
Posted Date: 6/2/2025
Contact Information
Contact | Human Resources Ascendion |
---|