Talent500

ETL Developer - Python/Data Warehousing

Job Location

bangalore, India

Job Description

Job Description : With your deep expertise and proven success using analytical thinking and iterative problem-solving, you have what it takes to manage programs and processes strategically and tactically. Whether balancing the needs of multiple stakeholders or making sound decisions using data, analysis, past experience, and a risk mindset, you will serve as a trusted advisor who routinely solves complex business problems and delivers against milestones. In the process, you will have exciting opportunities to develop your skills, expand your network, and build your career. Responsibilities : - Participate in all phases of the software engineering life cycle and troubleshoot technical problems as needed. - Design, develop, and maintain applications, infrastructure, and platform according to changing requirements and following established processes and procedures. - Create and optimize ETL processes to extract, transform, and load data from various sources into data warehouses and other storage solutions. - Develop and maintain data models for efficient storage, retrieval, and analytics. - Develop and maintain Python scripts for data integration. - Develop and maintain ETL processes using core Java to extract, transform, and load data from multiple sources. - Implement custom data transformations and integrations using core Java and Spring Boot. - Test, debug, and/or oversee the testing and validation of applications to ensure that quality and functionality are in line with the requirements, as well as industry standards and protocols. - Ensure ETL processes are optimized for performance, handling large datasets efficiently. - Implement data validation and error-handling mechanisms to ensure data quality and integrity. - Work with cross-functional teams to understand data requirements and translate them into scalable ETL solutions. - Maintain up-to-date technical documentation for ETL processes, data models, and data flows. - Provide technical guidance and mentorship to junior developers and data engineers. - Resolve production defects within the defined SLA. Requirements : - Bachelor's Degree (or foreign equivalent degree) in Information Technology, Information Systems, Computer Science, Software Engineering, or a related field. - Experience in the financial services or banking industry is preferred. - Minimum 5-7 years of development experience with Informatica Powercenter and other SDLC phases. - Minimum 5-7 years of development experience with SQL and PLSQL. - Strong knowledge and experience in data integration using Python scripts. - Strong knowledge of relational databases such as SQL Server, Oracle, or PostgreSQL. - Experience with cloud-based based datawarehousing solutions like AWS Glue, AWS RDS for SQL Server, etc. - In-depth understanding of data warehousing concepts and best practices. - Proficient in Unix/Linux scripting for automation and process optimization. - Strong analytical and problem-solving skills, with a focus on troubleshooting and optimizing ETL processes. - Excellent communication skills, with the ability to collaborate effectively with team members and stakeholders. - Good experience in the implementation of custom data transformation and integrations using core Java and Spring Boot. - Experience working in Agile/Scrum environments and knowledgeable on Atlassian Suite (Jira, Confluence, etc. ) - Knowledgeable in DevOps workflow tools, including Azure DevOps Services (ADO). - Ability to independently manage, organize, and prioritize multiple tasks, projects, and responsibilities. (ref:hirist.tech)

Location: bangalore, IN

Posted Date: 5/1/2025
View More Talent500 Jobs

Contact Information

Contact Human Resources
Talent500

Posted

May 1, 2025
UID: 5153870589

AboutJobs.com does not guarantee the validity or accuracy of the job information posted in this database. It is the job seeker's responsibility to independently review all posting companies, contracts and job offers.