Astellas Pharma
Senior Data Engineer
Job Location
Ciudad de México, Mexico
Job Description
About Astellas: At Astellas, we believe that nurturing exceptional relationships with our employees delivers exceptional business results. Everyone at Astellas has a responsibility for creating a brighter future for patients around the world. From the first moment, Astellas will inspire you to put this ethos into practice – with a positive, agile company culture and with well-defined ethical principles, values, and systems. Purpose & Scope As part of the Astellas commitment to delivering value for our patients, our organisation is currently undergoing transformation to achieve this critical goal. This is an opportunity to work on digital transformation and make a real impact within a company dedicated to improving lives. DigitalX our new information technology function is spearheading this value driven transformation across Astellas. We are looking for people who excel in embracing change, manage technical challenges and have exceptional communication skills. We are seeking committed and talented Senior Data Engineers, to join our InformationX team- which lies at the heart of DigitalX. As a member of our team within InformationX, you will be responsible for ensuring our data driven systems are operational, scalable and continue to contain the right data to drive business value. You will play a crucial role in leading, designing, building, and maintaining our data infrastructure. Your expertise in Data engineering across multiple platforms including Azure/ AWS Cloud Data Warehousing, Databricks, PySpark, SQL, Business Intelligence (Qlik), Application management and other related technologies, will be instrumental in enabling data-driven decision-making and outcomes. Essential Skills & Knowledge: Strong communication and collaboration skills, coupled with excellent problem-solving skills and attention to detail Subject Matter Expertise: possess a strong understanding of data architecture/ engineering/operations/data model (e.g. data vault, dimensional data model). Experience within Life Sciences/ Pharma /Manufacturing industry is preferred. Proven experience in building robust data pipeline; experience in (near) real-time processing is preferred. Technical Proficiency: Strong coding skills for example, Python, PySpark, SQL. Engineering experience across multiple platforms for example AWS, Azure, Databricks or Change Data Capture (eg. Fivetran) is preferred. Expertise in building data pipelines and strong understanding of data management best practices Proficiency in network architecture and security concepts Proven experience with data analytics practices and techniques and organizations Agile Practices: Experience working in Agile development environments, participating in sprint planning, stand-ups, and retrospectives. Cloud Data Solutions: Familiarity with other cloud platforms (AWS, Azure, Google Cloud) and their data services Analytical Thinking: Demonstrated ability to lead ad hoc analyses, identify performance gaps, and foster a culture of continuous improvement Agile Champion: Adherence to DevOps principles, automation, and a proven track record with CI/CD pipelines for continuous delivery. Understand and Interpret business requirements and can term them into technical requirements. Responsibilities: Data Pipeline Development: Design, build, and optimize data pipelines using DWH technologies, Databricks, QLIK and other platforms as required. Ensuring data quality, reliability, and scalability. Application Transition: Support the migration of internal applications to Databricks (or equivalent) based solutions. Collaborate with application teams to ensure a seamless transition. Managing Continuous Improvement, Continuous Development, DevOps and RunOps activities at application, data and infra levels either on cloud or on premise. Mentorship and Leadership: Lead and mentor junior data engineers. Share best practices, provide technical guidance, and foster a culture of continuous learning. Data Strategy Contribution: Contribute to the organization’s data strategy by identifying opportunities for data-driven insights and improvements. Participate in smaller focused mission teams to deliver value driven solutions aligned to our global and bold move priority initiatives and beyond. Design, develop and implement robust and scalable data analytics using modern technologies. Collaborate with cross functional teams and practises across the organisation including Commercial, Manufacturing, Medical, FoundationX, GrowthX and support other X (transformation) Hubs and Practices as appropriate, to understand user needs and translate them into technical solutions. Provide Technical Support to internal users troubleshooting complex issues and ensuring system uptime as soon as possible. Champion continuous improvement initiatives identifying opportunities to optimise performance security and maintainability of existing data and platform architecture and other technology investments. Participate in the continuous delivery pipeline. Adhering to DevOps best practises for version control automation and deployment. Ensuring effective management of the FoundationX backlog. Leverage your knowledge of data engineering principles to integrate with existing data pipelines and explore new possibilities for data utilization. Stay-up-to-date on the latest trends and technologies in data engineering and cloud platforms. Experience: At least 5 years demonstrable experience in: Data engineering with a strong understanding of PySpark and SQL, building data pipelines and optimization. Data engineering and integration tools (e.g., Databricks, Change Data Capture) Utilizing cloud platforms (AWS, Azure, GCP). A deeper understanding/certification of AWS and Azure is considered a plus. Experience with relational and non-relational databases. Qualifications: Bachelor's degree in computer science, Information Technology, or related field (Master’s preferred) or equivalent experience. Any relevant cloud-based integration certification at associate or professional level. For example: AWS certified DevOps engineer (Associate or Professional), AWS Certified Developer (Associate or Professional) DataBricks Certified Engineer Qlik Sense Data Architect / Business Analyst (or similar platform) Mulesoft Certified integration architect Level 1, Microsoft Certified Azure Integration and Security. Proficient in RESTful APIs AWS, CDMP, MDM, DBA, SQL, SAP, TOGAF, API, CISSP, VCP (any relevant certification) MuleSoft Understanding of MuleSoft's Anypoint Platform and its components Experience with designing and managing API-led connectivity solutions Knowledge of integration patterns and best practices AWS Experience provisioning, operating, and managing AWS environments Experience developing code in at least one high-level programming language Understanding of modern development and operations processes and methodologies Ability to automate the deployment and configuration of infrastructure using AWS services and tools Experience with continuous integration and continuous delivery (CI/CD) methodologies and tools Microsoft Azure Fundamental understanding of Microsoft Azure and AWS and the data services provided Experience with Azure services related to computing, networking, storage, and security Knowledge of general IT security principles and best practices Understanding of cloud integration patterns and Azure integration services such as Logic Apps, Service Bus, and API Management Preferred Qualifications: Subject Matter Expertise : possess a strong understanding of data architecture/ engineering/operations/ reporting within Life Sciences/ Pharma industry across Commercial, Manufacturing and Medical domains. Other complex and highly regulated industry experience will also be considered for e.g. healthcare, government or financial services. Data Analysis and Automation Skills : Proficient in identifying, standardizing, and automating critical reporting metrics and modelling tools Analytical Thinking : Demonstrated ability to lead ad hoc analyses, identify performance gaps, and foster a culture of continuous improvement. Technical Proficiency : Strong coding skills in SQL, R, and/or Python, coupled with expertise in machine learning techniques, statistical analysis, and data visualization. Agile Champion : Adherence to DevOps principles and a proven track record with CI/CD pipelines for continuous delivery. Other critical skills required: Cross-Cultural Experience : Work experience across multiple cultures and regions, facilitating effective collaboration in diverse environments. Innovation and Creativity : Ability to think innovatively and propose creative solutions to complex technical challenges. Global Perspective : Demonstrated understanding of global pharmaceutical or healthcare technical delivery, providing exceptional customer service and enabling strategic insights and decision-making. Working Environment At Astellas we recognize the importance of work/life balance, and we are proud to offer a hybrid working solution allowing time to connect with colleagues at the office with the flexibility to also work from home. We believe this will optimize the most productive work environment for all employees to succeed and deliver. Hybrid work from certain locations may be permitted in accordance with Astellas’ Responsible Flexibility Guidelines.
Location: Ciudad de México, MX
Posted Date: 5/23/2025
Location: Ciudad de México, MX
Posted Date: 5/23/2025
Contact Information
Contact | Human Resources Astellas Pharma |
---|