buscojobs Brasil
Data Engineer (Dataviz / Power Bi)
Job Location
Quixadá, Brazil
Job Description
Overview Our client is a U.S.-based company that provides technical expertise, testing, and certification services to the global food and agricultural industry. Their mission is to ensure food safety, quality, and sustainability across international supply chains. This role is critical to building, maintaining, and modernizing data pipelines that process large-scale regulatory data from around the world and transform it into usable datasets for downstream applications and APIs. The engineer will work hands-on with Python, SQL, and related tools to untangle legacy “spaghetti code” pipelines, migrate processes to more maintainable platforms such as Airflow, and ensure that our data is accurate, reliable, and ready for client-facing products. This role requires both strong technical ability and a consulting mindset—able to learn undocumented systems, troubleshoot gaps, and design forward-looking solutions that will scale as our data environment evolves. Required Qualifications: Minimum 7 years’ experience using Python for analyzing, extracting, creating, and transforming large datasets. Proficiency in Python 3 and common Python libraries and tools for data engineering, specifically Pandas, NumPy, and Jupyter Notebooks. Deep experience with SQL and relational data using Oracle, Postgres, or MS SQL Server. Solid understanding of database design principles, data modeling, and data warehousing concepts. Excellent troubleshooting skills and instincts. Curious, self-motivated, and self-directed; comfortable working within an Agile software development team with short, iterative delivery cycles. College degree or equivalent experience in computer science, software development, engineering, information systems, math, food science, or other applicable field of study. Preferred Qualifications: NoSQL database design and development using MongoDB, AWS DynamoDB, or Azure Cosmos DB. Familiarity with cloud platforms (e.g., AWS, Azure, GCP) and services related to data storage/processing Exposure to Terraform or other Infrastructure-as-Code tooling. Proficient in Azure DevOps for source code and pipeline management. TotalPass Company : Aracaju, Sergipe — Tata Consultancy Services Come to one of the biggest IT Services companies in the world! Here you can transform your career! Why to join TCS? Here at TCS we believe that people make the difference, that’s why we live a culture of unlimited learning full of opportunities for improvement and mutual development. Role Details We are looking for "Data Engineer" Remote mode , who wants to learn and transform his career. In this role you will: (responsibilities) Snowflake, DBT, SQL Agile Methodologies Operational Monitoring: Proactively monitor data jobs and pipelines to ensure smooth execution and timely delivery of datasets. Respond to alerts and resolve issues with minimal downtime. Pipeline Maintenance: Maintain and enhance DBT models and SQL scripts to support evolving business needs and ensure data accuracy. Warehouse Operations: Oversee Snowflake operations including user access, query performance, and resource utilization. Incident Response: Act as a first responder for data job failures, conducting root cause analysis and implementing preventive measures. Collaboration: Work closely with data engineers, analysts, and business stakeholders to support operational data needs and troubleshoot issues. Process Optimization: Identify opportunities to automate manual tasks, improve pipeline efficiency, and reduce operational overhead. Documentation & Reporting: Maintain clear documentation of operational procedures, job schedules, and incident logs. Provide regular updates to stakeholders on system health and performance. What can you expect from us? • Professional development and constant evolution of your skills, always in line with your interests. • Opportunities to work outside Brazil • A collaborative, diverse and innovative environment that encourages teamwork. What do we offer? Health insurance Life insurance Gympass TCS Cares – free 0800 that provides psychological assistance (24 hrs/day), legal, social and financial assistance to associates Partnership with SESC Reimbursement of Certifications Free TCS Learning Portal – Online courses and live training International experience opportunity Discount Partnership with Universities and Language Schools Bring Your Buddy – bonus for each hire TCS Gems – Recognition for performance Xcelerate – Free Mentoring Career Platform About the Product Niche is the leader in school search. Our mission is to make researching and enrolling in schools easy, transparent, and free. With in-depth profiles on every school and college in America, 140 million reviews and ratings, and powerful search tools, we help millions of people find the right school for them. Niche is all about finding where you belong, and that mission inspires how we operate every day. About the Role Niche is looking for a skilled Data Engineer to join the Data Engineering team. Youʼll build and support data pipelines that can handle the volume and complexity of data while ensuring scale, data accuracy, availability, observability, security, and optimum performance. Youʼll be developing and maintaining data warehouse tables, views, and models, for consumption by analysts and downstream applications. This is an exciting opportunity to join our team as weʼre building the next generation of our data platform, and engineering capabilities. Youʼll be reporting to the Manager, Data Engineering (Core). What You Will Do Design, build, and maintain scalable, secure data pipelines that ensure data accuracy, availability, and performance. Develop and support data models, warehouse tables, and views for analysts and downstream applications. Ensure observability and quality through monitoring, lineage tracking, and alerting systems. Implement and maintain core data infrastructure and tooling (e.g., dbt Cloud, Airflow, RudderStack, cloud storage). Collaborate cross-functionally with analysts, engineers, and product teams to enable efficient data use. Integrate governance and security controls such as access management and cost visibility. Contribute to platform evolution and developer enablement through reusable frameworks, automation, and documentation. What We Are Looking For Bachelorʼs degree in Computer Science, Data Science, Information Systems, or a related field. 3-5 years of experience in data engineering. Demonstrated experience of building, and supporting large scale data pipelines – streaming and batch processing. Software engineering mindset, leading with the principles of source control, infrastructure as code, testing, modularity, automation, CI/CD, and observability. Proficiency in Python, SQL, Snowflake, Postgres, DBT, Airflow. Experience of working with Google Analytics, Marketing, Ad & Social media platform, CRM/Salesforce, and JSON data; Government datasets, and geo-spatial data will be a plus. Knowledge and understanding of the modern data platform, and its key components – ingestion, transformation, curation, quality, governance, and delivery. Knowledge of data modeling techniques (3NF, Dimensional, Vault). Experience with Docker, Kubernetes, Kafka will be a huge plus. Self-starter, analytical problem solver, highly attentive to detail, effective communicator, and obsessed with good documentation. First Year Plan During the 1st Month: Immerse yourself in the company culture, and get to know your team and key stakeholders. Build relationships with data engineering team members, understand the day to day operating model, and stakeholders that we interact with on a daily basis. Start to learn about our data platform infrastructure, data pipelines, source systems, and inter-dependencies. Start participating in standups, planning, and retrospective meetings. Start delivering on assigned sprint stories and show progress through completed tasks that contribute to team goals. Within 3 Months: Start delivering on assigned data engineering tasks to support our day to day, and roadmap. Start troubleshooting production issues, and participating in on-call activities. Identify areas for improving data engineering processes, and share with the team. Within 6 Months: Contribute consistently towards building our data platform, which includes data pipelines, and data warehouse layers. Start to independently own workstreams whether it is periodic data engineering activities, or work items in support of our roadmap. Deepen your understanding, and build subject matter expertise of our data & ecosystem. Within 12 Months: Your contributions have led to us making significant progress in implementing the data platform strategy, and key data initiatives to support the company’s growth. Youʼve established yourself as a key team member with subject matter expertise within data engineering. About the Product Amidst a larger set of postings there are additional roles from Aracaju, Sergipe and other employers such as Amaris Consulting, Fidelis Security, Insight Global, HVAR, Kake, Pride Global, and Niche. This description consolidates several disparate postings that share a common data engineering theme. J-18808-Ljbffr
Location: Quixadá, Ceará, BR
Posted Date: 9/11/2025
Location: Quixadá, Ceará, BR
Posted Date: 9/11/2025
Contact Information
Contact | Human Resources buscojobs Brasil |
---|