buscojobs Brasil

Data Engineer (Dataviz / Power Bi)

Job Location

Roraima, Brazil

Job Description

Overview Our client is a U.S.-based company that provides technical expertise, testing, and certification services to the global food and agricultural industry. Their mission is to ensure food safety, quality, and sustainability across international supply chains. This role is critical to building, maintaining, and modernizing data pipelines that process large-scale regulatory data from around the world and transform it into usable datasets for downstream applications and APIs. The engineer will work hands-on with Python, SQL, and related tools to untangle legacy “spaghetti code” pipelines, migrate processes to more maintainable platforms such as Airflow, and ensure that data is accurate, reliable, and ready for client-facing products. This role requires both strong technical ability and a consulting mindset—able to learn undocumented systems, troubleshoot gaps, and design forward-looking solutions that will scale as the data environment evolves. Required Qualifications: Minimum 7 years’ experience using Python for analyzing, extracting, creating, and transforming large datasets. Proficiency in Python 3 and common Python libraries and tools for data engineering, specifically Pandas, NumPy, and Jupyter Notebooks. Deep experience with SQL and relational data using Oracle, Postgres, or MS SQL Server. Solid understanding of database design principles, data modeling, and data warehousing concepts. Excellent troubleshooting skills and instincts. Curious, self-motivated, and self-directed; comfortable working within an Agile software development team with short, iterative delivery cycles. College degree or equivalent experience in computer science, software development, engineering, information systems, math, food science, or other applicable field of study. Preferred Qualifications: NoSQL database design and development using MongoDB, AWS DynamoDB, or Azure Cosmos DB. Familiarity with cloud platforms (e.g., AWS, Azure, GCP) and services related to data storage/processing Exposure to Terraform or other Infrastructure-as-Code tooling. Proficient in Azure DevOps for source code and pipeline management. TotalPass Come to one of the biggest IT Services companies in the world! Here you can transform your career! Why to join TCS? Here at TCS we believe that people make the difference, that’s why we live a culture of unlimited learning full of opportunities for improvement and mutual development. The ideal scenario to expand ideas through the right tools, contributing to our success in a collaborative environment. We are looking for "Data Engineer" Remote mode , who wants to learn and transform his career. Responsibilities (In this role you will) Snowflake, DBT, SQL Agile Methodologies Operational Monitoring: Proactively monitor data jobs and pipelines to ensure smooth execution and timely delivery of datasets. Respond to alerts and resolve issues with minimal downtime. Pipeline Maintenance: Maintain and enhance DBT models and SQL scripts to support evolving business needs and ensure data accuracy. Warehouse Operations: Oversee Snowflake operations including user access, query performance, and resource utilization. Incident Response: Act as a first responder for data job failures, conducting root cause analysis and implementing preventive measures. Collaboration: Work closely with data engineers, analysts, and business stakeholders to support operational data needs and troubleshoot issues. Process Optimization: Identify opportunities to automate manual tasks, improve pipeline efficiency, and reduce operational overhead. Documentation & Reporting: Maintain clear documentation of operational procedures, job schedules, and incident logs. Provide regular updates to stakeholders on system health and performance. What you can expect from us? Professional development and constant evolution of your skills, always in line with your interests. Opportunities to work outside Brazil A collaborative, diverse and innovative environment that encourages teamwork. What do we offer? Health insurance Life insurance Gympass TCS Cares – free 0800 that provides psychological assistance (24 hrs/day), legal, social and financial assistance to associates Partnership with SESC Reimbursement of Certifications Free TCS Learning Portal – Online courses and live training International experience opportunity Discount Partnership with Universities and Language Schools Bring Your Buddy – By referring people you become eligible to receive a bonus for each hire TCS Gems – Recognition for performance Xcelerate – Free Mentoring Career Platform Boa Vista, Roraima Amaris Consulting We are looking for dynamic consultants to grow our Information Systems and Digital team in Brazil . Your experience, knowledge, and commitment will help us to face our client\'s challenges. You will be supporting different projects through your expertise as Data Engineer . Your main responsibilities: Develop and maintain robust ETL pipelines to acquire data from diverse sources, including Oracle, SAP, and SQL-based systems. Transform raw data into clean, structured datasets that support reporting, analytics, and data science use cases. Collaborate with Data Science, Reporting, and Front-End teams to deliver reliable and reusable data solutions. Contribute to the creation of reusable frameworks, standardize patterns, and maintain comprehensive technical documentation. Participate in agile development processes, engaging in daily stand-ups and iterative releases with product managers and engineering peers. Monitor, troubleshoot, and optimize data workflows to ensure high performance and availability in a production environment. Qualifications: Bachelor’s or Master’s degree in Computer Science, Data Engineering, or a related technical field. 3 years of experience in Data Engineering Proven experience designing and implementing end-to-end ETL/ELT data pipelines. Strong proficiency in SQL and Python for data processing and transformation. Hands-on experience with Azure Cloud services and familiarity with tools such as Databricks, Delta Lake, and Spark. Knowledge of CI/CD pipelines and version control tools such as Git, Azure DevOps, or GitHub. Comfortable working in agile environments, with experience in iterative development and A/B testing methodologies. English CV is a Must Boa Vista, Roraima Fidelis Security We are seeking a skilled and experienced Data Engineer to join our Threat Research team. The primary responsibility of this role will be to design, develop, and maintain data pipelines for threat intelligence ingestion, validation, and export automation flows. Responsibilities: Design, develop, and maintain data pipelines for ingesting threat intelligence data from various sources into our data ecosystem. Implement data validation processes to ensure data accuracy, completeness, and consistency. Collaborate with threat analysts to understand data requirements and design appropriate solutions. Develop automation scripts and workflows for data export processes to external systems or partners. Optimize and enhance existing data pipelines for improved performance and scalability. Monitor data pipelines and troubleshoot issues as they arise, ensuring continuous data availability and integrity. Document technical specifications, data flows, and procedures for data pipeline maintenance and support. Stay updated on emerging technologies and best practices in data engineering and incorporate them into our data ecosystem. Provide technical guidance and support to other team members on data engineering best practices and methodologies. Requirements: Proven experience as a Data Engineer or similar role, with a focus on data ingest, validation, and export automation. Strong proficiency in Python. Experience with data pipeline orchestration tools such as Apache Airflow, Apache NiFi, or similar. Familiarity with cloud platforms such as Snowflake, AWS, Azure, or Google Cloud Platform. Experience with data validation techniques and tools for ensuring data quality. Experience building and deploying images using containerization technologies such as Docker and Kubernetes. Excellent problem-solving skills and attention to detail. Strong communication and collaboration skills, with the ability to work effectively in a team environment. This position is fully remote and is a contractor (PJ) position. Salary range is 18-25/hr. Required Skills & Experience 3 years of experience with Power BI development and engineering (DAX and semantic model development and deployment) Strong hands on coding ability in PySpark and Python Professional working experience with Databricks, Fabric, or similar Spark platforms Fair knowledge on the use of Azure Dev Ops (ADO) to manage repository and versions Strong communication skills and experience engaging directly with business stakeholders and technical teams Nice to Have Skills & Experience Knowledge of Tabular and ALM tools to work on Semantic Model and automation Experience working for a global company/ with a global team Job Description A global biopharmaceutical company is seeking strong Power BI/Data Engineer to join their team to support a portfolio of ongoing clinical operations projects. These individuals will work on a team of Power BI /Data Engineers of all levels and will partner with technical and functional partners to meet the project needs. The ideal candidates should have in-depth expertise with Power BI (including DAX and Semantic Model Development), strong PySpark & Python coding experience, and experience with Databricks, Fabric, or a similar Spark platform. Additionally, strong communication skills are a must along with the proven ability to work with cross functional teams and stakeholders to drive project work. Cargo About The Role We are seeking experienced Data Engineers to develop and deliver robust, cost-efficient data products that power analytics, reporting and decision-making across two distinct brands. What You’ll Do - Build highly consumable and cost-efficient data products by synthesizing data from diverse source systems. - Ingest raw data using Fivetran and Python, staging and enriching it in BigQuery to provide consistent, trusted dimensions and metrics for downstream workflows. - Design, maintain, and improve workflows that ensure reliable and consistent data creation, proactively addressing data quality issues and optimizing for performance and cost. - Develop LookML Views and Models to democratize access to data products and enable self-service analytics in Looker. - Deliver ad hoc SQL reports and support business users with timely insights. - (Secondary) Implement simple machine learning features into data products using tools like BQML. - Build and maintain Looker dashboards and reports to surface key metrics and trends. What We’re Looking For Proven experience building and managing data products in modern cloud environments (GCP preferred). Strong proficiency in Python for data ingestion and workflow development. Hands-on expertise with BigQuery, dbt, Airflow and Looker. Solid understanding of data modeling, pipeline design and data quality best practices. Excellent communication skills and a track record of effective collaboration across technical and non-technical teams. Why Join Kake? Kake is a remote-first company with a global community — fully believing that its not where your table is, but what you bring to the table. We provide top-tier engineering teams to support some of the world’s most innovative companies, and we’ve built a culture where great people stay, grow, and thrive. We’re proud to be more than just a stop along the way in your career — we’re the destination. The icing on the Kake: Competitive Pay in USD – Work globally, get paid globally. Fully Remote – Simply put, we trust you. Better Me Fund – We invest in your personal growth and passions. ️ Compassion is Badass – Join a community that invests in social good. J-18808-Ljbffr

Location: Roraima, Brazil, BR

Posted Date: 9/11/2025
View More buscojobs Brasil Jobs

Contact Information

Contact Human Resources
buscojobs Brasil

Posted

September 11, 2025
UID: 5386296540

AboutJobs.com does not guarantee the validity or accuracy of the job information posted in this database. It is the job seeker's responsibility to independently review all posting companies, contracts and job offers.