Get ETL / Data Pipeline Engineers
Data pipeline engineers from Brazil, Argentina, and Colombia who build reliable, observable, and maintainable data infrastructure. They work with Airflow, dbt, Spark, Kafka, and modern ELT tools — not spreadsheet imports and cron jobs.
Senior data pipeline engineers in the US earn $155k-$200k. LATAM equivalents deliver at $68k-$105k — relevant for long-term data infrastructure engagements.
LATAM's 4-8 hour overlap with US teams supports collaborative pipeline debugging and data quality reviews that benefit from real-time coordination.
Data pipelines need clear documentation: lineage diagrams, SLA definitions, and runbooks for when things fail. LATAM pipeline engineers in our network treat documentation as part of the job.
LATAM universities produce strong database and algorithms graduates who move into data engineering. Brazil and Argentina have particularly active data engineering communities.
LATAM data engineers work in the modern stack: Airflow, dbt, Snowflake, Databricks, Kafka, and Spark. The days of custom ETL scripts no one can maintain are behind most of them.
Every candidate completes all five stages before you see their profile. You can also run your own technical round after our screening.
A timed test measuring analytical thinking, pattern recognition, and problem-solving clarity, independent of specific programming language knowledge.
A structured interview assessing communication style, conflict resolution, ownership mindset, and English proficiency in a professional context.
A 90-minute live session covering Airflow DAG design: candidates write a DAG for a realistic ingestion scenario with proper dependency management, error handling, retries, and alerting. We cover dbt model design — incremental vs full-refresh strategy, referencing upstream models, writing schema tests, and handling slowly changing dimensions. The session includes a data quality scenario: a key business metric has been wrong for a week — candidates walk through how they'd diagnose the root cause across ingestion, transformation, and reporting layers, and what monitoring they'd add to prevent recurrence.
Verification of work history, education, and identity with written consent, aligned with applicable privacy rules including LGPD where relevant.
We speak with at least two professional references who worked with the candidate in an engineering context, not personal contacts.
After our screening, you can optionally run your own technical round before making an offer.
Sample profiles
Anonymized profiles from our vetted talent pool. Actual candidates may vary.
Curitiba, Brazil
4+ years experience
Buenos Aires, Argentina
7+ years experience
São Paulo, Brazil
10+ years experience
Want to see profiles matched to your specific requirements?
Request matched profilesWhat to expect
Data pipeline engineers in our network build pipelines that are observable, testable, and recoverable. They treat data quality as a first-class concern and document lineage so the next engineer can understand what flows where. They use AI tools to speed up the repetitive parts of pipeline development.
GitHub Copilot for Airflow operator scaffolding, dbt model generation, and schema inference code. Boilerplate pipeline setup that used to take half a day gets done in an hour.
AI-assisted data quality rule generation: they use tools to suggest initial Great Expectations test suites based on schema inspection, then refine based on business requirements.
They instrument pipelines with data lineage tracking from day one. When something breaks, the impact analysis takes minutes, not days.
Documentation includes: data source descriptions, transformation logic, SLA definitions, and known data quality caveats. Analysts don't have to come to them for every question.
They design for idempotency and late-arriving data from the start. Re-running a pipeline twice shouldn't create duplicate records.
Junior Pipeline Engineer
Mid Pipeline Engineer
Senior Pipeline Engineer
Lead Data Engineer
FAQ
Related roles
Tell us your current data stack and what's breaking. We'll have a shortlist in 3 days.