Overview

About Remote

Remote is solving global remote organizations’ biggest challenge: employing anyone anywhere compliantly. We make it possible for businesses big and small to employ a global team by handling global payroll, benefits, taxes, and compliance (learn more about how it works at remote.com/how-it-works). We’re backed by A+ investors and our team is world-class, literally and figuratively, as we’re all scattered around the world.

Please check out our public handbook (at remote.com/handbook) to learn more about our culture. We encourage folks from all ethnic groups, genders, sexuality, age and abilities to apply. You can also check out independent reviews by other candidates on Glassdoor. If this job description resonates with you, we want to hear from you!

All of our positions are fully remote. You do not have to relocate to join us!

We use a LinkedIn feature called “multiplexing”, which creates several location-specific job postings for individual locations from a single global position we publish. Multiplexing enables us to make our global job post compatible with LinkedIn’s system and allows us to manage the inbound applications by location.

We encourage candidates to apply to any of these roles since they are factually global and we will make explicit mention of specific location details in the practical section below. 🌎

How we work

We love working async (www.notion.so/80c01cd443ad4c77a8ceaef7c5fba5d0) and this means you get to do your own schedule.

We empower ownership and proactivity and when in doubt default to action instead of waiting.

The position

As a data engineer, you will be the link between data producers and data consumers at Remote. You’ll primarily focus on building out our data pipeline to unify our various data sources in a compliant manner. That being said, you should also be able to jump in as needed and help deliver consumable data to internal users.

Requirements

Must have (professional experience)

  • 3+ years of experience with SQL (we use PostgreSQL at Remote)
  • 3+ years experience with data pipeline tools, e.g. Meltano or Stitch
  • Experience with BI Tools e.g. Metabase, Looker, or Tableau

Key responsibilities

  • Maintain our data pipeline by scheduling extractors within Meltano, and handling errors.
  • Writing custom extractors in Python for our Meltano ELT pipeline.
  • Writing transformations using DBT.
  • Identify and address data quality issues.
  • Build documentation around our tools.
  • Work with stakeholders to get them the data they need while maintaining safe data access.
  • Work with stakeholders to build the necessary data pipelines to get people the data they need.
  • Building a clear vision for the needs of the data team, and how to improve our process.

Practicals

  • You’ll report to: Head of Automation
  • Team: Automation
  • Location: Anywhere in the World
  • Start date: As soon as possible

Application process

  1. (async) Profile review
  2. Interview with recruiter
  3. Interview with future manager
  4. (async) Small challenge
  5. (async) Challenge Review
  6. Interview with team members (no managers present)
  7.  Prior employment verification check(s)
  8. (async) Offer

#LI-DNI

Remote Compensation Philosophy

Remote’s Total Rewards philosophy is to ensure fair unbiased compensation and fair pay along with competitive benefits in all locations in which we operate. We do not agree to or encourage cheap-labour practices and therefore pay a minimum annual salary of USD 40,000 per year, in all locations throughout the world. Actual compensation may vary based upon geographical location, experience, and/or skill level. However, it will never be below our minimum global compensation mentioned.

Benefits

You can learn more about the benefits we’re offering to all internal employees at Remote by visiting our public Benefits & Perks Handbook page (at www.notion.so/people-Benefits-perks-1e48a5869c274f40910b76d405b92f63).

How to apply

Please fill out the form below and upload your CV with a PDF format. See how to convert your CV to PDF here. Thank you!