Headquarters: Birmingham, AL
Enable and scale self-serve analytics for all Fleetio team members. You’ll architect clean data sets by modeling data and metrics via tools like dbt to empower employees to make data-driven decisions with accurate information.
Collaborate with other Fleetians around the company to understand data needs and ensure required data is collected, modeled, and available to team members.
Lead the development of our internal data infrastructure stack. Own the hygiene and integrity of our data warehouse and ancillary tools by maintaining and monitoring the ELT pipeline.
Establish governance and enforce standards around our data infrastructure. Document best practices and coach/advise other data analysts, product managers, engineers, etc. on data modeling, SQL query optimization & reusability, etc. Keep our data warehouse tidy by managing roles and permissions and deprecating old projects.
Ensure that each company metric has a single source of truth and that data validation is incorporated on a consistent basis.
Hire and mentor junior analytics engineers. Provide technical leadership for data modeling, metrics, visualization, etc.
Join an incredible team that goes above and beyond daily to make Fleetio a great place to work and leave your mark on our growth story.
Work remotely (within the United States), or at our Birmingham, AL HQ. We strive to promote a strong remote working culture and have done so since the beginning.
We place great emphasis on work/life balance. We have families and hobbies and know you do, too.
Watch our culture videos: https://fleet.io/culture
Fleetio overview video: https://www.youtube.com/watch?v=IlvIbwZT3oU
More about the Fleetio platform: https://www.fleetio.com/features
Our careers page: https://www.fleetio.com/careers
3+ years experience working in a data or analytics engineering role
Expert-level SQL skills with experience transforming raw data into clean models, optimizing code, and troubleshooting & improving others’ code
Strong understanding of ELT, data warehousing, and data modeling concepts (e.g. Star Schema)
Experience in designing, building, and administering modern data pipelines and data warehouses
Experience with dbt and dbt Cloud – both as a user and administrator
Experience with Snowflake, BigQuery, or Redshift
Experience with version control tools such as Github or Gitlab
Experience administering ELT tools such as Stitch or Fivetran
Experience with business intelligence solutions (Looker, Tableau, Periscope, Mode)
Experience collaborating with multiple business functions and stakeholders to develop metrics and key insights
Detailed understanding of schemas for popular third-party SaaS applications (e.g. Salesforce, Marketo, Snowplow, Segment)
Excellent communication and project management skills with a customer service focused mindset
Proficiency with Python (preferred)
Experience with serverless cloud functions (AWS Lambda, Google Cloud Functions, etc.)
100% coverage of employee health and dental insurance (50% family)
401(k) + match
Company stock options
Vision, STD & LTD
Dependent Care FSA and Medical FSA
Generous PTO, Company Holidays & Floating Holiday
Community service funds
Professional development funds
Health and wellness incentives
Remote working friendly since 2012
We Work Remotely: DevOps and Sysadmin Jobs