Job Data Warehouse Engineer en Remote

Plutus en Remote

Digital job Data Warehouse Engineer at Plutus

Data Warehouse Engineer

Plutus Remote

$50,000 - $50,000

Remote Full-time
Development 2-5 años Engineer Backend Dev Front end
By Remote Ok
¡I want this Job!

145 Views

Partnership with Remote Ok

Job description

We are seeking a skilled Data Warehouse Engineer to join our data team. In this role, you will be responsible for designing, developing, and maintaining scalable data solutions using dbt, Fivetran, and Braze. You will play a key role in building and optimising data pipelines, ensuring data quality, and supporting our business intelligence and marketing efforts through seamless data integration and automation.As a Data Warehouse Engineer, you will collaborate with data analysts, engineers, and marketing teams to ensure that data flows smoothly between different systems, creating a robust infrastructure that drives business insights and customer engagement.


Key Responsibilities:
  • Data Pipeline Development: Design, build, and maintain ETL/ELT data pipelines using Fivetran to integrate various data sources into the data warehouse.
  • Data Modeling with dbt: Develop and maintain data models and transformations using dbt (Data Build Tool) to optimise the structure of the data warehouse for analytics and reporting.
  • Braze Integration: Work closely with the marketing team to integrate Braze for personalised customer engagement, ensuring smooth data flow between the warehouse and the platform.
  • Data Warehouse Management: Maintain and optimise the performance of the data warehouse (e.g., Snowflake, BigQuery, Redshift) by managing schema design, partitioning, and indexing.
  • Data Quality and Monitoring: Implement data quality checks, conduct audits, and monitor pipeline health to ensure reliable and accurate data delivery.
  • Collaboration: Work closely with data analysts, BI teams, and marketing to understand data needs, improve data availability, and deliver actionable insights.
  • Automation & Optimisation: Implement automation for data ingestion, transformation, and orchestration to improve operational efficiency and reduce manual intervention.
  • Documentation & Best Practices: Create and maintain comprehensive documentation of data architecture, pipeline processes, and best practices for future reference and onboarding.
  • Troubleshooting & Support: Identify, investigate, and resolve data-related issues in a timely manner.


Please mention the word **RECOVER** and tag RMzQuOTYuNDYuMTIw when applying to show you read the job post completely (#RMzQuOTYuNDYuMTIw). This is a beta feature to avoid spam applicants. Companies can search these words to find applicants that read this and see they're human.
¡This Job is mine!

145 Views

Partnership with Remote Ok

Share Job:

Cookies help us deliver our services. By using our services, you agree to our use of cookies.