Company Detail

Washington Software Inc.
Member Since,
Login to View contact details
Login

About Company

Job Openings

  • Data Engineer  

    - Saint-Georges

    Job descriptionWe’re hiring on behalf of an international client for a contract-to-hire position. This is a 100% remote role,
    ???? Role SummaryAs a Data Engineer, you’ll participate in the development of scalable, cloud-native data solutions. You’ll architect high-performance pipelines, select the right tools for each scenario, and collaborate across teams to drive data-driven decisions.
    ???? Key ResponsibilitiesDesign and implement ETL/ELT pipelines using Snowflake and Airflow tailored to business needs.Build backend services in Golang to support real-time and batch data processing.Choose the right tools (e.g., Kafka vs. Pub/Sub, dbt vs. SQL) based on performance, scalability, and cost.Work closely with analysts, data scientists, and product teams to translate requirements into technical solutions.Optimize Snowflake performance through advanced SQL and data modeling.Ensure data governance, security, and compliance across systems.Mentor junior engineers and promote engineering best practices.Monitor pipeline health and proactively resolve issues.
    ???? QualificationsBachelor’s or Master’s degree in Computer Science, Engineering, or related field.Minimum 4 years of experience in data engineering.Strong proficiency in Golang and Snowflake.Hands-on experience with Airflow and GCP services (BigQuery, Cloud Functions, etc.).Deep understanding of SQL, data modeling, and pipeline orchestration.Ability to assess and recommend tools based on scenario-specific needs.Familiarity with Kafka, REST APIs, dbt, or Terraform is a plus.Excellent communication and leadership skills.
    ????️ Core TechnologiesSnowflake ❄️Golang ????Airflow ????️GCP ☁️SQL & Data ModelingETL/ELT ArchitectureTool Selection & Optimization

  • Data Engineer  

    - Banff

    Job descriptionWe’re hiring on behalf of an international client for a contract-to-hire position. This is a 100% remote role,
    ???? Role SummaryAs a Data Engineer, you’ll participate in the development of scalable, cloud-native data solutions. You’ll architect high-performance pipelines, select the right tools for each scenario, and collaborate across teams to drive data-driven decisions.
    ???? Key ResponsibilitiesDesign and implement ETL/ELT pipelines using Snowflake and Airflow tailored to business needs.Build backend services in Golang to support real-time and batch data processing.Choose the right tools (e.g., Kafka vs. Pub/Sub, dbt vs. SQL) based on performance, scalability, and cost.Work closely with analysts, data scientists, and product teams to translate requirements into technical solutions.Optimize Snowflake performance through advanced SQL and data modeling.Ensure data governance, security, and compliance across systems.Mentor junior engineers and promote engineering best practices.Monitor pipeline health and proactively resolve issues.
    ???? QualificationsBachelor’s or Master’s degree in Computer Science, Engineering, or related field.Minimum 4 years of experience in data engineering.Strong proficiency in Golang and Snowflake.Hands-on experience with Airflow and GCP services (BigQuery, Cloud Functions, etc.).Deep understanding of SQL, data modeling, and pipeline orchestration.Ability to assess and recommend tools based on scenario-specific needs.Familiarity with Kafka, REST APIs, dbt, or Terraform is a plus.Excellent communication and leadership skills.
    ????️ Core TechnologiesSnowflake ❄️Golang ????Airflow ????️GCP ☁️SQL & Data ModelingETL/ELT ArchitectureTool Selection & Optimization

  • Data Engineer  

    - Montcalm

    Job descriptionWe’re hiring on behalf of an international client for a contract-to-hire position. This is a 100% remote role,
    ???? Role SummaryAs a Data Engineer, you’ll participate in the development of scalable, cloud-native data solutions. You’ll architect high-performance pipelines, select the right tools for each scenario, and collaborate across teams to drive data-driven decisions.
    ???? Key ResponsibilitiesDesign and implement ETL/ELT pipelines using Snowflake and Airflow tailored to business needs.Build backend services in Golang to support real-time and batch data processing.Choose the right tools (e.g., Kafka vs. Pub/Sub, dbt vs. SQL) based on performance, scalability, and cost.Work closely with analysts, data scientists, and product teams to translate requirements into technical solutions.Optimize Snowflake performance through advanced SQL and data modeling.Ensure data governance, security, and compliance across systems.Mentor junior engineers and promote engineering best practices.Monitor pipeline health and proactively resolve issues.
    ???? QualificationsBachelor’s or Master’s degree in Computer Science, Engineering, or related field.Minimum 4 years of experience in data engineering.Strong proficiency in Golang and Snowflake.Hands-on experience with Airflow and GCP services (BigQuery, Cloud Functions, etc.).Deep understanding of SQL, data modeling, and pipeline orchestration.Ability to assess and recommend tools based on scenario-specific needs.Familiarity with Kafka, REST APIs, dbt, or Terraform is a plus.Excellent communication and leadership skills.
    ????️ Core TechnologiesSnowflake ❄️Golang ????Airflow ????️GCP ☁️SQL & Data ModelingETL/ELT ArchitectureTool Selection & Optimization

  • Data Engineer  

    - Saint-Laurent

    Job descriptionWe’re hiring on behalf of an international client for a contract-to-hire position. This is a 100% remote role,
    ???? Role SummaryAs a Data Engineer, you’ll participate in the development of scalable, cloud-native data solutions. You’ll architect high-performance pipelines, select the right tools for each scenario, and collaborate across teams to drive data-driven decisions.
    ???? Key ResponsibilitiesDesign and implement ETL/ELT pipelines using Snowflake and Airflow tailored to business needs.Build backend services in Golang to support real-time and batch data processing.Choose the right tools (e.g., Kafka vs. Pub/Sub, dbt vs. SQL) based on performance, scalability, and cost.Work closely with analysts, data scientists, and product teams to translate requirements into technical solutions.Optimize Snowflake performance through advanced SQL and data modeling.Ensure data governance, security, and compliance across systems.Mentor junior engineers and promote engineering best practices.Monitor pipeline health and proactively resolve issues.
    ???? QualificationsBachelor’s or Master’s degree in Computer Science, Engineering, or related field.Minimum 4 years of experience in data engineering.Strong proficiency in Golang and Snowflake.Hands-on experience with Airflow and GCP services (BigQuery, Cloud Functions, etc.).Deep understanding of SQL, data modeling, and pipeline orchestration.Ability to assess and recommend tools based on scenario-specific needs.Familiarity with Kafka, REST APIs, dbt, or Terraform is a plus.Excellent communication and leadership skills.
    ????️ Core TechnologiesSnowflake ❄️Golang ????Airflow ????️GCP ☁️SQL & Data ModelingETL/ELT ArchitectureTool Selection & Optimization

  • Data Engineer  

    - Newmarket

    Job descriptionWe’re hiring on behalf of an international client for a contract-to-hire position. This is a 100% remote role,
    ???? Role SummaryAs a Data Engineer, you’ll participate in the development of scalable, cloud-native data solutions. You’ll architect high-performance pipelines, select the right tools for each scenario, and collaborate across teams to drive data-driven decisions.
    ???? Key ResponsibilitiesDesign and implement ETL/ELT pipelines using Snowflake and Airflow tailored to business needs.Build backend services in Golang to support real-time and batch data processing.Choose the right tools (e.g., Kafka vs. Pub/Sub, dbt vs. SQL) based on performance, scalability, and cost.Work closely with analysts, data scientists, and product teams to translate requirements into technical solutions.Optimize Snowflake performance through advanced SQL and data modeling.Ensure data governance, security, and compliance across systems.Mentor junior engineers and promote engineering best practices.Monitor pipeline health and proactively resolve issues.
    ???? QualificationsBachelor’s or Master’s degree in Computer Science, Engineering, or related field.Minimum 4 years of experience in data engineering.Strong proficiency in Golang and Snowflake.Hands-on experience with Airflow and GCP services (BigQuery, Cloud Functions, etc.).Deep understanding of SQL, data modeling, and pipeline orchestration.Ability to assess and recommend tools based on scenario-specific needs.Familiarity with Kafka, REST APIs, dbt, or Terraform is a plus.Excellent communication and leadership skills.
    ????️ Core TechnologiesSnowflake ❄️Golang ????Airflow ????️GCP ☁️SQL & Data ModelingETL/ELT ArchitectureTool Selection & Optimization

  • Data Engineer  

    - Red Deer

    Job descriptionWe’re hiring on behalf of an international client for a contract-to-hire position. This is a 100% remote role,
    ???? Role SummaryAs a Data Engineer, you’ll participate in the development of scalable, cloud-native data solutions. You’ll architect high-performance pipelines, select the right tools for each scenario, and collaborate across teams to drive data-driven decisions.
    ???? Key ResponsibilitiesDesign and implement ETL/ELT pipelines using Snowflake and Airflow tailored to business needs.Build backend services in Golang to support real-time and batch data processing.Choose the right tools (e.g., Kafka vs. Pub/Sub, dbt vs. SQL) based on performance, scalability, and cost.Work closely with analysts, data scientists, and product teams to translate requirements into technical solutions.Optimize Snowflake performance through advanced SQL and data modeling.Ensure data governance, security, and compliance across systems.Mentor junior engineers and promote engineering best practices.Monitor pipeline health and proactively resolve issues.
    ???? QualificationsBachelor’s or Master’s degree in Computer Science, Engineering, or related field.Minimum 4 years of experience in data engineering.Strong proficiency in Golang and Snowflake.Hands-on experience with Airflow and GCP services (BigQuery, Cloud Functions, etc.).Deep understanding of SQL, data modeling, and pipeline orchestration.Ability to assess and recommend tools based on scenario-specific needs.Familiarity with Kafka, REST APIs, dbt, or Terraform is a plus.Excellent communication and leadership skills.
    ????️ Core TechnologiesSnowflake ❄️Golang ????Airflow ????️GCP ☁️SQL & Data ModelingETL/ELT ArchitectureTool Selection & Optimization

  • Data Engineer  

    - Saint John

    Job descriptionWe’re hiring on behalf of an international client for a contract-to-hire position. This is a 100% remote role,
    ???? Role SummaryAs a Data Engineer, you’ll participate in the development of scalable, cloud-native data solutions. You’ll architect high-performance pipelines, select the right tools for each scenario, and collaborate across teams to drive data-driven decisions.
    ???? Key ResponsibilitiesDesign and implement ETL/ELT pipelines using Snowflake and Airflow tailored to business needs.Build backend services in Golang to support real-time and batch data processing.Choose the right tools (e.g., Kafka vs. Pub/Sub, dbt vs. SQL) based on performance, scalability, and cost.Work closely with analysts, data scientists, and product teams to translate requirements into technical solutions.Optimize Snowflake performance through advanced SQL and data modeling.Ensure data governance, security, and compliance across systems.Mentor junior engineers and promote engineering best practices.Monitor pipeline health and proactively resolve issues.
    ???? QualificationsBachelor’s or Master’s degree in Computer Science, Engineering, or related field.Minimum 4 years of experience in data engineering.Strong proficiency in Golang and Snowflake.Hands-on experience with Airflow and GCP services (BigQuery, Cloud Functions, etc.).Deep understanding of SQL, data modeling, and pipeline orchestration.Ability to assess and recommend tools based on scenario-specific needs.Familiarity with Kafka, REST APIs, dbt, or Terraform is a plus.Excellent communication and leadership skills.
    ????️ Core TechnologiesSnowflake ❄️Golang ????Airflow ????️GCP ☁️SQL & Data ModelingETL/ELT ArchitectureTool Selection & Optimization

  • Data Engineer  

    - Fredericton

    Job descriptionWe’re hiring on behalf of an international client for a contract-to-hire position. This is a 100% remote role,
    ???? Role SummaryAs a Data Engineer, you’ll participate in the development of scalable, cloud-native data solutions. You’ll architect high-performance pipelines, select the right tools for each scenario, and collaborate across teams to drive data-driven decisions.
    ???? Key ResponsibilitiesDesign and implement ETL/ELT pipelines using Snowflake and Airflow tailored to business needs.Build backend services in Golang to support real-time and batch data processing.Choose the right tools (e.g., Kafka vs. Pub/Sub, dbt vs. SQL) based on performance, scalability, and cost.Work closely with analysts, data scientists, and product teams to translate requirements into technical solutions.Optimize Snowflake performance through advanced SQL and data modeling.Ensure data governance, security, and compliance across systems.Mentor junior engineers and promote engineering best practices.Monitor pipeline health and proactively resolve issues.
    ???? QualificationsBachelor’s or Master’s degree in Computer Science, Engineering, or related field.Minimum 4 years of experience in data engineering.Strong proficiency in Golang and Snowflake.Hands-on experience with Airflow and GCP services (BigQuery, Cloud Functions, etc.).Deep understanding of SQL, data modeling, and pipeline orchestration.Ability to assess and recommend tools based on scenario-specific needs.Familiarity with Kafka, REST APIs, dbt, or Terraform is a plus.Excellent communication and leadership skills.
    ????️ Core TechnologiesSnowflake ❄️Golang ????Airflow ????️GCP ☁️SQL & Data ModelingETL/ELT ArchitectureTool Selection & Optimization

  • Data Engineer  

    - Sainte-Marie

    Job descriptionWe’re hiring on behalf of an international client for a contract-to-hire position. This is a 100% remote role,
    ???? Role SummaryAs a Data Engineer, you’ll participate in the development of scalable, cloud-native data solutions. You’ll architect high-performance pipelines, select the right tools for each scenario, and collaborate across teams to drive data-driven decisions.
    ???? Key ResponsibilitiesDesign and implement ETL/ELT pipelines using Snowflake and Airflow tailored to business needs.Build backend services in Golang to support real-time and batch data processing.Choose the right tools (e.g., Kafka vs. Pub/Sub, dbt vs. SQL) based on performance, scalability, and cost.Work closely with analysts, data scientists, and product teams to translate requirements into technical solutions.Optimize Snowflake performance through advanced SQL and data modeling.Ensure data governance, security, and compliance across systems.Mentor junior engineers and promote engineering best practices.Monitor pipeline health and proactively resolve issues.
    ???? QualificationsBachelor’s or Master’s degree in Computer Science, Engineering, or related field.Minimum 4 years of experience in data engineering.Strong proficiency in Golang and Snowflake.Hands-on experience with Airflow and GCP services (BigQuery, Cloud Functions, etc.).Deep understanding of SQL, data modeling, and pipeline orchestration.Ability to assess and recommend tools based on scenario-specific needs.Familiarity with Kafka, REST APIs, dbt, or Terraform is a plus.Excellent communication and leadership skills.
    ????️ Core TechnologiesSnowflake ❄️Golang ????Airflow ????️GCP ☁️SQL & Data ModelingETL/ELT ArchitectureTool Selection & Optimization

  • Data Engineer  

    - Kingston

    Job descriptionWe’re hiring on behalf of an international client for a contract-to-hire position. This is a 100% remote role,
    ???? Role SummaryAs a Data Engineer, you’ll participate in the development of scalable, cloud-native data solutions. You’ll architect high-performance pipelines, select the right tools for each scenario, and collaborate across teams to drive data-driven decisions.
    ???? Key ResponsibilitiesDesign and implement ETL/ELT pipelines using Snowflake and Airflow tailored to business needs.Build backend services in Golang to support real-time and batch data processing.Choose the right tools (e.g., Kafka vs. Pub/Sub, dbt vs. SQL) based on performance, scalability, and cost.Work closely with analysts, data scientists, and product teams to translate requirements into technical solutions.Optimize Snowflake performance through advanced SQL and data modeling.Ensure data governance, security, and compliance across systems.Mentor junior engineers and promote engineering best practices.Monitor pipeline health and proactively resolve issues.
    ???? QualificationsBachelor’s or Master’s degree in Computer Science, Engineering, or related field.Minimum 4 years of experience in data engineering.Strong proficiency in Golang and Snowflake.Hands-on experience with Airflow and GCP services (BigQuery, Cloud Functions, etc.).Deep understanding of SQL, data modeling, and pipeline orchestration.Ability to assess and recommend tools based on scenario-specific needs.Familiarity with Kafka, REST APIs, dbt, or Terraform is a plus.Excellent communication and leadership skills.
    ????️ Core TechnologiesSnowflake ❄️Golang ????Airflow ????️GCP ☁️SQL & Data ModelingETL/ELT ArchitectureTool Selection & Optimization

Company Detail

  • Is Email Verified
    No
  • Total Employees
  • Established In
  • Current jobs

Google Map

For Jobseekers
For Employers
Contact Us
Astrid-Lindgren-Weg 12 38229 Salzgitter Germany