We are seeking a visionary and hands-on Senior Data Architect with 10+ years of experience to lead the design and implementation of scalable, secure, and high-performance data architectures across GCP, AWS, and Azure. This role is central to our cloud-first strategy, enabling advanced analytics, AI/ML, and real-time data capabilities across the enterprise. Responsibilities Define and implement enterprise-wide data architecture strategies across multiple cloud platforms Design data lakehouse, data mesh, and real-time streaming architectures using GCP, AWS, and Azure services Lead the development of conceptual, logical, and physical data models to support analytics, operational systems, and AI/ML workloads Collaborate with engineering, analytics, and business teams to align data architecture with strategic goals Establish data governance, metadata management, and data quality frameworks Architect ETL/ELT pipelines, data ingestion frameworks, and semantic layers for BI and ML Optimize data storage, compute, and query performance across cloud environments Ensure compliance with security, privacy, and regulatory standards (e.g., GDPR, HIPAA, SOC 2) Evaluate and integrate emerging technologies to future-proof the data ecosystem Mentor data engineers, modelers, and junior architects Qualifications 10+ years of experience in data architecture, with proven success in large-scale, cloud-native environments Deep expertise in GCP (BigQuery, Dataflow, Dataplex, Pub/Sub), plus working knowledge of AWS (Redshift, Glue, S3) and Azure (Synapse, Data Factory, ADLS) Strong understanding of data modeling, data warehousing, streaming, and NoSQL systems Proficiency in SQL, Python, and infrastructure-as-code tools (e.g., Terraform, CloudFormation) Experience with metadata catalogs, data lineage tools, and governance platforms Familiarity with BI tools (Looker, Power BI, Tableau) and ML platforms (Vertex AI, SageMaker, Azure ML) Bachelor’s or Master’s degree in Computer Science, Engineering, or related field Certifications: GCP Professional Data Engineer, AWS Certified Solutions Architect, or Azure Data Engineer Associate Experience in regulated industries such as finance, healthcare, or government Exposure to data mesh principles, event-driven architectures, and real-time analytics Strong leadership and stakeholder management skills Seniority level Associate Employment type Full-time Job function Business Development Industries Technology, Information and Internet Location: Etobicoke, Ontario, Canada Salary: CA$110,000.00-CA$125,000.00 #J-18808-Ljbffr
A leading tech firm is seeking a Senior Data Architect to lead the design and implementation of scalable cloud data architectures. The ideal candidate will have over 10 years of experience and expertise in GCP, AWS, and Azure. Responsibilities include defining enterprise-wide architecture strategies and mentoring junior staff. A Bachelor's or Master’s degree in Computer Science along with relevant certifications is required. Competitive salary between CA$110,000 and CA$125,000 is offered, along with a full-time contract. Location is in Etobicoke, Ontario. #J-18808-Ljbffr
Location:
Calgary, AB (Hybrid- 3 days onsite)
The client interview round will be conducted in person at the client location in Calgary, AB
Top 3 skills required for this role:
Demonstrate proficiency in Kafka, Oracle, and PL/SQL for effective data management and processing
Possess strong expertise in Spring Boot, Microservices, and REST Web Services for robust application development
Exhibit deep understanding of Core Java for efficient coding and system integration
Showcase domain knowledge in Cards and Payments, and Asset and Wealth Management for targeted solution delivery
Job Description/ Responsibilities
Lead the design and implementation of scalable microservices architecture to enhance system performance and reliability
Oversee the development and deployment of RESTful web services to facilitate seamless communication between applications
Provide technical guidance and mentorship to the development team, fostering a collaborative and innovative environment
Collaborate with cross-functional teams to gather requirements and translate them into technical specifications
Ensure the integration of Kafka for efficient data streaming and real-time processing within the application ecosystem
Utilize Oracle and PL/SQL for database management, ensuring data integrity and optimal performance
Implement Spring Boot and Spring Core frameworks to streamline application development and enhance modularity
Conduct code reviews and ensure adherence to best practices and coding standards
#J-18808-Ljbffr
Location:
Calgary, AB (Hybrid- 3 days onsite)
Top 3 skills required for this role:
Demonstrate proficiency in Kafka, Oracle, and PL/SQL for effective data management and processing
Possess strong expertise in Spring Boot, Microservices, and REST Web Services for robust application development
Exhibit deep understanding of Core Java for efficient coding and system integration
Showcase domain knowledge in Cards and Payments, and Asset and Wealth Management for targeted solution delivery
Job Description/ Responsibilities
Lead the design and implementation of scalable microservices architecture to enhance system performance and reliability
Oversee the development and deployment of RESTful web services to facilitate seamless communication between applications
Provide technical guidance and mentorship to the development team, fostering a collaborative and innovative environment
Collaborate with cross-functional teams to gather requirements and translate them into technical specifications
Ensure the integration of Kafka for efficient data streaming and real-time processing within the application ecosystem
Utilize Oracle and PL/SQL for database management, ensuring data integrity and optimal performance
Implement Spring Boot and Spring Core frameworks to streamline application development and enhance modularity
Conduct code reviews and ensure adherence to best practices and coding standards
#J-18808-Ljbffr
Lead Python Developer for Quantitative Finance
Location:
Toronto, ON (Hybrid, 3 days onsite)
Role Summary
We are seeking a highly skilled Lead Python Developer to join a team building high performance, scalable reporting solutions for fixed income pricing and risk metrics, leveraging Quantitative Finance libraries and integrating with Bloomberg APIs. The ideal candidate will bring deep technical expertise in Python and financial computation frameworks.
Key Responsibilities
Design and implement Python based modules of quant finance (Quantlib) software libraries for bond pricing, yield curve modeling, and risk analytics.
Develop and maintain reusable components for Quantlib financial metrics.
Integrate with Bloomberg APIs and other market data sources to retrieve and validate pricing inputs.
Implement and optimize Quantitative Finance libraries (e.g., QuantLib, PyQL, finmath) for high accuracy metric calculations.
Collaborate with data engineers and analysts to ensure seamless integration with cloud‑based data pipelines.
Ensure precision (3 decimal accuracy) in all pricing and risk computations.
Support testing, validation, and deployment of analytics modules in production environments.
Required Technical Skills
7+ years of professional experience in Python development, with a focus on numerical computing or financial analytics.
Strong experience with Quantitative Finance libraries such as QuantLib, PyQL, or similar.
Proven expertise in API integration, especially with Bloomberg APIs or other financial data providers.
Experience working in Agile/Scrum environments and collaborating with cross‑functional teams.
Seniority level
Mid‑Senior level
Employment type
Contract
Job function
Information Technology
#J-18808-Ljbffr
A leading IT consulting firm in Toronto seeks a skilled professional to provide subject matter expertise in tax determination and compliance for LATAM. This role requires 3+ years in SAP localization and hands-on experience with e-invoicing platforms. Proficiency in Spanish and knowledge of LATAM regulations are essential. The position emphasizes collaboration with cross-functional teams and includes responsibilities such as creating functional specifications and user documentation. Employment is on a contract basis with a mid-senior level designation. #J-18808-Ljbffr
A technology solutions provider in Toronto is seeking a professional to develop and implement Kofax solutions. In this role, you will collaborate with cross-functional teams to gather requirements and design effective solutions. You will also optimize existing Kofax applications to improve performance and provide technical support to ensure operations run smoothly. Ideal candidates should have strong skills in Kofax Total Agility and Kofax Transformation Modules. #J-18808-Ljbffr
Junior Executive - Recruitment at Galent
Responsibilities:
Assess and analyze business strategy, business requirements, current state environment and technology choices to evaluate solution alternatives that meet business sponsor needs in simplest possible way taking full advantage of CRM product features, factoring impact across LoBs using CRM and FSC Org structure.
Ensure both functional and non-functional considerations (resiliency, scalability, security etc) are factored not only in pragmatic way for the solution but also in support to long-term strategy realization.
Contribute to the design and development on the Salesforce Financial Service Cloud to drive business value.
Assist scrum teams in gathering requirements/user stories to develop/enhance functionality on the Salesforce platform, including the technical analysis and estimates.
Support in DevOps development and process efficiencies.
Stay current on Salesforce’s development platform and implement Salesforce development best practices, following the client internal standards and guidelines.
Must-have:
3-5 years of experience of SF Platform experience delivering large projects/programs: Salesforce configuration, customization and development using Salesforce.com.
Hands-on experience with the Lightning platform including APEX coding, Lightning Web Components, APIs and Salesforce Data Model.
Deep understanding of the Salesforce platform and its architecture and product suite including Sales & Service Cloud and/or Financial Services Cloud.
Experience integrating Salesforce applications with 3rd party systems using Mulesoft, REST API and event-based integration patterns.
Seniority Level:
Mid-Senior level
Employment Type:
Full-time
Job Function:
Information Technology
Industries:
Technology, Information and Internet
#J-18808-Ljbffr
Overview
Core Technologies: Java 8+, SpringBoot, RESTful APIs, Microservices.
Optimization: OptaPlanner for constraint-solving and scheduling.
Database: PostgreSQL, MySQL, or NoSQL databases.
Responsibilities
Design and implement robust, scalable solutions using Java, SpringBoot, and microservices architecture.
Define integration patterns and deployment strategies for AWS-based environments.
Incorporate OptaPlanner for constraint-solving and optimization in business workflows.
Architect solutions leveraging AWS services such as EC2, Lambda, API Gateway, RDS, S3, and CloudFormation.
Ensure security best practices, including IAM, encryption, and disaster recovery protocols.
Lead technical decision-making and mentor development teams.
Conduct code reviews and enforce coding standards.
Collaborate with stakeholders to translate business requirements into technical specifications.
Performance & Optimization
Identify and resolve performance bottlenecks.
Implement CI/CD pipelines using tools like Jenkins, GitHub Actions, or AWS CodePipeline.
Documentation
Maintain architecture diagrams, design documents, and technical standards.
Additional Information
Strong understanding of Agile methodologies, CI/CD, and security best practices
#J-18808-Ljbffr
BSA (Liquidity Reporting /Capital markets product knowledge)
Job Title:
BSA (Liquidity Reporting /Capital markets product knowledge)
Location:
Toronto, ON (Hybrid, 4 days onsite)
Interview type:
In-person Interview - End Client round
Experience:
7+ years of experience as a Business Systems Analyst, recently in Treasury space with direct involvement in system build and integration projects.
3+ years of hands‑on experience implementing QRM as an ALM, Liquidity Management, Stress Testing or Forecasting tool – consulting, deployment, implementation, and support.
Solid understanding of cap markets instruments, valuation techniques and IRR.
Expert analyzing and reconciling large datasets, investigations to drive business requirements, decisions, and solutions.
5+ years of experience in Agile delivery.
High level responsibilities:
Collaborate with business lines to analyze current state and gather, analyze, and document requirements related to ALM, liquidity management, FTP, and SIRR.
Write detailed business and functional specifications for internal and external system interfaces.
Lead requirements workshops and create artifacts such as data flow diagrams, process models, and message mapping documents.
Assist in testing efforts through scenario definition, defect triage, and requirements traceability.
Implement QRM platform by understanding the business objectives of Treasury pillars and ensuring it aligns with business needs and objectives.
Support the implementation of QRM‑related initiatives, ensuring successful project outcomes.
Utilize your deep understanding of QRM to address complex business challenges.
Education:
Bachelor's degree in computer science / engineering, mathematics, finance or related field required.
FRM, CFA, M.Fin, MBA or other related advanced degree preferred.
Seniority level
Mid‑Senior level
Employment type
Contract
Job function and Industries
Information Technology
Technology, Information and Internet
#J-18808-Ljbffr