Position: Full Stack Developer / Architect (Java - Collibra) Job Description At least 12 years of experience proficient in Java Springboot, Python, Salt script. Good understanding in requirement gathering, design, development, and testing. Strong knowledge of Azure functions and SQL. Hands-on experience in code optimization. Aware of Sonar, Veracode, Harshi Corp, etc. Aware of the complete CI/CD process. Hands-on experience in API creation and wrapper APIs. Hands-on experience in creating custom adaptors. Understanding of data governance tools (Collibra or any other governance tool). Well versed with GitLab and Azure DevOps. Hands-on experience in Autosys configuration for job automation. Well versed with documentation. Seniority level Mid-Senior level Employment type Full-time Job function Information Technology Industries IT Services and IT Consulting #J-18808-Ljbffr
As a technical lead on large scale complex initiatives, you will be responsible for end-to-end management of the technical development and delivery of data solutions. This role is the primary point of contact for PMs regarding development activities. The role requires a strong background in file system management, database development, SDLC, and solution support. Beyond core development expertise, this position requires coordination of development estimates, strong analytical and interpersonal skills, knowledge of testing methodology, configuration management, issue resolution, tuning and troubleshooting, and overall solution lifecycle management. Job Requirements 5-8+ years overall IT development experience with extensive experience in ETL development. Lead or provide expertise in defining and managing detailed system design for data integration and ingestion (to Data Lakes) processes or data marts/warehouse builds or extensions. Manage and coordinate efforts of data modelers, data warehouse/data integration developers, and ETL data management processes in database usage and data warehousing. Implement, maintain, and enhance Data Governance, Data Quality, and related policies, in alignment with Enterprise standards and frameworks in collaboration with the Data Managers/Stewards/Owners. Create development estimates for various phases of project requests and be responsible for reviewing and approving business/system specification requirements, authoring system design specifications, and system deployment plans. Hands-on experience with designing, creating, and maintaining a centralized data repository (file system and RDBMS). Solid understanding of Oracle database and Hive and fluent in SQL. Strong knowledge of ETL and Data Warehouse data modeling principles and development best practices. Strong verbal and written communication skills - articulating and communicating information and ideas to varying audiences and in multiple written styles (persuasive, informative, narrative) in order to negotiate/gain access to information and records. Seniority Level Mid-Senior level Employment Type Full-time Job Function Information Technology Industries Banking #J-18808-Ljbffr
Job Title: Business Administrator / Technical Writer Location: Calgary, AB, Brampton, ON 6-12 Months Video Interview
Job Summary: We are seeking a highly organized and detail-oriented Business Administrator / Technical Writer to join our project management team. The ideal candidate will be responsible for providing administrative support, creating and managing documentation, process flows, technical papers, and ensuring effective communication within the team. This role requires a blend of business administration skills and technical writing expertise to support various projects and initiatives. Key Responsibilities: • Technical Writing: Develop clear and concise technical documentation, such as user playbooks, training materials, and process guides, to support project deliverables. • Documentation Management: Create, maintain, and organize project documentation, including project plans, status reports, meeting minutes, and other relevant documents. • Administrative Support: Provide comprehensive administrative support to the project management team, including scheduling meetings and managing calendars. • Communication: Facilitate effective communication within the team and with external stakeholders by preparing and distributing project updates, newsletters, and other communications. • Data Analysis: Assist in data collection, analysis, and reporting to support project decision-making and performance tracking. • Process Improvement: Identify opportunities for process improvement and contribute to the development and implementation of best practices within the team. • Collaboration: Work closely with project managers, business analysts, and other team members to ensure project goals and objectives are met. Qualifications: • Proven experience in business administration and technical writing, preferably within a project management environment. • Strong organizational and time management skills, with the ability to handle multiple tasks and prioritize effectively. • Excellent written and verbal communication skills, with a keen attention to detail. • Proficiency in Microsoft Office Suite (Word, Excel, PowerPoint, Outlook). Project management software knowledge would be an asset. • Ability to work independently and collaboratively in a fast-paced, dynamic environment. • Familiarity with project management methodologies and tools is a plus. Optional Additional Skills: • Experience in the telecommunications industry. • Knowledge of content management systems and document control processes. • Certification in project management (e.g., PMP, CAPM) or technical writing (e.g., CPTC) is an asset.
Position: Full Stack Developer / Architect ( Java - Collibra) Mandatory Skills: Java Springboot, Collibra, Azure Location: Canada (Toronto) (Hybrid) Term: Fulltime
Job Description · Atleast 12 years of experience Proficient in Java Springboot, Python, Salt script · Good Understanding in Requirement gathering, Design, development and Testing · Strong knowledge of Azure functions and SQL, · Hands on experience on Code Optimization · Creating Test Cases · Aware of Sonar, Veracode, Harshi Corp etc. · Aware of the Complete CI/CD Process · Hands on experience on API creating, Wrapper API · Hands on experience in creating custom adaptors · Understanding on Data Governance Tool (Collibra or any other governance Tool · Well versed with GitLab and Azure DevOps · Hands on experience on Autosys configuration for Job Automation Well versed with documentation
GCP Data Architect' Full time Remote- Canada
GCP Data Architect with healthcare background Role Overview We are seeking a highly skilled Solution Architect with extensive experience in the healthcare domain and expertise in designing and implementing data solutions on the GCP data stack. This role involves leading the development of a data lake for a hospital organization, integrating data from on-premises and cloud sources. The candidate must have a strong understanding of healthcare codes, terminologies (e.g., ICD, CPT, LOINC), and data standards like HL7 and FHIR. They will be responsible for designing the destination data model, building ingestion and transformation pipelines, and creating cloud-native infrastructure using Terraform. Key Responsibilities 1. Solution Architecture: Design the architecture for a robust and scalable data lake on GCP. Develop a detailed and efficient data model tailored for healthcare use cases, including clinical and operational analytics. Architect pipelines for batch and streaming data ingestion and transformation. 2. Healthcare Data Expertise: Ensure seamless integration of healthcare data standards such as HL7 and FHIR. Work with healthcare terminologies and coding systems, including ICD, CPT, and LOINC, ensuring accurate representation and mapping of clinical data. Address healthcare-specific challenges, such as patient data privacy and regulatory compliance (HIPAA). 3. Infrastructure Development: Design and implement infrastructure for DataStream, Dataflow, BigQuery, Cloud Composer, and Cloud Storage using Terraform. Ensure the infrastructure is optimized for scalability, security, and cost-efficiency. 4. Pipeline and Workflow Design: Develop and implement Dataflow pipelines for data transformation. Use DataStream for real-time data replication from on-premises systems to GCP. Orchestrate workflows using Cloud Composer and Apache Airflow. Leverage Dataform for scheduled data transformations and query automation. 5. Collaboration and Leadership: Collaborate with hospital stakeholders, clinicians, and IT teams to understand requirements and propose tailored solutions. Guide data engineers and developers in implementing best practices for data modeling and pipeline development. 6. Documentation and Governance: Document architectural designs, data models, and pipeline workflows. Define governance policies to ensure compliance with healthcare regulations and maintain data integrity. Required Skills and Qualifications Healthcare Domain Expertise: Strong knowledge of healthcare data standards, including HL7, FHIR, and terminologies like ICD, CPT, and LOINC. Experience working with hospital data, including clinical, operational, and financial datasets. Familiarity with healthcare privacy regulations such as HIPAA. GCP Expertise: Proficiency in BigQuery, DataStream, Dataflow, Cloud Composer, and Cloud Storage. Experience in designing scalable, secure solutions on the GCP platform. Data Engineering and Modeling: Strong expertise in creating and managing data models for healthcare use cases. Proven ability to design data lakes and implement streaming and batch pipelines. Terraform and Automation: Hands-on experience with Terraform for provisioning cloud infrastructure. Understanding of Infrastructure as Code (IaC) principles. Programming and Tools: Proficiency in Python and SQL for data processing and orchestration. Experience with workflow management tools like Apache Airflow. Soft Skills: Strong communication skills to engage with clinical and technical teams effectively. Leadership capabilities to guide teams and drive project success.
Role - GCP Infrastructure - DevOps Engineer Location: Toronto , ON Duration: Fulltime
Role & JD The Google Cloud DevOps Engineer will be responsible for automating infrastructure provisioning and configuration management using Terraform and Ansible. The role involves designing, implementing, and maintaining CI/CD pipelines on GCP using Azure DevOps. The ideal candidate will have extensive experience with GCP resources, particularly in data engineering, and possess strong scripting skills in Python and Bash. Responsibilities: - Automate infrastructure provisioning and configuration management using Terraform and Ansible. - Design, implement, and maintain CI/CD pipelines on GCP using Azure DevOps. -Manage and optimize GCP resources, including Compute Engine, Data Fusion, Dataflow, Cloud Composer, BigQuery, and BigQuery datasets. - Develop and maintain infrastructure as code (IaC) to ensure scalable and reliable deployment of applications. - Collaborate with development and operations teams to ensure seamless integration and delivery of software. - Monitor, troubleshoot, and optimize performance of GCP infrastructure and services. - Implement best practices for security, reliability, and scalability of cloud infrastructure. - Create and maintain documentation for infrastructure and operational processes. - Conduct performance tuning, monitoring, and maintenance of CI/CD pipelines. - Utilize Python and Bash scripting for automation tasks and process improvements.
*Qualifications:* - Proven experience as a DevOps Engineer or similar role, with a focus on GCP. - Mandatory Healthcare Domain with HIPAA Experience - Strong experience with Terraform and Ansible for infrastructure automation and configuration management. - Proficient in designing and maintaining CI/CD pipelines using Azure DevOps. - In-depth knowledge of GCP resources, particularly in data engineering: Compute Engine, Data Fusion, Dataflow, Cloud Composer, BigQuery, and BigQuery datasets. - Solid scripting skills in Python and Bash. - Familiarity with containerization technologies such as Docker and Kubernetes. - Experience with monitoring tools and logging frameworks. - Excellent problem-solving skills and attention to detail. - Strong communication and collaboration skills.
*Preferred Qualifications:* - GCP Professional Cloud DevOps Engineer certification. - Experience with other CI/CD tools and platforms. - Familiarity with other programming languages and frameworks.
5 days ago Be among the first 25 applicants Direct message the job poster from Epsilon Solutions Ltd. Team Lead Recruitment @ Epsilon Solutions Ltd. | Full-life Cycle Recruiting Role: SAP PO DEVELOPER Duration: CONTRACT POSITION ACCOUNTABILITIES: Work with development teams from project initiation through to production implementation in a consultative role, representing both EAI and PO; Provide status updates on assigned project deliverables; Work with application support teams to ensure PO environment availability through effective proactive monitoring and prevention to avoid unplanned outages; Ensure SAP PO application failures are handled effectively and resolved in a timely fashion. Analyze dumps, traces, and logs in order to determine root cause of specific problems; Manage stakeholder expectations to ensure level of service is understood; Identify and prioritize enhancements to existing SAP PO interfaces and ensure execution per plan; Work with EAI team members to determine best value service improvement opportunities. Actively seek opportunities to increase customer satisfaction; Deliver on chosen improvement opportunities following established service development best practices; Ability to define, document, implement, and operate repeatable, well purposed processes; Coordinate diverse groups who contribute to the SAP PO environment and build consensus for requirements across the various IS groups; Identify assumptions, constraints, and anticipate and manage risks; Liaise with architecture to ensure PO functional and non-functional requirements are considered (i.e. security, compatibility and maintainability). POSITION REQUIREMENTS: University Degree in Computer Science, Information Systems, Engineering, Science, Commerce/Business Administration or equivalent practical experience; 7+ years of pure SAP PO experience with increasing degrees of responsibility; Proven strong communication skills - both written and oral - with technical and non-technical staff and with management; Expertise in facilitation, negotiation, gaining consensus and managing conflict of diverse stakeholder groups; Understand and manage change in working environment; Appreciation and understanding of how to deal with different cultural values and sensitivities and how to work in a virtual work environment; Comprehension of Enterprise Application Integration principles and practices; Strong organizational and time management skills. Work effectively in ambiguous or stressful situations; Stay current with industry directions and technology capabilities; Develop relationships with business partners internally and externally to foster a consistent and outstanding business partner experience; Interact effectively with various levels of the organization; Ability to work both in an independent or a highly interactive team situation; Ability to independently research and learn new skills to solve challenging problems; Experience in SAP Process Orchestration 7.5; Experience in System Landscape Directory (SLD); Hands-on experience in Interface Design & Configuration using SAP-PO; Hands-on experience in ABAP, JAVA & XSLT; Strong Graphical Mapping skills; Experience with SAP-PO Advance Adapter Engine; Experience with SAP-PO Alert Management; Experience with SAP PO directory APIs and other standard system APIs; Hands-on experience in defining & configuring adapters such as File, JMS, JDBC, SOAP, SFTP, REST, oData etc.; Experience with AWS messaging standards, such S3 and SQS; Experience in defining High Availability of SAP-PO systems, Message Security etc.; Experience with developing B2B scenarios using SAP PO B2B Toolkit; Experience with API Development and its standards. Asset to have: Fundamental knowledge of railway operations and processes; Experience in non-SAP EAI tools and their integration with SAP-PO; Experience in SAP ASAP methodology; Experience in PO and AIF integration and reporting; Experience in NetWeaver BPM and BRM; Experience with developing e-SOA applications using SAP-PO and other SAP components; Experience with SAP best practices for Design & Configuration; Experience with SAP-PO industry adapters; Experience with Java Proxy; Basic Administration experience in SAP PO; Experience with third-party adapters; Hands-on experience in Boomi; Hands-on experience in MuleSoft; Hands-on experience in IBM API Connect; Seniority level Mid-Senior level Employment type Contract Job function Information Technology Industries IT Services and IT Consulting #J-18808-Ljbffr
About The RoleYou will be at the forefront of automating and optimizing our processes and systems in the company. We are looking for a dynamic and skilled professional who has a strong background in PowerShell, Python, PowerApps, Power Automate, Dataverse, and SharePoint. Your role will involve developing and integrating sophisticated automation solutions, enhancing our change management processes, and contributing to various projects across our IT infrastructure.Key ResponsibilitiesDesign, build, and maintain automation solutions using PowerApps, Power Automate, and Dataverse.Need strong understanding and experience in building Power Automate logic, familiar with connectors and actions, and how to integrate them with PowerApps.Good understanding of working with multiple screens in PowerApps and integrating them effectively.Well-versed in Kusto queries, with the ability to write custom queries based on scenarios.Expertise in PowerShell, especially for automation in Microsoft technologies.Create and manage dynamic dashboards and interfaces using PowerApps, integrated with SharePoint lists and other data sources.Utilize Figma tool for front-end development and design.Write and maintain PowerShell scripts and Kusto queries for data capture, analysis, and automation of tasks.Develop and manage automation for Microsoft's ICM ticketing tool.Build and maintain Azure Cloud Adoption Framework (CAF) solutions using PowerShell, ARM, and Azure DevOps CI/CD pipelines.Proven experience in PowerApps, Power Automate, Dataverse, and SharePoint.Experience with Azure cloud services, including development of cloud infrastructure and monitoring solutions.Strong problem-solving skills and the ability to work in a dynamic, fast-paced environment.Excellent communication and teamwork abilities.Prior experience with Azure CAF, ARM templates, and Azure DevOps.Skills RequiredAdvanced knowledge of PowerApps, Power Automate, and Dataverse.Expertise in SharePoint development and administration.Proficient in PowerShell scripting and Kusto query language.Experience with Figma or similar front-end design tools.Familiarity with cloud services and infrastructure (Azure).Experience with cloud-based development and deployment tools, including Azure DevOps.Strong understanding of CI/CD pipelines and practices.Ability to troubleshoot and solve complex technical problems.Excellent communication skills, both written and verbal.Ability to manage multiple projects simultaneously.Keen to learn new technologies and methodologies. #J-18808-Ljbffr
Job Title: Data Engineer with Python and SnowflakeLocation: Mississauga, ON/ Montreal, QCDuration: ContractResponsibilitiesHands-on development experience with Snowflake features such as Snow SQL, Snow Pipe, Python, Tasks, Streams, Time travel, Zero Copy Cloning, Optimizer, Metadata Manager, data sharing, and stored procedures.Experience in Data warehousing - OLTP, OLAP, Dimensions, Facts, and Data modeling.Need to have working knowledge of MS Azure configuration items with respect to Snowflake.Developing EL pipelines in and out of data warehouse using a combination of Data bricks, Python, and Snow SQL.Strong understanding of Snowflake on Azure Architecture, design, implementation, and operationalization of large-scale data and analytics solutions on Snowflake Cloud Data Warehouse.Developing scripts in UNIX, Python, etc. to Extract, Load, and Transform data, as well as other utility functions.Provide production support for Data Warehouse issues such as data load problems and transformation translation problems.Translate mapping specifications to data transformation design and development strategies and code, incorporating standards and best practices for optimal execution.Understanding data pipelines and modern ways of automating data pipeline using cloud-based testing and clearly document implementations, so others can easily understand the requirements, implementation, and test conditions.Perform code reviews to ensure fit to requirements, optimal execution patterns, and adherence to established standards.RequirementsMinimum 8 years of designing and implementing an operational production grade large-scale data solution on Microsoft Azure Snowflake Data Warehouse.Including hands-on experience with productionized data ingestion and processing pipelines using Python, Data bricks, and Snow SQL.Excellent understanding of Snowflake Internals and integration of Snowflake with other data processing and reporting technologies.Excellent presentation and communication skills, both written and verbal, ability to problem solve and design in an environment with unclear requirements. #J-18808-Ljbffr