Responsible for project delivery, including custom development activities, code reviews, alignment to architecture, CI/CD activities and providing oversight and guidance to developers Responsibility for the release manager activity for implementations, through use of CI/CD and DevOps tools (ex Copado Essentials+) Deliver solutions which enable customer business success Develop high level design documents Coordinate and lead key project development tasks Responsible for the integration of vendor solutions and interfaces for medium to highly complex situations Communicate with Customers SMEs to define system requirements and develop solutions Identify and escalation of risks and issues for a client engagement. Support the Salesforce practice including solution patterns, technical/development standards, agile delivery methods, solution design and governance. Use tools and methods such as proof-of-concept to assess alternate solutions and help ensure the team proceeds with the right solutions Qualifications 8+ years of progressive experience in a relevant role Degree in a related field Key Skills/Competencies - Ability to handle complex projects, mentor and support team members Strong collaboration skills Excellent domain knowledge (in more than one domain) Excellent written and verbal communication, public speaking skills and interpersonal skills Strategic thinker Strong understanding of professional services and techniques Customer-centric demeanor Embodies core principles of AppCentrica, leads by example Strong technical Salesforce/CRM background Demonstrated success in delivering projects Strong leadership and influencing skills Learn and adapt quickly Excellent analytical and problem-solving skills Strong multi-tasking skills, receptive to change Attention to detail and quality Ability to work independently and with others Works well under time constraints/pressure Salesforce Certifications/Experienced Salesforce Administrator Platform App Builder Platform Developer I Platform Developer II Sales Cloud Consultant Service Cloud Consultant Experience Cloud Consultant JavaScript Developer Optional Copado Essentials Data Architect Sharing and Visibility Integration Architect
5-8 years of exp with following skillset. LWC is a mandatory skill. • Translate business requirements into well-architect-ed solutions that best leverage the Salesforce platform and products • Analyze, design, develop, test, document and deploy high quality business solutions on the SFDC platform based on business needs and industry best practices • Apply best practices and design patterns of best-of-breed applications developed on the Salesforce. com platform • In-depth understanding of Salesforce Lightning Platform including development of lightning components & LWC implementation. • Responsible for the design, configuration, and administration of Apex code and supporting test classes, trigger, and Visualforce pages • Apex and Visualforce development, design, configuration, testing and deployment of Salesforce.com solutions • Experience integrating Salesforce with other applications via real time and batch process (sync/async). • Manage to the Salesforce.com limits (e.g. data storage, governor limits, etc.) to ensure compliance of the Salesforce.com agreement. • Unit Test (and/or develop test scripts) for all complex deliverable above (e.g. integration, security, Visual-force).
Responsibilities Experience: 7+ years in AI development/architecture, with a focus on Generative AI and large-scale deep learning systems. Technical skills: LLMs and Generative Models: Proven expertise in LLMs (customization, fine-tuning), Retrieval-Augmented Generation (RAG), and other generative models like GANs and Diffusion Models. Programming: Proficient in Python and deep learning frameworks like PyTorch or TensorFlow. NLP: Advanced understanding of modern NLP techniques, such as transformer models and tokenization. Cloud: Extensive experience with cloud platforms (AWS, Azure, GCP) and their respective AI/ML services (e.g., SageMaker, Vertex AI). MLOps & Infrastructure: Experience with MLOps practices and building high-availability, low-latency systems using containers (Docker, Kubernetes) and microservices. Monitor and increase the performance of RAG system. Understand the basics of RAG and how to validate its output. Determine how to increase the accuracy of RAG. Identify general tools/techniques that will be used on the back for retrieving/generating the data.
We are seeking a highly skilled Data Analyst with experience in the Healthcare industry using HL7 FHIR, FHIR Implementation Guides, Healthcare Claims and EOBs, Insurance Eligibility, and knowledge of Azure Health Data Services. The client team is seeking someone who will collaborate with different groups in the organization; and possess the strong analytical and problem skills to ask the right questions and provide the right feedback. This ideal candidate should understand healthcare interoperability (via APIs); possesses strong technical skills in data analysis; and have a proven track record in dealing with some complex data patterns. The key responsibilities for this role are as follows: Create data mappings of healthcare data to HL7 FHIR defined resources that will be consumed within Liquid templates Create reference data mappings of source reference data to the HL7 FHIR reference date (e.g. Gender, ICD-10-CM, CPT, etc) Differentiate the mappings that fit into the profile and those mappings that will fit into the extensions Present and justify the logic of the mappings to the development team Understand and know how to extract data from the API Provide consulting answers to questions the development team may have in relation to Azure Healthcare Data Services performance.
Responsible for Lead the design and development of an Agentic AI platform. Deep expertise in machine learning, system architecture, and AI agent frameworks to build scalable, autonomous systems. Architect and implement core systems for agent-based AI workflows. Design and deploy LLM-based pipelines, agent orchestration, and vector-based memory systems. Develop and optimize ML models, pipelines, and orchestration logic. Drive technical strategy, tooling, and infrastructure decisions. Architect and implement agentic AI systems leveraging GCP services (Vertex AI, BigQuery, Cloud Functions, Pub/Sub, etc.). Requirements: Several years of industry experience in AI/ML and data engineering, with a track record of working in large-scale programs and solving complex use cases using GCP AI Platform/Vertex AI. Agentic AI Architecture: Exceptional command in Agentic AI architecture, development, testing, and research of both Neural-based & Symbolic agents, using current-generation deployments and next-generation patterns/research. Agentic Systems: Expertise in building agentic systems using techniques including Multi-agent systems, Reinforcement learning, flexible/dynamic workflows, caching/memory management, and concurrent orchestration. Proficiency in one or more Agentic AI frameworks such as LangGraph, Crew AI, Semantic Kernel, etc. Python Proficiency: Expertise in Python language to build large, scalable applications, conduct performance analysis, and tuning. Prompt Engineering: Strong skills in prompt engineering and its techniques including design, development, and refinement of prompts (zero-shot, few-shot, and chain-of-thought approaches) to maximize accuracy and leverage optimization tools. IR/RAG Systems: Experience in designing, building, and implementing IR/RAG systems with Vector DB and Knowledge Graph. Model Evaluation: Strong skills in the evaluation of models and their tools. Experience in conducting rigorous A/B testing and performance benchmarking of prompt/LLM variations, using both quantitative metrics and qualitative feedback. Technical Skills Required: Programming Languages: Proficiency in Python is essential. Agentic AI : Expertise in LangChain/LangGraph, CrewAI, Semantic Kernel/Autogen and Open AI Agentic SDK Machine Learning Frameworks: Experience with TensorFlow, PyTorch, Scikit-learn, and AutoML. Generative AI: Hands-on experience with generative AI models, RAG (Retrieval-Augmented Generation) architecture, and Natural Language Processing (NLP). Cloud Platforms: Familiarity with Google Cloud Platform (GCP). Data Engineering: Proficiency in data preprocessing and feature engineering. Version Control: Experience with GitHub for version control. Data Science Practices: Skills in building models, testing/validation, and deployment. · Collaboration: Experience working in an Agile framework. RAG Architecture: Experience with data ingestion, data retrieval, and data generation using optimal methods such as hybrid search. Google Cloud Platform:
Responsible for Lead the design and development of an Agentic AI platform. Deep expertise in machine learning, system architecture, and AI agent frameworks to build scalable, autonomous systems. Architect and implement core systems for agent-based AI workflows. Design and deploy LLM-based pipelines, agent orchestration, and vector-based memory systems. Develop and optimize ML models, pipelines, and orchestration logic. Drive technical strategy, tooling, and infrastructure decisions. Architect and implement agentic AI systems leveraging GCP services (Vertex AI, BigQuery, Cloud Functions, Pub/Sub, etc.).
Requirements:
Several years of industry experience in AI/ML and data engineering, with a track record of working in large-scale programs and solving complex use cases using GCP AI Platform/Vertex AI. Agentic AI Architecture: Exceptional command in Agentic AI architecture, development, testing, and research of both Neural-based & Symbolic agents, using current-generation deployments and next-generation patterns/research. Agentic Systems: Expertise in building agentic systems using techniques including Multi-agent systems, Reinforcement learning, flexible/dynamic workflows, caching/memory management, and concurrent orchestration. Proficiency in one or more Agentic AI frameworks such as LangGraph, Crew AI, Semantic Kernel, etc. Python Proficiency: Expertise in Python language to build large, scalable applications, conduct performance analysis, and tuning. Prompt Engineering: Strong skills in prompt engineering and its techniques including design, development, and refinement of prompts (zero-shot, few-shot, and chain-of-thought approaches) to maximize accuracy and leverage optimization tools. IR/RAG Systems: Experience in designing, building, and implementing IR/RAG systems with Vector DB and Knowledge Graph. Model Evaluation: Strong skills in the evaluation of models and their tools. Experience in conducting rigorous A/B testing and performance benchmarking of prompt/LLM variations, using both quantitative metrics and qualitative feedback.
Technical Skills Required: Programming Languages: Proficiency in Python is essential. Agentic AI : Expertise in LangChain/LangGraph, CrewAI, Semantic Kernel/Autogen and Open AI Agentic SDK Machine Learning Frameworks: Experience with TensorFlow, PyTorch, Scikit-learn, and AutoML.Generative AI: Hands-on experience with generative AI models, RAG (Retrieval-Augmented Generation) architecture, and Natural Language Processing (NLP). Cloud Platforms: Familiarity with Google Cloud Platform (GCP). Data Engineering: Proficiency in data preprocessing and feature engineering. Version Control: Experience with GitHub for version control. Data Science Practices: Skills in building models, testing/validation, and deployment. · Collaboration: Experience working in an Agile framework. RAG Architecture: Experience with data ingestion, data retrieval, and data generation using optimal methods such as hybrid search. Google Cloud Platform:
Responsible for Lead the design and development of an Agentic AI platform. Deep expertise in machine learning, system architecture, and AI agent frameworks to build scalable, autonomous systems. Architect and implement core systems for agent-based AI workflows. Design and deploy LLM-based pipelines, agent orchestration, and vector-based memory systems. Develop and optimize ML models, pipelines, and orchestration logic. Drive technical strategy, tooling, and infrastructure decisions. Architect and implement agentic AI systems leveraging GCP services (Vertex AI, BigQuery, Cloud Functions, Pub/Sub, etc.).
Requirements:
Several years of industry experience in AI/ML and data engineering, with a track record of working in large-scale programs and solving complex use cases using GCP AI Platform/Vertex AI. Agentic AI Architecture: Exceptional command in Agentic AI architecture, development, testing, and research of both Neural-based & Symbolic agents, using current-generation deployments and next-generation patterns/research. Agentic Systems: Expertise in building agentic systems using techniques including Multi-agent systems, Reinforcement learning, flexible/dynamic workflows, caching/memory management, and concurrent orchestration. Proficiency in one or more Agentic AI frameworks such as LangGraph, Crew AI, Semantic Kernel, etc. Python Proficiency: Expertise in Python language to build large, scalable applications, conduct performance analysis, and tuning. Prompt Engineering: Strong skills in prompt engineering and its techniques including design, development, and refinement of prompts (zero-shot, few-shot, and chain-of-thought approaches) to maximize accuracy and leverage optimization tools. IR/RAG Systems: Experience in designing, building, and implementing IR/RAG systems with Vector DB and Knowledge Graph. Model Evaluation: Strong skills in the evaluation of models and their tools. Experience in conducting rigorous A/B testing and performance benchmarking of prompt/LLM variations, using both quantitative metrics and qualitative feedback.
Technical Skills Required: Programming Languages: Proficiency in Python is essential. Agentic AI : Expertise in LangChain/LangGraph, CrewAI, Semantic Kernel/Autogen and Open AI Agentic SDK Machine Learning Frameworks: Experience with TensorFlow, PyTorch, Scikit-learn, and AutoML.Generative AI: Hands-on experience with generative AI models, RAG (Retrieval-Augmented Generation) architecture, and Natural Language Processing (NLP). Cloud Platforms: Familiarity with Google Cloud Platform (GCP). Data Engineering: Proficiency in data preprocessing and feature engineering. Version Control: Experience with GitHub for version control. Data Science Practices: Skills in building models, testing/validation, and deployment. · Collaboration: Experience working in an Agile framework. RAG Architecture: Experience with data ingestion, data retrieval, and data generation using optimal methods such as hybrid search. Google Cloud Platform:
Eviden, part of the Atos Group, with an annual revenue of circa € 5 billion is a global leader in data-driven, trusted and sustainable digital transformation. As a next generation digital business with worldwide leading positions in digital, cloud, data, advanced computing and security, it brings deep expertise for all industries in more than 47 countries. By uniting unique high-end technologies across the full digital continuum with 53,000 world-class talents, Eviden expands the possibilities of data and technology, now and for generations to come.
Clarity PPM DeveloperOverall, 8 + years software developer experience.4 plus years of hands-on experience configuring and supporting Clarity PPM modules Project, Demand, Financials, Timesheet and Resource Management. Including a strong understanding of these functional areas.Strong experience on the modern UX, XOG, process workflows, Gel scripts, REST API and Studio.Solid experience in SQL optimization and performance tuning NSQL queries.4 plus years of relevant work experience in a teaming position with knowledge and experience relating business requirements to sound technical solutions.Experience utilizing formal testing tools strongly desired (experience with HP ALM testing tools a plus).Must have strong communication skills, with the ability to explain complex solutions and ideas.ResponsibilitiesThe candidate will be responsible for support and enhancing our Clarity PPM solution performing the followingRequirement gathering, requirement analysis, design, and development for Clarity enhancements (Portlets, Processes, NSQL Queries, Lookups, HTML Portlets).Configure CLARITY and integrate with various applications like Beeline and Workday.Deploying deliverables from lower environment to prod. using XOG and content package.Coordinate across multiple teams to ensure on-time delivery.Application monitoring for processes and jobs including application performance monitoring.Troubleshoot issues on existing processes/portlets including full resolution lifecycle.Clarity (SSAS) upgrade support.Create and schedule Jaspersoft report in Advanced Reporting.Interact successfully with business and IT team members to design optimal solutions.
Eviden, part of the Atos Group, with an annual revenue of circa € 5 billion is a global leader in data-driven, trusted and sustainable digital transformation. As a next generation digital business with worldwide leading positions in digital, cloud, data, advanced computing and security, it brings deep expertise for all industries in more than 47 countries. By uniting unique high-end technologies across the full digital continuum with 53,000 world-class talents, Eviden expands the possibilities of data and technology, now and for generations to come.
Clarity PPM DeveloperOverall, 8 + years software developer experience.4 plus years of hands-on experience configuring and supporting Clarity PPM modules Project, Demand, Financials, Timesheet and Resource Management. Including a strong understanding of these functional areas.Strong experience on the modern UX, XOG, process workflows, Gel scripts, REST API and Studio.Solid experience in SQL optimization and performance tuning NSQL queries.4 plus years of relevant work experience in a teaming position with knowledge and experience relating business requirements to sound technical solutions.Experience utilizing formal testing tools strongly desired (experience with HP ALM testing tools a plus).Must have strong communication skills, with the ability to explain complex solutions and ideas.ResponsibilitiesThe candidate will be responsible for support and enhancing our Clarity PPM solution performing the followingRequirement gathering, requirement analysis, design, and development for Clarity enhancements (Portlets, Processes, NSQL Queries, Lookups, HTML Portlets).Configure CLARITY and integrate with various applications like Beeline and Workday.Deploying deliverables from lower environment to prod. using XOG and content package.Coordinate across multiple teams to ensure on-time delivery.Application monitoring for processes and jobs including application performance monitoring.Troubleshoot issues on existing processes/portlets including full resolution lifecycle.Clarity (SSAS) upgrade support.Create and schedule Jaspersoft report in Advanced Reporting.Interact successfully with business and IT team members to design optimal solutions.
Eviden, part of the Atos Group, with an annual revenue of circa € 5 billion is a global leader in data-driven, trusted and sustainable digital transformation. As a next generation digital business with worldwide leading positions in digital, cloud, data, advanced computing and security, it brings deep expertise for all industries in more than 47 countries. By uniting unique high-end technologies across the full digital continuum with 53,000 world-class talents, Eviden expands the possibilities of data and technology, now and for generations to come.
Clarity PPM DeveloperOverall, 8 + years software developer experience.4 plus years of hands-on experience configuring and supporting Clarity PPM modules Project, Demand, Financials, Timesheet and Resource Management. Including a strong understanding of these functional areas.Strong experience on the modern UX, XOG, process workflows, Gel scripts, REST API and Studio.Solid experience in SQL optimization and performance tuning NSQL queries.4 plus years of relevant work experience in a teaming position with knowledge and experience relating business requirements to sound technical solutions.Experience utilizing formal testing tools strongly desired (experience with HP ALM testing tools a plus).Must have strong communication skills, with the ability to explain complex solutions and ideas.ResponsibilitiesThe candidate will be responsible for support and enhancing our Clarity PPM solution performing the followingRequirement gathering, requirement analysis, design, and development for Clarity enhancements (Portlets, Processes, NSQL Queries, Lookups, HTML Portlets).Configure CLARITY and integrate with various applications like Beeline and Workday.Deploying deliverables from lower environment to prod. using XOG and content package.Coordinate across multiple teams to ensure on-time delivery.Application monitoring for processes and jobs including application performance monitoring.Troubleshoot issues on existing processes/portlets including full resolution lifecycle.Clarity (SSAS) upgrade support.Create and schedule Jaspersoft report in Advanced Reporting.Interact successfully with business and IT team members to design optimal solutions.