-Bachelor degree in Computer Science or a related field of study.
- Experience managing Data bricks workspaces including cluster configuration user roles permissions cluster policies and applying monitoring and cost optimization for efficient governed Spark workloads.
- 8 years Experience as a Data Architect in a large enterprise designing and implementing data architecture strategies and models that align data technology and business goals with strategic objectives.
- 4 years Experience designing data solutions for analytics-ready trusted datasets using tools like Power BI and Synapse including semantic layers data marts and data products for self-service data science and reporting.
- 4 years Experience in Github/Git for version control collaborative development code management and integration with data engineering workflows.
- 5 years Experience with Azure services (Storage, SQL, Synapse, networking) for scalable secure solutions and with authentication (Service Principals, Managed Identities) for secure access in pipelines and integrations.
- 6 years Experience in Python (including PySpark) and SQL applied to developing orchestrating and optimizing enterprise-grade ETL/ELT workflows in a large-scale cloud environment.
- 3 years Experience building scalable data pipelines with Azure Databricks Delta Lake Workflows Jobs and Notebooks plus cluster management. Extending solutions to Synapse Analytics and Microsoft Fabric is a plus.
- Certification in The Open Group Architecture Framework (TOGAF).
- Use of AI-Experienced in using AI for code generation data analysis automation and enhancing productivity in data engineering workflows.
- 8 years Direct hands-on experience performing business requirement analysis related to data manipulation/transformation cleansing and wrangling.
- 8 years Experience and strong technical knowledge of Microsoft SQL Server including database design optimization and administration in enterprise environments.
- 2 years Experience building scalable ETL pipelines data quality enforcement and cloud integration using TALEND technologies.
- 2 years Experience in data governance security and metadata management within a Databricks-based platform.
- 3 years Skilled in building secure scalable RESTful APIs for data exchange with robust auth error handling and support for real-time automation.
- 3 years Experience in Message Queueing Technologies implementing message queuing using tools like ActiveMQ and Service Bus for scalable asynchronous communication across distributed systems.
- 5 years Experience working with cross-functional teams to create software applications and data products.
- Experience working with ServiceNow- Azure based Data Management Platform Integrations.
-Bachelor degree in Computer Science or a related field of study.
- Experience managing Data bricks workspaces including cluster configuration user roles permissions cluster policies and applying monitoring and cost optimization for efficient governed Spark workloads.
- 8 years Experience as a Data Architect in a large enterprise designing and implementing data architecture strategies and models that align data technology and business goals with strategic objectives.
- 4 years Experience designing data solutions for analytics-ready trusted datasets using tools like Power BI and Synapse including semantic layers data marts and data products for self-service data science and reporting.
- 4 years Experience in Github/Git for version control collaborative development code management and integration with data engineering workflows.
- 5 years Experience with Azure services (Storage, SQL, Synapse, networking) for scalable secure solutions and with authentication (Service Principals, Managed Identities) for secure access in pipelines and integrations.
- 6 years Experience in Python (including PySpark) and SQL applied to developing orchestrating and optimizing enterprise-grade ETL/ELT workflows in a large-scale cloud environment.
- 3 years Experience building scalable data pipelines with Azure Databricks Delta Lake Workflows Jobs and Notebooks plus cluster management. Extending solutions to Synapse Analytics and Microsoft Fabric is a plus.
- Certification in The Open Group Architecture Framework (TOGAF).
- Use of AI-Experienced in using AI for code generation data analysis automation and enhancing productivity in data engineering workflows.
- 8 years Direct hands-on experience performing business requirement analysis related to data manipulation/transformation cleansing and wrangling.
- 8 years Experience and strong technical knowledge of Microsoft SQL Server including database design optimization and administration in enterprise environments.
- 2 years Experience building scalable ETL pipelines data quality enforcement and cloud integration using TALEND technologies.
- 2 years Experience in data governance security and metadata management within a Databricks-based platform.
- 3 years Skilled in building secure scalable RESTful APIs for data exchange with robust auth error handling and support for real-time automation.
- 3 years Experience in Message Queueing Technologies implementing message queuing using tools like ActiveMQ and Service Bus for scalable asynchronous communication across distributed systems.
- 5 years Experience working with cross-functional teams to create software applications and data products.
- Experience working with ServiceNow- Azure based Data Management Platform Integrations.
-Bachelor degree in Computer Science or a related field of study.
- Experience managing Data bricks workspaces including cluster configuration user roles permissions cluster policies and applying monitoring and cost optimization for efficient governed Spark workloads.
- 8 years Experience as a Data Architect in a large enterprise designing and implementing data architecture strategies and models that align data technology and business goals with strategic objectives.
- 4 years Experience designing data solutions for analytics-ready trusted datasets using tools like Power BI and Synapse including semantic layers data marts and data products for self-service data science and reporting.
- 4 years Experience in Github/Git for version control collaborative development code management and integration with data engineering workflows.
- 5 years Experience with Azure services (Storage, SQL, Synapse, networking) for scalable secure solutions and with authentication (Service Principals, Managed Identities) for secure access in pipelines and integrations.
- 6 years Experience in Python (including PySpark) and SQL applied to developing orchestrating and optimizing enterprise-grade ETL/ELT workflows in a large-scale cloud environment.
- 3 years Experience building scalable data pipelines with Azure Databricks Delta Lake Workflows Jobs and Notebooks plus cluster management. Extending solutions to Synapse Analytics and Microsoft Fabric is a plus.
- Certification in The Open Group Architecture Framework (TOGAF).
- Use of AI-Experienced in using AI for code generation data analysis automation and enhancing productivity in data engineering workflows.
- 8 years Direct hands-on experience performing business requirement analysis related to data manipulation/transformation cleansing and wrangling.
- 8 years Experience and strong technical knowledge of Microsoft SQL Server including database design optimization and administration in enterprise environments.
- 2 years Experience building scalable ETL pipelines data quality enforcement and cloud integration using TALEND technologies.
- 2 years Experience in data governance security and metadata management within a Databricks-based platform.
- 3 years Skilled in building secure scalable RESTful APIs for data exchange with robust auth error handling and support for real-time automation.
- 3 years Experience in Message Queueing Technologies implementing message queuing using tools like ActiveMQ and Service Bus for scalable asynchronous communication across distributed systems.
- 5 years Experience working with cross-functional teams to create software applications and data products.
- Experience working with ServiceNow- Azure based Data Management Platform Integrations.
-Bachelor degree in Computer Science or a related field of study.
- Experience managing Data bricks workspaces including cluster configuration user roles permissions cluster policies and applying monitoring and cost optimization for efficient governed Spark workloads.
- 8 years Experience as a Data Architect in a large enterprise designing and implementing data architecture strategies and models that align data technology and business goals with strategic objectives.
- 4 years Experience designing data solutions for analytics-ready trusted datasets using tools like Power BI and Synapse including semantic layers data marts and data products for self-service data science and reporting.
- 4 years Experience in Github/Git for version control collaborative development code management and integration with data engineering workflows.
- 5 years Experience with Azure services (Storage, SQL, Synapse, networking) for scalable secure solutions and with authentication (Service Principals, Managed Identities) for secure access in pipelines and integrations.
- 6 years Experience in Python (including PySpark) and SQL applied to developing orchestrating and optimizing enterprise-grade ETL/ELT workflows in a large-scale cloud environment.
- 3 years Experience building scalable data pipelines with Azure Databricks Delta Lake Workflows Jobs and Notebooks plus cluster management. Extending solutions to Synapse Analytics and Microsoft Fabric is a plus.
- Certification in The Open Group Architecture Framework (TOGAF).
- Use of AI-Experienced in using AI for code generation data analysis automation and enhancing productivity in data engineering workflows.
- 8 years Direct hands-on experience performing business requirement analysis related to data manipulation/transformation cleansing and wrangling.
- 8 years Experience and strong technical knowledge of Microsoft SQL Server including database design optimization and administration in enterprise environments.
- 2 years Experience building scalable ETL pipelines data quality enforcement and cloud integration using TALEND technologies.
- 2 years Experience in data governance security and metadata management within a Databricks-based platform.
- 3 years Skilled in building secure scalable RESTful APIs for data exchange with robust auth error handling and support for real-time automation.
- 3 years Experience in Message Queueing Technologies implementing message queuing using tools like ActiveMQ and Service Bus for scalable asynchronous communication across distributed systems.
- 5 years Experience working with cross-functional teams to create software applications and data products.
- Experience working with ServiceNow- Azure based Data Management Platform Integrations.
-Bachelor degree in Computer Science or a related field of study.
- Experience managing Data bricks workspaces including cluster configuration user roles permissions cluster policies and applying monitoring and cost optimization for efficient governed Spark workloads.
- 8 years Experience as a Data Architect in a large enterprise designing and implementing data architecture strategies and models that align data technology and business goals with strategic objectives.
- 4 years Experience designing data solutions for analytics-ready trusted datasets using tools like Power BI and Synapse including semantic layers data marts and data products for self-service data science and reporting.
- 4 years Experience in Github/Git for version control collaborative development code management and integration with data engineering workflows.
- 5 years Experience with Azure services (Storage, SQL, Synapse, networking) for scalable secure solutions and with authentication (Service Principals, Managed Identities) for secure access in pipelines and integrations.
- 6 years Experience in Python (including PySpark) and SQL applied to developing orchestrating and optimizing enterprise-grade ETL/ELT workflows in a large-scale cloud environment.
- 3 years Experience building scalable data pipelines with Azure Databricks Delta Lake Workflows Jobs and Notebooks plus cluster management. Extending solutions to Synapse Analytics and Microsoft Fabric is a plus.
- Certification in The Open Group Architecture Framework (TOGAF).
- Use of AI-Experienced in using AI for code generation data analysis automation and enhancing productivity in data engineering workflows.
- 8 years Direct hands-on experience performing business requirement analysis related to data manipulation/transformation cleansing and wrangling.
- 8 years Experience and strong technical knowledge of Microsoft SQL Server including database design optimization and administration in enterprise environments.
- 2 years Experience building scalable ETL pipelines data quality enforcement and cloud integration using TALEND technologies.
- 2 years Experience in data governance security and metadata management within a Databricks-based platform.
- 3 years Skilled in building secure scalable RESTful APIs for data exchange with robust auth error handling and support for real-time automation.
- 3 years Experience in Message Queueing Technologies implementing message queuing using tools like ActiveMQ and Service Bus for scalable asynchronous communication across distributed systems.
- 5 years Experience working with cross-functional teams to create software applications and data products.
- Experience working with ServiceNow- Azure based Data Management Platform Integrations.
-Bachelor degree in Computer Science or a related field of study.
- Experience managing Data bricks workspaces including cluster configuration user roles permissions cluster policies and applying monitoring and cost optimization for efficient governed Spark workloads.
- 8 years Experience as a Data Architect in a large enterprise designing and implementing data architecture strategies and models that align data technology and business goals with strategic objectives.
- 4 years Experience designing data solutions for analytics-ready trusted datasets using tools like Power BI and Synapse including semantic layers data marts and data products for self-service data science and reporting.
- 4 years Experience in Github/Git for version control collaborative development code management and integration with data engineering workflows.
- 5 years Experience with Azure services (Storage, SQL, Synapse, networking) for scalable secure solutions and with authentication (Service Principals, Managed Identities) for secure access in pipelines and integrations.
- 6 years Experience in Python (including PySpark) and SQL applied to developing orchestrating and optimizing enterprise-grade ETL/ELT workflows in a large-scale cloud environment.
- 3 years Experience building scalable data pipelines with Azure Databricks Delta Lake Workflows Jobs and Notebooks plus cluster management. Extending solutions to Synapse Analytics and Microsoft Fabric is a plus.
- Certification in The Open Group Architecture Framework (TOGAF).
- Use of AI-Experienced in using AI for code generation data analysis automation and enhancing productivity in data engineering workflows.
- 8 years Direct hands-on experience performing business requirement analysis related to data manipulation/transformation cleansing and wrangling.
- 8 years Experience and strong technical knowledge of Microsoft SQL Server including database design optimization and administration in enterprise environments.
- 2 years Experience building scalable ETL pipelines data quality enforcement and cloud integration using TALEND technologies.
- 2 years Experience in data governance security and metadata management within a Databricks-based platform.
- 3 years Skilled in building secure scalable RESTful APIs for data exchange with robust auth error handling and support for real-time automation.
- 3 years Experience in Message Queueing Technologies implementing message queuing using tools like ActiveMQ and Service Bus for scalable asynchronous communication across distributed systems.
- 5 years Experience working with cross-functional teams to create software applications and data products.
- Experience working with ServiceNow- Azure based Data Management Platform Integrations.
-Bachelor degree in Computer Science or a related field of study.
- Experience managing Data bricks workspaces including cluster configuration user roles permissions cluster policies and applying monitoring and cost optimization for efficient governed Spark workloads.
- 8 years Experience as a Data Architect in a large enterprise designing and implementing data architecture strategies and models that align data technology and business goals with strategic objectives.
- 4 years Experience designing data solutions for analytics-ready trusted datasets using tools like Power BI and Synapse including semantic layers data marts and data products for self-service data science and reporting.
- 4 years Experience in Github/Git for version control collaborative development code management and integration with data engineering workflows.
- 5 years Experience with Azure services (Storage, SQL, Synapse, networking) for scalable secure solutions and with authentication (Service Principals, Managed Identities) for secure access in pipelines and integrations.
- 6 years Experience in Python (including PySpark) and SQL applied to developing orchestrating and optimizing enterprise-grade ETL/ELT workflows in a large-scale cloud environment.
- 3 years Experience building scalable data pipelines with Azure Databricks Delta Lake Workflows Jobs and Notebooks plus cluster management. Extending solutions to Synapse Analytics and Microsoft Fabric is a plus.
- Certification in The Open Group Architecture Framework (TOGAF).
- Use of AI-Experienced in using AI for code generation data analysis automation and enhancing productivity in data engineering workflows.
- 8 years Direct hands-on experience performing business requirement analysis related to data manipulation/transformation cleansing and wrangling.
- 8 years Experience and strong technical knowledge of Microsoft SQL Server including database design optimization and administration in enterprise environments.
- 2 years Experience building scalable ETL pipelines data quality enforcement and cloud integration using TALEND technologies.
- 2 years Experience in data governance security and metadata management within a Databricks-based platform.
- 3 years Skilled in building secure scalable RESTful APIs for data exchange with robust auth error handling and support for real-time automation.
- 3 years Experience in Message Queueing Technologies implementing message queuing using tools like ActiveMQ and Service Bus for scalable asynchronous communication across distributed systems.
- 5 years Experience working with cross-functional teams to create software applications and data products.
- Experience working with ServiceNow- Azure based Data Management Platform Integrations.
-Bachelor degree in Computer Science or a related field of study.
- Experience managing Data bricks workspaces including cluster configuration user roles permissions cluster policies and applying monitoring and cost optimization for efficient governed Spark workloads.
- 8 years Experience as a Data Architect in a large enterprise designing and implementing data architecture strategies and models that align data technology and business goals with strategic objectives.
- 4 years Experience designing data solutions for analytics-ready trusted datasets using tools like Power BI and Synapse including semantic layers data marts and data products for self-service data science and reporting.
- 4 years Experience in Github/Git for version control collaborative development code management and integration with data engineering workflows.
- 5 years Experience with Azure services (Storage, SQL, Synapse, networking) for scalable secure solutions and with authentication (Service Principals, Managed Identities) for secure access in pipelines and integrations.
- 6 years Experience in Python (including PySpark) and SQL applied to developing orchestrating and optimizing enterprise-grade ETL/ELT workflows in a large-scale cloud environment.
- 3 years Experience building scalable data pipelines with Azure Databricks Delta Lake Workflows Jobs and Notebooks plus cluster management. Extending solutions to Synapse Analytics and Microsoft Fabric is a plus.
- Certification in The Open Group Architecture Framework (TOGAF).
- Use of AI-Experienced in using AI for code generation data analysis automation and enhancing productivity in data engineering workflows.
- 8 years Direct hands-on experience performing business requirement analysis related to data manipulation/transformation cleansing and wrangling.
- 8 years Experience and strong technical knowledge of Microsoft SQL Server including database design optimization and administration in enterprise environments.
- 2 years Experience building scalable ETL pipelines data quality enforcement and cloud integration using TALEND technologies.
- 2 years Experience in data governance security and metadata management within a Databricks-based platform.
- 3 years Skilled in building secure scalable RESTful APIs for data exchange with robust auth error handling and support for real-time automation.
- 3 years Experience in Message Queueing Technologies implementing message queuing using tools like ActiveMQ and Service Bus for scalable asynchronous communication across distributed systems.
- 5 years Experience working with cross-functional teams to create software applications and data products.
- Experience working with ServiceNow- Azure based Data Management Platform Integrations.
-Bachelor degree in Computer Science or a related field of study.
- Experience managing Data bricks workspaces including cluster configuration user roles permissions cluster policies and applying monitoring and cost optimization for efficient governed Spark workloads.
- 8 years Experience as a Data Architect in a large enterprise designing and implementing data architecture strategies and models that align data technology and business goals with strategic objectives.
- 4 years Experience designing data solutions for analytics-ready trusted datasets using tools like Power BI and Synapse including semantic layers data marts and data products for self-service data science and reporting.
- 4 years Experience in Github/Git for version control collaborative development code management and integration with data engineering workflows.
- 5 years Experience with Azure services (Storage, SQL, Synapse, networking) for scalable secure solutions and with authentication (Service Principals, Managed Identities) for secure access in pipelines and integrations.
- 6 years Experience in Python (including PySpark) and SQL applied to developing orchestrating and optimizing enterprise-grade ETL/ELT workflows in a large-scale cloud environment.
- 3 years Experience building scalable data pipelines with Azure Databricks Delta Lake Workflows Jobs and Notebooks plus cluster management. Extending solutions to Synapse Analytics and Microsoft Fabric is a plus.
- Certification in The Open Group Architecture Framework (TOGAF).
- Use of AI-Experienced in using AI for code generation data analysis automation and enhancing productivity in data engineering workflows.
- 8 years Direct hands-on experience performing business requirement analysis related to data manipulation/transformation cleansing and wrangling.
- 8 years Experience and strong technical knowledge of Microsoft SQL Server including database design optimization and administration in enterprise environments.
- 2 years Experience building scalable ETL pipelines data quality enforcement and cloud integration using TALEND technologies.
- 2 years Experience in data governance security and metadata management within a Databricks-based platform.
- 3 years Skilled in building secure scalable RESTful APIs for data exchange with robust auth error handling and support for real-time automation.
- 3 years Experience in Message Queueing Technologies implementing message queuing using tools like ActiveMQ and Service Bus for scalable asynchronous communication across distributed systems.
- 5 years Experience working with cross-functional teams to create software applications and data products.
- Experience working with ServiceNow- Azure based Data Management Platform Integrations.
-Bachelor degree in Computer Science or a related field of study.
- Experience managing Data bricks workspaces including cluster configuration user roles permissions cluster policies and applying monitoring and cost optimization for efficient governed Spark workloads.
- 8 years Experience as a Data Architect in a large enterprise designing and implementing data architecture strategies and models that align data technology and business goals with strategic objectives.
- 4 years Experience designing data solutions for analytics-ready trusted datasets using tools like Power BI and Synapse including semantic layers data marts and data products for self-service data science and reporting.
- 4 years Experience in Github/Git for version control collaborative development code management and integration with data engineering workflows.
- 5 years Experience with Azure services (Storage, SQL, Synapse, networking) for scalable secure solutions and with authentication (Service Principals, Managed Identities) for secure access in pipelines and integrations.
- 6 years Experience in Python (including PySpark) and SQL applied to developing orchestrating and optimizing enterprise-grade ETL/ELT workflows in a large-scale cloud environment.
- 3 years Experience building scalable data pipelines with Azure Databricks Delta Lake Workflows Jobs and Notebooks plus cluster management. Extending solutions to Synapse Analytics and Microsoft Fabric is a plus.
- Certification in The Open Group Architecture Framework (TOGAF).
- Use of AI-Experienced in using AI for code generation data analysis automation and enhancing productivity in data engineering workflows.
- 8 years Direct hands-on experience performing business requirement analysis related to data manipulation/transformation cleansing and wrangling.
- 8 years Experience and strong technical knowledge of Microsoft SQL Server including database design optimization and administration in enterprise environments.
- 2 years Experience building scalable ETL pipelines data quality enforcement and cloud integration using TALEND technologies.
- 2 years Experience in data governance security and metadata management within a Databricks-based platform.
- 3 years Skilled in building secure scalable RESTful APIs for data exchange with robust auth error handling and support for real-time automation.
- 3 years Experience in Message Queueing Technologies implementing message queuing using tools like ActiveMQ and Service Bus for scalable asynchronous communication across distributed systems.
- 5 years Experience working with cross-functional teams to create software applications and data products.
- Experience working with ServiceNow- Azure based Data Management Platform Integrations.