RPA & AI/ML - Help Center - Comidor Low-Code Automation Platform https://www.comidor.com/category/help-center/rpa-ai-ml-hc/ All-in-one Digital Modernization Tue, 31 Dec 2024 08:43:23 +0000 en-GB hourly 1 https://www.comidor.com/wp-content/uploads/2025/05/cropped-Comidor-favicon-25-32x32.png RPA & AI/ML - Help Center - Comidor Low-Code Automation Platform https://www.comidor.com/category/help-center/rpa-ai-ml-hc/ 32 32 The Power of Process Mining Tools: Unlock Efficiency and Drive Innovation in Business Operations https://www.comidor.com/knowledge-base/business-process-management-kb/process-mining/ Mon, 30 Dec 2024 14:34:02 +0000 https://www.comidor.com/?p=38232 In today’s fast-paced world, organizations are constantly looking for ways to streamline operations and boost efficiency. One powerful tool in that direction that’s gaining attention is process mining. Process mining tools use data from business processes to uncover valuable insights that can transform how organizations work. Gartner projects that the process mining market will grow […]

The post The Power of Process Mining Tools: Unlock Efficiency and Drive Innovation in Business Operations appeared first on Comidor Low-code Automation Platform.

]]>
In today’s fast-paced world, organizations are constantly looking for ways to streamline operations and boost efficiency. One powerful tool in that direction that’s gaining attention is process mining. Process mining tools use data from business processes to uncover valuable insights that can transform how organizations work. Gartner projects that the process mining market will grow to $2.3 billion by 2025, driven by a compound annual growth rate (CAGR) of 33%. This significant expansion reflects the increasing adoption of process mining tools among large enterprises. For example, a hospital used process mining to analyze patient flow in its emergency department. This helped identify bottlenecks, reduce wait times, and improve patient satisfaction and efficiency.

Process mining combines data mining and process management, using event logs from IT systems to analyze and improve real-world processes. This blend of data science and process management enables organizations to become more agile and efficient.

Join us to explore process mining—what it is, how it differs from process discovery, how it works, and the techniques and stages involved. Discover the transformative potential of this game-changing technology.

What is Process Mining

As mentioned in the introduction, process mining is a transformative technique used to analyze business processes by extracting insights from event logs stored in information systems. Unlike traditional process modeling methods, which rely on subjective input, process mining tools utilize real data to provide an objective view of how processes are executed within an organization.

By examining event logs, process mining uncovers hidden patterns, bottlenecks, and variations, offering organizations a clear understanding of their workflows, deviations, and inefficiencies. This data-driven approach enables stakeholders to identify areas for improvement and optimization, ultimately enhancing operational efficiency and driving organizational success.

process mining explanationProcess Mining vs. Process Discovery

While both process mining and process discovery focus on understanding and improving business processes, they are distinct in their approaches, methodologies, and outcomes. In comparison to process mining which is a data-driven approach that uses event logs from IT systems like ERP, CRM, and workflow automation software to analyze and improve actual business processes, process discovery is a technique to uncover and define business processes from scratch, often using interviews, workshops, and observations.

Purpose: Process mining focuses on analyzing existing event logs to improve process efficiency and compliance, while process discovery is the initial step in process mining, aiming to construct process models from observed events.

Analysis vs. Construction: Process mining analyzes historical data to understand how processes are executed, while process discovery constructs process models based on observed events, providing a foundation for further analysis.

Insight Generation: Process mining generates insights from existing data, uncovering actual process flows and deviations. In contrast, process discovery focuses on constructing an initial process model to understand process structure and behavior.

Iterative Process: Process mining is often an iterative process, where insights from initial analysis inform further data collection and refinement. Process discovery serves as a starting point for this iterative cycle, providing a baseline model for subsequent optimization efforts.

Feature Process Mining Process Discovery
Data Source Event logs and system data Interviews, workshops, observations
Focus Data-driven analysis Human-driven process understanding
When to Use When event logs are available When processes are undocumented
Outcome Visualizations of actual processes Descriptions or diagrams of processes
Precision High accuracy based on real-time data May vary based on stakeholder input

Phases in the Data/Process Mining Process

  1. Discovery: This initial phase involves not only identifying data sources but also understanding the context and objectives of process mining within the organization. Stakeholders define the scope of the analysis, identifying key processes and desired outcomes.
  2. Data Preparation: Once data sources are identified, the next step is to prepare the data for analysis. This involves data cleaning, transformation, and integration from various sources to create a unified dataset suitable for process mining.
  3. Process Modeling: In this phase, process mining algorithms are applied to the prepared dataset to construct process models. These models represent the sequence of activities, dependencies, and decision points within the process, providing a visual representation of how the process flows.
  4. Analysis and Interpretation: Once process models are constructed, they are analyzed to uncover insights and patterns. Stakeholders interpret the results to identify bottlenecks, inefficiencies, and opportunities for improvement. This phase may involve statistical analysis, visualization, and collaboration among different stakeholders.
  5. Validation and Verification: Before implementing any changes based on process mining insights, it’s crucial to validate the findings and verify their accuracy. This may involve comparing the constructed process models with domain knowledge or historical records to ensure they accurately reflect the reality of the process.
  6. Implementation and Monitoring: Finally, the insights gained from process mining are implemented in the organization’s processes. This may involve redesigning workflows, reallocating resources, or introducing new technologies. Continuous monitoring is essential to track the impact of these changes and make further adjustments as needed.
  7. Continuous Improvement: Process mining is not a one-time activity but rather a continuous journey of improvement. Organizations should regularly revisit their process models, collect new data, and refine their analysis to adapt to changing business needs and drive ongoing optimization.

Process Mining Phases | ComidorProcess Mining Techniques

  • Process Discovery: This technique involves extracting process models from event logs to visualize how processes are executed. Various algorithms such as alpha, heuristic, and genetic algorithms are employed to construct these models, offering insights into process flow and behavior.
  • Conformance Checking: Conformance-checking techniques compare observed behavior with predefined process models to identify discrepancies and deviations. By assessing the alignment between actual executions and expected behavior, organizations can pinpoint areas of non-compliance or inefficiency.
  • Enhancement Mining: Enhancement mining focuses on optimizing existing process models to improve efficiency and performance. This technique involves analyzing process models to identify bottlenecks, redundancies, and opportunities for streamlining. By implementing changes based on these insights, organizations can enhance process efficiency and achieve better outcomes.
  • Predictive Process Analytics: Predictive process analytics utilizes historical event data to forecast future process behavior. By analyzing past patterns and trends, organizations can predict potential issues, anticipate future resource needs, and make proactive decisions to optimize processes and enhance performance.
  • Social Network Analysis: Social network analysis examines the relationships and interactions between individuals or entities involved in a process. By visualizing communication patterns and collaboration networks, organizations can identify key influencers, communication bottlenecks, and opportunities for improving collaboration and knowledge sharing.
  • Performance Mining: Performance mining techniques focus on analyzing process performance metrics to identify areas for improvement. By monitoring key performance indicators (KPIs) such as cycle time, throughput, and resource utilization, organizations can pinpoint inefficiencies and optimize processes to achieve better performance outcomes.
  • Text Mining: Text mining techniques analyze unstructured textual data within event logs to extract valuable insights. By mining text data from sources such as emails, chat logs, or support tickets, organizations can uncover hidden patterns, sentiment analysis, and emerging issues that impact process performance.

Conclusion

Process mining tools help businesses improve by analyzing event logs to uncover how processes truly work. It reveals patterns, identifies bottlenecks, and highlights areas for improvement. As we conclude, think of process mining not just as a tool but as a guide, leading us to a future where efficiency, agility, and innovation drive success.

Author Bio:
Vijayashree Shinde is the Digital Marketing Executive. She has worked in a wide range of industries, including the software testing industry. Currently, she is a Digital Marketer at Testrig Technologies. In addition to as marketing expertise, Vijayashree enjoys writing articles on quality assurance for a larger audience.

The post The Power of Process Mining Tools: Unlock Efficiency and Drive Innovation in Business Operations appeared first on Comidor Low-code Automation Platform.

]]>
How to Integrate ChatGPT and DALL·E into your Business with Comidor https://www.comidor.com/help-center/process-automation/chatgpt-integration/ Thu, 23 Mar 2023 15:43:22 +0000 https://www.comidor.com/?p=36204 Undoubtedly, chatbots and AI-powered solutions are here to stay. Businesses of all types and sizes are trying to find ways to implement chatbots and AI solutions to drive business productivity and stay competitive. In this section, we will explore how you can easily with no-code integrate ChatGPT and DALL·E into your business to improve productivity […]

The post How to Integrate ChatGPT and DALL·E into your Business with Comidor appeared first on Comidor Low-code Automation Platform.

]]>
Undoubtedly, chatbots and AI-powered solutions are here to stay. Businesses of all types and sizes are trying to find ways to implement chatbots and AI solutions to drive business productivity and stay competitive. In this section, we will explore how you can easily with no-code integrate ChatGPT and DALL·E into your business to improve productivity and boost efficiency. 

To begin with, OpenAI provides a suite of powerful tools that allow users to generate text, images, videos, or audio in a fraction of the time it would take a human. ChatGPT is an Artificial Intelligence (AI) chatbot developed by OpenAI that can be used in a variety of use cases. It uses Natural Language Processing (NLP) and machine learning to generate conversations with a human-level quality. On the other hand, DALL·E, with its automated capabilities, can help content creators save time while increasing the quality of their output. It produces images based on a description provided by a user.  

OpenAI tools are both easily accessible through web interfaces, as well as via APIs that help you integrate their services with your applications and systems. However, make sure to be cautious, and familiarize yourself with the various security measures in place before attempting any integration. 

Before you use OpenAI ChatGPT and DALL·E, you need to sign up for an OpenAI account and create a valid API key. To create an API key and integrate ChatGPT and DALL·E into Comidor, please follow these steps: 

First step: Visit the OpenAI website and sign up for an account. You can use your Google or Microsoft account or just add another email address.  

Create an openAI account | ChatGPT | ComidorBefore completing your account, you should clarify how you will primarily use the OpenAI tools. 

How will you use ChatGPT? | ComidorSecond step: When logged in, click on your profile icon and select “View API Keys”

view API keys | ChatGPT | Comidor

Third step: Click on “create new secret key”  

generate key | ChatGPT | Comidor

Fourth step:  Finally, copy the generated key and store it in a secure place, as it is displayed on your screen only for a limited time. After closing this page, you won’t be able to view the API key again, so keep that in mind. 

copy the generated key | ChatGPT | Comidor

Fifth step: Log in to your Comidor account and go to Application Parameters. Click on the “+” icon to create a new application parameter. 

App Factory> Integrations and Services> Application Parameters 

create a new application parameter | ChatGPT | Comidor

Sixth step: Fill in the form as it is shown below. Make sure that you add the generated API key to the “Value” field. Once ready, save the new application parameter.  

  • Package Code: SYSTEM 
  • Name: GPT_TOKEN 
  • Value: the generated key from OpenAI 

Application Parameter | ChatGPT | Comidor

Note: Double-check that there are no empty spaces before and after the values you have added.  

Seventh step: Open the process-enabled application or the process design where you want to utilize the OpenAI capabilities. If you haven’t created an application yet, you can easily create it with no code through the Comidor App Designer. 

Eighth step: Open the Data Model and create at least 2 memo fields by clicking on the “+” icon, one for the question and one for the response of the ChatGPT. Of course, you can create as many fields as necessary for your case.  

  • For using the DALL·E open AI, you would need to create a memo field to describe the image you want the AI to draw, and a binary-type field to store the produced image.

create ChatGPT fields | Comidor

Data model | ChatGPT | Comidor

Ninth step: Both question and response fields should be a part of one or more user forms for the end-user to provide the question and get the response. Go to User Forms and create (a) new form(s) according to your needs, by clicking on the “+” icon Drag and drop the fields you need inside the form. Don’t forget to add the Question field if this form is used to ask a question to the ChatGPT 

ChatGPT form | Comidor

The ChatGPT’s response should be available inside the form you have included in the response field. Keep in mind that this form can be a task form or a main form 

Tenth step: Go to the workflow and drag and drop the OpenAI component from the Integration Components list to the workflow design.

  • In the component attribute, define the type:
    • ChatGPT
    • DALL·E
  • For ChatGPT, choose the input; a text/memo field where the question is stored, and the Response; a memo field where the answer of the ChatGPT is saved (the previously created fields).

integrate ChatGPT | Comidor

  • For DALL·E, choose the input; a text/memo field where the image description is added, and the Response; a binary field where the produced image by DALL·E is saved (the previously created fields).

integrate DALL·E | Comidor

 

Now, it’s time to get started using ChatGPT and DALL·E in your business life! 

Let’s see in action, how the OpenAI integration services can be utilized by a marketing team to generate compelling content to be used in the form of a newsletter.  Marketers should always review the generated content and make any adjustments to meet their specific needs and industry standards. 

1. A member of the marketing team initiates the new newsletter process from the quick add menu in Comidor.  

initiate newsletter app-quick add | Comidor

2. On the quick add form, the user defines the topic of the newsletter, asks a question to the ChatGPT, and describes the image to be drawn by DALL·E. 

ask ChatGPT | Comidor

3. In a matter of seconds, the ChatGPT produces the content that the user asked for and the DALL·E produces an image in png format.  

4. Once the response is ready, the marketing team receives a notification that a new task is assigned to the team to review ChatGPT’s response.  

notification from newsletter app | Comidor

5. The user can edit the response and complete the task. Finally, the responsible team member receives a notification to use the content for the newsletter preparation.

chatgpt response | Comidor

Check out this special image created by DALL·E! It’s truly incredible what AI can accomplish these days.

Dalle image | AI | ComidorFinal Thoughts

ChatGPT and DALL·E are great tools for any business that wants to automate its marketing operations and save time and money. It offers scalability, flexibility, and customization options.Open AI tools, with their powerful features and intuitive interface, are sure to revolutionize the way organizations manage their operations in the future.

The post How to Integrate ChatGPT and DALL·E into your Business with Comidor appeared first on Comidor Low-code Automation Platform.

]]>
Robotic Process Automation (RPA components) https://www.comidor.com/help-center/rpa-ai-ml-hc/rpa-components/ Tue, 05 Apr 2022 07:00:24 +0000 https://www.comidor.com/?p=21349 RPA Components Comidor RPA components and elements allow you to automate and manage repetitive tasks. With the RPA Caller and RPA Receiver workflow components, you are enabled to retrieve or exchange data with other systems. RPA can be integrated into: Process initiation Report generation File upload in Comidor Document Management System (DMS) With Comidor RPA you […]

The post Robotic Process Automation (RPA components) appeared first on Comidor Low-code Automation Platform.

]]>
RPA Components

Comidor RPA components and elements allow you to automate and manage repetitive tasks. With the RPA Caller and RPA Receiver workflow components, you are enabled to retrieve or exchange data with other systems.

RPA can be integrated into:

  • Process initiation
  • Report generation
  • File upload in Comidor Document Management System (DMS)

With Comidor RPA you can:

  • Automate repetitive tasks
  • Increase employee productivity
  • Speed up time-consuming processes

Prior to involving RPA Scripts and Agents in a workflow design, the following actions need to take place:

  1. Install an RPA agent to the PC (any unit) that you wish to perform RPA tasks
  2. Save Agent’s properties (needed for RPA Agent set up in Comidor)
  3. Install the RPA software (e.g. we used Sikulix for the following examples)
  4. Create your RPA Script including all actions that you want the RPA bot to replicate
  5. Save your RPA Script (Script name needed for RPA Script set up in Comidor)

 

RPA Agents

To access RPA Agents, go to App Factory Icon > RPA & AI/ML > RPA Agents.

RPA Agents | Comidor Platform

  1. Click on the “+” icon at the top of the screen to open the Creation Form.
  2. Type an Agent Name.
  3. The field Code refers to the agent code that you gave in application properties.
  4. Provide other information such as the Operating System, Version, and Description of this Agent.
  5. Select the desired Save option (refer to the Quick Reference Guide).

edit RPA Agents | Comidor Platform

Edit RPA Agents

  1. Go to App Factory Icon > RPA & AI/ML > RPA Agents.
  2. Select the RPA Agent to edit.
  3. Click on the Edit button to open the Edit Form.
  4. Edit the information you want and click on the desired Save option (refer to Quick Reference Guide)

    edit RPA Agents | Comidor Platform

Delete RPA Agents

  1. Go to App Factory Icon > RPA & AI/ML > RPA Agents.
  2. Select one or more RPA Agents.
  3. Click on Delete to delete one or multiple RPA Agents at the same time. A confirmation pop-up box appears.

 

RPA Scripts

To access RPA Scripts, go to App Factory Icon > RPA & AI/ML > RPA Scripts.

RPA Scripts | Comidor Platform

  1. Click on the “+” icon at the top of the screen to open the Creation Form.
  2. Type in the Script Name exactly as you saved the script file in the PC Agent.
  3. Field Integrated Software refers to the software you have installed at the PC Agent.
  4. Provide other information such as Built with and Description of this Script.
  5. Select the desired Save option (refer to the Quick Reference Guide).

Edit RPA Scripts

  1. Go to App Factory Icon > RPA & AI/ML > RPA Scripts.
  2. Select the RPA Script to edit.
  3. Click on the Edit button to open the Edit Form.
  4. Edit the information you want and click on the desired Save option (refer to Quick Reference Guide).

Delete RPA Scripts

  1. Go to App Factory Icon > RPA & AI/ML > RPA Scripts.
  2. Select one or more RPA Scripts.
  3. Click on Delete to delete one or multiple RPA Scripts at the same time. A confirmation pop-up box appears.

RPA Components in a Workflow Design

Comidor Workflow designer offers a variety of RPA components and elements, in order to eliminate manual repetitive tasks and allow employees to focus on more significant ones.

In particular, Comidor RPA components and elements are the following:

  • RPA Caller/Receiver
  • RPA Selenium
  • RPA Document Creator
  • RPA Excel Processor
  • RPA Web Scraper

RPA Caller/ Receiver

Add an RPA Caller in various steps of the workflow design, to send data from the workflow and perform a series of repetitive actions in the 3rd party system, or an RPA Receiver to receive data from other systems and perform actions in Comidor workflow fields.

To access Workflows go to App Factory Icon > Workflow Automation > Workflows

RPA Caller

  • Drag-n-drop the RPA Caller element.
  • Give a Title to the component.
  • Set the Parent Stage which is the stage of the parent process once this step is reached.
  • Select which Script you would like to run at this step, from the list of RPA Scripts that you have already created.
  • Select which Agent you would like to run at this step, from the list of RPA Agents that you have already created.
  • Define the Request Body by specifying the Key and its Value as the Runtime Value of a custom field or predefined value.

RPA Caller | Comidor Platform

RPA Receiver

  • Drag-and-drop the RPA Receiver element.
  • Give a Title to the component.
  • Set the Parent Stage which is the stage of the parent process once this step is reached.
  • Select a user field as RPA Response to store the result of the RPA Receiver once this runs.

RPA Receiver | Comidor Platform

 

RPA Selenium

Use an RPA Selenium in your workflow to replicate repetitive manual steps. Use unique CSS Selectors for specifying each element.

RPA Selenium | Comidor Platform

  • Drag-and-drop the RPA Selenium element in the workflow design.
  • Give a Title to the component.
  • In the Variables table, define all the actions that you wish the bot to execute step-by-step:
    • Go to URL: select this action to define the URL that the bot should browse.
    • Sleep: select this action to determine how many seconds should the bot wait until the next action. This depends on the loading time of each website.
    • Wait for element: select this action when not certain of how many seconds the bot should wait for an element to be displayed.
    • Click: select this action to define where the bot should click on.
    • Input-Put: select this action when you wish to add a value to a specific element. Specify the unique id of the element in “Value 1” and the value of the field in “Field (Runtime Value)”
    • Input-Get: select this action when you wish to get the value of a specific element. Specify the unique id of the element in “Value 1”.
  • Select a user field as RPA Response to store the result of the RPA Selenium once this runs. If you used more than one “Input-Get” all the values will be stored in the response field, separated with commas.

Document Creator RPA Component

Create invoices, reports, or other types of documents by combining a file template format and fields of the workflow.

  • Give a Title to the component.
  • Give the Parent Stage which is the stage of the parent process as soon as this step is reached.

Document creator | Comidor Platform

  • Set a Template File for your document either in a Form, upload it in a step of the workflow in a Binary Field, or give the Template Name of a file stored in DMS.
    •  Form: Select the User Form in which you have uploaded the Template Tile. (fixed template scenario)
      • The Template file should be either .docx or .xlsx.
      • The produced file can be either .docx, .xlsx, or .pdf.
      • Apply format options in your template such as font colour, size, alignment, number format, etc. and it will be captured in the produced document.

Document creator template | Comidor Platform

    • Binary Field: Select the binary field in which the Template File will be uploaded during the workflow. (dynamic template scenario)
    • Template Name: Type the name of a file stored in DMS. Keep in mind that this file should be linked with the Account of the process, in order to be used as a template file.
  • Define the Variables being used in the Template File by providing the Key and its Value as the run-time value of a custom field or predefined value. Excel fields and images stored in binary fields can be added too.
    • For excel fields specify the area to be replaced in the produced document. For example, r:1:2,c:0:4 will include from Rows 2-3 and columns A-D.
    • For images, correlate the binary field that is stored, and define the size in pixels in value. E.g. 200×200.
  • Choose a field to be the name of the produced document from the list of text fields in File Name.
  • Set the binary field in which you wish to save the Response document.

    Document creator | Comidor Platform

  • Check the option Return PDF, if you wish the produced document to be in PDF format. Leave it unchecked, and the produced document will be in the same format as the template.
  • In Status you can add a text field, to see the response of this component.

Excel Processor

Use the RPA Excel Processor component to parse a big excel file or part of it. Capture values of certain cells into user fields or a whole area and depict them in an excel type user field.

RPA Excel Processor | Comidor Platform

  • Drag-and-drop the RPA Excel Processor component.
  • Give a Title to the component.
  • Set the Parent Stage which is the stage of the parent process once this step is reached.
  • Select a binary field in the Excel Document – the document that you will upload in a previous stage.
  • Define the action you wish the Excel Processor to perform, from the following:
    • Parse Excel – select this action to return values of cells or an entire area of the excel and store them in user fields.
    • Get No of Sheets – select this action to get the number of the sheets in the uploaded excel (useful for large excel files with multiple sheets)
    • Find in Sheet –  select this action to find a certain value in excel. Define the Search Index.
    • Find the row in Sheet – select this action to find a specific value in excel and get the entire row as a response.

      RPA Excel Processor | Comidor Platform

  • Select the option Create Excel From Uploaded when you want to define an area of the excel to be saved in an excel type field. The fields “Read uploaded from (row), Read uploaded to (row), Read uploaded from (column), Read uploaded to (column), and Produced Excel Field” will appear to guide through.
  • In Assign to user fields map the user fields with the cells. E.g. if you want to display B2, type in Index r1c1.
  • Select a user field as RPA Response to store the result of the RPA Excel Processor once this runs.

 

RPA Web Scraper

Use an RPA Web Scraper in your workflow to first find information in a selected area of a website and then store it in user fields.

RPA Web scraper | Comidor Platform

  • Drag-and-drop the RPA Web Scraper element in the workflow design.
  • Give a Title to the component.
  • Set the Parent Stage which is the stage of the parent process once this step is reached.
  • Choose whether you want a hardcoded or Dynamic source
    • In the dynamic source option, define fields as host and port; those fields should get value in a previous step
    • Alternatively, type a host and port to the respective fields.
  • Define the URL that you wish to be scraped. Select a text field type.
  • Choose the Search selector from the variety of options (xpaths, class, id, etc) based on the website you are scraping.
  • Define the Selector (based on the Search option above). Select a text field type.
    • You can have a script in a previous step and give a specific value to this field. (eg. #this.USR_SELECTOR# = “//*[@class=’v2-responsive-table__content v2-pav10′]//tbody/tr/td”)
  • Select a memo user field as RPA Response to store the result of the RPA Web Scraper once this runs.

 


Find out more on how to create and manage workflows, including RPA components, and learn about all Comidor Workflow Elements.

The post Robotic Process Automation (RPA components) appeared first on Comidor Low-code Automation Platform.

]]>