In the last video, I gave a short introduction to Decision Model and Notation or DMN. It contains sufficient info to understand this demo.
This demo includes:
Enabling the Decision Central DMN Editor
Using the Decision Central DMN Editor
Writing a Test Scenario
Deploying a DMN Project on the Execution Server
Interacting with the deployed DMN Model using REST API
I am going to show you how to create a DMN Decision model from start to finish.
The Decision Requirement Diagram is shown at the top of the page (above).
Notice that I put the decision table in the Business Knowledge Model (BKM). And the use of FEEL (Friendly Enough Expression Language) in the “age” column.
One can also put the decision table directly in the Decision Node itself. The difference is that by putting the decision table in the BKM, it can be reused in another Decision node. There is no advantage in doing it this way in such a simple demo but imagine the reuse value in a large decision model.
The diagram below shows how the decision node invokes the decision table in the BKM. It maps the data to the variable (Age) used in the decision table. This is like a subroutine call in a programming language.
I am finding less and less time to pursue my passions these days. It is my pleasure to show you this unusual demo in which a robot’s movement is sent using MQTT and saved in a Kafka topic and then taken from the Kafka topic and saved in a file on a FTP site using Fuse Online integration. The file is then used as the source to playback the movements on the same robot using a small Java program.
In this video, I am going to give a short introduction to DMN. I shall describe what its capabilities are and why is it important. To fully describe how DMN works will require a much longer video than this one. You can find more details on DMN using the links provided in the description of this video.
Here is a brief description of DMN, by no means comprehensive but sufficient for you to know why it exists and help you understand the demo in the next video.
DMN is an OMD standard. OMG or the Object Management Group is the same organisation that brought you BPMN2, the Business Process Model Notation V2 standard. DMN is to decision modelling what BPMN2 is to business process modelling.
When you create a DMN model, you are creating a DRD or Decision Requirement Diagram. It is a visual representation of your DMN model. FEEL or Friendly Enough Expression Language is used to evaluate expressions eg, in a decision table. It has been said that: If you can use Microsoft Excel formulas, you will have no problem learning and using FEEL. There is a meta model interchange meaning that you can export your model as XML and import your model to another DMN tool.
To make model interchange possible, the DMN specs defines 3 conformance levels ranging from level 1 to 3 where level 3 is the highest.
In this video, I am going to demo Decision Central which is Decision Manager’s low code workbench.
The intention of the demo is to show you the new Decision Central web-based workbench which is based on the web UI framework PatternFly making it having a consistent look-and-feel with other Red Hat web consoles such as Openshift, 3scale API Management Platform, etc. I shall show you what it looks like and how to navigate the UI but I am not going to describe everything in detail. If you are familiar with the JBoss BRMS web UI, you may want to contrast what I am about to show you with what you already know.
I started a multi-part video series on Red Hat Decision Manager 7 on youtube. It covers 5 main topics seen in the above slide. In Part 1, I describe what the series is all about and outline what each topic will cover. Certain topics may include more than 1 video as I do not want to make each video too long.
In 2009, Gartner Says: Citizen Developers Will Build at Least 25 Percent of New Business Applications by 2014.
“…Future citizen-developed applications will leverage IT investments below the surface, allowing IT to focus on deeper architectural concerns, while end users focus on wiring together services into business processes and workflows,” said Eric Knipp, senior research analyst at Gartner. “Furthermore, citizen development introduces the opportunity for end users to address projects that IT has never had time to get to — a vast expanse of departmental and situational projects that have lain beneath the surface…”
That prediction is yet to come true, in my opinion, but the idea of a class of developers in the integration domain referred to as Citizen Integrators was born. Citizen Integrators are the class of integrators whose day job is not in IT Integration. These people understand the business. They are tech savvy to a certain degree but do not necessarily have a deep understanding of the underlying technology. For simple integrations, Citizen Integrators using the proper tools can do the job freeing up expensive IT Integration resources to do more complex integrations.
FUSE Ignite is the tool designed for Citizen Integrators. It is not meant to replace FUSE Standalone or FUSE on Openshift. For complex integrations, FUSE development by developers using an IDE is still the way to go.
FUSE Ignite is supported only on Openshift and comes with a FUSE subscription. Use of FUSE Online, the SaaS version of FUSE Ignite, is not part of a FUSE subscription though.
In Part 1, I implemented a simple database application using Wildfly Swarm and deployed it on Openshift. In this article, I am going to do the same but using Spring Boot instead so that you can see the difference between using different frameworks for your development. I shall also describe the use of spring-cloud-starter-kubernetes-config in accessing ConfigMap in Openshift deployment which is slightly different from I I read in the documentation.
At the end of this article, I shall summarise my view on the pros and cons of each framework in the context of ease of developing this simple database application. You may want to compare it with the Verti.x and Fuse Integration Services (FIS) implementation in my previous 2-part article on Vert.x (Part 1 and Part2). By the way, the FIS implementation is also using Spring Boot but it uses FUSE’s REST Domain Specific Language (DSL). Continue reading RHOAR: Wildfly Swarm vs Spring Boot Microservices – Part 2
Red Hat Openshift Application Runtime (RHOAR) comes with a number of frameworks/toolkits for implementing microservices. In previous articles on Vert.x (Part1 and Part2), I compared Vert.x with Fuse Integration Services (FIS). I am going to compare two other popular frameworks that come with RHOAR in this article. They are: Wildfly Swarm and Spring Boot. I am going to show you how to implement the same database access application implemented in Vert.x and FIS in my two previous articles so that you can compare the level of difficulty for using these frameworks. This is kind of an unfair comparison as most of you are either JEE or Spring developers and you will always find that your framework is easier to use than others especially when compared to Vert.x as it requires learning a new way (reactive programming) of implementing an application. Continue reading RHOAR: Wildfly Swarm vs Spring Boot Microservices – Part 1
Some users want to use JBoss Business Process Management Suite (BPMS), or its upstream jBPM Open Source Business Process Management solution, as a documentation tool as well as an executing engine for business processes and business rules, but…
BPMS is not designed as a documentation tool
BPMS has some metadata search capability but not sufficient for the scenario customers have in mind
Customers want a much more powerful search capability
Customers want a single source of truth for all business assets for both documentation and execution purposes
THE MISSION: ENHANCE THE DOCUMENTATION AND SEARCHING CAPABILITIES OF BPMS/BRMS
I created a couple of youtube videos suggesting how this could be done:
The first video defines the mission and discusses 2 approaches to achieve the mission.
The second video describes how I created a prototype and how it looks like and works. The searching capabilities is on a different level from that of BPMS’.
However, more work needs to be done to realise a proper solution for the problem. I shall be adding more videos when new ideas come to mind. Stay tuned!