|Day 1||July 13, 2017|
|Demystifying the Decision Model and Notation Specification
||The Decision Modelling and Notation (DMN) is an OMG standard for modelling and execution of decision services, either standalone or seamlessly integrated with Business Processes using BPMN or CMMN (its sibling standards). The specification itself defines three progressively increasing levels of compliance numbered 1 through 3. Level 1 is basic modelling, level 2 adds execution semantics for some model elements, and level 3 supports the complete specification.
There seems to be a perception in part of the community that although some chapters of the standard are clear and easy to implement, the more obscure requirements for level 3 compliance are either too hard or too complex to implement. Nevertheless, the elements introduced by level 3 compliance are fundamental to enable practitioners to model effective real world use cases.
This presentation aims to highlight the importance of level 3 compliance (demonstrated with real world examples) and to share the experience of learning the specification and implementing a complete level 3 runtime engine. The presentation will highlight the scope, the challenges and the obstacles found, as well as the solutions proposed and the experience acquired implementing it. It will present the core context-aware FEEL language implementation, the support and importance of the DMN interchange format and the common runtime semantics. Finally, the presenter will also list what, in his experience, works well and what requires specific solutions beyond the content of the specification.
Attendees will have the opportunity for a deep look behind the curtain of a real DMN implementation. Engineers, practitioners and implementors will acquire a better understanding about the trade-offs and benefits of adopting a DMN based solution for modelling real world solutions.
|Jan Purchase (Lux Magi Ltd)
James Taylor (Decision Management Solutions Inc.)
|‘Mind the Gap’ Lessons Learned in the Application of DMN to Large Projects
||Since the release of DMN 1.1 by the OMG in 2015, a growing number of organisations have sought to use it to model their key business decisions. They use decision modelling to improve the integrity and structure of their business decisions, support precise and transparent definitions, frame their use of analytics and create an executable representation of their requirements comprehensible to their business communities.
But DMN is a young, evolving standard; its application to large or complex decisions, where its precision and clarity are most needed, often requires specific practices and maturity. As organizations adopt these practices they find that the current version lacks specific features. Focussing on clear communication, published decision models (in books and internet articles) are often simplified and omit the need for these features. However, practitioners attempting to apply DMN to real-world decision modelling rapidly discover and work around these ‘gaps’. Also, DMN tool vendors are often pressed to address them with their own specific workarounds.
Through the use of case studies, the authors present gaps within DMN’s feature set that most frequently cause significant challenges to its successful application. The lack of these features can be a barrier to adoption of DMN. The authors outline the critical practices these projects require, propose solutions for each gap and walk through model fragments that illustrate these ideas.
Although standards should be small, the authors argue—by looking at the challenges faced in real projects—that users would benefit from a small number of additional features, some of which are already being considered by the DMN committee.
|Bruce Silver||Decision Table Analysis in DMN
||New DMN tools put rich executable standards-based decision modeling in the hands of non-technical (“business”) users. Adoption and effective use of this power now hinges on “whole product” offerings that include documentation, training, and software features geared to business users to ensure model correctness and encourage best practices. Beyond basic syntax checking and logic validation necessary to prevent runtime errors, tools should also provide modeler guidance and best practice methodology built into the software.
The DMN format of most interest to business users is the decision table. Decision table methodologies intended to enhance readability and maintainability by business, as described by Vanthienen and others, have a long history, including normalization along relational principles more recently rediscovered and formalized by von Halle and Goldberg in The Decision Model (TDM). This presentation demonstrates the adaptation of those recommendations to DMN. Method and Style Decision Table Analysis is integrated with a third party DMN tool, and in principle it could be integrated with any tool that creates standard DMN XML, illustrating the benefit of a tool-independent standard.
Proudly “methodology-free”, DMN 1.1 technically allows the following:
Method and Style analysis flags these and similar issues, and guides modelers to correct them.
|Business Decision Management with Signavio
||We will show how Signavio supports the Decision Model and Notation (DMN) standard and enables the creation, testing, and use of decision models. We will talk about the notion of Multi-Instance Decisions (MID), an extension to DMN, why it has been added, and how those needs might get addressed in DMN 1.2. To execute models in enterprise applications, users can choose between standardized XML export and the generation of rules code. We will illustrate the implemented mapping from DMN to the Drools Rules Language (DRL), including decision table aggregations, ternary logic, DMN’s fallback to ‘null’ values, as well as MIDs. We will compare the performance of generated DRL code with the direct interpretation of DMN.|
|Denis Gagné||Integrating DMN FEEL to BPMN and CMMN
As a disciplined approach, Business Decision Management increases an organizations agility and adaptability by extracting and making business decisions first class citizens of an organization’s operational model. The Decision Model and Notation (DMN) standard from OMG is a core enabler of a Decision Management practice, by offering standardized meta-model and notations for defining both the requirements and the logic of business decisions. DMN is by design a complement to both the Business Process Model and Notation (BPMN) and Case Management Model and Notation (CMMN) standards. Trisotech Digital Enterprise Suite was the first solution to allow for modeling an organization’s operations using the complementary viewpoints of BPMN, CMMN and DMN. In this session we will explain and demonstrate how the Friendly Enough Expression Language (FEEL) from DMN has been deeply integrated as an alternative expression language to both BPMN and CMMN. Now for the first time users will be to use the same expression language across the OMG Triple Crown of process improvement: BPMN, CMMN and DMN.
|Why Model Business Knowledge?
||Some practitioners avoid the use of Business Knowledge Models (BKMs) in their decision models, considering them to be unpopular, inconvenient, or appropriate only for reuse of logic. These opinions are discussed and weighed against a range of benefits arising from the explicit modelling of business knowledge.|
|Jan Vanthienen||Process-Decision Integration: An overview of different scenarios
||Business processes using/calling decisions or decision services seems to be the common scenario for process-decision integration. But that is only one simple case.
This talk will be about different scenarios for integrating decisions and processes: scenarios where intermediate decisions produce results needed earlier in the process model, scenarios with incomplete input, scenarios which impose an order on decision execution, scenarios where the process model can be optimized based on the decision model (and some criteria).What you will learn:
|Larry Goldberg||Decisions and Big Data – How Decision Management Makes Big Data and Analytics More Impactful
||Big Data is not only becoming pervasive it is ubiquitous today. And while the potential impact is increasing, the challenges managing big data continue to rise and haven’t been fully addressed. The main challenge is this: Once you have all this data, how do you not only a) make sense of it but b) what actions do you take and c) how do you implement those actions successfully and d) how do you do it automatically? These challenges and others are obvious candidates for Decision Management.
In fact, Forrester just published a report called “Prescriptive Analytics – the Black Belt of Digital Decisions” where Decision Management is called out as logical enabler and partner to the world of big data and analytics.
Over the past year, Sapiens DECISION has extensively championed and led the implementation and deployment of business logic into big data environments. Here, we are bringing the business logic to the data rather than bringing the data to the logic…this fundamental change has resulted in the improvement of not only performance but also opens a new vista of for Decision Management, harnessing big data in business production automation and business transformation initiatives. Larry Goldberg, will describe real case study examples and the technical interplay between big data and decisions
|QnA Panel||Real-world Business Decision Management: Vendor and Practitioner Perspectives||
Moderated by James Taylor
|Day 2||July 14, 2017|
|Machine Learning and Decision Optimization||
|Eugene C. Freuder|| Constraints and Rules
||There is an old adage: “When the only tool you have is a hammer, every problem looks like a nail.” Even a tool as powerful as rule-based reasoning cannot be expected to be ideal for every job. It behooves us to have multiple tools in our toolbox. Ideally these can also learn from each other, and even be integrated into combinations that are more powerful than the sum of their parts.
One such tool is constraint-based reasoning, or constraint satisfaction and optimization, which has proven successful in a wide range of applications. This talk will:
a) provide a very brief introduction to constraint-based reasoning,
|Geoffrey De Smet
|Real-time Constraint Solving with OptaPlanner
||What’s the shortest trip to visit all European capitals? Or the cheapest vehicle routing schedule to restock all our retail stores? How do we optimize our cloud machines? When do we assign nurses to shifts in our hospitals to make them as happy as possible? Which crops do we plant on which fields for the optimal revenue? What’s the fairest tennis club schedule?
Which algorithms work well and scale out on these kind of planning problems? Certainly not Brute Force or other exhaustive heuristics!In this session, we will:
– Introduce constraint satisfaction optimization
– Demo a few use cases
– Use weighted hard and soft constraints to formalize business goals
– Walk through a bit of example code in Java of the open source constraint satisfaction solver OptaPlanner (www.optaplanner.org)
– Explain how continuous planning and real-time planning works
|James Taylor (Decision Management Solutions Inc.)||
The Role of Decision Models in Analytic Excellence
|Organizations are increasingly investing in data analytics to improve decision-making. Dashboards, self-service BI, data mining, predictive analytics, machine learning and cognitive technologies are being evaluated, deployed and used as organizations push to adopt data-driven decision-making. Effectively using these analytic technologies, and especially using them consistently, broadly and deeply enough to achieve excellence requires a disciplined focus on better decisions.
Some organizations are using decision modeling, and the DMN standard, to achieve analytic excellence. Decision modeling with DMN provides critical structure and acts as a catalyst to maximize the value of these technologies.
|Marcia Gottgtroy||Decision Management in Tax Administration – lessons learnt and future development
||This presentation proposes to share lessons learnt, pitfalls, and new practices Inland Revenue NZ has been adopting and developing to establish our decision management capability since the presentation of the Compliance Management Environment (CME) decision management platform, the GST experience, in the Decision Camp 2014.
Decision management is in IR’s DNA and is in the core of the analytical architecture and practices. This presentation will discuss how we are combining decision management, experimental enterprise and lean analytics principles with traditional analytical methods/processes such as CRISP-DM, delivering hybrid and integrated, enterprise solutions.
IR’s pathway to create a modern analytical culture will be addressed. Breaking the silos between mathematicians, statisticians, software engineers, knowledge engineers, criminologists and SMEs while building a strong collaboration supported by a data science framework were fundamental in enabling IR’s transformational goal of becoming an intelligence-led, customer-centric, agile, low touch and network-organized organization. The data science framework adopted will be considered.
Knowledge engineering principles and methodologies employed will be briefly touched to explain how they have been applied as an enabler to transparently embed the decision management core principles in many projects and by different groups, creating an organic learning environment.
Some of the business successes achieved will be reviewed. It will be shown that the paradigm shift from product centric traditional risk management strategies to a hybrid approach considering profiles and human intelligence inputs for augmented decisions, was a definite step towards bringing the Organization to understand the need for the management of decisions beyond our internal boundaries; for the consideration of customer journey maps, cross Government risk management along with the necessary architecture to argument every decision considering the optimization of IR’s services and more specifically NZ citizens experience in the context of NZ Government drivers.
|Rule based Traffic Management
||Rule based traffic management is a methodology for dynamic traffic management that is a joint development of the different road authorities in the Netherlands. The methodology supports collaboration between various traffic management authorities (e.g. counties, provinces, or cities) by offering a common approach and vocabulary.
This common approach offers an alternative to the response plans that are used today. A response plan translates policy and experience in operational instructions, often in the form of a flow chart. They are intended for human operators to deal with events, incidents and road works. When used on a large scale they are labour-intensive to maintain and slow down the operational execution process.
The rule based traffic management approach tackles these issues by disentangling problem detection, problem solution and conflict handling for each element in the road network. Each element in the road network follows the same generic rules (presented as decision tables) and tries to defer the traffic problem to other roads in the network that have less priority. This way traffic is distributed across the road network and congestion is prevented.
Evaluation studies show that the new methodology is effective for daily operational traffic management. The methodology facilitates the transition to traffic management based on big data for new technologies (like in-car systems) and tightens the connection between traffic policy and operational execution (compliance).
In this talk you will learn:
– what the objectives are of rule based traffic management
– how we connect policy with operations using the rule based methodology
– how simple generalized rules may result in complex and dynamic network behavior
– that the development of this methodology was a process of many years
|Gitika Gumbar, Lucas van Biert||Decision Maintenance and Governance: a Worked Example
||As more and more organisations are focusing on ‘decisions first’, it is critical that we understand initial observations for two major value propositions: Ease of Maintenance and Model Governance. Decision modeling increase maintainability of your business solution as, (1) all business logic is captured in one place, (2) in one standard notation, (3) which can be easily read and amended by business and technical stakeholders. Centralizing the design and execution of business logic will ensure (4) standardization, opportunities for reusability and improve quality of decision models.
A Decision model maintenance process and governance framework, directly impact an organisation’s agility and time-to-market of business logic changes. Therefore we want to provide a real-world example illustrating key principles in balancing governance and agility when maintaining executable decision models. Through this real-world example we also aim to highlight challenges around skills, organisation change, model governance and model execution governance.Key take-away:
|Think Big: Scale Your Business Rules Solutions Up to the World of Big Data
||Traditional business rule applications process records of a few megabytes of data at a time, one record at a time. As we move solutions to big data, rules may be applied to terabytes of data, and traditional business rule architecture cannot keep up. To scale business rules solutions up to the world of big data, this session presents an integration of Business Rule Management Systems with Apache Hadoop.
We will demonstrate IBM ODM and Apache Hadoop on the cloud running rules against 100 million passenger data records in twelve minutes (132,000 records processed a second). This proves that business rules can be applied against huge volumes of passenger data and stop passengers of interest before they fly.
Other big data and business rules use cases are:
* Trade Validation
The presentation will be based on the article
|Vijay Bandekar||Decision models for the Digital Economy
||Digital economy demands autonomous, self-learning, real-time decision making systems that sense, comprehend and act. Decisions must be effective and adapt to changing business environments. Decision models and learning models must work in unison to enable adaptive reasoning capabilities. “Store-first-process-later” does not work as the increasing volume of data generated from growing number of mobile and “Internet of Thing” (IoT) devices make the learning process too slow for timely decisions. The traditional big-data analytics such as Hadoop is too slow, and requires custom integration with a decision engines.
We present the results of applying adaptive decisions framework, which integrates best-of-breed machine learning algorithms with data driven productions in a state-full knowledge session. The framework allows us to represent relations for continuous learning based on decisions. Decisions are modeled using rules, which form an in-memory Rete network called state-full working memory (SWM); this enables caching of partial decisions. The data is continuously streamed through the network. Leaned models are readily available in the SWM, which alter the decisions as per dynamic data streams in real-time.
We demonstrate measurable advantages of our innovative framework with two specific real world case studies in the area of media analytics and wearable devices. Today the online media analytics rely on periodic sales data and media spending data from retailers. The retailers need timely insight into how their media spending on various channels directly influence sales. Similarly, the wearable device manufacturers must process continuous streams of data from field-deployed devices to support their business process.
|Tim Stephenson||Punching Above Your Weight: Automating Decisions for Better Customer Relationships
||We’ve been using and refining a SaaS based decision modeling and execution practice for two years to enable resource constrained marketing and account teams to maintain more regular and intimate customer relationships.
We will show the power of integrated modeling and execution from the initial discovery conversations around whiteboards of ‘as is’ processes and decisions through to execution. We will look at that most contentious of questions: when to use process models and when decision models. We’ll explore our experience in identifying where best to place abstractions to maximize robustness in the face of change and how to support innovation through testing different responses.
The approach discussed relies on both modeling and execution of decision and process models but it is a critical part of the story that we rely on the applicable standards in each area to ensure that the business retains independent ownership of its business knowledge. To explore this we will look at snippets of XML for DMN models as well as their visual aspects and will make reference to the work of the DMN technology compatibility kit with which he has been involved for the last year.
Key take away: Models not only crystalise business thinking they also allow a rapid hypothesis testing system.
Audience: Some experience with DMN would definitely be desirable but given the early nature of that standard I will anchor this talk in the business domain before proceeding to implementation details