Posted in Dynamics 365, PowerApps

PCF Tagger

Tagging is common requirement in Dynamics CRM. Accounts can be tagged against multiple types. Contacts can be tagged for multiple interests.

Created a solution using PCF that makes tagging quick and intuitive. No more doing searching and linking tags using lookups. Just *click on* *click off*. Data goes in a related many to many association. Below is a demo of the Tagger.



Posted in PowerApps

Introducing CRM Timeline control

This article shows how to present CRM data in a chronological timeline format. Timeline is best suited for those entities which have one or more date field in them. For example – Task, Work Orders, Inspections, Resource Bookings, etc.

Task has a start date and end date. The GIF below shows how a task timeline can be displayed inside Dynamics 365 using the PowerApps Control Framework.

Key notes about the footage below

  1. There are 7 tasks with start date and end date.
  2. Initially All Tasks views shows information in tabular format.
  3. Then Task Timeline view shows the same information in Timeline format.
  4. It is easy to visualise overlaps of tasks and other chronological details which get lost in the tabular view.
  5. On hover, it shows the start date and end date
  6. Lot more can be done onClick, onDrag, etc. but for now just a basic sample.




The code is on my Github.

Posted in PowerApps

From CRM views to Datacards using PCF

PowerApps Component Framework can be used to transform the visual representation of CRM data (which was earlier restricted to only forms and views) into a more sophisticated and intuitive user experience.

This blog explores how a bland CRM view can be turned into a control of graphical nature that makes information more palatable to the visual cortex.


The scenario we have is an entity called Ratings which stores ratings for various venues in Canberra.

  1. We want to show the verified Ratings in green, while the unverified in blue.
  2. The layout should render the smaller fields on the left (stacked top to bottom).
  3. More descriptive fields like Review Comments should utilise lion’s share of the real estate towards the right.


Once the control is built and deployed to Dynamics, it can be easily consumed by your view designer using the View Customisations > Custom Controls section as shown below

Datacard setup

Upon publish, you are able to represent the information in ways that were not possible before and meeting the three requirements outline above.

Veritec Datacard v2

And yes, this is the same Dynamics 365 CRM’s typical view which we use everyday. So custom controls have answered the call to a more intuitive UX design.

The source code used to create this datacard has been published here

Note: PowerApps Component Framework is still in Preview, so not recommended for Production at the moment.

Posted in Dynamics 365

XrmToolBox Plugin – Workflow Buddy

Workflow Buddy is an XrmToolBox plugin developed by Veritec. This plugin gives workflow developers the functionality to search within

  • Workflow comments
  • Stop statements
  • Parameters used to call child workflows and actions

Business Need

As the focus of the CRM implementations is shifting towards achieving the business requirements with minimal amount of customisations (read code), workflows are becoming the core containers of the business logic. The complexity and scale of CRM workflows and actions is increasing, precisely because business logic is moving from JScript/Plugins (code) into workflows.

In a medium scale CRM implementation (e.g. the one which took a year to build), it is not uncommon to have 200 workflows triggering from various places -doing their bit and handing off the baton to subsequent workflow processes, and so forth.

Visual Studio provides a handy search feature where you search within a file, project or whole solution for a particular keyword. But there is nothing like that for CRM workflows. Although not as spectacular as VS search, this tool endeavours to cover this gap by providing a basic search on some of the commonly searched aspects

How to use Workflow Buddy

After installing the plugin from Plugin Store, launch it from XrmToolBox from the list of your plugins.

Image 6

Once launched the tool should appear in a new tab


The plugin basically does a partial (contains) search in the following areas



Shown above with the orange arrow

Stop statement messages

Stop workflows with canceled are typically used to show validation errors. Those error messages can be searched with Workflow Buddy.



Many times you pass string parameters to actions like child action names, relationship names, etc. Those string parameters can be searched too.

e.g. in the example below if we want to search who is calling an action (new_LeadSubmittherate). We can search for this and calling workflow will show up in the Workflow Buddy results.




Let us search for a word Calculate and press Search

Image 4

Any workflow that uses this keyword in either of the three areas discussed above will appear in the search results as below.

Image 5

Then you can click on the workflow and it will open up directly in your default(selected) browser.


Hope this tool will increase the developer productivity of your organisation.


Posted in Blockchain

Part 3 – Connecting Dynamics with Blockchain

If you have been following along the blog series, you’ve seen in Part 1 why blockchain matters and in Part2 how we can build a shopping cart in ReactJs that will be used in this part to send our transactions to the blockchain. In this part, we will see the blockchain plumbing and setup.


Audience: If you are functional consultant or an architect who wants to understand how blockchain transactions work (at a high level). I have explained them in the Blockchain Request Lifecycle section. If you are developer who would like to build something similar, I have also posted the source code of my working prototype that demonstrates how Dynamics can be linked to the blockchain. I have never heard of blockchain in Dynamics365 space, so potentially this the first ever working prototype with source code – hooray!!


There are few blockchains in the market but the two biggest of them are Bitcoin’s blockchain and Ethereum Foundation blockchain. We will be using the Ethereum Foundation one because as per my research it is growing faster and has recently tied-up with Microsoft for a Blockchain As A Service solution. So knowing how to integrate with it will help you understand the underpinnings of that service.

Just to be clear I would not be using Microsoft’s Blockchain as a service because of two reasons

  1. It costs money to spin up a blockchain cluster in Azure and I am a developer on budget.
  2. I don’t like shortcuts.

I like to start from the grassroots and make my way up the chain. That way I can understand how connectivity works, dependencies, install order, etc. This is a difficult path because you need to install the packages one by one – in a particular sequence whereas on the Service they are all pre-installed and you get a working infrastructure. So I will leave it upto you, either way you get a blockchain and following is what its eco system is composed of.


The stack I used for Blockchain development


  1. Ethereum VM (i.e. the platform)
  2. Ethereum client – the one which connects a webpage to the platform)
  3. Metamask – a browser plugin which simulate a virtual ether wallet – you need ether (gas) to run SmartContracts)
  4. web3.js – a node module which connects Javascript to the Ethereum client)
  5. Solidity Compiler – a compiler which converts SmartContracts written in JS like language (solidity) into binary format which the platform understands)
  6. Truffle (optional) – a framework that deploys your SmartContracts from your project folder on the platform
  7. Remix – an online IDE to develop, compile and unit test smart contracts.



Blockchain Demo



Blockchain Request Lifecycle

Let me try to explain how the Dynamics to Blockchain connectivity works.


Loading Products and Pricing from Dynamics 365

  1. When a customer browses to the Shopping page, a WebAPI request reads Product and Pricing information from Dynamics 365. This assumes you have already registered your app in the Azure AD. The code for this can be found in connect.js
  2. This request is raised by a React component called Action. All the data is fetched as JSON object and stored in something called a Store (Redux’s Store).

Initiating the Purchase

  1. When customer goes to the product catalog and presses Add button the following things happen
    1. A reducer intercepts this event and updates the state – that updates your UI (decreases the Inventory and adds the product to the shopping cart)
    2. A smart contract named Buy is invoked, this is first port of entry into the Blockchain

Calling Blockchain Smart contract

  1. Invocation is done by web3.js which is wrapper around the Ethereum client. Web3.js looks for the SmartContract with the name/signature in the Ethereum VM and if it finds one, it passes the transaction data as arguments. So Web3.js is your conduit between the browser and the blockchain platform.


Below is the code of the smart contract which posts the transaction on the blockchain


I wrote two smart contracts in Solidity

Buy – which actually create the transaction basically a collection of objects in Blockchain’s memory

TotalBillOfACustomer – total bill of the customer

Mining the blocks

  1. Once this transaction is posted into the blockchain’s data structure it basically goes into something called a block (an unverified block). Then a node actually mines that block (in a bid to put it on the verified chain) . But there is a twist here. The node won’t mine the block until you associate a reward with it. That reward is called gas which is generally paid in Ether (the cryptocurrency of Ethereum).
  2. That is where the Metamask comes into picture – as you see in my demo gif. Everytime I click on Add item I get an Metamask popup which is asking me to Submit/Reject. If I say Submit, then gas is purchased with ether from my Wallet and that gas is sent along with the smart contract request.
  3. Lot of miner nodes will compete to mine my block (it is a competition to crack the nonce – whoever finds that nonce will win). Whoever wins gets that reward.
  4. So Metamask ensures that money is not taken without our manual approval.
  5. That basically comes up with a response and you can take some action if you want – send email, update inventory, create invoice, etc.

Source Code

I hope that was useful and as promised the link to full source code is on my Github

You will have to install the npm packages and get the webpack running in your local machine for the dependencies to resolve. It is not going to be a unzip, deploy and run type of application. As explained earlier there a lot of underpinnings / dependencies tools that are being used. Follow the trail starting from the packages.json file.


Posted in Blockchain

Blockchain Part 2 – ReactJs portal for Dynamics 365

Continuing our journey down the chain as you can see in the architecture diagram I shared in my previous post NodeJs (the green patch) is our middleware that makes the commuicaiton possible. As of now Dynamics offers no direct connection to any blockchain. Let alone direct connection, there are not even any connector available. See that’s the joy of working on the cutting edge – you get to invent. Necessity is the mother of invention. So I thought of building some kind of middleware and NodeJs was a great choice as it will broker the communication between Dynamics, Portal and the blockchain.


The following are the main roles of the NodeJs layer in this architecture

Retrieving and storing information into Dynamics using Web API

For this I had to register my application inside the Azure AD as a native app.

Module Bundling

I used Webpack 3.0 as my module bundler and Babel as my transpiler. The main purpose of these components is to convert the higher level code like React, Web3 into plain Javascript which browsers can understand.

State Management using Redux

The Redux package has been used for state management which basically means storing data in session and collections and keeping it up to do as users interact with the portal. If you see the demo animation which I attached at the bottom of the previous post – where a user clicks on Add button and products keep getting added into the Shopping cart in realtime. That is made possible by Redux and all done on the client side – hence great UX. Shopping Cart and Product Catalog are different components and they communicate with each other using a common Redux Store using event/listener based communication

Communicating with Smart Contracts

Smart contracts are to Blockchain, what plugins are to Dynamics. They contain business logic which can be triggered by external calls. Smart Contracts execute in the blockchain and manipulate the transactions. There are written in a language called Solidity and get deployed to the blockchain. The Smart Contracts are invoked by a NodeJS package called Web3 which runs on client side.


There are lot of things at play here, lot of moving parts that need to be setup in a proper fashion for the end to end communication to work. So far I have only listed half of things I had to use, many others, which I will cover in the next post.


React JS Components

React JS is a wonderful language for functional programming, it makes writing advanced components possible that can scale. It is a micro – components architecture where you application is built of small compoents that talk to each other rather than a heavy monolith design like MVC. In React every component is self sufficient, it uses its own state and its context is localised. There are clear boundaries in what a component gets passed into (Properties) and what it manipulates (State). I loved this new pattern for working with Web apps and much better than the spaghetti style MVC pattern.


Below diagram shows how communication happens in React components. Every event (e.g. when I click on Add button of a particular shopping item) goes via this route. The sequence is

  1. On Click and event is generated by the Action Creator
  2. That event is dispatched by the dispatcher
  3. Reducers listen for events and update the big Store object
  4. Then update store is passed back into the components are properties


This is a highly scalable pattern called Flux


Show me some code

Below is some of the code for the Catalog Item component to give you an idea. As you can see the bindings, markup, properties and events of this components are all contained in one file. For a new developer, so easy to understand what is going on as do not need to go through five files to connect the jig saw pieces, its all there at one place. I have put some labels that explain what is going in various sections of the code


How it connects to Dynamics

I have used a Node JS package called https which is basically to makes HTTPS requests to any endpoint. You set the url, you set the headers, make a call and get the results. One limitation I saw with this package was that it uses callbacks. I changed the code to use promises which is a better framework.

Some of the code is shown below


That covers the communication between the ReactJs and Dynamics 365. In next blog we will look at the Blockchain communication

Posted in Blockchain

Blockchain integration with Dynamics 365

“2017 is the year of Machine Learning, 2018 will be the year of Blockchain. So get ready to chain your CRM to the block”

Manny Grewal

What a great way to start a blog series with a quote of your own. Last year I blogged at length how machine learning can take your CRM implementations to the next level and demonstrated some real world use cases of how artificial intelligence can weave its magic. This year I thought why not explore the other great things buzzing around and the exploration left me amazed!!  Not because I learnt lot of amazing things but at the amount of untapped potential which we are missing upon. Technology paradigms are changing rapidly and being a technologist feels like running after multiple buses departing away from you in opposing directions. If you run after one, you’ll miss the other.


This is what happened when Blockchain rocked up while people were still trying to understand how Artificial Intelligence works.

The challenge current market is facing is that business leaders (and even technologists) are unable to cope up with the innovation. Before they can understand a technology and how it can benefit their organisation, something else rages up and throws them back at the start of their learning curve. This is what happened when Blockchain rocked up while people were still trying to understand how Artificial Intelligence works.

Such challenges bring an opportunity in disguise for those who vow to get under the skin of these technologies. The ones who put the hard yards to decode the intricacies – the likes of distributed consensus or a neural network, tame them and then demonstrate their usefulness. Such was my motivation, that to spearhead the cavalcade while we (the IT community) take on these behemoths and bring them under control and understand how they can be ‘leveraged’. I hope the fiction has pumped you up a bit, so get ready to join this journey as great things are about to begin….

What and Why of Blockchain

Blockchain is like an open database which everyone can see. It contains a ledger of transactions which basically boils down to

User A did to User B

is typically a transaction like a purchase, transfer, delivery, packing, shipment, etc.

The word Block because bunch of these transactions are blocked together e.g. 1024 transactions in one block.

The word Chain because all these blocks are chained to each other using cryptographic pointers something like below



The main benefits of blockchain technology are

Distributed and Transparent

Blockchain information is not controlled by a particular organisation, it is distributed across many nodes around the world. Anyone can search and validate the transaction and everyone know what is happening and where. In short – a fair network.

No middleman

Eliminates middlemen like brokers, third parties, institutions because both the transacting parties can directly trade using blockchain


Once validated, a transaction is set in stone. So either party cannot deny it. So it introduces non-repudiation and eliminates fraud.

I would not go into depth of the concepts as I do not believe in repeating what can already be found on the internet. If you want to understand more about its inner workings, feel free to explore around. But the purpose of this blog is demonstrate how a blockchain can be integrated with a system like Dynamics CRM – the product everyone loves.

What are we building?

We will build a portal – a shopping cart where our customers can purchase products. Upon purchase every transaction will be sent to the blockchain once validated (i.e. committed to blockchain), the customer will be notified. The portal will read all the product and pricing information from Dynamics CRM, and as products are bought – inventory will be updated in real-time. Once committed, an invoice can be generated inside CRM.

Below is the technology stack I will be using

  • Dynamics 365 for Product Catalog, Pricelists, Inventory and Invoicing
  • Ethereum blockchain
  • Solidity and web3 to write our smart contracts
  • ReactJs for the portal
  • Redux for state management
  • NodeJs as our messaging engine




I plan to cover this series in three parts and will also host the source code of a working prototype that demonstrates the above stack in action. Soon I will cover the following

Part 2 – Building a Dynamics 365 based portal in React JS

Part 3 – Building the DApp (distributed application) and smart contracts in the blockchain

I will leave you with a teaser of how the shopping cart is shaping up.


Until then…. cheers.

Posted in Artificial Intelligence, Machine Learning

Dedupe Duplicates using Fuzzy / Proximity search

Last year I wrote a post about finding similar accounts for Dynamics CRM which generated lot of interest in the community. Understandably so, as this is a very common requirement that is asked for in nearly every CRM project – Duplicate Accounts. CRM duplicate detection capabilities are only basic – they just do partial match, they can’t do any fuzzy or proximity match.

Even with the latest and advanced weaponry in CRM’s armour i.e. Relevance Search it is not there yet where it could tell that the following accounts are infact the same companies.


Potential Duplicate


Waste Management

Waset Manaegment


Public Storage Co.

Storage Public Co.

Wrong order

Scotts Miracle-Gro

Scott Miracles Gro


Melbourne University

Melbourne Univ.

Short form

I decided to improve and generalise my code a bit, so that it can be used not only for CRM for any general requirement where you need to find duplicates based on proximity. I am going to share the code and approach in this blog.


This proximity search is based on the machine learning algorithms which base the search on Edit Distance. The program starts with finding the exact matches first, if it couldn’t find an exact match, then it widens the search filter to find partial and proximity matches (i.e. words in the same neighbourhood, ordered in a different way, etc.)


I have also attached the original files that I used during my testing i.e. the file containing duplicates and the results (where duplicates were found). Below is the brief snapshot of the results from my test run


Duplicate Found


Kimberly Clark

San disk




Starwood Hotels & Resorts

Starwood Hotels And Resorts

Expeditors Washington

Expeditors International of Washington

There were some false positives in the results as well, so you can adjust the thresholds of the algorithm as per your data.

How to use

You got a list of companies and you want to know which of them are duplicates. So, this is what you need to do.

1. Export the list into a CSV file.

2. Point the code to your file.

3. Run the code and it generates a new file results.csv with a new column called Duplicate

Complete source code

Python is a beautiful language and does big things in just few lines of code. Just install Python on your desktop and run the following file. No frills, no servers, no deployment. Too easy.


import pandas as pd
from fuzzywuzzy import fuzz
from fuzzywuzzy import process
import csv
import os


companies_db = "<local path of your CSV file>/CompaniesShort.csv"
pwd = os.getcwd()
current_db_dataframe = pd.read_csv(os.path.basename(companies_db),skiprows=1,index_col=False, names=['Company'])

def find_matches(matchThis):
    rows = current_db_dataframe['Company'].values.tolist();
    matches= process.extractBests(matchThis,rows,scorer=fuzz.ratio,score_cutoff=FULL_MATCHING_THRESHOLD,limit=MAX_MATCHES)
    if len(matches)==0:
        matches= process.extractBests(matchThis,rows,scorer=fuzz.partial_ratio,score_cutoff=PARTIAL_MATCHING_THRESHOLD,limit=MAX_MATCHES);
        if len(matches)==0:
            matches= process.extractBests(matchThis,rows,scorer=fuzz.token_set_ratio,score_cutoff=TOKEN_MATCHING_THRESHOLD,limit=MAX_MATCHES);
            if len(matches)==0:
                matches= process.extractBests(matchThis,rows,scorer=fuzz.token_sort_ratio,score_cutoff=SORT_MATCHING_THRESHOLD,limit=MAX_MATCHES);
    return matches[0][0] if len(matches)>0 else None

fn_find_matches = lambda x: find_matches(x)

Posted in Dynamics 365

Calling external web services from CRM Online Sandboxed plugin

I have seen this question many times – Can you call external endpoints from within a plugin running inside Sandbox of Dynamics CRM Online?

Recently I was riddled with the same situation where the sandbox did not allow me to call an external endpoint.

On the positive note, I was able to overcome this issue with a little tweak and I thought it might be useful to share with the community.



Say we need to call a JSON based web service from within a CRM plugin


Code that would not work

var client = new HttpClient();
client.DefaultRequestHeaders.Authorization = new AuthenticationHeaderValue( "Bearer", wsKey);
client.BaseAddress = new Uri("<your ws url>");
//Say jsonBody is typed object
HttpResponseMessage response = await client.PostJsonAsync("", jsonBody);

if (response.IsSuccessStatusCode)
        string result = await response.Content.ReadAsStringAsync();
        var typedResult= JsonConvert.DeserializeObject<Results>(result);      


Modified Code that will work

var client = new HttpClient();
client.DefaultRequestHeaders.Authorization = new AuthenticationHeaderValue( "Bearer", wsKey);
client.BaseAddress = new Uri("<your ws url>");
//Rather than using a typed object, construct the JSON object manually using strings
string jsonBody =  "{\"Inputs\": {\"input1\": {\"ColumnNames\": [\"AnnualReview\",\"Category\"],........";

//Rather than using PostJsonAsync use PostAsync
 HttpResponseMessage response = await client
                   .PostAsync("", new StringContent(jsonBody , Encoding.UTF8, "application/json"))

if (response.IsSuccessStatusCode)
        string result = await response.Content.ReadAsStringAsync();
		//Rather than using DeserializeObject, parse the json string manually
        var parsingResp = ((string)result).ParseWSResponse();



In nutshell, my experience has been that you can call external services as long as you stick to base .NET classes that come packaged with the framework out of the box.

Posted in IoT, Machine Learning

Azure IoT Hub Streaming Analytics Simulator

Azure IoT Hub Streaming Analytics Simulator is an application written by Manny Grewal. The purpose of this blog is to explain What, Why and How of this application.



Streaming analytics is a growing trend that deals with analysing data in real-time. Real-time data streams have a short life span, their relevance decreases with time, so they demand quick analysis and rapid action.

Some areas where such applications are highly useful include data streams emitted by

  • Data centres to detect intrusions, outages
  • Factory production line to detect wear and tear of the machinery
  • Transactions and phone calls to detect fraud
  • Time-series analysis
  • Analogy Detection


Data used by streaming analytics applications is temporal in nature i.e. it is based on short intervals of time. What is happening at the interval TX can be influenced by what happened 2 minutes ago i.e. at the interval TX-2

So the relationships between various events are time-based rather than entity based (e.g. as in general Entity Relational Database based systems)

Take the scenario of a Data Centre which has two sensors that emit a couple of data streams – Fan Speed of the server hardware and its temperature.

If temperature reading of server hardware is going high, it could be related to the dwindling Fan Speed reading. We need to look at both the readings over an interval of time to establish a hypotheses on their correlation.




In order to model and work with streaming analytics it is important to have an event generator that can generate the data streams in a time-series fashion.

Some example of such generators can be vehicle sensors, IoT devices, medical devices, transactions, etc. that generate data quickly.

The purpose of this application is to simulate the data generated by those devices, it just helps you setup quickly and start modelling some data for your IoT experiments.



Main benefits of this app

1. Integrated with Azure IoT Hub i.e. the messages emitted by this application are sent to the Azure IoT Hub and can be leveraged by the Intelligence and Big Data ecosystem of Azure.

2. This app comes with 4 preset sensors

a. Temperature/Humidity

b. Air Quality

c. Water Pollution

d. Phone call simulator

3. Configure > Ready. App can be easily pointed to your Azure instance and can start sending messages to your Azure IoT Hub

4. Can be extended, if you are handy with .NET development. I have designed the app on S.O.L.I.D framework so it can be extended and customised the link to source code is below




App and source code can be downloaded from my Github


A quick tour of the app is below

IoT Hub




The app needs to be configured with details of your Azure IoT Hub account.

The following files need to be configured

1. App.Config

2. If you are registering Devices in the Hub, then keys for the devices need to be stored in the SensorBuilder.cs

3. You may need to restore the Nuget Packages to build the application


Once the above three steps have been completed, you can build the application and the EXE of the application will be generated.


Sensor Tuning

Sensors can be tuned from the classes inheriting IDataPoint e.g. in the FloatDataPoint.cs

The following properties can be used to tune the sensors

Property Name Tuning
MinValue The minimum value of the sensor reading e.g. for climatic temperature it can be -40C
MaxValue The maximum value of the sensor reading e.g. for climatic  temperature it can be 55C
CommonValue This is the average value of the sensor e.g. for warmer months it can be 30C
FluctuationPercentage How much variance you want in the generated data
AlertThresholdPercentage When should an alert be generated if the reading passes a certain threshold e.g. 80% of the maximum value


Azure IoT Hub

The messages sent by the sensor simulator can be accessed in the Azure IoT Hub. Once you have configured your hub and related streaming jobs. The messages can be seen in the dashboard as below



The messages are sent in the JSON format and below is a structure of one of the messages emitted by a sensor located at Berwick, VIC

"IncludeSensorHeader": 1,

"MessageId": "949a3618-c4a4-42bc-9c2a-39da86aa9191",

"EmittedOn": "2017-06-30T11:13:45.3543200",

"SensorDataHeader": {
"Readings": [