Thursday, May 26, 2011

Even with GDSN - retailers product master data is still crap!?

Just recently GS1 Belgium did  a "Mini Data Crunch Survey 2011" where they did a survey what the impact of bad data on the belgian grocery market is. This is a survey which stands in a line with what GS1 UK started with their "GS1 UK Datacrunch Report 2009".

I think GS1 Belgium really put together a very nice report - and the results are - I would like to say "as expected" - very bad. The data between suppliers and retailers is significantly out of sync. And there is a huge cost to the market for that.

What I liked most is that GS1 Belgium not only looked at suppliers and retailers who were not yet doing GDSN but also at suppliers and retailers who are heavy GDSN users. Surprisingly by looking at the figures they have measured regarding "missing data" and "mismatches supplier-retailer" you really cannot tell who is doing GDSN and who is not. The percentage of incorrect data ranges from 49% to 67% - and keep in mind they only researched the data of 4 suppliers at 4 retailers and even that with only 100 products:


I would have expected that the retailers who are using GDSN have significant better data then those who are not using GDSN at all.

If that is not the case, what are the reasons for those issues?

GS1 Belgium does not directly address this subject, but indirectly they are proposing an answer with their analysis what is happening behind the firewalls of suppliers and retailers.

Obviously suppliers and retailers still have to do a lot of manual work on both sides of the table even if they have implemented an automated GDS process.

From my perspective the two most important recommendations of GS1 Belgium are:
  1. Use an automated exchange of data between suppliers and retailers. Here GS1 Belgium for a good reason refers to their GDSN datapool offering. But I think that is only the one part needed. The other part is that retailers have to implement internal automatic integration processes to really integrate the data into all their business applications.

    Just recently I found out at a major german retailer who is already doing GDSN for quite some time that they are not integrating the GDSN data into their purchasing system!? How do they expect to benefit from GDSN for their order and invoice processes???
  2. Introduce a so called "data team" on suppliers and on retailers side to be the one and only team to maintain the data.
They are also proposing a lot of other very reasonable actions to be taken like establish data ownership, data quality dashboards, etc.

But I really think you have to establish a dedicated organisation and then you have to integrate the data into your business applications in an automated way. Actually straight forward, isn't it?

But my expierence is that suppliers and retailers are really struggeling at this point and typically do not implement this consequently ...

Let me repeat my mantra: Suppliers and Retailers simply have to implement a MDM program and make their GDSN initiative part of that MDM program. This is the only way to implement GDS successfully and gain the business benefits out of it.

Tuesday, May 24, 2011

Retailers: GDSN Changes - the productivity killer?

Lately I talked to one of the bigger european retailers who has recently fully adopted the GDSN processes to receive item data from his suppliers.

Meaningless changes!?

He was complaining that they were receiving so many changes and that he needed more staff to get control of this stream of changes although he thought that most of those changes were not meaningful to them!?

How can a change of an item from a supplier be not meaningful to a retailer? At first sight I really questioned this statement and we started investigating what was happening.

First of all I sat down for a couple of days with the data stewarts who are in charge to approve all changes to really see what was happening.

50 - 1,000 changes / day

My first observation was that they really had to approve a huge amount of changed items each day. They had to manage a minimum of 50 changes per day but also one day it was more than 1000 changes which they were not any more able to approve with the existing staff in that day and changes began to pile up.

Up to 10 minutes for approval of a single item

My second observation was that it was quite painful to look at each and every item and to figure out what the change was. Although they have implemented a quite fancy looking comparison screen it takes the data stewart up to 10 minutes to do the item review and to decide whether to synchronize, review or reject the item.

90% of changes meaningless

My third observation then was the eye opener. Approx. 90% of the changes were meaningless for this retailer. How come? Actually every retailer is only using a subset of the attributes which are available in the GDSN. None uses all 1,600 attributes available. All retailers I have seen use between 100 and 250 attributes. But suppliers have to maintain actually the superset of all attributes required by their retail customers.

So what is happening is that supplierA is maintaining Attr1 - Attr200, while retailerB is using Attr1 - Attr100 and retailerC is using Attr90 - Attr200.

If supplierA is changing Attr1 of an item this change gets send automatically to retailerB and retailerC. But it is only relevant for retailerB. retailerC is not using Attr1, so the change is not relevant to him. But according to the GDSN rules he has to synchronize this version of the item to stay synchronized with the supplier. Sending back a CIC review or reject would not make any sense because that is just not what he wants to do.

Solution 1

What is the solution? The first solution approach is to evaluate automatically whether a change is relevant to a retailer and if it is not to automatically synchronize (meaning store the item and send back a CIC synchronized) the item change. Only if the change is relevant put the item into the manual approval workflow.

With this solution this retailer could already save up to 50-60% of the needed effort in his approval process.

Solution 2 - and my recommendation

My recommendation is to take it even one step further. Why do you have to look at each and every item change even if it is relevant to you? Do you really think that your data stewart can judge for all your items whether a slight change in the measurements or in the ingredients or in the packaging or where ever is correct without pulling the concrete product and remeasuring?

My experience is that the whole manual approval process mostly is pure waste of time.

Instead turn the process around. Define rules when an item change cannot be automatically approved (eg. if the description changes, if measurements change more then 10%, etc.). If an item change passes all those rules it should be synchronized automatically. Otherwise you put it to the manual approval process.

And then measure what is your ratio during manual approval between sending a review or reject to the supplier and synchronized. If you have more than 50% getting an immediate synchronized than your approval rules are still to strict and you should further relax them.

Now you should be down to 10% or even less of the original effort.

Btw. dealing with ADD's is typically a little bit more complicated because you typically have a manual enrichment process ...


UPDATE: 
There is a very interesting discussion on LinkedIn regarding this Blogpost, see here.

Saturday, May 7, 2011

What are the core components of a MDM system?

Actually we are talking  a lot about MDM systems – but hey - are you aware what  a full blown MDM system should provide?

In my opinion the following modules are essential to a MDM system:
  • A very flexibel Data Model. This is key and actually the core. You want to manage master data and you have to be able to put the data model into the system which meets your requirements.

    For me managing the data model is NOT a development task but instead a pure customization task which should be taken over by the people who are operating and administrating the MDM system. In the best case the MDM Team itself is able to adjust the data model to their needs.

    The flexible Data Model should be accompanied by a validation rules engine. What you want to do when defining your data model you also want to define the data quality rules your master data should comply to. And those data quality rules are typically more complex then just a primitive type check or the check for mandatory attributes.
  • A Content or may be better “Item” Management Module which allows the different departments actually to manage (enter and maintain) the master data itself. And man – do not mix that module up with a web content mangement system (CMS) that is something completely different.

    This module should be fully driven by the data model (meaning again no programming if you add an attribute) and allow to customize different views for different users or user groups.

    It should be also integrated with some workflow engine because you might want to establish editorial and approval workflows for master data management.

  • A Data Quality Module. This module should provide you the capabilities to monitor the data quality you have in your system. Thereby this should be integrated with your validation rules engine so that you can report on which items do not comply to your rules and stuff like this.

    It is also great if the MDM system already provides simple data quality analysis functionalities like pattern analysis or filling rate analysis independent from your defined data quality validation rules.

  • A Data Integration Module. This is essential because you want to integrate your master data into the diverse other systems you have. What you need is batch upload and download functionality plus you need realtime integration capabilities eg. via  a Enterprise Service Bus.

  • A Workflow Engine should be part of the MDM system. Actually as data models differ from company to company also the editorial workflows but also the approval workflows differ from company to company. This gives you a great flexibility to model your processes into the system and it helps that you not have to adjust yourself to the processes defined within the tool.

  • Webservices should be supported for integration purposes. As master data is needed in most areas of the business having public webservices often helps to solve integration requirements very quickly.

  • For sure you have to look at Technology and Architecture that this fits into your Company policies.
  • Performance, Scalability and Availability are also key requirements for a business critical application like a central MDM system
  • You should also look at operating and security requirements and whether the system meets them.
  • And last but not least a MDM system should also provide at least a minimum of reporting capabilities.
And as you know my focus is always not only MDM but also how MDM connects and enables GDSN. So if your requirement is that your MDM system should also support your GDSN initiative what is required then?

  1. Best would be if your MDM vendor already provides you with a out-of-the-box GDSN connector which is certified – or at least know to work with – your favorite data pool. But as there are not that many MDM solutions out there with GDSN connectors out-of-the box you should look at option 2.

  2. To support GDSN with your MDM you need:
    1. Your data model has to be aligned with the GDSN data model. Either you have your own and you can map that to the GDSN data model (be aware of codelist issues, attribute type issues and stuff like that) or you directly adopt the standards and use them also internally.
    2. You have to build workflows which are supporting the item confirmation process.
    3. Your MDM systems integration capabilities should support the data formats and protocols your data pool supports

I hope this helps you to structure your evaluation process of a MDM system!