Friday, September 30, 2011

Have you ever thought through how MDM differs in a multinational company from MDM in a only nationally focused company?


Do you remember my blog post how to organize for MDM? There I described how you should organize your MDM activities in principle. If you are working for a big multinational company you might have thought that this is by far to simple to solve your challenges. 

And you are right!

In a multinational company you have on top of the generic MDM specific tasks additional challenges which you have to address within your organization, your processes and your IT infrastructure (esp. in your metadata model).

Even if you start looking in the data of multinational products you can find very serious challenges for MDM.

Global, regional and local products: In an multinational environment you typically find global products which are sold and relevant to all regions or countries. But you also find products which are region specific (eg. only for the European or Asian market) and also very local products which might be available and target only on one country. Typically you want your local organizations to be able to list and sell the local products while the global products are more or less defined by a central business unit.

Global, regional and local elements in global products: If you have global products which are available in different regional or even local markets you typically have some product information which is really only targeted on local requirements. One good example for those types of requirements is the requirement for the “green dot” information in germany (which by the way became obsolete again due to changed legislations). This information was only relevant if you wanted to sell a product to germany. But even if it was the same product globally, for germany you specifically had to add the information whether the product had the “green dot” certificate or not.

Multi-branded products: Multiple brandings can occure for the same product already from the industry side, but it is even more common in retail where you have two retail brands owned by one company and therefore sharing a big part of their assortment. But when presenting to the market they want to describe the same product typically in different ways to the market to clearly differentiate their products from each other.

How does all of this impact your organization and processes?

If you are a multinational company you will have a strategy which will also cover the parameters “need for integration” and “need for local responsibility”. These two parameters define quite well your internationalization strategy. The same you have to apply to your master data management.
In the end you have to setup a distributed organization for your MDM. A quick shot might look like this:

Wednesday, August 31, 2011

Leaving SA2 Worldsync …


I am leaving SA2 Worldsync. I have worked now for over 6 years in the area of master data management and global data synchronization with SA2 Worldsync and its predecessors.

And I think it was a great time and esp. a great team to work with. And I think we achieved a lot. SA2 Worldsync has become one of the globally dominant data pool players and also one of the leading PIM vendors at least in the retail market.

Thinking back I started all this master data management stuff right back in 1997 when founding Cataloom AG. There we developed successfully one of the first PIM (Product Information Management) / MDM (Master Data Management) Systems. 

And I still remember very vividly myself coding the first database schemas and graphical user interfaces (still very awkwardly looking) for cataloom 0.1.

We were very quickly successfully fulfilling the requirements of our customers like AXA, BASF, Deutsche Bahn, Emaro (the famous eprocurement marketplace joint venture of SAP and Deutsche Bank), Siemens Medical and many others. It was a real fun time, working with a small but enthusiastic team, having all this great ideas and concepts which are still in the product and which are still the base for its success.

In 2005 PIRONET NDH has taken over Cataloom. My personal journey into the world of GS1 and data synchronization started when PIRONET NDH also took over a majority stake in SINFOS the predecessor of SA2 Worldsync. 

Replacing the proprietary SINFOS technology with my previous Cataloom technology and building this great Webforms with online validations was one of my first responsibilities during that time.

Merging with the Agentrics data pool business unit was the next milestone. And this merger was mainly driven by our superior technology. Esp. I think the usability of our Webforms is still leading edge technology and the team keeps developing and improving it.

After that merger I became responsible for looking into our retail customers understanding their master data management processes and to help them to get those processes organized to be able to also leverage the SA2 Worldsync services better.

I think in the last two to three years I worked with most of the leading retailers globally (from Europe over Japan to the US) to understand their challenges in master data management and to work with them on improving those processes, their organization and sometimes also their IT infrastructure.

Right now SA2 Worldsync is not only supporting world’s largest retailers and brand manufacturers but also some of the most advanced GS1 and GDSN communities like GS1 UK and GS1 Australia by providing our technology to them. And not to forget to mention that the SA2 Worldsync WS|PIM product today is run by some of the largest retailers – and right now there are more to come.

I think SA2 Worldsync has been settled right now and achieved a really leading position in the master data management and global data synchronization business.

That is the time to leave for me.

Why?

I am the more entrepreneurial kind of guy. I have to start up something new. Change the world :-) 

I think Cataloom and SA2 Worldsync had some impact. At least the world changed a little bit. Product Information and data synchronization processes became a little bit more electronically. But I think there is still a long way to go to have all business transactions electronically.

That is where I still see a huge potential! So be prepared to see something new coming up which hopefully will really have some impact!

And in the meantime – if you have some requirements regarding master data management consulting, just let me know. I am helping companies globally to put in organisation and processes and IT infrastructure to manage their master data more efficiently.

PS: Stay tuned - this blog will be continued and now I might even have more time to share my thoughts on MDM and GDSN!

Thursday, August 25, 2011

How to organise for MDM?

Have you ever thought through how to build an organisation for master data management?

One of the key issues I always find at customers who are struggeling with their quality of master data is that their organisation is not really prepared to deal with master data in a sustainable manner.

In my view Gartner has done quite a good job in proposing a generic organisation for master data management. Actually this organisation implies that you are introducing a new business process - Master Data Management.

Please look at the following orgchart - this is how I remember the generic organisation Gartner is proposing:
The generic MDM organisation
One of the key ideas is that business has to lead the MDM organisation. A quick description of the different roles & responsibilities:

  1. The Information Governance Board should consist of executive level sponsors. They should  set and enforce information management policies. In this board you should have representatives from the business - that is key - but you also should have a representation from your IT executives in here.
  2. The MDM Team is responsible to manage the MDM program and who authors and maintains the master data. If you are also implementing GDSN or any other means to collect item data from your suppliers the supplier onboarding and communication is also a key task for this team.
  3. Data Stewards sit in the business unit and they are responsible for data quality monitoring, improvement and issue resolution. They are NOT maintaining the data themselves but work closely with the MDM Team to get the master data quality on to the level the business needs it.
  4. The MDM Infrastructure Team is a kind of virtual team responsible for all aspects of the IT implementation needed for the MDM business process.
  5. In the GDSN context I only want to highlight the importance of the Data Modeling and Information Architecture role. If participating in the GDSN this role is key because it is the link to the GSMP process and defining change requests and also anticipating and adopting the changes coming through the GSMP / GDSN.
Please take this organisation template as what it is - only a template. In real life you have to look at your current organisation and see how you can adopt the different roles and responsibilities within your organisation.

But also be aware - you want to introduce a new business process which you did not have in the past.

And if your organisation is multinational or even global I will discuss in one of my next posts what impact that has on the MDM organisation.

Friday, June 10, 2011

Don't brands want to communicate transparently to the consumer?

Do you remember one of my earlier postings Brand owners loose sales due to mobile scanning?

Actually my main message was that brands have to invest into MDM to be able to publish their product information in a high quality way to all those app providers to have control over the branded product information publically available.

I thought that it is mainly a technical and effort issue!

But I had to learn that there are also other issues.

Last couple of days there was the SA2 Worldsync's Annual User Congress 2011where we promoted a lot the new mobile century. We had great presentations from the mobile space (eg. from Mirasense and from barcoo) which made it clear that there is a massive demand for extensive product information from the consumer. And those app providers are fulfilling this demand - either with or without the support from the brand owners.

We got quite some good feedback from some major brand manufacturers for our activities to deliver trusted, manufacturer product information to those platforms. See one of our press releases here.

But we also got quite some negative feedback from some other major brand manufacturers.

Quote: "That is not in our interest."

It is not in your interest to communicate with your customers?
It is not in your interest to provide correct and honest information to your customers?
It is not in your interest to use the new media?

One argument I heard was "We cannot give allergene information to the consumer via such platforms because then we could be held liable for it. And we could be sued if the data is incorrect and thereby somebody gets harmed.".

But you are putting allergene information on the packaging and on your website.

Are your processes so much more reliable for putting information on the packaging or on your website then publishing it to the GDSN?

Or are there other reasons behind your reluctancy to provide transparent information to the consumer?

Actually I haven't got it yet. I would be happy to get some feedback on that topic ...

Thursday, May 26, 2011

Even with GDSN - retailers product master data is still crap!?

Just recently GS1 Belgium did  a "Mini Data Crunch Survey 2011" where they did a survey what the impact of bad data on the belgian grocery market is. This is a survey which stands in a line with what GS1 UK started with their "GS1 UK Datacrunch Report 2009".

I think GS1 Belgium really put together a very nice report - and the results are - I would like to say "as expected" - very bad. The data between suppliers and retailers is significantly out of sync. And there is a huge cost to the market for that.

What I liked most is that GS1 Belgium not only looked at suppliers and retailers who were not yet doing GDSN but also at suppliers and retailers who are heavy GDSN users. Surprisingly by looking at the figures they have measured regarding "missing data" and "mismatches supplier-retailer" you really cannot tell who is doing GDSN and who is not. The percentage of incorrect data ranges from 49% to 67% - and keep in mind they only researched the data of 4 suppliers at 4 retailers and even that with only 100 products:


I would have expected that the retailers who are using GDSN have significant better data then those who are not using GDSN at all.

If that is not the case, what are the reasons for those issues?

GS1 Belgium does not directly address this subject, but indirectly they are proposing an answer with their analysis what is happening behind the firewalls of suppliers and retailers.

Obviously suppliers and retailers still have to do a lot of manual work on both sides of the table even if they have implemented an automated GDS process.

From my perspective the two most important recommendations of GS1 Belgium are:
  1. Use an automated exchange of data between suppliers and retailers. Here GS1 Belgium for a good reason refers to their GDSN datapool offering. But I think that is only the one part needed. The other part is that retailers have to implement internal automatic integration processes to really integrate the data into all their business applications.

    Just recently I found out at a major german retailer who is already doing GDSN for quite some time that they are not integrating the GDSN data into their purchasing system!? How do they expect to benefit from GDSN for their order and invoice processes???
  2. Introduce a so called "data team" on suppliers and on retailers side to be the one and only team to maintain the data.
They are also proposing a lot of other very reasonable actions to be taken like establish data ownership, data quality dashboards, etc.

But I really think you have to establish a dedicated organisation and then you have to integrate the data into your business applications in an automated way. Actually straight forward, isn't it?

But my expierence is that suppliers and retailers are really struggeling at this point and typically do not implement this consequently ...

Let me repeat my mantra: Suppliers and Retailers simply have to implement a MDM program and make their GDSN initiative part of that MDM program. This is the only way to implement GDS successfully and gain the business benefits out of it.

Tuesday, May 24, 2011

Retailers: GDSN Changes - the productivity killer?

Lately I talked to one of the bigger european retailers who has recently fully adopted the GDSN processes to receive item data from his suppliers.

Meaningless changes!?

He was complaining that they were receiving so many changes and that he needed more staff to get control of this stream of changes although he thought that most of those changes were not meaningful to them!?

How can a change of an item from a supplier be not meaningful to a retailer? At first sight I really questioned this statement and we started investigating what was happening.

First of all I sat down for a couple of days with the data stewarts who are in charge to approve all changes to really see what was happening.

50 - 1,000 changes / day

My first observation was that they really had to approve a huge amount of changed items each day. They had to manage a minimum of 50 changes per day but also one day it was more than 1000 changes which they were not any more able to approve with the existing staff in that day and changes began to pile up.

Up to 10 minutes for approval of a single item

My second observation was that it was quite painful to look at each and every item and to figure out what the change was. Although they have implemented a quite fancy looking comparison screen it takes the data stewart up to 10 minutes to do the item review and to decide whether to synchronize, review or reject the item.

90% of changes meaningless

My third observation then was the eye opener. Approx. 90% of the changes were meaningless for this retailer. How come? Actually every retailer is only using a subset of the attributes which are available in the GDSN. None uses all 1,600 attributes available. All retailers I have seen use between 100 and 250 attributes. But suppliers have to maintain actually the superset of all attributes required by their retail customers.

So what is happening is that supplierA is maintaining Attr1 - Attr200, while retailerB is using Attr1 - Attr100 and retailerC is using Attr90 - Attr200.

If supplierA is changing Attr1 of an item this change gets send automatically to retailerB and retailerC. But it is only relevant for retailerB. retailerC is not using Attr1, so the change is not relevant to him. But according to the GDSN rules he has to synchronize this version of the item to stay synchronized with the supplier. Sending back a CIC review or reject would not make any sense because that is just not what he wants to do.

Solution 1

What is the solution? The first solution approach is to evaluate automatically whether a change is relevant to a retailer and if it is not to automatically synchronize (meaning store the item and send back a CIC synchronized) the item change. Only if the change is relevant put the item into the manual approval workflow.

With this solution this retailer could already save up to 50-60% of the needed effort in his approval process.

Solution 2 - and my recommendation

My recommendation is to take it even one step further. Why do you have to look at each and every item change even if it is relevant to you? Do you really think that your data stewart can judge for all your items whether a slight change in the measurements or in the ingredients or in the packaging or where ever is correct without pulling the concrete product and remeasuring?

My experience is that the whole manual approval process mostly is pure waste of time.

Instead turn the process around. Define rules when an item change cannot be automatically approved (eg. if the description changes, if measurements change more then 10%, etc.). If an item change passes all those rules it should be synchronized automatically. Otherwise you put it to the manual approval process.

And then measure what is your ratio during manual approval between sending a review or reject to the supplier and synchronized. If you have more than 50% getting an immediate synchronized than your approval rules are still to strict and you should further relax them.

Now you should be down to 10% or even less of the original effort.

Btw. dealing with ADD's is typically a little bit more complicated because you typically have a manual enrichment process ...


UPDATE: 
There is a very interesting discussion on LinkedIn regarding this Blogpost, see here.

Saturday, May 7, 2011

What are the core components of a MDM system?

Actually we are talking  a lot about MDM systems – but hey - are you aware what  a full blown MDM system should provide?

In my opinion the following modules are essential to a MDM system:
  • A very flexibel Data Model. This is key and actually the core. You want to manage master data and you have to be able to put the data model into the system which meets your requirements.

    For me managing the data model is NOT a development task but instead a pure customization task which should be taken over by the people who are operating and administrating the MDM system. In the best case the MDM Team itself is able to adjust the data model to their needs.

    The flexible Data Model should be accompanied by a validation rules engine. What you want to do when defining your data model you also want to define the data quality rules your master data should comply to. And those data quality rules are typically more complex then just a primitive type check or the check for mandatory attributes.
  • A Content or may be better “Item” Management Module which allows the different departments actually to manage (enter and maintain) the master data itself. And man – do not mix that module up with a web content mangement system (CMS) that is something completely different.

    This module should be fully driven by the data model (meaning again no programming if you add an attribute) and allow to customize different views for different users or user groups.

    It should be also integrated with some workflow engine because you might want to establish editorial and approval workflows for master data management.

  • A Data Quality Module. This module should provide you the capabilities to monitor the data quality you have in your system. Thereby this should be integrated with your validation rules engine so that you can report on which items do not comply to your rules and stuff like this.

    It is also great if the MDM system already provides simple data quality analysis functionalities like pattern analysis or filling rate analysis independent from your defined data quality validation rules.

  • A Data Integration Module. This is essential because you want to integrate your master data into the diverse other systems you have. What you need is batch upload and download functionality plus you need realtime integration capabilities eg. via  a Enterprise Service Bus.

  • A Workflow Engine should be part of the MDM system. Actually as data models differ from company to company also the editorial workflows but also the approval workflows differ from company to company. This gives you a great flexibility to model your processes into the system and it helps that you not have to adjust yourself to the processes defined within the tool.

  • Webservices should be supported for integration purposes. As master data is needed in most areas of the business having public webservices often helps to solve integration requirements very quickly.

  • For sure you have to look at Technology and Architecture that this fits into your Company policies.
  • Performance, Scalability and Availability are also key requirements for a business critical application like a central MDM system
  • You should also look at operating and security requirements and whether the system meets them.
  • And last but not least a MDM system should also provide at least a minimum of reporting capabilities.
And as you know my focus is always not only MDM but also how MDM connects and enables GDSN. So if your requirement is that your MDM system should also support your GDSN initiative what is required then?

  1. Best would be if your MDM vendor already provides you with a out-of-the-box GDSN connector which is certified – or at least know to work with – your favorite data pool. But as there are not that many MDM solutions out there with GDSN connectors out-of-the box you should look at option 2.

  2. To support GDSN with your MDM you need:
    1. Your data model has to be aligned with the GDSN data model. Either you have your own and you can map that to the GDSN data model (be aware of codelist issues, attribute type issues and stuff like that) or you directly adopt the standards and use them also internally.
    2. You have to build workflows which are supporting the item confirmation process.
    3. Your MDM systems integration capabilities should support the data formats and protocols your data pool supports

I hope this helps you to structure your evaluation process of a MDM system!