Wednesday, December 28, 2011

9 Predictions on GDSN and MDM for 2012


Inspired by the many predictions for 2012 you can find right now in the I thought to write down my own assumptions and predictions for 2012. Maybe it is a little bit unstructured but this is how it came to my mind.

I hope you will enjoy them a little bit and maybe enrich them with your own thoughts and comments!

Here we go:

1. GDSN will overcome the national Data Synchronization standards and become the real global Datasync standard
Especially in Europe Data Synchronization started long before the rise of the GDSN – and already was successfully implemented before. With the rise of the GDSN and the adoption by the large, global manufacturers the European communities were urged to build bridges from their national, proprietary data synchronization standards into the GDSN. This happened the last couple of years for example in Germany, Sweden, France, Spain and other European countries. By now many of those communities have learned that using one single, global standard is really beneficial and therefore they are in the process of switching from their national, proprietary program to the global GDSN. One good example here is the German community which has decided to switch from their former SINFOS standard to the GDSN. This initiative is led by GS1 Germany and backed by all major players in the German retail marketplace.

2. Master Data Management also gets implemented by European retailers
My observation is that MDM as a discipline was for quite some time much more adopted by US retailers than in European retailers. Therefore European retailers had much more difficulties in adopting automated data synchronization and getting rid of paper or email based new item forms and the manual processes attached to it.
What I am seeing now is that meanwhile also many European retailers are looking into MDM as a discipline or a business process and that many of them really have already started their MDM program or are at least right now preparing to start one.


3. Retailers build their own portals to gather item master data on top of GDSN to have a free solution for suppliers and to collect data beyond the standards – and this will burst the usage of GDSN
GDSN on its own is not sufficient for retailers. The goal for retailers regarding data synchronization is to get the data of ALL their suppliers in a reasonable time frame electronically in an automated process instead of getting only a part of it through data synchronization and sticking with the rest to manual processes. 
To overcome the chicken and egg problem (“only a ‘small’ amount of suppliers are doing GDSN, therefore retailers are not implementing GDSN” versus “retailers are not implementing GDSN and therefore suppliers are not participating in the GDSN”) retailers are more and more implementing their own web portals where suppliers can maintain their item data manually at no charge additionally to implementing GDSN. By offering a free option to suppliers retailers really can mandate electronic delivery of item data – either via GDSN or via their own portal.
Why this will burst the usage of GDSN? Because retailers are implementing GDSN and suppliers are only getting any benefits from electronic delivery of item data if they implement GDSN themselves instead of manually feeding the retailer portals.

4. E-Commerce in Europe becomes the driver for MDM and GDSN in retailers
E-Commerce is THE driver for electronic product information par excellence. This is how suppliers can advertise and promote their products and directly impact their sales through the different online channels. 
Electronic product information is the equivalent to the packaging in the store!
To manage the product information on the supplier and the retailer side you absolutely need to implement MDM for product information otherwise you will not be able to control the data.
But why will it also drive the adoption of GDSN? GDSN is an established infrastructure between suppliers and retailers to exchange product information and absolutely capable to exchange the more sales oriented product information for E-Commerce also. Why try to establish something else?

5. B2C item information will become integral part of GDSN
As E-Commerce is one of the key drivers sales oriented product information (aka ‘B2C item information’) has to become integral part of GDSN. 
Do we need ‘Modular Item’ for this? Probably not short term, because we could easily live with some more extensions or even with using the ‘Specifics Technical Characteristics’ Extension.
Key is to regard GDSN as the means to receive “trusted data” from suppliers.
Btw. what is your understanding of “B2C data”? Mine is all data on a product that comes from the business and the targeted audience is the consumer. This includes all product feature information, every kind of marketing information, manufacturers images and such stuff. It excludes explicitly all kind of consumer generated content (e.g. recommendations) or 3rd party content (e.g. test reports etc.). 

6. Stationary retailers will learn from Amazon that the electronic supply-chain is key for success
Working a lot for stationary retailers, my observation is, that those retailers have a lot challenges to adopt electronic processes for their key business processes.  Mainly because they have well established (manual) processes and their business is up and running. It is obviously a huge change task (mental wise but also just from the huge amount of employees impacted) for them to switch from the manual processes which are working for decades to automated, electronic processes.
But as Amazon meanwhile is very well established globally and is competing more and more with stationary retailers, the latter ones have to really investigate how they can stay competitive. Full adaption of the electronic supply-chain is one of the must haves. 

7. The GS1 system demands an integrated solution for supplier data, item master data and transactional data
You implement GDSN and then you are going to start your first rollout / onboarding program. And you will have your first failure because you learn that your supplier address data is too poor. This can be repaired with a huge manual effort (contact each and every supplier and collect their GLN, the correct contact person and probably some more address information).
But why is there no viable GS1 solution which provides supplier address information to retailers?
Each and every retailer has the same challenges here. By the way suppliers have similar problems because where can they get all the DC and store addresses from retailers?
As a retailer you first need correct supplier information, then you need the item information and based on that you want to do the electronic business processes like orders, invoices, etc.
Why is there no integrated solution for this?
Will this solution emerge in 2012? Probably not. But the more the GDSN is used the more people will recognize that there are some other puzzle pieces still missing.

8. Gepir will be abandoned
I have to admit – I am not seriously predicting that this service will be shut down. But I still have not got the intention of the Gepir service. You could think that it is a wonderful service where you can find for each and every GLN and GTIN GS1 has ever issued some detail information.
But the data quality there is even worse than what I have seen at most retailers databases. And this just does not help.
From my perspective one of the key issues is the way how GLN’s and GTIN’s are distributed / sold where it is not mandatory to give a feedback what you are using a GLN or GTIN for.

9. GDSN at the tipping point - Not for profit vs. commercial services
The two largest GDSN datapools are meanwhile owned by the two largest GS1 organizations (SA2 Worldsync by GS1 Germany and 1sync by GS1 US). Many smaller GS1 organizations are offering also GDSN services to their community either based on 3rd party technology (many use either SA2 Worldsync or 1sync) or based on their own developments (e.g. GS1 Sweden or GS1 Hungary).
Where are the commercial players in the GDSN area? They are either bought by GS1’s (1sync was formerly Transora which was a commercial company, SA2 Worldsync was majority owned by Pironet NDH, a public company) or they are stepping slowly out of this business like GXS for example. In US there are still some commercial companies engaged in the GDSN market and there are also new players arising like lately FSEnet and others.
What does that mean for 2012? It will be very interesting to see how “not for profit” vs. “commercial” performs.
Regarding an integrated GS1 platform as described above I would think that commercial vendors are in a better position because GS1 in certain areas has to be sensitive to not make to much competition to their own solution providers. For example GS1 probably will not be willing to enter the field of transactional data (EDI) because there are well established, commercial offerings available.

So 2012 will become very interesting in the areas of MDM and GDSN!

Happy New Year!

Sunday, December 11, 2011

How to implement a Data Quality Management system ...


... for product master data!?

Did you ever think through it? Did you ever try to implement one? Actually I was happy to be part of a team at a large retailer to implement a Data Quality Management system.

In principle it is very simple:
But the problems already start with the very first bubble "Measure the current Data Quality". What is Data Quality? What are the the KPI's (KPI = Key Performance Indicator) to measure Data Quality?

First of all you really have to define what Data Quality means to you. And this really depends on your business processes which use the product data and the requirements those business processes have.

In my case I am pretty sure you could very quickly imagine that the core processes are purchasing and logistics which are the main drivers at the retailer I was working on the Data Quality Management system.

And for gods sake - our team was not the very first one who has to implement a Data Quality Management system for a retailer. We did a little bit research and found very quickly the "Data Quality Framework (DQF)" which is compiled by GS1 to support companies to get ready for GDSN.

The DQF is very helpful to implement a Data Quality Management system for product information if your product information is mainly targeted on supporting business processes in purchasing and logistics. If your requirements go beyond that you have to extend it.

The DQF consists of KPI's, a guide how to implement a Data Quality Management System, and a Self-Assessment procedure.

Most interesting to us were the KPI definitions:

  1. Overall item accuracy: Ok, this indicator is not that surprising - it is the percentage of items that have correct attributes values for the attributes in scope.
  2. Generic attribute accuracy: That is already more interesting. This is describing the percentage of items that have correct values for some more generic attributes. GS1 defines the following attributes to be in scope for this KPI:
    1. GTIN - that is the key for identification of an item therefore it is really key ;-)
    2. Classification Category Code - As the classsification code is relevant for reporting it is a very important attribute within retail industry
    3. TradeItemDescription - to me this is really a difficult attribute in retail. At all retailers I have been so far, the buyers always insisted that item descriptions are a means to differentiate from competitors and therefore have to be handcrafted at the retailer or have at least to comply with the retailers rules how the description has to be build. Just as a sidenote - I think that is wrong and the item descriptions in no way drive revenue, but I might be wrong here.
      Therefore we decided to leave that attribute out in our reporting.
    4. Net Content - is important for shelf tags and therefore one of the really important informations.
  3. Dimension and weight accuracy: Depth, Width, Height and Gross Weight are the key attributes here. And those attributes are not only key for distribution centers but also for your transport/route  planning and therefore have very strong and immediate impact on logistics.
  4. Hierarchy accuracy: This is absolutly relevant because in different business processes you are using different units of the same item. E.g. you might order an item on palett level but you stores order it on case level or even each level at your distribution center. If you then do not have the packaging hierarchy correct then you are in serious trouble!
  5. Active / Orderable: You should not order item units which are not active at your supplier or just not orderable units. This immediately disrupts every automatic, electronic process and therefore has to be avoided.
So with those KPI's you are covering very much all the requirements from business processes in purchasing and logistics.

But the question now is: How to measure accuracy for those attributes?

A retailer has two approaches he can take:
  1. Compare the data to the data provided by the supplier.
  2. Do a self-assessment and go to your DC and really measure the physical products to gain that information.
In our project we are doing both. We have implemented a system where we are comparing the supplier data to our data according to the above KPI's on an ongoing basis. As the supplier data provided through the data pool does not cover 100% of the business we are also calculating how much of the business is covered by this report.

On top of this we are doing a self assessment. The reason for this is mainly to figure out what quality the supplier data has.

From our experience a Data Quality Management system based on the GS1 Data Quality Framework is  a solid basis manage your MDM program. It gives you the means to document and communicate the progress your MDM program achieves.

---
Update 12.12.2011:

You made it till the end of this post? ;-)
Ok, then I have some more fun stuff for you. I just stumbled over this quite old video from GS1 Netherlands on data quality. But I think it is still to the point and at least it is fun watching and listening:

Saturday, December 3, 2011

Why GEPIR sucks ...

Have you ever tried to use GEPIR?

Lately I needed the address of a Distribution Center of one of the major german retailers. As usual I first tried google but there was no address to be found. I found a lot of information on the DC - press releases mentioning it and stuff like that - but no address.

Then I happily remembered GEPIR. GEPIR is a global service from GS1 and GS1 promotes it as "a unique, internet-based service that gives access to basic contact information for companies that are members of GS1. These member companies use GS1's globally unique numbering system to identify their products, physical locations, or shipments." .

And as you know each and every retailers uses GLN's (Global Location Numbers - the GS1 numbering system to identify physical locations) to also identify their Distribution Centers (DC's).

Ok - I quickly went to http://www.gepir.de (you can also use the more international version http://www.gepir.org or any of the other nationally provided URL's, they all access the same data), typed in the name of the retailer - and was first very surprised and then quite disappointed.

There were only 17 hits and none of them was a Distribution Center!?

I quickly checked multiple of the major german retailers, EDEKA, Metro, REWE - none of their Distribution Centers gets listed by GEPIR.

And all of them have more then one thousand GLN's to identify their DC and store locations!

GS1 is selling each and every GLN and they are not able to provide the correct master data to their GLN's!???

How come?

Did I get their vision wrong for GEPIR? Is GEPIR not meant to provide all the master data to the GS1 numbers? If that is not the vision of GEPIR what shall be the value otherwise??

I think the vision of GEPIR is to provide all the master data identified by the GS1 identifiers. Regarding the GLN I expect GEPIR to have all valid GLN's and the physical addresses they refer too. And I would also expect GEPIR to show what type of location that is. Is it a DC? Is it a store? Is it a location used for ordering? Or is it  a location used for invoicing? But that is a second issue with GEPIR.

I think GS1 has a simple master data management issue as many companies do have. I think there is a vision what GEPIR should be (see above), there might be even a strategy how to achieve that vision (implement a globally shared database) but they did the typical mistakes you can see in many companies regarding master data management.

They did not establish any metrics. For example they should measure what the coverage of the GLN's within GEPIR compared to the sold GLN's is. Then they would know that the coverage in the best case is only 50-60% and that they would have to take some measures to improve the situation.

They did not establish any sustainable organisation and processes. Who is responsible to get new GLN's into GEPIR? How about updates of addresses?

And finally their IT infrastructure does not seem to be really tailored to best support their vision. I even heard that couple times GLN's from a single, local GS1 organisation was sold two times. What a mess. And what type of IT infrastructure is not able to ensure the uniqueness of such a simple number like the GLN?

So finally GEPIR currently ends up as many MDM and PIM projects. There is a great tool. Even with an iPhone app. But it really sucks because the master data is either not available or the quality is so poor that you do not want to rely on it.

Care for the content! Care for your master data! Care for your master data quality!

Ok GS1, I really would recommend that you start a Master Data Management Program as you are expecting from your members to do data synchronisation. A possible point to start with you can find here: What is a MDM program?

GEPIR is really out of sync right now!


----
Update 03.12.2011:
There are some really interesting comments to this post on linkedin, which you can find here.

Tuesday, October 4, 2011

Why is GDSN ignored by PIM vendors?

Lately I had a very intensive discussion  on GDSN with one of the major PIM software producer. They were really interested to see whether GDSN could bring them any additional business and if there were relevant customers being in the need of GDSN support in a PIM solution.

They got very detailed presentations on GS1 activities in the Retail & CPG and in the Healthcare space. And they were also presented the actual GDSN growth figures and actually it was tried everything to sell them the relevance of GDSN for their PIM business. They even talked to major retailers using GDSN on their experience and advice and also to some data pools.

Finally they took some time to evaluate what was presented to them and draw some conclusions for their business. Actually I was quite surprised with their key conclusions:

  1. Datapools only have relevance in Europe and in US. This really puzzled me because at least Australia is one of the most advanced GDSN communities with the highest adoption rates. But looking at most of the other countries - even within Europe - GDSN is not that much adopted and still manual data exchange (even on paper) is the most spread way of data synchronisation. Although the adoption rate is continuously growing as you can see from the Global Registry Statistics.

  2. The data is not maintained sufficiently and the data quality is considered to be very low.
    In this case this is really a two-sided issue. First I think bad data quality in GDSN data pools is really one of the badest myths at all. In all cases where I did comparisons between retailer product data and the data in a data pool the retailer data was much worse then the data in the data pool. The data in the data pool actually always was in quite a good shape. This is also supported by all recent studies like the GS1 UK datacrunch report or the report from GS1 Belgium.
    The second side of the issue in this case is that their customers are mainly also in the ecommerce space and what you need there is the marketing and sales information on products and this is today really not available in the GDSN.
    It really seems to be a challenge within the GSMP process to get a standard developed for marketing and sales relevant (or let's use the new hype term "B2C") data. If you look for example at Consumer Electronics. The first hints on additional attributes and an additional standard I have found in 2008. And now in 2011 we are getting calls to action to work on the requirements for Consumer Electronics ... For me that does not look convincing too ...

  3. Data enrichment is not covered within the GDSN
    What they meant by "data enrichement" is to add B2C data to the data if that is not available directly from any internal source at the supplier.
    I think here they are going wrong. In my perception the PIM of a retailer should offer a supplier portal where suppliers can enrich their data manually. Also data pools like SA2 Worldsync are starting to offer such functionalities based on their data pools.
    As GDSN is "only" a protocol between data pools (and in some respect between a data source and a data recipient and its corresponding data pool) it does not deal with "manual" data enrichment processes by concept.

  4. They consider GDSN only as  a infrastructure side topic
    Here I think they are really making a big mistake. GDSN is not only about publishing the data to  a data recipient (retailer) but it is also about communication with the retailer (think of the CIC messages which allow retailers to send feedback back to suppliers).
    In my point of view every serious PIM system has to support the full GDSN choreography. And this also means to have specific UI's for the corresponding user roles and being aware that there is a new type of user role which has also to be established in the customers organisation.

  5. As they consider retail as one of their major markets, they also consider supplier data onboarding for retailers as one of their key tasks - but they do not consider GDSN to be one of their main means to do so
    As their customers are very much in the ecommerce space and they are dealing a lot with sales and marketing oriented product information this decisions seems to be very much linked to the above topic 2 and 3 and that this type of data is just not available in the GDSN.
    Although I think that onboarding of supplier data is always following the same principle. It is independant whether I need supplier logistics data or some feature information. Only the content differs.
What is my take away?

GDSN seems to be struggeling from my point of view with two challenges:
  1. Market perception: The market does not perceive GDSN as a major, global and successful way to synchronize product information. And this despite the huge community behind GS1 Standards and GDSN.
    I think that is really a misperception mainly because of the huge GS1 community. I am not aware of any comparable standardization organisation which is so user driven and has such a huge supporting community. But here also the second challenge comes into the game.
  2. GDSN adoption and perception by the implementing companies: Although there is such a huge supporting community the real success stories where the implementing companies really have achieved the promised savings are still rare. And this is not mainly because GDSN is the wrong approach but because companies either have not implemented it at all or have not implemented it properly as part of a MDM program.
In one of my next postings I will discuss some alternatives to GDSN and how a combined approach of GDSN with some alternatives might help user companies to achieve their goals and thereby also help GDSN to improve its market perception.

Friday, September 30, 2011

Have you ever thought through how MDM differs in a multinational company from MDM in a only nationally focused company?


Do you remember my blog post how to organize for MDM? There I described how you should organize your MDM activities in principle. If you are working for a big multinational company you might have thought that this is by far to simple to solve your challenges. 

And you are right!

In a multinational company you have on top of the generic MDM specific tasks additional challenges which you have to address within your organization, your processes and your IT infrastructure (esp. in your metadata model).

Even if you start looking in the data of multinational products you can find very serious challenges for MDM.

Global, regional and local products: In an multinational environment you typically find global products which are sold and relevant to all regions or countries. But you also find products which are region specific (eg. only for the European or Asian market) and also very local products which might be available and target only on one country. Typically you want your local organizations to be able to list and sell the local products while the global products are more or less defined by a central business unit.

Global, regional and local elements in global products: If you have global products which are available in different regional or even local markets you typically have some product information which is really only targeted on local requirements. One good example for those types of requirements is the requirement for the “green dot” information in germany (which by the way became obsolete again due to changed legislations). This information was only relevant if you wanted to sell a product to germany. But even if it was the same product globally, for germany you specifically had to add the information whether the product had the “green dot” certificate or not.

Multi-branded products: Multiple brandings can occure for the same product already from the industry side, but it is even more common in retail where you have two retail brands owned by one company and therefore sharing a big part of their assortment. But when presenting to the market they want to describe the same product typically in different ways to the market to clearly differentiate their products from each other.

How does all of this impact your organization and processes?

If you are a multinational company you will have a strategy which will also cover the parameters “need for integration” and “need for local responsibility”. These two parameters define quite well your internationalization strategy. The same you have to apply to your master data management.
In the end you have to setup a distributed organization for your MDM. A quick shot might look like this:

Wednesday, August 31, 2011

Leaving SA2 Worldsync …


I am leaving SA2 Worldsync. I have worked now for over 6 years in the area of master data management and global data synchronization with SA2 Worldsync and its predecessors.

And I think it was a great time and esp. a great team to work with. And I think we achieved a lot. SA2 Worldsync has become one of the globally dominant data pool players and also one of the leading PIM vendors at least in the retail market.

Thinking back I started all this master data management stuff right back in 1997 when founding Cataloom AG. There we developed successfully one of the first PIM (Product Information Management) / MDM (Master Data Management) Systems. 

And I still remember very vividly myself coding the first database schemas and graphical user interfaces (still very awkwardly looking) for cataloom 0.1.

We were very quickly successfully fulfilling the requirements of our customers like AXA, BASF, Deutsche Bahn, Emaro (the famous eprocurement marketplace joint venture of SAP and Deutsche Bank), Siemens Medical and many others. It was a real fun time, working with a small but enthusiastic team, having all this great ideas and concepts which are still in the product and which are still the base for its success.

In 2005 PIRONET NDH has taken over Cataloom. My personal journey into the world of GS1 and data synchronization started when PIRONET NDH also took over a majority stake in SINFOS the predecessor of SA2 Worldsync. 

Replacing the proprietary SINFOS technology with my previous Cataloom technology and building this great Webforms with online validations was one of my first responsibilities during that time.

Merging with the Agentrics data pool business unit was the next milestone. And this merger was mainly driven by our superior technology. Esp. I think the usability of our Webforms is still leading edge technology and the team keeps developing and improving it.

After that merger I became responsible for looking into our retail customers understanding their master data management processes and to help them to get those processes organized to be able to also leverage the SA2 Worldsync services better.

I think in the last two to three years I worked with most of the leading retailers globally (from Europe over Japan to the US) to understand their challenges in master data management and to work with them on improving those processes, their organization and sometimes also their IT infrastructure.

Right now SA2 Worldsync is not only supporting world’s largest retailers and brand manufacturers but also some of the most advanced GS1 and GDSN communities like GS1 UK and GS1 Australia by providing our technology to them. And not to forget to mention that the SA2 Worldsync WS|PIM product today is run by some of the largest retailers – and right now there are more to come.

I think SA2 Worldsync has been settled right now and achieved a really leading position in the master data management and global data synchronization business.

That is the time to leave for me.

Why?

I am the more entrepreneurial kind of guy. I have to start up something new. Change the world :-) 

I think Cataloom and SA2 Worldsync had some impact. At least the world changed a little bit. Product Information and data synchronization processes became a little bit more electronically. But I think there is still a long way to go to have all business transactions electronically.

That is where I still see a huge potential! So be prepared to see something new coming up which hopefully will really have some impact!

And in the meantime – if you have some requirements regarding master data management consulting, just let me know. I am helping companies globally to put in organisation and processes and IT infrastructure to manage their master data more efficiently.

PS: Stay tuned - this blog will be continued and now I might even have more time to share my thoughts on MDM and GDSN!

Thursday, August 25, 2011

How to organise for MDM?

Have you ever thought through how to build an organisation for master data management?

One of the key issues I always find at customers who are struggeling with their quality of master data is that their organisation is not really prepared to deal with master data in a sustainable manner.

In my view Gartner has done quite a good job in proposing a generic organisation for master data management. Actually this organisation implies that you are introducing a new business process - Master Data Management.

Please look at the following orgchart - this is how I remember the generic organisation Gartner is proposing:
The generic MDM organisation
One of the key ideas is that business has to lead the MDM organisation. A quick description of the different roles & responsibilities:

  1. The Information Governance Board should consist of executive level sponsors. They should  set and enforce information management policies. In this board you should have representatives from the business - that is key - but you also should have a representation from your IT executives in here.
  2. The MDM Team is responsible to manage the MDM program and who authors and maintains the master data. If you are also implementing GDSN or any other means to collect item data from your suppliers the supplier onboarding and communication is also a key task for this team.
  3. Data Stewards sit in the business unit and they are responsible for data quality monitoring, improvement and issue resolution. They are NOT maintaining the data themselves but work closely with the MDM Team to get the master data quality on to the level the business needs it.
  4. The MDM Infrastructure Team is a kind of virtual team responsible for all aspects of the IT implementation needed for the MDM business process.
  5. In the GDSN context I only want to highlight the importance of the Data Modeling and Information Architecture role. If participating in the GDSN this role is key because it is the link to the GSMP process and defining change requests and also anticipating and adopting the changes coming through the GSMP / GDSN.
Please take this organisation template as what it is - only a template. In real life you have to look at your current organisation and see how you can adopt the different roles and responsibilities within your organisation.

But also be aware - you want to introduce a new business process which you did not have in the past.

And if your organisation is multinational or even global I will discuss in one of my next posts what impact that has on the MDM organisation.

Friday, June 10, 2011

Don't brands want to communicate transparently to the consumer?

Do you remember one of my earlier postings Brand owners loose sales due to mobile scanning?

Actually my main message was that brands have to invest into MDM to be able to publish their product information in a high quality way to all those app providers to have control over the branded product information publically available.

I thought that it is mainly a technical and effort issue!

But I had to learn that there are also other issues.

Last couple of days there was the SA2 Worldsync's Annual User Congress 2011where we promoted a lot the new mobile century. We had great presentations from the mobile space (eg. from Mirasense and from barcoo) which made it clear that there is a massive demand for extensive product information from the consumer. And those app providers are fulfilling this demand - either with or without the support from the brand owners.

We got quite some good feedback from some major brand manufacturers for our activities to deliver trusted, manufacturer product information to those platforms. See one of our press releases here.

But we also got quite some negative feedback from some other major brand manufacturers.

Quote: "That is not in our interest."

It is not in your interest to communicate with your customers?
It is not in your interest to provide correct and honest information to your customers?
It is not in your interest to use the new media?

One argument I heard was "We cannot give allergene information to the consumer via such platforms because then we could be held liable for it. And we could be sued if the data is incorrect and thereby somebody gets harmed.".

But you are putting allergene information on the packaging and on your website.

Are your processes so much more reliable for putting information on the packaging or on your website then publishing it to the GDSN?

Or are there other reasons behind your reluctancy to provide transparent information to the consumer?

Actually I haven't got it yet. I would be happy to get some feedback on that topic ...

Thursday, May 26, 2011

Even with GDSN - retailers product master data is still crap!?

Just recently GS1 Belgium did  a "Mini Data Crunch Survey 2011" where they did a survey what the impact of bad data on the belgian grocery market is. This is a survey which stands in a line with what GS1 UK started with their "GS1 UK Datacrunch Report 2009".

I think GS1 Belgium really put together a very nice report - and the results are - I would like to say "as expected" - very bad. The data between suppliers and retailers is significantly out of sync. And there is a huge cost to the market for that.

What I liked most is that GS1 Belgium not only looked at suppliers and retailers who were not yet doing GDSN but also at suppliers and retailers who are heavy GDSN users. Surprisingly by looking at the figures they have measured regarding "missing data" and "mismatches supplier-retailer" you really cannot tell who is doing GDSN and who is not. The percentage of incorrect data ranges from 49% to 67% - and keep in mind they only researched the data of 4 suppliers at 4 retailers and even that with only 100 products:


I would have expected that the retailers who are using GDSN have significant better data then those who are not using GDSN at all.

If that is not the case, what are the reasons for those issues?

GS1 Belgium does not directly address this subject, but indirectly they are proposing an answer with their analysis what is happening behind the firewalls of suppliers and retailers.

Obviously suppliers and retailers still have to do a lot of manual work on both sides of the table even if they have implemented an automated GDS process.

From my perspective the two most important recommendations of GS1 Belgium are:
  1. Use an automated exchange of data between suppliers and retailers. Here GS1 Belgium for a good reason refers to their GDSN datapool offering. But I think that is only the one part needed. The other part is that retailers have to implement internal automatic integration processes to really integrate the data into all their business applications.

    Just recently I found out at a major german retailer who is already doing GDSN for quite some time that they are not integrating the GDSN data into their purchasing system!? How do they expect to benefit from GDSN for their order and invoice processes???
  2. Introduce a so called "data team" on suppliers and on retailers side to be the one and only team to maintain the data.
They are also proposing a lot of other very reasonable actions to be taken like establish data ownership, data quality dashboards, etc.

But I really think you have to establish a dedicated organisation and then you have to integrate the data into your business applications in an automated way. Actually straight forward, isn't it?

But my expierence is that suppliers and retailers are really struggeling at this point and typically do not implement this consequently ...

Let me repeat my mantra: Suppliers and Retailers simply have to implement a MDM program and make their GDSN initiative part of that MDM program. This is the only way to implement GDS successfully and gain the business benefits out of it.

Tuesday, May 24, 2011

Retailers: GDSN Changes - the productivity killer?

Lately I talked to one of the bigger european retailers who has recently fully adopted the GDSN processes to receive item data from his suppliers.

Meaningless changes!?

He was complaining that they were receiving so many changes and that he needed more staff to get control of this stream of changes although he thought that most of those changes were not meaningful to them!?

How can a change of an item from a supplier be not meaningful to a retailer? At first sight I really questioned this statement and we started investigating what was happening.

First of all I sat down for a couple of days with the data stewarts who are in charge to approve all changes to really see what was happening.

50 - 1,000 changes / day

My first observation was that they really had to approve a huge amount of changed items each day. They had to manage a minimum of 50 changes per day but also one day it was more than 1000 changes which they were not any more able to approve with the existing staff in that day and changes began to pile up.

Up to 10 minutes for approval of a single item

My second observation was that it was quite painful to look at each and every item and to figure out what the change was. Although they have implemented a quite fancy looking comparison screen it takes the data stewart up to 10 minutes to do the item review and to decide whether to synchronize, review or reject the item.

90% of changes meaningless

My third observation then was the eye opener. Approx. 90% of the changes were meaningless for this retailer. How come? Actually every retailer is only using a subset of the attributes which are available in the GDSN. None uses all 1,600 attributes available. All retailers I have seen use between 100 and 250 attributes. But suppliers have to maintain actually the superset of all attributes required by their retail customers.

So what is happening is that supplierA is maintaining Attr1 - Attr200, while retailerB is using Attr1 - Attr100 and retailerC is using Attr90 - Attr200.

If supplierA is changing Attr1 of an item this change gets send automatically to retailerB and retailerC. But it is only relevant for retailerB. retailerC is not using Attr1, so the change is not relevant to him. But according to the GDSN rules he has to synchronize this version of the item to stay synchronized with the supplier. Sending back a CIC review or reject would not make any sense because that is just not what he wants to do.

Solution 1

What is the solution? The first solution approach is to evaluate automatically whether a change is relevant to a retailer and if it is not to automatically synchronize (meaning store the item and send back a CIC synchronized) the item change. Only if the change is relevant put the item into the manual approval workflow.

With this solution this retailer could already save up to 50-60% of the needed effort in his approval process.

Solution 2 - and my recommendation

My recommendation is to take it even one step further. Why do you have to look at each and every item change even if it is relevant to you? Do you really think that your data stewart can judge for all your items whether a slight change in the measurements or in the ingredients or in the packaging or where ever is correct without pulling the concrete product and remeasuring?

My experience is that the whole manual approval process mostly is pure waste of time.

Instead turn the process around. Define rules when an item change cannot be automatically approved (eg. if the description changes, if measurements change more then 10%, etc.). If an item change passes all those rules it should be synchronized automatically. Otherwise you put it to the manual approval process.

And then measure what is your ratio during manual approval between sending a review or reject to the supplier and synchronized. If you have more than 50% getting an immediate synchronized than your approval rules are still to strict and you should further relax them.

Now you should be down to 10% or even less of the original effort.

Btw. dealing with ADD's is typically a little bit more complicated because you typically have a manual enrichment process ...


UPDATE: 
There is a very interesting discussion on LinkedIn regarding this Blogpost, see here.

Saturday, May 7, 2011

What are the core components of a MDM system?

Actually we are talking  a lot about MDM systems – but hey - are you aware what  a full blown MDM system should provide?

In my opinion the following modules are essential to a MDM system:
  • A very flexibel Data Model. This is key and actually the core. You want to manage master data and you have to be able to put the data model into the system which meets your requirements.

    For me managing the data model is NOT a development task but instead a pure customization task which should be taken over by the people who are operating and administrating the MDM system. In the best case the MDM Team itself is able to adjust the data model to their needs.

    The flexible Data Model should be accompanied by a validation rules engine. What you want to do when defining your data model you also want to define the data quality rules your master data should comply to. And those data quality rules are typically more complex then just a primitive type check or the check for mandatory attributes.
  • A Content or may be better “Item” Management Module which allows the different departments actually to manage (enter and maintain) the master data itself. And man – do not mix that module up with a web content mangement system (CMS) that is something completely different.

    This module should be fully driven by the data model (meaning again no programming if you add an attribute) and allow to customize different views for different users or user groups.

    It should be also integrated with some workflow engine because you might want to establish editorial and approval workflows for master data management.

  • A Data Quality Module. This module should provide you the capabilities to monitor the data quality you have in your system. Thereby this should be integrated with your validation rules engine so that you can report on which items do not comply to your rules and stuff like this.

    It is also great if the MDM system already provides simple data quality analysis functionalities like pattern analysis or filling rate analysis independent from your defined data quality validation rules.

  • A Data Integration Module. This is essential because you want to integrate your master data into the diverse other systems you have. What you need is batch upload and download functionality plus you need realtime integration capabilities eg. via  a Enterprise Service Bus.

  • A Workflow Engine should be part of the MDM system. Actually as data models differ from company to company also the editorial workflows but also the approval workflows differ from company to company. This gives you a great flexibility to model your processes into the system and it helps that you not have to adjust yourself to the processes defined within the tool.

  • Webservices should be supported for integration purposes. As master data is needed in most areas of the business having public webservices often helps to solve integration requirements very quickly.

  • For sure you have to look at Technology and Architecture that this fits into your Company policies.
  • Performance, Scalability and Availability are also key requirements for a business critical application like a central MDM system
  • You should also look at operating and security requirements and whether the system meets them.
  • And last but not least a MDM system should also provide at least a minimum of reporting capabilities.
And as you know my focus is always not only MDM but also how MDM connects and enables GDSN. So if your requirement is that your MDM system should also support your GDSN initiative what is required then?

  1. Best would be if your MDM vendor already provides you with a out-of-the-box GDSN connector which is certified – or at least know to work with – your favorite data pool. But as there are not that many MDM solutions out there with GDSN connectors out-of-the box you should look at option 2.

  2. To support GDSN with your MDM you need:
    1. Your data model has to be aligned with the GDSN data model. Either you have your own and you can map that to the GDSN data model (be aware of codelist issues, attribute type issues and stuff like that) or you directly adopt the standards and use them also internally.
    2. You have to build workflows which are supporting the item confirmation process.
    3. Your MDM systems integration capabilities should support the data formats and protocols your data pool supports

I hope this helps you to structure your evaluation process of a MDM system!

Thursday, April 14, 2011

What is “100% Data Quality”?

Isn’t your goal “100% Data Quality”? And don’t you expect (at least if you are  a retailer) your suppliers to deliver via GDSN 100% correct item data?

But what does this simple term “100% Data Quality” really mean?

Imagine that you are a retailer. You are receiving item data from your suppliers via GDSN. You are feeding that data directly and exclusively into your warehouse system. And your warehouse is fully automated so you are really dependent to receive from your suppliers the correct measurements. So you are very proud of your warehouse logistics and you are managing that in the most efficient manner which gives you serious benefits over your competition. But because you are so focused on efficient logistics you have not yet implemented any ecommerce strategies and therefore you have very little demand in any additional product information like nutritional information or things like that.

So for you “100% Data Quality” does mean – get me the measurements and my other logistics information (like packaging hierarchies) correct!

Now imagine you are a e-commerce retailer. What you need to sell is good images and very detailed product feature descriptions. If you are selling food you need nutritional and allergen information. In some countries you need this already for legal reasons.

Now “100% Data Quality” means logistic information including measurements + feature information + images.

And now try to imagine what “100% Data Quality” means to a small supplier who is confronted with forms with hundreds of attributes to fill out. He is happy when he gets his data passed the damn data pool validations!

Got my point?

Data quality – and my examples so far only touched on the completeness of the data – is “relative”. It is dependent on the requirements of the recipient. If you are using the measurements in your systems and your business processes depend on that information, you need it to have “100% Data Quality”. If you do not use it you can still have 100% Data Quality for your purposes if the measurements are incorrect or even empty.

Same applies for all dimensions of data quality: correctness, completeness, in time delivery and validity.

Imagine the perfect world all retailer requirements regarding data quality are equal because they are fully using the full set of product information in all their processes. But in todays world implementation levels of retailers are different and therefore they today might have very different requirements depending on the business processes where they are using the data today.

My recommendation to retailers would be, give a clear communication and documentation to your suppliers what you consider to be “100% Data Quality” and build into your system validations to ensure that quality level including feedback for suppliers if their data does not meet those requirements.

To suppliers I recommend ask your retail customers what they understand under the term “100% Data Quality”. Then start implementation accordingly.

Tuesday, April 5, 2011

MDM and GDSN hard to connect? The original post ...

Actually when I started writing my last blog post on the topic why MDM and GDSN are hard to connect my original intention was not to write about a flaw in the design of GDSN. This just happend to come to my awareness while I was thinking through the whole "How to connect a MDM system to GDSN?".

My original thought was that GDSN actually allows - and in my view even "requires" if you want to get all the benefits out of it - interaction between the supplier (data source) and the retailer (data recipient) when you are synchronising item data. This interaction is mainly the feedback from the retailer to the supplier what he has done with the data on item level via the CIC (Catalog Item Confirmation) message.  With the CIC message the retailer can ask the supplier to review the send data because it might not comply with the retailers data requirements or he simply can indicate that he has accepted the data and synchronized it with his backend systems.

This type of interaction between data source and data recipient is very unique to GDSN. I am not aware of any other item data exchange standard which has a similar technique.

In a MDM system the typical approach for data integration is on the data source side more a "fire and forget" approach. That means you are maintaining your master data in the MDM system until it is approved and then you are publishing it for the consuming systems. Typically a MDM system does not expect any feedback on its publication. If you are lucky you are using some kind of EAI tools for the data distribution and that is handling at least technical protocols regarding the success of the data distribution. Those protocols are then typically managed by some technical teams.

But the GDSN CIC process is a business process! So the CIC feedback has to be dealt with on the supplier side by the business people who are normally maintaining the item data. So your MDM has to provide some kind of user interface for its users to see CIC's which have been received and to process them efficiently.

On the data recipient side it is a little bit different. There your MDM system has validation rules to ensure the data quality of your item data. If the data you are receiving from a supplier does not comply with those rules your MDM system has to be capable to send back a CIC review message automatically. On top of that your MDM system also has to enable its business users to either accept or send back a review manually because not everything can be validated automatically.

And as those workflows are very unique to GDSN there is no generic solution for it in MDM systems. If you look at MDM systems which have implemented a standard connector to GDSN they all have implemented a separate module (a "GDSN console" or something alike) to deal with those special processes.

It is even getting worse if you also take price synchronisation into account. In GDSN price sync is a different message set and a different choreography from item sync. It is in a way similar (similar message names, similar idea of the choreography) but the details are then quite different. For example is the "Price Sync Confirmation" process not optional like in the item sync process but it is mandatory. Additionally you also first have to synchronise and accept a business relationship before you can start synchronising prices.

Right now I am not aware of any MDM system that has implemented support for the GDSN Price Synchronisation!

And that is doubting the whole value of a GDSN implementation in my view. If I am only able to synchronise item data and no price data my business problem is only solved half. I then still have to find a solution how I can synchronise prices between trading partners. And they really want to get rid of those error prone, not versioned, not processable Excel sheets!

So if you want to connect your MDM to the GDSN (either because you are the software manufacturer of the MDM or because you have introduced a MDM system into your business) be aware of the business processes GDSN implies. They have to be supported.

Also be aware that only the whole GDSN package of Item sync + Price sync really unlocks the whole value of GDSN for you.

Just to be fair I should mention here that price sync adoption today is really low globally - to not say zero if we except Australia. But Australia is a good proof that it can be done and really brings the value expected!

Sunday, March 20, 2011

MDM and GDSN: Hard to connect? The design flaw in GDSN …

Did you ever look for a MDM or a PIM system with a GDSN connector certified by GS1?

If you did I am pretty sure that you have not found any. Why? Because the whole GDSN standardization is only meant for and certified for the communication between data pools.
The communication between the suppliers and retailers and their data pools is not standardized at all.

You do not believe that?

Look at GS1's Interoperability and Certification document. It explicitly reads “GDSN was not intended to govern DP-to -Trading Partner activity. “ (see chapter 4 at the end). There is even no recommendation what type of message format and standard should be used for communication.

And what is the datapool reality?

Some datapools promote GS1 XML for the data pool to trading partner communication, some promote PRICAT based data formats and a third group even promotes their own proprietary XML format.

What does that mean?

  1. Switching data pools is quite a challenge because typically you have to reimplement your communication  to the datapool.
  2. Independent software vendors do not have a reliable interface which they can implement and which would allow them to connect to any data pool.

This explains why each data pool has to certify every software manufacturer on his own and why every software vendor has to implement its GDSN support individually per data pool. That puts a big barrier in front of independent software manufacturers.

Although GDSN already exists for more than 7 years for example SAP MDM today is only certified for two data pools (and the implementations differ significantly) and as a second example IBM Websphere Product Center is only certified for one data pool. Customers for sure use those solutions also with other data pools. But this always needs significant implementation effort by the customer or his service provider. Having invested those efforts locks in customers to their data pool provider significantly.

Having no standard interface to implement which can be used for all GDSN data pool prevents software manufacturers to invest too heavily into providing standard solutions.

Can this be solved?

Yes . GDSN Inc. can make it mandatory that data pools also support the current version of GS1 XML (at least the GDSN subset) and AS2 as communication protocol between trading partners and their data pool. GDSN Inc. could provide a test harness consisting of a number of messages which have to be executed during the certification. That would even not add additional cost to the certification process but could even reduce certification costs.

Trading partners would not be impacted at all because GS1 XML should not replace the proprietary data formats.  But it should be the only mandatory option to give standard software providers a standard interface to GDSN data pools.

What is the impact to data pools? 

It should not be to much of an impact. Every certified GDSN data pool is already capable to process GS1 XML messages because that is what they are sending and receiving to and from other data pools. To make those protocols also available for the communication between trading partners and the data pool should be quite simple.

What would be the benefit?

Standard Software manufacturers would be able to implement only one interface and they could connect immediately to every certified GDSN data pool.
That would be really a benefit to trading partners because having standard software which has GDSN connectivity out-of-the-box would really drive down the needed implementation effort.
And this finally would be very beneficial for the whole GDSN.

So, let’s make the GDSN plug&play for trading partners!

Tuesday, March 8, 2011

Brand Owners loose sales due to mobile scanning!

GS1 UK released recently a report called „Mobile-savvy shopper report“. In this report they researched the impact of poor third-party app data on shopper behaviour.

You can find the report on their website http://www.gs1uk.org.




What did they do in their research?
  1. They tested the data from three third-party barcode scanning apps.

    They have chosen the top downloaded paid for barcode scanning app, the top downloaded free barcode scanning app and the top downloaded health and fitness barcode scanning app. It is a little pity that they did not name the tested apps (although I can understand this).
     
  2. They scanned 375 random grocery products and compared the product information provided by those apps to the product information which was provided by the brand owners to GS1 UK’s TrueSource Product Catalogue.

And what are their findings?
  1. The very first finding and statement is that the consumer uses all means to inform himself about the products he is going to buy and that this information heavily impacts his buying decision. If he does not find any data on the product or only data that he is not trusting then in the worst case he will not buy that product.


  2. Only 9% of the scanned products have correct descriptions in the third-party apps.


  3. 75% of scans did return absolutely no product information at all.


  4. 87% did not return any images.


  5. The more data is available the higher is the percentage of wrong data (see following chart)


How would you feel as a brand owner hearing such results and also hearing everybody that mobile commerce and B2C are the next wave, which will influence shopper’s behavior radically?

You will definitely want to get control on your product information in the mobile and the internet univers!

Just out of pure interest I did a small (and not representative) research on my own. I asked for a quite mature target market the leading barcode scanning app provider who has his focus in grocery to give me the scanned GTIN’s for the first two weeks of febuary 2011.

That was impressive 3Mio scans! And they were able to provide product information for nearly 90% of their scans. Even more impressive!

And then I matched that with the product information which is available in the GDSN (do not argue that this is not possible because recipient data pools do not store the data – for some target markets life is different ;-).

You can imagine what comes now? The matching rate was significantly, very significantly lower than what the app guys had :-/

So what is my conclusion?
  1. For the consumer the 3rd party barcode scanning apps are great – you have all the data you need for your buying decision on your fingertips. The data is bad? Then the product must be bad and the consumer will not buy it.


  2. From a brand owners perspective the data that is available in 3rd party barcode scanning apps is very, very poor.


  3. Brand owners are really not yet taking care of their product information. Even in the GDSN until now you only can find a fraction of the data consumers are looking for. And that is NOT because of that the standards are behind and attributes are missing but it is just because the industry does not take care of their data.
Finally, could you imagine why the industry does not take care of their data? I am very convinced that many manufacturers are just not capable to do so, because – and here comes my mantra - they just have not yet launched a serious MDM program.

Manufacturers – Take control of your data! Launch a MDM program! Provide the data through the GDSN to those platforms! The consumer is already there!

Tuesday, February 22, 2011

GDSN: The vicious circle

On one of the last GDSN workgroup meetings we were discussing how to improve the adoption of GDSN.

One of the comments was that there is a vicious circle: Retailers say as long as there is not more item data available from suppliers they will not start to engage with GDSN. And on the same time suppliers are arguing that they will not start delivering their article data through GDSN as long as retailers are still demanding product information on paper or via excel files or similar.

How to deal with that vicious circle?

Suppliers should launch a MDM program independently from GDSN. 

Suppliers have to prepare their own product data in a central database to fulfill the different requirements they are facing today. They need the product data for their internal manufacturing processes, for their sales processes, for internal information purposes, for all kind of electronic business. Providing that data via GDSN to their retail customers is only one additional use case for their correct, complete and quality checked product information.

If they are taking GS1 standards into consideration when defining their MDM program then they will be easily able to also deploy GDSN as part of their MDM program. And as there is already so much value in their MDM program, it does not matter if in the beginning they deliver the data via GDSN only to 1 or 2 retailers. Usage of the GDSN will evolve over time.

Retailers should take the same approach. 

Just launch your MDM program and make GDSN only being one (additional) source of product data. The MDM program itself will bring so much value that it also pays already for your GDSN implementation. What is key to your MDM program is that your vision contains that you are not any longer accepting item information on paper or via email.

Wednesday, February 16, 2011

Cap Gemini: MDM Top IT Topic for 2011!??

In a recent study of Cap Gemini (which you can find here in german) they are identifying Master Data Management and Data Quality Management as part of the Top 5 most important IT topics mentioned by mid sized companies for 2011.


Hurray! Companies are becoming aware of their master data issues and that they have to do something around them!

But STOP! Didn't I mention that MDM is NOT an IT topic? Just compare the 10 myths regarding MDM by Gartner.

Ok, there is still some education to be done! It is great that companies are becoming aware of their MDM issues. Now they have to learn that IT cannot solve it. Business has to solve it with organisation and processes. IT has to support!

Wednesday, February 9, 2011

Let's build a new platform to exchange Rich Product Information!

This is what of my collegues reported back to me today. He was joining one of those new GS1 MobileCom meetings and they were discussing how they could exchange rich product information.

Just to get everybody on the same page - what is rich product information? My understanding of rich product information is that this consumer oriented information which helps the consumer to take a buying decision. Examples are for sure images, but also feature informations, nutritions, allergenes and similar stuff.

Everybody complained that today there is no standard and no platform in place where you can exchange that type of information.

And believe me there were many of the big manufacturers and retailers sitting in this room and complaining too!

If you are familiar with some of the details of the GDSN you for sure are aware that all of the above can be easily exchanged via the GDSN.

The discussion went on for a while and became more and more ridiculous and people were even suggesting to build a new platform to be able to exchange that type of data.

At one point my collegue remarked that the GDSN and even current standards would be technically capable to support the exchange of this type of rich product information but that manufacturers just were not entering that data. And then he asked one of the manufacturers who was participating  in that meeting why they were not entering that type of information into the GDSN.

The representative of that manufacturer could not answer that question! And also the rest of the audience was obviously overstrained with the question why not simply use the GDSN.

But my favorit comment during that discussion was "I just do not like the GDSN. Let us build something new.".

(And just to be clear on that - I do not want to put anybody on the spot, therefore I am not mentioning any names here)

What is my take away?

  1. Building new things is more fun (at least in the beginning) than understanding and leveraging existing stuff.
  2. Knowledge of people participating in such meetings often is not sufficient:
    1. No Knowledge on GDSN - you need not only to understand the principles but you also have to know some details about attributes, extensions and stuff like that
    2. No Knowledge on their internal organisation, processes and IT infrastructure and how they manage their product information.
    3. No Knowledge on what is going on in the rest of the world.
And now? I would suggest "Let's build something new! That will be fun!" ;-)

PS: You want to know what I think why manufacturers are not putting the "richer" product information into the GDSN? Remember one of my former posts regarding "every manufacturer has a database with all his productinformation"? I just think that most of the manufacturers just do not have that data available in a form to easily send it through the GDSN ...

Tuesday, February 8, 2011

Retailer GDS Implementation: US vs. Europe

In the last few month I worked with couple european retailers and also with a couple of US retailers on GDSN and MDM.

My perception was that US Retailers where quite ahead of their European colleagues. Not all of them because also in US there are a lot of retailers who have not yet started any GDS Initiative nor a MDM Initiative.

But comparing the most advanced (regarding GDS and MDM implementation) retailers in Europe and in US, US seems to me to be more advanced.

What is the difference?

European retailers seem to have as their vision “I want to do GDS to receive suppliers product data electronically.”. US retailers seem to have a different vision “I want to replace my paper base product introduction and change process by an electronic process”.

You think there is no real difference in the outcome?

Wrong. It is. The goal in US is not to do GDSN but to replace paper based processes. In Europe the goal is to do GDSN and thereby hoping to replace paper based processes.

What I saw with the retailers I worked with in US was that they all launched a MDM program. This was a lot about setting up an organization and processes to manage their item master data. To receive the data from their suppliers they not only implemented GDSN but they also implemented Web portals where suppliers can manually maintain item data free of charge. This led to the comfortable situation that they could receive 100% of their product data electronically by mandating to deliver item data either via GDSN or via their Web portal.

In Europe retailers engaged heavily into community alignment and standards development (which btw. led to all those country specific requirements and standards) but only implemented GDS (and to be honest, they first all developed country specific data pools and later on tried to switch to GDSN). No free portal for their suppliers, nor – in many cases – a dedicated MDM program to ensure data quality. With only having GDS as a means to deliver item data electronically, retailers in Europe are not able to mandate the usage of GDS. In the end successful retailers in Europe might only sync item data with 10-15% of their suppliers, and with the rest of their suppliers there are still manual, often paper based processes in place.

This explains why retailers in US are typically much happier with GDSN implementation than their European counterparts although adoption of GDSN does not significantly differ in US and Europe. In both regions it is not as good as expected by all participants.

Do not get me wrong.

I am not suggesting to retailers to implement their proprietary portals just to be able to offer item maintenance for free to their suppliers. This is a functionality which today is offered typically by data pools. Those offerings are more standards compliant and should be a lot cheaper than it is for a retailer to implement it on their own.

So I would suggest to retailers who are struggling with their GDSN implementation and their realized benefits that they should define and implement a MDM program and their vision has to be to "Replace manual item listing and maintenance process by  automated data synchronisation processes".

Sunday, January 30, 2011

Retailer 1: Data from GDSN Datapools is too bad!

If you are talking to retailers about GDSN they always complain on the data quality of the data suppliers are providing. And I was always wondering how come?

Couple months ago I was invited by a large retailer to do a data quality analysis. Reason for them was that they knew they had a problem with their product master data but they did not know what really their problems were. Everybody in that retailer complaint about the quality of their own product master data but they did not have any metrics on what was bad.

So what did we do?

First of all we went through all the departments and talked to the operational people to understand what was bad about their product master data. From their we got an extensive list of problems in their product data. The three most interesting ones were:

1. The warehouse people complaint that the measurements often were missing or wrong so that they often had to redo the intake of physical goods because the automatically calculated shelf did not fit.

2. Also the people of the tourplanning complaint about wrong and missing measurements which leads to wrong tourplanning and "overfilled" trucks.

3. The C-level people complained about wrong supplier scorecards which led to very bad discussions and negotiations with their suppliers about esp. order fulfillment.

From their we focused on investigating how good or bad measurements are in the retailers system and to see if bad master data could somehow lead to wrong scorecards.

Our results were a little bit surprising:

1. 70% of their products did not have any measurements at all in their central master data managing system or they did only have default values in there (length = 1mm, width = 1mm, heigth = 1mm, weight = 1kg).

2. We took randomly some products and compared their measurements in their different warehouse systems (as they have multiple warehouses, they have multiple instances of their warehouse system and therefore they have each item multiple times). And in most of the cases the measurements differed significantly. As they were lacking the measurements in their central system they established a measurement process in their warehouses. But obviously the people there were not able to measure the same product in the same way. Most funny was what we found for one of the private label products. The measurement for that product differed more then 100%.

3. When we tried to figure out how master data issue could impact the scorecards we found out that their system did not support a flag whether a unit is orderable or not. We could prove at some sample scorecards that this was the reason for wrong orders and bad order fulfillment. The retailer ordered a number of units which was not orderable. Because the retailer was a very good customer to the supplier, the supplier executed that order (probably after internal manual correction), and send back the dispatch advise with the correct (orderable) gtin and a corresponding amount which differed from the order. And peng: the scorecard showed a delivery error. We also had cases where the manual correction of the supplier just was wrong and he in case delivered the wrong number. But all those errors were caused because of the missing "orderable" flag and needed manual intervention on the supplier side.

4. We compared the retailers product data to the data from a datapool and found out that the data from the data pool looked completly different.

There were a lot more outcomes and results from the data quality analysis but that would really exceed that posting. I think you have now an impression how bad the data was at that retailer and what problems that caused.

Now as this retailer is doing data synchronisation already for a couple of years I was really keen to figure out why the master data can be that bad and if this is really a supplier issue.

To make a long story short we found out that the retailer from an IT perspective had implemented everything quite well. He had implemented an approval workflow where the people in the buying department could look at changes and new items coming in and then decide whether they want to take that over or to change something.

So we decided to talk to the people who were managing this approval process.

Surprise, surprise!

We learned that they preferred NOT to work with that approval process. Mainly because there were so many changes coming every day but also because they were very unsure what would happen in their backend systems when they would approve a change. Would their backend processes still work? Or would that create some kind of failure and they would be hold responsible for that failure? Therefore they preferred to not change anything because by that they could not create any process failure - they thought.

When we then looked in the queue with the items to be approved we found item changes going back until 2005!

Final surprise came when we presented our findings to the middel management. One comment I will never forget was after the whole presentation: "The product data from suppliers has to be 100% correct, as long as nobody guarantees that, we cannot work with it.".

For gods sake the c-level people got the message and launched a MDM program to start improving their master data management.