Sunday, February 12, 2012

Cap Gemini: MDM in 2012 still top IT topic

Do you remember my blog post from nearly a year ago? There I commented on Cap Gemini saying that MDM is one of the Top 5 IT topics for 2011.

Recently Cap Gemini released a report on the top IT topics for 2012. And there MDM and Data Quality Management are both top topics for 2012.


What does that mean? Actually it demonstrates that master data and its quality is key for sustainable IT. Therefore Master Data Management and Data Quality Management will continue to be key topics for IT and business as long as IT exists.

And my comment from a year ago that MDM is not only an IT topic but has to be driven by the business is today even more valid than a year ago. So let's continue to work on these topics!

Friday, January 6, 2012

Amazon is struggeling too with product information quality ...

... I am always very impressed by the product information and esp. by the quality of the product information Amazon does provide in their shop system.

But even Amazon seems to struggle from time to time with the data quality of their product information.

Look at the following screenshot which I have take today while I was shopping for  a landline phone.

For those of you who are not familiar with german here is how it should be written:

"Plug&Play AnrufbASINtworters:" -> "Plug&Play Anrufbeantworter:"
"Bedienung des AnrufbASINtworters:" -> "Bedienung des Anrufbeantworters:"
"AkkuladASINzeige:" -> "Akkuladeanzeige:"
"SignalstärkASINzeige:" -> "Signalstärkeanzeige:"

Got it?

Somebody has done a kind of global "Search&Replace" and has replaced "ean" by "ASIN". Actually EAN is the former name of the GTIN and ASIN is the name of the Amazon article number. Within Amazon you can convert the GTIN to the ASIN and vice versa.

So what does this example tell us:
  1. Amazon does obviously not do any checks on the provided features. They obviously do not classify feature names as real attributes and they are obviously not trying to standardize those. You have also not product comparison in their shop.
  2. Philips either has a manual process to provide their data to Amazon and in this process somebody manually did the global search&replace or they have an automated interface and a very bad testing process. Because then this has to go wrong in their interface implementation.
How could this situation be avoided?

  1. Amazon could predefine all feature names and force all suppliers to deliver their data accordingly. I think this is not a feasible approach because Amazon would have to standardize all attributes on all product categories they have plus they would have to force all their suppliers to comply to that standard.
    Do you think Amazon could do this? I do not think so. Therefore I think Amazon can only rely on their suppliers that it is in their best interest to provide the best data quality they can.
  2. Philips could use a PIM with defined attributes for the features of their products and implement an automated upload process to Amazon.
    I think this could be easily done and would solve the problem for Philips and Amazon.
Btw. although the data was corrupt I have ordered exactly that phone because I hope that Philips is way better in designing and producing phones than they are in providing product information to Amazon :-)

Wednesday, December 28, 2011

9 Predictions on GDSN and MDM for 2012


Inspired by the many predictions for 2012 you can find right now in the I thought to write down my own assumptions and predictions for 2012. Maybe it is a little bit unstructured but this is how it came to my mind.

I hope you will enjoy them a little bit and maybe enrich them with your own thoughts and comments!

Here we go:

1. GDSN will overcome the national Data Synchronization standards and become the real global Datasync standard
Especially in Europe Data Synchronization started long before the rise of the GDSN – and already was successfully implemented before. With the rise of the GDSN and the adoption by the large, global manufacturers the European communities were urged to build bridges from their national, proprietary data synchronization standards into the GDSN. This happened the last couple of years for example in Germany, Sweden, France, Spain and other European countries. By now many of those communities have learned that using one single, global standard is really beneficial and therefore they are in the process of switching from their national, proprietary program to the global GDSN. One good example here is the German community which has decided to switch from their former SINFOS standard to the GDSN. This initiative is led by GS1 Germany and backed by all major players in the German retail marketplace.

2. Master Data Management also gets implemented by European retailers
My observation is that MDM as a discipline was for quite some time much more adopted by US retailers than in European retailers. Therefore European retailers had much more difficulties in adopting automated data synchronization and getting rid of paper or email based new item forms and the manual processes attached to it.
What I am seeing now is that meanwhile also many European retailers are looking into MDM as a discipline or a business process and that many of them really have already started their MDM program or are at least right now preparing to start one.


3. Retailers build their own portals to gather item master data on top of GDSN to have a free solution for suppliers and to collect data beyond the standards – and this will burst the usage of GDSN
GDSN on its own is not sufficient for retailers. The goal for retailers regarding data synchronization is to get the data of ALL their suppliers in a reasonable time frame electronically in an automated process instead of getting only a part of it through data synchronization and sticking with the rest to manual processes. 
To overcome the chicken and egg problem (“only a ‘small’ amount of suppliers are doing GDSN, therefore retailers are not implementing GDSN” versus “retailers are not implementing GDSN and therefore suppliers are not participating in the GDSN”) retailers are more and more implementing their own web portals where suppliers can maintain their item data manually at no charge additionally to implementing GDSN. By offering a free option to suppliers retailers really can mandate electronic delivery of item data – either via GDSN or via their own portal.
Why this will burst the usage of GDSN? Because retailers are implementing GDSN and suppliers are only getting any benefits from electronic delivery of item data if they implement GDSN themselves instead of manually feeding the retailer portals.

4. E-Commerce in Europe becomes the driver for MDM and GDSN in retailers
E-Commerce is THE driver for electronic product information par excellence. This is how suppliers can advertise and promote their products and directly impact their sales through the different online channels. 
Electronic product information is the equivalent to the packaging in the store!
To manage the product information on the supplier and the retailer side you absolutely need to implement MDM for product information otherwise you will not be able to control the data.
But why will it also drive the adoption of GDSN? GDSN is an established infrastructure between suppliers and retailers to exchange product information and absolutely capable to exchange the more sales oriented product information for E-Commerce also. Why try to establish something else?

5. B2C item information will become integral part of GDSN
As E-Commerce is one of the key drivers sales oriented product information (aka ‘B2C item information’) has to become integral part of GDSN. 
Do we need ‘Modular Item’ for this? Probably not short term, because we could easily live with some more extensions or even with using the ‘Specifics Technical Characteristics’ Extension.
Key is to regard GDSN as the means to receive “trusted data” from suppliers.
Btw. what is your understanding of “B2C data”? Mine is all data on a product that comes from the business and the targeted audience is the consumer. This includes all product feature information, every kind of marketing information, manufacturers images and such stuff. It excludes explicitly all kind of consumer generated content (e.g. recommendations) or 3rd party content (e.g. test reports etc.). 

6. Stationary retailers will learn from Amazon that the electronic supply-chain is key for success
Working a lot for stationary retailers, my observation is, that those retailers have a lot challenges to adopt electronic processes for their key business processes.  Mainly because they have well established (manual) processes and their business is up and running. It is obviously a huge change task (mental wise but also just from the huge amount of employees impacted) for them to switch from the manual processes which are working for decades to automated, electronic processes.
But as Amazon meanwhile is very well established globally and is competing more and more with stationary retailers, the latter ones have to really investigate how they can stay competitive. Full adaption of the electronic supply-chain is one of the must haves. 

7. The GS1 system demands an integrated solution for supplier data, item master data and transactional data
You implement GDSN and then you are going to start your first rollout / onboarding program. And you will have your first failure because you learn that your supplier address data is too poor. This can be repaired with a huge manual effort (contact each and every supplier and collect their GLN, the correct contact person and probably some more address information).
But why is there no viable GS1 solution which provides supplier address information to retailers?
Each and every retailer has the same challenges here. By the way suppliers have similar problems because where can they get all the DC and store addresses from retailers?
As a retailer you first need correct supplier information, then you need the item information and based on that you want to do the electronic business processes like orders, invoices, etc.
Why is there no integrated solution for this?
Will this solution emerge in 2012? Probably not. But the more the GDSN is used the more people will recognize that there are some other puzzle pieces still missing.

8. Gepir will be abandoned
I have to admit – I am not seriously predicting that this service will be shut down. But I still have not got the intention of the Gepir service. You could think that it is a wonderful service where you can find for each and every GLN and GTIN GS1 has ever issued some detail information.
But the data quality there is even worse than what I have seen at most retailers databases. And this just does not help.
From my perspective one of the key issues is the way how GLN’s and GTIN’s are distributed / sold where it is not mandatory to give a feedback what you are using a GLN or GTIN for.

9. GDSN at the tipping point - Not for profit vs. commercial services
The two largest GDSN datapools are meanwhile owned by the two largest GS1 organizations (SA2 Worldsync by GS1 Germany and 1sync by GS1 US). Many smaller GS1 organizations are offering also GDSN services to their community either based on 3rd party technology (many use either SA2 Worldsync or 1sync) or based on their own developments (e.g. GS1 Sweden or GS1 Hungary).
Where are the commercial players in the GDSN area? They are either bought by GS1’s (1sync was formerly Transora which was a commercial company, SA2 Worldsync was majority owned by Pironet NDH, a public company) or they are stepping slowly out of this business like GXS for example. In US there are still some commercial companies engaged in the GDSN market and there are also new players arising like lately FSEnet and others.
What does that mean for 2012? It will be very interesting to see how “not for profit” vs. “commercial” performs.
Regarding an integrated GS1 platform as described above I would think that commercial vendors are in a better position because GS1 in certain areas has to be sensitive to not make to much competition to their own solution providers. For example GS1 probably will not be willing to enter the field of transactional data (EDI) because there are well established, commercial offerings available.

So 2012 will become very interesting in the areas of MDM and GDSN!

Happy New Year!

Sunday, December 11, 2011

How to implement a Data Quality Management system ...


... for product master data!?

Did you ever think through it? Did you ever try to implement one? Actually I was happy to be part of a team at a large retailer to implement a Data Quality Management system.

In principle it is very simple:
But the problems already start with the very first bubble "Measure the current Data Quality". What is Data Quality? What are the the KPI's (KPI = Key Performance Indicator) to measure Data Quality?

First of all you really have to define what Data Quality means to you. And this really depends on your business processes which use the product data and the requirements those business processes have.

In my case I am pretty sure you could very quickly imagine that the core processes are purchasing and logistics which are the main drivers at the retailer I was working on the Data Quality Management system.

And for gods sake - our team was not the very first one who has to implement a Data Quality Management system for a retailer. We did a little bit research and found very quickly the "Data Quality Framework (DQF)" which is compiled by GS1 to support companies to get ready for GDSN.

The DQF is very helpful to implement a Data Quality Management system for product information if your product information is mainly targeted on supporting business processes in purchasing and logistics. If your requirements go beyond that you have to extend it.

The DQF consists of KPI's, a guide how to implement a Data Quality Management System, and a Self-Assessment procedure.

Most interesting to us were the KPI definitions:

  1. Overall item accuracy: Ok, this indicator is not that surprising - it is the percentage of items that have correct attributes values for the attributes in scope.
  2. Generic attribute accuracy: That is already more interesting. This is describing the percentage of items that have correct values for some more generic attributes. GS1 defines the following attributes to be in scope for this KPI:
    1. GTIN - that is the key for identification of an item therefore it is really key ;-)
    2. Classification Category Code - As the classsification code is relevant for reporting it is a very important attribute within retail industry
    3. TradeItemDescription - to me this is really a difficult attribute in retail. At all retailers I have been so far, the buyers always insisted that item descriptions are a means to differentiate from competitors and therefore have to be handcrafted at the retailer or have at least to comply with the retailers rules how the description has to be build. Just as a sidenote - I think that is wrong and the item descriptions in no way drive revenue, but I might be wrong here.
      Therefore we decided to leave that attribute out in our reporting.
    4. Net Content - is important for shelf tags and therefore one of the really important informations.
  3. Dimension and weight accuracy: Depth, Width, Height and Gross Weight are the key attributes here. And those attributes are not only key for distribution centers but also for your transport/route  planning and therefore have very strong and immediate impact on logistics.
  4. Hierarchy accuracy: This is absolutly relevant because in different business processes you are using different units of the same item. E.g. you might order an item on palett level but you stores order it on case level or even each level at your distribution center. If you then do not have the packaging hierarchy correct then you are in serious trouble!
  5. Active / Orderable: You should not order item units which are not active at your supplier or just not orderable units. This immediately disrupts every automatic, electronic process and therefore has to be avoided.
So with those KPI's you are covering very much all the requirements from business processes in purchasing and logistics.

But the question now is: How to measure accuracy for those attributes?

A retailer has two approaches he can take:
  1. Compare the data to the data provided by the supplier.
  2. Do a self-assessment and go to your DC and really measure the physical products to gain that information.
In our project we are doing both. We have implemented a system where we are comparing the supplier data to our data according to the above KPI's on an ongoing basis. As the supplier data provided through the data pool does not cover 100% of the business we are also calculating how much of the business is covered by this report.

On top of this we are doing a self assessment. The reason for this is mainly to figure out what quality the supplier data has.

From our experience a Data Quality Management system based on the GS1 Data Quality Framework is  a solid basis manage your MDM program. It gives you the means to document and communicate the progress your MDM program achieves.

---
Update 12.12.2011:

You made it till the end of this post? ;-)
Ok, then I have some more fun stuff for you. I just stumbled over this quite old video from GS1 Netherlands on data quality. But I think it is still to the point and at least it is fun watching and listening:

Saturday, December 3, 2011

Why GEPIR sucks ...

Have you ever tried to use GEPIR?

Lately I needed the address of a Distribution Center of one of the major german retailers. As usual I first tried google but there was no address to be found. I found a lot of information on the DC - press releases mentioning it and stuff like that - but no address.

Then I happily remembered GEPIR. GEPIR is a global service from GS1 and GS1 promotes it as "a unique, internet-based service that gives access to basic contact information for companies that are members of GS1. These member companies use GS1's globally unique numbering system to identify their products, physical locations, or shipments." .

And as you know each and every retailers uses GLN's (Global Location Numbers - the GS1 numbering system to identify physical locations) to also identify their Distribution Centers (DC's).

Ok - I quickly went to http://www.gepir.de (you can also use the more international version http://www.gepir.org or any of the other nationally provided URL's, they all access the same data), typed in the name of the retailer - and was first very surprised and then quite disappointed.

There were only 17 hits and none of them was a Distribution Center!?

I quickly checked multiple of the major german retailers, EDEKA, Metro, REWE - none of their Distribution Centers gets listed by GEPIR.

And all of them have more then one thousand GLN's to identify their DC and store locations!

GS1 is selling each and every GLN and they are not able to provide the correct master data to their GLN's!???

How come?

Did I get their vision wrong for GEPIR? Is GEPIR not meant to provide all the master data to the GS1 numbers? If that is not the vision of GEPIR what shall be the value otherwise??

I think the vision of GEPIR is to provide all the master data identified by the GS1 identifiers. Regarding the GLN I expect GEPIR to have all valid GLN's and the physical addresses they refer too. And I would also expect GEPIR to show what type of location that is. Is it a DC? Is it a store? Is it a location used for ordering? Or is it  a location used for invoicing? But that is a second issue with GEPIR.

I think GS1 has a simple master data management issue as many companies do have. I think there is a vision what GEPIR should be (see above), there might be even a strategy how to achieve that vision (implement a globally shared database) but they did the typical mistakes you can see in many companies regarding master data management.

They did not establish any metrics. For example they should measure what the coverage of the GLN's within GEPIR compared to the sold GLN's is. Then they would know that the coverage in the best case is only 50-60% and that they would have to take some measures to improve the situation.

They did not establish any sustainable organisation and processes. Who is responsible to get new GLN's into GEPIR? How about updates of addresses?

And finally their IT infrastructure does not seem to be really tailored to best support their vision. I even heard that couple times GLN's from a single, local GS1 organisation was sold two times. What a mess. And what type of IT infrastructure is not able to ensure the uniqueness of such a simple number like the GLN?

So finally GEPIR currently ends up as many MDM and PIM projects. There is a great tool. Even with an iPhone app. But it really sucks because the master data is either not available or the quality is so poor that you do not want to rely on it.

Care for the content! Care for your master data! Care for your master data quality!

Ok GS1, I really would recommend that you start a Master Data Management Program as you are expecting from your members to do data synchronisation. A possible point to start with you can find here: What is a MDM program?

GEPIR is really out of sync right now!


----
Update 03.12.2011:
There are some really interesting comments to this post on linkedin, which you can find here.

Tuesday, October 4, 2011

Why is GDSN ignored by PIM vendors?

Lately I had a very intensive discussion  on GDSN with one of the major PIM software producer. They were really interested to see whether GDSN could bring them any additional business and if there were relevant customers being in the need of GDSN support in a PIM solution.

They got very detailed presentations on GS1 activities in the Retail & CPG and in the Healthcare space. And they were also presented the actual GDSN growth figures and actually it was tried everything to sell them the relevance of GDSN for their PIM business. They even talked to major retailers using GDSN on their experience and advice and also to some data pools.

Finally they took some time to evaluate what was presented to them and draw some conclusions for their business. Actually I was quite surprised with their key conclusions:

  1. Datapools only have relevance in Europe and in US. This really puzzled me because at least Australia is one of the most advanced GDSN communities with the highest adoption rates. But looking at most of the other countries - even within Europe - GDSN is not that much adopted and still manual data exchange (even on paper) is the most spread way of data synchronisation. Although the adoption rate is continuously growing as you can see from the Global Registry Statistics.

  2. The data is not maintained sufficiently and the data quality is considered to be very low.
    In this case this is really a two-sided issue. First I think bad data quality in GDSN data pools is really one of the badest myths at all. In all cases where I did comparisons between retailer product data and the data in a data pool the retailer data was much worse then the data in the data pool. The data in the data pool actually always was in quite a good shape. This is also supported by all recent studies like the GS1 UK datacrunch report or the report from GS1 Belgium.
    The second side of the issue in this case is that their customers are mainly also in the ecommerce space and what you need there is the marketing and sales information on products and this is today really not available in the GDSN.
    It really seems to be a challenge within the GSMP process to get a standard developed for marketing and sales relevant (or let's use the new hype term "B2C") data. If you look for example at Consumer Electronics. The first hints on additional attributes and an additional standard I have found in 2008. And now in 2011 we are getting calls to action to work on the requirements for Consumer Electronics ... For me that does not look convincing too ...

  3. Data enrichment is not covered within the GDSN
    What they meant by "data enrichement" is to add B2C data to the data if that is not available directly from any internal source at the supplier.
    I think here they are going wrong. In my perception the PIM of a retailer should offer a supplier portal where suppliers can enrich their data manually. Also data pools like SA2 Worldsync are starting to offer such functionalities based on their data pools.
    As GDSN is "only" a protocol between data pools (and in some respect between a data source and a data recipient and its corresponding data pool) it does not deal with "manual" data enrichment processes by concept.

  4. They consider GDSN only as  a infrastructure side topic
    Here I think they are really making a big mistake. GDSN is not only about publishing the data to  a data recipient (retailer) but it is also about communication with the retailer (think of the CIC messages which allow retailers to send feedback back to suppliers).
    In my point of view every serious PIM system has to support the full GDSN choreography. And this also means to have specific UI's for the corresponding user roles and being aware that there is a new type of user role which has also to be established in the customers organisation.

  5. As they consider retail as one of their major markets, they also consider supplier data onboarding for retailers as one of their key tasks - but they do not consider GDSN to be one of their main means to do so
    As their customers are very much in the ecommerce space and they are dealing a lot with sales and marketing oriented product information this decisions seems to be very much linked to the above topic 2 and 3 and that this type of data is just not available in the GDSN.
    Although I think that onboarding of supplier data is always following the same principle. It is independant whether I need supplier logistics data or some feature information. Only the content differs.
What is my take away?

GDSN seems to be struggeling from my point of view with two challenges:
  1. Market perception: The market does not perceive GDSN as a major, global and successful way to synchronize product information. And this despite the huge community behind GS1 Standards and GDSN.
    I think that is really a misperception mainly because of the huge GS1 community. I am not aware of any comparable standardization organisation which is so user driven and has such a huge supporting community. But here also the second challenge comes into the game.
  2. GDSN adoption and perception by the implementing companies: Although there is such a huge supporting community the real success stories where the implementing companies really have achieved the promised savings are still rare. And this is not mainly because GDSN is the wrong approach but because companies either have not implemented it at all or have not implemented it properly as part of a MDM program.
In one of my next postings I will discuss some alternatives to GDSN and how a combined approach of GDSN with some alternatives might help user companies to achieve their goals and thereby also help GDSN to improve its market perception.

Friday, September 30, 2011

Have you ever thought through how MDM differs in a multinational company from MDM in a only nationally focused company?


Do you remember my blog post how to organize for MDM? There I described how you should organize your MDM activities in principle. If you are working for a big multinational company you might have thought that this is by far to simple to solve your challenges. 

And you are right!

In a multinational company you have on top of the generic MDM specific tasks additional challenges which you have to address within your organization, your processes and your IT infrastructure (esp. in your metadata model).

Even if you start looking in the data of multinational products you can find very serious challenges for MDM.

Global, regional and local products: In an multinational environment you typically find global products which are sold and relevant to all regions or countries. But you also find products which are region specific (eg. only for the European or Asian market) and also very local products which might be available and target only on one country. Typically you want your local organizations to be able to list and sell the local products while the global products are more or less defined by a central business unit.

Global, regional and local elements in global products: If you have global products which are available in different regional or even local markets you typically have some product information which is really only targeted on local requirements. One good example for those types of requirements is the requirement for the “green dot” information in germany (which by the way became obsolete again due to changed legislations). This information was only relevant if you wanted to sell a product to germany. But even if it was the same product globally, for germany you specifically had to add the information whether the product had the “green dot” certificate or not.

Multi-branded products: Multiple brandings can occure for the same product already from the industry side, but it is even more common in retail where you have two retail brands owned by one company and therefore sharing a big part of their assortment. But when presenting to the market they want to describe the same product typically in different ways to the market to clearly differentiate their products from each other.

How does all of this impact your organization and processes?

If you are a multinational company you will have a strategy which will also cover the parameters “need for integration” and “need for local responsibility”. These two parameters define quite well your internationalization strategy. The same you have to apply to your master data management.
In the end you have to setup a distributed organization for your MDM. A quick shot might look like this: