DataPorts and Why they Matter

Handheld phone with image of products

Improve the Quality of Product Content

​Retailers today struggle with managing ​the product content necessary to publish and maintain their online product catalog. Assembling content from manufactures is difficult when information ​comes in different forms. Likewise, manufacturers struggle to publish new product content and syndicate content throughout their channel.


Standardization is one approach and it is important for critical attributes such as the global trade identification number (GTIN). However, information such as schemas remain difficult to implement throughout the value chain and across the various different markets and regions.

Recently, IceCream Labs become involved with the Consumer Goods Forum (CGF). One core goal of the CGF is to improve data exchange throughout the value chain. The momentum within the CGF comes from the attention of executive leadership from many of the largest worldwide retailers and manufacturers. As a result, the CGF is starting to see momentum build from its initiatives.

Make it Simple

The executives within the CGF want to achieve the goal of reducing the pain of sharing information and improving interactivity (while reducing the costs of managing this data). From this need has emerged the idea of DataPorts. Conceptually, DataPorts deliver a method for peer-to-peer data exchange between any two points in the value chain. This removes the need for data aggregation or hub and spoke interactions. Any point in the supply chain can talk to any other point.

At the heart of the DataPort implementation is the use of Graph Query Language (GraphQL) and GraphQL schemas. The significance is that the GraphQL is emerging as a performant solution to the need for quickly finding related information in a network of related data. GraphQL has evolved to meet the needs of social media giants Facebook and LinkedIn.

What is a DataPort?

At its simplest implementation, a DataPort is a method for publishing information and then discovering and using the information using data virtualization rather than data federation. There is a common programming model for DataPorts, and this allows for peer-to-peer integration.

For example, a manufacturer can release a new product line, publish the content to a DataPort and then make that DataPort available for any retailer to query (with their product catalog DataPort). A retailer can request the new product content, and specify the schema that they desire and normalize any values in the process.

The services delivered by a DataPort ​are broken down (currently) into three broad areas:

1. Abstraction is the process of virtualizing the source data.
2. Transformation operates on the data to do things such as unit conversion.
3. Composition takes a set of results and creates a response from the DataPort.

At IceCream Labs, we are actively working on applying our existing expertise and experience in  data extraction to use machine learning models in the abstraction and transformation processes. We are already normalizing ​and extracting data from source images and unstructured information to generate high-quality product content.

Stay tuned, as we continue to explore more ways that ​DataPorts are changing the way that data moves through the supply chain, and improve the entire end-to-end process.


Dataport Whitepaper Cover

​Download your copy of the latest DataPorts whitepaper: Solving End-to-End Value Chain Content Integration from the Consumer Goods Forum

About the Author

Mike is leading the marketing and product management teams at IceCream Labs. Mike brings more than 20 years of machine vision, automation and enterprise software marketing and sales experience to the team.

Leave a Reply 0 comments

Leave a Reply:







close

Get our best content delivered weekly to your inbox.

Sign up below to get started:

Unsubscribe anytime. Your data is stored only for business-to-business communication purposes. See our privacy policy.