The buzzword of the beginning of 2012 is “big data” when it comes to anyone in ecommerce. As storage and transfer/transmission costs decrease exponentially, the amount of data exchanged, processed and utilized seems to exponentially increase. There are many semantic meanings of the word data in the context of communications. In ecommerce, data is that discrete input from customers and feedback from business analytics, machine processing, transactions, etc that comes unformed into a system of management for utilization and analysis to eventually become “information” and used as “knowledge”. There are many technologies out there being used to perform “big data” functions in meaningful ways to organizations.
There is also a great deal of “magical thinking” when it comes to finding utility in all the data coming in and out of large organizations. Among them is artificial intelligence and natural language processing, and other sci-fi terms. Though the systems vary as much as the people who use them, one thing remains crucial when it comes to efficiently and effectively leveraging “big data” on any scale that has passed the tests of time: quality.
Quality can convey many things when it comes to data. Data originates somewhere and goes somewhere. Intake of the data, where it goes and where it remains for access is a form of quality. Accuracy is another form. Consistency another. Structure can be crucial to some and to others structure is a burden (mySQL and noSQL, relational or non-relational). The underlying theme for quality is, again, utility (can it be used and accessed in lots of different contexts in valuable ways?).
GigaOM has been tracking this ongoing debate for some time now. It will be interesting to see the increased importance of “big data” issues to public and private organizations as technology embeds itself into stuff like “the smart grid” and devices, even the “smart home” appliances (offered by Kenmore and others).