A Global Consumer Brand was given a few weeks to create a high level social media report that would aggregate social media data to be delivered in a daily report to SVP and C-level global leaders company wide - each of whom has responsibility for different brands in different geographies. The report was developed for a seven-month campaign, in which the data would drive social marketing efforts around the globe and revolutionize digital marketing within the brand. The report would help fuel a data driven competition among various country marketing teams, with winners being awarded each day based on top engagement metrics. We were asked to provide the data that powered the ‘data-mart’ from which the report would be produced. Three different outside firms were used - a consulting company to create the report, a data software company to create the software, and our Clickable team, who had the responsibility to aggregate all the API social media data and deliver it to the ‘data mart.’


The Clickable solution was to help the client understand the API data and help them determine what information could reasonably be included in the report to satisfy all of the stakeholders. The extracts would be delivered each day at a specific time, to ensure that analysts would receive “fresh” data for their reports. The Global Consumer Brand’s portfolio included 2,000+ locations from 6 different channels (FB, YT, IN, TW, Vine, VK) and required both page-level and post-level data for public and authenticated locations. To accomplish this task, Clickable created an automated pull process that would start each day, just before the reports were to be assembled and delivered to stakeholders. T o ensure that Clickable delivered the most recent data, we also created an automated “polling” process that checked to make sure the APIs had updated analytics data (note: each api updates their data differently). We then worked with the Global Consumer Brand and its other firms to streamline and tie all systems together. This process is lasted seven-months and data was delivered 2x daily for each day during that time. Due to size of the client’s portfolio, the frequency of data pulls required to populate the client’s data mart, and the sometimes unpredictable results from the APIs, Clickable implemented an automated error detection and a QA process that would detect when bugs and other data discrepancies were sent from the APIs. We also delivered data through a second FTP link and developed a dashboard that provided the Client and its outside firms additional ways to populate their stakeholder reports, if needed.


Out of three vendors, Clickable’s data and solution turned out to be the only solution that could be relied upon to generate the report for stakeholders. The other two vendors were let go as the project came to delivery and only Clickable remained as a vendor to work with an internal data team to deliver the report. The Brand itself is outshining their largest competitor according to the press and winning the brand battle in the event that we are reporting.