Monday, July 11, 2011

Order of chaos - how to integrate social media feeds into your websites

Today you absolutely have to integrate social media into a website, especially on a blog. At least you include buttons such as Facebook Send or Like buttons, Twitter post button, maybe a Google +1 button and so on.
This helps spreading the information or improves search engine ranking. Of course, this also helps social media platforms collecting behavioral data, but who cares, right? Well I care, but there is no other way.
Integrate social buttons is generally strait forward, you grab the code, you include in your page and boom, all done! However, if you plan to display message feeds on your pages, things become more complicated. You may decide to use the out of the box widgets, so the content will be loaded directly on the browser page; the problem is that this might be reeeeeeally slow. If the content is loaded asynchronously the user might be frustrated seeing pieces of content appearing later. It will be painful to wait for feeds from Facebook, Twitter, YouTube, Flickr: connections opened with each service, latency, not a good user experience. Plus, there are availability issues. Facebook, YouTube and Flickr websites are almost always up, Twitter website is often up. Still, services might be down at some point. This includes your website, but hey, if your site is down, it doesn’t matter if other services are working fine or not. Though, this might be a good solution if you don’t want to display much social services data. But what if you are?
You can of course grab data from all services using a cron job or a scheduled task, store the content locally and then serve it from your server. All social services have APIs, allowing making server calls to obtain data. Front end social media websites are mostly up, but APIs are completely different animals. APIs have all kind of issues. Just to name a few: services are not available or are very slow causing timeouts; there are API call limits (per hour, per IP address), APIs are changing all the time.
The cron job (or scheduled task on Windows) will overcome the APIs call limit, plus if the services are not available the server will offer the latest information it grabbed, but how to deal with API changes? For instance, Facebook changes something every week.
The best solution is to build interfaces specific to your application needs, and then write platform specific providers. This way you only have to adjust the providers when APIs change. For example, you create an interface called IMessage, implementing several properties, such as Author, Message, Data, and several methods such as GetLatestMessages(), returning a list of IMessage objects. In your code you will only use IMessage, so when something changes (like Facebook not allowing anymore to get public message feed witout a token authentication, which broke a lot of applications), your library containing the IMessage implementation will be updated but the application code does not change at all.
In conclusion:
  • write a library for social media feeds
  • use interfaces
  • write an import application to store data on a local cache (maybe database)
  • run the import application on a schedule, just enough to stay bellow social media APIs call limits
  • use the created library to get data and display it on the website
Note: you may find useful an Inverse of Control technique in order to be able to use only interfaces on your application code. Check the examples I wrote for Simple C# inverse of control and C# inverse of control for more details on this subject.