In a previous post, we saw how easy it was to start consuming the Twitter Streaming API and display the messages on a console window. In this post, we’ll take it a step further and try to derive some useful information about the the activity stream on Twitter in real-time. I’ve given myself a hypothetical goal of deriving an answer to the following question:

Which Twitter users are mentioned most often when a given keyword is included in the tweet?

How useful is this information?  One idea could be to identify influencers in real time when you’re expecting a sudden increase in a keyword or hashtag for a sponsored event or TV spot.  If people are mentioning a particular user over and over again with your brand in it, you could connect with that user and help spread your message.

Let’s Get Started

For demonstration purposes, the components I’ll be building include:

  1. twitter_stream_db (SQL Server Database) – This will store the mention count for individual users
  2. MSMQ – I’ll be creating a queue that will be sent messages from the Twitter Streaming API. I’m going to leave this for a future post since I’m just doing a POC at this point. If I were to even consider using this in production I would definitely develop a queuing system but at this time it’s a bit of overkill.
  3. TwitterReader (console) – A small application that will read the Twitter API and drop messages into an MSMQ channel.
  4. TwitterWriter (console) – A small application that will read messages from MSMQ and update the database No MSMQ as explained above so no need to read off the queue. Again, I’ll write these components so we can scale in a later post. For now, consider this just a POC


This will be a slightly modified version of the console application we created in the previous post. Instead of writing to a console window, we’re going to parse the JSON objects using and then insert users mentioned in the tweet into a table using the SQL MERGE command.

Looking at the code, you’ll see we modified the stream URL slightly to include the keyword we want to track and sending that JSON result to a new method ParseJson.   In this case, let’s see who are the most popular users mentioned every time someone tweets something with the keyword “love” in it.   (I know it sounds corny but I needed something popular so that I can show off the results.)

We’re using’s Linq to Json feature to navigate to the user_mentions array. Once we have it, we just loop through all the users in the array and MERGE them into the database table through the stored procedure (see below).


For simplicity, I’m going to create a single table to store the data as it comes in.  The primary key is the Twitter user’s id since there should be only one record per user at any given time.

I used the new MERGE command to perform an “upsert” of the data.  If its the first time the user has been mentioned, it will perform an insert and set the mention_count to 1.  Otherwise, we’ll update the record by setting mention_count to mention_count + 1.

Since I’m using the MERGE command, I encapsulated it into a stored procedure as oppose to writing LINQ queries. The stored procedure receives a user id and twitter name and performs the insert/update logic.

The MERGE command is an incredibly useful feature introduced in SQL Server 2008.

Parsing the JSON Result

As mentioned above,  we’re just going to parse the JSON object and iterate through the users mentioned in the tweet.  As we do that, we’ll pass the users into the stored procedure above and MERGE the data into the SQL table.

Important: Since we’re not using a queuing system, the rate at which we can process tweets will depend largely on the speed of our SQL stored procedure. If you’re considering something similar in a production environment, please implement a queuing system to handle the load


Once you start running the application, you’ll start to see Twitter ids and screen names appearing on the console window.  Let it run for a few minutes and, depending on the popularity of your search term, you should start to see some results.  You can then go to SQL Server Management Studio and run a simple query to get a view of the activity on Twitter for that keyword / user mention combination.

Hope you enjoyed this post and have some ideas for implementing something similar with your next social media and Twitter campaigns!

Several social networking sites including Twitter and Digg have implemented some form of a streaming API. This is an emerging pattern with many high-volume content providers. At first glance, a streaming API may seem to be more resource intensive than traditional polling API, but as discussed here, it is actually much more streamlined.

Most of these server implementations use some sort of queuing system and a long-lived HTTP connection which clients use to have data delivered in near-realtime. In this post, I’m going to demonstrate just how easy it is to hook into the Twitter Streaming API.  Specifically, we’ll be using the streaming/filter method to consume any tweets which match a filter condition in realtime. I’m sure you can think of some neat ideas that can leverage this concept!

To demonstrate, let’s just create a console application in Visual Studio. We’re going to create a simple WebRequest to connect and just start reading the stream. Then, we’ll just print out the json result to the console window. I’ll leave it up to you to write some parsing logic to actually do something with the data.

That’s all there is to it!   Your output on the console window will look like this:

What you do with the data is where the real magic happens. The last I heard Twitter was producing somewhere in the range of 1000 tweets per second and I’m sure its much higher than that now.  You will probably want to implement message queuing system where you hand of the response as soon as possible for some other process to handle without blocking the incoming stream.

In a future post I’ll describe some neat ideas for what you can do with the data from the stream.

Hope this helps.

A common pattern in iPhone applications that load data remotely in a table is to utilize a button at the bottom of the table with a “Load More” label.  Typically, when you click on the Load More button, a request is made to the server to download additional items.  This form of lazy loading enhances the user experience and improves application performance.

I knew Three20 handled this scenario but I couldn’t find a good example that walked through each component that was required to implement it.  It was a challenge to get all the bits to work together so here is my breakdown in case others are facing a similar challenge. I’m going to divide the post into two parts. In part 1 we’ll just hook up the classes with each other then in part 2 we’ll implement the actual request / response logic.

If you’re not familiar with the Three20 library, I highly recommend you take a look at it for your iPhone / iPad projects. For this example, I’ll assume you have integrated Three20 into your project already and are somewhat familiar with the library.

To demonstrate the load more feature we’ll be doing a simple integration to display my public Twitter timeline with paging. Since my timeline doesn’t require authentication, we won’t have to get bogged down with implementing the OAuth protocol. Now, let’s look at the players:


To start, let’s add a subclass of the TTTableViewController. This will just provide us with a base controller with a table view to utilize for demonstration purposes. In addition, we’re going to stub in a class which will serve as the datasource for this controller.


Subclass of TTTableViewController

That’s pretty much all you have to do within the tableviewcontroller thanks to Three20 encapsulating a lot of the delegate logic into the datasource object. Now, let’s create the datasource for the table.


If you’ve worked with Three20 and table implementations, you’re probably familiar with this class. Again, we’ll subclass it to create a new datasource object which will be consumed by our table view controller.  We’ll also add some code which will work with the TTURLRequestModel.

Subclass of TTListDataSource

A few things are going on in this class. First, we allocate the TTURLRequestModel which we’ll add in just a minute. We also implement the model method which we’ll set to the TTURLRequestModel above. This is very important if you want to keep your sanity trying to debug later on. Below is the header file for the class.


@interface MBTwitterDataSource : TTListDataSource {
	  MBTwitterRequestModel* _twitterFeedModel;


The last class we’re going to create will be a subclass of the TTURLRequestModel. This class will encapsulate all the network operations to interact with the Twitter API.


@interface MBTwitterRequestModel : TTURLRequestModel {

These are all the classes involved in getting this to work. In the next part, we’ll look at writing the code to interact with the Twitter API and how to actually add the “Load More” button to the table cells.

One of the most popular formats for writing agile user stories follows this template:

As a [role], I want to [feature] so that [goal]

Many organizations translate this into a spreadsheet where each column represents a different field the story writer can fill in:

I have used this template on almost every agile project I have participated in.  For the most part, it works very well. It typically provides a great launch point for additional discussions before it makes it into a sprint.

One of the most common mistakes I see when introducing the template (and agile stories in general) to a new team is a tendency to define every role as just a generic “user”. For example, I’ve seen user stories as generic as:

“As a user, I want to upload photos to a library so that I can use them on a post”

Although its syntactically correct, it is vital that we define a role and not a “seat” on the system. A more valuable user story would look like this:

“As a user designer content author, I want to upload photos to a library so that I can use them on a post”

Notice how this story now has a better context of who will be using the feature and how. This could provide valuable insight into the intended audience and how best to implement the feature — a designer may imply more advanced features while a content author would use the image as is.

Always avoid the generic “user” role in your user stories.

Hope this helps.

These kinds of projects are surely a sign of things to come. More and more executives are realizing the value of social media as a robust, scalable and (relatively) inexpensive channel for marketing their brands and generating demand for their products.