Twitter Tools’ Data Upgrade Process


One of the important changes in Twitter Tools 3.0 is the way that tweet data is stored in your WordPress database. Instead of writing that content in a separate database table the tweets are now stored as a custom post type within the posts table. This gives us all sorts of great benefits:

  • it utilizes current WordPress best practices, making it more forward compatible
  • an admin interface for editing and deleting tweets (a very common request)
  • the ability to relate tweets via custom taxonomies (view your tweets by account, @mentions, #hashtags)
  • the ability to store the original tweet data as post meta (and allow access to that data for display purposes)
  • the ability to download and save an included photo as the featured image for the tweet (and resulting blog post, if that feature is enabled)
  • theoretical compatibility

The challenge is how best to implement the upgrade. There are two main things that need to happen in the upgrade process:

  1. Convert legacy data to the new format.
  2. Backfill full tweet data for the legacy data.

These sound quite simple on the surface, but there are considerations that complicate matters. The first is that some folks have a lot of tweets. The second concern is with Twitter’s API limits. I didn’t want activity from Twitter Tools to use up all of the Twitter API calls for a given account. An upgrade process that handled these limitations was needed.

First let’s look at the conversion process for the existing data. I’ve seen several people mention that they have 20,000+ tweets to upgrade with 3.0. With this amount of data, you can’t just pull it all in a database query and loop through it. Luckily I anticipated this and created a solution that would handle scale. The upgrade process first adds an “upgraded” column to the legacy tweets table so that it can keep track of status.


The upgrade page itself has some JavaScript functionality on it to make everything work. When you click the Upgrade button, it makes an AJAX call to code that grabs 10 251 tweets that need upgrading, creates new copies of those tweets as custom post types items, marks them as upgraded in the old table, then sends back a response. When that response is received, the progress bar is updated to let the user know things are moving along, and a new request is kicked off to upgrade the next 25 tweets. This continues until all tweets have been converted to custom post types.


So that takes care of the initial conversion from the old data table to the nice elegant post types. Now we just need to get full tweet data for all of these tweets.

I think it’s important to have the full tweet data stored locally because there are lots of interesting things you can do once you have this data. One thing I’m using this for is to create “in reply to X” links where appropriate.


However I didn’t want your iPhone app to stop working because your WordPress site was trying to load in full data from Twitter and using up all of your API requests. Since I don’t know how many other apps you might be using, I decided to be quite conservative with the process to fetch tweet data for each of the upgraded tweets.

Every hour Twitter Tools makes requests to grab full content for 10 of the upgraded tweets, attaches that data (and processes it to attach @mentions and #hashtags), then clears the “this tweet needs more data” flag. The backfill process can stretch out over days/weeks/months if you have lots and lots of tweets, but that’s OK.

This is data we didn’t have before, so we haven’t given up any features while we wait for it; and Twitter Tools is coded in such a way that this data is not required.

The upgrade process seems to be taking a long time for some users. I can only assume this is due to limitations of their host or server configuration. For my ~4000 tweets the initial upgrade took about 5 minutes.

When I see someone complaining about how long it is taking, I simply translate that in my head to the following compliment:

Wow, I can’t believe that Alex was thoughtful enough to create this elegant upgrade process. Not only does it handle my 20,000+ tweets, but it’s also fetching and filling in useful data in a way that respects the rate limit on my Twitter account.

Thanks Alex!

  1. This was changed in 3.0.1 in the hopes that it would be faster for users that are seeing slow upgrade times. Not sure that it has helped

This post is part of the project: Twitter Tools. View the project timeline for more context on this post.