I have a number of scripts I’ve written that send updates to Twitter automatically. They are Perl scripts that run database queries and then post to Twitter using curl. Unfortunately, since Twitter turned off basic authentication they’re all dead.
Twitter ended support for basic authentication in order to prevent third party applications from asking Twitter users for their passwords. Phishing for Twitter passwords is rampant, and it’s harder to combat phishing when legitimate sites are asking Twitter users for their usernames and passwords.
While I was working on this blog post, John Udell beat me to the punch by posting about the good and bad aspects of migrating to OAuth and a technical guide to the migration.
My use case is simple, I just need to come up with an equivalent to the following code that actually works:
exec("curl -s -u $username:$password -d status=\"$tweet\"
http://twitter.com/statuses/update.json");
When updating command line scripts to use OAuth, there are three steps. The first is registering the application with Twitter. The second is obtaining an OAuth token that the script can use. The third is updating the script to authenticate using OAuth.
Twitter’s recommendation in this case is that I stop using curl and migrate to an OAuth or Twitter library instead. To be frank, this sucks. Our servers run Red Hat Enterprise Linux and my systems administrator doesn’t like to install random Perl modules. In researching how to solve this problem, I decided to start with Net::Twitter, which has the following dependencies:
DateTime [requires]
Data::Visitor::Callback [requires]
DateTime::Format::Strptime [requires]
Net::OAuth [requires]
Moose [requires]
JSON::Any [requires]
Try::Tiny [requires]
Moose::Role [requires]
URI [requires]
namespace::autoclean [requires]
Moose::Exporter [requires]
JSON [requires]
MooseX::MultiInitArg [requires]
Those dependencies each have dependencies of their own as well. So I’m looking at moving from a script that is dependent only on the curl command line tool, which is already installed, to a script that requires dozens of Perl modules to be installed in order to work. That’s a deal breaker. As an aside, when I tried to install Net::Twitter on my Mac, the installation failed because the tests for the module didn’t pass.
Before I can even bother with registering my silly 15 line Perl script with Twitter as an application and authorize it for the account to which I submit these status updates, I have to rewrite it to use a library that I was unsuccessful in installing on my laptop and probably can’t install on the server, or I can write my own OAuth implementation from scratch to avoid getting caught in the mire of dependencies.
My Twitter script is just dead for now.
Twitter has traded simplicity for the potential for greater security. The emphasis is on the potential, because the tough part isn’t getting third party sites to migrate to OAuth, but teaching users not to give their passwords to sites that ask for them. Just because sites don’t need your password any more doesn’t mean that third parties can’t still ask for passwords or that users won’t continue to enter them when asked.
Update: My scripts started working again a few hours ago with no changes on my end. Has Twitter reenabled basic authentication temporarily?
September 12, 2010 at 3:38 pm
Don’t know if it’d help, but there does exist an OAuth aware wrapper around Curl:
http://code.google.com/apis/buzz/v1/oacurl.html
(although in Java)
September 12, 2010 at 7:37 pm
I wonder how long this will work for write-only use:
September 12, 2010 at 7:42 pm
The most amusing thing I’ve found in this is that all the Twitter clients I’ve used since the OAuth switchover had me login in a framed window, rather than a separate browser. So they could have very easily been phishing me, I’ve no way to tell (short of running a sniffer).
September 12, 2010 at 11:43 pm
@Rafe: Not that this makes your job any easier, but you could always install perl modules in your own directory, assuming you are running your tweeting script as yourself.
@xiojason, WWW:Mechanize is awesome but IIRC has a bunch of dependencies, although not as many as Net::Twitter appears to have.
September 13, 2010 at 9:19 am
I think there is a bit of a chicken and the egg problem with OAuth. Until there are enough important sites using it, simple libraries won’t emerge. Until simple libraries emerge, important sites will be hesitant to use it.
Perhaps Twitter will start the snowball rolling down the hill with their bold shift to OAuth.
Hopefully eventually you’ll have an OAuth-ified equivalent of:
exec(“curl -s -u $username:$password -d status=\”$tweet\” http://twitter.com/statuses/update.json“);
September 13, 2010 at 10:03 am
You may want to look at this : http://blog.nelhage.com/2010/09/dear-twitter/
Looks like (at least for the moment), adding a ‘source=twitterandroid’ param to the URL will allow you to keep using the older, basic AUTH API…
September 13, 2010 at 11:07 am
Do you have any secondary server that you’d be able to install the modules on? You could write a two-stage script. Stage 1 can use curl to write to a web service on a different server, and the web service can use OAuth to post to Twitter.
September 13, 2010 at 11:23 am
I couldn’t agree more. That’s exactly why I built and put up the http://supertweet.net Proxy service so the following works like it used to on the Twitter API:
exec(“curl -u $username:$password -d status=\”$tweet\” http://api.supertweet.net/statuses/update.json“);
September 13, 2010 at 11:29 am
The idea of a proxy is a good one. Thanks!
September 13, 2010 at 12:07 pm
You might want to use the command line tool, ‘bti’ which now handles oauth communication with twitter. It will give you the ability to do what you were doing with curl in the same way.
September 13, 2010 at 3:02 pm
I find Twitter’s policy of requiring you “do all you can” to ensure no-one exploits your consumer_key and consumer_secret awfully entertaining for open source desktop apps.