Monday, May 09, 2011

How to get a RSS feed from a Twitter account

So it seems that Twitter has killed RSS feeds (I'm not sure what that has to do with implementing OAuth), but fortunately, there's an easy workaround.

You can get an RSS feed by querying another URL for the desired user. For instance, to get the RSS feed for user ggreenwald, you can find the feed at this url:

Saturday, March 12, 2011

Facebook Wall question

I sometimes have the question asked of me, "Danny, you are such a strong advocate of transparency and access; aren't you being a hypocrite by not allowing others to comment on your Facebook wall?"

This has always struck me as a rather strange question because it confuses the obligations of a person with the obligations of a state or corporation. It is true that I think that, say, governments should be relatively transparent -- that's a prerequisite for accountability. I, on the other hand, am a priori accountable to no one so I don't have those same obligations.

This accusation is analogous to claiming that WikiLeaks is a hypocritical organization because its members don't share all their internal communication with the world (many in the mainstream media do make this point after each release from WikiLeaks in order to discredit the organization). As if 1) there is any equivalence between the transparency obligations of states/corporations and activist groups or 2) WikiLeaks isn't facing an existential threat from organizations that would use that internal information to try and destroy the group.

But that aside, there is a valid question contained within the query. Why do I have the personal preference of configuring my Facebook Wall settings in such a way? I just don't feel that it is worth the time and mental strain to police my Facebook profile 24/7 to expunge things said by others (whether said out of malice, ignorance or superfluousness) that I would rather not have on there. There are sufficient potentially undesirable comments that may be put on one's profile for this to be a valid concern.

Saturday, May 22, 2010

Multidimensional arrays with GCC's variable length arrays

GCC's extensions allow you to do some crazy things with multidimensional arrays. For C99, GCC implements variable-length arrays. So a declaration like this:

int matrix[a][b + 1];

Can be passed by reference to a function with this prototype:

void foo(int a, int b, int matrix[a][b+1]);

Or, if you want to get really crazy, you can use forward declarations and change around the parameter order:

void foo(int a; int b; int matrix[a][b+1], int a, int b);

No malloc needed! Much better than the old way.

Saturday, April 10, 2010

Which application is using that port?

See which ports are open, as an attacker would:


(Superuser seems to need to be used on some the following commands)

See which process is using port 25:

netstat -nlp | grep 25

Same, with a bit less info:

fuser -n tcp 25

Or you could also do:

fuser -u smtp/tcp

Discovered here

Wednesday, March 17, 2010

Sansa Clip and Karmic

Upon upgrade to Karmic Koala Ubuntu (9.10), my trusty Sansa Clip player was not autodetecting like it did under the Ubuntu distribution I was previously using. So I scoured the web and found this site which gives a workaround: when the Sansa Clip is disconnected, go to Settings->USB Mode on the Sansa Clip and set it to MSC. It should now present itself to the computer as a normal file system. Unless, of course, the file system is screwed up, in which case you'll have to go to Settings->Format and format the Sansa Clip drive before connecting it again.

Friday, March 05, 2010

Funny C tricks

Taken from Bill Rowan's Stanford ACM presentation:

The "downto" operator:

int main()
int i = 5;
while(i --> 0) // --> is the downto operator!
printf("%d\n", i);
return 0;

Cast any type to "bool" type (that is, 1 or 0):
"Computed goto" (compiler-dependent && unary operator):
void print_loop(int s, int e) {
assert(s < e);
printf("%d\n", s);
goto *( &&top + ( !!(s++/e) ) * ( &&end - &&top ) );

Tuesday, January 12, 2010

Make a big file into many small files (and back again)

Adapted from this slashdot thread

Create many little files called "chunks00000" etc. from bigfile that are all just under 10MB in size: split -a 5 -b 10MB -d bigfile chunks

Put the chunks back together: cat * > newbigfile

Do the same, except with compression: tar -zc bigfile | split -a 5 -b 10MB -d - chunks

Put the compressed chunks back together: cat chunks000* | tar -zxO > newbigfile