Google and Privacy
Michael Arrington at TechCrunch notes with alarm the release of Google Desktop 3.0, which enables desktop search from across computers by maintaining a search index of your local files on Google servers. (It should be noted that use of the feature is optional).
Arrington links to a Gillmor Daily discussion about the privacy implications of this. I usually like listening to these guys, but I thought their opposing positions both missed the point, badly.
Arrington seems to be most worried about the potential for abuse by The Government, especially in the wake of Google's agreement to share information with the Big Bad Bush Administration.
Gillmor scoffs at the concern, not because he doesn't think the current administration is indeed The BogeyMan, but because (if I understand him correctly) They have more and better means to abuse our privacy anyway, and the value of storing our data in the internet cloud is so compelling that the way to deal with it is to manage how we manage ownership of that data, and specifically asks why Google's storage of indices is worse than storage of private data via online backup.
What I Find Troubling About This
To take the last point first, here's the fundamental difference from my point of view: in the case of backup, I build a package of my backup data locally, encrypt it to make in (hopefully) impenetrable, then move it into the cloud. In the case of Google Desktop, my data is moved to Google's servers in a form where they can peek into it, in the name of search.
I find this troubling. Google promises not to share my data with anyone. Fine, let's accept that and hope that Google is better at protecting it than the Boston Globe, of whose idiocy I was affected.
Being able to search my data and affirmatively determine whether it contains references to certain keywords conveys information about me, even if the content of that data is not explicitly shared. Google does not do this, as far as I know; they certainly don't let anyone to ask "who has references to Al Qaeda or Barry Manilow" in their index?" But there's a spectrum of possibilities between complete privacy and outright sharing.
As for government surveillance, challenging the government on its exercise of power is always healthy, but Arrington and Gillmor are so focused on it that they race each other to proudly label themselves as unpatriotic for wanting to retain their privacy; it's actually comical.
While it's legitimate to be vigilant to ensure that we don't become a police state over the next 20 years, I also try to make sure that many small, annoying things aren't done to me on a daily basis between now and then.
Up to now, Google has commercialized what they know about your interest as expressed by your search queries, and there's an strong element of fairness to this. If you go to Google and type "Barry Manilow" into the search box, you're transmitting your interest in the subject to Google, and they use this to deliver discreet ads with your search results that reflect your expressed interest.
Controlling Attention Data
Gillmor is an advocate of giving the user control of this information about themselves, as exemplified by the principles espoused by AttentionTrust.org, with which he is intimately involved. Their notion is that the user ought to have control of the information about their attention, what is included in it, and how it is shared.
In the discussion with Arrington, Gillmor seems content that users can exclude specific folders from Google's net-based indexing, and that this event will trigger the concern and discussions necessary to convince Google that they need to provide better control of this information. I'm less confident about this. Desktop search is most valuable when it can help you find stuff in obscure places, and constraining the search indexing reduces the usefulness. It also places the burden on the user to proactively identify all the places sensitive data is now or might possibly land in the future, and I don't think this is realistic. In theory, the user has the control; in practice, it's too easy not to exercise it.
Capability May Not Be Intent, But That Hasn't Stopped Us Before
Years ago, ActiveX came into being because the techies at Microsoft (and elsewhere) thought it would be cool and powerful to enable program code to be embedded in emails (and later, web pages). There were some clever ideas; when someone received an email requesting them to perform an action, the email could embed code that would perform, or support the user's performance, of the requested actions. There were all sorts of scenarios drawn up where this could be a significant productivity enhancement.
In hindsight, most people agree that this was a Bad Idea: used maliciously, this capability became a conduit, perhaps the most exploited one, for spreading malware. The consequences are that the innocent have been inconvenienced. For example, my wife is an elementary school teacher, and in the name of security she is now prevented from installing and trying out many potentially useful pieces of new software, which is what is supposed to make computers flexible and powerful.
Like Microsoft, Google is, I think, optimistically focusing on the potential value of desktop search across the network, and in their enthusiasm is blind to the potential for abuse. I hope I'm wrong.
To those who think, like Steve Gillmor, that it's alarmist to worry about the potential for abuse instead of the deed itself, I would point out that wiretapping and gun control laws are based on the same premise: good intentions aren't sufficient protection.
I'm not proposing that Google Desktop be outlawed. (When Google is outlawed, only outlaws will Google.) But I'm not installing Google Desktop (I use Macs anyway; small sacrifice), and I don't use GMail for my personal correspondence. (*)
(*) I do use GMail to collect the traffic from related public mailing lists, so I can search them in one place.