The company in mid-December pledged on its blog "Going green at Google" that it was going to make its operations carbon-neutral and reduce greenhouse gases blamed for global warming.
One big challenge for search engines is to implement a measure of quality that is not based solely on popularity. Search engines must determine both relevance (is the item pertinent to the user's query?) and quality (is the item inherently accurate, useful and understandable, independent of the query?). Current relevance measures do reasonably well. Measures of quality require better models of the concepts and relations expressed in documents and how they relate to the reality of the world, as well as models of the trustworthiness of authors. Thus, a site that claims that the Moon landings were a hoax and seems to have a coherent argument structure will be judged to be lower quality than a legitimate astronomy site, because the premises of the hoax argument are at odds with reality. Understanding and improving these models is a key challenge for the coming decade.
It is becoming obvious with this announcement and the piece from a Google bigwig in the latest Nature rag that states that they want to control the quality of information in searches in the future that this is quite the neo-fascist outfit.
(It is next to impossible, since Microsoft removed it from their code, to get a good Google logo with the o character transposed to a swastika as is meet to illustrate the principle, but as an optimist, I am sure this will be forthcoming soon.)