I recently completed an article for the HPC conference held in Cetraro, Italy, in 2006. (Ok, I was a little late.) I took the opportunity to talk about how a topic that I find fascinating, namely the vision and legacy of JCR Licklider. The abstract:
Licklider advocated in 1960 the construction of computers capable of working symbiotically with humans to address problems not easily addressed by humans working alone. Since that time, many of the advances that he envisioned have been achieved, yet the time spent by human problem solvers in mundane activities remains large. I propose here four areas in which improved tools can further advance the goal of enhancing human intellect: services, provenance, knowledge communities, and automation of problem-solving protocols.
Needless to say, I could hardly do justice to such a grand topic, but perhaps my comments will spur some interesting thoughts and responses.
Sun's CTO Greg Papadopoulos coined the term "red shift" to denote the massive IT buildout that is occurring in the likes of Google, Amazon, eBay, and the like as they provide ever-more-sophisticated services to ever-larger numbers of customers. He posits that this trend will continue, ultimately resulting in a "neutron star collapse of datacenters" to a small number of massive, centralized, highly efficient providers.It's a bizarre choice of term--doesn't a bigger red shift mean that the star we are seeing is further away and thus older (and probably already extinct)?--but it's certainly a compelling analysis.
I liked an article by Phil Wainewright (riffing on an article by Dan Farber about red shift theory) about the pros and cons of ultra-large data centers. As others have commented to me in the past, having all of your data and computing in a single location is not necessarily a good idea.
I've been talking a lot recently to colleagues in the social sciences, following a workshop on virtual organizations that I ran for NSF with Carl Kesselman, Tom Finholt, and Jonathan Cummings. If I may say so myself, it was a great meeting--certainly not the "same old" discussions (at least from my perspective). The Building Effective Virtual Organizations (BEVO) workshop is a next step in this process.
An interesting article on Red Hat's plans for its "Enterprise MRG" distribution. MRG stands for Messaging (an implementation of the Advanced Message Queuing Protocol), Real-time (real-time Linux kernel), and Grid (which for Red Hat, means virtualization support and Condor support for application deployment).
And for those who have always wanted to see the supposedly open source, but never really accessible Condor software, some good news:
As part of the agreement between Red Hat and the University of
Wisconsin, the Condor software will seek an OSI-compliant software
license that will allow the code to be distributed as part of a RHEL
JOIN US for an exciting 3-day course in large-scale and high-performance
grid computing to take place January 23-25, 2008, at Florida
International University, Miami, FL.
This intensive course introduces the techniques of grid and distributed
computing for science and engineering fields, with hands-on training in
the use of large-scale grid computing resources. The course introduces
skills that will be needed by researchers in the natural and applied
sciences, engineering, and computer science to conduct and support
large-scale computation and data analysis in emerging grid and
distributed computing environments. The workshop will focus on enabling
the use of Open Science Grid (OSG) and TeraGrid cyberinfrastructure to
perform large-scale computations and data-intensive processing in
various research fields. Participants will learn how to use grids of
thousands of processors and will be able to continue to use these
resources for their research after the course.
The workshop will cover:
* Overview of distributed computing concepts and tools
* Wide-area high speed optical networking
* Concepts, tools, and techniques of grid computing
* Discovering and using grid resources
* Grid scheduling and distributed data management
* Web service and grid service concepts
* Techniques for workflow and collaboration
Undergraduate and graduate students, researchers, educators and
professionals in engineering, computer science, or any scientific,
data-or computing-intensive discipline may apply. Applicants should have
at least intermediate programming skills (one to two semesters
experience in C/C++, Java, Perl, and/or Python) and hands-on experience
with UNIX / Linux in a networked environment.
Application Deadline: Dec 10.
Notification of Acceptance will be sent by Dec 20.
Registration Deadline: Jan 10.
A report released by McKinsey, Reducing U.S. Greenhouse Gas Emissions: How Much at What Cost?, concludes that the United States could cut its projected 2030 greenhouse gas emissions by 33% to 50% at modest cost, using energy efficiency and other proven, on-the-shelf technologies. As much as 40 percent of the emission reductions could result from policy measures that more than pay for themselves over their lifetimes.
In one of the report's more notable findings, aggressive application of energy efficiency technology in new and existing homes and commercial buildings could eliminate the need for 80 percent of new electricity generation that would be required to meet new demand in 2030 under a business-as-usual scenario in which no new efficiency improvements are made.
Interestingly, of the many emission reduction methods considered, hybrid cars are by far the most expensive.
Many many years ago I wrote a book on parallel programming. My initial plan was to show how just about everything could be expressed using map reduce, but I couldn't quite see how to package things in a way way that made sense. (Probably I wanted to do it all via language constructs.) So instead I ended up covering a lot of other material instead. It's neat to see how with the right abstractions things turn out so nicely.
example, in Year 1 that useless letter "c" would be dropped to be
replased either by "k" or "s", and likewise "x" would no longer be part
of the alphabet. The only kase in which "c" would be retained would be
the "ch" formation, which will be dealt with later. Year 2 might reform
"w" spelling, so that "which" and "one" would take the same konsonant,
wile Year 3 might well abolish "y" replasing it with "i" and Iear 4 might fiks the "g/j" anomali wonse and for all.
then, the improvement would kontinue iear bai iear with Iear 5 doing
awai with useless double konsonants, and Iears 6-12 or so modifaiing
vowlz and the rimeining voist and unvoist konsonants. Bai Iear 15 or
sou, it wud fainali bi posibl tu meik ius ov thi ridandant letez "c",
"y" and "x" -- bai now jast a memori in the maindz ov ould doderez --
tu riplais "ch", "sh", and "th" rispektivli.
Fainali, xen, aafte
sam 20 iers ov orxogrefkl riform, wi wud hev a lojikl, kohirnt speling
in ius xrewawt xe Ingliy-spiking werld.