Tuesday, September 30, 2008

Wednesday, September 24, 2008

Writing Clearly

There is an interesting brief essay on Ubiquityby Phillip Yaffe, entitled "The Three Acid Tests of Persuasive Writing." It is located at http://www.acm.org/ubiquity/volume_9/v9i35_yaffe.html.

Yaffe writes, "I have yet to encounter a computing professional who is not concerned about communicating effectively. It is a waste of time to have to communicate again because an idea did not get across. And it can be enormously frustrating when people tell you that they do not "get" what you are saying. Especially since most people will not actually tell you that; they will just wander away, leaving you hanging. Phil Yaffe offers three clear, concise, and dense principles to help you overcome this issue."

Philip Yaffe is a former reporter/feature writer with The Wall Street Journal and a marketing communication consultant. He currently teaches a course in good writing and good speaking in Brussels, Belgium. His recently published book In the "I" of the Storm: the Simple Secrets of Writing & Speaking (Almost) like a Professional is available from Story Publishers in Ghent, Belgium (http://www.storypublishers.be/) and Amazon (http://www.amazon.com/).

The Four Kinds of Free

I found this article by Chris Anderson interesting, particularly since many in the industry give something away for "free". I am not going to quote the whole thing, but here is a teaser to get you to visit the item:

A few weeks ago, I posted a diagram grouping free business models into three categories: cross-subsidies (eg, razor-and-blades), three-party markets (ads) and "freemium" (what economists call "versioning"; in this case most people get the free version). But as I was writing through that chapter, I realized that wasn't quite right.

Transformations of organizational structures

If you have heard my talks in the past several years, then you know that I get interested in industry organization and reorganization, especially with regard to the telecommunications industry. Thus, I found this article interesting.

Quoting the article

In brief, most companies today are still an unnatural bundle of three fundamentally different, and often competing, business types:

  • Infrastructure management businesses – high volume, routine processing activities like running basic assembly line manufacturing, logistics networks or routine customer call centers
  • Product innovation and commercialization businesses – developing, introducing and accelerating the adoption of innovative new products and services
  • Customer relationship businesses – building deep relationships with a target set of customers, getting to know them very well and using that knowledge to become increasingly helpful in sourcing the products and services that are most relevant and useful to them
These three business types remain tightly bundled together within most companies today even though they have completely different skill sets, economics and cultures required for success. Inevitably, companies deeply compromise on their performance as they seek to balance the competing needs of these business types. More broadly, this tight bundling decreases agility and diminishes learning capacity.
So how does this apply to the structure of the information organizations that we study? Has anyone taken this kind of analysis to a library?

Sunday, September 21, 2008

Future Technologies

For those interested in ruminating on the implications of future technologies, you might want to read David D. Friedman, Future Imperfect: Technology and Freedom in an Uncertain World (New York: Cambridge University Press, 2008). Friedman states early in the book, "The conclusion I want readers to draw from this book is not that any one of the futures I sketch is going to happen. The conclusion I want them to draw is that the future is radically uncertain. In interesting ways" (p. 4). He discusses, among other things, intellectual property, personal privacy, transparency and its mixed blessings, e-business. open space and scholarship, computer crime, biotechnology, and virtual reality. He is deliberately provocative, and he often presents extreme opposite possibilities of various scenarios.

Thursday, September 18, 2008

Race and Admissions

In this afternoon's Chronicle of Higher Education:

The Education Department’s Office for Civil Rights has aroused the ire of at least one leading civil-rights group by telling colleges receiving federal aid that they may not consider race in admissions unless it is “essential” to their “mission and stated goals.”

The advice to colleges came in a letter of guidance sent to them by Stephanie J. Monroe, the department’s assistant secretary for civil rights, late last month. The letter represents the first attempt by the federal civil-rights office to tell colleges how it will interpret the U.S. Supreme Court’s last major rulings on the use of affirmative action in college admissions, its 2003 Grutter v. Bollinger and Gratz v. Bollinger decisions involving the University of Michigan at Ann Arbor.

The new letter also tells colleges that the diversity they seek “must be broader than mere racial diversity,” that “quotas are impermissible,” and that “providing individualized consideration is paramount and there must be no undue burden on other-race applicants.” In addition, it says, colleges must give “serious good-faith consideration” to race-neutral alternatives before using race in admissions, and the use of race “must have a logical end point.”

Chauncey Bell on Design

Chauncey Bell provides an interesting essay on design in his "My Problem with Design," Ubiquity Volume 9, Issue 34 (September 16 - 22, 2008), available at http://www.acm.org/ubiquity/volume_9/v9i34_bell.html
He is what he tries to do in the essay:

"I will raise several questions about the way we commonly interpret "design."

First, our way of understanding design strips apart components, activities, and contexts. I like simplification, but not this kind of atomistic simplification that destroys the context.

Second, the commonplace notions of design don't give observers of the design process strong ways of making sense of the object of the designer's attention - what the designer thinks he or she is designing.

Third, the designer doesn't have a useful way of thinking about who he or she is in the process of design - the role they think they are playing.

Fourth, I want to question the accountability the designer takes in the invention of whatever he or she is designing."

Wednesday, September 17, 2008

GAO reports on cyber security

Given the emphasis on cyber security at SIS, this report from the US GAO is relevant. This report, also released today, is quite a bit stronger. Quoting the latter report:

While US-CERT’s cyber analysis and warning capabilities include aspects of each of the key attributes, they do not fully incorporate all of them. For example, as part of its monitoring, US-CERT obtains information from numerous external information sources; however, it has not established a baseline of our nation’s critical network assets and operations. In addition, while it investigates if identified anomalies constitute actual cyber threats or attacks as part of its analysis, it does not integrate its work into predictive analyses. Further, it provides warnings by developing and distributing a wide array of notifications; however, these notifications are not consistently actionable or timely.

US-CERT faces a number of newly identified and ongoing challenges that impede it from fully incorporating the key attributes and thus being able to coordinate the national efforts to prepare for, prevent, and respond to cyber threats. The newly identified challenge is creating warnings that are consistently actionable and timely. Ongoing challenges that GAO previously identified, and made recommendations to address, include employing predictive analysis and operating without organizational stability and leadership within DHS, including possible overlapping roles and responsibilities. Until US-CERT addresses these challenges and fully incorporates all key attributes, it will not have the full complement of cyber analysis and warning capabilities essential to effectively performing its national mission.

The ITU and IP Traceback

This item is an example of a topic that should generate interest at SIS. IP traceback is the technical ability to ascertain the source of a message sent over the Internet.

At heart is the tradeoff between the need for security and the desire for privacy (in this case, actually, anonymity) in Internet communications. In certain cases, such as speech in a repressive regime, it is easy to make the case for anonymity. In other cases, anonymity seems to be a way to enable socially irresponsible behavior because there is little consequence to the individual. Where does this fall?

Tuesday, September 09, 2008

Science for Sale

Daniel S. Greenberg, Science for Sale: The Perils, Rewards, and Delusions of Campus Capitalism(Chicago: University of Chicago, 2007) provides a balanced, critical, and somewhat hopeful analysis of the relationship between corporate research and the university. For example, Greenberg concludes that "overall, for protecting the integrity of science and reaping in benefits for society, wholesome developments now outweigh egregious failings -- though not by a wide margin. Nonetheless, the changes and trends are hopeful" (p. 256). While he argues that there are procedures, good ones generally, in place at universities to enable them to work carefully in research with the corporate and government sector, he is also cautious about the challenges ahead: "Temptations for ethical lapses are abetted by institutional factors that are untamed. The academic arms race giddily accelerates. In Ponzi-scheme fashion, it inflames the pursuit of money for constructing research facilities needed to attract high-salaried scientific superstars who can win government grants to perform research that will bring glory and more money to the university. Academe's pernicious enthrallment by the rating system of U.S. News & World Report is a disgrace of modern higher education" (p. 276). And so forth. . . . this is an important book.

Down with Blackboard?

There is an interesting new article in the Chronicle of Higher Education about the continuing problems with Blackboard, the company’s lack of service to its customers, and growing options for other software for teaching. Any of you found screaming at problems with Blackboard will want to read the article. Jeffrey Young, “Blackboard Customers Consider Alternatives,” CHE, September 12, 2008, available at http://chronicle.com/free/v55/i03/03a00103.htm?utm_source=at&utm_medium=en

Monday, September 08, 2008

Cloud computing

In his recent book, The Big Switch, Nicholas Carr makes the case for the emergence of cloud (or utility) computing as the next means by which we will process information. In this item, Joe Weinman asserts 10 laws of cloud computing:

Cloudonomics Law #1: Utility services cost less even though they cost more. An on-demand service provider typically charges a utility premium — a higher cost per unit time for a resource than if it were owned, financed or leased. However, although utilities cost more when they are used, they cost nothing when they are not. Consequently, customers save money by replacing fixed infrastructure with clouds when workloads are spiky, specifically when the peak-to-average ratio is greater than the utility premium.

Cloudonomics Law #2: On-demand trumps forecasting. The ability to rapidly provision capacity means that any unexpected demand can be serviced, and the revenue associated with it captured. The ability to rapidly de-provision capacity means that companies don’t need to pay good money for non-productive assets. Forecasting is often wrong, especially for black swans, so the ability to react instantaneously means higher revenues, and lower costs.

Cloudonomics Law #3: The peak of the sum is never greater than the sum of the peaks. Enterprises deploy capacity to handle their peak demands – a tax firm worries about April 15th, a retailer about Black Friday, an online sports broadcaster about Super Sunday. Under this strategy, the total capacity deployed is the sum of these individual peaks. However, since clouds can reallocate resources across many enterprises with different peak periods, a cloud needs to deploy less capacity.

Cloudonomics Law #4: Aggregate demand is smoother than individual. Aggregating demand from multiple customers tends to smooth out variation. Specifically, the “coefficient of variation” of a sum of random variables is always less than or equal to that of any of the individual variables. Therefore, clouds get higher utilization, enabling better economics.

Cloudonomics Law #5: Average unit costs are reduced by distributing fixed costs over more units of output. While large enterprises benefit from economies of scale, larger cloud service providers can benefit from even greater economies of scale, such as volume purchasing, network bandwidth, operations, administration and maintenance tooling.

Cloudonomics Law #6: Superiority in numbers is the most important factor in the result of a combat (Clausewitz). The classic military strategist Carl von Clausewitz argued that, above all, numerical superiority was key to winning battles. In the cloud theater, battles are waged between botnets and DDoS defenses. A botnet of 100,000 servers, each with a megabit per second of uplink bandwidth, can launch 100 gigabits per second of attack bandwidth. An enterprise IT shop would be overwhelmed by such an attack, whereas a large cloud service provider — especially one that is also an integrated network service provider — has the scale to repel it.

Cloudonomics Law #7: Space-time is a continuum (Einstein/Minkowski) A real-time enterprise derives competitive advantage from responding to changing business conditions and opportunities faster than the competition. Often, decision-making depends on computing, e.g., business intelligence, risk analysis, portfolio optimization and so forth. Assuming that the compute job is amenable to parallel processing, such computing tasks can often trade off space and time, for example a batch job may run on one server for a thousand hours, or a thousand servers for one hour, and a query on Google is fast because its processing is divided among numerous CPUs. Since an ideal cloud provides effectively unbounded on-demand scalability, for the same cost, a business can accelerate its decision-making.

Cloudonomics Law #8: Dispersion is the inverse square of latency. Reduced latency — the delay between making a request and getting a response — is increasingly essential to delivering a range of services, among them rich Internet applications, online gaming, remote virtualized desktops, and interactive collaboration such as video conferencing. However, to cut latency in half requires not twice as many nodes, but four times. For example, growing from one service node to dozens can cut global latency (e.g., New York to Hong Kong) from 150 milliseconds to below 20. However, shaving the next 15 milliseconds requires a thousand more nodes. There is thus a natural sweet spot for dispersion aimed at latency reduction, that of a few dozen nodes — more than an enterprise would want to deploy, especially given the lower utilization described above.

Cloudonomics Law #9: Don’t put all your eggs in one basket. The reliability of a system with n redundant components, each with reliability r, is 1-(1-r)n. So if the reliability of a single data center is 99 percent, two data centers provide four nines (99.99 percent) and three data centers provide six nines (99.9999 percent). While no finite quantity of data centers will ever provide 100 percent reliability, we can come very close to an extremely high reliability architecture with only a few data centers. If a cloud provider wants to provide high availability services globally for latency-sensitive applications, there must be a few data centers in each region.

Cloudonomics Law #10: An object at rest tends to stay at rest (Newton). A data center is a very, very large object. While theoretically, any company can site data centers in globally optimal locations that are located on a core network backbone with cheap access to power, cooling and acreage, few do. Instead, they remain in locations for reasons such as where the company or an acquired unit was founded, or where they got a good deal on distressed but conditioned space. A cloud service provider can locate greenfield sites optimally.

A comic book user manual

OK, so perhaps they should be called graphic novels ... in any case, I found this article from the NY Times interesting. It describes how Google used a comic book as the technical documentation for its new "Chrome" browser. Quoting the article:

Mapping out the comic involved several brainstorming sessions. “It was, in the earliest stages, an enormously complex process, trying to work from these very geek technical details towards a visualization that would be accessible, but not condensing, not shallow,” Mr. McCloud said. Explaining new browser technology means getting into potentially eye-glazing details, but Mr. McCloud offset that arcane matter with clever, anthropomorphic depictions of overworked browsers and guilty-looking plug-ins.

For Mr. McCloud, the real opportunity was not to introduce a browser, but to show how effective comics can be at communicating complex ideas. “I don’t think the potential for comics in nonfiction has been exploited nearly as much as it could be,” he said. And what they can teach people should not be underestimated. “When you’re on an airplane and your life depends on it,” Mr. McCloud said, “comics are going to tell you how to open an emergency exit.”

Wednesday, September 03, 2008


I thought this, just posted on the afternoon edition of the Chronicle of Higher Education, was really interesting.

September 3, 2008
NIH Tries to Buy Eureka Moments With New Round of Grants
Few, if any, scientific discoveries prompt a slap to the forehead and a shout of “Eureka,” as Archimedes is said to have done.

But that hasn’t stopped the National Institutes of Health from chasing after truly novel work that could push research in new directions. Today, the agency announced it was giving out $42.2-million to 38 “exceptionally innovative research projects that could have an extraordinarily significant impact on many areas of science.” Each will get $200,000 in direct costs for up to four years.

The awards are the first to be made under a new initiative with the acronym “Eureka,” for “Exceptional, Unconventional Research Enabling Knowledge Acceleration.” For example, Iswar K. Hariharan, a professor of cell and developmental biology at the University of California at Berkeley, won a grant to study new ways to promote tissue to regrow.

The Eureka awards are one way the NIH has been trying to respond to criticism that its ultracompetitive grant-making process favors established researchers and relatively safe projects that are sure to deliver results.

Last year the agency started granting new innovator awards to support researchers early in their careers. The Pioneer award program, by contrast, provides $500,000 per year for five years to researchers with a track record of opening new areas in research.

Monday, September 01, 2008

Technology Irritations

Writer Jonathan Franzen’s entertaining account of new technologies and the implications for society can be found in “I Just Called to Say I Love You,” Technology Review 111 (September/October 2008): 88-95, also available at http://www.technologyreview.com/Infotech/21173/page1/.

His essay starts out: “One of the great irritations of modern technology is that when some new development has made my life palpably worse and is continuing to find new and different ways to bedevil it, I'm still allowed to complain for only a year or two before the peddlers of coolness start telling me to get over it already Grampaw--this is just the way life is now.”