Thursday, December 18, 2008

3G network test

On the off chance that you may have missed an advertisement or two in which carriers are bragging about their latest "3G" network speeds, this item might not be of particular interest. If you don't fall into that category, you might find the non-scientific study done by gadget blog Gizmodo of interest. In brief, here is what they found. In the tested cities, download speeds are in this graph:



And upload speeds are in this graph:



So, what does this mean to you? Well, first off, if you want a more symmetric connection, AT&T appears to be the best choice. For download speeds, on average, there isn't much difference between carriers. Finally, the results show that your results may vary depending on where you are. Helpful?

Educational Values

As we prepare to enter the holidays, burdened by complaints from students upset about getting an A instead of an A+ and whining about grades instead of wanting to talk about what they might have learned, I thought this article by Ralph Hexter, president of Hampshire College, in Inside Higher Education provides a bigger perspective.

Here are some excerpts:

“Much of what lies behind our current economic train-wreck stems from short-sightedness — focus on short-term goals and gains — and near-sightedness — seeking to maximize one vector without regard for context in which that vector has value to begin with.”

Hexter discusses how higher education has, perhaps, missed the boat in educating students rather than simply perpetuating a race for high grades and credentials.

“The system we use to grade students doesn’t just mirror this scale of values. It blesses and promotes it. Even as the admissions officers of our most prestigious colleges and universities claim to seek “well-rounded students,” they are choosing among students who have already learned to play the high-score-and-grades game in high school. Most colleges and universities do not question what students and their parents want of them: Enough seats in the “right” majors so they can get their passport to a professional school. How? By wracking up the same string of A’s during their undergraduate years as they did before. Little time for experimentation, for taking risks — where the only “loss” might be a less than perfect transcript. If they don’t get into the right graduate or professional program they might not get the credential that is the ticket to a job where they can reap larger profits more quickly than those who went before them, in the same fields. Because, the assumption is, those fields will always be profitable.”

Hexter then describes the doing away with grades at his institution.

“This philosophy undergirds Hampshire’s whole system of education. Instead of choosing among pre-set majors — predetermined fields with established questions — each student crafts a unique educational plan of work that must be approved by two professors. Each student submits a portfolio to show that she or he has achieved the agreed-upon goals, and faculty evaluate the totality of each student’s accomplishments. Our students come to know that the first step in learning is defining the question and setting it in context. Even more: To take responsibility for deciding which questions to ask, quite often of a status quo that seems unassailable, and then by means of study, research, interrogation, and creative reflection, to reframe the question in light of changing circumstances.”

I need to search for something like this. I am tired of students pre-occupied with grades and credentials who miss the point that they are here to learn something. Radical experimentation is in order.

The article is “The Economic Collapse and Educational Values” and is at http://www.insidehighered.com/views/2008/12/18/hexter.

Thursday, December 11, 2008

New report on "cord cutters"

I have been following the "cord cutter" trend for some time. This report from Nielsen provides a bit more fodder.



From this, we can conclude that cord cutters are much more common among the under 34 age group who rent. Not surprising to me, but it is good to see data.

Tuesday, November 25, 2008

Cyberchondria

With people doing self-diagnosis (say with WebMD), "you've got a raging stomachache and you're feeling kind of fatigued, so you search online for the cause of your malady and conclude that you've got cancer." This link to the related Microsoft study is via Lifehacker and NYTimes.

Monday, November 24, 2008

University of the Future?

Maybe here is the solution to some of our large class sizes. . . . from Chronicle of Higher Education, November 28, 2008
Jeffrey R. Young, “Will Electric Professors Dream of Virtual Tenure?”

Excerpts . . . .

Last month at the NASA-Ames Research Center, a group of top scientists and business leaders gathered to plan a new university devoted to the idea that computers will soon become smarter than people.

The details of Singularity University, as the new institution will be called, are still being worked out — and so far the organizers are tight-lipped about their plans. But to hold such a discussion at all is a sign of growing acceptance that a new wave of computing technologies may be just ahead — with revolutionary implications for research and teaching.

The idea that gave the new university its name is championed by Ray Kurzweil, an inventor, entrepreneur, and futurist who argues that by 2030, a moment — the "singularity" — will be reached when computers will outthink human brains. His argument is that several technologies that now seem grossly undeveloped — including nanotechnology and artificial-intelligence software — are growing at an exponential rate and thus will mature much faster than most linear-minded people realize. Once they do, computers will take leaps forward that most people can hardly imagine today.

Computerized research assistants might even do some of the work that graduate assistants do today. Professors will be able to assign hundreds of these electronic assistants to problems without having to get grant money to pay them.
Computers will become better at teaching than most human professors are once artificial intelligence exceeds the abilities of people. . . . These new computer teachers will have more patience than any human lecturer, and they will be able to offer every student individual attention — which sure beats a 500-person lecture course.

http://chronicle.com
Section: Information Technology
Volume 55, Issue 14, Page A13

Sunday, November 23, 2008

If you liked this ...

There is an interesting article by Clive Thompson in this Sunday NY Times Magazine on the progress of the $1 million Netflix challenge to improve their recommender results by 10%. The article does not shy away from the mechanics of search algorithms, including a discussion of how Singular Value Decomposition is useful for summarizing the search space. This level of detail often left out of the popular press when describing algorithms.

(By the way, one of the earliest and most cited papers on SVD is Deerwester, S., Dumais, S. T. , Furnas, G. W., Landauer, T. K., & Harshman, R., Indexing by Latent Semantic Analysis, JASIS, 1990.)


Thompson goes to talk about the “Napoleon Dynamite” problem and Netflix’s internal debate of whether hiring cinephiles to tag all 100,000 movies would help. The article ends with Pattie Maes questioning whether computer search alone with ever be enough. Instead, future recommender systems might need to a mixture of algorithmic and social networking tools.

In all, it the article provides a great discussion of rather complex issues about recommender systems. He makes it clear how progress can measured empirically, how difficult it is to improving algorithms even by a fraction of a percent, and science can advance through an open dialogue among highly competitive research teams.

Saturday, November 22, 2008

Women and Computer Science

The NY Times (http://www.nytimes.com/2008/11/16/business/16digi.html) reported this week on the noted decline of woman in computer science in recent years. As the graph in the right indicates, interest in CS for female college freshman has dropped from 4.2 to 0.3 since 1982. Recent trends are no better. "In 2001-2, only 28 percent of all undergraduate degrees in computer science went to women. By 2004-5, the number had declined to only 22 percent. ... at research universities like M.I.T.: women accounted for only 12 percent of undergraduate degrees in computer science and engineering."

The article goes on to point out that there is a little agreement as to the cause of the decline and therefore little agreement on how to reverse the trend.

Friday, November 21, 2008

E-mail Responses: Rational or Random?

I chanced upon this news report in Northwestern University that reports on some research that a faculty member in chemical and biological engineering had done on whether (timely?) responses to e-mail are rational or random by studying 3000 e-mail accounts. I couldn't find the actual paper, but if someone did, please leave a comment.

Thursday, November 20, 2008

Now Online: "Europeana", Europe's Digital Library

According to this press release, Europe's digital library is open to the public today:

Europeana, Europe’s multimedia online library opens to the public today. At www.europeana.eu, Internet users around the world can now access more than two million books, maps, recordings, photographs, archival documents, paintings and films from national libraries and cultural institutions of the EU's 27 Member States. Europeana opens up new ways of exploring Europe’s heritage: anyone interested in literature, art, science, politics, history, architecture, music or cinema will have free and fast access to Europe's greatest collections and masterpieces in a single virtual library through a web portal available in all EU languages. But this is just the beginning. In 2010, Europeana will give access to millions of items representing Europe's rich cultural diversity and will have interactive zones such as communities for special interests. Between 2009 and 2011, some €2 million per year of EU funding will be dedicated to this.

Monday, November 17, 2008

Act now...

This is off-topic. It has little to do with Information Science, but it has a lot to do with getting things moving in the school. Here is an article that is worth reading (in my humble opinion) to reduce process inefficiencies and getting to action. It starts with a quote from DaVinci: "I have been impressed with the urgency of doing. Knowing is not enough; we must apply. Being willing is not enough; we must do."

Saturday, November 15, 2008

Some Need to Rethink the Immediate Future

An article in today’s New York Times – Ashlee Vance, “Tech Industry Feels a Slump,” http://www.nytimes.com/2008/11/15/technology/15tech.html?th&emc=th, suggests the need for the School to do some intense contingency planning for next academic year, especially in terms of recruiting.

Here are some excerpts:

The technology industry, which resisted the economy’s growing weakness over the last year as customers kept buying laptops and iPhones, has finally succumbed to the slowdown.

In the span of just a few weeks, orders for both business and consumer tech products have collapsed, and technology companies have begun laying off workers. The plunge is so severe that some executives are comparing it with the dot-com bust in 2000, when hundreds of companies disappeared and Silicon Valley lost nearly a fifth of its jobs.

This time around, the tech sector finds itself at the mercy of a double-barreled slump in both corporate and consumer spending caused by the housing decline and the economic crisis on Wall Street. Technology companies are also feeling the effect of frozen credit markets as business and government customers struggle to finance computer and software purchases that can run to millions of dollars.

Need to Rethink Next Year?

An article in today’s New York Times – Ashlee Vance, “Tech Industry Feels a Slump,” http://www.nytimes.com/2008/11/15/technology/15tech.html?th&emc=th, suggests the need for the School to do some intense contingency planning for next academic year, especially in terms of recruiting.

Here are some excerpts:

The technology industry, which resisted the economy’s growing weakness over the last year as customers kept buying laptops and iPhones, has finally succumbed to the slowdown.

In the span of just a few weeks, orders for both business and consumer tech products have collapsed, and technology companies have begun laying off workers. The plunge is so severe that some executives are comparing it with the dot-com bust in 2000, when hundreds of companies disappeared and Silicon Valley lost nearly a fifth of its jobs.

This time around, the tech sector finds itself at the mercy of a double-barreled slump in both corporate and consumer spending caused by the housing decline and the economic crisis on Wall Street. Technology companies are also feeling the effect of frozen credit markets as business and government customers struggle to finance computer and software purchases that can run to millions of dollars.

Thursday, November 06, 2008

The Last Professors

As we try to be responsible, efficient, and smart about our academic programs in the new world of fiscal crisis, students demanding to be treated as customers, the aims of being business-like, and the temptations of new funding sources with many strings-attached, a book like Frank Donoghue’s The Last Professors: The Corporate University and the Fate of the Humanities (New York: Fordham University Press, 2008) is an important read.

Any academic, whether connected to the humanities or not, ought to read this book, because it provides a good list of how corporate influences, new uses of information technologies, and new challenges to the fiscal bottom-line all contribute to radical redefining of what faculty ought and can do in the university. Faculty are reduced to productivity measures, salesmanship, revenue generators, and societal or business relevance. Donoghue reflects, for example, that “disciplines that hold out the promise of money and cultivate a knowledge of money both attract and produce expert professionals who stand at the farthest remove from the humanities” (p. 69). I would argue that those of us not in the humanities, momentarily safely-ensconced in professional schools, ought not to be too smug about this impact on the humanities; it is possible that the same forces squeezing that sector of the university could come back to pressure other activities in professional schools – such as the constituencies they serve and what research they determine to pursue.

In other words, Donoghue’s commentary can be read as a roadmap indicating future challenges ahead for every component of the university. He predicts, for example, the eventual disappearance of tenured faculty, and he makes a compelling case for this happening because of the increasing attention to fiscal matters as the determinants of all academic affairs. Donoghue also predicts a new kind of credentialism for undergraduate education: “The B.A. and B.S. will largely be replaced by a kind of educational passport that will document each student’s various educational certifications from one of several schools, the credentials directly relevant to his or her future occupation” (p. 84). From my vantage in a professional school, I would argue we can already see some of this in our new masters students, where they often seem ill-equipped in their knowledge, critical thinking and research skills, and other areas – some of these problems perhaps attributable to more of a stress on vocational goals rather than being educated broadly and deeply (and where they see our own graduate degrees as just a credential to practice).

Donoghue sees as what’s at stake is the very meaning of higher education, and he thinks that faculty must “use the tools of critical thinking to question that the widespread assumption that efficiency, productivity, and profitability are intrinsically good” (p. 88). And this may be harder to do than we think. He considers the advent of online education, and rather than critiquing because of pedagogical and related issues, Donoghue expresses concerns because it potentially shifts the ownership and control of teaching to the course management businesses, something that has not happened yet although there are troubling signs of what lies ahead. We need to be mindful of the implications of all that we do, even those decisions that seem straightforward and commonsensical. I admit to losing sleep sometimes over what appear to be the simplest things, because I worry about their long-term consequences.

Sunday, November 02, 2008

Rare Books and Teaching

Roger Mummert, "Handle This Book! Curators Put Rare Texts in 18-Year-Old Hands," NY Times, November 2, 2008, in the special education supplement, writes about the emergence of new courses about the history of books and printing for undergraduates. Mummert notes, "Courses on the history of the book itself have grown along with the ascendancy of electronic information. Students today often blindly grant authority to the online world. Curators want to reconnect them with original sources and teach them to question those sources." One of the potential beauties of an I-School is the possibility of providing well-rounded exposure to both technology and traditional issues such as the nature of the printed or manuscript book. The question is, as always, how to do it.

Wednesday, October 29, 2008

Visualizing Internet "buzz" at Nielsen

This item over at Nielsen is interesting to me for the way in which they have sought to represent "buzz" on social network sites. The topic (General Motors) will likely be of less interest to the community, but this graph should.





Do you find this an effective visualization? Why or why not?

Friday, October 24, 2008

Audit supports integrity of county voting machines

Somehow this does not seem all that reassuring. Doesn't an election seem too important not to provide paper verification of the actual votes cast, as every scientific and computing organization has recommended?


Tuesday, October 21, 2008

Allegheny County officials said a random-sample audit showed that the software loaded in 18 touch-screen voting machines is what the state certified when it approved the instruments.

...

The audit, which is expected to cost about $15,000 when the contract is settled, essentially decommissioned the 18 machines that were used for the test, said Mr. Flynn. The county's contract with ES&S stipulates once the seal on a machine has been broken, it cannot be used again until it is recertified.

"This is the first time that any county in Pennsylvania has verified the software on these machines to the extent that we have," Mr. Flynn said.

Hey, you! Cell-phone zombie! Get off the road!


Great article in Slate today on the problem of cellphone zombies.

"Last month, 25 people died and 130 were injured in a train crash near Los Angeles. The cause, apparently, was a cell phone. In three hours of work before the crash, one of the engineers received 28 text messages and sent 29 more. He sent his last message 22 seconds before impact, just after passing a signal that would have alerted him to the disaster ahead.

Scientists call this phenomenon "cognitive capture" or "inattention blindness." The mind, captured by the world inside the phone, becomes blind to the world outside it. Millions of people move among us in this half-absent state. Mentally, they're living in another world."

See the rest at http://www.slate.com/id/2202978/

Tuesday, October 21, 2008

How to Brand a School

Jessic R. Feldman and Robert Shilling, eds., What Should I read Next? 70 University of Virginia Professors Recommend Readings in History, Politics, Literature, Match, Science, Technology, the Arts, and More (Charlottesville: University of Virginia Press, 2008) is a commendable project suggesting how to reach the public. Each faculty member tackles an interesting area (examples -- media and politics or how computing changes thinking), writes several pages of introduction to the topic, and annotates five significant or compelling books for reading into the topic. Any school, including any professional school could do the same; all that is required is for faculty to work together, to read, and to be willing to exert a little effort. We could transform this blog into such a public device, except that it seems to be dying -- there are just about three, maybe four on a good day, of us contributing to it.

Monday, October 13, 2008

Preserving the Books in Goggle Digital Book Project

Jeffrey Young, “University Libraries in Google Project to Offer Backup Digital Library,” Chronicle of Higher Education, October 13, 2008, reports on an effort to “create a stable backup of the digital books should Google go bankrupt or lose interest in the book-searching business.” Young reports, “The project is called HathiTrust, and so far it consists of the members of the Committee on Institutional Cooperation, a consortium of the 11 universities in the Big Ten Conference and the University of Chicago, and the 10 campuses in the University of California system. The University of Virginia is joining the project, it will be announced today, and officials hope to bring in other colleges as well.” There is one problem: “Because most of the millions of books are still under copyright protection, the libraries cannot offer the full text of the books to people off their campuses, though they can reveal details like how many pages of a given volume contain any passage that a user searches for.”

Sunday, October 12, 2008

Teaching with and about Technology (K-12 Style)

Matthew Kay, discussing teaching 13 year olds armed with laptops and other technology, at the Science Leadership Academy:

“As important as it is for students to expand their sense of community and learn to collaborate — it is more crucial that they learn how to sift thoughtfully through increasing amounts of information. The Internet presents a unique challenge to scholarship — many of the questions that once required extensive research can now be answered with 10-minute visits to Google. The issue now is distinguishing between rich resources and the online collection of surface facts, misinformation, and inexcusable lies that masquerade as the truth. It will be hard for our students to be thoughtful citizens without this ability to discern the useful from the irrelevant. This is especially clear during this election season. If they are never asked to practice dealing with this new onslaught of information, they will have to practice when the stakes are much higher.”

Matthew Kay, “Putting Technology in Its Place,” New York Times, October 11, 2008, available at http://lessonplans.blogs.nytimes.com/2008/10/11/putting-technology-in-its-place/?th&emc=th

Friday, October 10, 2008

Copyright and Digital Stuff

There is an interesting essay about intellectual property and the digital era in the recent issue of Spiked. Andrew Orlowski, “This Digital Utopianism is Glorified Piracy,” Spiked, October 9, 2008, http://www.spiked-online.com/index.php?/site/earticle/5795/.

Here is an excerpt:

“In polite company, sympathy for copyright is in short supply, while for politicians, the ‘creative economy’ is little more than a platitude. Such attitudes are most deeply held amongst people who consider themselves liberal, forward thinking or progressive.

Which is deeply odd, because for 150 years liberals and progressives have embraced the artistic creator as both an ally and a pathfinder. From William Morris’ Arts and Crafts movement, to the many schemes devised by postwar social democratic governments, the creator was an aesthetic rebel, a political ally and a visionary, an ethos that owed much to Shelley’s view of the poet as the ‘unacknowledged legislator’. What many of these initiatives had in common was a creator’s economic independence, typically supported through the mechanism of copyright.

The progressive’s support of creator’s rights expressed an optimistic view of society and human nature. But ever since digital utopianism swept through the chattering classes in the early 1990s, this positive view has been replaced by one of misanthropy and paranoia.”

Thursday, October 02, 2008

Born Digital Students

Andy Guess, in “Understanding Students Who Were ‘Born Digital,’ interviews John Palfrey and Urs Gasser, authors Born Digital: Understanding the First Generation of Digital Natives (Basic Books, 2008). Here is an excerpt:

Q: The intersection of education and technology is riddled with gimmicks that never worked and promises to revolutionize the way students learn. What are realistic expectations for the application of technologies to learning, and what are the potential advantages and disadvantages?

JP: Technology is never a panacea. And technology on its own can do nothing; it’s just a tool for teachers and students to put to work in support of how they want to teach and to learn. A realistic expectation is that technology may be able to help support your pedagogical goals, but it’s not going to (nor should it) do anything on its own.

A key advantage of using technology in education is that, through its use, we can give young people the digital media learning skills that they need. Right now, we are not teaching young people to sort credible information from less credible information online, despite the proliferation of sources and the extent to which we know young people are relying on such sources. Technology can also be very engaging and interactive and — truly — fun for young people to use as they learn.
The disadvantages could be many: over-reliance on the tools to do the teaching, potentially just a distraction, and used at the expense of sometimes better forms of learning (such as reading an entire book).

Q: How should libraries adapt to the changing ways that students (and faculty) do research?

JP: Libraries are adapting every day to changes in research methods. At Harvard Law School Library, we’re just updating our Web site in response to extensive focus-grouping that the reference staff did with students. The site is oriented toward those research tasks that we know start in the digital world, much as a Google search is the first stop for many young people on their way to find information.

There’s much more to be done, of course. One key is to figure out how best to acquire, catalog, and make e-resources accessible to users. Right now, most libraries are set up to do a great job in acquiring, cataloguing, and offering books for use to students and faculty members, but are not organized to handle e-resources. We need to teach students and faculty how to make use of both rivers and oceans of information. A lot of good innovative work is going into solving these issues. I’m sure libraries will adapt.

Another key area of adaptation has to do with the growing interdisciplinary nature of research and learning. More fields are becoming interdisciplinary, but libraries at universities are often stove-piped, much as the schools themselves are. So, we need to be offering research materials but also support for research methods, such as empirical work in law schools.

The original story and user comments can be viewed online at http://insidehighered.com/news/2008/10/02/digital.

Tuesday, September 30, 2008

Wednesday, September 24, 2008

Writing Clearly

There is an interesting brief essay on Ubiquityby Phillip Yaffe, entitled "The Three Acid Tests of Persuasive Writing." It is located at http://www.acm.org/ubiquity/volume_9/v9i35_yaffe.html.

Yaffe writes, "I have yet to encounter a computing professional who is not concerned about communicating effectively. It is a waste of time to have to communicate again because an idea did not get across. And it can be enormously frustrating when people tell you that they do not "get" what you are saying. Especially since most people will not actually tell you that; they will just wander away, leaving you hanging. Phil Yaffe offers three clear, concise, and dense principles to help you overcome this issue."

Philip Yaffe is a former reporter/feature writer with The Wall Street Journal and a marketing communication consultant. He currently teaches a course in good writing and good speaking in Brussels, Belgium. His recently published book In the "I" of the Storm: the Simple Secrets of Writing & Speaking (Almost) like a Professional is available from Story Publishers in Ghent, Belgium (http://www.storypublishers.be/) and Amazon (http://www.amazon.com/).

The Four Kinds of Free

I found this article by Chris Anderson interesting, particularly since many in the industry give something away for "free". I am not going to quote the whole thing, but here is a teaser to get you to visit the item:

A few weeks ago, I posted a diagram grouping free business models into three categories: cross-subsidies (eg, razor-and-blades), three-party markets (ads) and "freemium" (what economists call "versioning"; in this case most people get the free version). But as I was writing through that chapter, I realized that wasn't quite right.

Transformations of organizational structures

If you have heard my talks in the past several years, then you know that I get interested in industry organization and reorganization, especially with regard to the telecommunications industry. Thus, I found this article interesting.

Quoting the article

In brief, most companies today are still an unnatural bundle of three fundamentally different, and often competing, business types:

  • Infrastructure management businesses – high volume, routine processing activities like running basic assembly line manufacturing, logistics networks or routine customer call centers
  • Product innovation and commercialization businesses – developing, introducing and accelerating the adoption of innovative new products and services
  • Customer relationship businesses – building deep relationships with a target set of customers, getting to know them very well and using that knowledge to become increasingly helpful in sourcing the products and services that are most relevant and useful to them
These three business types remain tightly bundled together within most companies today even though they have completely different skill sets, economics and cultures required for success. Inevitably, companies deeply compromise on their performance as they seek to balance the competing needs of these business types. More broadly, this tight bundling decreases agility and diminishes learning capacity.
So how does this apply to the structure of the information organizations that we study? Has anyone taken this kind of analysis to a library?

Sunday, September 21, 2008

Future Technologies

For those interested in ruminating on the implications of future technologies, you might want to read David D. Friedman, Future Imperfect: Technology and Freedom in an Uncertain World (New York: Cambridge University Press, 2008). Friedman states early in the book, "The conclusion I want readers to draw from this book is not that any one of the futures I sketch is going to happen. The conclusion I want them to draw is that the future is radically uncertain. In interesting ways" (p. 4). He discusses, among other things, intellectual property, personal privacy, transparency and its mixed blessings, e-business. open space and scholarship, computer crime, biotechnology, and virtual reality. He is deliberately provocative, and he often presents extreme opposite possibilities of various scenarios.

Thursday, September 18, 2008

Race and Admissions

In this afternoon's Chronicle of Higher Education:

The Education Department’s Office for Civil Rights has aroused the ire of at least one leading civil-rights group by telling colleges receiving federal aid that they may not consider race in admissions unless it is “essential” to their “mission and stated goals.”

The advice to colleges came in a letter of guidance sent to them by Stephanie J. Monroe, the department’s assistant secretary for civil rights, late last month. The letter represents the first attempt by the federal civil-rights office to tell colleges how it will interpret the U.S. Supreme Court’s last major rulings on the use of affirmative action in college admissions, its 2003 Grutter v. Bollinger and Gratz v. Bollinger decisions involving the University of Michigan at Ann Arbor.

The new letter also tells colleges that the diversity they seek “must be broader than mere racial diversity,” that “quotas are impermissible,” and that “providing individualized consideration is paramount and there must be no undue burden on other-race applicants.” In addition, it says, colleges must give “serious good-faith consideration” to race-neutral alternatives before using race in admissions, and the use of race “must have a logical end point.”

Chauncey Bell on Design

Chauncey Bell provides an interesting essay on design in his "My Problem with Design," Ubiquity Volume 9, Issue 34 (September 16 - 22, 2008), available at http://www.acm.org/ubiquity/volume_9/v9i34_bell.html
He is what he tries to do in the essay:

"I will raise several questions about the way we commonly interpret "design."

First, our way of understanding design strips apart components, activities, and contexts. I like simplification, but not this kind of atomistic simplification that destroys the context.

Second, the commonplace notions of design don't give observers of the design process strong ways of making sense of the object of the designer's attention - what the designer thinks he or she is designing.

Third, the designer doesn't have a useful way of thinking about who he or she is in the process of design - the role they think they are playing.

Fourth, I want to question the accountability the designer takes in the invention of whatever he or she is designing."

Wednesday, September 17, 2008

GAO reports on cyber security

Given the emphasis on cyber security at SIS, this report from the US GAO is relevant. This report, also released today, is quite a bit stronger. Quoting the latter report:

While US-CERT’s cyber analysis and warning capabilities include aspects of each of the key attributes, they do not fully incorporate all of them. For example, as part of its monitoring, US-CERT obtains information from numerous external information sources; however, it has not established a baseline of our nation’s critical network assets and operations. In addition, while it investigates if identified anomalies constitute actual cyber threats or attacks as part of its analysis, it does not integrate its work into predictive analyses. Further, it provides warnings by developing and distributing a wide array of notifications; however, these notifications are not consistently actionable or timely.

US-CERT faces a number of newly identified and ongoing challenges that impede it from fully incorporating the key attributes and thus being able to coordinate the national efforts to prepare for, prevent, and respond to cyber threats. The newly identified challenge is creating warnings that are consistently actionable and timely. Ongoing challenges that GAO previously identified, and made recommendations to address, include employing predictive analysis and operating without organizational stability and leadership within DHS, including possible overlapping roles and responsibilities. Until US-CERT addresses these challenges and fully incorporates all key attributes, it will not have the full complement of cyber analysis and warning capabilities essential to effectively performing its national mission.

The ITU and IP Traceback

This item is an example of a topic that should generate interest at SIS. IP traceback is the technical ability to ascertain the source of a message sent over the Internet.

At heart is the tradeoff between the need for security and the desire for privacy (in this case, actually, anonymity) in Internet communications. In certain cases, such as speech in a repressive regime, it is easy to make the case for anonymity. In other cases, anonymity seems to be a way to enable socially irresponsible behavior because there is little consequence to the individual. Where does this fall?

Tuesday, September 09, 2008

Science for Sale

Daniel S. Greenberg, Science for Sale: The Perils, Rewards, and Delusions of Campus Capitalism(Chicago: University of Chicago, 2007) provides a balanced, critical, and somewhat hopeful analysis of the relationship between corporate research and the university. For example, Greenberg concludes that "overall, for protecting the integrity of science and reaping in benefits for society, wholesome developments now outweigh egregious failings -- though not by a wide margin. Nonetheless, the changes and trends are hopeful" (p. 256). While he argues that there are procedures, good ones generally, in place at universities to enable them to work carefully in research with the corporate and government sector, he is also cautious about the challenges ahead: "Temptations for ethical lapses are abetted by institutional factors that are untamed. The academic arms race giddily accelerates. In Ponzi-scheme fashion, it inflames the pursuit of money for constructing research facilities needed to attract high-salaried scientific superstars who can win government grants to perform research that will bring glory and more money to the university. Academe's pernicious enthrallment by the rating system of U.S. News & World Report is a disgrace of modern higher education" (p. 276). And so forth. . . . this is an important book.

Down with Blackboard?

There is an interesting new article in the Chronicle of Higher Education about the continuing problems with Blackboard, the company’s lack of service to its customers, and growing options for other software for teaching. Any of you found screaming at problems with Blackboard will want to read the article. Jeffrey Young, “Blackboard Customers Consider Alternatives,” CHE, September 12, 2008, available at http://chronicle.com/free/v55/i03/03a00103.htm?utm_source=at&utm_medium=en

Monday, September 08, 2008

Cloud computing

In his recent book, The Big Switch, Nicholas Carr makes the case for the emergence of cloud (or utility) computing as the next means by which we will process information. In this item, Joe Weinman asserts 10 laws of cloud computing:

Cloudonomics Law #1: Utility services cost less even though they cost more. An on-demand service provider typically charges a utility premium — a higher cost per unit time for a resource than if it were owned, financed or leased. However, although utilities cost more when they are used, they cost nothing when they are not. Consequently, customers save money by replacing fixed infrastructure with clouds when workloads are spiky, specifically when the peak-to-average ratio is greater than the utility premium.

Cloudonomics Law #2: On-demand trumps forecasting. The ability to rapidly provision capacity means that any unexpected demand can be serviced, and the revenue associated with it captured. The ability to rapidly de-provision capacity means that companies don’t need to pay good money for non-productive assets. Forecasting is often wrong, especially for black swans, so the ability to react instantaneously means higher revenues, and lower costs.

Cloudonomics Law #3: The peak of the sum is never greater than the sum of the peaks. Enterprises deploy capacity to handle their peak demands – a tax firm worries about April 15th, a retailer about Black Friday, an online sports broadcaster about Super Sunday. Under this strategy, the total capacity deployed is the sum of these individual peaks. However, since clouds can reallocate resources across many enterprises with different peak periods, a cloud needs to deploy less capacity.

Cloudonomics Law #4: Aggregate demand is smoother than individual. Aggregating demand from multiple customers tends to smooth out variation. Specifically, the “coefficient of variation” of a sum of random variables is always less than or equal to that of any of the individual variables. Therefore, clouds get higher utilization, enabling better economics.

Cloudonomics Law #5: Average unit costs are reduced by distributing fixed costs over more units of output. While large enterprises benefit from economies of scale, larger cloud service providers can benefit from even greater economies of scale, such as volume purchasing, network bandwidth, operations, administration and maintenance tooling.

Cloudonomics Law #6: Superiority in numbers is the most important factor in the result of a combat (Clausewitz). The classic military strategist Carl von Clausewitz argued that, above all, numerical superiority was key to winning battles. In the cloud theater, battles are waged between botnets and DDoS defenses. A botnet of 100,000 servers, each with a megabit per second of uplink bandwidth, can launch 100 gigabits per second of attack bandwidth. An enterprise IT shop would be overwhelmed by such an attack, whereas a large cloud service provider — especially one that is also an integrated network service provider — has the scale to repel it.

Cloudonomics Law #7: Space-time is a continuum (Einstein/Minkowski) A real-time enterprise derives competitive advantage from responding to changing business conditions and opportunities faster than the competition. Often, decision-making depends on computing, e.g., business intelligence, risk analysis, portfolio optimization and so forth. Assuming that the compute job is amenable to parallel processing, such computing tasks can often trade off space and time, for example a batch job may run on one server for a thousand hours, or a thousand servers for one hour, and a query on Google is fast because its processing is divided among numerous CPUs. Since an ideal cloud provides effectively unbounded on-demand scalability, for the same cost, a business can accelerate its decision-making.

Cloudonomics Law #8: Dispersion is the inverse square of latency. Reduced latency — the delay between making a request and getting a response — is increasingly essential to delivering a range of services, among them rich Internet applications, online gaming, remote virtualized desktops, and interactive collaboration such as video conferencing. However, to cut latency in half requires not twice as many nodes, but four times. For example, growing from one service node to dozens can cut global latency (e.g., New York to Hong Kong) from 150 milliseconds to below 20. However, shaving the next 15 milliseconds requires a thousand more nodes. There is thus a natural sweet spot for dispersion aimed at latency reduction, that of a few dozen nodes — more than an enterprise would want to deploy, especially given the lower utilization described above.

Cloudonomics Law #9: Don’t put all your eggs in one basket. The reliability of a system with n redundant components, each with reliability r, is 1-(1-r)n. So if the reliability of a single data center is 99 percent, two data centers provide four nines (99.99 percent) and three data centers provide six nines (99.9999 percent). While no finite quantity of data centers will ever provide 100 percent reliability, we can come very close to an extremely high reliability architecture with only a few data centers. If a cloud provider wants to provide high availability services globally for latency-sensitive applications, there must be a few data centers in each region.

Cloudonomics Law #10: An object at rest tends to stay at rest (Newton). A data center is a very, very large object. While theoretically, any company can site data centers in globally optimal locations that are located on a core network backbone with cheap access to power, cooling and acreage, few do. Instead, they remain in locations for reasons such as where the company or an acquired unit was founded, or where they got a good deal on distressed but conditioned space. A cloud service provider can locate greenfield sites optimally.

A comic book user manual

OK, so perhaps they should be called graphic novels ... in any case, I found this article from the NY Times interesting. It describes how Google used a comic book as the technical documentation for its new "Chrome" browser. Quoting the article:

Mapping out the comic involved several brainstorming sessions. “It was, in the earliest stages, an enormously complex process, trying to work from these very geek technical details towards a visualization that would be accessible, but not condensing, not shallow,” Mr. McCloud said. Explaining new browser technology means getting into potentially eye-glazing details, but Mr. McCloud offset that arcane matter with clever, anthropomorphic depictions of overworked browsers and guilty-looking plug-ins.

For Mr. McCloud, the real opportunity was not to introduce a browser, but to show how effective comics can be at communicating complex ideas. “I don’t think the potential for comics in nonfiction has been exploited nearly as much as it could be,” he said. And what they can teach people should not be underestimated. “When you’re on an airplane and your life depends on it,” Mr. McCloud said, “comics are going to tell you how to open an emergency exit.”

Wednesday, September 03, 2008

Eureka!

I thought this, just posted on the afternoon edition of the Chronicle of Higher Education, was really interesting.

September 3, 2008
NIH Tries to Buy Eureka Moments With New Round of Grants
Few, if any, scientific discoveries prompt a slap to the forehead and a shout of “Eureka,” as Archimedes is said to have done.

But that hasn’t stopped the National Institutes of Health from chasing after truly novel work that could push research in new directions. Today, the agency announced it was giving out $42.2-million to 38 “exceptionally innovative research projects that could have an extraordinarily significant impact on many areas of science.” Each will get $200,000 in direct costs for up to four years.

The awards are the first to be made under a new initiative with the acronym “Eureka,” for “Exceptional, Unconventional Research Enabling Knowledge Acceleration.” For example, Iswar K. Hariharan, a professor of cell and developmental biology at the University of California at Berkeley, won a grant to study new ways to promote tissue to regrow.

The Eureka awards are one way the NIH has been trying to respond to criticism that its ultracompetitive grant-making process favors established researchers and relatively safe projects that are sure to deliver results.

Last year the agency started granting new innovator awards to support researchers early in their careers. The Pioneer award program, by contrast, provides $500,000 per year for five years to researchers with a track record of opening new areas in research.

Monday, September 01, 2008

Technology Irritations

Writer Jonathan Franzen’s entertaining account of new technologies and the implications for society can be found in “I Just Called to Say I Love You,” Technology Review 111 (September/October 2008): 88-95, also available at http://www.technologyreview.com/Infotech/21173/page1/.

His essay starts out: “One of the great irritations of modern technology is that when some new development has made my life palpably worse and is continuing to find new and different ways to bedevil it, I'm still allowed to complain for only a year or two before the peddlers of coolness start telling me to get over it already Grampaw--this is just the way life is now.”

Tuesday, August 26, 2008

Digital Resources and Libraries

Jennifer Howard, “Scholars' View of Libraries as Portals Shows Marked Decline,” Chronicle of Higher Education, August 26, 2008, http://chronicle.com/daily/2008/08/4351n.htm, describes a report issued by Ithaka about the “relationship between libraries and the faculty at institutions of all sizes, and how the digital shift is altering that relationship.” According to Howard, “The report confirms what everyone already knows—that electronic resources are ever more central to scholarly activity. It emphasizes that scholars still value libraries as buyers and archivers of scholarship, and many still use them as gateways to scholarly information. However, it also confirms that researchers increasingly find what they need through Google Scholar and other online resources, a trend the report's authors anticipate will accelerate as more and more knowledge goes digital.” “In an interview, the report's authors said that they hoped the report would get librarians talking about whether libraries should ‘ambitiously redirect resources’ toward new and better ways to serve scholars operating in a digital environment.”

Monday, August 25, 2008

Reinventing Knowledge

We are very accustomed to hearing or reading the self-congratulating messages that we presently live in THE Information Age. Ian F. McNeely with Lisa Wolverton, Reinventing Knowledge: From Alexandria to the Internet (New York: W. W. Norton & Co., 2008) provides an easy to read historical analysis of that claim identifying the library, monastery, university, Republic of Letters, disciplines, and the laboratory as the major means for generating and using new knowledge. While the authors focus on the Western tradition, they also trace the influence of that tradition in non-Western cultures. And near the beginning they suggest that doing this kind of analysis corrects our view of new digital information systems: “We risk committing a serious error by thinking that cheap information made universally available through electronic media fulfills the requirements of a democratic society for organized knowledge. Past generations had to win knowledge by using their wits, and never took what they knew for granted. Recalling their labor and travail is, if anything, more important than ever if we are to distinguish what is truly novel about the ‘information age’ from what is transient hype” (p. xx). Towards the end of the book, McNeely and Wolverton firmly state that “Promoters of the vaunted ‘information age’ often forget that knowledge has always been about connecting people, not collecting information” (p. 271). And if one walks away from reading the book with nothing other than this idea, the time will have been well spent. This is a good book for use in the classroom.

Friday, August 22, 2008

Managing Your Time -- Some Pointers

David Perlmutter, “Do You Really Not Have the Time?,” Chronicle of Higher Education, August 22, 2008, available at http://chronicle.com/jobs/news/2008/08/2008082201c.htm, provides a good list of suggestions for how faculty can manage their time. I extracted below some of his main points.

“To get your work life under control, you must first recognize that the problem is controllable. To view yourself as a martyr to work, fated to slog through the faculty years overburdened with cares and labors, is an exercise in self-indulgence.”

“ Discover and stake out your preferred work environment.”

“ Whatever your favored venue, carve out time to concentrate there.”

“ Frame of mind is also important. . . . You need some spiritual focus as well — a sort of ‘Zen and the Art of Research.’”

“Begin with the big picture: Consider what you want to complete in teaching, research, and service, and calculate the probable time needed to finish the projects for, say, three years ahead.”

David D. Perlmutter is a professor in the William Allen White School of Journalism and Mass Communications at the University of Kansas.

Wednesday, August 20, 2008

In case you are thinking of writing a book about SIS. . .

From the American Library Association --

"Sally Stern-Hamilton’s controversial book, The Library Diaries, written under the pseudonym Ann Miketa, resulted in her termination July 25 as a Mason County (Mich.) District Library employee after 15 years on the job. Written in the first person and set in what she calls the Lake Michigan town of Denialville, the book, produced by print-on-demand publisher PublishAmerica, is a series of fictional vignettes about mostly unsavory characters encountered daily at the library...."

Makes me want to read it

Why Complex Systems Do Better Without Us

ACM Technews posted this interesting thought on systems research this week:

Why Complex Systems Do Better Without Us
New Scientist (08/06/08) Vol. 199, No. 2668, P. 28; Buchanan, Mark

Research by Swiss Federal Institute of Technology physicist Dirk Helbing suggests humans' desire to force complex systems into a regular, predictable model is misguided, and a much better strategy is to cede a certain degree of control and let systems work out solutions on their own. "You have to learn to use the system's own self-organizing tendencies to your advantage," he argues. Helbing and Stefan Lammer at Germany's Technical University of Dresden have considered whether traffic lights could be engineered to reduce congestion by giving the devices the means to adapt their behavior rather than have engineers shape traffic into patterns that seem favorable. The researchers have found that traffic lights, when provided with some simple operating rules and left alone to organize their own solution, can do a better job. Helbing and Lammer have crafted a mathematical model that assumes a fluid-like movement for traffic and describes what happens at intersections. The researchers make the lights at each intersection responsive to increasing traffic pressure via sensors. Lights that only adapt to conditions locally might give rise to problems further away, and to address this the researchers have engineered a scheme in which neighboring lights share their information so that what occurs around one light can affect how others respond, preventing the formation of long traffic jams. Helbing and Lammer have shown through simulation that this setup should substantially reduce overall travel times and keep no one waiting at a light too long, even though the lights' behavior runs counter to accepted human concepts of efficiency.

See article at:

http://www.newscientist.com/channel/being-human/mg19926681.500-why-complex-systems-do-better-without-us.html

NSF on the History of the Internet

This site has been making the rounds on the Internet, in case you managed to miss it. As a multimedia presentation, you might find it a useful adjunct to some of your classes.
Scott McLemee has written an amusing essay about the transition of a 45 year old to using chat and IM; it is published as "Cogito Interruptus" in Inside Higher Education today and available at http://www.insidehighered.com/views/2008/08/20/mclemee.

Here is the concluding part of his essay:

"At one level, texting and IM are just slight variations on the now-familiar medium of e-mail. They tend to be even more casual — without so much formality as a subject line, even — yet they finally seem more similar to e-mail than anything else.

But now that e-mail itself is both so commonplace and so prone to abuse (“naked Angelina Jolie pics here!”), these supplementary forms have a slightly different valence. They seem more urgent. In the case of IM in particular, there is a suggestion of presence – the sense of an individual on the other end, waiting for a reply. (Indeed, the IM format indicates whether someone you know is online at a given time. The window indicates when a person is typing something to send to you.)

For anyone now accustomed to texting and IM – that is, most people in their teens and 20s – all of this goes without saying. And for lots of folks over a certain age, it probably won’t matter: the number of people in their social circles using these format won’t reach critical mass.

Those of us stuck in between, though, are left with questions about civility. Do you have to respond? How rude is it not to do so? (The other day, I ignored an IM from a friend and still feel positively antisocial for it.) Is it necessary to withdraw entirely from all forms of digital communication for a while, just to sustain, as Baudrillard put it two decades ago, “the minimal separation of public and private ... a restricted space”? And will withdrawal even be a possible, a few years down the line?"

Monday, August 18, 2008

Forbes College ranking

Forbes has joined USN&WR and others in coming up with a college ranking. See this article. Quoting the article:
CCAP's methodology attempts to put itself in a student's shoes. How good will my professors be? Will the school help me achieve notable career success? If I have to borrow to pay for college, how deeply will I go into debt? What are the chances I will graduate in four years? Are students and faculty recognized nationally, or even globally?

To answer these questions, the staff at CCAP (mostly college students themselves) gathered data from a variety of sources. They based 25% of the rankings on 7 million student evaluations of courses and instructors, as recorded on the Web site RateMyProfessors.com. Another 25% depends on how many of the school's alumni, adjusted for enrollment, are listed among the notable people in Who's Who in America.

The other half of the ranking is based equally on three factors: the average amount of student debt at graduation held by those who borrowed; the percentage of students graduating in four years; and the number of students or faculty, adjusted for enrollment, who have won nationally competitive awards like Rhodes Scholarships or Nobel Prizes.

The data show that students strongly prefer smaller schools to big ones. The median undergraduate enrollment in the top-50-ranked schools is just 2,285, and only one of the top 50 (the University of Virginia) has more than 10,000 undergraduate students.

Sunday, August 17, 2008

Common Usability Terms

I am not sure how correct this series of articles is, but it may be interesting to the SIS community.

Wednesday, July 23, 2008

Digital Preservation

Andy Guess, “At Libraries, Taking the (Really) Long View,” Inside Higher Education, July 23, 2008, provides a description of the efforts to preserve digital stuff by librarians and archivists, focusing on the realization of trying to manage old the digital information these repositories now hold. Guess describes the various approaches as ranging from “hardware complexities, such as constructing storage devices that continuously monitor and repair data while remaining easily scalable; redundancy measures, such as distributing and duplicating data across storage devices and even across the country; universal standards, such as formats that could conceivably remain readable in the distant future; and interfaces, such as open software protocols that manage digital holdings and make them accessible to the public.” He cites a 2006 report from Britain’s Digital Preservation Coalition and describes work being done at Stanford University and its partnership with Sun Microsystems, setting up the “Sun Preservation and Archiving Special Interest Group, or PASIG, to bring together leaders in research libraries, universities and the government to periodically meet and collaborate on digital archiving issues.”

The essay is at http://insidehighered.com/news/2008/07/23/preservation.

Wednesday, July 16, 2008

Pittsburgh -- Robot Capital

This article over at CNET is interesting since it represents a national view of our region. Quoting the article:

Pittsburgh touts on its official Web site that it's the only city to have won "America's Most Livable City" award twice. But perhaps the "Take me to you robot," or "Go ahead, make my robot," slogans used for its Robot 250 festival are more appropriate.

Pittsburgh is famously home to one of the leading academic research centers for robotics in the country, The Robotics Institute at Carnegie Mellon University, which is also home to one of the country's leading roboticists, Matt Mason.

There has long been a rivalry between Boston and Pittsburgh as to which city is the tech leader in robotics. Both cities have academic and private research centers, as well as major companies, heavily involved in the robotics industry.

But in recent years, Pittsburgh has been playing up its ties to robotics through a series of public announcements, events, and community projects.

On the site for its 250th anniversary, Pittsburgh proudly states that the first robot was created in the Pittsburgh region, as well as the first polio vaccine and the first advanced organ transplantation.

Taiwan digital archive

I came across this article, which ought to be of interest here. Gordon Cook normally reports on Internet related issues, so I would expect that many of you missed this. From the article:
In 2006 with the beginning of Phase Two, a further decision was made to merge the 16 into six-thematic groups. These are first - maps and architectures; second - languages and multimedia; third - biosphere and nature; fourth - lives and cultures; fifth -archives and databases; sixth - artifacts and illustrations.

The bridging across discipline’s will help raise many new possibilities for research such as the more widespread use of things like the largest known time series of human measurements namely the three centuries of the month-by-month changes in agricultural prices registered as part of the Imperial Ch’ing Dynasty archives in Beijing.

Several other serious digital archive projects have begun elsewhere in the world. However, the one in Taiwan is the only one of which I am aware that is so blatantly cross disciplinary – something that, given the capabilities of computer technology, seems to be an obviously desirable course to follow.

Tuesday, July 15, 2008

Blackboard & Open Source

From today’s Inside Higher Education

Blackboard, the dominant player in course management software, has the ability to inspire devotion and, for the more fervid open-source adherents, not a little contempt. So today’s announcement may cause a stir among those more apt to liken Blackboard to the devil than a gentle giant: The company is partnering with Syracuse University to develop a way to integrate Blackboard with Sakai, one of the primary open-source alternatives.

Once the project is completed late this year or early next year, the company hopes it will have created a platform that any campus can adopt to import Sakai data into Blackboard or vice versa. The solution, which is being developed in-house at Syracuse with support from Blackboard, will be released as a “Building Block,” or software plug-in, that will itself be open source.

The announcement comes as part of an increased effort at Blackboard to reach out to proponents of open-source software who have in the past bristled at the company’s tactics or its proprietary, license-based model. In addition to the Sakai partnership, the company is pursuing a similar arrangement with another university to develop integration with another open-source course management system, Moodle.

Andy Guess The original story and user comments can be viewed online at http://insidehighered.com/news/2008/07/15/sakai.

Wednesday, July 02, 2008

Web 2.0 Reflections

The July/August 2008 issue of MIT’s Technology Review features a cluster of essays on “The Future of Web 2.0,” including the implications of social networking and business, security and privacy issues, the use of bandwidth, and reviews of new and emerging social computing providers.

Monday, June 30, 2008

Open Access

Stay tuned as this comes to Pitt; this is from this afternoon's Chronicle of Higher Education

Stanford's Education School Mandates Open Access

Open-access advocates predicted that the move last February by Harvard University’s Faculty of Arts and Sciences and, later, by its Law School to require free online access to all faculty members’ scholarly articles would prompt other universities to adopt similar policies. The movement has not exactly snowballed, but another institution did just join in.

Last week Stanford University’s School of Education revealed that it would require faculty members to allow the university to place their published articles in a free online database.

The school’s faculty passed a motion unanimously — just as Harvard’s two faculties had — on June 10. A faculty member and open-access advocate, John Willinsky, made the policy public last week at the International Conference on Electronic Publishing, in Toronto. A video of his presentation is available. —Lila Guterman

Saturday, June 28, 2008

Persuasive Academic Writing

A new primer on academic writing worth recommending to students and other faculty is Gerald Graff and Cathy Birkenstein, "They Say/I Say: The Moves That Matter in Persuasive Writing (New York: W.W. Norton and Co., 2008). Here is a sample of their premise: "For us, the underlying structure of effective academic writing -- and of responsible public discourse -- resides not just in stating our own ideas, but in listening closely to others around us, summarizing their views in a way that they will recognize, and responding with our own ideas in kind. Broadly speaking, academic writing is argumentative writing, and we believe that to argue well you need to do more than assert your own ideas" (p. 3). This is a useful little book.

Tuesday, June 24, 2008

IT as the new Liberal Arts

In today's Post-Gazette, Ted Roberts writes "If success in the 21st century is being defined by collaborative training that combines computer science/engineering skills with social sciences, languages, psychology and other disciplines, then IT is emerging as the "new" liberal arts. Just as traditional liberal arts education includes the study of theology, art, literature, languages, philosophy, history, mathematics and science, the new world view of Information Technology is evolving to include interdisciplinary skill sets."

In particular he argues that we should be teaching collaboration across geographies, immersion in new media, power of critical thinking, and a passion for citizenship.

It was not the list that I would have created, which is why I found it to be rather thought-provoking as we look towards what should be taught in our graduate and undergraduate curriculum.

See http://www.post-gazette.com/pg/08176/892059-28.stm for the entire commentary.

The Kindle and University Presses

Scott Jaschik, “University Presses Start to Sell Via Kindle,” Inside Higher Education, June 24, 2008, provides an interesting insight in a new direction for university presses. Jaschik reports, “By the beginning of the fall, Princeton plans to have several hundred books available for sale through Kindle. Yale University Press and Oxford University Press already have a similar presence there. The University of California Press recently had about 40 of its volumes placed on Kindle and is ramping up.” The story is at http://insidehighered.com/news/2008/06/24/kindle.

New IT jobs report

It appears that the job market for IT continues to improve, according to this article in BusinessWeek. Quoting the article:
Here's a hint for high school graduates or college students still majoring in indecision: Put down that guitar or book of poetry and pick up a laptop. Study computer science or engineering, and plan to move to a big city.

A new survey out this week from AeA, the group formerly known as the American Electronics Assn., reports that jobs in the technology industry are growing at a healthy clip, especially in large cities. The organization's Cybercities 2008 survey says that 51 cities added high-technology jobs in 2006, the most recent year for which data were available. The survey tracks new jobs related to the creation of tech products, including fields such as chip manufacturing and software engineering. It is the AeA's first such survey since 2000, which was taken before the crash of the tech bubble that created so many jobs in the late 1990s.

And while slowing economic conditions have dulled the pace of growth since the 2006 data were collected, AeA researcher Matthew Kazmierczak says it's far from turning south. "Nationally, there are some data that show the rate of growth has slowed since 2006, but it hasn't gone negative," he says.

The real market value of social networks

Here is the article from TechCrunch (via NY Times).
Blogged with the Flock Browser

Saturday, June 21, 2008

Google and Research

Andy Guess, “Research Methods ‘Beyond Google,” Inside Higher Education, 17 June 2008, describes efforts at some universities to develop information competency programs. Here is an excerpt:

“The problem is near-universal for professors who discover, upon assigning research projects, that superficial searches on the Internet and facts gleaned from Wikipedia are the extent — or a significant portion — of far too many of their students’ investigations. It’s not necessarily an issue of laziness, perhaps, but one of exposure to a set of research practices and a mindset that encourages critical thinking about competing online sources. Just because students walk in the door as “digital natives,” the common observation goes, doesn’t mean they’re equipped to handle the heavy lifting of digital databases and proprietary search engines that comprise the bulk of modern, online research techniques.

Yet the gap between students’ research competence and what’s required of a modern college graduate can’t easily be solved without a framework that encompasses faculty members, librarians, technicians and those who study teaching methods. After all, faculty control their syllabuses, librarians are often confined to the reference desk and IT staff are there for when the network crashes.

So instead of expecting students to wander into the library themselves, some professors are bringing the stacks into the classroom. In an effort to nudge curriculums in the direction of incorporating research methodology into the fabric of courses themselves, two universities are experimenting with voluntary programs that encourage cooperation between faculty and research specialists to develop assignments that will serve as a hands-on and collaborative introduction to the relevant skills and practices.”

The article can be found at http://insidehighered.com/news/2008/06/17/institute.

Wednesday, June 18, 2008

Report on Women in Science, Engineering and Technology

You might find this item over at the Chronicle interesting. It will be interesting and useful to discuss how we might apply this to higher education. Key findings:

Rich Talent Pipeline. 41% of highly qualified scientists, engineers and technologists on the lower rungs of corporate career ladders are female — a talent pipeline that is surprisingly deep and rich. Despite the challenges girls face at school and in our culture, a significant number make the commitment to begin careers in science.

Fight or Flight. Across SET women hit a break point in their mid- to late-30s. Career and family pressures ratchet up at one and the same time. The losses are massive: over time, 52% of highly qualified SET women quit their jobs. Stepping in with targeted support before this “flight or fight” moment has the potential of lowering female attrition significantly.

Antigens and other Barriers. Five powerful antigens in SET corporate cultures help explain the female exodus. Women are seriously turned off by: hostile macho cultures, severe isolation, mysterious career paths, systems of reward that emphasizes risk-taking, and extreme work pressures.

Cutting Edge Models. New initiatives like WOVEN (Alcoa), Crossing the Finish Line (J&J), Mentoring Rings (Microsoft), ETIP (Cisco) and Restart (G.E.) are game changers that will allow many more women stay on track in SET careers.

Comparative prices of liquids

In these days of $4/gallon for gasoline, you might find this amusing/interesting:

Lipton Iced Tea, 16 oz @ $1.19 = $9.52 per gallon

Diet Snapple, 16 oz @ $1.29 = $10.32 per gallon

Gatorade, 20 oz @ $1.59 = $10.17 per gallon

Ocean Spray, 16 oz @ $1.25 = $10.00 per gallon

Evian water, 9 oz @ $1.49 = $21.19 per gallon (!)

Wite-Out, 7 oz @ $1.39 = $25.42 per gallon

Brake fluid, 12 oz @ $3.15 = $33.60 per gallon

Scope, 1.5 oz @ $0.99 = $84.48 per gallon

Pepto-Bismol, 4 oz @ $3.85 = $123.20 per gallon

Vicks NyQuil, 6 oz @ $8.35 = $178.13 per gallon

HP 02 Black Ink Cartridge, 16 ml $18 online = $4,294.58 per gallon.

Thursday, June 12, 2008

A Look at the Internet, Then and Now

You might find this item interesting. I think it gives a good sense of how the underlying infrastructure has changed in ten years.

Digital convergence

I have begun producing podcasts for my courses using Camtasia. So, when doing a dry run for an upcoming talk, it was natural for me to make that into a podcast as well. I also discovered that Blogger lets you embed podcasts into web pages

On June 20, I will be giving a presentation entitled "Future Perspectives on Digital Convergence" to an audience of telecommunications industry professionals and regulators in Seoul. The podcast of my "dry run" is here.

I welcome your comments and feedback!

Saturday, June 07, 2008

Google and Stupdity

Nicholas Carr, "Is Google Making Us Stupid?" Atlantic302 (July/August 2008):56-58, 60, 62-63 wonders if our immersion in networked communications isn't transforming the way we read or even how and what we can read. He argues that it is clear that we are reading more than ever, but he draws in some anecdotal and historical evidence to suggest that this is having an impact on our ability to read longer texts and to focus on certain other kinds of documents. He admits that we have much to learn about this: "Never has a communication system playes so many roles in our lives -- or exerted such broad influence over our thoughts -- as the Internet does today. Yet, for all that's been written about the Net, there's been little consideration of how, exactly, it's reprogramming us. The Net's intellectual ethic remains obscure" (p. 60). Carr's writing is always worth some consideration.

Friday, June 06, 2008

Keeping Up with Email and Other Results of the New Technologies

For an entertaining, informative essay on the struggles of academics keep up with all the email, blogs, text messages, etc., see Piper Fogg's "When Your In Box Is Always Full," Chronicle of Higher Education, June 6,2008, available at http://chronicle.com/weekly/v54/i39/39b01901.htm. This consists of a variety of vignettes of faculty trying to cope.

Wednesday, June 04, 2008

Cellphone tracking reveals information

This article about violation of privacy by using cellphones to track how people move also talks about the actual collection of data and the conclusions. The authors are actually from Notre Dame and here is a link to their webpage. They have looked at other social networking related research.

Blogged with Flock

Tuesday, June 03, 2008

Here Comes Everybody

Clay Shirky, Here Comes Everybody: The Power of Organizing Without Organizations (New York: Penguin Press, 2008) provides a readable account of the impact of IT on supporting new ways for people to work together. Shirkey, a faculty member at NYU’s Interactive Telecommunications Program, considers how Flickr, blogs, MySpace, Wikis, flash mobs, and open source software enable groups to form, organizations to manage differently, amateurs to challenge professionals, and actions to be carried out collectively. Shirky blends case studies (a mass movement to retrieve a stolen cell phone, reactions to off-hand comments made by a politician, and public scrutiny of cover-ups) with research studies and classic writings. Shirky also considers both the positive and negative implications of the new and emerging technologies.

Friday, May 30, 2008

Libraries and the Future of Digitization

Interesting article in Inside Higher Education, May 30, by Andy Guess, May 30 "Post-Microsoft, Libraries Mull Digitization," about Microsoft's withdrawal from the digitization game.

Here is an excerpt: "Libraries increasingly see digitization as a preservation strategy. While Microsoft’s departure probably won’t cause significant upheaval, it will reinforce for universities the necessity of ensuring that they retain the rights to their scanned materials — or that their digitization projects will be around next semester, let alone forever. One way to do that is to continue pursuing internal, proprietary scanning projects which, for many libraries, existed for years before Google and Microsoft made it possible to vastly increase their scope and scale. Another is to work with nonprofit initiatives. But if there’s one thing libraries agree on, it’s that the competition between the two companies was healthy."

The original story and user comments can be viewed online at http://insidehighered.com/news/2008/05/30/microsoft.

Thursday, May 29, 2008

State of the Internet

Akamai is using data gathered by its global server network along with publicly available information to deliver a quarterly "State of the Internet" report. The first such report (January-March 2008) is available here.
Blogged with the Flock Browser

Tuesday, May 27, 2008

The Future of Libraries

Historian Robert Darnton has an interesting essay considering the future of libraries in his "The Library in Your Future," New York Review of Books 55 (June 12, 2008): 72-73, 76, 78-80. Comparing print and digital publishing, Darnton seeks to remind us that every era is an information, although this one is indicating that information may be more fluid than that of earlier times. Essentially, Darnton argues that we need both traditional and new forms of libraries, and that we need to learn to appreciate them both for what they represent. Darnton is always provocative, and this essay is no exception.

Friday, May 23, 2008

Technology, Artifacts, & Secrets

The reunification of Germany provided an opportunity to study how secret police have operated in repressive regimes. Kristie Macrakis, in her Seduced by Secrets: Inside the Stasi’s Spy-Tech World (New York: Cambridge University Press, 2008), opines, “The fall of the Berlin World in 1989 created an unprecedented opportunity for historians to examine the files of a defunct intelligence and secret police organization” (p. xix). The files are truly amazing – a hundred miles of files and 35 million index cards – reflective of a government out of control in watching its own citizens. Macrakis focuses on the Stasi’s uses of information technology (cameras, containers, radios, and computers), providing one of the most detailed analyses of how secret police function. In this volume, we get as well some insights about understanding what the artifacts of information technology tell us. Noting that there were efforts made to destroy all the equipment and other artifacts used by the Stasi, Macrakis makes this interesting comment about the secret police artifacts: ”Artifacts often reflect the ideas, beliefs, achievements, and attitudes of long-lost civilizations; they also mirror their culture. Technology talks, it speaks the language of culture . . . . the technological artifacts offer us rare and valuable insight into a very secret culture and community within which like in a secret cult, every member was trained to keep the methods and sources of their work hidden from the enemy and outsiders” (p. 196). Macrakis is careful to note that the kind of spying we see reflected in the Stasi files is something we should be aware can happen in our own democracy. In fact, she is worried that all of this stems from a “faith in technology,” arguing that “as technological developments have accelerated over the last-century, our dependence on technology and faith in it have only increased” (p. 315).

Thursday, May 22, 2008

Talking to Oneself

From Rachel Toor, “Did You Publish Today?” Chronicle of Higher Education, May 22, 2008 available at http://chronicle.com/jobs/news/2008/05/2008052201c/careers.html

What does it look like to do intellectual work? What does it look like to have an insight? To formulate a theory? To solve a philosophical problem? What does it take to get to the point at which you're ready to sit down and write something, ready to present something to the world?

Experience tells me that sometimes it looks like playing Spider Solitaire. Or twirling one's hair, talking to oneself, or sitting stock still and staring into space.

My friend Andrew, a psychiatrist, is an expert in the physiology of sleep. He has come up with a host of good ideas that have resulted in a fat sheaf of academic publications. He believes that sleep is the result of conditioning, ritual, and circumstance. You can't force yourself to go to sleep. What you can do, he says, is set up the conditions and rituals that will allow it to happen. You let the dog out (or put the rat back in her cage). You change into your footy pajamas. You brush your teeth. You get into bed. And then, having provided the right environment, eventually, you fall asleep.

That process, Andrew believes, is similar to what academics go through when trying to solve an intellectual problem. We shuffle off to our offices and plant ourselves in front of a computer. Or slink into the library and sink into a comfy chair. Or walk around the block 43 times.

We go through the motions that have led us, in the past, to cerebral success. We can no more force ourselves to make an intellectual breakthrough than we can will ourselves to sleep. All we can do is prepare the environment and perform the rituals associated with thinking.

Wednesday, May 21, 2008

National Security Letters

From time to time I have stated that it would be interesting to take some problem facing the information professions and have a group of faculty from various perspectives write position papers or reflections on how to deal with the issue. Well, here is an example of a problem, the continuing assault by the current presidential administration on free speech in the guise of national security. Given that we have faculty engaged in research about digital libraries, national security, archives and records management, ethics and public policy, e-government, and so forth, what an opportunity something like this presents. I am not arguing we should pursue such a project on this particular case, I just think it represents a relevant example.

Toni Carbo tipped me off about this particular story. It is a short news story, so it is here in full.

Libraries Win Second Round against National Security Letters, American Libraries, May 9, 2008, available http://www.ala.org/ala/alonline/currentnews/newsarchive/2008/may2008/kahlewinsnslbattle.cfm

“I’m grateful that I am able now to talk about what happened to me, so that other libraries can learn how they can fight back from these overreaching demands,” Internet Archive founder and digital librarian Brewster Kahle stated May 7, two days after records were unsealed documenting his six-month legal battle to force the FBI to withdraw a National Security Letter because it sought details of several patrons’ archive use without a court order.

The disclosure about the existence of Internet Archive v. Mukasey came two days after the records were unsealed about Kahle’s federal complaint against the Justice Department. As legal counsel representing the digital library, the American Civil Liberties Union and the Electronic Frontier Foundation named themselves as co-plaintiffs because the gag order that has accompanied NSLs since the 2001 enactment of the Patriot Act also forbids legal counsel from speaking about any aspect of such a case.
The disclosed documents reveal that the FBI issued an NSL to the Internet Archive on November 19, 2007, seeking the patrons’ names and contact information and “all electronic mail header information (not to include message content and/or subject fields).” Kahle responded December 14, 2007, with a First Amendment challenge to the constitutionality of serving an NSL on a library. “The FBI cannot demand records from libraries [under the reauthorized Patriot Act], unless they are providers of wire or electronic communication services. The archive is not a provider,” EFF Senior Staff Attorney Kurt B. Opsahl wrote the agency three days later.

The complaint never became a full-fledged lawsuit because Opshal offered the FBI a deal: “If the government is willing to withdraw the NSL, including the nondisclosure order, the archive will voluntarily dismiss the lawsuit.” The FBI apparently agreed to negotiate, and reached a settlement agreement April 21 in which the NSL was withdrawn but the case itself remained under court seal until the Justice Department and the plaintiffs agreed on how relevant documents were to be redacted.

Thanking the plaintiffs for “their brave stand against this unconstitutional federal intrusion,” American Library Association President Loriene Roy said May 7, “While librarians fully support the efforts of law enforcement in legitimate investigations, those efforts must be balanced against the right to privacy.” Roy went on to call for the passage of the National Security Letters Reform Act of 2007 (H.R. 3189) “for meaningful Congressional oversight of these risky law enforcement tools.”

ACLU staff attorney Melissa Goodman noted, “It appears that every time a National Security Letter recipient has challenged an NSL in court and forced the government to justify it, the government has ultimately withdrawn its demand for records.” In response, John Miller of the FBI’s Office of Public Affairs said, “National Security Letters remain indispensable tools for national security investigations and permit the FBI to gather the basic building blocks for our counterterrorism and counterintelligence investigations.”

The Internet Archive is the third known instance of an NSL challenge, and became public two years after four Connecticut librarians successfully defended patron privacy from a similar NSL demand. The American Library Association as well as its Freedom to Read Foundation filed amicae briefs in an unrelated challenge by an Internet Service Provider to NSL gag provisions; Judge Victor Marrero of the U.S. District Court for the Southern District of New York overturned the entire NSL statute September 7, 2007, and the Justice Department is scheduled to offer oral arguments in June before the Second Circuit Appeals Court seeking to reverse Marrero’s ruling.

Tuesday, May 20, 2008

Children and Technology

For an interesting critique of the John D. and Catherine T. MacArthur Foundation's Digital Learning Initiative see Mark Bauerlein's "Research Funds for Technophiles" in Inside Higher Education, May 21, 2008.

Here is the concluding paragraph to give you a taste:

"This is not to fault Jenkins, Davidson, and MacArthur for arguing the benefits of digital learning, or for disputing the claims of skeptics and dissenters. It is to fault them for not allowing a dispute to happen through open debate. In a word, they stigmatize the other side. In doing so, they turn the Digital Learning Initiative into an advocacy program, not a research project. The first rule of research is to consider evidence from all sources, to open the marketplace to anybody willing to observe norms of evidence and collegiality. Throwing labels such as “moral panic” and “Hall of Shame” breaks the rule, and when the speakers have $50 million behind them, it corners the market on legitimacy. MacArthur and other sponsors of digital learning would serve the research and policy worlds better if they allowed more reflection into their programming and tempered the enthusiasm of participants with the presence of dissenters."

The full article can be found at http://insidehighered.com/views/2008/05/20/bauerlein.

Monday, May 19, 2008

The Knowledge Commons

The idea of knowledge as a commons, something shared by a group of people and challenged by social dilemmas, is an interesting topic in our digital age and usefully played with in the volume edited by Charlotte Hess and Elinor Ostrom, Understanding Knowledge as a Commons: From Theory to Practice (Cambridge, MA: MIT Press, 2007), is a good place to start. The essays in this volume are grouped by studying the knowledge commons, protecting the knowledge commons, and building the knowledge commons. And the volume brings together a group of leading researchers and proponents of this concept – including Hess and Ostrom, David Bollier, James Boyle, and Donald Waters – commenting on new standards and initiatives such as the Open Archive Initiative, MIT Dspace, digital libraries, and so forth.

One of the more interesting commentaries comes from Peter Levine in discussing the knowledge commons and community projects, with some connection to what academics do. Levine writes, “Academics are strongly influenced by policies regarding funding, hiring, promotion, and tenure. Often universities that compete internationally for academic prominence do not reward applied research – let alone service – despite rhetoric to the contrary” (p. 261). Levine adds, “Fortunately, universities do reward scholars who break new ground in their disciplines by working with communities. Thus is a strategy of using community engagement to achieve genuine scholarly insight is better suited to the existing academic marketplace than a strategy based on ‘service’” (p. 263). Some interesting ideas to reflect on by faculty in professional schools in research universities.

Friday, May 16, 2008

Technology and Teaching

Julie Frechette, associate professor of communication and the director of the Center for Community Media at Worcester State College, has an interesting essay about the use of technology in teaching, in Inside Higher Education, available at http://insidehighered.com/views/2008/05/16/frechette.

Here is how the article concludes: "As with other digital developments, faculty continue to grapple with these questions as pedagogical paradigms for effective learning are rapidly changing. Fortunately, clichés of the professor as preoccupied with research over teaching, the political over the personal, literature over television, print over digital media, high art over popular culture, and conferencing over social networking have increasingly been challenged through profound socio-cultural changes, many of which have undoubtedly been promulgated by new technologies and a new generation of learners. If social networking, 3D simulations, blogs and Web pages are means to enhancing the student-teacher relationship, then perhaps we should be less hesitant about using them as we strive to find powerful and creative means to improve the learning experience."

Does zebra-striping help?

This article discusses some experimentation to see if "zebra-striping" - shading alternate lines in tables or forms - actually makes a difference for humans reading the data.
Blogged with the Flock Browser

Wednesday, May 14, 2008

A New Book on Intellectual Property

“Our scattershot cultural policy has failed to balance the public interest with the marketplace,” writes Bill Ivey in his Arts, Inc.: How Greed and Neglect Have Destroyed Our Cultural Rights (Berkeley: University of California Press, 2008). Ivey, the former chairman of the National Endowment for the Arts, carefully follows the growing corporate ownership of our documentary heritage, creative arts, and art of lasting value (like classical music) and the fading cultural and other institutions that collect and care for them. Adding to the growing literature about intellectual property, most of it quite pessimistic, Ivey also spends considerable energy considering the cultural heritage, a topic that will be of interest to archivists. [This is how I start my review of the book on my blog posting today, May 14th.] The book is also of interest to everyone in our school!