Tuesday, December 18, 2007

Pew Internet: Digital Footprints

Ellen pointed me to this new report from the Pew Internet and American Life Project entitled "Digital Footprints". Quoting the teaser blurb:

Internet users are becoming more aware of their digital footprint; 47% have searched for information about themselves online, up from just 22% five years ago. However, few monitor their online presence with great regularity. Just 3% of self-searchers report that they make a regular habit of it and 74% have checked up on their digital footprints only once or twice.

Indeed, most internet users are not concerned about the amount of information available about them online, and most do not take steps to limit that information. Fully 60% of internet users say they are not worried about how much information is available about them online. Similarly, the majority of online adults (61%) do not feel compelled to limit the amount of information that can be found about them online.

Cyberinfrastructure and the liberal arts

Given our recent success with the Mellon Foundation, you might find this of interest. In introducing this special issue of Academic Commons, David Green writes:
Made possible by dramatic advances in networking technologies, cyberinfrastructure promises to combine new computing capabilities, massive data resources and distributed human expertise to enable qualitatively different creative product from new generations of "knowledge environments." Introducing this timely collection of observations on how this will affect liberal arts disciplines and institutions, David Green reviews the distance we've come in the last 15 years and identifies the main themes of the essays, interviews and reviews that follow.
Happy reading!

Friday, December 14, 2007

Design at Cisco

I found this article interesting. It describes how Cisco, a classic engineering-focussed network equipment company, is now paying attention to customer experience.

He [Ratzlaff] and his dozen-or-so staffers have created a blueprint for how Cisco's products should work together for customers. With the support of CEO John Chambers and other top brass, they are trying to impose it on the San Jose company.

Their degree of success will help determine whether Cisco can reach beyond the business of selling routers and other basic networking gear, an area it dominates, into faster-growing markets for products that make use of those networks. "Cisco is respected for their technology and for their financial success, but nobody really knows what they do," says Hartmut Esslinger, founder of consultancy frog design.


I think that this is a great example of what should be an i-School like project ... what kinds of information do network managers need, what kind of interfaces do the job, etc.

Thursday, December 13, 2007

Study Finds 90 to 95 Percent of All Email in 2007 Spam

You might find this item interesting (and not surprising):


A study conducted by Barracuda Networks suggests spam email has accounted for 90 to 95 percent of all email sent in 2007. The study, based on an analysis of more than one billion daily email messages sent to its more than 50,000 customers worldwide, found a staggering percentage of all email sent in 2007 was spam, increasing from an estimated 85 to 90 percent of email in 2006.

Saturday, December 08, 2007

Free for All in the Public Library

Every so often, an amusing book about some aspect of the information professions surfaces, such as Don Borchert’s Free for All: Oddballs, Geeks, and Gangstas in the Public Library (New York: Virgin Books, 2007). The discovery of this book came just before our December 07 graduation ceremony, and I used it in my brief remarks, as follows:

Our School has been educating individuals to function as librarians for well over a century. The LIS program is internationally recognized for preparing public, school, children’s and youth services, medical, academic, digital, and special librarians, as well as archivists and preservation administrators. Some of these positions are quite new in their origins. Now you might one working as a Webmaster or an Electronic Records Specialist, jobs the first graduates in 1901 could never have imagined. We also have graduates of our program teaching in countries around the world as well. And we are happy to be recognizing the most recent graduates of our program today, hoping they understand the proud legacy they will be part of here in just a few moments.

All of you will face new and exciting challenges. Don Borchert, in a humorous book just published, Free for All: Oddballs, Geeks, and Gangstas in the Public Library, provides a sense of these challenges: “Libraries are a footnote to our civilization, an outpost to those unfamiliar with the concept, and a cheap, habit-forming narcotic to the regular patron. Walk into a public library and it is usually as calm and inviting as a warm bath. It is clean, well kept, and quiet enough to do the Sunday crossword puzzle (the one you brought with you from home, not the one torn surreptitiously out of the library’s copy of the paper while no one was looking). . . . The staff is invariably professional, courteous, and unobtrusive. They are almost always educated – not just disillusioned college grads who could find nothing in their own field but majors in Library Science, a degree as arcane as alchemy or predicting the future by reading the entails of a recently slaughtered lamb.”

Now, some might not agree with Borchert’s assessment, and his book is a bit of a trip, but we know you are ready for your new careers. And, we know as well, that if you don’t know what alchemy is or lack experience reading lamb entails that you know where to find the information about such activities and then preserve it for others to use.

Wednesday, December 05, 2007

For college students, if it's Facebook, it's love - Yahoo! News

This article reminds me of Prof. Hirtle's statement "if it isn't on the web, it doesn't exist":

When a couple was "going steady" in the 1950s, the young man might have let his girlfriend wear his Varsity team sweater or given her his fraternity pin. But the 1960s swept aside those rituals. Now the Facebook link has become a publicly-recognized symbol of a reasonably serious intent short of being engaged or moving in together.

"For those in a relationship, the theme that kept echoing was that Facebook made it official," said Nicole Ellison, an assistant professor of telecommunication and information studies at Michigan State University who has studied social networking sites. "That was the term they used. And when the relationship fell apart, when you broke up on Facebook, that's when the breakup was official."

Monday, December 03, 2007

Internet Usage in the EU

You might find this item, which reports some internet usage statistics from the EU interesting. The report looks at % of households with broadband connections, gender and IT use, and internet skills (not by age or gender, though).

The Importance of Intellectual Community

Today’s Chronicle of Higher Education features an article – George Walker et al, “The Importance of Intellectual Community” -- about a new study of American doctoral education coming out next month. The five-year study was run by the Carnegie Foundation for the Advancement of Teaching examining “84 departments in six fields: chemistry, education, English, history, mathematics, and neuroscience.” It emphasizes the notion of “intellectual community” and here are some excerpts:

“What qualities make intellectual communities more vibrant, enriching, stimulating, welcoming, and suited to the formation of scholars and the building of knowledge? First, they have a shared purpose -- a commitment to help students develop into the best scholars possible so that they, in turn, may contribute to the creation and growth of knowledge. Strong intellectual communities are also:

Diverse and multigenerational. True intellectual exchange must include a wide range of opinions that challenge and inform thinking. Scholars who are not actively involved in an environment of diverse viewpoints and healthy debate may find their work intellectually malnourished. Often doctoral programs approach the topic of diversity as a concern for numbers of people who can be counted in different ways and attention to access is a crucial agenda. But an equally important motivation for diversity is to ensure access to a wide range of viewpoints that enrich intellectual exchange. In addition, a vibrant intellectual community is one in which students are integrated as junior colleagues. Indeed, a key finding of our research is just how great a contribution students, who bring fresh perspectives, can make to the intellectual life of a department.

Flexible and forgiving. Mistakes can be a source of strength. Unfortunately, however, departments are often structured and supported in ways that leave little time and few resources for projects that might not pan out, and in an academic culture that increasingly values "productivity," the need for reflection and thought is profoundly undervalued. Creating a space “literally and metaphorically”to try out new ideas, to "take a flyer," to play, and to step back and reflect on what has been learned is essential.

Respectful and generous. Without creating a climate of political correctness, it is necessary to treat one another respectfully regardless of differing opinions. Understanding that one person's success does not come at the expense of another's, scholars should also share opportunities ("Have you seen this grant application?"), intellectual resources ("Here are three articles you might find helpful"), and connections ("Let me introduce you to Professor X because you will find each other's work interesting"). Generosity seems to flourish when senior faculty members are confident in their own expertise and assume the responsibility to serve as mentors to the next generation of scholars.

Strong intellectual communities also:

Engage students fully in the life of the department. A department with a healthy intellectual community involves students in serving on committees, hosting outside scholars, planning events, being mentors to junior students, and shaping policy. Students (especially those at the beginning of their program) need explicit invitations and routines for such engagement. For instance, in response to the type of problem that we described at the beginning through Anna's story, the history department at the University of Pittsburgh has instituted a rule for one of its seminar series: The first three questions must come from students. That small gesture speaks volumes.

Collaborate on the curriculum. Like the work that goes into a set of departmental goals, curriculum design and course development can bring people together around questions of purpose, as they often quickly move from discussions of specific content to larger debates about what knowledge scholars in the discipline should acquire. For instance, faculty members in the School of Education at the University of Colorado at Boulder engaged in lengthy discussions about what students should know and be able to do. Although the process was often contentious, there is now a clear understanding, among both faculty members and students, of what course work and thus what content and skills are expected of all students. That understanding informs a continuing revision of other aspects of the doctoral program, including comprehensive exams and expectations for the dissertation. Similarly, in the University of Michigan at Ann Arbor chemistry department, doctoral students have opportunities to work with faculty members on the design of undergraduate curriculum—again a chance both to debate ideas and build professional community.

Share research across boundaries. Every department has program subareas, and those are often lively intellectual communities in themselves. One strategy for creating intellectual community is to create research seminars that bridge subspecialties; such connections are especially important as disciplinary boundaries blur. Connections with others in different subareas can lead to new collaborations, especially if the department invites students to organize such activities.

Open classroom doors. For graduate students, seeing how and what others teach is an opportunity not only to expand their pedagogical repertoire but also to observe various modes of explanation, different metaphors, and other models for transforming key ideas in the field. For faculty members, that approach communicates an interest in the work of colleagues and students and also provides a chance to reflect on their own teaching. Departments where classroom doors are open, metaphorically and otherwise, are settings for building a particular kind of intellectual community that some are calling a "teaching commons."

Set aside time for reflection. Many of our partners in the Carnegie Initiative on the Doctorate use departmental retreats as a time to step away from day-to-day demands. Agendas vary from a focused attempt to solve a specific problem to a wide-ranging program that includes many different opportunities to think, discuss, argue, and create. Setting aside time to think and to build the community in which careful thought is possible sends a powerful signal.

Create physical spaces for community. Much of the research on organizational culture points to the value of informal interaction. Although, by definition, that is not something that can be planned, the chances that it will happen increase when there are kitchens, lounges, bulletin boards, and electronic spaces where department members can connect with others. In that spirit, the English department at Texas A&M University at College Station provides refreshments for students each week at a regular time and invites them to get together in a new lounge to talk about whatever issues are important to them.

Encourage social events. Although intellectual community requires more than potlucks and softball games, social activities clearly strengthen a community that already has intellectual ties. Many math departments, for instance, have a tradition of afternoon teas—informal times when students and faculty gather to discuss ideas and problems. Such events allow students to get to know faculty members in a relaxed setting.

The full essay can be found at http://chronicle.com/daily/2007/12/862n.htm. The book will be published by Jossey-Bass.

Saturday, December 01, 2007

Toothpicks and Design

Henry Petroski is a rare individual, an engineer who writes (and well) about engineering for the lay public. In his most recent book, The Toothpick: Technology and Culture (New York: Alfred A. Knopf, 2007), Petroski relates how he had found an “engagingly simple device that would serve to illustrate some basic principles of engineering and design and that at the same time would help reveal the inevitable interrelationships between technology and culture” (p. xi). Petroski writes an engaging, informal history of the toothpick, the machinery supporting the manufacture of toothpicks, and the difficulty of finding information about the toothpick (as it turns out the industry is ultra-secret about how it works). As Petroski relates, “Trying to divine how a toothpick was made is no mean feat. Ironically, more complicated things – like automobiles and cellular phones – might be more readily reverse-engineered than the very simple. . . . Things of a whole cannot be disassembled because they have no component parts” (p. 173). This is a good read.

Monday, November 26, 2007

The Giant Global Graph

Tim Berners-Lee's blog on the next tier of connectivity on the web (connecting computers, data/documents, and people) is discussed here.

Blogged with Flock

Friday, November 16, 2007

Tasty Data Goodies - Swivel

I came across a new web service/site in connection with some research I was doing, Tasty Data Goodies - Swivel. A warning ... there is some interesting stuff here so it can be a time sink once you go there!

Thursday, November 15, 2007

Is email for old people?

If you have a teen or a young adult in your life, this may not come as a surprise. So, you might find this article interesting. Quoting the article:
just as older generations have embraced emails, kids have moved on to many different forms of communication from instant messaging to text messaging to private messaging through social networks to broadcast messaging through Twitter and Facebook news feeds. And, while it worries the reporter a bit, he's come to accept it and realize that kids are simply figuring out the best, most efficient way to communicate different messages -- where email as a one-size-fits-all communication system is a bit clunky.

Wednesday, November 14, 2007

Scholarship in the Digital Age

Today’s Inside Higher Education features an interview with Christine L. Brogman about her new book, Scholarship in the Digital Age: Information, Infrastructure, and the Internet, published by the MIT Press.

Here is an excerpt of the interview. . . .

Q: What do you see as the key unexplored policy issues raised by digital scholarship?

A: The overarching policy issue is what the new scholarly information infrastructure should be. Cyberinfrastructure is the policy answer of the moment. My concern is whether this is a solution in search of a problem that we don’t yet fully understand. Building something is much easier than is determining what to build – the risk today is that we construct a new infrastructure that locks in a number of questionable assumptions about what scholarship is and what it could be in the future.

Some aspects of a successful new scholarly infrastructure are these:

* It would support both collaborative and independent research and learning.
* It would provide relatively easy and equitable access to information resources and to the tools to use them.
* It would provide scholars in all fields with the ability to use their own research data and that of others to ask new questions and to visualize and model their data in new ways. For example:

*
o Scientists – better models of the environment.
o Social scientists – better ways to analyze social trends.
o Humanists – new ways to explore and explain culture – and to mine all those books being digitized by Google, Open Content Alliance, and other international projects

* Open access would prevail, and access to digital content would be permanent.
* Institutional responsibility for obtaining and maintaining digital content would be clear and would be sustainable.

We haven’t achieved any of these goals yet. Each of the many stakeholders – scholars, students, universities, publishers, librarians, archivists, funding agencies, and the taxpaying public – has different concerns for how these functions should be addressed. What we need is a broader conversation that includes these many interested parties.

The full interview can be found at http://insidehighered.com/news/2007/11/14/borgman

Thursday, November 08, 2007

Designing the Future

Donald Norman has written extensively about design, education, information technology, and a host of other topics. His latest book, The Design of Future Things(New York: Basic Books, 2007), will be of interest to faculty in our school, especially for those interested in social computing.

Norman presents his thesis early on his book, suggesting that “as machines start to take over more and more, . . . they need to be socialized; they need to improve the way they communicate and interact and to recognize their limitations. Only then can they become truly useful” (p. 9). Norman is dealing, of course, with the “limitations” of machines – “they do not sense the world in the same way as people, they lack higher order goals, and they have no way of understanding the goals and motives of the people with whom they must interact” (p. 14).

One of the more interesting points Norman makes, at least a point that is interesting for SIS, is the interdisciplinary nature of design: “Design cuts across all disciplines, be it the arts or sciences, humanities or engineering, law or business. In universities, the practical is often judged less valuable than the abstract and theoretical. Universities, moreover, put each discipline into separate schools and departments, where people mainly talk to others within their own narrowly defined categories. This compartmentalization is optional for developing specialists who understand their narrow area in great depth. It is not well suited for the development of generalists whose work cuts across disciplines. Even when the university tries to overcome this deficit by establishing new, multidisciplinary programs, the new program soon becomes its own discipline and grows more and more specialized each year” (pp. 171-172). Norman’s assessment seems awfully close to the nature of the challenges we face here, from the reorganization of the School to the design and offering of a common introductory course. Norman argues that we need a “science of design” (p. 172). “We need a new approach, one that combines the precision and rigor of business and engineering, the understanding of social interactions, and the aesthetics of the arts” (p. 173) – and even here we can hear echoes of our discussions about technology, people, and society.

Wednesday, November 07, 2007

Networking's 50 greatest arguments

For those of you who are not from the telecom world, you might enjoy scanning this item, which claims to list the "50 greatest arguments" among networking professionals. As one from this world, I can attest that these are indeed significant debates that have happened or are happening. Many have never been resolved (except by administrative fiat in an organization). I would also note that the list contains items of varying significance and importance (eg. Intel vs. AMD). Anyway, I think this list might be helpful in giving you some insight into what practicing telecom professionals concern themselves with.

Tuesday, October 30, 2007

Google news and more

There have been a couple of items related to Google lately that caught my attention (I have posted on some of Google's (apparent) strategic intentions before, if you search my blog). I have come to be skeptical of their "Do no evil" motto ... Hence, I take the liberty of copying my post from that blog over here.

The first thing is this item, which purports to reveal Google's "wireless plans". According to the Wall Street Journal (2007-10-30):

Within the next two weeks, Google is expected to announce advanced software and services that would allow handset makers to bring Google-powered phones to market by the middle of next year, people familiar with the situation say. In recent months Google has approached several U.S. and foreign handset manufacturers about the idea of building phones tailored to Google software, with Taiwan's HTC Corp. and South Korea's LG Electronics Inc. mentioned in the industry as potential contenders. Google is also seeking partnerships with wireless operators. In the U.S., it has the most traction with Deutsche Telekom AG's T-Mobile USA, while in Europe it is pursuing relationships with France Télécom's Orange SA and Hutchison Whampoa Ltd.'s 3 U.K., people familiar with the matter say. A Google spokeswoman declined to comment.

The Google-powered phones are expected to wrap together several Google applications -- among them, its search engine, Google Maps, YouTube and Gmail email -- that have already made their way onto some mobile devices. The most radical element of the plan, though, is Google's push to make the phones' software "open" right down to the operating system, the layer that controls applications and interacts with the hardware. That means independent software developers would get access to the tools they need to build additional phone features.

Developers could, for instance, more easily create services that take advantage of users' Global Positioning System location, contact lists and Web-browsing habits. They also would be able to interact with Google Maps and other Google applications. The idea is that a range of new social networking, mapping and other services would emerge, just as they have on the open, mostly unfettered Web. Google, meanwhile, could gather user data to show targeted ads to cellphone users.

-----snip----

Google helped push through controversial rules for a coming spectrum auction at the Federal Communications Commission that would result in a new cellular network open to all devices and software applications, even those not favored by an operator. Google has said it will probably bid for the frequencies.

For now, the company knows it has no choice but to work with operators to make its open platform successful. D.P. Venkatesh, CEO of mPortal Inc., which makes software for wireless operators, puts it this way: "There are a few things carriers control that will always keep them in charge at the end of the day."

But broader (and deeper) thoughts come from Robert Cringley and Nicholas Carr. In his recent article, Cringley writes:


Here is what's significant about Google putting code into MySQL: they haven't done it before. Google has been a MySQL user from almost the very beginning, customizing the database in myriad ways to support Google's widely dispersed architecture with hundreds of thousands of servers. Google has felt no need previously to contribute code to MySQL. So what changed? While Google has long been able to mess with the MySQL code in ITS machines, it hasn't been able to mess with the code in YOUR machine and now it wants to do exactly that. The reason it will take so long to roll out MySQL 6.1 is that Google will only deliver its MySQL extensions for Linux, leaving MySQL AB the job of porting that code to the 15 other operating systems they support. That's what will take until early 2009.

Then what? I think the best clue comes from the agreement Google recently signed with IBM to co-promote cloud computing in universities.

Cloud computing is, of course, the ability to spread an application across one or many networked CPUs. You can think of it as renting computer power or having the ability to infinitely scale a local application without buying new hardware. Cloud computing can be anything from putting your entire business on other people's computers to running a huge Photoshop job from the lobby computer at Embassy Suites before jumping on the shuttle bus to Disney World with your kids. For all its promise, though, cloud computing has been pretty much a commercial failure so far.

-------snip----------

But Google wants us to embrace not just cloud computing but Google's version of cloud computing, the hooks for which will be in every modern operating system by mid-2009, spread not by Google but by a trusted open source vendor, MySQL AB.

Mid-2009 will also see the culmination of Google's huge server build-out. The company is building data centers large and small around the world and populating them with what will ultimately be millions of generic servers. THAT's when things will get really interesting. Imagine a much more user-friendly version of Amazon's EC2 and S3 services, only spread across 10 or more times as many machines. And as with all its services, Google will offer free versions at the bottom for consumers and paid, but still cost-effective versions nearer the top for businesses and education.

If you are completely OK with this and are looking forward to this environment -- and there is much to look forward to -- you might read and consider Nicholas Carr's allegory (that is in the spirit of Halloween), which is difficult to summarize here.

Monday, October 29, 2007

Will the Internet transform storytelling?

Here is an article on MSNBC that looks at interactive story-telling and how the Internet may influence it.

Blogged with Flock

Sunday, October 28, 2007

Minding the Store

Howard Gardner, the prolific Harvard cognitive scientist, has two new books out that deserve a reading by SIS faculty and students. His Five Minds for the Future (Boston: Harvard Business School Press, 2006) addresses the new ways we need to prepare for the future in education, business, and the professions. He identifies and discourses on five cognitive abilities that we need to develop and nurture – the disciplinary mind, synthesizing mind, creating mind, respectful mind, and the ethical mind. His comments on the disciplinary mind provide a window into his thinking as represented in the book. He argues that it takes a decade to master a discipline, and that education, seen as a lifelong activity, is a key into enabling such mastery. Gardner lays out the steps for achieving the disciplined mind. He argues that “an individual is disciplined to the extent that she has acquired the habits that allow her to make steady and essentially unending progress in the mastery of a skill, craft, or body of knowledge” (p. 40). There is a lot of stuff in this volume that will provide food for thought for how and what we teach in a school like ours. For example, Gardner at one point notes that the “ability to knit together information from disparate sources into a coherent whole is vital today” (p. 82), a statement nearly all of us will acknowledge without trouble but then one that we will counter by our own resistance to cooperate disciplinary, professional, and political boundaries within our own school.

His other new book is his edited Responsibility at Work: How Leading Professionals Act (or Don’t Act) Responsibly (San Francisco: John Wiley and Sons, Jossey-Bass, 2007). Another volume in the GoodWork© research project, the various essayists explore issues of ethics, responsibility, and accountability in a number of professions (such as journalism, law, medicine, and education). Needless to say, there is a lot to offer us about both how we teach about ethical and related issues and how we ourselves act in our own school.

Friday, October 26, 2007

The Science Education Myth

This article in BusinessWeek reports on a new study that runs counter to conventional wisdom (and, for that matter, the National Academies of Science). I think it provides some grist for the mill, both within and outside of SIS. A teaser from the article:

Yet a new report by the Urban Institute, a nonpartisan think tank, tells a different story. The report disproves many confident pronouncements about the alleged weaknesses and failures of the U.S. education system. This data will certainly be examined by both sides in the debate over highly skilled workers and immigration (BusinessWeek.com, 10/10/07). The argument by Microsoft (MSFT), Google (GOOG), Intel (INTC), and others is that there are not enough tech workers in the U.S.

The authors of the report, the Urban Institute's Hal Salzman and Georgetown University professor Lindsay Lowell, show that math, science, and reading test scores at the primary and secondary level have increased over the past two decades, and U.S. students are now close to the top of international rankings. Perhaps just as surprising, the report finds that our education system actually produces more science and engineering graduates than the market demands.

Thursday, October 25, 2007

GAO on Internet Infrastructure

This GAO Report might be of broader interest within the school. The "Highlights" section reads as follows:

Why GAO Did This Study

Since the early 1990s, growth in the use of the Internet has revolutionized the way that our nation communicates and conducts business. While the Internet originated as a U.S. government-sponsored research project, the vast majority of its infrastructure is currently owned and operated by the private sector. Federal policy recognizes the need to prepare for debilitating Internet disruptions and tasks the Department of Homeland Security (DHS) with developing an integrated public/private plan for Internet recovery.

GAO was asked to summarize its report on plans for recovering the Internet in case of a major disruption (GAO-06-672) and to provide an update on DHS’s efforts to implement that report’s recommendations. The report (1) identifies examples of major disruptions to the Internet, (2) identifies the primary laws and regulations governing recovery of the Internet in the event of a major disruption, (3) evaluates DHS plans for facilitating recovery from Internet disruptions, and (4) assesses challenges to such efforts.

What GAO Recommends

In its report, GAO made recommendations to DHS to strengthen its ability to help recover from Internet disruptions. In written comments, DHS agreed with these recommendations.

Tuesday, October 23, 2007

Vaidhyanathan and his book in progress on Google

Siva Vaidhyanathan is working on a book called "The Googlization of Everything". The book project has a blog, where he invites people to comment on various postings related to Google. One of the topical areas in the blog is "Is Google a Library?" You can find his blog here.

Here is what he says in one of his first postings about what he is doing:

"As you can tell from the title of this blog, the book will be about Google and all they ways that Google is shaking up the world. Google is a transformative and revolutionary company. I hesitate to use terms like that. We live in an era of hyperbole. So I try my best to discount claims of historical transformation or communicative revolutions.

But in the case of Google, I am confident it is both.

Now, I am approaching this book as both a fan and a critic. I am in awe of all that Google has done and all it hopes to do. I am also wary of its ambition and power.

As I use this site to compose the manuscript (an archaic word that I love too much to discard) for the book The Googlization of Everything, I hope to do so with your help.

This is the latest in a series of “open book” experiments hosted and guided by The Institute for the Future of the Book. The Institute has been supportive of my work for years – long before I became affiliated with it as a fellow and certainly long before we thought up this project together. As with the other projects by Ken Wark and Mitch Stephens, this one will depend on reader criticism and feedback to work right. So this is an appeal for help. If you know something about Google, hip me to it. If you have an observation about how it works or how it affects our lives, write to me about it."

Monday, October 22, 2007

interesting NYTimes article on Open Content Alliance

October 22, 2007: Libraries Shun Deals to Place Books on Web

By KATIE HAFNER

Several major research libraries have rebuffed offers from Google and Microsoft to scan their books into computer databases, saying they are put off by restrictions these companies want to place on the new digital collections.

The research libraries, including a large consortium in the Boston area, are instead signing on with the Open Content Alliance, a nonprofit effort aimed at making their materials broadly available.

Libraries that agree to work with Google must agree to a set of terms, which include making the material unavailable to other commercial search services. Microsoft places a similar restriction on the books it converts to electronic form. The Open Content Alliance, by contrast, is making the material available to any search service.

Google pays to scan the books and does not directly profit from the resulting Web pages, although the books make its search engine more useful and more valuable. The libraries can have their books scanned again by another company or organization for dissemination more broadly.

It costs the Open Content Alliance as much as $30 to scan each book, a cost shared by the group’s members and benefactors, so there are obvious financial benefits to libraries of Google’s wide-ranging offer, started in 2004.

Many prominent libraries have accepted Google’s offer — including the New York Public Library and libraries at the University of Michigan, Harvard, Stanford and Oxford. Google expects to scan 15 million books from those collections over the next decade.
But the resistance from some libraries, like the Boston Public Library and the Smithsonian Institution, suggests that many in the academic and nonprofit world are intent on pursuing a vision of the Web as a global repository of knowledge that is free of business interests or restrictions.

Even though Google’s program could make millions of books available to hundreds of millions of Internet users for the first time, some libraries and researchers worry that if any one company comes to dominate the digital conversion of these works, it could exploit that dominance for commercial gain.

“There are two opposed pathways being mapped out,” said Paul Duguid, an adjunct professor at the School of Information at the University of California, Berkeley. “One is shaped by commercial concerns, the other by a commitment to openness, and which one will win is not clear.”

Last month, the Boston Library Consortium of 19 research and academic libraries in New England that includes the University of Connecticut and the University of Massachusetts, said it would work with the Open Content Alliance to begin digitizing the books among the libraries’ 34 million volumes whose copyright had expired.

“We understand the commercial value of what Google is doing, but we want to be able to distribute materials in a way where everyone benefits from it,” said Bernard A. Margolis, president of the Boston Public Library, which has in its collection roughly 3,700 volumes from the personal library of John Adams.

Mr. Margolis said his library had spoken with both Google and Microsoft, and had not shut the door entirely on the idea of working with them. And several libraries are working with both Google and the Open Content Alliance.

Adam Smith, project management director of Google Book Search, noted that the company’s deals with libraries were not exclusive. “We’re excited that the O.C.A. has signed more libraries, and we hope they sign many more,” Mr. Smith said.

“The powerful motivation is that we’re bringing more offline information online,” he said. “As a commercial company, we have the resources to do this, and we’re doing it in a way that benefits users, publishers, authors and libraries. And it benefits us because we provide an improved user experience, which then means users will come back to Google.”

The Library of Congress has a pilot program with Google to digitize some books. But in January, it announced a project with a more inclusive approach. With $2 million from the Alfred P. Sloan Foundation, the library’s first mass digitization effort will make 136,000 books accessible to any search engine through the Open Content Alliance. The library declined to comment on its future digitization plans.

The Open Content Alliance is the brainchild of Brewster Kahle, the founder and director of the Internet Archive, which was created in 1996 with the aim of preserving copies of Web sites and other material. The group includes more than 80 libraries and research institutions, including the Smithsonian Institution.

Although Google is making public-domain books readily available to individuals who wish to download them, Mr. Kahle and others worry about the possible implications of having one company store and distribute so much public-domain content.

“Scanning the great libraries is a wonderful idea, but if only one corporation controls access to this digital collection, we’ll have handed too much control to a private entity,” Mr. Kahle said.
The Open Content Alliance, he said, “is fundamentally different, coming from a community project to build joint collections that can be used by everyone in different ways.”

Mr. Kahle’s group focuses on out-of-copyright books, mostly those published in 1922 or earlier. Google scans copyrighted works as well, but it does not allow users to read the full text of those books online, and it allows publishers to opt out of the program.

Microsoft joined the Open Content Alliance at its start in 2005, as did Yahoo, which also has a book search project. Google also spoke with Mr. Kahle about joining the group, but they did not reach an agreement.

A year after joining, Microsoft added a restriction that prohibits a book it has digitized from being included in commercial search engines other than Microsoft’s. Unlike Google, there are no restrictions on the distribution of these copies for academic purposes across institutions,” said Jay Girotto, group program manager for Live Book Search from Microsoft. Institutions working with Microsoft, he said, include the University of California and the New York Public Library.
Some in the research field view the issue as a matter of principle.

Doron Weber, a program director at the Sloan Foundation, which has made several grants to libraries for digital conversion of books, said that several institutions approached by Google have spoken to his organization about their reservations. “Many are hedging their bets,” he said, “taking Google money for now while realizing this is, at best, a short-term bridge to a truly open universal library of the future.”

The University of Michigan, a Google partner since 2004, does not seem to share this view. “We have not felt particularly restricted by our agreement with Google,” said Jack Bernard, a lawyer at the university.

The University of California, which started scanning books with the Open Content Alliance, Microsoft and Yahoo in 2005, has added Google. Robin Chandler, director of data acquisitions at the University of California’s digital library project, said working with everyone helps increase the volume of the scanning.

Some have found Google to be inflexible in its terms. Tom Garnett, director of the Biodiversity Heritage Library, a group of 10 prominent natural history and botanical libraries that have agreed to digitize their collections, said he had had discussions with various people at both Google and Microsoft.

“Google had a very restrictive agreement, and in all our discussions they were unwilling to yield,” he said. Among the terms was a requirement that libraries put their own technology in place to block commercial search services other than Google, he said.
Libraries that sign with the Open Content Alliance are obligated to pay the cost of scanning the books. Several have received grants from organizations like the Sloan Foundation.
The Boston Library Consortium’s project is self-funded, with $845,000 for the next two years. The consortium pays 10 cents a page to the Internet Archive, which has installed 10 scanners at the Boston Public LibraryOther members include the Massachusetts Institute of Technology and Brown University.

The scans are stored at the Internet Archive in San Francisco and are available through its Web site. Search companies including Google are free to point users to the material.
On Wednesday the Internet Archive announced, together with the Boston Public Library and the library of the Marine Biological Laboratory and Woods Hole Oceanographic Institution, that it would start scanning out-of-print but in-copyright works to be distributed through a digital interlibrary loan system.

Copyright 2007 The New York Times Company

Friday, October 19, 2007

The Brain is NOT your Friend

I found these two articles (article 1, article 2) interesting. I don't know if this is true or can be backed up scientifically, but it did make interesting reading.

Blogged with Flock

Wednesday, October 17, 2007

What does it take to hire and retain technical talent?

Continuing with my recent thread on IT employment, you might find this article from Network World interesting. Quoting the article:

IT managers say now more than ever they struggle to find qualified IT employees and to keep key talent on staff, while tasked with delivering ever more services to the business.

The concerns certainly are not new, but they are growing in importance, according to the Society for Information Management (SIM). This week the organization revealed that "Attracting, developing and retaining IT professionals" topped its annual list of IT management concerns, ahead of financial woes related to reducing costs and technology-related worries such as security and privacy. SIM reports that IT managers are experiencing staff turnover rates of less than 10% this year, which is far lower than years past, and that attracting talent worries most because of a dwindling resource pool.

"The challenges with attracting talent stems from a pipeline of qualified candidates that isn't as large as the demand companies are seeing for new people, which is a good thing for our discipline," says Jerry Luftman, SIM's vice president of academic affairs. "It's good news that the amount of resources being demanded from IT are on the rise, but it's happening at the same time a large number of workers are leaving the workforce."

As a result, IT managers say they wrestle often with the challenge of hiring people that fit well into their organization in the hopes they won't lose the talent quickly.

Tuesday, October 16, 2007

U.S. Tech Workers Fight Back

You might find this story interesting in the context of technology employment in the US. Not surprisingly, there are diverse interests at play in Congressional policy discussions around immigration reform for tech workers.

Monday, October 15, 2007

2008 IT Outlook

While crystal ball gazing is always hazardous business, we ignore prognostications at our own peril. Thus, you might find this item interesting. Unfortunately, it is not terribly encouraging (though also not discouraging), just as enrollments in the IS programs are beginning to recover.

Thursday, October 11, 2007

Gartner's top 10 strategic technologies for 2008

You might find this article interesting. While I believe that it is important to not overly focus on year to year trends, there may me some deeper kernels that are worth responding to in our teaching and research.

A Primer on the Structured Web

Please see this article.

Blogged with Flock

Tuesday, October 09, 2007

'Burgh goes virtual on Google


'Burgh goes virtual on Google
3D cityscapes to appear on Google Maps' Street View
Tuesday, October 09, 2007
By Elwin Green, Pittsburgh Post-Gazette

Smile, Pittsburgh -- you're on Google-vision.

Starting today, Pittsburgh joins the list of 14 cities that Internet users can tour online by using Google Maps' Street View. The service, which launched in May, offers three-dimensional, photographic views of cityscapes that a user can navigate street by street to locate everything from apartments to gas stations to theaters.

With Street View, dragging and dropping the icon of a human figure onto a highlighted street will pull up a photo of the street, placing the user in a virtual world in which it is possible to move forward, backward, to the side -- even to rotate 360 degrees.

See http://www.post-gazette.com/pg/07282/823985-96.stm for full story.

UC Berkeley Classes on YouTube

After iTunes and iPod, I guess this was the next step for UC Berkeley. Here are six videos on search engines by UC Berkeley on YouTube.

Blogged with Flock

Monday, October 08, 2007

Encore Generation

At a couple of recent meetings I found myself making reference to a recent book on the encore generation, healthy baby boomers retiring and then returning to the workforce to look for new ways of making contributions to society. The book in question is Marc Freedman, Encore: Finding Work That Matters in the Second Half of Life (New York: Public Affairs, 2007). Freedman describes how many baby boomers are facing 30 years of retirement and how many were disatisfied with their former careers and now see a way to do new and more positive things, especially as predictions of labor shortages of 9 million in 2010 and 18.1 million in 2020 loom ahead of us. Since many of these baby boomers are fairly sophisticated in their knowledge and use of information technology, this may be a new market for new and different students for a school like ours. In my opinion, this is a book worth a read and worth some thinking of new programs, especially for our underutilized CAS degree.

Bill Gates on healt care (WSJ)

I thought that this opinion piece in the Wall Street Journal on October 5 would be of interest to the SIS community.  Here is the article:

We live in an era that has seen our knowledge of medical science and treatment expand at a speed that is without precedent in human history. Today we can cure illnesses that used to be untreatable and prevent diseases that once seemed inevitable. We expect to live longer and remain active and productive as we get older. Ongoing progress in genetics and our understanding of the human genome puts us on the cusp of even more dramatic advances in the years ahead.

But for all the progress we've made, our system for delivering medical care is clearly in crisis. According to a groundbreaking 1999 report on health-care quality published by the Institute of Medicine (the medical arm of the National Academy of Sciences) as many as 98,000 Americans die every year as a result of preventable medical errors. That number makes the health-care system itself the fifth-leading cause of death in this country.

Beyond the high cost in human life, we pay a steep financial price for the inability of our health-care system to deliver consistent, high-quality care. Study after study has documented the billions of dollars spent each year on redundant tests, and the prolonged illnesses and avoidable injuries that result from medical errors. The impact ripples through our society, limiting our ability to provide health care to everyone who needs it and threatening the competitiveness of U.S. businesses, which now spend an average of $8,000 annually on health care for employees.

At the heart of the problem is the fragmented nature of the way health information is created and collected. Few industries are as information-dependent and data-rich as health care. Every visit to a doctor, every test, measurement, and procedure generates more information. But every clinic, hospital department, and doctor's office has its own systems for storing it. Today, most of those systems don't talk to each other.

Isolated, disconnected systems make it impossible for your doctor to assemble a complete picture of your health and make fully informed treatment decisions. It also means that the mountain of potentially lifesaving medical information that our health-care system generates is significantly underutilized. Because providers and researchers can't share information easily, our ability to ensure that care is based on the best available scientific knowledge is sharply limited.

There is widespread awareness that we need to address the information problem. In 2001, the Institute of Medicine issued a follow-up report on health-care quality that urged swifter adoption of information technology and greater reliance on evidence-based medicine. In his 2006 State of the Union address, President Bush called on the medical system to "make wider use of electronic records and other health information technology."

But increased digitization of health-care information alone will not solve the problems we face. Already, nearly all procedures, test results and prescriptions are recorded in digital form -- that's how health-care providers transmit information to health insurers so they can be paid for their work. But patients never see this data, and doctors are unable to share it. Instead, individuals do their best to piece together the information that they think their caregivers might need about their medical history, the medications they take and the tests they've undergone.

What we need is to place people at the very center of the health-care system and put them in control of all of their health information. Developing the solutions to help make this possible is an important priority for Microsoft. We envision a comprehensive, Internet-based system that enables health-care providers to automatically deliver personal health data to each patient in a form they can understand and use. We also believe that people should have control over who they share this information with. This will help ensure that their privacy is protected and their care providers have everything they need to make fully-informed diagnoses and treatment decisions.

I believe that an Internet-based health-care network like this will have a dramatic impact. It will undoubtedly improve the quality of medical care and lower costs by encouraging the use of evidence-based medicine, reducing medical errors and eliminating redundant medical tests. But it will also pave the way toward a more important transformation.

Today, our health-care system encourages medical professionals to focus on treating conditions after they occur -- on curing illness and managing disease. By giving us comprehensive access to our personal medical information, digital technology can make us all agents for change, capable of pushing for the one thing that we all really care about: a medical system that focuses on our lifelong health and prioritizes prevention as much as it does treatment. Putting people at the center of health care means we will have the information we need to make intelligent choices that will allow us to lead healthy lives -- and to search out providers who offer care that does as much to help us stay well as it does to help us get better.

The technology exists today to make this system a reality. For the last 30 years, computers and software have helped industry after industry eliminate errors and inefficiencies and achieve new levels of productivity and success. Many of the same concepts and approaches that have transformed the world of business -- the digitization of information, the creation of systems and processes that streamline and automate the flow of data, the widespread adoption of tools that enable individuals to access information and take action -- can be adapted to the particular requirements of health care.

No one company can -- or should -- hope to provide the single solution to make all of this possible. That's why Microsoft is working with a wide range of software and hardware companies, as well as with physicians, hospitals, government organizations, patient advocacy groups and consumers to ensure that, together, we can address critical issues like privacy, security and integration with existing applications.

Technology is not a cure-all for the issues that plague the health-care system. But it can be a powerful catalyst for change, here in the U.S. and in countries around the globe where access to medical professionals is limited and where better availability of health-care information could help improve the lives of millions of people.

This discussion of the Wikipedia phenomenon was insightful, IMHO: https://mail.upmc.edu/exchweb/bin/redir.asp?URL=http://www.cio.com/article/141650 Wikipedia's Awkward Adolescence [from CIO magazine]

Friday, October 05, 2007

Stephen Levy & Technology

There is an interesting interview with Newsweek’s technology writer, Steven Levy, on Ubiquity, Volume 8, Issue 39 (October 2, 2007 – October 8, 2007). You can find the entire interview at http://www.acm.org/ubiquity/interviews/v8i39_levy.html.

The abstract reads: Steven Levy, the chief technology writer for Newsweek magazine, has written a number of best-selling books, the latest of which is The Perfect Thing: How the iPod Shuffles Culture, Commerce, and Coolness. When we asked acclaimed software developer Marney Morris to comment on this interview and she responded: "I've been reading Steven Levy's thoughts on technology for over 20 years, and he is still as fresh and insightful as he was back then -- in the beginning days of the personal computer. Steven weaves the implications of technology change into social meaning with wit and intelligence. He was the best, brightest and funniest guy in the tech arena before he moved to Newsweek in '95, and he still is. He is a great guy and a great journalist. I'm honored that I got to say so."

Here is a sample:

UBIQUITY:
What about the future of the book, and printing, and reading?

LEVY:
I think about that a lot. I think that, as wonderful as the form of the book is, it's ridiculous to think that we're not going to come up with some electronic device that is able to replicate 99% of the good stuff about a physical book along with all the extra virtues you could have, like electronic storage. It's going to happen. I don't know whether it's going to happen in five years or ten years or thirty years. But it's got to happen, and you're going to have something which is flexible and as readable - and pleasurable as a physical book. And they'll have all the stuff that comes with being digital - searchability, connectivity, you name it. And that's going to be a huge change, and eventually it will change the way writers work and what they write, just as the printed book made the novel possible. So I think that, in the short term, yeah, you find me still writing books and hoping that people will still buy the books. And I think they will buy books, if not necessarily mine. But in the long term all publishing has got to be electronic, and I think that's going to change a lot of things. Some of those changes will be things that we'll miss, but I think that if you take a broad view of history, everything will work out just fine. You know, it's sad in a sense that we don't have the oral tradition, that we don't sit and tell long stories over bonfires. We've moved on to something else. Maybe it's time to return.

Tuesday, October 02, 2007

A review of "Surface"

This article takes a look at Microsoft Surface. I am not sure we can call it a new paradigm, but it does take some new steps in human-computer interaction.

Blogged with Flock

Monday, October 01, 2007

Flickr and permissions

Raymond Yee, in his talk last week, made reference to a case in which a photo on Yahoo's Flickr service was used in an advertisement without consulting the person. Here is an article that discusses this case.

Friday, September 28, 2007

Energy consumption and computing

This article reports on a rough estimate of the gross power consumption of computers in the US.  Quoting the article:
The numbers, in billions of kilowatt-hours, break down as follows:
Data center servers: 45
PCs and monitors: 235
Networking gear: 67
Phone network: 0.4
That amounts to about 350 billion kWh a year, representing a whopping 9.4% of total US electricity consumption. On a global basis, Sarokin estimates that the computing grid consumes 868 billion kWh a year, or 5.3% of total consumption.

I find these kinds of estimates interesting.  It is interesting how little the phone network consumes in these figures (assuming they are close to accurate). 

In these days where some people are concerned about their "carbon footprints", this estimate may get some visibility.  How much does computing substitute for other forms of energy consumption (notably transportation of various forms)?  Should computing therefore "count" as indulgences offsets?

Thursday, September 27, 2007

Ranking of Universities Internationally

 

This article points to this annual survey ... the article points to the failures of European Universities.  Quoting the article:

Lack of financing is a key weakness. Public and private spending on higher education in the European Union amounts to only 1.3% of the EU economy—an average $12,000 per student. That compares with 3.3% of gross domestic product in the U.S., or $50,000 per student.

The Bruegel report calls for European countries to invest an additional 1% of their economies in higher education. But the report cautions that distribution of the money should be based on performance, not egalitarianism.

To foster excellence, Europe's universities should be given more autonomy and incentives for better performance—and they should be rewarded financially when they excel, says André Sapir, an author of the report. "There needs to be more performance evaluation," Sapir says. "Differentiation is key."

By the way ... Pitt ranked 49th on the list.

Wednesday, September 26, 2007

IT Salary survey

You might find this article to be a worthwhile read. In particular, it lists "Hot IT skills" (the table didn't cut/paste very well, so you'll have to go to the article).

Monday, September 24, 2007

Using Spam Blockers To Target HIV, Too

This article, in the latest issue of BusinessWeek, is very interesting. Quoting the article:

Early this decade, Heckerman was leading a spam-blocking team at Microsoft Research. To build their tool, team members meticulously mapped out thousands of signals that a message might be junk. An e-mail featuring "Viagra," for example, was a good bet to be spam--but things got complicated in a hurry.



If spammers saw that "Viagra" messages were getting zapped, they switched to V1agra, or Vi agra. It was almost as if spam, like a living thing, were mutating.

This parallel between spam and biology resonated for Heckerman, a physician as well as a PhD in computer science. It didn't take him long to realize that his spam-blocking tool could extend far beyond junk e-mail, into the realm of life science. In 2003, he surprised colleagues in Redmond, Wash., by refocusing the spam-blocking technology on one of the world's deadliest, fastest- mutating conundrums: HIV, the virus that leads to AIDS.


It seems that this article can help us understand how IT is joining with traditional areas of study ... so how must IT education mutate to adapt?

Wednesday, September 12, 2007

Thin clients and technology history

Today's NY Times had an article on thin clients. I found the comments on technology history interesting (see this), but I think that this approach to computing can have potentially large implications on what we include in our technology programs if it turns out to become dominant.

Hackers and system failures

This article in the NYTimes is worth reading. Quoting the article:
Whether it’s the Los Angeles customs fiasco or the unpredictable network cascade that brought the global Skype telephone service down for two days in August, problems arising from flawed systems, increasingly complex networks and even technology headaches from corporate mergers can make computer systems less reliable. Meanwhile, society as a whole is growing ever more dependent on computers and computer networks, as automated controls become the norm for air traffic, pipelines, dams, the electrical grid and more.

“We don’t need hackers to break the systems because they’re falling apart by themselves,” said Peter G. Neumann, an expert in computing risks and principal scientist at SRI International, a research institute in Menlo Park, Calif.

Steven M. Bellovin, a professor of computer science at Columbia University, said: “Most of the problems we have day to day have nothing to do with malice. Things break. Complex systems break in complex ways.”

Tuesday, September 11, 2007

Technorati attempts to organize the blogosphere

This is interesting. If you go to www.technorati.com and click on the "Topics" tag, you get a real time feed of blog posts as they are changing. It is cool to watch for a bit.

UCB is reengineering its engineering curriculum

Since each of the programs in IS&T have undergone (or are undergoing) significant retooling, you might find this article of interest. It may also help us deepen our appreciation for the goals of the proposed SIS 2000 course. Quoting the article:
A major thrust of that effort will be mixing courses from the oft-derided "soft sciences" like sociology and economics, as well as law and design, into engineering students' academic load.

"That is the big, new thing that I'd like to do," Sastry said during a lunch meeting last week. "The time has come for us in engineering to look outwards. The stereotype has always been a quadrangle looking inward."

Two articles from MSNBC.com

Perhaps of interest to SIS related topics are these two articles: The first talks about evolutionary psychology and social networking and the second asks if it is important for people to be able to read in the future.

Blogged with Flock

Friday, September 07, 2007

eBooks again?

The demise of the (paper) book has been oft forecast, yet it persists. This article, in the NY Times, speculates whether the next generation of eBooks will prevail:
Two new offerings this fall are set to test whether consumers really want to replace a technology that has reliably served humankind for hundreds of years: the paper book.

In October, the online retailer Amazon.com will unveil the Kindle, an electronic book reader that has been the subject of industry speculation for a year, according to several people who have tried the device and are familiar with Amazon’s plans. The Kindle will be priced at $400 to $500 and will wirelessly connect to an e-book store on Amazon’s site.

That is a significant advance over older e-book devices, which must be connected to a computer to download books or articles.

Also this fall, Google plans to start charging users for full online access to the digital copies of some books in its database, according to people with knowledge of its plans. Publishers will set the prices for their own books and share the revenue with Google. So far, Google has made only limited excerpts of copyrighted books available to its users.

Tuesday, August 28, 2007

Grand Challenges for Engineering

It seems that there is no shortage of "grand challenges" documents. This one, a project of the National Academy of Engineering may be different in that it seems to be actively engaging the engineering community to contribute. Please take a look, especially at the comments that have been contributed, and consider contributing your own.

Zotero - The Next-Generation Research Tool

You might find this Firefox extension handy in these days of web-based research. I haven't played with it extensively yet ...

Girls in STEM

You might find this article interesting. This article addresses five myths about girls in STEM disciplines.

Monday, August 27, 2007

Six figure IT jobs

This article, referred by Glen Ray, highlights high-paying IT jobs that should be accessible by MSIS graduates. Perhaps this can be used to support our recruitment efforts.

Mind your manners!

You might find this article enjoyable (I did) ... perhaps this is an area where we need to educate our students, given that increasing numbers are coming to class with laptops. Now, we just need to tackle mobile phones!
Quoting the article:
The Microsoft.com Web site even lists seven rules for using laptops in meetings, including “Make Sure There’s a Point” and “Turn Down Bells and Whistles.” In some meetings, especially if the topic is sensitive, it just seems more respectful to leave the laptops closed. On the other hand, if the meeting is covering a variety of areas and the conversation is moving into something I’m not involved in, I don’t feel too bad about catching up on my e-mail. It beats doing so at 11 p.m.

BBC's Tech Lab

BBC's Tech Lab has articles where "the world's leading thinkers give a personal view of future technologies". Among the many articles: one by Vint Cerf on his take on the Internet and one by Dave Winer on future "blog-like" technologies.

Blogged with Flock

Friday, August 24, 2007

Beloit's list: This year's Freshman

This list is always entertaining ... some exerpts:
  • What Berlin wall?
  • Pete Rose has never played baseball.
  • Russia has always had a multi-party political system.
  • U2 has always been more than a spy plane.
  • Fox has always been a major network.
  • Women’s studies majors have always been offered on campus.
  • Being a latchkey kid has never been a big deal.
  • They learned about JFK from Oliver Stone and Malcolm X from Spike Lee.
  • China has always been more interested in making money than in reeducation.
  • They’re always texting 1 n other.
  • They will encounter roughly equal numbers of female and male professors in the classroom.
  • Avatars have nothing to do with Hindu deities.
  • The World Wide Web has been an online tool since they were born.

Thursday, August 23, 2007

2007 OFCOM Communications Review

I posted this item over at my blog, and I thought it might be of broader interest:

The British regulator OFCOM released its 2007 review. The report is worth reading ... here are a few of the report's key points:

  • The availability of broadband to more than half of UK households has driven the development of converged services and devices.
  • Convergence has opened up major revenue opportunities for the producers of many content types. Over the first half of 2007 90% of UK singles sales by volume came from digital downloads to the computer or a mobile handset. The market for computer game playing has also been transformed, with millions of consumers worldwide now engaging in shared online gaming experiences.
  • Audiovisual content, by contrast, continues to be largely broadcaster-funded, although independent producer revenue from new media rights more than doubled to £42m in 2006.
  • The traditional advertiser-funded model of broadcast audiovisual output faces pressures both from the growing popularity of online advertising (it rose by nearly half in 2006 to £2bn) and from the multichannels (which attracted advertising revenue of over £1bn in 2006).
  • Increasingly sophisticated devices are beginning to influence consumer behaviour. Fifteen percent of individuals now have a digital video recorder (DVR) and up to 78% of adults who own them say they always, or almost always, fast-forward through the adverts when watching recorded programmes.
  • Bundled communications services are increasingly popular with consumers, with 40% of households now taking more than one communications service from the same provider (up a third on last year). A majority of broadband customers take it as part of a bundle.
  • Each person now consumes more than seven hours of media and communications services cumulatively per day. However, the tendency to consume some media simultaneously means that the actual time spent on media is likely to be less than this.
  • Digital television penetration broke through the 80% barrier in Q1 2007, taking the total number of homes with multichannel television to 20.4 million (80.5% of the total).
  • Radio reach has been stable over the last five years at around 90%. However, total listening hours fell by 1.4% in the year to Q1 2007, and are down 4.0% on five years ago. Listening hours have fallen furthest among 25-34 year olds, down by 17.3% over five years, and among children, down 8.7%. However, the over-55’s are now listening to more radio, with hours up by 5.5%.
  • Some 58% of listeners say they have accessed radio through one of the digital platforms (up seven percentage points on last year); 41% have listened via DTV, 24% over the internet, and 8% via mobile phone. Twenty seven per cent of UK adults now own an MP3 player, with 5% using them to listen to radio podcasts.
  • Average household spend on telecoms services fell by nearly a pound in 2006 to £64.73 per month. For the first time, average mobile spend fell (by 70p to £31.72) as falling prices more than compensated for an increase in the total number of connections and in the average number of voice calls and text messages per subscriber. Like-for-like prices (based on a basket of services with average usage at 2006 levels) fell by nearly 9%.
  • Total industry revenue in 2006 was £47.0bn, of which £38.5bn was retail revenue (i.e. revenue from end-users). This was an increase of 1.4% on 2005 but represents significantly slower growth than the previous five years as fixed-line revenues declined and growth in mobile and broadband revenues slowed.
  • More than half of UK households had broadband by March 2007. The average (blended) headline speed in June 2007 was 4.6Mbit/s, although actual speeds experienced are often considerably lower, varying according to the quality and length of line from premises to exchange and the number of simultaneous users.
  • Households with a mobile connection (93%) exceeded households with a fixed connection (90%) for the first time in 2006. Average calls per mobile connection rose above 100 minutes a month for the first time, while average calls per fixed-line connection fell below 300 minutes.
  • Local loop unbundling accelerated through 2006 so that by the end of March 2007, 72% of UK premises were connected to an unbundled exchange (an increase from 45% in March 2006). The proportion of premises in unbundled areas taking LLU services rose from 3% in March 2006 to 9% in March 2007.
  • Analysis of time spent online reveals that Britain is a nation of shoppers and social networkers. More time was spent on eBay than on any other web site, and social networking sites Bebo, MySpace, Facebook and YouTube are all in the top ten sites by time spent.
  • Women aged 25-34 spend over 20% more time online than their male counterparts. ‘Silver surfers’ also account for an increasing amount of internet use with nearly 30% of total time spent on the internet accounted for by over-50s (although, as over-50s account for 41% of the UK population, their internet usage remains significantly lower than average).

Monday, August 20, 2007

Will The Response Of The Library Profession To The Internet Be Self-Immolation?

I am posting this item for Ellen:

Will The Response Of The Library Profession To The Internet Be Self-Immolation?
by Martha M. Yee, with a great deal of help from Michael Gorman

There are two components of our profession that constitute the sole basis for our standing as a profession. The first is our expertise in imparting literacy to new generations, something we share with the teaching profession. The other is specific to our profession: human intervention for the organization of information, commonly known as cataloging. The greater goals of these kinds of expertise are an educated citizenry, maintenance of the cultural record for future generations, and support of research and scholarship for the greater good of society. If we cease to practice either of these kinds of expertise, we will lose the right to call ourselves a profession.

*** Read on by following the link ...***

Wednesday, August 15, 2007

Overloaded

Perhaps of interest

RUMORS OF GLORY
Overloaded
The "information flood" has a longer history than we suppose.
By Alan Jacobs | posted 08/13/07

Books & Culture

James O'Donnell, in his thoughtful book Avatars of the Word (1998)—there's a supplementary website here,—argues that until fairly recently the role of the university professor was to be "the supreme local authority" on his or her subject. But then university libraries started getting larger, and the most learned professors started writing more books, books with which those libraries could be stocked. (Whether the increasing numbers of books or the libraries' cravings for them came first, I cannot say.) In England especially the late 19th century saw the creation of vast reference projects—the two most famous of them being the Oxford English Dictionary and the Dictionary of National Biography—which would do much to make great erudition available throughout the English-speaking world.

Later came additional technologies that were especially important for newer academic libraries, many of which were in North America: microfilm and microfiche allowed such schools to provide their patrons with old newspapers, magazines, and books that they never could have found paper copies of. More recently came scanning and digitizing: the magnificent high-resolution images of old and rare books from Octavo, for instance; the vast accumulation of plain-text files of public-domain texts from Project Gutenberg; specialized collections of books in various formats, like the Christian Classics Ethereal Library; and so on. Add to this the increasingly wide online availability of the best scholarly articles—though at a price—from Project Muse and JSTOR, and it becomes increasingly difficult to think of the professor, supreme local authority though he or she may be, as the best possible authority.

In this context, O'Donnell thinks, who needs professors to provide information? Rather, "the real roles of the professor in an information-rich world will be … to advise, guide, and encourage students wading through the deep waters of the information flood." And O'Donnell is moved to think in this way largely because, as a scholar of the late classical world, he has worked primarily on thinkers whose problem was a scarcity of information, not an overabundance. Much of O'Donnell's work has been on Augustine, a man who, as O'Donnell's rival biographer Peter Brown has noted, was "steeped too long in too few books." And his first book was about Cassiodorus—you can get the full text of the book on O'Donnell's homepage—a man who devoted the latter part of his life to collecting what we would think of as a very small monastic library and teaching monks how to use it.

So for O'Donnell the contrast between the current scene and the world of his own scholarly expertise could scarcely be greater. But there are other scholars who remind us that we are not the first to wrestle with the problem of "information overload"—and there may be lessons for us in how some of our predecessors dealt with this issue. Perhaps the most interesting of those scholars is Ann Blair of Harvard, who is working on what promises to be a fascinating book about information overload in the 16th century—the early days of the printing press and the Protestant Reformation. (You can find a copy of a preliminary article on the subject here: scroll about halfway down or search for the name "Blair.")

Blair shows that many scholars, starting in the 16th century and continuing for another hundred and fifty years or so, were—how shall I put it?—freaked out by the sudden onslaught of texts. In the 17th century one French scholar cried out, "We have reason to fear that the multitude of books which grows every day in a prodigious fashion will make the following centuries fall into a state as barbarous as that of the centuries that followed the fall of the Roman Empire." That is, "unless we try to prevent this danger by separating those books which we must throw out or leave in oblivion from those which one should save and within the latter between what is useful and what is not."

Blair shows that such alarm was quite common, and indeed can be found in various periods going back to the high Middle Ages. For instance, in the 13th century we hear from Vincent of Beauvais: "Since the multitude of books, the shortness of time and the slipperiness of memory do not allow all things which are written to be equally retained in the mind, I decided to reduce in one volume in a compendium and in summary order some flowers selected according to my talents from all the authors I was able to read." A kind of Culture's Greatest Hits. But after the invention of the printing press books had multiplied so greatly that a project like Vincent's would have been self-evidently absurd.

So what did those poor deluged people do? Well, they adopted several strategies. First, they practiced various ways of marking important passages in books: with special symbols, with slips of paper, and so on. Then they came up with various ways of organizing books: there were now so many that figuring out how to arrange them became quite a puzzle, so the learned began debates on this subject that would culminate in the creation of the great Dewey Decimal Classification.

But there was also the most fundamental question of all: how to read. One of the most widely quoted sentences of Sir Francis Bacon—it comes from his essay "Of Studies"—concerns the reading of books: "Some books are to be tasted, others to be swallowed and some few to be chewed and digested; that is, some books are to be read only in parts; others to be read, but not curiously; and some few to be read wholly, and with diligence and attention." This is usually taken as a wise or sententious general comment about the worthiness of various texts, but Ann Blair shows that Bacon was making a very practical recommendation to people who were overwhelmed by the availability of books and couldn't imagine how they were going to read them all. Bacon tells such worried folks that they can't read them all, and so should develop strategies of discernment that enable them to make wise decisions about how to invest their time.

Blair also points out that certain enterprising scholars recognized that this information overload created a market for reference works and books that claimed to summarize important texts—We read the books so you don't have to!—or promised to teach techniques for the rapid assimilation of knowledge. But serious scholars like Meric Casaubon—son of the famous Isaac Casaubon—denounced the search for "a shorter way" to learning, insisting that "the best method to learning … is indefatigable (soe farr as the bodie will beare) industrie, and assiduitie, in reading good authors, such as have had the approbation of all learned ages." No shortcuts!

The correspondences to the present day are pretty obvious—Casaubon might be a professor today warning students against Wikipedia—but perhaps the application of lessons is not so easy. The scholars of the 16th and seventeenth centuries were resourceful indeed as they struggled to cope with the influx of books, pamphlets, broadsides, and so on; but their strategies are tailored to the particular technology of the book, and may not be directly applicable to our current problem of managing not just a torrent of books, but a far greater torrent of digital texts.

We are probably at the beginning of our attempts to manage our own information overload. For instance, though many people today—doctors and other scientists in particular—do much of their reading in PDF files, PDF readers that allow annotation are quite new and highly imperfect. And the habit of tagging digital data, while growing, is also still new. Tags are incredibly useful: services like the free del.icio.us, for instance, allow you to keep your browser bookmarks online, tag them, share them with other people, and see bookmarks that others have tagged similarly. Another web-based service, Stikkit—also free, at least at the moment—provides a free-form repository for almost any kind of data and encourages you tag that data for retrieval later.

Those of us who use Macs—I'm sure there are Windows equivalents—now can choose from among several applications that collect data in a range of formats and feature tagging or something similar: Steven Johnson has written a vigorous commendation [restricted content] of DevonThink, a fine application, though I prefer the more streamlined Yojimbo. Frankly, I don't know what I would do without Yojimbo: I keep my whole brain in it, and relate everything to everything else with tags. And modern operating systems are increasingly building in tagging functions at the system-wide level.

Tagging is coming to replace the "folder" metaphor that has dominated personal computing for twenty years because, while a file can only be in one folder, the same file can be given as many tags as its owner wants, and so can be seen in relation to several different sets of other files. It's a great advance in organizing information. But it's just one step towards coping with a flow of data that can only get bigger. James O'Donnell is absolutely correct when he says that the role of the teacher will be, increasingly, to manage this flow and teach others to manage it—that is, to teach the kind of discernment that Bacon counsels, so that we and our students will better understand what is but to be tasted, and what by contrast should be read wholly, with diligence and attention. But surely this is not just a task for educators and their pupils. In Bacon's time, and in Meric Casaubon's, literacy was still rare, and those who had to cope with books were few. Now information overload is everyone's problem.

Alan Jacobs teaches English at Wheaton College in Illinois; his Bad to the Bone: an Exemplary History of Original Sin (HarperOne) will appear in Spring 2008. His Tumblelog is here.

Monday, August 13, 2007

SIS Librarian in City Paper

POSTED ON AUGUST 9, 2007:

City Wants Pedestrians to Turn in Dangerous Motorists

By Violet Law



Pedestrians, especially at the city's busiest intersections, can now report inconsiderate and dangerous drivers by calling 311.
Photo by Heather Mull
Native Manhattanite and San Francisco transplant Alina Del Pino considers herself a seasoned pedestrian, but she says the rude drivers on Bigelow Boulevard have turned her leisurely 10-minute commute into a daily crucible.

Many times Del Pino has found herself stuck between lanes of traffic, waiting for the rare law-abiding driver to stop at the two marked crossings on her way to her office at Pitt's School of Information Sciences, in Oakland.

"Those two crosswalks are the bane of my existence," says Del Pino. "I always have to play chicken with the cars. I've come close to being hit."

Now both Pittsburgh police and planners are urging more pedestrians, bicyclists and other road-users to call so that they can muster sufficient information to monitor the trouble spots, like the one that Del Pino has been contending with. The number to dial is 311, via Mayor Luke Ravenstahl's Response Line. (Cell-phone users should call 412-255-2621.)

While previously some people were irate enough to have complained, city officials say the calls they have received were still too sporadic for them to pin down the hot spots. In a new drive to make the city's roadways friendlier, even for those who aren't driving, city planners are trying to gather as much input as possible to draw up a plan that would improve pedestrian safety. To that end, they are holding a meeting for pedestrians on Aug. 15, at the Brashear Association on the South Side.

"We here in Pittsburgh are very vehicle-oriented. You really don't think about pedestrians a lot," says Amanda Panosky, a city planning department intern working on the pedestrian plan. "We want to change that mindset. We want to make sure the pedestrians are included."

Meanwhile, police brass says officers can be more effectively deployed to catch traffic-law violations if only they know where to go.

"If we get complaints from the citizens, then we can focus our enforcement efforts. We want to do what we can to redirect our efforts to make it safer," explains commander Scott Schubert, of the special deployment division, which includes 26 motorcycle officers in charge of traffic-law enforcement.

Officials say they prefer that 311 complainants be specific about the location and the time of the violations.

Del Pino vows she is going to do her share to make her complaints known.

"I think I will call every opportunity, every time a car doesn't stop for me. I want to claim my right -- the law says the pedestrian has the right of way," says Del Pino. "I just think that people need to be involved."

The pedestrian meeting is scheduled for 7 p.m. Wed., Aug. 15, at Brashear Association, 2005 Sarah St., South Side. For more information, call the city planning department at 412-255-2200.

URL for this story: http://www.pittsburghcitypaper.ws/gyrobase/Content?oid=oid%3A34162

Thursday, July 26, 2007

Is computer science education keeping up with pervasive computing?

Thanks to Prashant for this link: http://www.ibm.com/developerworks/podcast/dwi/cm-int030607.html. This podcast (or associated transcripts) discuss computer science education in a way that is relevant to IS and, to a lesser extent, telecom eduction.

A New Direction for University Presses

Here is the beginning of an article about university presses in today's Inside Higher Education

July 26

Ideas to Shake Up Publishing

With some regularity, reports or op-eds note the economic struggles of most university presses and the difficulties they face publishing monographs that are vital to individual scholars’ careers, but that typically aren’t read by that many people — and that libraries can’t afford to buy. Concerns about the relationship between university presses and tenure, for example, led the Modern Language Association to propose moving beyond the “fetishization” of the monograph.

Today, a new report called “University Publishing in a Digital Age” is being released by a group of experts on scholarly publishing — and they too are proposing radical changes in the way publishing works. The report — from Ithaka, a nonprofit group that promotes research and strategy for colleges to reflect changing technology — is based on a detailed study of university presses, which morphed into a larger examination of the relationship between presses, libraries and their universities.

The report and its authors are suggesting that university presses focus less on the book form and consider a major collaborative effort to assume many of the technological and marketing functions that most presses cannot afford, and that universities be more strategic about the relationship of presses to broader institutional goals.

Get the rest of it at http://insidehighered.com/news/2007/07/26/ithaka