Tuesday, October 30, 2007

Google news and more

There have been a couple of items related to Google lately that caught my attention (I have posted on some of Google's (apparent) strategic intentions before, if you search my blog). I have come to be skeptical of their "Do no evil" motto ... Hence, I take the liberty of copying my post from that blog over here.

The first thing is this item, which purports to reveal Google's "wireless plans". According to the Wall Street Journal (2007-10-30):

Within the next two weeks, Google is expected to announce advanced software and services that would allow handset makers to bring Google-powered phones to market by the middle of next year, people familiar with the situation say. In recent months Google has approached several U.S. and foreign handset manufacturers about the idea of building phones tailored to Google software, with Taiwan's HTC Corp. and South Korea's LG Electronics Inc. mentioned in the industry as potential contenders. Google is also seeking partnerships with wireless operators. In the U.S., it has the most traction with Deutsche Telekom AG's T-Mobile USA, while in Europe it is pursuing relationships with France Télécom's Orange SA and Hutchison Whampoa Ltd.'s 3 U.K., people familiar with the matter say. A Google spokeswoman declined to comment.

The Google-powered phones are expected to wrap together several Google applications -- among them, its search engine, Google Maps, YouTube and Gmail email -- that have already made their way onto some mobile devices. The most radical element of the plan, though, is Google's push to make the phones' software "open" right down to the operating system, the layer that controls applications and interacts with the hardware. That means independent software developers would get access to the tools they need to build additional phone features.

Developers could, for instance, more easily create services that take advantage of users' Global Positioning System location, contact lists and Web-browsing habits. They also would be able to interact with Google Maps and other Google applications. The idea is that a range of new social networking, mapping and other services would emerge, just as they have on the open, mostly unfettered Web. Google, meanwhile, could gather user data to show targeted ads to cellphone users.


Google helped push through controversial rules for a coming spectrum auction at the Federal Communications Commission that would result in a new cellular network open to all devices and software applications, even those not favored by an operator. Google has said it will probably bid for the frequencies.

For now, the company knows it has no choice but to work with operators to make its open platform successful. D.P. Venkatesh, CEO of mPortal Inc., which makes software for wireless operators, puts it this way: "There are a few things carriers control that will always keep them in charge at the end of the day."

But broader (and deeper) thoughts come from Robert Cringley and Nicholas Carr. In his recent article, Cringley writes:

Here is what's significant about Google putting code into MySQL: they haven't done it before. Google has been a MySQL user from almost the very beginning, customizing the database in myriad ways to support Google's widely dispersed architecture with hundreds of thousands of servers. Google has felt no need previously to contribute code to MySQL. So what changed? While Google has long been able to mess with the MySQL code in ITS machines, it hasn't been able to mess with the code in YOUR machine and now it wants to do exactly that. The reason it will take so long to roll out MySQL 6.1 is that Google will only deliver its MySQL extensions for Linux, leaving MySQL AB the job of porting that code to the 15 other operating systems they support. That's what will take until early 2009.

Then what? I think the best clue comes from the agreement Google recently signed with IBM to co-promote cloud computing in universities.

Cloud computing is, of course, the ability to spread an application across one or many networked CPUs. You can think of it as renting computer power or having the ability to infinitely scale a local application without buying new hardware. Cloud computing can be anything from putting your entire business on other people's computers to running a huge Photoshop job from the lobby computer at Embassy Suites before jumping on the shuttle bus to Disney World with your kids. For all its promise, though, cloud computing has been pretty much a commercial failure so far.


But Google wants us to embrace not just cloud computing but Google's version of cloud computing, the hooks for which will be in every modern operating system by mid-2009, spread not by Google but by a trusted open source vendor, MySQL AB.

Mid-2009 will also see the culmination of Google's huge server build-out. The company is building data centers large and small around the world and populating them with what will ultimately be millions of generic servers. THAT's when things will get really interesting. Imagine a much more user-friendly version of Amazon's EC2 and S3 services, only spread across 10 or more times as many machines. And as with all its services, Google will offer free versions at the bottom for consumers and paid, but still cost-effective versions nearer the top for businesses and education.

If you are completely OK with this and are looking forward to this environment -- and there is much to look forward to -- you might read and consider Nicholas Carr's allegory (that is in the spirit of Halloween), which is difficult to summarize here.

Monday, October 29, 2007

Will the Internet transform storytelling?

Here is an article on MSNBC that looks at interactive story-telling and how the Internet may influence it.

Blogged with Flock

Sunday, October 28, 2007

Minding the Store

Howard Gardner, the prolific Harvard cognitive scientist, has two new books out that deserve a reading by SIS faculty and students. His Five Minds for the Future (Boston: Harvard Business School Press, 2006) addresses the new ways we need to prepare for the future in education, business, and the professions. He identifies and discourses on five cognitive abilities that we need to develop and nurture – the disciplinary mind, synthesizing mind, creating mind, respectful mind, and the ethical mind. His comments on the disciplinary mind provide a window into his thinking as represented in the book. He argues that it takes a decade to master a discipline, and that education, seen as a lifelong activity, is a key into enabling such mastery. Gardner lays out the steps for achieving the disciplined mind. He argues that “an individual is disciplined to the extent that she has acquired the habits that allow her to make steady and essentially unending progress in the mastery of a skill, craft, or body of knowledge” (p. 40). There is a lot of stuff in this volume that will provide food for thought for how and what we teach in a school like ours. For example, Gardner at one point notes that the “ability to knit together information from disparate sources into a coherent whole is vital today” (p. 82), a statement nearly all of us will acknowledge without trouble but then one that we will counter by our own resistance to cooperate disciplinary, professional, and political boundaries within our own school.

His other new book is his edited Responsibility at Work: How Leading Professionals Act (or Don’t Act) Responsibly (San Francisco: John Wiley and Sons, Jossey-Bass, 2007). Another volume in the GoodWork© research project, the various essayists explore issues of ethics, responsibility, and accountability in a number of professions (such as journalism, law, medicine, and education). Needless to say, there is a lot to offer us about both how we teach about ethical and related issues and how we ourselves act in our own school.

Friday, October 26, 2007

The Science Education Myth

This article in BusinessWeek reports on a new study that runs counter to conventional wisdom (and, for that matter, the National Academies of Science). I think it provides some grist for the mill, both within and outside of SIS. A teaser from the article:

Yet a new report by the Urban Institute, a nonpartisan think tank, tells a different story. The report disproves many confident pronouncements about the alleged weaknesses and failures of the U.S. education system. This data will certainly be examined by both sides in the debate over highly skilled workers and immigration (BusinessWeek.com, 10/10/07). The argument by Microsoft (MSFT), Google (GOOG), Intel (INTC), and others is that there are not enough tech workers in the U.S.

The authors of the report, the Urban Institute's Hal Salzman and Georgetown University professor Lindsay Lowell, show that math, science, and reading test scores at the primary and secondary level have increased over the past two decades, and U.S. students are now close to the top of international rankings. Perhaps just as surprising, the report finds that our education system actually produces more science and engineering graduates than the market demands.

Thursday, October 25, 2007

GAO on Internet Infrastructure

This GAO Report might be of broader interest within the school. The "Highlights" section reads as follows:

Why GAO Did This Study

Since the early 1990s, growth in the use of the Internet has revolutionized the way that our nation communicates and conducts business. While the Internet originated as a U.S. government-sponsored research project, the vast majority of its infrastructure is currently owned and operated by the private sector. Federal policy recognizes the need to prepare for debilitating Internet disruptions and tasks the Department of Homeland Security (DHS) with developing an integrated public/private plan for Internet recovery.

GAO was asked to summarize its report on plans for recovering the Internet in case of a major disruption (GAO-06-672) and to provide an update on DHS’s efforts to implement that report’s recommendations. The report (1) identifies examples of major disruptions to the Internet, (2) identifies the primary laws and regulations governing recovery of the Internet in the event of a major disruption, (3) evaluates DHS plans for facilitating recovery from Internet disruptions, and (4) assesses challenges to such efforts.

What GAO Recommends

In its report, GAO made recommendations to DHS to strengthen its ability to help recover from Internet disruptions. In written comments, DHS agreed with these recommendations.

Tuesday, October 23, 2007

Vaidhyanathan and his book in progress on Google

Siva Vaidhyanathan is working on a book called "The Googlization of Everything". The book project has a blog, where he invites people to comment on various postings related to Google. One of the topical areas in the blog is "Is Google a Library?" You can find his blog here.

Here is what he says in one of his first postings about what he is doing:

"As you can tell from the title of this blog, the book will be about Google and all they ways that Google is shaking up the world. Google is a transformative and revolutionary company. I hesitate to use terms like that. We live in an era of hyperbole. So I try my best to discount claims of historical transformation or communicative revolutions.

But in the case of Google, I am confident it is both.

Now, I am approaching this book as both a fan and a critic. I am in awe of all that Google has done and all it hopes to do. I am also wary of its ambition and power.

As I use this site to compose the manuscript (an archaic word that I love too much to discard) for the book The Googlization of Everything, I hope to do so with your help.

This is the latest in a series of “open book” experiments hosted and guided by The Institute for the Future of the Book. The Institute has been supportive of my work for years – long before I became affiliated with it as a fellow and certainly long before we thought up this project together. As with the other projects by Ken Wark and Mitch Stephens, this one will depend on reader criticism and feedback to work right. So this is an appeal for help. If you know something about Google, hip me to it. If you have an observation about how it works or how it affects our lives, write to me about it."

Monday, October 22, 2007

interesting NYTimes article on Open Content Alliance

October 22, 2007: Libraries Shun Deals to Place Books on Web


Several major research libraries have rebuffed offers from Google and Microsoft to scan their books into computer databases, saying they are put off by restrictions these companies want to place on the new digital collections.

The research libraries, including a large consortium in the Boston area, are instead signing on with the Open Content Alliance, a nonprofit effort aimed at making their materials broadly available.

Libraries that agree to work with Google must agree to a set of terms, which include making the material unavailable to other commercial search services. Microsoft places a similar restriction on the books it converts to electronic form. The Open Content Alliance, by contrast, is making the material available to any search service.

Google pays to scan the books and does not directly profit from the resulting Web pages, although the books make its search engine more useful and more valuable. The libraries can have their books scanned again by another company or organization for dissemination more broadly.

It costs the Open Content Alliance as much as $30 to scan each book, a cost shared by the group’s members and benefactors, so there are obvious financial benefits to libraries of Google’s wide-ranging offer, started in 2004.

Many prominent libraries have accepted Google’s offer — including the New York Public Library and libraries at the University of Michigan, Harvard, Stanford and Oxford. Google expects to scan 15 million books from those collections over the next decade.
But the resistance from some libraries, like the Boston Public Library and the Smithsonian Institution, suggests that many in the academic and nonprofit world are intent on pursuing a vision of the Web as a global repository of knowledge that is free of business interests or restrictions.

Even though Google’s program could make millions of books available to hundreds of millions of Internet users for the first time, some libraries and researchers worry that if any one company comes to dominate the digital conversion of these works, it could exploit that dominance for commercial gain.

“There are two opposed pathways being mapped out,” said Paul Duguid, an adjunct professor at the School of Information at the University of California, Berkeley. “One is shaped by commercial concerns, the other by a commitment to openness, and which one will win is not clear.”

Last month, the Boston Library Consortium of 19 research and academic libraries in New England that includes the University of Connecticut and the University of Massachusetts, said it would work with the Open Content Alliance to begin digitizing the books among the libraries’ 34 million volumes whose copyright had expired.

“We understand the commercial value of what Google is doing, but we want to be able to distribute materials in a way where everyone benefits from it,” said Bernard A. Margolis, president of the Boston Public Library, which has in its collection roughly 3,700 volumes from the personal library of John Adams.

Mr. Margolis said his library had spoken with both Google and Microsoft, and had not shut the door entirely on the idea of working with them. And several libraries are working with both Google and the Open Content Alliance.

Adam Smith, project management director of Google Book Search, noted that the company’s deals with libraries were not exclusive. “We’re excited that the O.C.A. has signed more libraries, and we hope they sign many more,” Mr. Smith said.

“The powerful motivation is that we’re bringing more offline information online,” he said. “As a commercial company, we have the resources to do this, and we’re doing it in a way that benefits users, publishers, authors and libraries. And it benefits us because we provide an improved user experience, which then means users will come back to Google.”

The Library of Congress has a pilot program with Google to digitize some books. But in January, it announced a project with a more inclusive approach. With $2 million from the Alfred P. Sloan Foundation, the library’s first mass digitization effort will make 136,000 books accessible to any search engine through the Open Content Alliance. The library declined to comment on its future digitization plans.

The Open Content Alliance is the brainchild of Brewster Kahle, the founder and director of the Internet Archive, which was created in 1996 with the aim of preserving copies of Web sites and other material. The group includes more than 80 libraries and research institutions, including the Smithsonian Institution.

Although Google is making public-domain books readily available to individuals who wish to download them, Mr. Kahle and others worry about the possible implications of having one company store and distribute so much public-domain content.

“Scanning the great libraries is a wonderful idea, but if only one corporation controls access to this digital collection, we’ll have handed too much control to a private entity,” Mr. Kahle said.
The Open Content Alliance, he said, “is fundamentally different, coming from a community project to build joint collections that can be used by everyone in different ways.”

Mr. Kahle’s group focuses on out-of-copyright books, mostly those published in 1922 or earlier. Google scans copyrighted works as well, but it does not allow users to read the full text of those books online, and it allows publishers to opt out of the program.

Microsoft joined the Open Content Alliance at its start in 2005, as did Yahoo, which also has a book search project. Google also spoke with Mr. Kahle about joining the group, but they did not reach an agreement.

A year after joining, Microsoft added a restriction that prohibits a book it has digitized from being included in commercial search engines other than Microsoft’s. Unlike Google, there are no restrictions on the distribution of these copies for academic purposes across institutions,” said Jay Girotto, group program manager for Live Book Search from Microsoft. Institutions working with Microsoft, he said, include the University of California and the New York Public Library.
Some in the research field view the issue as a matter of principle.

Doron Weber, a program director at the Sloan Foundation, which has made several grants to libraries for digital conversion of books, said that several institutions approached by Google have spoken to his organization about their reservations. “Many are hedging their bets,” he said, “taking Google money for now while realizing this is, at best, a short-term bridge to a truly open universal library of the future.”

The University of Michigan, a Google partner since 2004, does not seem to share this view. “We have not felt particularly restricted by our agreement with Google,” said Jack Bernard, a lawyer at the university.

The University of California, which started scanning books with the Open Content Alliance, Microsoft and Yahoo in 2005, has added Google. Robin Chandler, director of data acquisitions at the University of California’s digital library project, said working with everyone helps increase the volume of the scanning.

Some have found Google to be inflexible in its terms. Tom Garnett, director of the Biodiversity Heritage Library, a group of 10 prominent natural history and botanical libraries that have agreed to digitize their collections, said he had had discussions with various people at both Google and Microsoft.

“Google had a very restrictive agreement, and in all our discussions they were unwilling to yield,” he said. Among the terms was a requirement that libraries put their own technology in place to block commercial search services other than Google, he said.
Libraries that sign with the Open Content Alliance are obligated to pay the cost of scanning the books. Several have received grants from organizations like the Sloan Foundation.
The Boston Library Consortium’s project is self-funded, with $845,000 for the next two years. The consortium pays 10 cents a page to the Internet Archive, which has installed 10 scanners at the Boston Public LibraryOther members include the Massachusetts Institute of Technology and Brown University.

The scans are stored at the Internet Archive in San Francisco and are available through its Web site. Search companies including Google are free to point users to the material.
On Wednesday the Internet Archive announced, together with the Boston Public Library and the library of the Marine Biological Laboratory and Woods Hole Oceanographic Institution, that it would start scanning out-of-print but in-copyright works to be distributed through a digital interlibrary loan system.

Copyright 2007 The New York Times Company

Friday, October 19, 2007

The Brain is NOT your Friend

I found these two articles (article 1, article 2) interesting. I don't know if this is true or can be backed up scientifically, but it did make interesting reading.

Blogged with Flock

Wednesday, October 17, 2007

What does it take to hire and retain technical talent?

Continuing with my recent thread on IT employment, you might find this article from Network World interesting. Quoting the article:

IT managers say now more than ever they struggle to find qualified IT employees and to keep key talent on staff, while tasked with delivering ever more services to the business.

The concerns certainly are not new, but they are growing in importance, according to the Society for Information Management (SIM). This week the organization revealed that "Attracting, developing and retaining IT professionals" topped its annual list of IT management concerns, ahead of financial woes related to reducing costs and technology-related worries such as security and privacy. SIM reports that IT managers are experiencing staff turnover rates of less than 10% this year, which is far lower than years past, and that attracting talent worries most because of a dwindling resource pool.

"The challenges with attracting talent stems from a pipeline of qualified candidates that isn't as large as the demand companies are seeing for new people, which is a good thing for our discipline," says Jerry Luftman, SIM's vice president of academic affairs. "It's good news that the amount of resources being demanded from IT are on the rise, but it's happening at the same time a large number of workers are leaving the workforce."

As a result, IT managers say they wrestle often with the challenge of hiring people that fit well into their organization in the hopes they won't lose the talent quickly.

Tuesday, October 16, 2007

U.S. Tech Workers Fight Back

You might find this story interesting in the context of technology employment in the US. Not surprisingly, there are diverse interests at play in Congressional policy discussions around immigration reform for tech workers.

Monday, October 15, 2007

2008 IT Outlook

While crystal ball gazing is always hazardous business, we ignore prognostications at our own peril. Thus, you might find this item interesting. Unfortunately, it is not terribly encouraging (though also not discouraging), just as enrollments in the IS programs are beginning to recover.

Thursday, October 11, 2007

Gartner's top 10 strategic technologies for 2008

You might find this article interesting. While I believe that it is important to not overly focus on year to year trends, there may me some deeper kernels that are worth responding to in our teaching and research.

A Primer on the Structured Web

Please see this article.

Blogged with Flock

Tuesday, October 09, 2007

'Burgh goes virtual on Google

'Burgh goes virtual on Google
3D cityscapes to appear on Google Maps' Street View
Tuesday, October 09, 2007
By Elwin Green, Pittsburgh Post-Gazette

Smile, Pittsburgh -- you're on Google-vision.

Starting today, Pittsburgh joins the list of 14 cities that Internet users can tour online by using Google Maps' Street View. The service, which launched in May, offers three-dimensional, photographic views of cityscapes that a user can navigate street by street to locate everything from apartments to gas stations to theaters.

With Street View, dragging and dropping the icon of a human figure onto a highlighted street will pull up a photo of the street, placing the user in a virtual world in which it is possible to move forward, backward, to the side -- even to rotate 360 degrees.

See http://www.post-gazette.com/pg/07282/823985-96.stm for full story.

UC Berkeley Classes on YouTube

After iTunes and iPod, I guess this was the next step for UC Berkeley. Here are six videos on search engines by UC Berkeley on YouTube.

Blogged with Flock

Monday, October 08, 2007

Encore Generation

At a couple of recent meetings I found myself making reference to a recent book on the encore generation, healthy baby boomers retiring and then returning to the workforce to look for new ways of making contributions to society. The book in question is Marc Freedman, Encore: Finding Work That Matters in the Second Half of Life (New York: Public Affairs, 2007). Freedman describes how many baby boomers are facing 30 years of retirement and how many were disatisfied with their former careers and now see a way to do new and more positive things, especially as predictions of labor shortages of 9 million in 2010 and 18.1 million in 2020 loom ahead of us. Since many of these baby boomers are fairly sophisticated in their knowledge and use of information technology, this may be a new market for new and different students for a school like ours. In my opinion, this is a book worth a read and worth some thinking of new programs, especially for our underutilized CAS degree.

Bill Gates on healt care (WSJ)

I thought that this opinion piece in the Wall Street Journal on October 5 would be of interest to the SIS community.  Here is the article:

We live in an era that has seen our knowledge of medical science and treatment expand at a speed that is without precedent in human history. Today we can cure illnesses that used to be untreatable and prevent diseases that once seemed inevitable. We expect to live longer and remain active and productive as we get older. Ongoing progress in genetics and our understanding of the human genome puts us on the cusp of even more dramatic advances in the years ahead.

But for all the progress we've made, our system for delivering medical care is clearly in crisis. According to a groundbreaking 1999 report on health-care quality published by the Institute of Medicine (the medical arm of the National Academy of Sciences) as many as 98,000 Americans die every year as a result of preventable medical errors. That number makes the health-care system itself the fifth-leading cause of death in this country.

Beyond the high cost in human life, we pay a steep financial price for the inability of our health-care system to deliver consistent, high-quality care. Study after study has documented the billions of dollars spent each year on redundant tests, and the prolonged illnesses and avoidable injuries that result from medical errors. The impact ripples through our society, limiting our ability to provide health care to everyone who needs it and threatening the competitiveness of U.S. businesses, which now spend an average of $8,000 annually on health care for employees.

At the heart of the problem is the fragmented nature of the way health information is created and collected. Few industries are as information-dependent and data-rich as health care. Every visit to a doctor, every test, measurement, and procedure generates more information. But every clinic, hospital department, and doctor's office has its own systems for storing it. Today, most of those systems don't talk to each other.

Isolated, disconnected systems make it impossible for your doctor to assemble a complete picture of your health and make fully informed treatment decisions. It also means that the mountain of potentially lifesaving medical information that our health-care system generates is significantly underutilized. Because providers and researchers can't share information easily, our ability to ensure that care is based on the best available scientific knowledge is sharply limited.

There is widespread awareness that we need to address the information problem. In 2001, the Institute of Medicine issued a follow-up report on health-care quality that urged swifter adoption of information technology and greater reliance on evidence-based medicine. In his 2006 State of the Union address, President Bush called on the medical system to "make wider use of electronic records and other health information technology."

But increased digitization of health-care information alone will not solve the problems we face. Already, nearly all procedures, test results and prescriptions are recorded in digital form -- that's how health-care providers transmit information to health insurers so they can be paid for their work. But patients never see this data, and doctors are unable to share it. Instead, individuals do their best to piece together the information that they think their caregivers might need about their medical history, the medications they take and the tests they've undergone.

What we need is to place people at the very center of the health-care system and put them in control of all of their health information. Developing the solutions to help make this possible is an important priority for Microsoft. We envision a comprehensive, Internet-based system that enables health-care providers to automatically deliver personal health data to each patient in a form they can understand and use. We also believe that people should have control over who they share this information with. This will help ensure that their privacy is protected and their care providers have everything they need to make fully-informed diagnoses and treatment decisions.

I believe that an Internet-based health-care network like this will have a dramatic impact. It will undoubtedly improve the quality of medical care and lower costs by encouraging the use of evidence-based medicine, reducing medical errors and eliminating redundant medical tests. But it will also pave the way toward a more important transformation.

Today, our health-care system encourages medical professionals to focus on treating conditions after they occur -- on curing illness and managing disease. By giving us comprehensive access to our personal medical information, digital technology can make us all agents for change, capable of pushing for the one thing that we all really care about: a medical system that focuses on our lifelong health and prioritizes prevention as much as it does treatment. Putting people at the center of health care means we will have the information we need to make intelligent choices that will allow us to lead healthy lives -- and to search out providers who offer care that does as much to help us stay well as it does to help us get better.

The technology exists today to make this system a reality. For the last 30 years, computers and software have helped industry after industry eliminate errors and inefficiencies and achieve new levels of productivity and success. Many of the same concepts and approaches that have transformed the world of business -- the digitization of information, the creation of systems and processes that streamline and automate the flow of data, the widespread adoption of tools that enable individuals to access information and take action -- can be adapted to the particular requirements of health care.

No one company can -- or should -- hope to provide the single solution to make all of this possible. That's why Microsoft is working with a wide range of software and hardware companies, as well as with physicians, hospitals, government organizations, patient advocacy groups and consumers to ensure that, together, we can address critical issues like privacy, security and integration with existing applications.

Technology is not a cure-all for the issues that plague the health-care system. But it can be a powerful catalyst for change, here in the U.S. and in countries around the globe where access to medical professionals is limited and where better availability of health-care information could help improve the lives of millions of people.

This discussion of the Wikipedia phenomenon was insightful, IMHO: https://mail.upmc.edu/exchweb/bin/redir.asp?URL=http://www.cio.com/article/141650 Wikipedia's Awkward Adolescence [from CIO magazine]

Friday, October 05, 2007

Stephen Levy & Technology

There is an interesting interview with Newsweek’s technology writer, Steven Levy, on Ubiquity, Volume 8, Issue 39 (October 2, 2007 – October 8, 2007). You can find the entire interview at http://www.acm.org/ubiquity/interviews/v8i39_levy.html.

The abstract reads: Steven Levy, the chief technology writer for Newsweek magazine, has written a number of best-selling books, the latest of which is The Perfect Thing: How the iPod Shuffles Culture, Commerce, and Coolness. When we asked acclaimed software developer Marney Morris to comment on this interview and she responded: "I've been reading Steven Levy's thoughts on technology for over 20 years, and he is still as fresh and insightful as he was back then -- in the beginning days of the personal computer. Steven weaves the implications of technology change into social meaning with wit and intelligence. He was the best, brightest and funniest guy in the tech arena before he moved to Newsweek in '95, and he still is. He is a great guy and a great journalist. I'm honored that I got to say so."

Here is a sample:

What about the future of the book, and printing, and reading?

I think about that a lot. I think that, as wonderful as the form of the book is, it's ridiculous to think that we're not going to come up with some electronic device that is able to replicate 99% of the good stuff about a physical book along with all the extra virtues you could have, like electronic storage. It's going to happen. I don't know whether it's going to happen in five years or ten years or thirty years. But it's got to happen, and you're going to have something which is flexible and as readable - and pleasurable as a physical book. And they'll have all the stuff that comes with being digital - searchability, connectivity, you name it. And that's going to be a huge change, and eventually it will change the way writers work and what they write, just as the printed book made the novel possible. So I think that, in the short term, yeah, you find me still writing books and hoping that people will still buy the books. And I think they will buy books, if not necessarily mine. But in the long term all publishing has got to be electronic, and I think that's going to change a lot of things. Some of those changes will be things that we'll miss, but I think that if you take a broad view of history, everything will work out just fine. You know, it's sad in a sense that we don't have the oral tradition, that we don't sit and tell long stories over bonfires. We've moved on to something else. Maybe it's time to return.

Tuesday, October 02, 2007

A review of "Surface"

This article takes a look at Microsoft Surface. I am not sure we can call it a new paradigm, but it does take some new steps in human-computer interaction.

Blogged with Flock

Monday, October 01, 2007

Flickr and permissions

Raymond Yee, in his talk last week, made reference to a case in which a photo on Yahoo's Flickr service was used in an advertisement without consulting the person. Here is an article that discusses this case.