Wikipedia offers offline access

BBC.com reports that Wikipedia will sell nearly 2,000 Wikipedia articles via compact disc to enable people without an internet connection to access highlights of the web resource. The articles selected for distribution were chosen by software that rated both their quality and importance to the greater Wikipedia community. The featured articles aim to be well written, comprehensive and free from errors and bias…

http://news.bbc.co.uk/2/hi/technology/6566749.stm

tags

Fairfax Schools back down on NCLB defiance

The Washington Post reports that Fairfax County, Virginia, school officials have backed down from a vow to defy federal testing rules for students with limited English, saying that they would administer reading tests to those students–even if they stumble with items dealing with metaphors and other difficult material. This decision is a sharp turn for educators in Virginia’s largest school system, who had led others in opposition to the federal law…

http://www.washingtonpost.com/wp-dyn/content/article/2007/04/18/AR2007041802375.html

tags

USA Today reports that according to a new study conducted by The Pew Internet and American Life Project, two-thirds of teens with profiles on blogs or social-networking sites have restricted access to their profiles in some capacity, such as requiring passwords, or only allowing those on a "approved" list to view them. This research comes as concerns are being raised over online predators and other dangers associated with the use of such sites as Facebook and MySpace…

http://www.usatoday.com/tech/news/2007-04-18-teen-net-study_N.htm

tags

MySpace to test news service

USA Today reports that in an effort to attract more advertisers, the popular social-networking site MySpace plans to test a news service that scours the web for news stories, and lets users rate them. The service, called MySpace News resembles a mix of Google news, which collects and arranges news stories, and Digg.com which sorts news stories according to their popularity ranking within the site…

http://www.usatoday.com/tech/webguide/internetlife/2007-04-19-myspace-news_N.htm

tags

Net challenges lead to clean-slate approach

Government and university researchers have been exploring ways to redesign the internet from scratch. Here are some of the challenges that led researchers to start thinking of clean-slate approaches:

SECURITY

THE CHALLENGE: The internet was designed to be open and flexible, and all users were assumed to be trustworthy. Thus, the internet’s protocols weren’t designed to authenticate users and their data, allowing spammers and hackers to easily cover their tracks by attaching fake return addresses onto data packets.

THE CURRENT FIX: Internet applications such as firewalls and spam filters attempt to control security threats. But because such techniques don’t penetrate deep into the network, bad data still get passed along, clogging systems and possibly fooling the filtering technology.

THE CLEAN-SLATE SOLUTION: The network would have to be redesigned to be skeptical of all users and data packets from the start. Data wouldn’t be passed along unless the packets are authenticated. Faster computers today should be able to handle the additional processing required within the network.

MOBILITY

THE CHALLENGE: Computers rarely moved, so numeric internet addresses were assigned to devices based on their location. A laptop, on the other hand, is constantly on the move.

THE CURRENT FIX: A laptop changes its address and reconnects as it moves from one wireless access point to another, disrupting data flow. Another workaround is to have all traffic channel back to the first access point as a laptop moves to a second or a third location, but delays could result from the extra distance.

THE CLEAN-SLATE SOLUTION: The address system would have to be restructured so that addresses are based more on the device and less on the location. This way, a laptop could retain its address as it hops through multiple hot spots.

UBIQUITY

THE CHALLENGE: The internet was designed when there were relatively few computers connecting to it. The proliferation of personal computers and mobile devices led to a scarcity in the initial address system. There will be even more demand for addresses as refrigerators, air conditioners, and other devices come with internet capability, as will stand-alone sensors for measuring everything from the temperature to the availability of parking spaces.

THE CURRENT FIX: Engineers expanded the address pool with a system called IPv6, but nearly a decade after most of the groundwork was completed, the vast majority of software and hardware still use the older, more crowded IPv4 technology. Even if more migrate to IPv6, processing the addresses for all the sensors could prove taxing.

THE CLEAN-SLATE SOLUTION: Researchers are questioning whether all devices truly need addresses. Perhaps sensors in a school or home could talk to one another locally and relay the most important data through a gateway bearing an address. This way, the internet’s traffic cops, known as routers, wouldn’t have to keep track of every single sensor, improving efficiency.

tags

New web design must tackle many interests

The university researchers who began construction on the internet some four decades ago never imagined the power their creation would have today. They toiled away in their labs quietly, and few outside cared.

That won’t be the case with a next-generation internet envisioned as an ultimate replacement for the current one.

Commercial and policy interests will likely play a bigger role this time, as researchers explore “clean slate” designs that scrap the internet’s underlying architecture to better address security, mobility, and other emerging needs.

Will the greater attention on these efforts ultimately be their undoing?

“The success of the internet can be largely credited to the fact that it began in a backwater,” said Jonathan Zittrain, a law professor affiliated with Oxford and Harvard universities. “It had the amazing advantage of not having to turn a profit. It didn’t need a business model.”

The bulk of the work is still being done in ivory towers, with grants from leading high-tech companies and government agencies.

Stanford University, for instance, has partnered with Cisco Systems Inc., Japan’s NTT DoCoMo Inc., Germany’s Deutsche Telekom AG, and other companies, though for now they are limited to advisory and sponsorship roles.

Bruce Davie, a Cisco fellow, said industry can take advantage of academia’s long-term vision, while giving feedback on what areas of research might actually be useful.

But commercial considerations are clearly in the minds of researchers.

Carnegie Mellon professor Hui Zhang said some of the work there surrounds building incentives for network operators to update systems and pass along data efficiently. Researchers are realizing they can’t simply rely on network operators’ altruism, a tenet in the original design.

Participants in a new network also could include law-enforcement officials, who are already demanding that internet service providers retrofit the existing network to ease wiretapping of internet-based phone calls. Governments around the world, including the United States, also could seek ways to block porn and politically sensitive web sites—and better identify those who distribute the forbidden.

“The more mature these ideas become, the closer they get to reality, I’m sure many stakeholders like that will come to the table,” said Larry Peterson, a Princeton professor who heads a planning group for an experimental network called GENI.

Building surveillance capabilities from the start could certainly cut costs, said Les Szwajkowski, a former FBI official who had worked on applying wiretap laws to new technologies. But he said engineers and other Americans shouldn’t worry.

“In theory this would be an excellent idea, but I think there are political issues to overcome,” Szwajkowski said. “There would be a reluctance to say you have an investigative agency at the table involved in a deep reworking of the internet.”

He said many in law enforcement share his view that any involvement should be limited to advising engineers rather than meddling in the details.

Guru Parulkar, incoming executive director of Stanford’s initiative following a tenure as the NSF’s clean-slate manager, said researchers recognize they must build any system with “the right balance of privacy and accountability,” leaving it flexible enough to adapt to wherever policy makers decide to draw the line.

tags

Difficulties abound in move to new internet

Transitioning to a next-generation internet could be akin to changing the engines on a moving airplane.

Routers and other networking devices likely will need replacing; personal computers could be in store for software upgrades. Headaches could arise, given that it won’t be possible simply to shut down the entire network for maintenance, with companies, schools, and individuals depending on it every day.

And just think of the costs–potentially billions of dollars.

Advocates of a clean-slate internet–a restructuring of the underlying architecture to better handle security, mobility, and other emerging needs–agree that any transition will be difficult.

Consider that the groundwork for the IPv6 system for expanding the pool of internet addresses was largely completed nearly a decade ago, yet the vast majority of software and hardware today still use the older, more crowded IPv4 technology. The clean-slate initiatives are far more ambitious than that.

But researchers aren’t deterred.

“The premise of the clean-slate design is, let’s start by saying, ‘How should it be done?’ independent of ‘Can we retrofit it?'” said Andrea Goldsmith, an electrical engineering professor at Stanford. “Once we know what the right thing to do is, then we can say, ‘Is there an evolutional path?'”

One transition scenario is to run a parallel network for applications that truly need the improved functions. Schools, businesses, and individuals would migrate to the new system over time, the way some are now abandoning the traditional telephone system for internet-based phones, even as the two networks run side by side.

“There’s no such thing as a flag day,” said Larry Peterson, chairman of computer science at Princeton. “What happens is that certain services start to take off and attract users, and industry players start to take notice and adapt.”

That’s not unlike the approach NASA has in mind for extending the internet into outer space. NASA has started to deploy the Interplanetary Internet so its spacecraft would have a common way of communicating with one another and with mission control.

But because of issues unique to outer space–such as a planet temporarily blocking a spacecraft signal, or the 15 to 45 minutes it takes a message to reach Mars and back–NASA can’t simply slap on the communications protocols designed for the earthbound internet. So project researchers have come up with an alternative communications protocol for space, and the two networks hook up through a gateway.

To reduce costs, schools and businesses might buy networking devices that work with both networks–and they’d do so only when they would have upgraded their systems anyhow.

Some believe the current internet will never go away, and the fruits of the research could go into improving–rather than scrapping–the existing architecture.

“You can’t overhaul an international network very easily and expect everyone to jump on it,” said Leonard Kleinrock, a UCLA professor who was one of the driving forces in creating the original internet. “The legacy systems are there. You’re not going to get away from it.”

tags

Researchers explore new internet design

Although it has already taken nearly four decades to get this far in building the internet, some university researchers–with the federal government’s blessing–want to scrap all that and start over.

The idea might seem unthinkable, even absurd, but many believe a “clean slate” approach is the only way to truly address security, mobility, and other challenges that have cropped up since UCLA professor Leonard Kleinrock helped supervise the first exchange of meaningless test data between two machines on Sept. 2, 1969.

The internet “works well in many situations but was designed for completely different assumptions,” said Dipankar Raychaudhuri, a Rutgers University professor overseeing three clean-slate projects. “It’s sort of a miracle that it continues to work well today.”

No longer constrained by slow connections and computer processors and high costs for storage, researchers say the time has come to rethink the internet’s underlying architecture–a move that could mean replacing networking equipment and rewriting software on computers to better channel future traffic over the existing pipes.

Even Vinton Cerf, one of the internet’s founding fathers and co-developer of the key communications techniques, said the exercise was “generally healthy” because the current technology “does not satisfy all needs.”

One challenge in any reconstruction, though, will be balancing the interests of various constituencies. The first time around, researchers were able to toil away in their labs quietly. Industry is playing a bigger role this time, and law enforcement is bound to make its needs for wiretapping known.

There’s no evidence they are meddling yet, but once any research looks promising, “a number of people [will] want to be in the drawing room,” said Jonathan Zittrain, a law professor affiliated with Oxford and Harvard universities. “They’ll be wearing coats and ties and spilling out of the venue.”

The National Science Foundation wants to build an experimental research network known as the Global Environment for Network Innovations, or GENI, and is funding several projects at universities and elsewhere through Future Internet Network Design, or FIND.

Rutgers, Stanford, Princeton, Carnegie Mellon, and the Massachusetts Institute of Technology are among the universities pursuing individual projects. Other government agencies, including the Defense Department, also have been exploring the concept.

The European Union also has backed research on such initiatives, through a program known as Future Internet Research and Experimentation, or FIRE. Government officials and researchers met last month in Zurich to discuss early findings and goals.

A new network could run parallel with the current internet and eventually replace it, or perhaps aspects of the research could go into a major overhaul of the existing architecture.

These clean-slate efforts are still in their early stages, though, and aren’t expected to bear fruit for another 10 or 15 years–assuming Congress comes through with funding.

Guru Parulkar, who will become executive director of Stanford’s initiative after heading NSF’s clean-slate programs, estimated that GENI alone could cost $350 million, while government, university, and industry spending on the individual projects could collectively reach $300 million. Spending so far has been in the tens of millions of dollars.

And it could take billions of dollars to replace all the software and hardware deep in the legacy systems.

Clean-slate advocates say the cozy world of researchers in the 1970s and 1980s doesn’t necessarily mesh with the realities and needs of the commercial internet.

“The network is now mission-critical for too many people, when in the [early days] it was just experimental,” Zittrain said.

The internet’s early architects built the system on the principle of trust. Researchers largely knew one another, so they kept the shared network open and flexible–qualities that proved key to its rapid growth.

But spammers and hackers arrived as the network expanded and could roam freely because the internet doesn’t have built-in mechanisms for knowing with certainty who sent what.

The network’s designers also assumed that computers are in fixed locations and always connected. That’s no longer the case with the proliferation of laptops, personal digital assistants, and other mobile devices, all hopping from one wireless access point to another, losing their signals here and there.

Engineers tacked on improvements to support mobility and improved security, but researchers say all that adds complexity, reduces performance, and–in the case of security–amounts at most to bandages in a high-stakes game of cat and mouse.

Workarounds for mobile devices “can work quite well if a small fraction of the traffic is of that type,” but could overwhelm computer processors and create security holes when 90 percent or more of the traffic is mobile, said Nick McKeown, co-director of Stanford’s clean-slate program.

The internet will continue to face new challenges as applications require guaranteed transmissions–not the “best effort” approach that works better for eMail and other tasks with less time sensitivity.

Think of a doctor using teleconferencing to perform a surgery remotely, or a customer of an internet-based phone service needing to make an emergency call. In such cases, even small delays in relaying data can be deadly.

And one day, sensors of all sorts will likely be internet-capable.

Rather than create workarounds each time, clean-slate researchers want to redesign the system to easily accommodate any future technologies, said Larry Peterson, chairman of computer science at Princeton and head of the planning group for the NSF’s GENI.

Even if the original designers had the benefit of hindsight, they might not have been able to incorporate these features from the start. Computers, for instance, were much slower then, possibly too weak for the computations needed for robust authentication.

“We made decisions based on a very different technical landscape,” said Bruce Davie, a fellow with network-equipment maker Cisco Systems Inc., which stands to gain from selling new products and incorporating research findings into its existing line.

“Now, we have the ability to do all sorts of things at very high speeds,” he said. “Why don’t we start thinking about how we take advantage of those things and not be constrained by the current legacy we have?”

Of course, a key question is how to make any transition–and researchers are largely punting for now.

“Let’s try to define where we think we should end up, what we think the internet should look like in 15 years’ time, and only then would we decide the path,” McKeown said. “We acknowledge it’s going to be really hard–but I think it will be a mistake to be deterred by that.”

Kleinrock, the internet pioneer at UCLA, questioned the need for a transition at all, but he said such efforts are useful for their out-of-the-box thinking.

“A thing called GENI will almost surely not become the internet, but pieces of it might fold into the internet as it advances,” he said.

Think evolution, not revolution.

Princeton already runs a smaller experimental network called PlanetLab, while Carnegie Mellon has a clean-slate project called 100 x 100.

These days, Carnegie Mellon professor Hui Zhang said he no longer feels like “the outcast of the community” as a champion of clean-slate designs.

Construction on GENI could start by 2010 and take about five years to complete. Once operational, it should have a decade-long lifespan.

FIND, meanwhile, funded about two dozen projects last year and is evaluating a second round of grants for research that could ultimately be tested on GENI.

These go beyond projects like Internet2 and National LambdaRail, both of which focus on next-generation needs for speed.

Any redesign may incorporate mechanisms, known as virtualization, for multiple networks to operate over the same pipes, making further transitions much easier. Also possible are new structures for data packets and a replacement of Cerf’s TCP/IP communications protocols.

“Almost every assumption going into the current design of the internet is open to reconsideration and challenge,” said Parulkar, the NSF official heading to Stanford. “Researchers may come up with wild ideas and very innovative ideas that may not have a lot to do with the current internet.”

Links:

Stanford program
http://cleanslate.stanford.edu

Carnegie Mellon program
http://100x100network.org

Rutgers program
http://orbit-lab.org

National Science Foundation’s GENI
http://geni.net

tags

Campus massacre: Turning to technology

Almost immediately after the deadly shootings at Virginia Tech University on April 16, Virginia Tech students created an “I’m OK” page on Facebook to let one another and their loved ones know that they survived. Other students posted photos and cell phone videos on their own sites, or shared them just hours after the shootings with news organizations.

Thanks to the portability and speed of today’s technology, the students’ shots are likely to become some of the “defining images” of the tragedy, says Amanda Lenhart, a senior researcher at the Pew Internet & American Life Project, which monitors high-tech culture.

And nowhere, she says, has the impact of the internet been seen more than on social networking sites, most often frequented by young people.

“What better place to mourn someone than a place that they themselves build to express who they are, and a place where the deceased and his or her friends may have spent a great deal of time interacting?” Lenhart asks.

Since April 16, there has been a nonstop flood of postings on the popular Facebook student site, on MySpace and LiveJournal, and on personal blogs–expressing everything from grief to anger to confusion.

Jesse Connolly, a 21-year-old from Lynn, Mass., made a posting April 17 on the Myspace page of Ross Alameddine, one of the Virginia Tech students who was killed. The pair worked together last summer at an electronics store in their home state.

“If only you were here to read this Ross … You’d know what an imaginative, intelligent, compassionate, and most of all hysterically funny human being you were, and how appreciative I am to have spent last summer working with such a great kid,” Connolly wrote. “My every thought is with you and your family.”

Even before names of the victims were officially released, a few students created Facebook memorial pages for some of the dead–though others worried that it was too soon, since family and friends were still being notified.

There are myriad other ways the internet continues to shape the grieving process.

In addition to using the university’s web site to communicate with the world, Virginia Tech officials planned to set up a site where families of the victims could post photos.

TechSideline.com, a site for Virginia Tech sports fans, also quickly morphed into a meeting place where students, family, and friends could communicate–especially when phones were jammed.

And as a show of support, many students, including scores from other colleges, replaced their Facebook profile photos with a Virginia Tech logo shrouded in a black ribbon.

Patti Jacobs, a junior at Canisius College in Buffalo, N.Y., was among them. Saddened by the shootings, she went searching for memorial pages on Facebook April 17.

Jacobs was alarmed when she also came across several pages that included hateful, sometimes racist remarks toward shooter Cho Seung-Hui, other Asians, and his family.

“This is not about just one guy and his problems,” Jacobs wrote. “Yes–he alone is accountable for all the damage and pain caused yesterday–but the reason for this was not his race, his child-rearing by his family, or his girlfriend breaking up with him. …

“How much of our society is accountable as well?”

Some of the hateful postings were removed, likely after other Facebook users flagged them–a process of communal self-editing used on many sites.

Those kinds of entries are a product of the open nature of the internet, where rumors and inaccuracies also can linger.

Such was the case for 23-year-old Wayne Chiang, who was mistaken by some as the shooter–partly because his Facebook profile includes references to graduating from Virginia Tech and several photos of him with his gun collection.

At first, Chiang says he “played along with it” on his personal web page, partly to see how much money he could make, since payment from the ads he places on his site are based on the number of hits the site gets. (He claims he’s going to donate the proceeds to a fund for the shooting victims at his alma mater.)

Chiang decided to post the truth after he received death threats. But many of those who thought he was the shooter had the same question: Why did the killings happen?

“I always knew the internet was very powerful, just not to this extent,” Chiang, who lives in suburban Washington, D.C., said in a telephone interview.

“People just want to blame it on somebody in order to understand the situation. It’s completely understandable.”

Despite technology’s darker side, Lenhart at Pew says the help the internet provides during tragedies like these is undeniable.

“No longer do you need to drive to a headstone in a cemetery or a roadside flower-strewn cross, or fly across the country to a funeral,” she says, “but you can log on and express yourself, and interact with others who are feeling the same thing.”

Contacted through his MySpace page, Connolly, the 21-year-old in Massachusetts, agreed with that sentiment.

“Reading everyone’s thoughts and communicating with friends makes that lonely, empty feeling inside a little bit easier to deal with,” he says, “knowing you’re not alone.”

Links:

Virginia Tech web page: “Tragedy at Virginia Tech”
http://www.vt.edu/tragedy

Pew Internet & American Life Project
http://www.pewinternet.org

American Psychological Association (APA) Help Center
http://www.apahelpcenter.org

tags