Stanford University Home

Stanford News Archive

Stanford Report, September 20, 2000

Stanford researchers win National Science Foundation grants for information technology research

BY DAWN LEVY

Making the World Wide Web more useful and ubiquitous. Figuring out the functions of the genes deciphered by the Human Genome Project. Visualizing weather, human thinking and other complex processes as they happen in time and three-dimensional space. These are some of the heady challenges to be tackled by five Stanford projects that were awarded more than $5 million in National Science Foundation (NSF) grants on Sept. 13. The projects vied with more than 1,400 proposals to receive the first grants in a new NSF program to spur fundamental research and innovative applications of information technology.

Jim Plummer, dean of the School of Engineering, said that "information technology is the basis of many new initiatives in the School of Engineering. These grants are tremendously helpful in strengthening our existing programs and helping us build these new initiatives."

President Bill Clinton said the "initiative will help strengthen America's leadership in a sector that has accounted for one-third of U.S. economic growth in recent years. High technology is generating jobs that pay 85 percent more than the average private-sector wage. I am pleased that the National Science Foundation is expanding its investment in long-term information technology research."

Information Technology Research (ITR) awards went to 62 large projects whose funding will average $1 million per year for three to five years, and 148 smaller projects funded up to $500,000 for as long as three years. Projects range from creating software, scalable information infrastructures and human-computer interfaces to managing information and studying social and economic implications of information technology. (For a complete list of ITR awards, see www.itr.nsf.gov/.)

"These projects represent major innovations in information technology, rather than routine applications of existing technology," said NSF director Rita Colwell. "Our strategy to support long-term, high-risk research responds to a challenge from the President's Information Technology Advisory Committee, which called for increased federal investment to maintain the U.S. lead in this important sector of the global economy."

Stanford Professor Hector Garcia-Molina, Assistant Professor Christopher Manning, Professor Jeffrey Ullman and Associate Professor Jennifer Widom won a $2.2 million grant to transform today's World Wide Web into a Global InfoBase (GIB). The GIB will be a ubiquitous information resource that is easy to use, current and comprehensive. The researchers will attempt to integrate existing technologies into a "universal" information model and query language. They hope to personalize information management to make data relevant and timely for users. And they expect to create sophisticated tools to analyze semantics and algorithms to "mine" data for patterns that reveal new knowledge.

"The Web has created a resource comprising much of the world's knowledge," the researchers wrote in their proposal. "Yet today our ability to use the Web as an information resource is in a primitive state. The GIB project is developing technology that will allow society far more effective and efficient use of the dramatically growing amount of information available online."

NSF awarded $1,003,417 to Associate Professor Monica Lam and Assistant Professor Dawson Engler to create static and dynamic tools for software design. They are developing a new methodology whereby programmer, compiler and runtime system all cooperate to maintain the integrity of a software program. The proposed system allows programmers to capture application-level semantics and invariants of interest at a high level of abstraction. Whereas specific tools have been developed by compiler writers to detect common programming errors, this system will allow programmers to formulate the correctness property or safety criterion that they wish to check in their programs. It places the full power of sophisticated static and dynamic analyses in programmers' hands, allowing them to analyze and manipulate the program at ease.

"Success of this research will have a significant impact on improving software reliability," Lam says.

Awarded $1 million were Professors Kincho Law and James Leckie in Civil and Environmental Engineering, Gio Wiederhold in Computer Science and Barton Thompson in the Law School. Their project is to develop a distributed information management framework called REGNET. The REGNET will be a formal but practical information-technology infrastructure to make governmental regulations publicly and beneficially available online. The pilot application focuses on regulations related to hazardous waste management. Federal and state environmental protection agencies, as well as local governments, impose strict regulations on the treatment and disposal of chemical wastes. Locating and using regulatory information can be daunting tasks. REGNET includes repositories for regulatory information and tools to locate, merge, compare and analyze the information. Five phases are planned: textual storage; semi-structured, indexed storage; means to resolve semantic ambiguities; cross-referencing appropriate for automated access from relevant legal and related documents; and online compliance checking of governmental regulations.

"We hope this collaborative, interdisciplinary effort will lead to a better understanding of legal and social issues related to information technology," Law says.

Stanford's Assistant Professor of Computer Science Daphne Koller and Assistant Professor of Medicine Peter Small in the Division of Infectious Diseases and Geographic Medicine are teaming up with Professor Nir Friedman of Hebrew University in Jerusalem to develop innovative technology for analyzing biological data. Awarded $494,034, the researchers aim to aid the analysis of complex structured databases to find interesting and useful patterns. In recent years, new technologies and data-gathering projects are turning out biological data at rates exceeding the analysis capacities of traditional research methodologies. The data include information that may reveal functions of specific genes in an organism's genome, and population data that may improve treatment of diseases such as tuberculosis. This project will develop languages for statistical modeling of biological processes, techniques for learning the models from data and algorithms for reasoning using the resulting models.

"These techniques will allow us to extract the most significant statistical patterns from the data, thereby providing a deeper scientific understanding of critical biological phenomena," Koller says.

Professor Lambertus Hesselink in Electrical Engineering, Aeronautics and Astronautics, and Applied Physics was awarded $489,998. His project will allow new ways of researching complicated physical phenomena -- such as electromagnetic fields, behavior of fluids and weather systems, quantum mechanics and biological processes such as the workings of the brain -- by using computers to determine how these systems evolve in space and time. Hesselink will develop topological methods based on mathematical analysis of global data. Results will be visualized using 3-D and time-dependent graphics techniques developed specifically for this purpose.

"Visualization and analysis of large multi-dimensional vector and tensor data sets is a difficult task as traditional 2-D and 3-D display methods are only suitable for very small data sets," Hesselink says. Topological information provides several orders of data compression as well as a simplified global topological skeleton that can be used for analysis and visualization of complicated data. "Using these skeletons as a basis, we have further developed novel methods to quantitatively compare data sets and detect similarities and differences between them, which is very difficult to do with previous methods," Hesselink says.

SR