snowdeal logo

archives archives

{bio,medical} informatics


Friday, January 19, 2001

bookmark: connotea :: del.icio.us ::digg ::furl ::reddit ::yahoo::

find related articles. powered by google. MSNBC Genome hunters, Compaq to hook up
"Genome hunters at the U.S. Department of Energy and Celera Corp. announced a marriage of computer technology and biology on Friday, saying they will mine the human gene map for information they hope will transform medicine."

"“We in the nuclear weapons industry thought for years that nothing could be more complex than nuclear physics,” Sandia’s Robinson said. He said it was clear that modeling the 250,000 interacting proteins in the human body is far more complex.

Sandia and Celera will write new computer software and the algorithms, the mathematical formulas, that are used to analyze the dense maze of genetic code."
find related articles. powered by google. The Washington Post Celera and National Labs to Collaborate
"Sandia's scientists will gain experience applying their expertise in computer science, mathematics and statistics to problems in biology, which promise to be among the most daunting big science undertakings of the new century. "This is the most challenging and the most exciting problem that I've ever been exposed to," said Bill Camp, director of strategic computing at Sandia.

Celera and Sandia will continue to own any patented or copyrighted work they bring to the collaboration, and they will jointly own any newly developed software. The new software may find its way into Celera's commercial products and might also be made available for use by federal laboratories or federally funded scientists in academic labs.

The computer programs written by Sandia and Celera will serve, for Compaq, as models of the kinds of problems supercomputers will need to tackle in the next few years. Out of the effort, Compaq expects to produce an ultra-fast computer that would eventually be sold commercially. Because biological problems closely resemble some other types of computing problems, including those confronted by cryptologists trying to intercept and read enemy communications, the new machine might be of interest to military planners and spy agencies."

find related articles. powered by google. Computerworld Compaq, DOE, biotech firm to build $150 million Linux supercomputer
"However, the system will be a long time in coming. Bill Blake, vice president of high-performance technical computing at Compaq, said the three partners hope to have a prototype machine ready by 2004. That system is expected to deliver performance ranging between 100 and 150 trillion floating point operations per second, Blake said.

The prototype supercomputer will likely use 10,000 to 20,000 of Compaq's Alpha processors and is being budgeted at $150 million in current costs, according to Blake. He added that the first system could eventually lead to the development of a so-called "petacruncher" -- a machine capable of 1,000 teraflops -- by the end of the decade."

redux [12.18.00]
find related articles. powered by google. GenomeWeb IBM Seeks 'Major Market Presence' with NuTec Supercomputer
"The 7.5-teraflop computing cluster that IBM is building for Atlanta-based NuTec Sciences will give IBM a “distinct advantage” in the genomics marketplace, an IBM spokesperson said Monday"

"Some market watchers speculated that IBM, whose computers are historically among the more expensive, might have cut NuTec a deal in order to secure a stronger foothold in the genomics sector.

IBM software for web application serving, information portals, and data integration will also be included in the system. NuTec Sciences will use the system to manage, mine and integrate genetic data from a wide variety of sources, and share this information via the Internet with the global life sciences community."

NuTec plans to run several massively parallel applications on the cluster. Morrissey said that a combinatorics algorithm that NuTec is developing in collaboration with the NIH to analyze disease-causing gene combinations is particularly compute-intensive. This algorithm is running as a test set on the company’s IBM computer in Houston, but Morrissey said they’re awaiting delivery of the supercomputer before it can be scaled up to optimal efficiency."


[ rhetoric ]

Bioinformatics will be at the core of biology in the 21st century. In fields ranging from structural biology to genomics to biomedical imaging, ready access to data and analytical tools are fundamentally changing the way investigators in the life sciences conduct research and approach problems. Complex, computationally intensive biological problems are now being addressed and promise to significantly advance our understanding of biology and medicine. No biological discipline will be unaffected by these technological breakthroughs.

BIOINFORMATICS IN THE 21st CENTURY

[ search ]

[ outbound ]

biospace / genomeweb / bio-it world / scitechdaily / biomedcentral / the panda's thumb /

bioinformatics.org / nodalpoint / flags and lollipops / on genetics / a bioinformatics blog / andrew dalke / the struggling grad student / in the pipeline / gene expression / free association / pharyngula / the personal genome / genetics and public health blog / the medical informatics weblog / linuxmednews / nanodot / complexity digest /

eyeforpharma /

nsu / nyt science / bbc scitech / newshub / biology news net /

informatics review / stanford / bmj info in practice / bmj info in practice /

[ schwag ]

look snazzy and support the site at the same time by buying some snowdeal schwag !

[ et cetera ]

valid xhtml 1.0?

This site designed by
Eric C. Snowdeal III .
© 2000-2005