snowdeal logo

archives archives

{bio,medical} informatics


Sunday, September 23, 2001

bookmark: connotea :: del.icio.us ::digg ::furl ::reddit ::yahoo::

find rt skeptics saw something entirely different in today's announcement. One fund manager, an Humelated articles. powered by google. Wired News The Little Screensaver That Could

"IBM is spending $100 million building the world's fastest supercomputer to do cutting-edge medical research, but a distributed computing effort running on ordinary PCs may have beaten Big Blue to the punch.

IBM's proposed Blue Gene , a massively parallel supercomputer, in hopes to help diagnose and treat disease by simulating the ultra-complex process of protein folding.

"But Folding@Home , a modest distributed computing project run by Dr. Vijay Pande and a group of graduate students at Stanford University, has already managed to simulate how proteins self-assemble, something that computers, until now, have not been able to do."

redux [07.13.01]
find related articles. powered by google. Wired Magazine Gene Machine

"Ambuj Goyal, IBM Research's general manager for software, solutions, and strategy, was more ambitious than that. Why not build a machine to model molecular dynamics using general-purpose chips rather than specialized ones? That way you'd produce a prototype for a whole new family of supercomputers. Not only would it be great technology development, it would be great marketing, too. Whereas the Department of Energy has the greatest interest in top-end supercomputing - with its need to understand how nuclear weapons work - focusing on the life sciences rather than the death sciences could make supercomputing more widely appealing. What's more, a biology program would be a way of telling one of the newest markets for big iron - the post-genome biotech world - that IBM took its interests seriously. "We believe that the life sciences are going to be a rapidly growing area," says Blue Gene project manager Bill Pulleyblank, "a huge growth area for IBM."

find related articles. powered by google. Scientific American The Do-It-Yourself Supercomputer

"Our solution was to construct a computing cluster using obsolete PCs that ORNL would have otherwise discarded. Dubbed the Stone SouperComputer because it was built essentially at no cost, our cluster of PCs was powerful enough to produce ecoregion maps of unprecedented detail. Other research groups have devised even more capable clusters that rival the performance of the world's best supercomputers at a mere fraction of their cost. This advantageous price-to-performance ratio has already attracted the attention of some corporations, which plan to use the clusters for such complex tasks as deciphering the human genome. In fact, the cluster concept promises to revolutionize the computing field by offering tremendous processing power to any research group, school or business that wants it."

"Above all, the Beowulf concept is an empowering force. It wrests high-level computing away from the privileged few and makes low-cost parallel-processing systems available to those with modest resources. Research groups, high schools, colleges or small businesses can build or buy their own Beowulf clusters, realizing the promise of a supercomputer in every basement. Should you decide to join the parallel-processing proletariat, please contact us through our Web site (http://extremelinux.esd.ornl.gov/) and tell us about your Beowulf-building experiences."

redux [08.29.01]
find related articles. powered by google. Nature: Science Update Parasite corrals computer power

"According to The Hitchhiker's Guide to the Galaxy, hyper-intelligent pan-dimensional beings (disguised as mice) are using us to compute The Ultimate Question of Life, The Universe, And Everything. Now earthling scientists have roped unsuspecting web servers into a similar - albeit slightly less ambitious - exercise in parasitic computing.

Using the Internet itself as a computer, Jay Brockman and colleagues at the University of Notre Dame, Indiana, have solved a mathematical problem with the unwitting assistance of machines in North America, Europe and Asia."

find related articles. powered by google. EyeForPharma Novartis evaluates Entropia's distributed computing technology for accelerating drug discovery

""The vast quantities of data involved in the genomic era of drug discovery are quickly outpacing advances in computing technology," said Robert North, Entropia CEO. "Distributed computing allows companies to cost-effectively access the massive computing power they'll need by using their existing PC networks. It's quite exciting that companies like Novartis are deploying our platform to demonstrate the potential of distributed computing as a valuable tool in drug discovery efforts.""

redux [07.22.00]
find related articles. powered by google. The Standard Distributed Computing Goes Commercial

"The distributed-computing model could be one of those rare cases where capitalism and pure scientific research mesh. Not every lab can afford to pay $200,000 for an eight-processor Origin 2000 SGI supercomputer, much less $1 million for a 40-processor machine, says David Fenstermacher, director of scientific computing for the medical school at the University of North Carolina at Chapel Hill. (Fenstermacher is also acting director of the campus' Center for Bioinformatics and a United Devices adviser.) And even the most powerful supercomputers need time to process data.

A project that would take several months on a supercomputer - creating a 3D model of a protein's linear be accomplished in much less time using thousands of distributed computers"

redux [10.09.00]
find related articles. powered by google. ACM CrossRoads The SETI@Home Problem

"The SETI@Home problem can be thought of as a special case of the distributed computation verification problem: "given a large amount of computation divided among many computers, how can malicious participating computers be prevented from doing damage?" This is not a new problem. Distributed computation is a venerable research topic, and the idea of "selling spare CPU cycles" has been a science fiction fixture for years."

"The Internet makes it possible for computation to be distributed to many more machines. However, distributing computing around the internet requires developers to consider the possibility of malicious clients."

"The general study of secure multiparty computation has produced much interesting work over the last two decades. Less well studied, unfortunately, are the tools and techniques required to move the theoretical results to the real world. The old dream of massively distributed computations is finally coming true, and yet our tools for building and analysing real systems still seem primitive. The challenge of the next few years will be to bridge this gap."

redux [04.05.00]
find related articles. powered by google. egroups : Decentralization Description

""*Is decentralization ever a good idea? If so, when? Is there non-anecdotal evidence on costs and benefits?

*What protocol issues are there? Can we begin assembling a good protocol for decentralized messaging? To what degree do the protocols for Freenet, Gnutella or WorldOS meet the need? Do we need an application protocol or something lower level? Can HTTP do the job? Can we implement peer routing as an add-on to existing protocols? Is there a call to develop an IETF working group?

* Given that authoring and versioning are critical but hard in a decentralized environment, how can we approach the job? Is it possible to integrate WebDAV with peer networking?

* What are the business issues? Who are the players? Who else stands to win or lose, and why?

At present many people and groups are working on the issues in isolation, some for competitive reasons and some for lack of an alternative. My belief is that a communal approach will be more productive."



[ rhetoric ]

Bioinformatics will be at the core of biology in the 21st century. In fields ranging from structural biology to genomics to biomedical imaging, ready access to data and analytical tools are fundamentally changing the way investigators in the life sciences conduct research and approach problems. Complex, computationally intensive biological problems are now being addressed and promise to significantly advance our understanding of biology and medicine. No biological discipline will be unaffected by these technological breakthroughs.

BIOINFORMATICS IN THE 21st CENTURY

[ search ]

[ outbound ]

biospace / genomeweb / bio-it world / scitechdaily / biomedcentral / the panda's thumb /

bioinformatics.org / nodalpoint / flags and lollipops / on genetics / a bioinformatics blog / andrew dalke / the struggling grad student / in the pipeline / gene expression / free association / pharyngula / the personal genome / genetics and public health blog / the medical informatics weblog / linuxmednews / nanodot / complexity digest /

eyeforpharma /

nsu / nyt science / bbc scitech / newshub / biology news net /

informatics review / stanford / bmj info in practice / bmj info in practice /

[ schwag ]

look snazzy and support the site at the same time by buying some snowdeal schwag !

[ et cetera ]

valid xhtml 1.0?

This site designed by
Eric C. Snowdeal III .
© 2000-2005