snowdeal logo

archives archives

{bio,medical} informatics

Wednesday, October 23, 2002

bookmark: connotea :: ::digg ::furl ::reddit ::yahoo::

find related articles. powered by google. News.Com Stanford gives distributed computing an A

"Scientists at Stanford University have demonstrated tangible proof that scientific experiments can be conducted using thousands of low-end PCs wrangled together into loosely linked networks.

A group of chemists, including Stanford assistant professor Vijay Pande, said they successfully predicted the folding rate of a protein using calculations worked out on a so-called distributed computing network. Their research, conducted last year, was published this week in the science journal Nature."

redux [08.30.02]
find related articles. powered by google. Genomeweb San Diego Supercomputer Center Using Entropia Grid to Build Protein-Structure Databases

"A research team at the San Diego Supercomputer Center is using a grid-based computer system from Entropia to build a set of protein structure databases."

"With the help of 250 desktop computers with processing power ranging from 180 MHz to 2.2 GHz, the platform has so far completed calculations on almost 1,000 proteins, said Elbert."

""What they've done in principle they could have done on one of their supercomputers, but those machines are heavily used for other projects," he said. "This is a way of expanding capacity. And it's a whole lot cheaper.""

redux [11.28.01]
find related articles. powered by google. News.Com IBM computers picked for cancer research

"IBM will supply the University of Pennsylvania and four hospitals with computers that will link into a computing "grid" to check for breast cancer, the company will announce Wednesday.

The grid will be used to detect breast cancer in patients, store mammograms in digital form and identify populations that are particularly susceptible, the company said in a statement. The system can be used, for example, to compare a new mammogram to a previous year's image to detect changes.

IBM, along with rivals such as Sun Microsystems and Compaq Computer, have been backing grid computing, which joins computers and storage systems into a large pool of computing power.

redux [11.21.01]
find related articles. powered by google. Scientific Computing World Scientific sharing across computer networks in USA

"The US National Science Foundation has announced a $12 million programme - called the NSF Middleware Initiative (NMI) - to develop middleware: software that allows scientists to share applications, scientific instruments and data, and collaborate with their colleagues across high-performance networks.

The effort will build on the success of the Globus project in developing middleware tools for grid computing, and will integrate Globus and other emerging middleware components into a well-tested, comprehensive, commercial-quality, middleware distribution package that runs on multiple platforms. These middleware distributions will be disseminated to research labs and universities worldwide."

redux [11.12.01]
find related articles. powered by google. ZDNet News New boost for open-source supercomputing

"Platform Computing, a company that tries to harness the collective computing power on computer networks, has signed a deal to commercialize an open-source supercomputing project.

Platform is working with the Globus Project to commercialize the Globus Toolkit for governing the use of computers and storage systems joined into a large computing "grid," Platform said Wednesday."

"Grid computing, though, often uses higher-powered computers than mere desktop PCs, and has attracted the interest of IBM, which thinks corporate customers as well as academics will use grid methods. IBM is working with Globus to boost this expansion.

Grid computing has long held potential for some types of computing tasks--typically those that don't require as much communication between one computing task and another. For this reason, they don't replace single mammoth supercomputers such as those from Cray. However, grid computing is popular among pharmaceutical companies and others."

find related articles. powered by google. Technical Report, Monash University The Virtual Laboratory: Enabling On-Demand Drug Design with the World Wide Grid

"Computational Grids are emerging as a popular paradigm for solving large-scale compute and data intensive problems in science, engineering, and commerce. However, application composition, resource management and scheduling in these environments is a complex undertaking. In this paper, we illustrate the creation of a virtual laboratory environment by leveraging existing Grid technologies to enable molecular modeling for drug design on distributed resources. It involves screening millions of molecules of chemical compounds against a protein target, chemical database (CDB) to identify those with potential use for drug design. We have grid-enabled the molecular docking process by composing it as a parameter sweep application using the Nimrod-G tools. We then developed new tools for remote access to molecules in CDB small molecule database. The Nimrod-G resource broker along with molecule CDB data broker is used for scheduling and on-demand processing of jobs on distributed grid resources. The results demonstrate the ease of use and suitability of the Nimrod-G and virtual laboratory tools."

redux [04.04.01]
find related articles. powered by google. BioMedNet Intel supports online protein project
[requires 'free' registration]

"Intel is providing equipment and software downloads for a project in which volunteers are donating spare home computer cycles to a Stanford University project studying the protein-folding process. The project, Folding@Home, was the first to model successfully a complete protein fold - a task not even achieved by supercomputers."

""We want to increase the value of the PC," said Scott Griffin, Intel's program manager. "The PC is there when people aren't at it, like when they are in meetings. A great thing about this is you get every day users involved in research that they care about. Not only do they get to help out, but they get to help cure these terrible diseases.""

redux [09.23.01]
find related articles. powered by google. Wired News The Little Screensaver That Could

"IBM is spending $100 million building the world's fastest supercomputer to do cutting-edge medical research, but a distributed computing effort running on ordinary PCs may have beaten Big Blue to the punch.

IBM's proposed Blue Gene , a massively parallel supercomputer, in hopes to help diagnose and treat disease by simulating the ultra-complex process of protein folding.

"But Folding@Home , a modest distributed computing project run by Dr. Vijay Pande and a group of graduate students at Stanford University, has already managed to simulate how proteins self-assemble, something that computers, until now, have not been able to do."

redux [10.09.00]
find related articles. powered by google. ACM CrossRoads The SETI@Home Problem

"The SETI@Home problem can be thought of as a special case of the distributed computation verification problem: "given a large amount of computation divided among many computers, how can malicious participating computers be prevented from doing damage?" This is not a new problem. Distributed computation is a venerable research topic, and the idea of "selling spare CPU cycles" has been a science fiction fixture for years."

"The Internet makes it possible for computation to be distributed to many more machines. However, distributing computing around the internet requires developers to consider the possibility of malicious clients."

"The general study of secure multiparty computation has produced much interesting work over the last two decades. Less well studied, unfortunately, are the tools and techniques required to move the theoretical results to the real world. The old dream of massively distributed computations is finally coming true, and yet our tools for building and analysing real systems still seem primitive. The challenge of the next few years will be to bridge this gap."

[ rhetoric ]

Bioinformatics will be at the core of biology in the 21st century. In fields ranging from structural biology to genomics to biomedical imaging, ready access to data and analytical tools are fundamentally changing the way investigators in the life sciences conduct research and approach problems. Complex, computationally intensive biological problems are now being addressed and promise to significantly advance our understanding of biology and medicine. No biological discipline will be unaffected by these technological breakthroughs.


[ search ]

[ outbound ]

biospace / genomeweb / bio-it world / scitechdaily / biomedcentral / the panda's thumb / / nodalpoint / flags and lollipops / on genetics / a bioinformatics blog / andrew dalke / the struggling grad student / in the pipeline / gene expression / free association / pharyngula / the personal genome / genetics and public health blog / the medical informatics weblog / linuxmednews / nanodot / complexity digest /

eyeforpharma /

nsu / nyt science / bbc scitech / newshub / biology news net /

informatics review / stanford / bmj info in practice / bmj info in practice /

[ schwag ]

look snazzy and support the site at the same time by buying some snowdeal schwag !

[ et cetera ]

valid xhtml 1.0?

This site designed by
Eric C. Snowdeal III .
© 2000-2005