snowdeal logo

archives archives

{bio,medical} informatics


Tuesday, May 29, 2001

bookmark: connotea :: del.icio.us ::digg ::furl ::reddit ::yahoo::

find related articles. powered by google. ZDNet Supercomputing: Virtually fighting disease
"Juno Online Services burns through US$9.6 million per quarter. But the free Internet service provider hopes its Virtual Supercomputer Project will help reverse its cash flow and keep email free, while helping scientists search the human genome for disease-fighting proteins."

"Juno already collects responses to the ads it serves up to subscribers. Using the same technology, Juno will download the mathematical tasks to subscribers' computers. The processing will be done offline, while the subscribers still have their computers on but are taking breaks. The next time the user signs onto the network, the results of the tasks will be uploaded."
redux [04.05.01]
find related articles. powered by google. New Scientist Screen test
"There's never been a better reason to ditch the flying windows or gyrating text drifting across your computer screen. Cancer researchers are offering a new computer screen saver that will use spare processor time on your computer to look for drugs to fight the disease.

Pinching the idea from the search for extraterrestrial intelligence (SETI), the researchers hope that drafting in PC users across the globe will speed up drugs discovery. "We're going to need perhaps three or four million hours of computing time," says Graham Richards of Oxford University. "Using our own computers, we'd be dead before we finished."

To look for new drugs, the team takes 3D computer models of four key proteins that seem to promote cancer - by encouraging the growth of blood vessels to supply tumours, for example. The models include all the active binding sites of the proteins. "We'd like to design little molecules that would get into these sites and block them," says Richards."

redux [02.18.01]
find related articles. powered by google. Wired News Genome Effort Hits Home
"A new distributed computing project is comparing gene data with protein structures to determine their genome sequences.

"Genome@home is the second project from Stanford University's chemistry department, which also runs the Folding@Home project.

"Whereas Folding@Home is designed to learn how genomes fold into proteins, Genome@Home was launched this week to try and reverse engineer known proteins by guessing the genome sequence of their structures."

redux [02.02.01]
find related articles. powered by google. The Standard Juno Online Launches Into Outerspace
"Facing shrinking ad revenues, ISP Juno Online is jumping into a new field that to date has enjoyed its greatest fame from the search for extraterrestrial life.

The New York-based company announced Thursday the creation of Juno Virtual Supercomputer Project, a distributed computing effort that would tap the computing power of Juno's 4 million subscribers. Juno hopes to sell that vast power to large-scale research projects, initially focusing on bioinformatics and pharmaceutical work, said Juno President and CEO Charles E. Ardai."

redux [10.09.00]
find related articles. powered by google. ACM CrossRoads The SETI@Home Problem
"The SETI@Home problem can be thought of as a special case of the distributed computation verification problem: "given a large amount of computation divided among many computers, how can malicious participating computers be prevented from doing damage?" This is not a new problem. Distributed computation is a venerable research topic, and the idea of "selling spare CPU cycles" has been a science fiction fixture for years.

In real life, distributed computation has been used since at least the late 1980's to create "farms" of machines for rendering 3-D images. Farms allow graphic artists to create large images without needing to buy a supercomputer. More recently, the needs of scientific computation have led to the creation of frameworks such as Parallel Virtual Machine (PVM) and Beowulf, which make it easier to distribute computations across many machines. The machines involved are usually owned by the same entity and a machine is either "good" or "bad" if it is operating or malfunctioning. There are no blatantly malicious machines.

The Internet makes it possible for computation to be distributed to many more machines. However, distributing computing around the internet requires developers to consider the possibility of malicious clients."

"The general study of secure multiparty computation has produced much interesting work over the last two decades. Less well studied, unfortunately, are the tools and techniques required to move the theoretical results to the real world. The old dream of massively distributed computations is finally coming true, and yet our tools for building and analysing real systems still seem primitive. The challenge of the next few years will be to bridge this gap."

redux [08.09.00]
BBC Screensavers could save lives
"Your computer could be helping to save lives when you are not using it to play games or surf the internet.

Instead of it sitting idle, it could be taking part in scientific experiments being distributed across thousands of computers on the internet.

Drugs to beat cancer and flu are starting to be tested in simulations split up and run on personal computers that would otherwise be doing nothing useful." [via slashdot.org]
PC Magazine New Apps Exploit Connectivity
"A natural complement to distributed file-sharing capabilities is distributed computation. The idea behind distributed computation is that a really big problem gets split into discrete, independent chunks, which are then parceled out to individual computers whose owners have volunteered their idle processor time to the cause. In aggregate, the users' computers form a sort of distributed supercomputer. The concept was first popularized by U.C. Berkeley's SETI@Home project, a 1999 PC Magazine Technical Excellence finalist that's now been downloaded by more than 2 million users. Though SETI@Home is a single-purpose tool designed solely to scour radio-telescope signals for signs of extraterrestrial transmissions, you can expect to see general-purpose mechanisms for distributing all kinds of massive computations. United Devices, for example, is a company that will use distributed computing for projects in areas such as bioinformatics research, drug design, and climate studies."
redux [07.22.00]
The Standard Distributed Computing Goes Commercial
"The distributed-computing model could be one of those rare cases where capitalism and pure scientific research mesh. Not every lab can afford to pay $200,000 for an eight-processor Origin 2000 SGI supercomputer, much less $1 million for a 40-processor machine, says David Fenstermacher, director of scientific computing for the medical school at the University of North Carolina at Chapel Hill. (Fenstermacher is also acting director of the campus' Center for Bioinformatics and a United Devices adviser.) And even the most powerful supercomputers need time to process data.

A project that would take several months on a supercomputer – creating a 3D model of a protein's linear be accomplished in much less time using thousands of distributed computers"

redux [04.05.00]
egroups : Decentralization Description
"* Is decentralization ever a good idea? If so, when? Is there non-anecdotal evidence on costs and benefits?
* What protocol issues are there? Can we begin assembling a good protocol for decentralized messaging? To what degree do the protocols for Freenet, Gnutella or WorldOS meet the need? Do we need an application protocol or something lower level? Can HTTP do the job? Can we implement peer routing as an add-on to existing protocols? Is there a call to develop an IETF working group?
* Given that authoring and versioning are critical but hard in a decentralized environment, how can we approach the job? Is it possible to integrate WebDAV with peer networking?
* What are the business issues? Who are the players? Who else stands to win or lose, and why?

At present many people and groups are working on the issues in isolation, some for competitive reasons and some for lack of an alternative. My belief is that a communal approach will be more productive."


[ rhetoric ]

Bioinformatics will be at the core of biology in the 21st century. In fields ranging from structural biology to genomics to biomedical imaging, ready access to data and analytical tools are fundamentally changing the way investigators in the life sciences conduct research and approach problems. Complex, computationally intensive biological problems are now being addressed and promise to significantly advance our understanding of biology and medicine. No biological discipline will be unaffected by these technological breakthroughs.

BIOINFORMATICS IN THE 21st CENTURY

[ search ]

[ outbound ]

biospace / genomeweb / bio-it world / scitechdaily / biomedcentral / the panda's thumb /

bioinformatics.org / nodalpoint / flags and lollipops / on genetics / a bioinformatics blog / andrew dalke / the struggling grad student / in the pipeline / gene expression / free association / pharyngula / the personal genome / genetics and public health blog / the medical informatics weblog / linuxmednews / nanodot / complexity digest /

eyeforpharma /

nsu / nyt science / bbc scitech / newshub / biology news net /

informatics review / stanford / bmj info in practice / bmj info in practice /

[ schwag ]

look snazzy and support the site at the same time by buying some snowdeal schwag !

[ et cetera ]

valid xhtml 1.0?

This site designed by
Eric C. Snowdeal III .
© 2000-2005