snowdeal logo

archives archives

{bio,medical} informatics


Friday, February 02, 2001

bookmark: connotea :: del.icio.us ::digg ::furl ::reddit ::yahoo::

find related articles. powered by google. The Standard Juno Online Launches Into Outerspace
"Facing shrinking ad revenues, ISP Juno Online is jumping into a new field that to date has enjoyed its greatest fame from the search for extraterrestrial life.

The New York-based company announced Thursday the creation of Juno Virtual Supercomputer Project, a distributed computing effort that would tap the computing power of Juno's 4 million subscribers. Juno hopes to sell that vast power to large-scale research projects, initially focusing on bioinformatics and pharmaceutical work, said Juno President and CEO Charles E. Ardai."
redux [10.09.00]
find related articles. powered by google. ACM CrossRoads The SETI@Home Problem
"The SETI@Home problem can be thought of as a special case of the distributed computation verification problem: "given a large amount of computation divided among many computers, how can malicious participating computers be prevented from doing damage?" This is not a new problem. Distributed computation is a venerable research topic, and the idea of "selling spare CPU cycles" has been a science fiction fixture for years.

In real life, distributed computation has been used since at least the late 1980's to create "farms" of machines for rendering 3-D images. Farms allow graphic artists to create large images without needing to buy a supercomputer. More recently, the needs of scientific computation have led to the creation of frameworks such as Parallel Virtual Machine (PVM) and Beowulf, which make it easier to distribute computations across many machines. The machines involved are usually owned by the same entity and a machine is either "good" or "bad" if it is operating or malfunctioning. There are no blatantly malicious machines.

The Internet makes it possible for computation to be distributed to many more machines. However, distributing computing around the internet requires developers to consider the possibility of malicious clients."

"The general study of secure multiparty computation has produced much interesting work over the last two decades. Less well studied, unfortunately, are the tools and techniques required to move the theoretical results to the real world. The old dream of massively distributed computations is finally coming true, and yet our tools for building and analysing real systems still seem primitive. The challenge of the next few years will be to bridge this gap."

redux [08.09.00]
BBC Screensavers could save lives
"Your computer could be helping to save lives when you are not using it to play games or surf the internet.

Instead of it sitting idle, it could be taking part in scientific experiments being distributed across thousands of computers on the internet.

Drugs to beat cancer and flu are starting to be tested in simulations split up and run on personal computers that would otherwise be doing nothing useful." [via slashdot.org]
PC Magazine New Apps Exploit Connectivity
"A natural complement to distributed file-sharing capabilities is distributed computation. The idea behind distributed computation is that a really big problem gets split into discrete, independent chunks, which are then parceled out to individual computers whose owners have volunteered their idle processor time to the cause. In aggregate, the users' computers form a sort of distributed supercomputer. The concept was first popularized by U.C. Berkeley's SETI@Home project, a 1999 PC Magazine Technical Excellence finalist that's now been downloaded by more than 2 million users. Though SETI@Home is a single-purpose tool designed solely to scour radio-telescope signals for signs of extraterrestrial transmissions, you can expect to see general-purpose mechanisms for distributing all kinds of massive computations. United Devices, for example, is a company that will use distributed computing for projects in areas such as bioinformatics research, drug design, and climate studies."
redux [07.22.00]
The Standard Distributed Computing Goes Commercial
"The distributed-computing model could be one of those rare cases where capitalism and pure scientific research mesh. Not every lab can afford to pay $200,000 for an eight-processor Origin 2000 SGI supercomputer, much less $1 million for a 40-processor machine, says David Fenstermacher, director of scientific computing for the medical school at the University of North Carolina at Chapel Hill. (Fenstermacher is also acting director of the campus' Center for Bioinformatics and a United Devices adviser.) And even the most powerful supercomputers need time to process data.

A project that would take several months on a supercomputer – creating a 3D model of a protein's linear be accomplished in much less time using thousands of distributed computers"

redux [04.05.00]
Wired Researcher Borrows from Napster
"A researcher working on the Human Genome Project is using Napster technology, and he's not looking for T3 connections to download Moby.

Dr. Lincoln Stein, an associate professor of bioinformatics at the Cold Spring Harbor Lab in New York, is investigating ways to use Napster-type technology to allow scientists to share their discoveries of the genome.

"I was very interested when I saw Napster," Stein said. "It has a similar architecture (to what we use now), but it allows for 'peer-to-peer' data exchange and it dawned on me that it would be marvelous for our annotation system.""
egroups : Decentralization Description
"* Is decentralization ever a good idea? If so, when? Is there non-anecdotal evidence on costs and benefits?
* What protocol issues are there? Can we begin assembling a good protocol for decentralized messaging? To what degree do the protocols for Freenet, Gnutella or WorldOS meet the need? Do we need an application protocol or something lower level? Can HTTP do the job? Can we implement peer routing as an add-on to existing protocols? Is there a call to develop an IETF working group?
* Given that authoring and versioning are critical but hard in a decentralized environment, how can we approach the job? Is it possible to integrate WebDAV with peer networking?
* What are the business issues? Who are the players? Who else stands to win or lose, and why?

At present many people and groups are working on the issues in isolation, some for competitive reasons and some for lack of an alternative. My belief is that a communal approach will be more productive."


[ rhetoric ]

Bioinformatics will be at the core of biology in the 21st century. In fields ranging from structural biology to genomics to biomedical imaging, ready access to data and analytical tools are fundamentally changing the way investigators in the life sciences conduct research and approach problems. Complex, computationally intensive biological problems are now being addressed and promise to significantly advance our understanding of biology and medicine. No biological discipline will be unaffected by these technological breakthroughs.

BIOINFORMATICS IN THE 21st CENTURY

[ search ]

[ outbound ]

biospace / genomeweb / bio-it world / scitechdaily / biomedcentral / the panda's thumb /

bioinformatics.org / nodalpoint / flags and lollipops / on genetics / a bioinformatics blog / andrew dalke / the struggling grad student / in the pipeline / gene expression / free association / pharyngula / the personal genome / genetics and public health blog / the medical informatics weblog / linuxmednews / nanodot / complexity digest /

eyeforpharma /

nsu / nyt science / bbc scitech / newshub / biology news net /

informatics review / stanford / bmj info in practice / bmj info in practice /

[ schwag ]

look snazzy and support the site at the same time by buying some snowdeal schwag !

[ et cetera ]

valid xhtml 1.0?

This site designed by
Eric C. Snowdeal III .
© 2000-2005