bookmark:
connotea
::
del.icio.us
::digg
::furl
::reddit
::yahoo::
"IBM will supply the University of Pennsylvania and four hospitals with computers that will link into a computing "grid" to check for breast cancer, the company will announce Wednesday.
The grid will be used to detect breast cancer in patients, store mammograms in digital form and identify populations that are particularly susceptible, the company said in a statement. The system can be used, for example, to compare a new mammogram to a previous year's image to detect changes.
IBM, along with rivals such as Sun Microsystems and Compaq Computer, have been backing grid computing, which joins computers and storage systems into a large pool of computing power.
redux [11.21.01]
Scientific Computing World Scientific sharing across computer networks in USA"The US National Science Foundation has announced a $12 million programme - called the NSF Middleware Initiative (NMI) - to develop middleware: software that allows scientists to share applications, scientific instruments and data, and collaborate with their colleagues across high-performance networks.
The effort will build on the success of the Globus project in developing middleware tools for grid computing, and will integrate Globus and other emerging middleware components into a well-tested, comprehensive, commercial-quality, middleware distribution package that runs on multiple platforms. These middleware distributions will be disseminated to research labs and universities worldwide."
redux [11.12.01]
ZDNet News New boost for open-source supercomputing"Platform Computing, a company that tries to harness the collective computing power on computer networks, has signed a deal to commercialize an open-source supercomputing project.
Platform is working with the Globus Project to commercialize the Globus Toolkit for governing the use of computers and storage systems joined into a large computing "grid," Platform said Wednesday."
"Grid computing, though, often uses higher-powered computers than mere desktop PCs, and has attracted the interest of IBM, which thinks corporate customers as well as academics will use grid methods. IBM is working with Globus to boost this expansion.
Grid computing has long held potential for some types of computing tasks--typically those that don't require as much communication between one computing task and another. For this reason, they don't replace single mammoth supercomputers such as those from Cray. However, grid computing is popular among pharmaceutical companies and others."
Technical Report, Monash University The Virtual Laboratory: Enabling On-Demand Drug Design with the World Wide Grid"Computational Grids are emerging as a popular paradigm for solving large-scale compute and data intensive problems in science, engineering, and commerce. However, application composition, resource management and scheduling in these environments is a complex undertaking. In this paper, we illustrate the creation of a virtual laboratory environment by leveraging existing Grid technologies to enable molecular modeling for drug design on distributed resources. It involves screening millions of molecules of chemical compounds against a protein target, chemical database (CDB) to identify those with potential use for drug design. We have grid-enabled the molecular docking process by composing it as a parameter sweep application using the Nimrod-G tools. We then developed new tools for remote access to molecules in CDB small molecule database. The Nimrod-G resource broker along with molecule CDB data broker is used for scheduling and on-demand processing of jobs on distributed grid resources. The results demonstrate the ease of use and suitability of the Nimrod-G and virtual laboratory tools."
redux [11.06.01]
Washington Business Journal Celera strikes deal with Parabon to improve efficiency"Celera is utilizing a novel approach to bioinformatics to speed up its proteomics research.
The Rockville-based company is using Frontier, a proprietary technology, to harness computer power while the computers are shut down, in sleep mode, or being used for other applications."
" Through Frontier, Celera can tie together its employees' computers so that they all work on large-scale, proteomic sequencing. Frontier detects when a computer has idle capacity, and therefore room, to work on a research application. Such capacity could be available even when a person is using his or her computer during everyday tasks."
redux [04.04.01]
BioMedNet Intel supports online protein project
[requires 'free' registration]
"Intel is providing equipment and software downloads for a project in which volunteers are donating spare home computer cycles to a Stanford University project studying the protein-folding process. The project, Folding@Home, was the first to model successfully a complete protein fold - a task not even achieved by supercomputers."
""We want to increase the value of the PC," said Scott Griffin, Intel's program manager. "The PC is there when people aren't at it, like when they are in meetings. A great thing about this is you get every day users involved in research that they care about. Not only do they get to help out, but they get to help cure these terrible diseases.""
redux [09.23.01]
Wired News The Little Screensaver That Could"IBM is spending $100 million building the world's fastest supercomputer to do cutting-edge medical research, but a distributed computing effort running on ordinary PCs may have beaten Big Blue to the punch.
IBM's proposed Blue Gene , a massively parallel supercomputer, in hopes to help diagnose and treat disease by simulating the ultra-complex process of protein folding.
"But Folding@Home , a modest distributed computing project run by Dr. Vijay Pande and a group of graduate students at Stanford University, has already managed to simulate how proteins self-assemble, something that computers, until now, have not been able to do."
redux [08.29.01]
Nature: Science Update Parasite corrals computer power"According to The Hitchhiker's Guide to the Galaxy, hyper-intelligent pan-dimensional beings (disguised as mice) are using us to compute The Ultimate Question of Life, The Universe, And Everything. Now earthling scientists have roped unsuspecting web servers into a similar - albeit slightly less ambitious - exercise in parasitic computing.
Using the Internet itself as a computer, Jay Brockman and colleagues at the University of Notre Dame, Indiana, have solved a mathematical problem with the unwitting assistance of machines in North America, Europe and Asia."
EyeForPharma Novartis evaluates Entropia's distributed computing technology for accelerating drug discovery""The vast quantities of data involved in the genomic era of drug discovery are quickly outpacing advances in computing technology," said Robert North, Entropia CEO. "Distributed computing allows companies to cost-effectively access the massive computing power they'll need by using their existing PC networks. It's quite exciting that companies like Novartis are deploying our platform to demonstrate the potential of distributed computing as a valuable tool in drug discovery efforts.""
redux [10.09.00]
ACM CrossRoads The SETI@Home Problem"The SETI@Home problem can be thought of as a special case of the distributed computation verification problem: "given a large amount of computation divided among many computers, how can malicious participating computers be prevented from doing damage?" This is not a new problem. Distributed computation is a venerable research topic, and the idea of "selling spare CPU cycles" has been a science fiction fixture for years."
"The Internet makes it possible for computation to be distributed to many more machines. However, distributing computing around the internet requires developers to consider the possibility of malicious clients."
"The general study of secure multiparty computation has produced much interesting work over the last two decades. Less well studied, unfortunately, are the tools and techniques required to move the theoretical results to the real world. The old dream of massively distributed computations is finally coming true, and yet our tools for building and analysing real systems still seem primitive. The challenge of the next few years will be to bridge this gap."
“Bioinformatics will be at the core of biology in the 21st century. In fields ranging from structural biology to genomics to biomedical imaging, ready access to data and analytical tools are fundamentally changing the way investigators in the life sciences conduct research and approach problems. Complex, computationally intensive biological problems are now being addressed and promise to significantly advance our understanding of biology and medicine. No biological discipline will be unaffected by these technological breakthroughs.”
BIOINFORMATICS IN THE 21st CENTURY
biospace
/
genomeweb
/
bio-it world
/
scitechdaily
/
biomedcentral
/
the panda's thumb
/
bioinformatics.org
/
nodalpoint
/
flags and lollipops
/
on genetics
/
a bioinformatics blog
/
andrew dalke
/
the struggling grad student
/
in the pipeline
/
gene expression
/
free association
/
pharyngula
/
the personal genome
/
genetics and public health blog
/
the medical informatics weblog
/
linuxmednews
/
nanodot
/
complexity digest
/
eyeforpharma
/
nsu
/
nyt science
/
bbc scitech
/
newshub
/
biology news net
/
informatics review
/
stanford
/
bmj info in practice
/
bmj info in practice
/
look snazzy and support the site at the same time by buying some snowdeal schwag !
valid xhtml 1.0?
This site designed by
Eric C. Snowdeal III
.
© 2000-2005