snowdeal logo

archives archives

{bio,medical} informatics


Tuesday, October 03, 2000

bookmark: connotea :: del.icio.us ::digg ::furl ::reddit ::yahoo::

find related articles. powered by google. Individual.Com Pushing Boundaries
""What's needed in this field is a knowledge-management system that lets researchers query across any and all of this data, write complex queries, and have the data appear to be coming from a common repository," says [VP of IBM Life Sciences Caroline] Kovac. Data warehousing doesn't do all that well with disparate data, and it also has a rigid schema, so you can't add new sources very easily. DiscoveryLink helps researchers integrate data in a flexible and scalable way.

The hardware side is also changing, Kovac says. Historically, biological computing has been done on supercomputers, but there's now strong interest in Linux clusters. Many tasks that biologists and life scientists need to do can be partitioned to run on Linux clusters. Says Kovac, "Some of the most interesting installations we're doing are hybrids with our SP2 machine and Linux clusters.""
redux [09.07.00]
find related articles. powered by google. ScienceDaily Supercomputers Help University Of Idaho Scientists Explore Genetics And Bioinformatics
"The mapping of the human genome is the tip of the iceberg that is the biological information revolution.

University of Idaho computer scientists and mathematicians are joining biologists to explore new ways to interpret the complex genetic information that describes all living things and their relationships.

Along the way, UI students returning to school this fall will find a new course few schools could hope to offer: building a new supercomputer."

"The students will work on every step of the project, from determining the requirements the supercomputer must meet, though the purchase, assembly, software selection and installation. "They are involved from start to finish. It should be a great experience for them," Heckendorn added."

""It's commodity computing. If you can only buy commodity computers and hook them together with the right stuff in the right way, you can get supercomputing power," he said. Although multi-million dollar specialty supercomputers still dominate the high end of the market, Beowulf-style supercomputers are gaining."
redux [08.28.00]
find related articles. powered by google. The New York Times Supercomputers Track Human Genome
[requires 'free' registration]
"Kwang-I Yu, president of Paracel Inc., will not say which secretive government agency buys his company's specialized supercomputers. "We sell to the federal government," he demurs.

But J. Craig Venter, the president of Celera Genomics, is less circumspect. Paracel's machines, he said, are used by the National Security Agency, the code-breaking unit that eavesdrops on other nations. And Dr. Venter should know. Celera Genomics, the company that mapped the human genome, acquired Paracel for nearly $250 million in stock in June." The machines are being snapped up to sift the blizzard of data being generated by the Human Genome Project and various private genomics efforts. "They're all character strings," said Dr. Yu, comparing the gene sequence to a text message."

"I.B.M. estimates that the market for hardware and software for life sciences will grow from $3.5 billion now to more than $9 billion by 2002. Carolyn Kovac, who heads a newly formed life sciences division at I.B.M., said biologists had replaced physicists as the main scientific users of supercomputers."

redux [06.14.00]
Scientific Computing World Biotech and clusters dominate Mannheim Supercomputing Conference
"Biotechnology and cluster computing were the main focus of attention in the corridors and private conversations at the recent Mannheim Supercomputing Conference, even though US Defense Department machines again dominated the upper ranks of the most recent Top500 listing of the world's fastest supercomputers, published at the meeting. "

"...in the breaks between the formal sessions at the conference, many of the delegates appeared to be focusing more on the prospective growth in bioinformatics and the demands that this is creating for high performance computing. In an interview, Dr Martin Walker from Compaq reviewed the figures for the growth of the supercomputing market. At present, according to the International Data Corporation figures, the high performance computing market was worth about $5.6 billion in 1999 and is expected to grow by about 9 per cent annually through to 2003. However, bioinformatics may change all that, he noted. The total R&D budget of the pharmaceutical industry is about $40 billion annually and so if some 10 per cent of it were to go on IT, this would provide $4bn for computing, especially high performance computing applications. Thus bioinformatics could grow, relatively quickly, from a minor component of the market to a 40 per cent share. 'This is a phase transition,' he commented. "

redux [03.29.00]
find related articles. powered by google. LinuxWorld Farming, Linux-Style
"Gone are the days when any pioneer with a bit of hardware, hard code, and hard work could run a small Linux farm and compete with the best plantations. The smart folks at biotech firm Incyte Genomics of Palo Alto, Calif., have just invented agribusiness. You remember everything you ever tried to tell your boss or colleagues about Linux's stability, price performance, and reliability? Well, Incyte has put those ideas to the test and come up grinning like a bandit.

To map the human genome, Incyte runs the world's largest commercial Linux farm, with more than 2,000 Linux processors chomping away on tens of millions of jobs per day. In its datacenter, laid out like a temple in the middle of Incyte's corporate headquarters, space costs a king's ransom -- but the company has come up with clever ways to address that problem..."


[ rhetoric ]

Bioinformatics will be at the core of biology in the 21st century. In fields ranging from structural biology to genomics to biomedical imaging, ready access to data and analytical tools are fundamentally changing the way investigators in the life sciences conduct research and approach problems. Complex, computationally intensive biological problems are now being addressed and promise to significantly advance our understanding of biology and medicine. No biological discipline will be unaffected by these technological breakthroughs.

BIOINFORMATICS IN THE 21st CENTURY

[ search ]

[ outbound ]

biospace / genomeweb / bio-it world / scitechdaily / biomedcentral / the panda's thumb /

bioinformatics.org / nodalpoint / flags and lollipops / on genetics / a bioinformatics blog / andrew dalke / the struggling grad student / in the pipeline / gene expression / free association / pharyngula / the personal genome / genetics and public health blog / the medical informatics weblog / linuxmednews / nanodot / complexity digest /

eyeforpharma /

nsu / nyt science / bbc scitech / newshub / biology news net /

informatics review / stanford / bmj info in practice / bmj info in practice /

[ schwag ]

look snazzy and support the site at the same time by buying some snowdeal schwag !

[ et cetera ]

valid xhtml 1.0?

This site designed by
Eric C. Snowdeal III .
© 2000-2005