snowdeal logo

archives archives

{bio,medical} informatics

Wednesday, September 20, 2000

bookmark: connotea :: ::digg ::furl ::reddit ::yahoo::

find related articles. powered by google. kuro5hin Why not parallel (agent frameworks)?
"Several months ago I was given a task of writing a very fast genetic sequence alignment software. For people that dont know, this software takes two gene sequences of arbitrary lenght and produces all of their possible alignments (see chapter four of this tutorial)..."

"The real problem arises in lab use where a scientist has to compare a new gene sequence to an existing database of hundreds of thousands of sequences. There are already a lot of web sites to do this but this particular piece of software was supposed to be "ours" (sigh). At first I didnt know much about the whole subject so I decided to conduct an experiment using a dynamic programming algorithm. To be fast, C was the programming language choice. The sequential version of the algorithm worked smoothly, however wasnt nearly as fast for what we wanted to do. So I decided to try and parallelize the algorithm. Writing threaded applications that work under different platforms can be a pain, no matter how much standardization people have thrown in it. Soon, my parallelized version worked on a two-cpu Alpha machine and I was pleased (it was faster too). However, when compiled under Linux or Solaris or FreeBSD it trashed the memory and then the painful task of debugging a multi-threaded application started. Due to lack of time and proper tools(my finals were coming) I abandoned the code.

"For now, I am stuck to solving matters on a single machine and telling it to do exactly what it has to, no matter what the complexity of the problem attacked is. Many people observe that Open Source usually produces tools and methods that are (better) copies of commercial originals. Perhaps an Open Source tool such as the above proposed would be a good thing to do. Give people the power, for free."
find related articles. powered by google. SunWorld Distributing computing the GNU way
"We often mention distributed computing models: JavaSpaces, Sash, BizTalk, WebL, and so on. Our lead column in August gave particular attention to the technical prospects for Microsoft's .NET initiative. Piper is an alternative to those models, and .NET in particular, on both engineering and business levels.

Microsoft, for example, has specific business motivations with .NET that involve licensing issues and how the company is paid for its products. Crudely, Microsoft wants to use .NET capabilities to ensure it receives payment every time its software is used. Piper, in contrast, is a free software project to make "anything and everything buildable by linking small components," even across a network, according to J.W. Bizzaro, director of "

"Moreover, Piper's connections are considerably richer than Unix's pipes. Rather than just a one-way, unstructured datafeed, Piper "[l]inks can depict protocol-independent data flow, procedural steps, and relationships," according to one Piper document. Moreover, those links "can merge or split streams."

Most compelling of all, perhaps, is the opportunity to escape the confines of a single desktop and access resources throughout a rich network. Piper knows how to do that, too." [via]

find related articles. powered by google. piper
"Piper is a system for managing multi-protocol connections between Internet-distributed objects. Networks, programs, files, widgets, and so on, are all treated as objects and represented in a graphical user interface (GUI) as the nodes of a flow chart (with the Pied/Piper user interface). The user can join nodes via lines that depict links for data flow, procedural steps, relationships, and so forth.

The Internet-distributed nature of Piper lets the user work in a unique way: Only the graphical representation of an object resides on a local workstation. Compute-intensive programs and large data sets can reside remotely on high-performance, high-capacity computers.

Joining nodes across the Internet can also be used to form world-wide collaboratives (such as The Loci Project) and provide an almost limitless collection of objects for the user. "

[ rhetoric ]

Bioinformatics will be at the core of biology in the 21st century. In fields ranging from structural biology to genomics to biomedical imaging, ready access to data and analytical tools are fundamentally changing the way investigators in the life sciences conduct research and approach problems. Complex, computationally intensive biological problems are now being addressed and promise to significantly advance our understanding of biology and medicine. No biological discipline will be unaffected by these technological breakthroughs.


[ search ]

[ outbound ]

biospace / genomeweb / bio-it world / scitechdaily / biomedcentral / the panda's thumb / / nodalpoint / flags and lollipops / on genetics / a bioinformatics blog / andrew dalke / the struggling grad student / in the pipeline / gene expression / free association / pharyngula / the personal genome / genetics and public health blog / the medical informatics weblog / linuxmednews / nanodot / complexity digest /

eyeforpharma /

nsu / nyt science / bbc scitech / newshub / biology news net /

informatics review / stanford / bmj info in practice / bmj info in practice /

[ schwag ]

look snazzy and support the site at the same time by buying some snowdeal schwag !

[ et cetera ]

valid xhtml 1.0?

This site designed by
Eric C. Snowdeal III .
© 2000-2005