[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: [sdpd] Seen at Geneva - Call for reviews



>3) Bruker presented 2 programs which took a quite different approach by
>using Monte Carlo trials followed by rapid refinement and assessment.
>One of them actually made its refinements directly against the observed
>profile rather than peak positions - this is a bold move and it will be
>interesting to see whether it succeeds.

I think that neither refinements directly against the observed profile or
against the peak positions are the best approach.

There is a third possibility, which is refining against a pseudo powder
pattern rebuilt from the peak positions and intensities, jist like it is
done in the pseudo-direct-space program ESPOIR for structure
solution. The advantage here would be to have the possibility to
enlarge the peak widths in order to find more easily the global minimum.
ESPOIR could be easily modified (by me) for indexing that way,
because it applies already that pseudo-powder pattern concept
together with Monte Carlo and pseudo simulated annealing..

Moreover, applying the "brute force" (trying all possibilities
following a grid step) may not need prohibitive time if enlarging
the peak widths allows to increase the grid step (say for instance
a step of 0.03 or 0.02 Angstrom instead of 0.01 or 0.005). The main
time problem is to find it for writing the program ;-).

Best,

Armel

PS - Evident prohibitive times :

- Testing all possible cubic cells from 2 to 52 Angstroms by step
of 0.001 corresponds to 50000 events, and current processors may
allow to test that in less than one second (tests being made on
Rp of a small pseudo-powder pattern corresponding to the 20 first
peaks).

- Finding the 2 parameters for hexagonal/trigonal and tetragonal
would need already 50000 more time. Reducing the step to 0.01 or
to 0.03 angstroms makes a difference (respectively 500 or 55 times more,
between 1 and 10 minutes).

- Orthorhombic starts to pose a problem. Then, monoclinic and triclinic,
with a systematic 0.03 grid step to explore 2-32 angstroms (1000 steps),
and a 0.04° step to explore 90-130° (1000 steps), would correspond to
231 days and 633000 years of calculation, respectively (if 3000000 steps
are tested per minute) - not taking accound of redundant cells.

But Monte Carlo random steps are expected to reduce these times a lot.
Making a very simple indexing program following these recommendations
is straightforward. Is it worth doing it when processors are close to 3GHz is
another story.

The test case in the Kariuki et al. paper (J. Synchrotron Rad. 6, 1999, 87-92)
was orthorhombic with cell parameters close to 5 Angstroms, using
a genetic algorithm.

Using high quality synchrotron raw data clearly leads to decreasing the
grid step. We need the opposite, enlarging the grid step, and this is
obtained by enlarging the peak width.

Insensitivity to <10% impurities would be there. But I think that
indexing a 2-unkown-phase mixture at 50-50% would be obviously
quite difficult on the Rp test basis.

Comments ?


------------------------ Yahoo! Groups Sponsor ---------------------~-->
Looking for a more powerful website? Try GeoCities for $8.95 per month.
Register your domain name (http://your-name.com). More storage! No ads!
http://geocities.yahoo.com/ps/info
http://us.click.yahoo.com/aHOo4D/KJoEAA/MVfIAA/UIYolB/TM
---------------------------------------------------------------------~->

 

Your use of Yahoo! Groups is subject to http://docs.yahoo.com/info/terms/