The NRC rankings are out

The national research council graduate department rankings are out. These are based on objective measurable quantities, like how many papers are published per faculty member and how long it takes Ph.D. students to graduate (and where they go). This is fundamentally different from the US & World report ranking which is based on reputation.

On reputation we’re about number 300. Based on facts my department is about about 10x higher (around 30) nationally.

I wonder how long it will take perceptions to catch up with reality?

I plan to make the GSU/CS logo much more prominent on poster in the future.

Written by Rob in: pedagogy |


I think it was Arthur Clarke who pointed out that any sufficiently advanced technology is indistinguishable from magic. I didn’t think I’d live to see that happen. But the Republican candidate for senator in Delaware had “dabbled in witchcraft”, and many (but certainly not all) of the conservatives as well as a fair number of the “green left” show an abysmal lack of understanding of science and technology.

While I certainly can’t claim to understand all of the details of every bit of semiconductor technology, the ‘gentleman’s acquaintance’ with material physics I had as an undergraduate is still valid and I can appreciate the mechanism of solid state electronics – including the rather complex ones that run our computers. However, if I didn’t have that it would be easy to drift into magical thinking about the most interesting machines people have built.

Technology must be scary to the ignorant.

Written by Rob in: rant,science |

DIRECT optimizer and its cousins

Back after a busy summer with trips and other sorts of diversions. (not all pleasant ones).

There is a class of function-value only optimizers which build a tree and use Lipschitz continuity to ensure global convergence. DIvided RECTangle algorithms recursively build a tree of rectangles. With a bit of cunning, based on Lipschitz continuity (the statement that iff it is continuous then there is a finite derivative (delta F is bounded by some K delta x)), this approach quickly finds global optima over a bounded region.

The cunning bit of this is how it searches for candidate optima for further expansion. Basically, if a box is of size X then if the function value is the lowest of all boxes of size X or bigger, then that box is a candidate for search.

It is immediately obvious that such an algorithm could efficiently search many problems that are thought to be NP complete (but aren’t really) like well protein folding, and other problems that are simply a pain – like image modeling from incomplete Fourier data.

I’ve generalized the algorithm with an amortized generic tree based on Delaunay triangulations, and get both a simpler program and somewhat better convergence (or at least what looks like better convergence). It certainly handles some very painful challenge problems quite well. The reason to use a generic tree is to handle seeding the optimizer with known potential solutions and/or to avoid having to encode the search space and function in a fairly complex manner.

So now to try more interesting problems.

Written by Rob in: science |

Powered by WordPress | Aeros Theme | WordPress Themes