Thursday, November 18, 2010

Google Code vs GitHub for hosting opensource projects

Cython is now considering options where to move the main (mercurial) repository, and Robert Bradshaw (one of the main Cython developers) has asked me about my experience with regards to Google Code and GitHub, since we use both with SymPy.

Google Code is older, and it was the first service that provided free (virtually unlimited) number of projects that you could easily and immediately setup. At that time (4 years ago?) that was something unheard of. However, the GitHub guys in the meantime not only made this available too, but also implemented features, that (as far as I know) no one offers at all, in particular hosting your own pages at your own domain (but at GitHub's servers, some examples are sympy.org and docs.sympy.org), commenting on git branches and pull requests before the code gets merged in (I am 100% convinced that this is the right approach, as opposed to comment on the code after it gets in), allow to easily fork the repository and it has simply more social features, that the Google Code doesn't have.

I believe that managing an opensource project is mainly a social activity, and GitHub's social features really make so many things easier. From this point of view, GitHub is clearly the best choice today.

I think there is only one (but potentially big) problem with GitHub, that its issue tracker is very bad, compared to the Google Code one. For that reason (and also because we already use it), we keep our issues at Google Code with SymPy.

The above are the main things to consider. Now there are some little things to keep in mind, that I will briefly touch below: Google Code doesn't support git and blocks access from Cuba and other countries, when you want to change the front page, you need to be an admin, while at GitHub I simply add push access to all sympy developers, so anyone just pushes a patch to this repository: https://github.com/sympy/sympy.github.com, and it automatically appears on our front page (sympy.org), with Google Code we had to write long pages (in our docs) about how to send patches, with GitHub we just say, send us a pull request, and point to: http://help.github.com/pull-requests/. In other words, GitHub takes care of teaching people how to use git and figure out how to send patches, and we can concentrate on reviewing the patches and pushing them in.

Wikipages at github are maintained in git, and they provide the webfrontend to it as opensource, so there is no vendor lock-in. Anyone with github account can modify our wiki pages, while the Google Code pages can only be modified by people that I add to the Google Code project, which forced us to install mediawiki on my linode server (hosted at linode.com, which by the way is an excellent VPS hosting service, that I have been using for couple of years already and I can fully recommend it), and I had to manage it all the time, and now we are moving our pages to the github wiki, so that I have one less thing to worry about.

So as you can see, I, as admin, have less things to worry about, as github manages everything for me now, while with Google Code, I had to manage lots of things on my linodes.

One other thing to consider is that GitHub is only for git, but they also provide svn and hg access (both push and pull, they translate the repository automatically between git and svn/hg), I never really used it much, so I don't know how stable this is. As I wrote before, I think that git is the best tool now for maintaining a project, and I think that github is now the best choice to host it (except the issue tracker, where Google Code is better).

Sunday, October 31, 2010

git has won

I switched to git from mercurial about two years ago. See here why I switched and here my experience after 4 months. Back then I was unsure, whether git will win, but I thought it has a bigger momentum. Well, I think that now it's quite clear that git has already won. Pretty much everybody that I collaborate with is using git now.

I use github everyday, and now thanks to github pull requests, I think it's the best collaboration platform out there (compared to Google Code, Sourceforge, Bitbucket or Launchpad).

I think it's partly because the github guys have a clear vision of what has to be done in order to make collaboration more easier and they do it, but more importantly that git branches is the way to go, as well as other git features, that are "right" from the beginning (branches, interactive rebase, and so on), while other VCS like bzr and mercurial simply either don't have them, or are getting them, but it's hard to get used to it (for example mercurial uses the "mercurial queues", and I think that is the totally wrong approach to things).

Anyway, this is just my own personal opinion. I'll be happy to discuss it in the comments, if you disagree.

Friday, August 13, 2010

Week Aug 9 - 13

On Monday I learned fwrap (excellent piece of software btw), there were a few minor technical issues, that I communicated with Kurt on the fwrap mailinglist and I also send him a simple patch, so that it works fine for my fortran code (functions returning the value itself, instead of a tuple of length 1).

Then I took my old fortran based shooting method solvers that I wrote couple years ago and wrapped them using fwrap and run couple simulations against my FE solver.

On Tuesday we had a lunch with all the advisors and students and the llnl director and a little presentation about what we did.

On Wednesday I run shooting method calculations for 50 states of silver, both for selfconsistent DFT potential and Z/r potential. I then also run the FE solver for the same DFT potential and compared results. There are lots of small technical issues, for example I had to use cubic splines to interpolate the potential, play with the mesh for the shooting method and so on.

However, the shooting method and FE agrees to every single printed digit, after making sure that the mesh is ok for both methods. For all potentials that I tried. That's very cool.

In the process of it, I also wrote a patch to SymPy to calculate exact energies for the Hydrogen atom, both from Schroedinger and Dirac equations. I still need to polish it a bit.

On Thursday I run couple more calculations and setup a poster and had a poster session, it was two hours, and I think around 7 people (not counting other students and people from our group) stopped by and talked with me about it, so I was very happy. Being able to solve radial Schroedinger and especially Dirac equations robustly is something that several people in the lab would really need.

Today I talked little bit (finally) about some Green functions in QM and QFT with a postdoc in the Quantum Simulations group, that I always wanted to, but didn't have time before, then packed my things and went back to Reno.

My plan for the next week(s) is to wrap up what I did and put it into articles. I already have enough material for some articles, so it has to be done. In parallel, I'd like to finish the FE Dirac solver, the coding is done, but now I need to play with adaptivity and also investigate if we are getting the spurious states, that other people are getting when using b-splines.

Friday, August 6, 2010

Week Aug 2 - 6

This week I essentially only worked on my LLNL poster, which I finally finished about two hours ago. I have created a web page for it:

http://certik.github.com/ccms-10-poster/

where you can download pdf, sources, I also put there some relevant info and links.


It turned out to be a lot more work than I expected (well, as usual), but we were very thorough with John and in the process I discovered several bugs in my program, so I am glad we did it. I used to generate all the plots by hand, by manually adjusting all the parameters in the Python script (like atomic number, mesh parameters, element orders, adaptivity parameters, error tolerance and so on). Essentially I had to remember all these parameters for each of the plots (about 10 of them). Then I settled to have a Python dictionary, that holds all the parameters, and then I just pass them to a radial_schroedinger_equation_adapt(params, error_tol=1e-8) function.

Here are example of the parameters:

params_hydrogen_p_L = dict(l=0, Z=1, a=0, b=100, el_num=4, el_order=1,
eig_num=3, mesh_uniform=False, mesh_par1=20, adapt_type="p",
eqn_type="R")
params_hydrogen_p_U = dict(l=0, Z=1, a=0, b=100, el_num=4, el_order=2,
eig_num=3, mesh_uniform=True, adapt_type="p", eqn_type="R")
params_hydrogen_hp_U = dict(l=0, Z=1, a=0, b=100, el_num=4, el_order=2,
eig_num=3, mesh_uniform=True, adapt_type="hp", eqn_type="R")
params_hydrogen_h_U = dict(l=0, Z=1, a=0, b=100, el_num=4, el_order=6,
eig_num=3, mesh_uniform=True, adapt_type="romanowski",
eqn_type="rR")

params_silver_p_L = dict(l=0, Z=47, a=0, b=150, el_num=4, el_order=13,
eig_num=50, mesh_uniform=False, mesh_par1=35, adapt_type="p",
eqn_type="R")
params_silver_hp_L = dict(l=0, Z=47, a=0, b=150, el_num=4, el_order=13,
eig_num=50, mesh_uniform=False, mesh_par1=35, adapt_type="hp",
eqn_type="R")

I mean, this is kind of obvious if you think about it, but for some reason I didn't do that at all at the beginning, because I thought --- I'll just run this once and I am done with it. But I had to run it like 20x, e.g. regenerating he plots, then creating a table about meshes, then redoing the table after changing the error tolerance, and so on.

Besides that I also got permission to release my code, so I'll go over it in the coming days and generate nice patches against Hermes1D.

Also in the process of creating the poster, I played a lot with p-FEM, uniform-p-FEM, hp-FEM and h-FEM and I will keep playing with that. It's clear to me now, that our current Hermes1D is not optimal. Especially the convergence of hp-FEM and p-FEM (as it is implemented right now) greatly depends on the initial mesh.

Nevertheless, even with the above limitations, hp-FEM seems to be really good if you don't a-priori know anything about the problem/mesh. One should not make any deep conclusions in 1D (it might be a bit different in 2D and 3D, and also I only did couple test problems), but from my experience so far, hp-FEM is a really good choice, if you just want to solve the problem and get a decent convergence (way better than h-FEM, and in general about the same as uniform-p-FEM with optimized mesh).

Another conclusion is that uniform-p-FEM (also called spectral element method), if you optimize the mesh for the problem, is very fast. All you have to do is increase the polynomial order and it goes very straight on the convergence graphs, it's very hard to beat. Also, and that I would like to write in the coming days, the algorithm for optimizing the mesh is really simple: just solve it with high "p", then play with the mesh parameters (for logarithmic mesh, there are only 2 parameters --- number of elements, and a ratio of first vs. last element), so that the eigenvalues (that one is interested in) are converged (with given accuracy) and optimize it wrt DOFs. The algorithm can also "look" at the convergence graphs and make sure it's steep enough. For atomic problems, my experience shows that the logarithmic mesh is good enough (as long as you optimize it). The advantage is that you do this once, and then (for close enough potentials in the Schroedinger equation), you just increase "p", and it's very robust and fast (no need for reference mesh, or trial refinements and so on).

When I get back to Reno, we'll do more research on hp-FEM with Pavel and I think this is not the last word to say. We need to review how we choose the candidates for eigenvectors, especially "p" vs "hp" and make it more robust. We'll see.

Saturday, July 31, 2010

Week July 26 - 30

This week I have been wrapping up my work at LLNL and trying to generate some comparisons between different approaches to adapt to multiple eigenvectors at once.

It turns out that most of the issues are in the way we create the new mesh for the next adaptivity iteration, in particular, how do we choose which candidates to refine. I have tried to converge to the lowest eigenvector (as well as to any other eigenvector too), to the sum of the eigenvectors, to each eigenvector individually and taking the union of the meshes and so on. I have also implemented the uniform p-FEM as well as tried p-FEM and hp-FEM using the approaches I mentioned above.

It seems to be crucial to have a good initial mesh, at least for p-FEM. If I use a good mesh and p-adaptivity, I am able to get the best results so far. In principle hp adaptivity should be at least as good, but our current approach doesn't show it yet. Hopefully we'll manage to make it work.

Besides that I have also implemented H1 norms for the Function class (based on Fekete points) by calculating coefficients with regards to a FE basis and some other little things.

Friday, July 23, 2010

Week July 19 - 23

This week I have learned how projections work in detail and wrote it up here:

http://theoretical-physics.net/dev/src/math/la.html

including all proofs (that orthogonal projection finds the closest vector and so on). At the end of the notes I have calculated some examples in 1D, so that one can see that indeed it doesn't depend on the basis and that the basis doesn't even have to be orthogonal. Then one has to use this approach to calculate the coefficients:

http://theoretical-physics.net/dev/src/math/la.html#nonorthogonal-basis


I have then implemented it in h1d. Due to the lack of time, I am now developing everything in my private branch. I'll obtain the permission in about 2 or 3 weeks, so then I'll push it into the master.

Besides that I have implemented Chebyshev points for orders greater than 48, for which I don't have exact Fekete points anymore (it'd be just a matter of running my sympy script longer, but I was hitting some accuracy issues when solving those large polynomials numerically -- one needs to obtain all roots, so Chebyshev points are ok for now). So I can now represent arbitrary polynomials in 1D.

I have implemented powering of the discrete function, it automatically determines which polynomial order it has to use and creates a new discrete function (the power) on the new mesh. I wrote lots of tests for that, and I hit an interesting bug, that my naive comparison code:

assert abs(x-y) < eps

was not good enough anymore for larger numbers and I had to read some documentation, and implement the following function:

@vectorize(0, 1)
def feq(a, b, max_relative_error=1e-12, max_absolute_error=1e-12):
a = float(a)
b = float(b)
# if the numbers are close enough (absolutely), then they are equal
if abs(a-b) < max_absolute_error:
return True
# if not, they can still be equal if their relative error is small
if abs(b) > abs(a):
relative_error = abs((a-b)/b)
else:
relative_error = abs((a-b)/a)
return relative_error <= max_relative_error

Then I implemented the global H1 and L2 projections, so far the projected function is hardwired, I still need to allow the user to specify any discrete function to be projected. I need to precalculate it and so on.

I wrote bunch of tests for the projections and powers and I always discovered some bugs by writing more tests, so the progress is slow, but at least I can trust the code that is tested.

I also helped Pavel to fix couple segfaults, as well as some other things.

Monday, July 19, 2010

Theoretical Physics Reference Book

Today I fulfilled my old dream --- I just created my first book! Here is how it looks like:

More images
here.

Here is the source code of the book: http://github.com/certik/theoretical-physics, the repository contains a branch 'master' with the code and 'gh-pages' with the generated html pages, that are hosted at github, at the url theoretical-physics.net.

Then I published the book at Lulu: http://www.lulu.com/product/hardcover/theoretical-physics-reference/11612144, I wanted a hardcover book, so I setup a project at Lulu, used some Lulu templates for the cover and that was it. Lulu's price for the book is $19.20 (166 black & white pages, hardcover), then I can set my own price and the rest of the money probably goes to me. I set the price to $20, because Lulu has free shipping for orders $20 or more. You can also download the pdf (for free) at the above link (or just use my git repository). So far this didn't cost me anything.

I have then ordered the book myself (just like anybody else would, at the above address) and it arrived today. It's a regular hardcover book. Beautiful, you can browse the pictures above. It smells deliciously (that you have to believe me). And all that it cost me was $19.20.

As for the contents itself, you can browse it online at theoretical-physics.net, essentially it's most of my physics notes, that I collected over the years. I'd like to treat books like software --- release early release often. This is my first release and I would call it beta. The main purpose of it was to see if everything goes through, how long it takes (the date inside the book is July 4, 2010, I created and ordered it on July 5, got the physical book on July 19) and what the quality is (excellent). I also wanted to see how the pages correspond to the pdf (you can see for yourself on the photos, click on the picasa link above).

Now I need to improve the first pages a bit, as well as the last pages, improve the index, write some foreword and so on. I also need to think how to better organize the contents itself and generally improve it. I also need to figure out some versioning scheme, so far this is version 0.1. I think I'll do edition 1, edition 2, edition 3, and so on. And whenever I feel that I have added enough new content, I'll just publish it as a new edition. So if you want to buy it, I suggest to wait for my 1.0 version, that will have the mentioned improvements.

It'd be also cool to have all the editions online somehow and create nice webpages for it (currently theoretical-physics.net points directly to the book html itself).

So far the book is just text. I still need to figure out how to handle pictures and also whether or not to use program examples (in Python, using sympy, scipy, etc.). So far I am inclining not to put there any program codes, as then I don't need to maintain them.

Overall I am very pleased with the quality, up to some minor issues that I mentioned above, everything else end up just fine. I think we have come a long way from the discovery of the printing press. Anybody can now create a book for free, and if you want to hold the hardcopy in your hands, it costs around $20. You don't need to order certain amounts of books, nor partner with some publisher etc. I think that's just awesome.

Friday, July 16, 2010

Week June 12 -- July 16

This week I was implementing hp-adaptivity based on H1 projections first for the ground state and later for other states.

It took me the whole week to debug things and I am still not done. There were issues with normalizing vectors and other problems. I am now able to converge any individual vector and that seems to work fine, but I am still not able to converge multiple vectors somehow.

I spent also some time with speeding up my implementation of fekete points, and I have implemented the fast evaluation based on Lagrange interpolation polynomials. In general, I made for example the l2_norm() method about 10x faster. I think it's about as fast now as if it was written in C++ directly. The main loops are now optimized C without any Python C/API calls.

My plan is now to implement taking squares of the solutions (exactly), and so on, and projecting it back to the original mesh. I'll try converging to that. At the same time, I'll try to converge to multiple eigenvalues. Hermes1d needs some improvements, that take me quite some time to implement.

Saturday, July 10, 2010

Week June 5 -- July 9

This week I have learned how to use the new C++ support in Cython, and posted a demo project at github (http://github.com/certik/cpp_test), then I have refactored the Python wrappers in hermes1d and I think things are now way cleaner.

Then I have finished my reimplementation of Romanowski's algorithm for adapting eigenvalues for the radial Schroedinger equation, I've extracted his values from the graphs, calculated the same thing myself and it agrees perfectly.

I am now polishing the h1 adaptivity in h1d, I essentially reimplemented it myself, so that I can consider more candidates (all possible combinations of "p" and "h", and also soon I'll consider bisecting, trisecting and so on of the interval).

I've implemented hydrogen wavefunctions in sympy and use it to project onto a very fine mesh (12 order, lots of elements), convert to Fekete points and use it for the adaptivity. It's in pure Python, as I need to develop very fast to get some results and see what approach is the best. Everything is fast, except the selection of candidates, which I'll now rewrite into Cython and try to use hermes1d whenever possible.

Next week I'll try to merge my adaptivity with hermes1d adaptivity and make it fast. And see how it converges.

I've also spent about 10 hours with improving Pavel's hermes2d branch, as well as implementing the Vector class in hermes_common.

Saturday, July 3, 2010

Week June 28 -- July 2

On Saturday I flew to Prague, then to Pilsen to the ESCO2010 conference. I had to presentations there and I met lots of awesome people.

I discussed with Dmitri how to solve the Euler equations using FEM and I think I know how to implement it now in hermes2d. I will try to give it a shot at the end of the summer, as this is something that was bothering me a lot.

I have made a progress in my fekete points code, I also wrote analytic solutions for the hydrogen atom in sympy. I'll wrap it up over the weekend and push to h1d.

I have investigated how to use github for collaboration using git, here is how it looks like for sympy: http://github.com/sympy.

I have figured out how to host any domain at github, in particular this:

http://theoretical-physics.net/

now runs at github. I think this is really awesome.

Besides that, I spent some time discussing how to do hp-adaptivity in h1d with Pavel and other people, as well as I helped couple people with hermes2d and so on.

My plan for the next week is to implement hp-adaptivity in h1d for eigenproblems, as well as finish my fekete machinery and make it fast using cython and use it in conjunction with the h1d adaptivity.

Friday, June 25, 2010

Week June 20 -- June 25

This week I implemented a Dirac solver in 1D, I have it in my private branch and I'll get a permission at the end of the summer to get it opensourced. Originally it didn't work at all, but then I've found a bug and it seems to be working now.

Before it I've implemented an example for the y''+k^2*y=0 equation in hermes1d by writing it as two coupled first order equations and solving the eigenproblem for "k".

I also started to implement a Function class that represents a function in 1D using Fekete points and I plan to implement orthogonal projections and adaptive algorithms for refining the mesh, so that I can play with adaptivity in 1D.

Besides that I studied couple articles about using b-splines to solve the radial Dirac equation and also about using variational formulation for FEM.

I helped fix hermes_common and did lots of administration at the lab (10 online courses and other things). We also got an awesome NIF tour.

Hermes1D patches:

Ondrej Certik (26):
Remove the autogenerated Makefile
hermes_common updated
Dirac solver added
Plot only 10 eigenfunctions
Fix the number of equations bug
Generate the Gauss Lobatto points
Add the autogenerated file
Don't use strings as dict keys
Fekete points manipulation implemented
fekete: solve the coefficients, tests pass
More tests added
system_sin_eigen example added
Add a comment about the Lagrange interpolation
Function.project_onto() implemented
Function.plot() implemented
Comparisons implemented
Fix precision problems
Improve tests
Start implementing the adaptivity
fekete: add a debug code
Removing the Dirac solver for now
Use smaller basis, while still getting reasonable results
Add harmonic oscillator option
hermes_common updated
Use jsplot if it is available
Better demo


hermes_common patches:

Ondrej Certik (6):
Revert "fixed pxd"
Update the generated .cpp file
Increase the precision in tests to 10^-10
Add tests for solve_linear_system_dense_lu()
Enable scipy tests
Test for solve_linear_system_cg() added

Friday, June 18, 2010

June 12 -- June 18

This week I have implemented a JSPlot library, which allows to use matplotlib API, but plot into your web browser:
http://github.com/certik/jsplot
(Scroll down a little bit to see the screenshots and examples.)

This was the last missing piece to be able to develop with FEMhub (I don't have a root access to my computer, running RHEL5). Now I have the full Python stack working (scipy, all the solvers, mayavi, ...), plus nice plotting in the browser.

I have spent several days investigating how to write weak forms for the radial Dirac equation as well as how to derive the action for it. It turns out, that one takes the Lagrangian from quantum electrodynamics (QED), converts it to spherical coordinates (which you can imagine is a hell of a job) and then integrates over angles and what remains is the Lagrangian (resp. action) for the radial Dirac equation. People used that in the literature a lot, but I just could not figure out where they take the functional from. So that part is clear and then I went to coding.

Here are the commits for JSPlot:

Ondrej Certik (38):
Initial commit
Make index.html work
README added
Hook up raphael
Serve raphael-min.js locally
Plot simpler stuff
Works fine
plots work
Download all the raphael stuff locally
raphael.html added
Use {% url %}
Use simpler urls
Typo fix
Flotr demo added
Use flotr in the index page
Don't recalculate
Add data by hand
Generate the data in Python
Add another flotr demo
Example mpl plot added
jsplot example added
runserver.py added
Hook it up with jsplot
Add a screenshot
Better README added
README improved
Show some testing data
Use better testing data
License added
setup.py added
Use testing data if data == []
Just return from the function on CTRL-C
spkg-install added
prepend the .. path, instead of append
Use the proper local files in ./manage.py
Turn off points
Fix a bug
Add grid and legend

I fixed some scipy warnings in hermes2d:

ondrej@crow:~/repos/hermes2d(master)$ git weekreport
Ondrej Certik (4):
hermes_common updated
Fix the other SciPy warning in Python wrappers
Update the generated .cpp file
update hermes_common

I have fixed Python wrappers in hermes1d, made the Schroedinger example work again:

ondrej@crow:~/repos/hermes1d(master)$ git weekreport
Ondrej Certik (19):
Use jsplot if available
Build everything for now
Make the Python wrappers work again
Fix the lhs/rhs/residual to build properly
Polish the forms a little bit
Enable assembling
Implement c2py_Mesh
Use c2py_Mesh in the schroedinger example
Update insert_matrix in Schroedinger
Fixed the rest in Schroedinger
copy_mesh_to_vector() and copy_vector_to_mesh() added to the Mesh class
Use JSPlot
Linearizer.get_xy() fixed
Implement Mesh.copy_vector_to_mesh() and use it from get_xy()
Use CooMatrix to assemble
Implement pysparse solver
Polish the numpy solver
Polish the printing a little bit
Rename assemble_jacobian() to assemble_matrix()

I fixed some build issues with FEMhub:

ondrej@crow:~/repos/femhub(master)$ git weekreport
Ondrej Certik (3):
Add gnutls dependency for Python
opencdk added
Fix libgpg_error to build in parallel

and I have some more patches to the build system in my "pu" branch at github:

* 44398d8 (github/pu) Use absolute paths in "femhub -i"
* 2b17bf0 Fix "femhub --devel-install" to normalize the paths
* 3b41261 Make "femhub --shell" stay in the current directory

I still need to test that femhub builds fine with those (it always takes quite some testing to make sure it builds properly).

Friday, June 11, 2010

week June 5 -- June 11

On Sunday I moved to Livermore.

On Monday I rebased my patches for Euler Equations, DG and FVM for hermes2d, sent for review. Unfortunately it took till Friday until they got reviewed, but they are now in.
I couldn't sleep, so at around midnight I turned on my computer and rewrote the build system for femhub, it took me about 1.5h. Then I could finally sleep well.

On Tuesday I had my new hire orientation. During it I wrote couple more patches for femhub.

On Wednesday I got my badge and spent the rest of the day with paperwork. They didn't manage to get my computer setup, so I studied a bit of Quantum Field Theory from a book that I found in our room and then spent the rest of the night figuring out the weak formulation for the radial Dirac equation. So far no luck, but at least I wrote it in many different forms in my hand written notes. Sometimes it's not bad to be cut off the internet and computers.

On Thursday I got my computer account setup and spent most of the day with paperwork.

On Friday I finally did some real work. Then we got to a pub, got home and drank the rest of my Budvars (Czechvar) and Plzeňs (Pilsner Urquell).

The FEMhub build system is really cool. It's written in Python and I totally got rid of the old Sage build system. Some features of the new one:

* dependencies
* automatically uses all your processors (unless told otherwise)
* simple Python script (414 lines) that handles everything, plus sage-env, sage-spkg and sage-make_relative scripts (the rest I simply deleted)
* allows you to install just some packages, for example "./femhub -i python" just installs python and it's dependencies
* allows you to unpack any package into the devel/ directory and then build it

I will not use it to develop hermes1d+schroedinger and dirac solvers over the summer, so it will likely get improved in the coming weeks. (I have to use it, because my computer is some old RHEL5, and I don't have a root access. With FEMhub, I have all the Python libraries plus cmake and similar stuff right there. It's really awesome.)

In total, I wrote 52 patches for FEMhub, 3 for hermes_common, 9 for hermes2d, 8 for hermes1d and 1 for sympy:

FEMHUB:
Ondrej Certik (52):
scipy and python upgraded
Remove the old buildsystem
New Python based buildsystem implemented
Install hermes2d by default
Compile in parallel
Let CmdException propagate
Handle dependencies
Fix the PYTHONPATH issue
Add a check that IPython is installed
Add Cython dependency to hermes2d
Add matplotlib to hermes2d deps
Keep track of installed packages
Add the -j option
Add the rest of the packages
Disable sphinx for now
Add a todo item
Disable mayavi for now
Add pysparse to fipy deps
Change the banner
Fix a typo
Remove old .hg files
setuptools added into fipy's deps
lab() implemented
Don't let other import errors to pass silently
Build bzip2 before Python
Simplify the makefile
Add the jinja2 package
Remove one forgotten " from a message
--shell and -s/--script options implemented
Fix cmake so that it builds if old cmake is present
expandvars() added to cmd()
Remove the rest of old buildsystem files
Move spd imports to femhub imports
Move download_packages into the femhub buildsystem
Remove a forgotten file from the old buildsystem
Create the standard directory if it doesn't exists
Polish the banner appearance
femhub-run: little refactoring
Use proper ipythonrc and matplotlibrc
Wrap lab(), add debugging statements
Don't import lab() in the ipythonrc
Fix the missing "-p" when creating the standard directory
Fix the problem with ambiguous names
cpu_count refactored
hermes1d added
-f/--force option added
--unpack added
--pack option implemented
--devel-install option implemented
Use MAKEFLAGS instead of MAKE env variable
Improve the FEMhub shell prompt
Print info when unpacking


hermes2d:
Ondrej Certik (9):
Removing the old-code directory
hermes_common update
Reformat the documentation in the Geom class
Python wrappers updated
Regenerate the _hermes2d.cpp
plot.py: ScalarView updated
Element orientation exported
Sanity checks and docs
Add all shapefunctions to the space L2

hermes1d:
Ondrej Certik (8):
spkg-install added
"make install" implemented
Make it build on the Mac
Allow to turn off examples
Use std::max(), fixed several warnings
hermes_common updated
Fix h1_polys.py to work with FEMhub's SymPy
Fix the new line warning

hermes_common:
Ondrej Certik (3):
Add numpy2c_double_inplace() to _hermes_common.pxd
Update the generated cpp file
Fix a small double -> int bug/warning

sympy:
Ondrej Certik (1):
pyglet: fix string exceptions

Friday, June 4, 2010

week May 30 -- June 4

Over the weekend and Monday I went to Livermore, CA to rent a room to live. On Monday and Tuesday my uncle visited me from Prague so I showed him Virginia City and Lake Tahoe, that was lots of fun.

Then I tried to debug hermes1d and some matrix issues in there for couple hours, but failed so far.

I've helped Sameer to fix h5py to compile with FEMhub.

Yesterday I spent 12h setting up gitosis on our server and finally got it done, here is the web interface:

http://git.hpfem.org/

I had to patch the gitosis repository itself, here are my changes: http://github.com/certik/gitosis (I've also sent them to Tommi Virtanen, the gitosis author, but I haven't heard back yet.)
In case you wanted to install it as well, I've posted instructions here.

I've figured out how to use msysgit with windows and use ssh keys to login to a linux box. It turns out it's not so trivial, see this issue.


I submitted the following patches to our projects at hpfem.org:

ondrej@raven:~/repos/hermes1d(master)$ git weekreport
Ondrej Certik (3):
Reverting the change dfa8580
hermes_common updated
hermes_common updated

ondrej@raven:~/repos/hermes2d(master)$ git weekreport
Ondrej Certik (5):
Fix "make install" to work again
hermes_common updated
remove hermes_common/doc/Makefile from .gitignore
Remove _XOPEN_SOURCE and _POSIX_C_SOURCE hacks
Fix the compilation warning

Btw, my "git weekreport" alias is defined as:

[alias]
weekreport = shortlog --since=1.weeks --author=ondrej


Next week I am moving to Livermore, and my summer internship will get started.

Wednesday, May 26, 2010

SymPy GSoC has started

This summer SymPy got 5 students, you can see their proposals and blogs here:

http://code.google.com/p/sympy/wiki/GSoC2010

All are required to blog once a week, the deadline is Friday night PST, and I'll do so too.

This summer I will be at the Lawrence Livermore National Laboratory, applying hp-FEM to radial Schroedinger and Dirac equations as well as 1D Density Functional Theory. I need to check with the management there how much I am allowed to blog about it. If I am, I'll keep blogging once a week too about it. If not, then only about sympy. I am super excited about the job, as it is in the electronic structure field, which is what I always wanted to do.