I am running three catalyst algorithms on a Google Cloud virtual machine (g1-small, 0.5 virtual CPU, 1.70 GB RAM), running Ubuntu. The CPU usage looks as follows:
As you can see, catalyst uses a negligible of the computing power at start. But then the CPU usage increases steadily up to 100%. Once it is around 100% it stays there for a while and sometimes crashes the virtual machine. There is nothing I can see in my algos that could justify such a steady increase in usage: up to more intense computations run every 10 min and 2h, the usage should be pretty low and constant.
I tried upgrading to a virtual machine with a full virtual CPU. Then I get essentially the same graph as above, it starts lower but eventually gets to 100% CPU usage and may then crash the virtual machine.
Did anybody encounter something similar?
Edit: An issue on Github with an example algorithm has been opened.