MemoryError Backtest


I´m doing a backtest with up to a year worth of data with 1min frequency and calculate dual moving averages. However I get MemoryError, is there a way to fix this?
I guess it has to do with plotting the data?

Here is the error log:

[2018-04-04 07:56:21.284339] INFO: run_algo: Catalyst version 0.5.6
[2018-04-04 07:56:24.284752] INFO: run_algo: running algo in backtest mode
[2018-04-04 07:56:25.212409] INFO: exchange_algorithm: initialized trading algorithm in backtest mode
[2018-04-04 09:02:44.176321] INFO: Performance: Simulated 214 trading days out of 214.
[2018-04-04 09:02:44.176820] INFO: Performance: first open: 2017-02-22 00:00:00+00:00
[2018-04-04 09:02:44.177323] INFO: Performance: last close: 2017-09-23 23:59:00+00:00
Traceback (most recent call last):
File “C:\Users\x\Desktop\Code\enigmaCatalyst1\Scripts\”, line 11, in
load_entry_point(‘enigma-catalyst==0.5.6’, ‘console_scripts’, ‘catalyst’)()
File “C:\Users\x\Desktop\Code\enigmaCatalyst1\lib\site-packages\click\”, line 722, in call
return self.main(*args, **kwargs)
File “C:\Users\x\Desktop\Code\enigmaCatalyst1\lib\site-packages\click\”, line 697, in main
rv = self.invoke(ctx)
File “C:\Users\x\Desktop\Code\enigmaCatalyst1\lib\site-packages\click\”, line 1066, in invoke
return process_result(sub_ctx.command.invoke(sub_ctx))
File “C:\Users\x\Desktop\Code\enigmaCatalyst1\lib\site-packages\click\”, line 895, in invoke
return ctx.invoke(self.callback, **ctx.params)
File “C:\Users\x\Desktop\Code\enigmaCatalyst1\lib\site-packages\click\”, line 535, in invoke
return callback(*args, **kwargs)
File "C:\Users\x\Desktop\Code\enigmaCatalyst1\lib\site-packages\catalyst_main
.py", line 104, in _
return f(*args, **kwargs)
File “C:\Users\x\Desktop\Code\enigmaCatalyst1\lib\site-packages\click\”, line 17, in new_func
return f(get_current_context(), *args, **kwargs)
File “C:\Users\x\Desktop\Code\enigmaCatalyst1\lib\site-packages\”, line 291, in run
File “C:\Users\x\Desktop\Code\enigmaCatalyst1\lib\site-packages\catalyst\utils\”, line 342, in _run
File “C:\Users\x\Desktop\Code\enigmaCatalyst1\lib\site-packages\catalyst\exchange\”, line 373, in run
data, overwrite_sim_params
File “C:\Users\x\Desktop\Code\enigmaCatalyst1\lib\site-packages\catalyst\exchange\”, line 330, in run
data, overwrite_sim_params
File “C:\Users\x\Desktop\Code\enigmaCatalyst1\lib\site-packages\catalyst\”, line 730, in run
File “C:\Users\x\Desktop\Code\enigmaCatalyst1\lib\site-packages\catalyst\exchange\”, line 367, in analyze
stats = self._create_stats_df() if self.data_frequency == ‘minute’
File “C:\Users\x\Desktop\Code\enigmaCatalyst1\lib\site-packages\catalyst\exchange\”, line 362, in _create_stats_df
stats = pd.DataFrame(self.frame_stats)
File “C:\Users\x\Desktop\Code\enigmaCatalyst1\lib\site-packages\pandas\core\”, line 305, in init
arrays, columns = _to_arrays(data, columns, dtype=dtype)
File “C:\Users\x\Desktop\Code\enigmaCatalyst1\lib\site-packages\pandas\core\”, line 5522, in _to_arrays
coerce_float=coerce_float, dtype=dtype)
File “C:\Users\x\Desktop\Code\enigmaCatalyst1\lib\site-packages\pandas\core\”, line 5647, in _list_of_dict_to_arrays
File “C:\Users\x\Desktop\Code\enigmaCatalyst1\lib\site-packages\pandas\core\”, line 5666, in _convert_object_array
arrays = [convert(arr) for arr in content]
File “C:\Users\x\Desktop\Code\enigmaCatalyst1\lib\site-packages\pandas\core\”, line 5666, in
arrays = [convert(arr) for arr in content]
File “C:\Users\x\Desktop\Code\enigmaCatalyst1\lib\site-packages\pandas\core\”, line 5662, in convert
arr = lib.maybe_convert_objects(arr, try_float=coerce_float)
File “pandas\src\inference.pyx”, line 745, in pandas.lib.maybe_convert_objects (pandas\lib.c:56863)


Hi, I tried to reproduce your error with the same algo and dates, and it runs fine on my PC. maybe it has something to do with an overload of data of other processes as well on your machine’s memory. try running catalyst after closing heavy processes and update us if you still get the same issues.
Good luck!


I have 16 GB of RAM, and usually run out of memory if a backrest at minute frequency goes longer than two years.