0
votes

i was testing the application by inserting some 1000 users and each user having 1000 contacts in a database table under mnesia and during insertion at some part the error i got is as follows:

Crash dump was written to: erl_crash.dump
binary_alloc: Cannot allocate 422879872 bytes of memory (of type "binary").
Aborted

i started the erl emulator with erl +MBas af (B-binary allocator af- a fit) and tried again but the error was same,

note:: i am using erlang r12b version and the system ram is 8gb on ubuntu 10.04 so may i know how to solve it?

the records definitions are:

%% database -record(database,{dbid,guid,data}).

%% changelog -record(changelog,{dbid,timestamp,changelist,type}).

here data is a vcard(contact info) , dbid and type is "contacts", guid is an integer automatically generated by the server

the database record contains all the vcard data of all users.if there are 1000 users and each user having 1000 contacts then we will have 10^6 records.

the changelog record will contain what are the changes done on the database table at that timestamp

the code for creation of tables are::

mnesia:create_table(database,                                                    [{type,bag},                                                          {attributes,Record_of_database},
{record_name,database},
{index,guid},
{disc_copies,[node()]}])

mnesia:create_table(changelog,                                                    [{type,set},                                                          {attributes,Record_of_changelog},
{record_name,changelog},
{index,timestamp},
{disc_copies,[node()]}])

the insertion of records on table is:

commit_data(DataList = [#database{dbid=DbID}|_]) ->
        io:format("commit data called~n"),
 [mnesia:dirty_write(database,{database,DbId,Guid,Key})||                {database,DbId,Guid,X}<-DataList].


write_changelist(Username,Dbname,Timestamp,ChangeList) ->
    Type="contacts",
    mnesia:dirty_write(changelog,{changelog,DbID,Timestamp,ChangeList,Type}).
3
Show us your record definitions and table structures. Usually when you use very long Lists as part of a users record and you have a bad way of appending to it. This should be because you may be using a certain Data structure badly. Edit your question to include all your record and table definitions so we can tell you where the problem is. Also, edit to show the piece of code that creates and inserts the users record into mnesia, how you are inserting the 1000 contacts. We need to see all these before we can continue. Show us these , please ThanksMuzaaya Joshua

3 Answers

0
votes

I suppose that the list DataList is huge and should not be sent at once from a remote node. It should be sent in small pieces. The client can send one by one item from the DataList generated at the client. Also, because this problem occurs during insertion, i think that we should parallelise the list comprehension. We could have a parallel map where for each item in the list, the insertion is done in a separate process. Then, i also think that something is still wrong with the list comprehension. Variable Key is unbound and variable X is unused. Otherwise, probably the entire methodology needs a change. Lets see what others think. Thanks

0
votes

This error normally occurs when there is no memory to allocate for binary heap by ERTS memory allocator called binary_alloc. Check the current binary heap size using erlang:system_info() or erlang:memory() or erlang:memory(binary) commands. If the binary heap size is huge then run erlang:garbage_collect() to free all non-referenced binary objects in binary heap. This will free the memory ..

0
votes

In case you use long strings (it is just list in erlang) for vcard or somewehre else, they consumes much memory. If this is the case, you change them to binary to suppress memory usage (use list_to_binary before insert to mnesia).

This may be not helpfull, because I don't know about your data structure (type, length and so on)...