Consider I'm going to have following things in my graph:
100 Million nodes, More than 1 Billion connections/relationships
Node properties: around 10 properties, mix of int, doubles, strings, HashMaps etc.
Relationship properties: around 10 double values and 2-3 string (with avg 50 chars) values
Now, Suppose I want to update all node and relationship property values, by querying neighbors on each node once. i.e. say as,
step1: search a node, say X, with given Id,
step2: get it's neighbours,
step3: update node properties of X and all relationship properties between X and it's neighbors.
Repeat these 3 steps for all nodes once. How much time will it take for once update of all nodes(approx time is OK for me, may be in seconds / minutes / hrs) given following system configuration:
Two dual core processors, 3.0 GHz each, 4*4 GB memory, 250 GB Hard disk space.
How much approximate storage space will be required for above mentioned data?
Please help me by providing any approximate, sample performance (time and storage) analysis. Any sample performance analysis will help me to visualize my requirements. Thanks.