Hi guys, Memory related question again, thanks in advance. I have the following structures in a single 'C' program: Here is the code for it.. ------------------------------------------------------------------- typedef struct { float i, j, k; } vector; vector a[52][20005] vector b[20005] vector c[52][1010] vector d[52] -------------------------------------------------------------------- I get segmentation faults. Is this because there is some sort of memory overloading. The seg fault might not be due to the above, but my question is does it seem quite obvious, to any of you, that the seg fault is definitely due to the above or NOT ?? Jatsui
It may be that you're blowing the stack limit. What OS are you using? If a Unix clone, what is the output of "ulimit -a"? What you could do as a test is to redefine a and c as vector *a[52], then assign each pointer in a to malloc(20005*sizeof(vector)) and b to malloc(1010*sizeof(vector)), thus allocating the memory off the heap instead of the stack. This should have no recoding implications. If that fixes the segfault then I'd say your stack is too small.
Thanks for that ! Yes it is a unix clone. 'ulimit -a' does not work, but on typing 'limit' it gives me stacksize = 10240 KB. Also, bit of an easy question, if i had a structure such as arr[50][10005], how can i know its size in KB, and does this value change depending on whether it is declared as a structure or an array, presume it makes no difference...plz excuse my ignorance/laziness ! Just wanted to say how deeply i appreciate all the help from you guys. Life savers are you lot. THANKS !!! :pleased:
You can do sizeof(vector) to get the size of the structure. If float is 4 bytes, then arr[50][10005] is going to be at least 4*3*50*10005 bytes. Alternatively sizeof(arr) might give you the true byte count. Adding up a-c only comes to 6.55MB so this probably isn't blowing the 10MB stack, unless the compiler's putting other stuff in there as well. Are a-c the only large arrays defined in the whole program or do you have more? Ignorance can be fixed (sizeof). Laziness too if you try out code that sizeof's all the permutations you're interested in
All well answered. Btw I have two more arrays which look similar to this arr[52][1010][10005] In total, I seem to have about 1.2e10, i.e. about 12GB, this could be it...the error i mean. Thanks for the help. Any solutions to this... :thinking:
I think total RAM available - 12GB in my case ! Ok, now I have tried to modify code and im using only the following: These are the structure arrays a[52][10005] b[10005] c[52][1010] d[52] And one more in a function- e[52][1010][10005]. According to your calculation, I get about 6.3 GB used with all the above declared, is this right ? And, now iam sure that used space is well below that available (6.3GB < 12 GB) but still gives me seg fault. Must be something else...any ideas, do not bother too much, i'll have a look at my code again and again, but if there is something that strikes hard, then plz mention. Its got to be something else though, it can no longer be memory overloading.... Thanks again. HOLD ON......it gives me a seg fault at the start of a function....does not even get to the declaration bit...what would this be. The parameters passed in are fine and all the input files the program is trying to read do exist !!
You have a ten ***MEGA*** byte stack (remember stacksize = 10240 KB?). Think about what might happen if you try to dump 6 ***GIGA*** bytes onto that stack. It's not total RAM that matters here, it's the stack size. If you've got enough RAM you may be able to solve this by allocating data from the heap instead (via malloc). 52x1010 should sit on the stack happily, make that a vector* and malloc 52x1010 chunks of vector[10005]. Otherwise your only option will be to store the data in disk files.