Running out of memory with MikTex
up vote
1
down vote
favorite
I am attempting to build a number of figures with tikz...one keeps failing to build and the log file indicates I'm running out of memory.
I've used
initexmf --edit-config-file pdflatex
initexmf --dump=pdflatex
To increase the available memory up to the point I think I can...the config file now reads
pool_size=40000000
main_memory=50000000
extra_mem_bot=40000000
And yet the log file tells me
43000001 words of memory out of 43000000
What's frustrating is that I have built this figure before on an older computer with less memory...it's cut and paste code....which makes me think maybe I've screwed something up that's letting something run away.
Here's code for the figure
%%%%%%%%%%%%%%%%%%%%%%
documentclass{singlecol-new}
%%%%%%%%%%%%%%%%%%%%%%
usepackage{pgfplots}
usepgfplotslibrary{groupplots}
usetikzlibrary{pgfplots.groupplots}
usetikzlibrary{plotmarks}
usetikzlibrary{patterns}
usetikzlibrary{calc}
usepgfplotslibrary{external}
usepackage[external]{tcolorbox}
tcbset{
external/prefix=jobname-,
external/safety=0mm,
external/input source on error=false,
}
pgfplotsset{compat = 1.12}
tcbEXTERNALIZE
tikzexternalize
%%%%%%%%%%%%%%%%%%%%%%
begin{document}
%%%%%%%%%%%%%%%%%%%%%%
begin{figure}[h!]
centering
begin{extikzpicture}[runs=2]{fig7}
begin{axis}[
height=8cm,
width=8cm,
xmin=0,
xmax=10000,
%legend style={draw=none},
legend style={at={(0.9,0.4)}},
xlabel = $frac{V(A)}{Resource Cost}$ Ratio,
ylabel = P(X),
width=0.75textwidth,
y tick label style={
/pgf/number format/.cd,
fixed,
fixed zerofill,
precision=1,
/tikz/.cd
},
x tick label style={
/pgf/number format/.cd,
fixed,
fixed zerofill,
precision=0,
/tikz/.cd
},
scaled ticks=false,
]
addplot+[black, mark=o,line join=round, mark repeat=1000] table[col sep=comma, y=Empirical, x=X]{CDFs.csv};
addlegendentry{{scriptsize Empirical Data}}
addplot+[black, mark=x,line join=round, mark repeat=1000] table[col sep=comma, y=PT, x=X]{CDFs.csv};
addlegendentry{{scriptsize Extended Pearson-Tukey}}
addplot+[black, mark=|,line join=round, mark repeat=1000] table[col sep=comma, y=SM, x=X]{CDFs.csv};
addlegendentry{{scriptsize Extended Swanson-Megill}}
addplot+[black, mark=square,line join=round, mark repeat=1000] table[col sep=comma, y=BM, x=X]{CDFs.csv};
addlegendentry{{scriptsize Bracket Median}}
end{axis}
end{extikzpicture}
caption{Elicited CDFs}
label{CDFGraph}
end{figure}
end{document}
Obviously the data is important here, but I'm not sure how to include it. It's 10,002 rows by 5 columns of floating point numbers to make 4 line graphs...it's honestly one of the simplest graphs in this paper...
The data file looks like
X Empirical PT BM SM
0 0 0.001 0.001 0.001
1 0 0.001 0.001 0.001
2 0 0.001 0.001 0.001
3 0 0.001 0.001 0.001
4 0 0.001 0.001 0.001
5 0 0.001 0.001 0.001
6 0 0.001 0.001 0.001
7 0 0.001 0.001 0.001
8 0 0.001 0.001 0.001
9 0 0.00101 0.001 0.001
10 0 0.00101 0.001 0.001
11 0 0.00101 0.001 0.001
12 0 0.00101 0.00101 0.00101
....
9990 1 1 1 1
9991 1 1 1 1
9992 1 1 1 1
9993 1 1 1 1
9994 1 1 1 1
9995 1 1 1 1
9996 1 1 1 1
9997 1 1 1 1
9998 1 1 1 1
9999 1 1 1 1
10000 1 1 1 1
miktex tikz-external memory
add a comment |
up vote
1
down vote
favorite
I am attempting to build a number of figures with tikz...one keeps failing to build and the log file indicates I'm running out of memory.
I've used
initexmf --edit-config-file pdflatex
initexmf --dump=pdflatex
To increase the available memory up to the point I think I can...the config file now reads
pool_size=40000000
main_memory=50000000
extra_mem_bot=40000000
And yet the log file tells me
43000001 words of memory out of 43000000
What's frustrating is that I have built this figure before on an older computer with less memory...it's cut and paste code....which makes me think maybe I've screwed something up that's letting something run away.
Here's code for the figure
%%%%%%%%%%%%%%%%%%%%%%
documentclass{singlecol-new}
%%%%%%%%%%%%%%%%%%%%%%
usepackage{pgfplots}
usepgfplotslibrary{groupplots}
usetikzlibrary{pgfplots.groupplots}
usetikzlibrary{plotmarks}
usetikzlibrary{patterns}
usetikzlibrary{calc}
usepgfplotslibrary{external}
usepackage[external]{tcolorbox}
tcbset{
external/prefix=jobname-,
external/safety=0mm,
external/input source on error=false,
}
pgfplotsset{compat = 1.12}
tcbEXTERNALIZE
tikzexternalize
%%%%%%%%%%%%%%%%%%%%%%
begin{document}
%%%%%%%%%%%%%%%%%%%%%%
begin{figure}[h!]
centering
begin{extikzpicture}[runs=2]{fig7}
begin{axis}[
height=8cm,
width=8cm,
xmin=0,
xmax=10000,
%legend style={draw=none},
legend style={at={(0.9,0.4)}},
xlabel = $frac{V(A)}{Resource Cost}$ Ratio,
ylabel = P(X),
width=0.75textwidth,
y tick label style={
/pgf/number format/.cd,
fixed,
fixed zerofill,
precision=1,
/tikz/.cd
},
x tick label style={
/pgf/number format/.cd,
fixed,
fixed zerofill,
precision=0,
/tikz/.cd
},
scaled ticks=false,
]
addplot+[black, mark=o,line join=round, mark repeat=1000] table[col sep=comma, y=Empirical, x=X]{CDFs.csv};
addlegendentry{{scriptsize Empirical Data}}
addplot+[black, mark=x,line join=round, mark repeat=1000] table[col sep=comma, y=PT, x=X]{CDFs.csv};
addlegendentry{{scriptsize Extended Pearson-Tukey}}
addplot+[black, mark=|,line join=round, mark repeat=1000] table[col sep=comma, y=SM, x=X]{CDFs.csv};
addlegendentry{{scriptsize Extended Swanson-Megill}}
addplot+[black, mark=square,line join=round, mark repeat=1000] table[col sep=comma, y=BM, x=X]{CDFs.csv};
addlegendentry{{scriptsize Bracket Median}}
end{axis}
end{extikzpicture}
caption{Elicited CDFs}
label{CDFGraph}
end{figure}
end{document}
Obviously the data is important here, but I'm not sure how to include it. It's 10,002 rows by 5 columns of floating point numbers to make 4 line graphs...it's honestly one of the simplest graphs in this paper...
The data file looks like
X Empirical PT BM SM
0 0 0.001 0.001 0.001
1 0 0.001 0.001 0.001
2 0 0.001 0.001 0.001
3 0 0.001 0.001 0.001
4 0 0.001 0.001 0.001
5 0 0.001 0.001 0.001
6 0 0.001 0.001 0.001
7 0 0.001 0.001 0.001
8 0 0.001 0.001 0.001
9 0 0.00101 0.001 0.001
10 0 0.00101 0.001 0.001
11 0 0.00101 0.001 0.001
12 0 0.00101 0.00101 0.00101
....
9990 1 1 1 1
9991 1 1 1 1
9992 1 1 1 1
9993 1 1 1 1
9994 1 1 1 1
9995 1 1 1 1
9996 1 1 1 1
9997 1 1 1 1
9998 1 1 1 1
9999 1 1 1 1
10000 1 1 1 1
miktex tikz-external memory
One sure way to fix the problem is to do all your tikzpictures using standalone or externalize. If you run out of memory for a single graph, you are using too many data points.
– John Kormylo
Sep 8 at 15:37
@JohnKormylo I am running them externalized, and though there are a lot of points (10,000 per series) I have built this graph before...just not sure why it's running out of memory now. Other charts that in the past gave me memory issues are working fine, but this one never caused a problem before...
– jerH
Sep 10 at 1:28
The data file, cdfs.csv, is available at pastebin.com/u/jerH
– jerH
Sep 10 at 2:00
add a comment |
up vote
1
down vote
favorite
up vote
1
down vote
favorite
I am attempting to build a number of figures with tikz...one keeps failing to build and the log file indicates I'm running out of memory.
I've used
initexmf --edit-config-file pdflatex
initexmf --dump=pdflatex
To increase the available memory up to the point I think I can...the config file now reads
pool_size=40000000
main_memory=50000000
extra_mem_bot=40000000
And yet the log file tells me
43000001 words of memory out of 43000000
What's frustrating is that I have built this figure before on an older computer with less memory...it's cut and paste code....which makes me think maybe I've screwed something up that's letting something run away.
Here's code for the figure
%%%%%%%%%%%%%%%%%%%%%%
documentclass{singlecol-new}
%%%%%%%%%%%%%%%%%%%%%%
usepackage{pgfplots}
usepgfplotslibrary{groupplots}
usetikzlibrary{pgfplots.groupplots}
usetikzlibrary{plotmarks}
usetikzlibrary{patterns}
usetikzlibrary{calc}
usepgfplotslibrary{external}
usepackage[external]{tcolorbox}
tcbset{
external/prefix=jobname-,
external/safety=0mm,
external/input source on error=false,
}
pgfplotsset{compat = 1.12}
tcbEXTERNALIZE
tikzexternalize
%%%%%%%%%%%%%%%%%%%%%%
begin{document}
%%%%%%%%%%%%%%%%%%%%%%
begin{figure}[h!]
centering
begin{extikzpicture}[runs=2]{fig7}
begin{axis}[
height=8cm,
width=8cm,
xmin=0,
xmax=10000,
%legend style={draw=none},
legend style={at={(0.9,0.4)}},
xlabel = $frac{V(A)}{Resource Cost}$ Ratio,
ylabel = P(X),
width=0.75textwidth,
y tick label style={
/pgf/number format/.cd,
fixed,
fixed zerofill,
precision=1,
/tikz/.cd
},
x tick label style={
/pgf/number format/.cd,
fixed,
fixed zerofill,
precision=0,
/tikz/.cd
},
scaled ticks=false,
]
addplot+[black, mark=o,line join=round, mark repeat=1000] table[col sep=comma, y=Empirical, x=X]{CDFs.csv};
addlegendentry{{scriptsize Empirical Data}}
addplot+[black, mark=x,line join=round, mark repeat=1000] table[col sep=comma, y=PT, x=X]{CDFs.csv};
addlegendentry{{scriptsize Extended Pearson-Tukey}}
addplot+[black, mark=|,line join=round, mark repeat=1000] table[col sep=comma, y=SM, x=X]{CDFs.csv};
addlegendentry{{scriptsize Extended Swanson-Megill}}
addplot+[black, mark=square,line join=round, mark repeat=1000] table[col sep=comma, y=BM, x=X]{CDFs.csv};
addlegendentry{{scriptsize Bracket Median}}
end{axis}
end{extikzpicture}
caption{Elicited CDFs}
label{CDFGraph}
end{figure}
end{document}
Obviously the data is important here, but I'm not sure how to include it. It's 10,002 rows by 5 columns of floating point numbers to make 4 line graphs...it's honestly one of the simplest graphs in this paper...
The data file looks like
X Empirical PT BM SM
0 0 0.001 0.001 0.001
1 0 0.001 0.001 0.001
2 0 0.001 0.001 0.001
3 0 0.001 0.001 0.001
4 0 0.001 0.001 0.001
5 0 0.001 0.001 0.001
6 0 0.001 0.001 0.001
7 0 0.001 0.001 0.001
8 0 0.001 0.001 0.001
9 0 0.00101 0.001 0.001
10 0 0.00101 0.001 0.001
11 0 0.00101 0.001 0.001
12 0 0.00101 0.00101 0.00101
....
9990 1 1 1 1
9991 1 1 1 1
9992 1 1 1 1
9993 1 1 1 1
9994 1 1 1 1
9995 1 1 1 1
9996 1 1 1 1
9997 1 1 1 1
9998 1 1 1 1
9999 1 1 1 1
10000 1 1 1 1
miktex tikz-external memory
I am attempting to build a number of figures with tikz...one keeps failing to build and the log file indicates I'm running out of memory.
I've used
initexmf --edit-config-file pdflatex
initexmf --dump=pdflatex
To increase the available memory up to the point I think I can...the config file now reads
pool_size=40000000
main_memory=50000000
extra_mem_bot=40000000
And yet the log file tells me
43000001 words of memory out of 43000000
What's frustrating is that I have built this figure before on an older computer with less memory...it's cut and paste code....which makes me think maybe I've screwed something up that's letting something run away.
Here's code for the figure
%%%%%%%%%%%%%%%%%%%%%%
documentclass{singlecol-new}
%%%%%%%%%%%%%%%%%%%%%%
usepackage{pgfplots}
usepgfplotslibrary{groupplots}
usetikzlibrary{pgfplots.groupplots}
usetikzlibrary{plotmarks}
usetikzlibrary{patterns}
usetikzlibrary{calc}
usepgfplotslibrary{external}
usepackage[external]{tcolorbox}
tcbset{
external/prefix=jobname-,
external/safety=0mm,
external/input source on error=false,
}
pgfplotsset{compat = 1.12}
tcbEXTERNALIZE
tikzexternalize
%%%%%%%%%%%%%%%%%%%%%%
begin{document}
%%%%%%%%%%%%%%%%%%%%%%
begin{figure}[h!]
centering
begin{extikzpicture}[runs=2]{fig7}
begin{axis}[
height=8cm,
width=8cm,
xmin=0,
xmax=10000,
%legend style={draw=none},
legend style={at={(0.9,0.4)}},
xlabel = $frac{V(A)}{Resource Cost}$ Ratio,
ylabel = P(X),
width=0.75textwidth,
y tick label style={
/pgf/number format/.cd,
fixed,
fixed zerofill,
precision=1,
/tikz/.cd
},
x tick label style={
/pgf/number format/.cd,
fixed,
fixed zerofill,
precision=0,
/tikz/.cd
},
scaled ticks=false,
]
addplot+[black, mark=o,line join=round, mark repeat=1000] table[col sep=comma, y=Empirical, x=X]{CDFs.csv};
addlegendentry{{scriptsize Empirical Data}}
addplot+[black, mark=x,line join=round, mark repeat=1000] table[col sep=comma, y=PT, x=X]{CDFs.csv};
addlegendentry{{scriptsize Extended Pearson-Tukey}}
addplot+[black, mark=|,line join=round, mark repeat=1000] table[col sep=comma, y=SM, x=X]{CDFs.csv};
addlegendentry{{scriptsize Extended Swanson-Megill}}
addplot+[black, mark=square,line join=round, mark repeat=1000] table[col sep=comma, y=BM, x=X]{CDFs.csv};
addlegendentry{{scriptsize Bracket Median}}
end{axis}
end{extikzpicture}
caption{Elicited CDFs}
label{CDFGraph}
end{figure}
end{document}
Obviously the data is important here, but I'm not sure how to include it. It's 10,002 rows by 5 columns of floating point numbers to make 4 line graphs...it's honestly one of the simplest graphs in this paper...
The data file looks like
X Empirical PT BM SM
0 0 0.001 0.001 0.001
1 0 0.001 0.001 0.001
2 0 0.001 0.001 0.001
3 0 0.001 0.001 0.001
4 0 0.001 0.001 0.001
5 0 0.001 0.001 0.001
6 0 0.001 0.001 0.001
7 0 0.001 0.001 0.001
8 0 0.001 0.001 0.001
9 0 0.00101 0.001 0.001
10 0 0.00101 0.001 0.001
11 0 0.00101 0.001 0.001
12 0 0.00101 0.00101 0.00101
....
9990 1 1 1 1
9991 1 1 1 1
9992 1 1 1 1
9993 1 1 1 1
9994 1 1 1 1
9995 1 1 1 1
9996 1 1 1 1
9997 1 1 1 1
9998 1 1 1 1
9999 1 1 1 1
10000 1 1 1 1
miktex tikz-external memory
miktex tikz-external memory
asked Sep 7 at 18:17
jerH
430413
430413
One sure way to fix the problem is to do all your tikzpictures using standalone or externalize. If you run out of memory for a single graph, you are using too many data points.
– John Kormylo
Sep 8 at 15:37
@JohnKormylo I am running them externalized, and though there are a lot of points (10,000 per series) I have built this graph before...just not sure why it's running out of memory now. Other charts that in the past gave me memory issues are working fine, but this one never caused a problem before...
– jerH
Sep 10 at 1:28
The data file, cdfs.csv, is available at pastebin.com/u/jerH
– jerH
Sep 10 at 2:00
add a comment |
One sure way to fix the problem is to do all your tikzpictures using standalone or externalize. If you run out of memory for a single graph, you are using too many data points.
– John Kormylo
Sep 8 at 15:37
@JohnKormylo I am running them externalized, and though there are a lot of points (10,000 per series) I have built this graph before...just not sure why it's running out of memory now. Other charts that in the past gave me memory issues are working fine, but this one never caused a problem before...
– jerH
Sep 10 at 1:28
The data file, cdfs.csv, is available at pastebin.com/u/jerH
– jerH
Sep 10 at 2:00
One sure way to fix the problem is to do all your tikzpictures using standalone or externalize. If you run out of memory for a single graph, you are using too many data points.
– John Kormylo
Sep 8 at 15:37
One sure way to fix the problem is to do all your tikzpictures using standalone or externalize. If you run out of memory for a single graph, you are using too many data points.
– John Kormylo
Sep 8 at 15:37
@JohnKormylo I am running them externalized, and though there are a lot of points (10,000 per series) I have built this graph before...just not sure why it's running out of memory now. Other charts that in the past gave me memory issues are working fine, but this one never caused a problem before...
– jerH
Sep 10 at 1:28
@JohnKormylo I am running them externalized, and though there are a lot of points (10,000 per series) I have built this graph before...just not sure why it's running out of memory now. Other charts that in the past gave me memory issues are working fine, but this one never caused a problem before...
– jerH
Sep 10 at 1:28
The data file, cdfs.csv, is available at pastebin.com/u/jerH
– jerH
Sep 10 at 2:00
The data file, cdfs.csv, is available at pastebin.com/u/jerH
– jerH
Sep 10 at 2:00
add a comment |
2 Answers
2
active
oldest
votes
up vote
0
down vote
Interestingly, I uninstalled and then re-installed the pgf package and this issue disappeared. Not sure if that counts as an "answer"...
No I tried un/install both pgf and pgfplots it did not make a jot of difference the only way it works for me is to have 25% of the .csv giving 2008247 words of memory out of 3000000 (the default) so could perhaps run say 4500 points max? update tried that and 4500 = 2938707 words of memory out of 3000000 so ok up to then but 5000 failed not sure if it would be much simpler to reduce data to every second entry? and bump memory a little
– KJO
Oct 20 at 1:03
add a comment |
up vote
0
down vote
Having a need to bump memory in MiKTeX
I searched around and collected the following recommendations which allowed me to revisit and run this plot.
> initexmf --edit-config-file=pdfLaTeX
In notepad change or if no prior value add the following lines
main_memory=12000000
extra_mem_bot=99999999
font_mem_size=3000000
save the file and back at prompt run
> initexmf --dump=pdflatex
If you get an error message you need to repeat and reduce values until --dump=pdflatex does not error (for speed use a "binary chop", half last difference)
previously the points required
2008247 words of memory out of 3000000 (default) for 25%
2938707 words of memory out of 3000000 (default) for 45%
However the final working log shows for the full 10000 points 100%
793011 words of memory out of 12000000
approx. 10% of expected ! (guess it gets partially cleared during run time)
add a comment |
2 Answers
2
active
oldest
votes
2 Answers
2
active
oldest
votes
active
oldest
votes
active
oldest
votes
up vote
0
down vote
Interestingly, I uninstalled and then re-installed the pgf package and this issue disappeared. Not sure if that counts as an "answer"...
No I tried un/install both pgf and pgfplots it did not make a jot of difference the only way it works for me is to have 25% of the .csv giving 2008247 words of memory out of 3000000 (the default) so could perhaps run say 4500 points max? update tried that and 4500 = 2938707 words of memory out of 3000000 so ok up to then but 5000 failed not sure if it would be much simpler to reduce data to every second entry? and bump memory a little
– KJO
Oct 20 at 1:03
add a comment |
up vote
0
down vote
Interestingly, I uninstalled and then re-installed the pgf package and this issue disappeared. Not sure if that counts as an "answer"...
No I tried un/install both pgf and pgfplots it did not make a jot of difference the only way it works for me is to have 25% of the .csv giving 2008247 words of memory out of 3000000 (the default) so could perhaps run say 4500 points max? update tried that and 4500 = 2938707 words of memory out of 3000000 so ok up to then but 5000 failed not sure if it would be much simpler to reduce data to every second entry? and bump memory a little
– KJO
Oct 20 at 1:03
add a comment |
up vote
0
down vote
up vote
0
down vote
Interestingly, I uninstalled and then re-installed the pgf package and this issue disappeared. Not sure if that counts as an "answer"...
Interestingly, I uninstalled and then re-installed the pgf package and this issue disappeared. Not sure if that counts as an "answer"...
answered Sep 19 at 20:59
jerH
430413
430413
No I tried un/install both pgf and pgfplots it did not make a jot of difference the only way it works for me is to have 25% of the .csv giving 2008247 words of memory out of 3000000 (the default) so could perhaps run say 4500 points max? update tried that and 4500 = 2938707 words of memory out of 3000000 so ok up to then but 5000 failed not sure if it would be much simpler to reduce data to every second entry? and bump memory a little
– KJO
Oct 20 at 1:03
add a comment |
No I tried un/install both pgf and pgfplots it did not make a jot of difference the only way it works for me is to have 25% of the .csv giving 2008247 words of memory out of 3000000 (the default) so could perhaps run say 4500 points max? update tried that and 4500 = 2938707 words of memory out of 3000000 so ok up to then but 5000 failed not sure if it would be much simpler to reduce data to every second entry? and bump memory a little
– KJO
Oct 20 at 1:03
No I tried un/install both pgf and pgfplots it did not make a jot of difference the only way it works for me is to have 25% of the .csv giving 2008247 words of memory out of 3000000 (the default) so could perhaps run say 4500 points max? update tried that and 4500 = 2938707 words of memory out of 3000000 so ok up to then but 5000 failed not sure if it would be much simpler to reduce data to every second entry? and bump memory a little
– KJO
Oct 20 at 1:03
No I tried un/install both pgf and pgfplots it did not make a jot of difference the only way it works for me is to have 25% of the .csv giving 2008247 words of memory out of 3000000 (the default) so could perhaps run say 4500 points max? update tried that and 4500 = 2938707 words of memory out of 3000000 so ok up to then but 5000 failed not sure if it would be much simpler to reduce data to every second entry? and bump memory a little
– KJO
Oct 20 at 1:03
add a comment |
up vote
0
down vote
Having a need to bump memory in MiKTeX
I searched around and collected the following recommendations which allowed me to revisit and run this plot.
> initexmf --edit-config-file=pdfLaTeX
In notepad change or if no prior value add the following lines
main_memory=12000000
extra_mem_bot=99999999
font_mem_size=3000000
save the file and back at prompt run
> initexmf --dump=pdflatex
If you get an error message you need to repeat and reduce values until --dump=pdflatex does not error (for speed use a "binary chop", half last difference)
previously the points required
2008247 words of memory out of 3000000 (default) for 25%
2938707 words of memory out of 3000000 (default) for 45%
However the final working log shows for the full 10000 points 100%
793011 words of memory out of 12000000
approx. 10% of expected ! (guess it gets partially cleared during run time)
add a comment |
up vote
0
down vote
Having a need to bump memory in MiKTeX
I searched around and collected the following recommendations which allowed me to revisit and run this plot.
> initexmf --edit-config-file=pdfLaTeX
In notepad change or if no prior value add the following lines
main_memory=12000000
extra_mem_bot=99999999
font_mem_size=3000000
save the file and back at prompt run
> initexmf --dump=pdflatex
If you get an error message you need to repeat and reduce values until --dump=pdflatex does not error (for speed use a "binary chop", half last difference)
previously the points required
2008247 words of memory out of 3000000 (default) for 25%
2938707 words of memory out of 3000000 (default) for 45%
However the final working log shows for the full 10000 points 100%
793011 words of memory out of 12000000
approx. 10% of expected ! (guess it gets partially cleared during run time)
add a comment |
up vote
0
down vote
up vote
0
down vote
Having a need to bump memory in MiKTeX
I searched around and collected the following recommendations which allowed me to revisit and run this plot.
> initexmf --edit-config-file=pdfLaTeX
In notepad change or if no prior value add the following lines
main_memory=12000000
extra_mem_bot=99999999
font_mem_size=3000000
save the file and back at prompt run
> initexmf --dump=pdflatex
If you get an error message you need to repeat and reduce values until --dump=pdflatex does not error (for speed use a "binary chop", half last difference)
previously the points required
2008247 words of memory out of 3000000 (default) for 25%
2938707 words of memory out of 3000000 (default) for 45%
However the final working log shows for the full 10000 points 100%
793011 words of memory out of 12000000
approx. 10% of expected ! (guess it gets partially cleared during run time)
Having a need to bump memory in MiKTeX
I searched around and collected the following recommendations which allowed me to revisit and run this plot.
> initexmf --edit-config-file=pdfLaTeX
In notepad change or if no prior value add the following lines
main_memory=12000000
extra_mem_bot=99999999
font_mem_size=3000000
save the file and back at prompt run
> initexmf --dump=pdflatex
If you get an error message you need to repeat and reduce values until --dump=pdflatex does not error (for speed use a "binary chop", half last difference)
previously the points required
2008247 words of memory out of 3000000 (default) for 25%
2938707 words of memory out of 3000000 (default) for 45%
However the final working log shows for the full 10000 points 100%
793011 words of memory out of 12000000
approx. 10% of expected ! (guess it gets partially cleared during run time)
edited Nov 13 at 2:30
answered Nov 13 at 2:23
KJO
628112
628112
add a comment |
add a comment |
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2ftex.stackexchange.com%2fquestions%2f449892%2frunning-out-of-memory-with-miktex%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
One sure way to fix the problem is to do all your tikzpictures using standalone or externalize. If you run out of memory for a single graph, you are using too many data points.
– John Kormylo
Sep 8 at 15:37
@JohnKormylo I am running them externalized, and though there are a lot of points (10,000 per series) I have built this graph before...just not sure why it's running out of memory now. Other charts that in the past gave me memory issues are working fine, but this one never caused a problem before...
– jerH
Sep 10 at 1:28
The data file, cdfs.csv, is available at pastebin.com/u/jerH
– jerH
Sep 10 at 2:00