I have some large log files and in general DPlot does admirably. Where it fails however, is on the seemingly benign "Reorder Curves" function. Both the menu command and the macro command fail. The menu command fails immediately with "Insufficient memory for 44941312 data points", followed by "Too many curves... Erase a curve or resize the X, Y arrays.".
The command is successful with 6*2874368 points and fails with 6*5617664 points.
The computer has 8GB of RAM and reports 5GB free.
Oh, an important discovery: if I move the ReorderCurves command to the start of my macro, it succeeds! The only thing it originally came after was a SelectCurve followed by an OperateOnY command. Given that this is XYYY... data, I assume the OperateOnY converts it to XYXYXY or something?
Well at least this bug report has found me a workaround, if nothing else!
Reorder Curves fails with more than 7*3000000 points
Moderator: DPlotAdmin
Oh, I see I'm simply bumping up against the practical limit on # of points. The do-reorder-first trick didn't work the next time I tried it. It seems the actual number of points limit changes from time to time and I'm just close enough to it to cause problems. Guess I'll have to split my files up until DPlot goes 64-bit!
- DPlotAdmin
- Posts: 2312
- Joined: Tue Jun 24, 2003 9:34 pm
- Location: Vicksburg, Mississippi
- Contact:
Sooner or later you'll run out of memory, but I think you're running out before it is time. The "ReorderCurves" macro command and the "Reorder Curves" menu command both allocate (or attempt to) a new curve with the maximum number of points (shown in the status bar), and then swap data with that extra curve. Your example that fails (6*5617664) would require an array of at least 7*5617664=39,323,648 points, which is somewhat less than 44,941,312. And that means the maximum number of points/curve is larger than it needs to be unless I'm missing something. What is the source of your data? If read from a CSV or similar text file it may be that DPlot is overestimating the size requirement. If instead you're sending this data to DPlot via DPLOTLIB.DLL calls... well... you already know the answer there.
As for why this would fail after EditOperateY but not before... I have no idea, sorry. EditOperateY doesn't eat up any memory so either way should behave the same. But I'll look into it.
Your 8Gb of RAM will sadly be mostly irrelevant until I come up with a 64-bit version.
As for why this would fail after EditOperateY but not before... I have no idea, sorry. EditOperateY doesn't eat up any memory so either way should behave the same. But I'll look into it.
Your 8Gb of RAM will sadly be mostly irrelevant until I come up with a 64-bit version.
Visualize Your Data
support@dplot.com
support@dplot.com
- DPlotAdmin
- Posts: 2312
- Joined: Tue Jun 24, 2003 9:34 pm
- Location: Vicksburg, Mississippi
- Contact:
Right! But something I've been meaning to do for a while that may help is to offer a few choices for large files:Guess I'll have to split my files up until DPlot goes 64-bit!
- Prompt you for a start/end X
- Offer to read every N'th line, or
- Average every N points.
Visualize Your Data
support@dplot.com
support@dplot.com
Somehow my 6*5617664 data caused DPlot to allocate 7*5617664. Perhaps that's because I wasn't counting the x values. So the ReorderCurves required 8*5617664, which is why DPlot tried to allocate memory for 44,941,312 points.DPlotAdmin wrote:Your example that fails (6*5617664) would require an array of at least 7*5617664=39,323,648 points, which is somewhat less than 44,941,312. And that means the maximum number of points/curve is larger than it needs to be unless I'm missing something. What is the source of your data? If read from a CSV or similar text file it may be that DPlot is overestimating the size requirement.
The source is indeed a CSV file, and DPlot is getting the size right, so no dramas there.
In an ideal world DPlot would read every Nth line so the data looks effectively the same when you display it all on the screen. Then, when you zoom in on a section it would re-read with N equal to something smaller. If you zoom in far enough, N would equal one. And this would all happen seamlessly and miraculously fast!But something I've been meaning to do for a while that may help is to offer a few choices for large files:
To try to elucidate this strange behaviour I'll describe what I'm doing. My data is various voltages and signals and the x values are time stamps. The data frequency is about 4Hz and the data is collected over many days. Occasionally interesting things happen that require 4Hz resolution to analyse, but there is a lot of "boring" bits in between. So instead of cutting the data into days and opening one by one, I collate it into a month's worth of data, open the lot, and drill down into "interesting" parts.... unless you really do want to see 5.6 million points on your ~2K-pixel wide monitor . I don't get that, but if that's the case you are not alone!