Dear Matt, Dave, good afternoon,
Any news and schedule in date of release x64 version?
Should we expect it in Q2 2012?
16GB are already "on board"but we can still use only 2GB
Please help!![]()
Dear Matt, Dave, good afternoon,
Any news and schedule in date of release x64 version?
Should we expect it in Q2 2012?
16GB are already "on board"but we can still use only 2GB
Please help!![]()
We don't have any specific news at this point. ArtRage can currently access the standard 32 bit memory limit ( about 3.8 GB ).
Dave
Resident Bug-Hunter
Ambient Design
Hello, Dave!
Thank you for your reply.
Unfortunately, these 3.8 Gb are ideal case of 32 allocation.ArtRage can currently access the standard 32 bit memory limit ( about 3.8 GB ).
In facts, when memory "eating" exceeds 1,8Gb (up to 2 Gb) - the program becomes extremely unstable and starts hang. No way to save file plus other critical bugs.
So, it is a problem e.g. to make painting with sizes 1m x 2m @ 150 dpi. Standard way in this case is to make file with 1m x 2m @ 75-100 dpi and use Photoshop to resize (enlarge). The quality loss is as bonus in such way.
P.S.: Sorry, but there are strong requirements for big, even huge "objects" in Russia!![]()
Last edited by Panaslonix; 04-25-2012 at 09:23 PM.
I'd be curious to know how being 64 bit would help Artrage specifically. As in, what benefits would we see? Or what things are tied to RAM in Artrage?
My presumption is that it's tied to canvas size? For example, if I work at 4500 x 6000, there's definitely more lag than there is at 1500 x 2000 when zooming and rotating the canvas, etc. If I could use my 8 gb of ram, would this be improved? Etc etc.
I definitely have 8 gb and it would be nice to know it was being used and improving performance.
Hi Steve B,
It (x64) will not help Artrage itself, it will help us to make huge paintings as >= 8000 x 8000 pixels.I'd be curious to know how being 64 bit would help Artrage specifically.
We would see following benefits, IMHO:As in, what benefits would we see?
1. Ability to creation of large files (e.g. 2m x 2m @ 150 dpi) for interior printing (on photo-paper, "Oracal" like films, PVC plastics, wood, glass etc) with solvent, UV curred etc printers;
2. Much faster and smooth operations: layers, filters, brushes - in case of software optimisations (MMX, SSE, newest CPU features as x64 mode, plus CUDA as GPU acceleration may be...);
3. Advanced painting features and tools (see below for details) by utilising more RAM as 16GB are now seems like standard config for "loaded" PC.
Bingo! Exact question! As I remember discussions with Matt regarding CMYK and as I did understand following.Or what things are tied to RAM in Artrage?
Artrage uses RAM "aggressively" for all its needs: layers, used tools, paintings and (!) especially for paint (colours) mixing (!) and other related tasks (probably memory allocation as a heap).
Now, due to "physical construction" of x86 (32 bit version of) Artrage it can allocate up to 2GB RAM. After exceeding it program starts crashing.
Using x64 memory allocation new Artrage would use all RAM physically available at motherboard of PC (8,16GB etc) for its needs and features even more aggressive. Which would lead to faster output and higher stability (in case of good RAM and overall hardware).
On top of it, using top features of top modern Intel and AMD CPUs, as MMX, SSE, AVX etc., parallel computing (multi-core, multi-threading - AMD FX cores and Intel Hyper-threads), even GPU acceleration as CUDA or OpenCL, or even OpenGL, Artrage could drastically increase output of painting and filters applications even for huge files (> 8000 x 8000 dpi).
This is only my IMHO, and I guess how much work has to be done, but... but for sake of result...!!!![]()
Last edited by Panaslonix; 04-26-2012 at 04:36 AM.
More RAM access via 64 bit may provide some speed increase but it probably won't make a huge difference. RAM usage is primarily for storing paint data - Each layer consists of multiple different overlayed maps of data to track the different physical properties required by the simulation, but layers store only as much as data as they require for their contents, blank space does not take RAM. When you compare the amount taken for the paint data to other items like stored tools the difference is enormous. ArtRage deallocates memory for objects not being used wherever possible to try and limit the amount being taken up by non-essential items.
We can currently handle the 32 bit limit for RAM allocation, eg. around 3.8Gb. If you're finding crashes after 2Gb then you should look at whether there's anything else on the system that might be causing a problem.
Speed hitches likely come from processing large amounts of interacting pixels on huge brush heads through the physical simulation - The size of the brush head defines how much processing power is required to track the behaviour of the paint under the brush head on the canvas.
We already support multithreading on physical multi-cores, but hyperthreads are inappropriate for the simulation. Activating hyperthreads actually slows down painting due to the slower speed at which they execute (we have tested this). For some applications, hyperthreads can work fine, but for the kind of complex simulation we're doing here they won't help at all.
Similarly, GPU acceleration is inappropriate for the kind of processing we're doing here. We've looked in to it a number of times and each time it has become quickly apparent that while it might seem like a good idea from the outside, the technical limitations of the system are such that it makes no sense for the simulation when you understand the backbone code.
Matt
ArtRage UI
Ambient Design.
Hello Matt,
Thank you for your reply.
We have tested > 2GB artefacts (connected with big sizes) with Artrage from v. 3.0.x on different hardware (3 different PCs):We can currently handle the 32 bit limit for RAM allocation, eg. around 3.8Gb. If you're finding crashes after 2Gb then you should look at whether there's anything else on the system that might be causing a problem.
1. Intel Core 2 Duo E6850 with 4GB OSZ RAM - Vista x64;
2.1 Intel Core 2 Quad Q9500 with 8GB Kingston RAM - Win 7 x64;
2.2 Intel Core 2 Quad Q9500 with 8GB Kingston RAM - Ubuntu x64 with Wine 1.3;
3. AMD FX-8150 with 16GB Kingston RAM - Win 7 x64.
The same story on all "machines". If I would create file with 1m x 2m @ 150 dpi, add two - three layers with small paintings (task manger shows 1,77Gb) and program crashes.
If I would create file with 1m x 2m @ 200 dpi - message box appears:
Title: "Not enough memory to perform action".
Body text: "A required object could not be created, the operation can not be completed. You may need to restart the application."
I am sure that memory limits for Artrage are 2Gb not 4Gb. And no problem with physical RAM 'cause I've tested 4-6Gb memory allocations with Photoshop CS3 (32 bit). Also night memory tests were done.
Matt, I definitely notice what to me seems like a lot of slowing as well when zooming or panning my canvas, when I use a rather larger canvas-- 9000 x 4500 or something like that. I understand that the brushes are largely cpu based, but is this sort of stuff RAM based? I can, of course, track more closely the amount of RAM being used by my computer.... Haven't really done that. Just thought I'd point out my user experience, and was curious how it might be improved (or not) by RAM.
Last edited by Steve B; 04-27-2012 at 02:44 AM.
Task Manager isn't an accurate measure of RAM being used so it's going to be hard to find exact limits using it (it updates at odd intervals and doesn't always indicate released memory as it happens depending on how the memory was allocated).
This is particularly important here because if you are seeing that error message, the application has run out of memory when trying to create something, the amount of memory it requested will not be displayed there. Chances are, the application is hitting the ~3.8Gb limit in the process of requesting RAM for the operation that spawned the error message and that's where the problem lies.
Quickly checking here, a 1m x 2m 150 dpi image with 5 layers that have basic paint strokes on is taking around 1Gb of RAM. If you fill a layer with paint completely, the system has to allocate memory for the entire layer (~64 bits per pixel at 2835 x 5669) and for the base undo buffer to contain that. That's a big operation and causes it to blow out, but Task Manager isn't reporting the full amount being used there when I check because the attempt to allocate the additional memory failed. The application is not capped at 2Gb.
Zooming and panning are CPU based, and to some extent the size of the canvas will have an effect on how long it takes because the larger the canvas is, the more data there is to push around when you zoom/pan (we mitigate this with some cunning scale stuff however). RAM may have an effect but it's not likely to be huge.
Matt
ArtRage UI
Ambient Design.
Thank you Matt. This has been a very informative conversation. Much appreciated.![]()