Can Xara support NVIDIA CUDA technology?
Can Xara support NVIDIA CUDA technology?
Sorry for my english
Why should they? What is the parallel processing task that will benefit a subset of Xara users and justify the effort of support?
More power for same operations: large files, live effects, etc.
Adobe include this technology in Photoshop (in new versions).
I am not specialist but I think this technology have a big potential.
Sorry for my english
Yes, but Adobe has rather a larger customer base with a much higher revenue to fund such development. You will notice that most of the CUDA examples are related to number-crunching applications in software used by companies with deep pockets.
The technology does have a big potential, but perhaps not for a company with limited development resources and a technology that would not be usable by the majority of the userbase.
I know I only ask out of curiosity :P
I work on large files, and sometimes lacks power. :]
Sorry for my english
I have to say the defining feature of Xara has always been the remarkable speed it typically maintains without needing a lot of hardware.
Also vector graphic processing is generally much less obviously open to simple parallelisation than bitmap/pixel processing, when by definition you have a large dataset of independent values (the bitmap) to process.
For this reason, whereas 64 bit support and multi-core parallelisation would be high on my wishlist for a pure image editing application, they are not top of my list for Xara. Having said that large memory capacity (over 3GB) and multi-core processors will increasingly be the norm for desktop PCs going forward, so I would put both of these ahead of supporting something like CUDA which is lined to a very specific hardware configuration (NVidia GPUs), and I believe mostly orientated towards an image type data set.
One exception to this could be plugins. Most plugins are written to support Photoshop, and Photoshop itself is now I believe quite significantly parallelised to support multi-core CPUs. Plugins also inherently work on raster (bitmap) data.
It is easy to envisage (if it hasn't already happened), high-end plugin packages being parallelised, and even using hardware shader support in high end graphics cards to support high performance on large images and complex effects.
Regards: Colin
Rendering diffusion curves?
Didn't cuda come out because it would optimize vector gfx in games in first place?
Xara's software uses it's own code written in Assembler to obtain it's amazing speed. Bolting on code to support CUDA would be like donning an old fashion diving helmet and lead shoes to enter an olympic swimming competition.
Soquili
a.k.a. Bill Taylor
Bill is no longer with us. He died on 10 Dec 2012. We remember him always.
My TG Album
Last XaReg update
Bookmarks