New GPU Renderer - Octane
 1-20  21-40  41-42

Previous
Next
 From:  Samuel Zeller
3233.21 In reply to 3233.20 
Well I also have MacBookPro with a 9600M GT and its already faster than Quad core at 2.4ghz (Q6600)
According to this graph
http://www.refractivesoftware.com/forum/viewtopic.php?p=1150#p1150

That mean you can render like 4 time the ammount of teapots in the same ammount of time !!!
  Reply Reply More Options
Post Options
Reply as PM Reply as PM
Print Print
Mark as unread Mark as unread
Relationship Relationship
IP Logged

Previous
Next
 From:  Rudl
3233.22 
My MacBookPro is a little bit older and has only the 8600M GT in it.

Is it worth to give this renderer a try.

Rudl
  Reply Reply More Options
Post Options
Reply as PM Reply as PM
Print Print
Mark as unread Mark as unread
Relationship Relationship
IP Logged

Previous
Next
 From:  ed (EDDYF)
3233.23 
.

EDITED: 12 Mar 2010 by EDDYF

  Reply Reply More Options
Post Options
Reply as PM Reply as PM
Print Print
Mark as unread Mark as unread
Relationship Relationship
IP Logged

Previous
Next
 From:  Phr0stByte
3233.24 
I just purchased my CUDA enabled card and will be purchasing Octane as soon as the but up the link.

On a side note, maybe MoI could get CUDA enabled? This may allow us poor linux users to enjoy MoI in a VM, or better yet, natively. I am sure it would be much easier than porting to OpenGL..? I purchased MoI v1.0 way back and honestly just through my money away, as I have only used it a couple times at work (when I was supposed to be working - I don't run Windows at home).
  Reply Reply More Options
Post Options
Reply as PM Reply as PM
Print Print
Mark as unread Mark as unread
Relationship Relationship
IP Logged

Previous
Next
 From:  PaQ
3233.25 
An Octane viewport inside MoI would be terrific ... (sorry, so thinking loud)

... I'm half sold (well completely in fact, at 100Euro I can't be wrong) ... just this memory limitation is a bit annoying. Today 'render' station have easilly 8 or 12 Mo (I'm on 16Mo here, and I allready feel limited sometimes). I'm not really sure nvidia will put so much memory on a gaming card any soon ... the biggest available are 'only' 2Mo (and it's allready a waste of memory for actual games requirement)
  Reply Reply More Options
Post Options
Reply as PM Reply as PM
Print Print
Mark as unread Mark as unread
Relationship Relationship
IP Logged

Previous
Next
 From:  Samuel Zeller
3233.26 In reply to 3233.25 
PaQ it will support multiple cards so 4 cards at 1gb will be like one card at 4gb
Also 2gb cards are coming soon (mainstream prices)
  Reply Reply More Options
Post Options
Reply as PM Reply as PM
Print Print
Mark as unread Mark as unread
Relationship Relationship
IP Logged

Previous
Next
 From:  PaQ
3233.27 In reply to 3233.26 
Yes I know for the 2gig card ... but having 4 of them dont give you 8 gig, as the card can't share memory ... one copy of the scene data will be stored per GPU to speed up the rendering. (I allready asked on the Octane forum ;))
  Reply Reply More Options
Post Options
Reply as PM Reply as PM
Print Print
Mark as unread Mark as unread
Relationship Relationship
IP Logged

Previous
Next
 From:  BurrMan
3233.28 In reply to 3233.25 
>>>>An Octane viewport inside MoI would be terrific ...

Should be the other way around. :o
  Reply Reply More Options
Post Options
Reply as PM Reply as PM
Print Print
Mark as unread Mark as unread
Relationship Relationship
IP Logged

Previous
Next
 From:  Michael Gibson
3233.29 In reply to 3233.24 
Hi Phr0stByte,

> On a side note, maybe MoI could get CUDA enabled?

What part of MoI would you expect to use CUDA?

I'm not sure if CUDA is what you are expecting - it's a mechanism for giving programming access to the resources of your GPU for certain kinds of calculations.

It's most useful for something that can be broken up into something like a million little smaller tasks that can be run in parallel (similar to rendering).

It is not generally very feasible to use CUDA for the regular real-time viewport display of a modeler, which is what you seem to be thinking of? I mean it is possible, but it would mean writing a lot of custom code to just reproduce what is already set up in Direct3D or OpenGL.


> I am sure it would be much easier than porting to OpenGL..?

Nope - it would be more like 100 times more work than porting to OpenGL, because I'd basically be trying to rewrite what OpenGL does.

- Michael
  Reply Reply More Options
Post Options
Reply as PM Reply as PM
Print Print
Mark as unread Mark as unread
Relationship Relationship
IP Logged

Previous
Next
 From:  Michael Gibson
3233.30 In reply to 3233.24 
Hi Phr0stByte, also you wrote:

> This may allow us poor linux users to enjoy MoI in a VM

Actually this may already be possible - there are several Mac users that run MoI in a VM on Mac OSX using Parallels or VMWare.

I think that VMWare is also available for Linux so you can try that.


> or better yet, natively

Actually, you can already do that as well at least for basic stuff by using WINE.

You do have to set up IE6 or IE7 under wine though, look for something like "wine tricks" to find out how to get IE installed and then you can actually launch MoI running the code directly in WINE. Not everything works, like a couple of little menus don't show up but for the most part it appears to be ok:
http://appdb.winehq.org/objectManager.php?sClass=version&iId=7383

- Michael
  Reply Reply More Options
Post Options
Reply as PM Reply as PM
Print Print
Mark as unread Mark as unread
Relationship Relationship
IP Logged

Previous
Next
 From:  Samuel Zeller
3233.31 In reply to 3233.30 
Paq >>>> but having 4 of them dont give you 8 gig, as the card can't share memory

Ok. That's bad but not so much because 4gb cards are coming soon :)
  Reply Reply More Options
Post Options
Reply as PM Reply as PM
Print Print
Mark as unread Mark as unread
Relationship Relationship
IP Logged

Previous
Next
 From:  anthony
3233.32 
Michael, could MoI use CUDA to speed up slow operations -- like booleans?
  Reply Reply More Options
Post Options
Reply as PM Reply as PM
Print Print
Mark as unread Mark as unread
Relationship Relationship
IP Logged

Previous
Next
 From:  Brian (BWTR)
3233.33 In reply to 3233.29 
3D Coat has an option for Cuda, and combined with DX, operation in both 32 and 64 bit versions.
The speed up of calculating is excellent.

Brian
  Reply Reply More Options
Post Options
Reply as PM Reply as PM
Print Print
Mark as unread Mark as unread
Relationship Relationship
IP Logged

Previous
Next
 From:  Michael Gibson
3233.34 In reply to 3233.32 
Hi anthony,

> Michael, could MoI use CUDA to speed up slow
> operations -- like booleans?

No, it's not really feasible to do that - things that are suited for CUDA are more when you have a very large quantity of somewhat more simple individual tasks.

Booleans don't particularly fit into that category - although it would be theoretically possible (with quite a lot of work to rewrite some chunks of the geometry library) to separate booleans into some list of individual tasks, each of those tasks is fairly complex and involves the intersection between 2 NURBS surfaces.

It's not easy to make more complex individual functions run on the GPU, and it is also not very easy to take existing program code and automatically turn it into CUDA, the GPU is a fairly different environment for processing things than the CPU and you cannot just push a button and automatically run the same kind of process on it.

See here for some information: http://en.wikipedia.org/wiki/CUDA#Limitations

It wasn't even too many years ago that shader programs on GPU did not even have any branching (like if/then) instructions available to them...

I can't really think of any particular area of MoI that would fit into the massive parallel processing of more simple individual calculations that would fit well with CUDA. In general the things that MoI does have more potential for multi-core CPU processing, and there has been a big step forward with that already in MoI v2 with the mesh generation at export making use of multiple CPU cores.

- Michael
  Reply Reply More Options
Post Options
Reply as PM Reply as PM
Print Print
Mark as unread Mark as unread
Relationship Relationship
IP Logged

Previous
Next
 From:  Michael Gibson
3233.35 In reply to 3233.33 
Hi Brian, re: 3D-Coat - yup the kind of data that 3D-Coat works on, with a huge number of simple individual voxel elements, is well suited for CUDA.

That is a very much different kind of geometry than the NURBS surface data that MoI uses though.

- Michael
  Reply Reply More Options
Post Options
Reply as PM Reply as PM
Print Print
Mark as unread Mark as unread
Relationship Relationship
IP Logged

Previous
Next
 From:  Frenchy Pilou (PILOU)
3233.36 In reply to 3233.35 
  Reply Reply More Options
Post Options
Reply as PM Reply as PM
Print Print
Mark as unread Mark as unread
Relationship Relationship
IP Logged

Previous
Next
 From:  Michael Gibson
3233.37 In reply to 3233.36 
Hi Pilou,

> Maybe this can http://blog.renderstream.com/?p=442 enlight you ;)

Yup, but just keep in mind that it is generally referring to rendering there.

That does not apply so much to MoI because as you know MoI is not focused on rendering, but instead on NURBS modeling calculations.

Rendering is a kind of thing that can be more easily split up into a very large number of fairly simple individual tasks, it's well suited for GPU processing.

- Michael
  Reply Reply More Options
Post Options
Reply as PM Reply as PM
Print Print
Mark as unread Mark as unread
Relationship Relationship
IP Logged

Previous
Next
 From:  Frenchy Pilou (PILOU)
3233.38 In reply to 3233.37 
yes :)
---
Pilou
Is beautiful that please without concept!
My Gallery
  Reply Reply More Options
Post Options
Reply as PM Reply as PM
Print Print
Mark as unread Mark as unread
Relationship Relationship
IP Logged

Previous
Next
 From:  Mark Brown (MABROWN)
3233.39 In reply to 3233.38 
I hadn't read this thread until just now but I'm glad I did. This looks a bit special, particularly at the 99 Euro price.

One of the things which caught my eye was that it imports RIB as well as OBJ. Does anyone know how compliant it is with the Renderman Spec? One of the things I would like in my current renderer (Kerkythea) is the ability to use displacement shaders, particularly for ocean water. I've never been able to get a satisfactory looking ocean surface in any of my renders. It makes me very sad to see excellent *free* ocean shaders available for outrageously expensive software like Max.

---
Mark
http://www.homepages.ihug.com.au/~mabrown/index.html

  Reply Reply More Options
Post Options
Reply as PM Reply as PM
Print Print
Mark as unread Mark as unread
Relationship Relationship
IP Logged

Previous
Next
 From:  Brian (BWTR)
3233.40 In reply to 3233.39 
Mark
For example, the "Ocean" primitive in Carrara 7 Pro is quite brilliant.

The versatility--built in, of Carrara really does make it an ideal app in conjunction with MoI Modelling

Brian
Attachments:

  Reply Reply More Options
Post Options
Reply as PM Reply as PM
Print Print
Mark as unread Mark as unread
Relationship Relationship
IP Logged
 

Reply to All Reply to All

 

 
Show messages:  1-20  21-40  41-42