MoI discussion forum
MoI discussion forum

Full Version: New GPU Renderer - Octane

Show messages:  1-11  12-31  32-42

From: Michael Gibson
10 Jan 2010   [#12] In reply to [#1]
Right now it kind of seems semi "Power user" oriented, lots of stuff to tweak which is cool but have you also considered a kind of hypershot-esque stuff for people who don't have much background in rendering?

That would maybe be something like bundling a variety of HDRs for lighting setups, making it more prominent in the UI to browse through them, stuff like that...

Although I guess since you are focused on users with CUDA cards looking for high performance, a "power user" is a good fit with that.

- Michael
From: Anis
11 Jan 2010   [#13] In reply to [#12]
Interesting link about Hypershot : http://develop3d.com/blog/2010/01/bunkspeed-the-future
From: neo
11 Jan 2010   [#14]
radiance's new baby looks cool indeed :) Phil I have 3 CUDA cards on my Mac Pro waiting for Demo...:)

I can see Octane Render becoming a must have for product vizualisation and archviz too when the ram management becomes more solid...
From: neo
11 Jan 2010   [#15] In reply to [#13]
>Interesting link about Hypershot : http://develop3d.com/blog/2010/01/bunkspeed-the-future

how dishonest is that? I always thought HyperShot is the most over priced, rookies only piece of software....
From: jbshorty
11 Jan 2010   [#16]
Wow, looks nice! Good luck with this Philbo! I'm glad to see Radiance has been working on a commercial product. I had no idea this was under development. Also it's really well priced and perfectly timed with both the arrival of Thea and the questionable future of Hypershot.

My GTX280 is waiting to test this. One question i have, does this use only HDRI lighting? Or there is user-created lights as well?

jonah

EDIT - Just noticed it has emissive material properties, so that answers my question about lighting... :)
From: PaQ
11 Jan 2010   [#17]
Hi Phil,

This looks promizing,

Are there any restriction with the .obj comming from MoI ? I mean, is it possible to use ngones ? Are the vertex normals fully translated ? Except Maya, .obj are really a problem with this ngone/normals stuffs, whatever you use 3dsmax, modo or lightwave ... without talking about uv's limitation too (only one channel).

Are you planning to extend the files support ? (true .3dm import with tesselation on the fly, .lwo, .fbx) ?

I'm also really curious to see more complex indirect lighting scenario, and how long it takes to render to clean the noise.

As a final note, there is something a little bit strange with the textures from actual examples provided ... I'm not a technical guy, but they looks really blured/washed/compressed ... there is this kind of realtime texture feeling about it, maybe it's the filtering I don't know. (I know it's still in alpha)
From: Rudl
11 Jan 2010   [#18] In reply to [#16]
What is a CUDA card. I have a MacBookPro. Does it have a CUDA card.

Rudl
From: Ralf-S
11 Jan 2010   [#19]
CUDA-Enabled Products: http://www.nvidia.com/object/cuda_learn_products.html
From: neo
11 Jan 2010   [#20] In reply to [#18]
>What is a CUDA card. I have a MacBookPro. Does it have a CUDA card.

I also have MacBookPro with 9600M GT (CUDA-enabled)... BUT that card could only render TEAPOTS :) if you know what I mean.
From: Samuel Zeller
11 Jan 2010   [#21] In reply to [#20]
Well I also have MacBookPro with a 9600M GT and its already faster than Quad core at 2.4ghz (Q6600)
According to this graph
http://www.refractivesoftware.com/forum/viewtopic.php?p=1150#p1150

That mean you can render like 4 time the ammount of teapots in the same ammount of time !!!
From: Rudl
11 Jan 2010   [#22]
My MacBookPro is a little bit older and has only the 8600M GT in it.

Is it worth to give this renderer a try.

Rudl
From: ed (EDDYF)
11 Jan 2010   [#23]
.
From: Phr0stByte
12 Jan 2010   [#24]
I just purchased my CUDA enabled card and will be purchasing Octane as soon as the but up the link.

On a side note, maybe MoI could get CUDA enabled? This may allow us poor linux users to enjoy MoI in a VM, or better yet, natively. I am sure it would be much easier than porting to OpenGL..? I purchased MoI v1.0 way back and honestly just through my money away, as I have only used it a couple times at work (when I was supposed to be working - I don't run Windows at home).
From: PaQ
12 Jan 2010   [#25]
An Octane viewport inside MoI would be terrific ... (sorry, so thinking loud)

... I'm half sold (well completely in fact, at 100Euro I can't be wrong) ... just this memory limitation is a bit annoying. Today 'render' station have easilly 8 or 12 Mo (I'm on 16Mo here, and I allready feel limited sometimes). I'm not really sure nvidia will put so much memory on a gaming card any soon ... the biggest available are 'only' 2Mo (and it's allready a waste of memory for actual games requirement)
From: Samuel Zeller
12 Jan 2010   [#26] In reply to [#25]
PaQ it will support multiple cards so 4 cards at 1gb will be like one card at 4gb
Also 2gb cards are coming soon (mainstream prices)
From: PaQ
12 Jan 2010   [#27] In reply to [#26]
Yes I know for the 2gig card ... but having 4 of them dont give you 8 gig, as the card can't share memory ... one copy of the scene data will be stored per GPU to speed up the rendering. (I allready asked on the Octane forum ;))
From: BurrMan
12 Jan 2010   [#28] In reply to [#25]
>>>>An Octane viewport inside MoI would be terrific ...

Should be the other way around. :o
From: Michael Gibson
12 Jan 2010   [#29] In reply to [#24]
Hi Phr0stByte,

> On a side note, maybe MoI could get CUDA enabled?

What part of MoI would you expect to use CUDA?

I'm not sure if CUDA is what you are expecting - it's a mechanism for giving programming access to the resources of your GPU for certain kinds of calculations.

It's most useful for something that can be broken up into something like a million little smaller tasks that can be run in parallel (similar to rendering).

It is not generally very feasible to use CUDA for the regular real-time viewport display of a modeler, which is what you seem to be thinking of? I mean it is possible, but it would mean writing a lot of custom code to just reproduce what is already set up in Direct3D or OpenGL.


> I am sure it would be much easier than porting to OpenGL..?

Nope - it would be more like 100 times more work than porting to OpenGL, because I'd basically be trying to rewrite what OpenGL does.

- Michael
From: Michael Gibson
12 Jan 2010   [#30] In reply to [#24]
Hi Phr0stByte, also you wrote:

> This may allow us poor linux users to enjoy MoI in a VM

Actually this may already be possible - there are several Mac users that run MoI in a VM on Mac OSX using Parallels or VMWare.

I think that VMWare is also available for Linux so you can try that.


> or better yet, natively

Actually, you can already do that as well at least for basic stuff by using WINE.

You do have to set up IE6 or IE7 under wine though, look for something like "wine tricks" to find out how to get IE installed and then you can actually launch MoI running the code directly in WINE. Not everything works, like a couple of little menus don't show up but for the most part it appears to be ok:
http://appdb.winehq.org/objectManager.php?sClass=version&iId=7383

- Michael
From: Samuel Zeller
12 Jan 2010   [#31] In reply to [#30]
Paq >>>> but having 4 of them dont give you 8 gig, as the card can't share memory

Ok. That's bad but not so much because 4gb cards are coming soon :)

Show messages:  1-11  12-31  32-42