Show messages:
1-7
8-27
28-42
From: David (BLEND3D)
Argh... My laptop uses an Nvida GO 7600.... No cuda for me :(
From: Frenchy Pilou (PILOU)
Funny the buzz make the round of the Earth in one hour! :)
You are now everywhere! :D
From: Samuel Zeller
I hope the company behind Octane is strong and doesn't fail to fullfill all the request/support/bugs :)
If so it will be a major competitor in the rendering field. Im waiting for the beta.
(Im registred with the username "Sam" in the forum)
From: Michael Gibson
Hi Phil, I checked out the video and it looks like a very interesting renderer, quite unique to have something like this at an accessible price!
- Michael
From: Michael Gibson
Right now it kind of seems semi "Power user" oriented, lots of stuff to tweak which is cool but have you also considered a kind of hypershot-esque stuff for people who don't have much background in rendering?
That would maybe be something like bundling a variety of HDRs for lighting setups, making it more prominent in the UI to browse through them, stuff like that...
Although I guess since you are focused on users with CUDA cards looking for high performance, a "power user" is a good fit with that.
- Michael
From: Anis
Interesting link about Hypershot :
http://develop3d.com/blog/2010/01/bunkspeed-the-future
From: neo
radiance's new baby looks cool indeed :) Phil I have 3 CUDA cards on my Mac Pro waiting for Demo...:)
I can see Octane Render becoming a must have for product vizualisation and archviz too when the ram management becomes more solid...
From: neo
>Interesting link about Hypershot :
http://develop3d.com/blog/2010/01/bunkspeed-the-future
how dishonest is that? I always thought HyperShot is the most over priced, rookies only piece of software....
From: jbshorty
Wow, looks nice! Good luck with this Philbo! I'm glad to see Radiance has been working on a commercial product. I had no idea this was under development. Also it's really well priced and perfectly timed with both the arrival of Thea and the questionable future of Hypershot.
My GTX280 is waiting to test this. One question i have, does this use only HDRI lighting? Or there is user-created lights as well?
jonah
EDIT - Just noticed it has emissive material properties, so that answers my question about lighting... :)
From: PaQ
Hi Phil,
This looks promizing,
Are there any restriction with the .obj comming from MoI ? I mean, is it possible to use ngones ? Are the vertex normals fully translated ? Except Maya, .obj are really a problem with this ngone/normals stuffs, whatever you use 3dsmax, modo or lightwave ... without talking about uv's limitation too (only one channel).
Are you planning to extend the files support ? (true .3dm import with tesselation on the fly, .lwo, .fbx) ?
I'm also really curious to see more complex indirect lighting scenario, and how long it takes to render to clean the noise.
As a final note, there is something a little bit strange with the textures from actual examples provided ... I'm not a technical guy, but they looks really blured/washed/compressed ... there is this kind of realtime texture feeling about it, maybe it's the filtering I don't know. (I know it's still in alpha)
From: Rudl
What is a CUDA card. I have a MacBookPro. Does it have a CUDA card.
Rudl
From: Ralf-S
CUDA-Enabled Products:
http://www.nvidia.com/object/cuda_learn_products.html
From: neo
>What is a CUDA card. I have a MacBookPro. Does it have a CUDA card.
I also have MacBookPro with 9600M GT (CUDA-enabled)... BUT that card could only render TEAPOTS :) if you know what I mean.
From: Samuel Zeller
Well I also have MacBookPro with a 9600M GT and its already faster than Quad core at 2.4ghz (Q6600)
According to this graph
http://www.refractivesoftware.com/forum/viewtopic.php?p=1150#p1150
That mean you can render like 4 time the ammount of teapots in the same ammount of time !!!
From: Rudl
My MacBookPro is a little bit older and has only the 8600M GT in it.
Is it worth to give this renderer a try.
Rudl
From: ed (EDDYF)
.
From: Phr0stByte
I just purchased my CUDA enabled card and will be purchasing Octane as soon as the but up the link.
On a side note, maybe MoI could get CUDA enabled? This may allow us poor linux users to enjoy MoI in a VM, or better yet, natively. I am sure it would be much easier than porting to OpenGL..? I purchased MoI v1.0 way back and honestly just through my money away, as I have only used it a couple times at work (when I was supposed to be working - I don't run Windows at home).
From: PaQ
An Octane viewport inside MoI would be terrific ... (sorry, so thinking loud)
... I'm half sold (well completely in fact, at 100Euro I can't be wrong) ... just this memory limitation is a bit annoying. Today 'render' station have easilly 8 or 12 Mo (I'm on 16Mo here, and I allready feel limited sometimes). I'm not really sure nvidia will put so much memory on a gaming card any soon ... the biggest available are 'only' 2Mo (and it's allready a waste of memory for actual games requirement)
From: Samuel Zeller
PaQ it will support multiple cards so 4 cards at 1gb will be like one card at 4gb
Also 2gb cards are coming soon (mainstream prices)
From: PaQ
Yes I know for the 2gig card ... but having 4 of them dont give you 8 gig, as the card can't share memory ... one copy of the scene data will be stored per GPU to speed up the rendering. (I allready asked on the Octane forum ;))
Show messages:
1-7
8-27
28-42