On this week we try out a new section: Community news. We also mention our progress in the studio, Blender Cloud addon development, and even reveal some fun facts of the Open Movie projects. We tried to avoid it but we end up talking about VR as well, of course.
In this podcast: Andy, Hjalti, Pablo, Francesco, Julian
Music: Blabetté de Metz (Psy-Jazz Mix) - (cc-by-nc) keytronic
I have a question for the next podcast (apart from the other I did)... how did you do the compositing of all the 3d elements over the real footage if there is no shadow catcher shader in Cycles? I'm doing some shadow catcher study compairing it to my other render engine (Corona) and using the current shadow catcher as it is right now (patch D1788) and it is not working correctly, so I wonder how did you manage to mix all the 3D with the real life footage.
Love the "radio show" version of your studio feed, as video can be hard to access on mobile. Question for podcast, does Blender produce stereo vision by rendering 2 complete eyes or does it produce a center version, then interpolate the differences to create the left and right? This would be less processor intensive when creating 4K "lat-long" VR images.
Also what do the Devs think of the new feature request site RicghtClickSelect.com?
I was still wondering about VR in Blender and Hjalti's concern about editing language (when to cut, how quickly etc.) in this new story telling medium.
VR experience seems more like a stage show seen from the 'stage'. Where the situation is still manipulated but passage of time is often achieved by modifying sets and lighting. For real life performers you would have to merge takes and reaction times ah'la George Lucas in Phantom Menace https://youtu.be/j8sBsnYNucM?t=45m45s stretching motion and morphing actors as if they have shape keys. While the actual cutting of shots seems to be jarring the use of dissolves might overcome that. Listen to this excellent podcast from CG Garage and guest Justin Denton https://labs.chaosgroup.com/index.php/cg-garage-podcast/cg-garage-podcast-33-justin-denton/ for lots of great insight.
I have a question for Hjalti and Julian to discuss this week. Julian posted his patch on Right Click Select today and it sparked a discussion of what useful features you could pull over from the viewport display settings.
https://rightclickselect.com/p/animation/nbbbbc/graph-editor-background-from-3d-view
Hjalti: What settings would you find useful to be able to enable/disable from the graph editor backdrop? For example, SSAO, DoF, only render, etc.
You mention Agent 327 in this podcast and I wondered about that project sometimes. What's Agent 327 project's status? Are you planing to make some other co-production in the future or is this just a one-time thing?
PS: Community news is a great idea!
Nice to hear you guys! :) Thanks for the Ian Morgenstern trivia, that kind of details are amazing! But I can say, that seeing you in the video is always even better, because not only sound is in the stream - with your picture you sell also the mood, the feeling, the Blender Institute Charm :)
@piotao: Thanks! We used to Periscope, but it meant having someone's phone always there, set it up, check we're all in frame (we sit facing each other now, much simpler and conversations feel more natural), also Periscope is only saved for 24hs, so we dropped it.
We have cameras but.. would you really watch a 1hr video of this same thing? 5 guys sitting in a dark room surrounded by mic and cables: I don't know how much it would add to the final product, but it will definitely slow us down a bit because of syncing video/audio, exporting, uploading, and so on. We are getting better at just recording everything in one go, so we don't edit the podcast itself that much anymore (we try not to swear, but sometimes it happens and we mute that out). We record everything at the end of Wednesday, and edit/publish/PR on Thursday morning, to keep the amount of office hours used by the team to the minimum. We should be making content for our subscribers not radio shows! :)
Any new Open Movies in the horizon? Blender evolves so much with each OpenMovie that I'm eager to see what BI can produce now and the development results of it! Please, have a testing build with Alembic and generic OpenVDB soon! :D begging for it here... but hey... as it's said... no pressure! hehehe
Join to leave a comment.