Re: Fwd: [Bf-committers] GSoC 2017: Camera breathing

Next Topic
classic Classic list List threaded Threaded
1 message Options
Reply | Threaded
Open this post in threaded view
Report Content as Inappropriate

Re: Fwd: [Bf-committers] GSoC 2017: Camera breathing

Мукаев Виктор
What feature particularly are you talking about? Almost all of the trackers support zooming shots, so if blender would support this feature it would be great! i'm not sure how hard it would be to implement this feature, but it's definetly a priority to support variable focal length for tracker

Date: Mon, 27 Mar 2017 19:41:57 +0000
From: Sean Kennedy <[hidden email]>
Subject: Re: [Bf-vfx] Fwd: [Bf-committers] GSoC 2017: Camera breathing
To: Blender motion tracking & VFX <[hidden email]>
        <[hidden email]>

Content-Type: text/plain; charset="windows-1252"

I myself have never needed this feature. I checked with one of the trackers here where I work, and he said he hasn't done much of that (tracking the focus of a shot, if I am understanding the documents correctly).

Things I think would be more useful to the community at large would be:

Automatic tracking - While not useful for all shots, for basic shots like aerial flyovers or simple handheld shots, this would be a time saver.

Easier rebuilding of geometry - I know we have the "3d markers to mesh" button, but it simply creates vertices, which then have to be manually stitched together to create rough geo. There's gotta be an easier way to get rough scene geo. Even if it's only updating that button to create vertices for selected tracks only. Or being able to build geometry after the camera solve by specifying a few points, then moving to a different frame and re-specifying those same points. The solve should be able to rebuild that geo correctly from just that small amount of information.

Easier planar tracking - Planar tracking where we can, for example, simply draw a grease pencil stroke around a flat, planar area, and have that area tracked throughout the shot to easily stick a plane track on to.

Obviously I can't speak for everyone, but these are the things that would be the most helpful for how I use tracking and solving in Blender here at work every day.


From: [hidden email] <[hidden email]> on behalf of Ton Roosendaal <[hidden email]>
Sent: Monday, March 27, 2017 11:12 AM
To: [hidden email]
Subject: [Bf-vfx] Fwd: [Bf-committers] GSoC 2017: Camera breathing support


FYI. A student proposal.

I can't judge this feature well, please feedback.
Is it really essential? Other ideas he could work on?


Ton Roosendaal  -  [hidden email]   -<>
Home of the Blender project - Free and Open 3D Creation Software

Chairman Blender Foundation, Director Blender Institute
Entrepotdok 57A, 1018 AD, Amsterdam, the Netherlands

> Begin forwarded message:
> From: Tianwei Shen <[hidden email]>
> Subject: Re: [Bf-committers] GSoC 2017: Camera breathing support
> Date: 27 March 2017 at 19:06:37 GMT+2
> To: bf-blender developers <[hidden email]>
> Reply-To: bf-blender developers <[hidden email]>
> Hi all,
> FYI, you can check out my proposal draft on this project at <>, if you are interested.
> Thanks,
> Tianwei
>> On Mar 25, 2017, at 12:08 AM, Jacob Merrill <[hidden email]> wrote:
>> what about using a object of known scale to calibrate (like a 3d printed
>> susan?)
>> On Fri, Mar 24, 2017 at 8:27 AM, Tianwei Shen <[hidden email] <mailto:[hidden email]>>
>> wrote:
>>> Hi Levon,
>>> Thank you so much this long reply. First of all, I?ve been looking for
>>> user tests and suggestions for the multi-view reconstruction project. If
>>> you have ideas for making it better, just feel free to drop me emails. On
>>> the other hand, it is still a quite large patch. So we need time to split
>>> it up and gradually merge it into the master. But hopefully this can be
>>> integrated well with the camera breathing support project and even
>>> automatic tracking in the future.
>>> As for the camera breathing support, I didn?t realize lens distortion
>>> parameters would also change with the focal lengths. I thought we?d only
>>> deal with changing focal lengths with the zoom-in/out motions. So things
>>> become complicated here since it seems to me that focal lengths and
>>> distortion parameters cannot be estimated on the fly. Users have to first
>>> calculate this information (focal lengths for each frames and their
>>> corresponding lens distortion parameters) using some calibration tools. Can
>>> the solver reliably deal with changing focal lengths and distortions? On
>>> the other hand, if users have to first calculate focal distances using some
>>> tools (if Blender doesn?t have its own) in the first place, would it impose
>>> burden and inconvenience for users?
>>> Thanks,
>>> Tianwei
>>>> On Mar 24, 2017, at 8:57 PM, Levon <[hidden email]> wrote:
>>>>> Message: 1
>>>>> Date: Fri, 24 Mar 2017 02:26:38 +0800
>>>>> From: Tianwei Shen <[hidden email] <mailto:[hidden email]> <mailto:
>>> [hidden email] <mailto:[hidden email]>>>
>>>>> Subject: [Bf-committers] GSoC 2017: Camera breathing support
>>>>> To: bf-blender developers <[hidden email] <mailto:[hidden email]> <mailto:
>>> [hidden email] <mailto:[hidden email]>>>
>>>>> Message-ID: <[hidden email] <mailto:[hidden email]> <mailto:
>>> [hidden email] <mailto:[hidden email]>>>
>>>>> Content-Type: text/plain;       charset=us-ascii
>>>>> Hi Everyone,
>>>>> Last summer I participated in GSoC 2016 and worked on the multi-view
>>>>> camera reconstruction project. Some of my efforts are summarized in this
>>>>> blog: <> < <>>
>>> < <> < <>>>.
>>>>> And this patch ( <> <
>>> <>> <
>>>>> <https://developer.blender.
>>> org/D2187>>) is now being reviewed and revised.
>>>>> This year I would like to apply again and work on the camera breathing
>>>>> support, which is already requested by some users during the time I
>>> worked
>>>>> on the motion tracking project. Now I need clarifications for some
>>> specific
>>>>> problems.
>>>>> 1. should we automatic detect the changes of focal lengths, or is it
>>>>> specified by users as additional inputs (like the focal length for each
>>>>> frame)? I know we can read exif tags to get focal lengths for photos.
>>> Do we
>>>>> have a similar approach for videos?
>>>>> 2. Is the current UI able to handle camera breathing, if we need
>>>>> additional inputs from users?
>>>>> I think this project also has something to do with my revisions done on
>>>>> the motion tracking system last summer. Hopefully I should be able to
>>> merge
>>>>> the revisions and move towards the goal of automatic tracking.
>>>>> Thanks,
>>>>> Tianwei
>>> _______________________________________________
>>> Bf-committers mailing list
>>> [hidden email] <mailto:[hidden email]>
>>> <>
>> _______________________________________________
>> Bf-committers mailing list
>> [hidden email] <mailto:[hidden email]>
>> <>
> _______________________________________________
> Bf-committers mailing list
> [hidden email]

Bf-vfx mailing list
[hidden email]
-------------- next part --------------
An HTML attachment was scrubbed...


Bf-vfx mailing list
[hidden email]