Tagged with depth - Personal View Talks https://personal-view.com/talks/discussions/tagged/depth/feed.rss Fri, 03 May 24 04:13:22 +0000 Tagged with depth - Personal View Talks en-CA Focos for Video https://personal-view.com/talks/discussion/23499/focos-for-video Thu, 06 Feb 2020 06:57:48 +0000 tida 23499@/talks/discussions Focos is an App for iPhone which emulates 3D lens focus for images. In the following article they state that Software company does develope currently a version for video. This Should be very interesting if it works that well as it already does for images.

https://www.theverge.com/2019/9/9/20856585/focos-2-0-hands-on-portrait-mode-blur-any-photo

]]>
Bit depth and sample rate - Impact on audio editing quality https://personal-view.com/talks/discussion/15438/bit-depth-and-sample-rate-impact-on-audio-editing-quality Thu, 28 Jul 2016 17:58:56 +0000 joethepro 15438@/talks/discussions If I were to record 2 wav files, one at 16/48k, and one at 24/96k, I understand that there would be little to no perceptible difference in the sound quality to most listeners. As far as editing the file goes, how do the higher quality files make a difference? Will I be able to pitch shift or slow down the 24/96k file farther without it breaking up, for example?

]]>
Frame Rate Downscaling / Frame Averaging https://personal-view.com/talks/discussion/12188/frame-rate-downscaling-frame-averaging Fri, 16 Jan 2015 04:47:25 +0000 Athiril 12188@/talks/discussions Anyway, the basic idea is the same as image stacking, you stack 2 or more images, one of the advantages is if you do it in a high bit depth space, the bit depth of the image will go up, that means 8-bit images increase in bit depth when the stack is calculated in a higher bit depth space.

I commented off hand that if you're desperate enough you can get higher bit depth from merging frames together if you have access to 50, 60p etc. Oh, this also works if you downscale a higher resolution to a lower one within a higher bit depth space as well - so for all you people down scaling footage, if you're able to do that in a 16-bit space and save a high bit depth format then you'll have something to gain. I didn't speak for practicality, only that it can be done.

When I wrote my program to count unique colours in very large resolution 16-bit tiff images, I didn't see other software that'd do that. As the array would require tremendous memory. If you want to know the way I managed to do it I had a simpler way to do it which required processing power over memory, and that was to sort the pixels in an array from smallest colour value to highest essentially, that way you could simply not add +1 to the unique colour count if the same colour appeared more than once in a row.

I had other reasons for doing this, it's part of my experiments with image stacking, but I've done it with various image sources, scans, jpegs from shitty cameras with no raw, raw, jpegs from better cameras, snap shots of two frames in video.

Been told it's "cant be done" many times. Because the information from one from to the next isn't the same - well no, that's exactly why it works, even in the worst case scenario of a tripod locked off shot of still life with nothing moving it works.

But if you're recording 50p, blending every 2 frames together just speeds the footage up to 25p, the average of the two frames still contains the same amount of motion blur/movement as recording in 25p at the same shutter angle, though it may be divided into 2 sections over the 1 frame instead of 1 long section of motion.

Generally that shouldn't be an issue as 180 degree is accepted as general purpose good shutter angle which leaves space between the movement/motion blur anyway, and if you wanted none you could shoot 360 degree.

Now to the screenshot - R, G, B bit values, these represent how much of the scale Red will use on it's own (regardless of what green and blue is attached to - the same goes for Green and Blue etc), a value of 6-bits would mean, the image only has 64 differing red values in total. I made this to identify problems with some images.

Unique Colours is the total number of colours, an 8-bit image has a maximum value of 24-bits here, but that would be very hard to achieve, as it would have to have every combination of colour, hue, intensity, saturation in the one image etc from bland to neon, and the image resolution would need to be 16.7 million pixels or over.

The Unique Colour Factor - A 1920x1080p image has 2,073,600 pixels in the frame, a value of 50% would mean it would have 1,036,800 unique colours in the image, this percentage can be used as a factor of colour quality and separation.

The first example is a single frame grab from some A7s (720p 120fps) video I downloaded off the net (I was going to use my GH2, but I left it at work this afternoon), it's placed into Photoshop and converted to 16-bit and saved as a tiff (partly to rule out any conversion to 16-bit as the contributing rise in bit depth counting, and also because the program I wrote only reads 16-bit tiffs at this time of writing).

The second example, is the first 2 frames in 16-bit mode in photoshop, blended with opacity at 50% for the top layer, flattened and saved as a tif.

The fourth is the same but with 4 frames.

As you can see, the more you stack, the more in between values with higher precision than 8-bit arise, about ~9x the unique colours.

It should only be logical, even if the video or two photos is of on a tripod of a still life, 2 pixel values will have variance in them over the two frames, especially since 8-bit has high quantisation error compared to higher bit depths, variable noise, micron image movement etc, so if you have one that's 241 and 240 for example, the in between value cannot be represented in 8-bits.

Unfortunately I'm working tomorrow, but that means I can get my camera back before Monday, so I probably can shoot some footage with the lowest contrast settings and see if I can induce banding in something and post up an actual video sample.

]]>
Consolidated Specs for Cinema Cameras with RAW https://personal-view.com/talks/discussion/5270/consolidated-specs-for-cinema-cameras-with-raw Thu, 22 Nov 2012 05:29:39 +0000 thepalalias 5270@/talks/discussions I went through the list of topics in Raw Cameras and did not see a list like this, so since I had been compiling some for the blog I posted today at http://www.pasadenapulse.net, I thought I would put them here so that others can add to it (and correct any errors that are statistically likely to have been added. :)

These cameras all have (or are slated to have soon) RAW recording capability in some form.

Max Horizontal Resolution

  • BMCC: 2.4K
  • Arri Alexa: 2.88K max horizontal resolution.
  • Sony FS-700: 4K expandable with 1.92K at release.
  • Sony F55: 4K max horizontal resolution (4096x2160).
  • Canon C500: 4K max horizontal resolution (4096x2960 RAW max).
  • Sony F65: Greater than 4K.*
  • Red One/Red One MX: 4.5K max horizontal resolution.
  • Red Scarlet: 5K max horizontal resolution (but 4K max horizontal resolution at 24/25/30P).
  • Red Epic: 5K max horizontal resolution.
  • Red Dragon: 6K max horizontal resolution. Unreleased.

*As for the Sony F65, I don't want to get into that can of worms. With it's 20MP sensor, let's just say "easily 4K, probably more in the future" but not get too far into it other than to say that the current recording format is 4K.

Cameras with Global Shutter

  • Sony F55

Max Framerates at 4K and Highest Resolution (with the color depth recorded)

  • BMCC: Cannot do 4K. Max 30P 12-bit log at 2.5K.
  • Arri Alexa: Cannot do 4K. Max 60P 12-bit log at 2.88K in 16x9 mode.
  • Red One MX: 30P 12-bit at both 4.5K and 4K.
  • Canon C500: 60P 10-bit at 4K.
  • Sony FS-700: Upcoming 60P at unspecified RAW bit-depth in 4K.
  • Sony F55: 60P 16-bit linear 4K.
  • Sony F65: 120P 16-bit linear 4K.
  • CMV12000: 90P 12-bit or 150P 10-bit at 4K.
  • Red Scarlet: 30P 16-bit at 4K. 12P at 5K.
  • Red Epic: 150P 16-bit at 4K. 120P 16-bit at 5K.
  • Red Dragon: Unknown 4K (expected to meet or exceed Red Epic at all framerates but guaranteed 120P at 5K). At least 85P 16-bit at 6K. Unreleased.

Native DR

  • Sony FS-700: Untested in RAW.
  • Red One (not MX): 11.3 stops official.
  • Canon C500: 12 stops official. (Though the highlight roll-off has been garnering praise)
  • BMCC: 13 stops official.
  • Red MX (as used in MX/Scarlet/Epic): 13.5 official. (Some sources measured at under 12 but I'm not invested in one measurement over the other).
  • Arri Alexa: 14 stops official.
  • Sony F55: 14 stops official.
  • Sony F65: 14 stops official.
  • Red Dragon: Over 15 stops official. Unreleased.

And of course, the Red Scarlet, Epic (and upcoming Dragon) support the HDRx mode that some people like and others do not, which is officially specified as increasing dynamic range to 18 stops (with a lot of caveats to go along with it).

]]>