Personal View site logo
Panansonic IR interview - talks about GH2 hacker community
  • Thought some of you might find this interesting - first time I've really heard what Panasonic thinks about the GH2 hacker community - seems quite positive!

    http://www.imaging-resource.com/news/2012/01/15/panasonic-learning-from-the-fringe-taking-small-steps

  • 41 Replies sorted by
  • I have only one for this. It is marketing guy.

  • What I found interesting:

    "The GH2 has a full digital Micro Four Thirds 16 megapixel sensor, whereas the GX and the G3 have analog 16 megapixel Micro Four Thirds sensors. The numbers are the same, but the technology is vastly different. Now, what the digital version gets you is far better video--far better, faster readout rates--but it’s a lot more expensive. So between the two cameras you get a better video read on a GH2 than you will with a G3, for example."

  • @Strangways

    Again. It is marketing guy.

    And as usual, do not understand a shit about the products.

  • LOL:

    .

    AE: So there has been a lot of interest in hacking GH2s. Might you take to heart some of those optimizations they're making when you're looking at future developments?

    DP: You know... Yes and no? We do take a look at what they do, and we take it very seriously, because obviously that's what the customer wants. There are some limitations within the hardware and within the system specifications. For example, the AVCHD format; you know, we have to stay within the confines of what that AVCHD format says. Yes, you can go outside that with some hyped-up features and firmware updates and that sort of thing, but then you're outside the normal specification, and for us as a manufacturer, we really have to stay within it. At the same time, we know the level that the sensors can handle, we know the amount of cooling it's going to take, and to push it past it's limits, you're going to degrade the life expectancy of the product. Now, with a hopped-up camera, you somewhat expect that. You know, it's just like a car; if you push your car to the limit, you know something's going to happen eventually, right? So, as a consumer, go for it! But as a manufacturer, we have to stay within the limits of the standards and what we know the system itself can handle long term.

    .

    .

    Hehehe, this kinda sounds like a BS answer to me. I dunno too much but as far as I know AVCHD doesn't have any such limitations. Right? It's a file format, it only accepts parameters - and everything we do is nothing special. We only do what they didn't do. Whatever limitations there are, are determined by the hardware aren't they. It's kinda like blaming sensor noise on the JPeg file format. LOL He gets to the hardware limitations too but I just thought it was kinda funny that he started it off by claiming our mods were out of specification for the format. :D

    Yup, "it's a marketing guy". ;)

  • AVCHD has clear limitations.

    H.264 do not have such limitations.

    And they really can't start making camera with 100-150Mbps. As returns numbers will be huge. It won't play on their new TV. It won't record on their $5 SD card. Many reasons to be unhappy.

  • Really? There are bitrate limitations written directly into the format? Sounds odd.

  • They could hide a switch in third menu level that needs to enter a code from the website. Then nobody could moan that he did not know. Then again.. people will enter the code, and then they will fail, and then they will blame panasonic, no matter what the huge red warning sign said... so I guess if I was responsible for the decision I would take the save route as well.

  • I believe that 1080 50p@35 or 50mbit/s mode in gh3 would not harm anybody.. I hope panasonic learned something from hacks and they make something similar to 24mbit HBR mode

  • AVCHD is a brand name used by Sony and Panasonic to market a specific subset of H.264 encoder technology. While it's true that Panasonic is constrained by AVCHD specs to 25Mbps 8-bit 4:2:0 encoding quality, these limits are entirely self-imposed. In past instances Panasonic has seen fit to both extend the AVCHD spec and to step outside of it in order to market video features that standard doesn't support.

    Technically, GH2 MTS files are internally tagged as High Profile Level 4.0 H.264 encodings.Maximum bitrate, color depth, and resolution of this and higher H.264 Profile Levels can be found here:

    http://en.wikipedia.org/wiki/H.264/MPEG-4_AVC#Levels

  • Yes, but they could theoretically use AVC-intra/AVC-Ultra for the GHx series too.

    http://en.wikipedia.org/wiki/AVC-Intra

    I think it comes down to what VK said, it needs to play on consumer devices and use cheap media since the camera was designed for the general consumer.

    Maybe the next GHx model will be more "professional" and will utilize higher quality cards and higher level AVC. Then again, that might eat into their professional market and they won't.

    Who knows.

  • They can use plain old .mp4 to avoid the constraints of AVCHD. If my memory is correct, that's how the GX1 records 1080/30p. Here's hoping future Lumix cameras will support high bit rate progressive .mp4 recording, in addition to the AVCHD format that they seem so attached to. We are seeing .mp4 more and more from other manufacturers. There's a JVC camera now that records 1080/60p at 35 Mbps.

  • Just good software engineering could overcome some of the troubles Vitaliy mentioned. As just one example if they're intent on selling the $5 memory cards all they would need do is read and write a test file to the card every time it was inserted or formatted which rated the card's speed and wrote a bit-rate limit to a spec file located on the card itself. We might see a "Detecting Media Capabilities" for a second or two (or it may be transparent to the user) but that would ensure media compliance across the board - if they actually wanted to deliver the goods so to speak. ;)

    Also, what some may not be considering here is that Panasonic designs and manufacturers competing products. And in order to do that successfully they need to (or they think they need to) successively demote (read cripple) the lesser models. Remember, they're trying to sell a pro level video camera too. And there are models below the GH series yet as well. I know of several people on the forums for example who would not have purchased the pro video camera had the GH1 (or GH2) exploited the full function of it's potential. I myself would have gone for a G1 had both it and the GH1 been fully enabled at the time I made the purchase. Etceteras.

    I just thought it was funny that the guy started to blame everything on software limits that to my knowledge don't really exist. Which of course is something a guy from marketing would do. Vitaliy nailed it. ;)

  • Is it true what the rep says? the hack shortens sensor lifespan?

  • @brianluce

    No, it is not true.

    Hack does not touch sensor at all :-)

  • @brianluce: use some logic

    Sensor is a well a sensor, it is a pickup device, and has a certain voltage run across it, and values are directly read from it (in the GH2's case), everything the hack does is only playing with those values/numbers after they have come off the sensor in the image processor. The sensor is still doing the same amount of 'work' to run constantly for 4 minutes than it does with higher bit rate and other things from the hack for 4 minutes.

  • Okay so in other words, the Panny marketing rep is full of crap.

  • I'm actually really embarrased for Panasonic for letting this guy even open up his mouth to say anything. When he stated:

    "At the same time, we know the level that the sensors can handle, we know the amount of cooling it's going to take, and to push it past it's limits, you're going to degrade the life expectancy of the product."

    This made me laugh out loud. He obviously does not have the slightest clue. He's acting like the hack is overclocking the processor.... What a nitz

  • That might actually be true "TECHNICALLY" speaking. LOL

    For example if you ALWAYS operate you camera in 100˚ F weather then using the hack could shorten the camera or sensor's "average" lifespan from 15 years (or whatever) to about 14 years and 250 days. ;)

    Of course you'll no longer own the camera and in 14 years and 250 days from now you probably won't have an interest in µ4/3 cameras at all by that time. Not to mention that if you actually use your camera the shutter will probably give out looooong before then. And Panasonic of course will not service ANY of their cameras with a busted shutter after about 100 shots or so - According to Panasonic's own service management' statements to me and as written directly in the warrantee.

    But oops... I guess he forgot to mention all those factors. :)

    In the end your camera/sensor is about 99.9999999999% more likely to break from an encounter with cosmic rays than from using different encoding parameters.

  • uh... What might be true?

    Definately not technically. When these parameters are changed, they will either run within the clock cycle or not. Again, this IS NOT overclocking..period.

  • @proaudio4 Don't be too cynical; It may not be overclocking or effecting the sensor, but making the processor run at higher load levels for longer periods of time than it normally would, i.e. as a high-bitrate hack would, is inevitably going to make the camera run hotter than it was originally designed for. However in practice, the cooling may be more than adequate and the effect on lifespan from the stress may well be negligible.

  • @Ptchaw Care to cite your technical basis for claiming that a high-bitrate hack "is inevitably going to make the camera run hotter than it was originally designed for"? What is this "load" you're speculating about and how are you estimating its power dissipation? Or perhaps by "hotter" you're referring to the GH2's sex appeal?

  • @Ptchaw is exactly right to say that making the encoder run at a higher bit rate will make the camera run hotter. CMOS circuit designs, which modern computers generally use (and certainly any computers designed for high computing power and low electrical power), consume more power and generate more heat when the transistors switch more frequently. Making an encoder produce a higher bit rate leads to more frequent switching: the encoder is doing more work, in the computational sense. (calculations per second)

    But whether the higher temperature exceeds the camera's thermal designs is just speculation. The camera will run hotter than it would as designed, but not necessarily hotter than the design allows for.

    Let's not scrutinize the technical things a marketing guy has said. I give him credit for knowing more than I would have expected him to know.

  • @balazer Yes, I agree, running the encoder at a higher bitrate definitely makes the GH2 a hotter camera. The raw sex appeal it generates raises my blood pressure and the heat from my sweaty palms is no doubt conducted directly to the image sensor. But I think that just makes the pixels swell up even fatter, giving it that juicy organic film grain we all cream over. All those hot computations pimp out that phat bitrate which must be why my SD cards wear out faster but no matter its all good!

  • @LPowell

    lol Watch it....you might go blind!!! :-)

  • Call me crazy but isn't the processor actually doing less work with less compression/higher bit rates?