TV streams @ 1280 X 720 @25 or 50 fps with typical graphic card


  1. Posts : 11,247
    Windows / Linux : Arch Linux
       #1

    TV streams @ 1280 X 720 @25 or 50 fps with typical graphic card


    Hi folks
    just wondering if with average graphic card and standard 26 inch monitor whether there is any discernable difference between capturing a TV / Video stream at standard Broadcast (1280 X 720p) at 25 fps or 50 fps.

    While resolution is the same in both cases a program of 1 hr at 50 fps occupies approx 2GB on HDD and the 25 fps takes approx 1GB.

    I've downloaded (legally) from BBC iplayer same program at 50 fps and 25 fps and ran on computers with decent graphic cards in them -- and I can't see any difference so are my eyes going bonkers or are the TV companies in cahoots with the ISP's in hoodwinking you into using more bandwidth for program downloads than you need.

    (note most current TV channels allow downloads usually of 1280 X 720p including amazon - very few "saveable" streams at full HD (1920 X 1080p) and zero as far as I can see at full UHD 4K).

    Cheers

    jimbo
      My Computer


  2. Posts : 4,666
    Windows 10 Pro x64 21H1 Build 19043.1151 (Branch: Release Preview)
       #2

    @jimbo45

    The biggest difference between the two is the amount of motion blur and sharpness. But all this depends on how the original footage was filmed.

    A 50FPS fast paced movie/video is a lot sharper during fast paced action scenes. If you watch a documentary where they film same picture for 10 minutes in a row, you see absolutely no difference between the two.


    What is more important is that if you watch on a 50/100/200Hz TV, that your source is 25 or 50Hz so it synchronize better. On a 60/120/240Hz TV 30 or 60 FPS footage is much better.
      My Computers


  3. Posts : 11,247
    Windows / Linux : Arch Linux
    Thread Starter
       #3

    slicendice said:
    @jimbo45

    The biggest difference between the two is the amount of motion blur and sharpness. But all this depends on how the original footage was filmed.

    A 50FPS fast paced movie/video is a lot sharper during fast paced action scenes. If you watch a documentary where they film same picture for 10 minutes in a row, you see absolutely no difference between the two.


    What is more important is that if you watch on a 50/100/200Hz TV, that your source is 25 or 50Hz so it synchronize better. On a 60/120/240Hz TV 30 or 60 FPS footage is much better.
    Hi there

    220 VAC 50 HZ standard mains here but US standards different I think 110 VAC 60 HZ.

    It is better on a large TV then I can see difference quite clearly with motion / even the subtitles are clearer -- I suppose smaller screens mask a lot of these difference - so I have to be a bit sceptical with Apple supposedly in their new 6 inch iPhone screens want to have a 4K type resolution !!!!

    TV's are something else though - the newer UHD 4K models are quite good now - but I'd avoid those curved ones - there's something not quite right with them when viewing at least to me.

    Anyway marked as solved -- plenty of HDD space so I might as well take best quality streams available.

    Cheers
    jimbo
      My Computer


 

  Related Discussions
Our Sites
Site Links
About Us
Windows 10 Forums is an independent web site and has not been authorized, sponsored, or otherwise approved by Microsoft Corporation. "Windows 10" and related materials are trademarks of Microsoft Corp.

Đ Designer Media Ltd
All times are GMT -5. The time now is 11:48.
Find Us




Windows 10 Forums