Windows 10: Can't use 1080p on full HD TV over Miracast
Can't use 1080p on full HD TV over Miracast
Hi, I'm currently using my Notebook to connect to my LG webOS LH6000 TV and while the connection works, I can't seem to change the resolution.
My notebook screen is 1366x768 natively. My TV is 1920x1080. The steps I take to connect are: Win+P, connect to a wireless display, choose my TV, connect.
Then I went to right click, display settings and choose extend this displays. Since I know wouldn't be possible to mirror with 1080p. However, even after in extend mode, I can't change my TV resolution to 1080p, it is capped in the same resolution as the notebook. I also tried going to the Intel graphics setting, same thing.
Here's my DxDiag: DxDiag - Pastebin.com
Intel Report from Cpanel: Intel Report - Pastebin.com
And here are two screenshots of configurations mentioned above.
Any ideas on how to run my TV on 1080p? I even tried to run it as "Second screen only", disabling my notebook screen and making the TV the main display, still only goes as far as 1366 x 768.
Last edited by Nimb; 25 Sep 2016 at 22:19.
Apparently your laptop display adapter (video card) does not support any resolution higher than 1366x768. It must at least support 1920x1080 to cast 1920x1080. It is as simple as that.
While I didn't think of that, apparently it supports:
GeForce GT 740M | Specifications | GeForce
Intel support for HD 4000 also says regarding resolutions:
In clone mode: 1920 x 1200 at 60 Hz.
In extended desktop mode: One display at 2560 x 1600 at 60Hz and the remaining two displays at 1920 x 1200 at 60 Hz.
If using a mobile processor, embedded Display Port (eDP) supports 1920 x 1200 at 60Hz, and two external displays using 2560 x 1600 at 60Hz.
Any other ideas of what could be causing this?
I would say your computer and the fact that your TV is that size are all factors in your problem. Upgrading your computer or getting a new 1 might be a problem solver depending on what you do.
1 .Go to NVIDIA control panel
2. Click Display Resolution on Display on the left side of the window
3. You will see a "Customize..." button, click it.
4. Check "Enable resolutions not exposed by the display" if you see 1920x1080 in there then add it. If not, go to step 5.
5. Click "Create Custom Resoluton..."
6. Horizontal pixels must be set to 1920 and vertical lines to 1080. Refresh rate must be THE SAME on what your TV or monitor can support at maximum.
7. Click on apply
If this does not work then just change the value of the horizontal pixels and vertical lines to what can perfectly fit on the screen of your TV. The GeForce 740M will support 1080P resolution
Hmm. I have the stripped down NVIDIA control panel, I think it's not nvidia who's controlling my output settings, but rather, Intel. I don't know if there's a way to change this. But here's a screenshot of my ncp: http://s22.postimg.org/6k3qqv7fl/Capture.png
So probably it's intel ****ing up somehow (Or not supporting 1920x1080) like it was mentioned above. The weird thing is, the support page for HD graphics 4000 says it *does* support that resolution. So I don't know what is going on. Strange thing is, Intel Cpanel report says it supports it! Intel Report - Pastebin.com
Last edited by Nimb; 25 Sep 2016 at 22:20.
If the actual hardware in your computer doesn't support more than the specified resolution - then you are STUCK with it.
In general cheaper tablets etc will only support lower resolutions -- however most modern smartphones can cast at full 1080p so that might be a better option -- use mains power though or your battery will run out very quickly. Note also casting sometimes has nasty things like DRM crud so you might not always be able to cast.
The hardware spec of your computer will specify the max output video resolution .
Another issue often forgotten -- unless you use an HDMI connection to the TV you could be prevented from getting 1080p because of the TV hardware (for example if using SCART/AV/RGB/Component/VGA connections).
Okay, I'm guessing the driver that you have for that GeForce 740M is the one that came from the laptop manufacturer? Because you shouldn't even be seeing that in a driver that directly came from NVIDIA.
Have you tried switching to High Performance in the battery settings? Try and set it at that and check NVIDIA Control Panel's settings if anything adds up.
Hi, Jimbo! I went ahead and got an HDMI cable to test it, works perfectly. The same Computer and TV can handle 1920x1080 over cabled HDMI. So I think the computer graphics/chipset can handle it just fine. Not sure why it won't work over Miracast, though.
I actually have downloaded the driver directly from NVIDIA and performed a clean install, it's not the drivers that came from the manufacturer. Both Intel's driver and Nvidia's as updated and generic drivers (not from the manufacturer).
Yep, the computer has always been on High Performance
Never mind what I had to say, then.
The problem: everything including icons, start menu, internet browsers, everything is appearing as if I am at a lower resolution even though I am at 1920x1080.
The only things I did were disconnect a second monitor I am now using for my laptop...
I'm going to be investing in a new HDTV for my gaming PC. I currently use a 50" Plasma but I would like something bigger. I plan to go with a 60-65" LED HDTV and I plan to game using 1080p resolution regardless of whether I purchase a 4K HDTV or a...
Can win10 actually do miracast? I can cast my Android tablet to my TV via a wd TV live media player but can't work out how to do it with won 10. Read lots on the net but it has offered me no solution.. Please someone hlep me do it. I have both a...
Its a little hard to see (sorry), but this is what im dealing with. When i go to advanced options, everything is gray.
Anyone know how to fix this?
Hello, I recently installed the latest Windows 10 TP (I believe it said April) on an old school computer with an Intel Core 2 Duo @ 2.97 GHz. It always used to run 3:2 in the classroom but on a fresh install of 7 Professional it offered and worked...