Results 1 to 6 of 6

Thread: Vista does not detect external display

  1. #1
    Join Date
    Oct 2005

    Vista does not detect external display

    Hi, I am using a Dell XPS M1330 laptop which came pre-installed with Windows Vista Ultimate 32bit. I usually connect external displays to this system as it has a VGA as well as a HDMI Port. I was able to use external Monitors or Projectors through this HDMI post in past using Windows Mobility Center without any problem but since past 10 days I am getting some problems. Recently I connected my TV using the same HDMI port and since then I guess something went wrong.

    Now no matter I try to connect external monitor, Projector or TV, nothing seems to be working. Mobility Center is not able to detect and external displays connected to VGA. Whenever I try activating the option to extend the desktop to an external monitor using the graphic cards driver or the display panel in the control panel, it works fine but wont work at all with Windows Mobility Center. I also tried mirroring the external monitor but still it is not detected.

    Any idea what would have gone wrong and how can I fix it? Many Thanks.

  2. #2
    Join Date
    Apr 2009
    Austin Texas

    Re: Vista does not detect external display

    Hey there! I ran across a very similar issue. I was researching my problem when I came upon your note. Well, I figured out what seem to resolve the issue - but still don't understand what the actual problem is...I picked up a new laptop:
    • Sony Vaio VGN-FW351J
    • Vista Home premium 64-bit
    • Duel Core w/ 2.0 ghz a piece
    • 4GB Ram
    • Bluray player + cd/dvd read and write
    • HDMI out among other readable media drives

    I have a Toshiba Regza 42" LCD HDTV that accepts PC input and also has 4 HDMI inputs. I am not using the PC input for this scenario, but I believe the problem isn't with the type of connection, but more of a problem with the OS and auto-detect that most likely spans all connection types. I'm anticipating that this post will be helpful to others, so I'll do some background as I go along..

    The 1st few times I connected the laptop to the HDTV via HDMI cable, a little two-note sound would play when inserting the HDMI cable to the PC. Vista immediately recognized the output device and the "wizard" would pop up asking, "What do you want with the external display device: mirrored, extension of monitor, or external display only?"

    External display "mirrored" setting will not allow the external display to use it's own max resolution/display configuration because the OS is still outputting to the PC monitor and not the external display HD output device. Thus, I was only getting 720p and a small screen size (max for PC monitor) on the HDTV (the TV recognizes what signal it is receiving). Sony's VAIO website had a cool lil' How-To flash video that explained this. I was using this setting for a while and noticed that the output was only 720p and the Bluray movie rips I had been watching had noticable "refresh lines" that would scroll downward and slightly disrupt the picture. It was annoying.

    Extension of monitor setting: Haven't tried to use this, but with deductive reasoning, this setting would still be sending the display signal to the PC monitor 1st, and would still be unable to fully take advantage of the HDTV's display power.

    External display only:
    This one turns your PC monitor off, and allows the primary display output to be the external display device. My television recognizes the signal as 1080p. Working directly on the HDTV as my monitor, Once the OS realized that it was no longer limited by the PC monitor, it would allow me to increase my display resolution.

    Note: The following Windows® desktop resolution settings can be used for different output device types:

    HD Display Type Desktop Screen Resolution
    1080p / 1080i ---------------------------- 1920 x 1080
    1768 x 992
    720p ------------------------------------ 1280 x 720
    1176 x 664

    After I realized this and started using the "external display device only" setting, Vista stopped detecting the external display device automatically.

    The first solution was to use a different HDMI input port on the HDTV. Vista would auto-detect the "new" port on the TV. A few days later, though, that solution stopped working as well...

    So, there I was
    - HDTV switched to the proper "video source/input" setting
    - Newly rebooted PC without any HDMI connections on boot (just in case).
    - connected the two: TV first, then PC.

    My first thoughts was that the HDMI cable or the TV's or PC's HDMI ports got fried somehow. After connecting the two, however, my television would immediately go from "no video signal" to "unsupported video signal." The TV would not say unsupported vid signal if it was not receiving any signal at all - it would continue to display "no video signal." This was a good sign. I'm glad the problem wasn't my hardware.

    The PC's behavior was strange. Only the 1st time the HDMI cable was plugged in to the PC would that two-note sound play, subsequent reconnection attempts would produce no "Hey, you've connected something.." sounds from the OS.
    I have also experienced issues like that when connecting PCs (not just this laptop) with external harddrives or mobile phones via USB. Windows seems to have problems with detecting/releasing external devices for/from OS control. Most often, there are "It's Safe to Remove Hardware" options that come along with using any external device. Why would any OS need something like that? Just unplugging it doesn't do the trick? What's the hold-up here? Why does the OS need a wizard for unplugging hardware if the OS "understands" exactly what happens when you unplug something? Why do I not have "safe to remove hardware" options for my HDMI port? The OS obviously needs it...

    I believe the answers to those questions will provide a solution or the reasoning behind this problem, but I don't know them. Those questions led me to believe that the problem was with the OS and that, "Yes, some amount of digging/configuring should be able to resolve this."

    While looking through Vista's device manager while the TV and PC were still connected, I noticed that the display monitors that were listed were two "Generic PnP" monitors. Before, it had been appearing as one generic PnP monitor and one HDMI output device. I disabled one of the generic PnP monitors. I am not certain about the outcome of this action. I have not been able to relate this step to resolving the problem, it's just something I did.

    Using the "Windows Mobility Center" application would allow me to click the "connect display" button but nothing would happen. Intermittently, it would show "display connected" and "not connected." I have yet to determine why this is.

    I powered down the HDTV and the PC and disconnected them. Then restarted...
    - HDTV switched to the proper "video source/input" setting
    - Newly rebooted PC without any HDMI connections on boot (just in case).
    - connected the two: TV first, then PC. I tried using a different HDMI input on the TV just for jollies.

    TV still was saying "unsupported video signal" when connected. Still no autodetect pop-up that used to happen, but I did have the two-note signal that something had been connected. I opened up the "Windows Mobility Center" which -in the external display section - said "display connected." I still clicked on the "connect display" button. The application said, "detecting..." Then, popped up with the options for "how do you want to use the external display?" I got excited, noticed that "external display only" was already selected and clicked, 'ok.' Nothing happened. Unsupported video signal on the TV. Display still on the PC only.

    I clicked on the "connect display" button again from the Windows Mobility Center -> Detecting -> pop up options for "how do you want to use the external display" again. This time I choose to see if the "mirrored" setting would work. I clicked the radio button to select it, then clicked, 'apply.' My monitor flickered and - whammo - mirroed settings on my HDTV. The TV noticed it was receiving a 720p signal.

    Hm.. I clicked on the "connect display" button again from the Windows Mobility Center -> Detecting -> pop up options for "how do you want to use the external display" again. This time I switched it back to "external display only" and clicked apply. Monitor went completly black and then the TV noticed the signal had changed to 1080p!

    So, the problem seemed to correct itself when I modified\switched around the external display "what do you wanna do with it" settings and actually applied them. It seemed like the OS was like, "oh yeah, I can do this - no problem" as if it had forgotten.

    Anyway, I typed this up because I was extremely irritated about the amount of worry I put in to this. Laptops and HDTVs are expensive and I wanna watch blu-ray, damn it!
    Hopefully, it will give you a few items to try and a few things to note when you are troubleshooting.


    OS = operating system, like: Vista, XP, MacOSx, Linux
    Last edited by Zenrocks; 26-04-2009 at 10:25 PM. Reason: Adding more detail..

  3. #3
    Join Date
    Mar 2011

    Re: Vista does not detect external display

    Hey Zenrocks,

    Have you ever fully resolved this issue? I have the same problem and you seem to be the only person who has identified it. It is driving me crazy...

    I am connecting a Dell Inspiron 1525 to a Sony Bravia using an HDMI to HDMI cable.

    At first, it worked fine, then I'd have to switch HDMI ports, now it has stopped working completely, no matter what on-off plug in-plug out sequence I follow!

    Any help would be appreciated.



  4. #4
    Join Date
    May 2008

    Re: Vista does not detect external display

    If it is working correctly for the first time the why not now. It might be possible that the HDMI cable is not properly working and that is why you are getting the issue with the same. I will advise you to replace the HDMI cable with the new one. Now connect the same to check whether this is working for you or not. Also it is advisable that you should check for graphic driver which might be supporting this functionality. Install the latest version of software from official site.

  5. #5
    Join Date
    May 2011

    Re: Vista does not detect external display

    Well here is my issue. It's similar I believe.

    I have a cutom built pc. intel pentium dual core, 2.4ghz 3gig ram, and a ge force 8800 gt 512mb card.

    I'm using a 24" monitor, and trying to also use a 50" panasonic viera 1080p 600mhz plasma. My card has two hdmi outputs and I'm using a hdmi to dvi cable on the TV.

    Before anyone says something obvious like "maybe your cable is bad" That's not the case. I have 3 different cable, and all work.

    So back to the problem. I get the plasma to work as a monitor, either extended or as a clone, for a few hours. It seems to work for 6 or so hours at a time, but then ends up not detecting the plasma TV after that time. Then it takes me about 3 or 4 hours of restarting, turning the system off, or what not in order to get it working again.

    I have installed the card in different slots on the motherboard. I have downloaded and installed about a half dozen different drivers. It seemed at one point when the TV would stop being detected if I reinstalled the driver, it would then be detected. That worked 3 or 4 times, but has since stopped working. Then if I wiggled the cable, it would seem to work, but that fix too stopped working. Same for switching ports on the card.

    I thought maybe the card was the problem because at one point, I was getting no video at all. The card is not overheating however, it's cool to the touch when all this is happening, and the processor isn't overheating either.

    So I'm basically at a complete loss here. If it's not a driver issue, which after insalling both the latest drivers, beta drivers, and even the old drivers that were working, and none of that seeming to resolve the issue (and before you ask, I completely removed the old drivers each time)

    Or, moving the card to a different slot on the MB, and swapping the ports on the card. Seeming to show it isn't a hardware issue, I am just at a complete loss here. It seems that it's either a compatability issue with vista, or I've got a ghost in the machine. There doesn't seem to be any logical reason I can come up with as to why I'm having this issue.

    Anyone have any other ideas?

  6. #6
    Join Date
    May 2006

    Re: Vista does not detect external display

    Hi Cavethug,

    Have you tried going in nvidia control panel and forcing it to pick up the tv? In the nVidia control panel click on "Set up multiple displays" on the left, enable the plasma and it should work, it will act as a dual monitor setup. You can also try to hook it up via component / s-video dongle. Try going into display settings on the plasma and dial back the sharpness all the way, some color adjustments should probably be made as well.
    Searching the forums can help you to find your answers more quickly

Similar Threads

  1. USB Loader GX does not detect my external HDD
    By ShankY.Rudeshwar in forum Portable Devices
    Replies: 8
    Last Post: 26-12-2011, 09:45 PM
  2. Macbook doesn't detect external display
    By Chen in forum Portable Devices
    Replies: 6
    Last Post: 29-05-2010, 06:43 AM
  3. HDMI external display can not be full screen display
    By Chakradev in forum Monitor & Video Cards
    Replies: 4
    Last Post: 28-03-2010, 05:04 AM
  4. Windows 7 cannot detect 1TB external HDD
    By Cruzz in forum Operating Systems
    Replies: 5
    Last Post: 15-12-2009, 01:43 AM
  5. Ubuntu Cannot Detect External Harddisk ?
    By TAIPA in forum Hardware Peripherals
    Replies: 4
    Last Post: 03-03-2009, 05:25 PM

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts