Experiences with USB/HDMI to VGA/DVI adapters

You want the answer?

It depends...

On?

The Video hardware's wiring...

VGA is analog, and most modern machines have nothing but HDMI and Display Port. Those are both digital, so you can't just convert those UNLESS the graphical output in question has the internal electrics to detect that and use it. Current generation Intel iGPUs all have this circuitry on the DISPLAY PORT output, but not the HDMI output.

But then again on the modern Intel iGPUs don't like that DP port being adapted to HDMI.

I've been running into more and more cases where old displays like this simply will not light up, and the only fix is replacement.

Now, on to the USB adapter mess... these things are WORTHLESS for accessing the system BIOS so no matter what you do, you need at least one display working off a card or onboard directly. Beyond that, they work fine over USB 3.0 IF and we go back to system specs once again... the GPU in the system SUPPORTS IT. And once again I go back to Intel because on SOME OF THE 12 gen iSeries chips the i3 CPUs only support 2 displays while the i5 and up can support four! Then most of the i3s also do 4... check the spec sheet of the CPU! It will tell you!

Intel is currently designed to do this Display Port daisy chain method to get more than 1 display hooked up too, so if you don't have monitors that can do that... good luck!

I sold a brand new Optiplex to a client a couple of weeks ago and that Dell Optiplex 5000 Micro on his desk has an HDMI and DisplayPort output. I do NOT recommend anyone going that route... make sure your Dells have dual Display Port outputs! The DP outputs will NOT ADAPT to HDMI, so DP to HDMI cables are out. You're welcome to try but I have yet to see a single monitor light.

In this case the HDMI output lit up my bench display for all of about 20min before it went stupid and lines and shadows and all sorts of ugly. THREE MOTHERBOARDS LATER, this port hasn't lit up another monitor since. The DP output works fine directly into a monitor as DP, that same port with a VGA adapter in it still works fine. His 2nd and 3rd displays are now attached via USB 3.0 as HDMI and VGA and working fine.

So again it's not drivers, its low level electrical feature set. Honestly, I'd tell this customer to keep his old displays on his old machine, and call me when he's ready to replace it all. Because he will be replacing it all if he goes current generation on replacement.
 
The Video hardware's wiring...
Thanks - this is great information to add to my standard speech I use whenever someone asks if they can continue to use their old monitor with a new computer. I've been down this path before but never new the details on exactly why it didn't work.
 
I would provide a computer with at least 2x DisplayPort, and use a DisplayPort to VGA and DisplayPort to DVI adapter.

Not that I don't get what you're saying, but what I'm saying is that the 2 computers that will be CAD/CAM workstations do have multiple DisplayPort ports as well as an HDMI and DVI port, all on the graphics card. Those will most likely get new monitors, or at the very least be a cinch to get fully functioning plug n' play DVI to HDMI/DVI/VGA adapters (as a number of options for DisplayPort to all three other options adapters exist).

But the two office machines have a single HDMI output port as they're being shipped. For one of those two machines, that uses a single monitor, I anticipate a simple HDMI to VGA adapter will work, and work fine. But for the other (unless I were to put in a graphics card, which I really don't want to do if it can be avoided as it's not needed, really) I will have to use one HDMI to (DVI/VGA) adapter and one USB 3.0 to (DVI/VGA) adapter.

At this point, I thank everyone for their input. I've simply ordered the adapters and will hope for the best. If it gives the client exactly what they want, great, if not, then we will most likely get a new monitor or monitors. There was one Dell 32" curved monitor when I was at Costco today (for just over $225) that made me think, "Who needs multiple monitors with this thing?!"
 
List of my experiences for passive adapters (ie. the source natively supports the output).

DisplayPort > DVI - Yes
DisplayPort > HDMI - Yes
DisplayPort > VGA - Yes

HDMI > DVI - Yes
HDMI > VGA - No
HDMI > DisplayPort - No

DVI > HDMI - Yes
DVI > VGA - Yes, but only if DVI-I
DVI > DisplayPort - No

VGA > just about anything - hit or miss.

You get exceptions to this with "active" adaptors which in theory can convert anything to anything as they convert the signal in-line. Often these require additional USB power. I've not had great experiences with them in general.

USB adaptors work but don't have the same performance. They can add significant CPU overhead, especially at higher resolutions, as the data is compressed/decompressed for transmission over the USB. If it's for CAD I'd definitely avoid these.

Short answer to the original question:
1x HDMI > DVI adapter
1x DisplayPort > VGA adapter
 
Last edited:
That's odd. I've never had that issue. I currently have at least three OptiPlex 3090 that are using DP to HDMI adapters. Also, two older ProDesk 400 G4s that are DP to HDMI as well.
Same here, I'm talking specifically about 12 generation iSeries equipped units with the brand new Intel iGPUs in them. They do NOT like being adapted in all cases.
 
Back
Top