This sounds like my area of expertise :). If your graphics card has a specific DVI-I port, then you shouldn't have any problems using a DVI-D cable to convert to an HDMI as both are designed for digital signals and DVI-I can transmit both (because the analog pins are present in the female connector for DVI-I) whereas DVI-D is for digital only signals. This means you can use DVI-A or DVI-D on a DVI-I port. If you have a high resolution monitor (4k for instance), then you will want to avoid a single link cable and go with a dual link otherwise you won't get the picture that you want to see on your monitor from the graphics card output because single link cables are for resolution up to around 1080p. You can buy DVI to HDMI adapters so that you can just plug an HDMI in the other side of the adapter and plug the adapter itself into your graphics card -- I use these all the time.
I also believe that with DVI-I, if you're transmitting an analog signal, the audio is not sent over the cable, whereas if you're transmitting a digital signal, then audio is sent with the digital data.
There are 3 types though:
DVI-I
DVI-D
DVI-A
Think of the last letter of DVI-A and DVI-D being synonymous for what kind of signal they can support. 'A' for Analog, 'D' for Digital. The odd letter 'I' just remember it's both.
If you get them mixed up, it's frustrating to understand why audio isn't carried over an HDMI cable, because in most scenarios it does carry audio information and bidirectionally if it and the source device can support ARC.