Sorry this is incorrect -Graphics cards/chips have ASICs specifically for encoding built in. They do have quality settings, but you do lose a bit of quality by going ASIC over brute force. Whether this matters to you is entirely up to your use case. If you're doing it professionally I'd urr on the side of caution and push CPU power, if you're doing it for personal use, I'd go for GPU.
The minimum I'd put in a system these days is a 960 gtx.
In the earlier days of encoding via GPU that would be an accurate statement, but as of roughly 2014, GPU's have been able to do the job with the same quality at a faster pace. If you head over to the Adobe forums, there are plenty topics on this with screenshot comparisons. In pretty much any turnkey system provided to professional broadcast companies, they come spec'd with the GPU used for encoding, as recommended by the vendor of the video editing suites. I would do a little more research and not discount it so quickly. Several years back when I worked at a TV studio on some side projects, we would use GPU encoding and not have any visible quality losses. We used Nvidia Quadro cards with EDIUS Pro.
Edit: As an example, you can look at Adobe's performance and quality notes on Adobe Premiere FAQ's in their forums. They note that at medium and high bitrates, the quality when using NVENC-export versus the CPU Mainconcept H264 encoder is the same, but roughly 2-4x faster using NVENC. At low bitrates, you do get some artifacting, but working in a professional environment you will not have to worry about low bit rate video streams.