I'm looking at building a big array as well, also mixed disks. It's all data that is replicated elsewhere, and also backed up elsewhere, so it's not like mixing random disks of various sized and odd configurations is going to make my arrays very scary. But I am very curious to see what happens.
The usual advice is all around vibration, temperature and load types, but there is surprisingly little data on what actually happens of you just build a 'bad' array.
One of the things I might do is create a ±44 disk array of old disks (i.e. Samsung disks of various single digit TB sizes) for a while to see if I can get them to fail since they are old consumer drives. I'd be mirroring the entire dataset anyway and speeds aren't all that important either, but considering the reports from Backblaze and Google vs. the anecdotes from failures within days of consumer drives in 'enterprise' arrays, I'm curious to see if they do indeed vibrate each other to death, cook to well-done and lose bits as if they were a dripping faucet.