AVX2 in this case.Thanks for the result. Another question, when you compiled the non-AVX512 binary, did you compile it to general instructions (SSE4) or AVX/AVX2?
AVX2 in this case.Thanks for the result. Another question, when you compiled the non-AVX512 binary, did you compile it to general instructions (SSE4) or AVX/AVX2?
My thoughts were the same, until I got some critical proof.If I may interject here, I think some ppl may be overlooking something:
You don't necessarily have to pay people these days to spout lies and half-truths on the internet. You don't even have to pay them to shill for a specific company. So I would be rather reluctant to point fingers at fanbois shouting "paid shill". The vast majority of them do it without getting paid, or being explicitly told to do so.
You do not have critical proof, you have circumstantial evidence at best. And your first dozen comments came off as super spam heavy.... and nearly got you booted. You seem to get caught up in fanboi wars you and complaining about censorship on a Chinese website is... amusing to say the least.My thoughts were the same, until I got some critical proof.
OK. From the information you provided, I think in this case it has something to do with thermal and turbo throttling. The AVX2 based binary could have performed better if the AVX2 code was not brought down on CPU frequency by the AVX512 code.AVX2 in this case.
I really disagree with this.Yeah why don't people just buy their favored product? Selecting product based on own preferences is reasonable all and well, until you start to notice, why people prefer one thing to another, how does they make the decision?
People usually can't test everything themselves. So their decision is usually based on other's tests, reviews, datasheets or simply other's opinion. Then when someone start to pour large amounts of fake data and unreasonably biased opinions into the public media, and start attacking anyone who disagrees, it becomes a problem.
If 3 people start to convey the same idea, the listener may begin to change his mind, although the idea they tried to convey may be completely false and ridiculous. Such is human nature.
That 5% is just from switching from AVX512 to avx2 while compiling, when running on the same intel platform. I know the core code are not affected, but the difference due to other parts, I think, do show something.OK. From the information you provided, I think in this case it has something to do with thermal and turbo throttling. The AVX2 based binary could have performed better if the AVX2 code was not brought down on CPU frequency by the AVX512 code.
Under the same thermal budget, AVX2 will be slower than AVX512 but not by a huge amount. AVX512 runs twice as fast per CPU clock but it downclocks. In your first benchmark, AVX2 has to be running at the same clock as AVX512 core code, and that penalty is huge. BTW, I would be interested to know how good it performs on a Ryzen since at least Zen3 should have full AVX2 speed and doesn't downclock like Intel does.
Now, back to the original debate. I claimed that the core code is in assembly and thus comparing the speed of x265 is not a valid proof of innocence on Intel side. It still holds true given that the worst case is 5% difference. While it's beyond measurement error, it's still not a valid proof. If Intel has actually compiled crappy code for the C code part, you'll still see < 10% difference. (Might actually be less, since Ryzen runs AVX2 code at a higher clock than Intel.)
To demonstrate, you should disable all ASM optimization, i.e. --no-asm , then measure the numbers on both platforms.
Also I just want to be clear. My intention was never to claim that Intel compiler is guilty here. They used to be, as there was a legal battle and Intel was forced to stop "cripple AMD". They presumably fixed the issue after settling the court case. My intention has always been that proofs can still be biased based on people's knowledge and opinions. You and I are no experts in CPU designing, and I would say even LTT or GN teams are not experts at those areas. Maybe Lisa Su qualifies. If numbers look good to you, you could just take it out as a "proof" without knowing why the numbers look like that. That's why I would put a question mark on the "proofs" since "proofs" may not be true proofs.
You have a point. Censorship is not really limited to politics these days. Perhaps I should never have trusted any of them from the beginning, when there are signs.You do not have critical proof, you have circumstantial evidence at best. And your first dozen comments came off as super spam heavy.... and nearly got you booted. You seem to get caught up in fanboi wars you and complaining about censorship on a Chinese website is... amusing to say the least.
Perhaps you overestimated people's rationality and underestimated the autonarcosis effect.I really disagree with this.
It is more “Is the person who paid for the platform happy with what they got?”
If they are unhappy after the fact, then it’s time to consider buying a competing product the next time. It’s that simple tbh. The buyer’s money speaks, not shouting on Internet forums.
There have been many instances where I wanted this feature or that support, and I had to take/use what was available to me.
That is a very weird and specific thing to pick as the example you show.[...] his Minecraft lost a lot of frames due to EPYC's interconnection latency issue.
The interconnection speed I say mainly manifests on memory latency, because Minecraft is a single/dual-core game.That is a very weird and specific thing to pick as the example you show.
And i find your your conclusion that it Must be the Interconnect latency, very very weak.
I find it to be way more likely that the way lower base and boost clocks are to blame.
Or of course, if it is a Naples even more so.
Were high CB points really his only reason? i find that unlikely.
If only the crowd are as reasonable as you say, then this conversation may never happen from the start.@111alan Rationality and “auto-narcosis” has nothing to do with anything for a serious user and serious buyer. Serious types of people don’t have the time to worry about these things. They also wouldn’t get tricked that easily, because as you may infer from my polite replies thus far, I am implying that the people who get upset are those who expected magic and specialness, and then realized they got a normal product, with a set of limitations like any other product. Yes, even products from Intel have limitations.
The SSD example is a bit irrelevant. It’s well known that SSD firmware may be in a way where the SLC cache presents the best “face” of the performance. Many English language reviews point this out specifically. I have not followed up on Chinese language reviews for a while, but Chinese or not, if someone is so gullible and naive to believe stated performance based on the seller or manufacturer’s Public Relations marketing then tbh they deserve to have their money taken away from them.
Still, as I said before, if the user is happy, then that’s enough. If they’re not happy they are welcome to go buy the other product from another company. It’s that simple.
Why people get unhappy with an expensive product they bought? I think it’s quite clear. They bought into a hype and bought products that are too expensive for their budget, they don’t understand, possibly don’t even use 50% capabilities, because it’s “the best.” Then some time later they find out that by god, there’s some other product that was actually “better” or their product is “missing” some feature. They are mad because their product no longer feels special and the best.
What serious person has time for that? Do the best research a person can do, buy within the budget, and call it a day.
I find this weird, i think you have sufficiently described the issues on for example Chiphell.I've seen way too many communities, too much rationality destroyed by lies and attacks. That's why I'm posting all these stuffs.
Of course not. If someone is confirmed to be an AMD fan or paid to speak, I will not talk to him, at all. It's just pointless. If an environment allows this person to bamboozle others, I'll leave that environment. But I'm still here.If i were malicious, i would interpret your responses so far as:
You claiming all we spout is the Propaganda nonsense we get paid to say by AMD.
Hence, you only need to say its propaganda and here is a chart of real-world "Benchmarks" to disprove our nonsense.
To sum it up, it all started by the strong statement: But on the outside AMD fans are famously notorious, fabricating test data and bending truths with their explanation to their needs, while swarm attacking everyone who have second thoughts. There are also a lot of evidences indicating AMD paying independent reviewers and commenters for its reputation campaign. 111alan has tried his best to tell us that he is right and people who don't agree with him are wrong. One of this proof was to show that Intel C Compiler didn't cripple AMD, where he measures the performance of x265, a tool mostly written in assembly, and claims that Intel was innocent.I'm not even sure what you guys are talking about right now, avx2, avx512 or chinese trolls lol. As a bystander.
To sum it up, it all started by the strong statement: But on the outside AMD fans are famously notorious, fabricating test data and bending truths with their explanation to their needs, while swarm attacking everyone who have second thoughts. There are also a lot of evidences indicating AMD paying independent reviewers and commenters for its reputation campaign. 111alan has tried his best to tell us that he is right and people who don't agree with him are wrong. One of this proof was to show that Intel C Compiler didn't cripple AMD, where he measures the performance of x265, a tool mostly written in assembly, and claims that Intel was innocent.
Hope this helps those who don't want to read pages.
Where does this come from.111alan has tried his best to tell us that he is right and people who don't agree with him are wrong.
Wise, most wise. Just don't join any political affairs.I have no affiliation with intel whatsoever BTW! Just in case.