AMD making a fool of Threadripper customers - AGAIN?

111alan

Active Member
Mar 11, 2019
208
47
28
Haerbing Institution of Technology
Please when and where did someone here do that?
How?

Are you per chance not reading this in english?
Maybe through a translation app?
Because i have no clue how else you could come up with such nonsense
It's not about you. And they are not here. It's the reason why I'm still speaking here.

BTW "political correctness" here means one or several ideas enforced mindlessly by a group of people who never cared about truth, methology or ethics etc.
 
Last edited:

Wasmachineman_NL

Dell Precisions FTW!
Aug 7, 2019
1,277
422
83
It's not about you. And they are not here. It's the reason why I'm still speaking here.
I think he means that chiphell is biased as hell (CARLOS!) in one way or another and actively censors or slanders people that go against the narrative.
 
  • Like
Reactions: 111alan

111alan

Active Member
Mar 11, 2019
208
47
28
Haerbing Institution of Technology
Is there any proof that these Chiphell forum posters are paid AMD operatives? I highly doubt that. Seems like a typical case of fanboy-ism to me. Fanboys have existed in every subject for as long as the internet has existed. Who cares? Let’s just pick the platform that works best for our needs at the time of build, and move on.
Actually, it's a long story. If you play with pc hardware during 2011-2014 in China, you would never have any doubts. Everyone was talking about them writing fake reviews for sponsorship, blocking talks about certain brands and banning people as they liked. There should still be some articles existing on PCEVA and PC426 written to expose them.

But I have a lot more proofs than that. I'll post them here later when I have time.

Move on? It's easier said than done. I was once a mod in PCEVA, util the admin went weird and kicked eveyone and killed the forum. Now there's literaly no forum in China in which people could talk about technology freely. People are constantly at war, with fake benchmarks everywhere, even contradicting with eachother. Fans, bots and payed commenters/we-media have eliminated almost all the rational discussions in public forums.

Then you see why a lot of old reviewers in China just gave up? Perhaps I should try some English forums then.
 
Last edited:

ReturnedSword

Active Member
Jun 15, 2018
526
226
43
Santa Monica, CA
I find it very hard to believe that AMD, which is many times smaller than Intel, and did not have/probably still doesn’t have a PR department throwing a lot of money to influence the public perception, would hire random forum posters anywhere China or Western forums to influence the narrative. If anyone has actual proof I’m more than happy to change my mind on that.

I’m well aware that in China the fanboyism is quite strong. There may be various reasons for that, which I decline to go into in public to protect China-side people. This doesn’t change the fact that it’s still fanboyism and not constructive to the larger conversation. Fanboyism is characterized by overly passionate shouting down others who don’t agree, exaggerating good points, and downplaying or often ignoring bad points. It is not rational, nor does it help anyone including the posters understand the strengths and weaknesses of a particular platform.

I think those upset at architecture changes to be entirely missing the point.I don’t understand why anyone can expect HEDT platforms to have socket compatibility across multiple generations. These are professional systems that are often purpose built by either big integrators or white box by smaller integrators like me. Professional workstations for actual clients are designed for the needs of the lifetime of the use case, and specified accordingly. Sometimes “and then some” added for future proofing.

I understand that there are also enthusiast users of HEDT platforms, but to expect multi-generational support is going a bit too far, for either Intel or AMD. With HEDT CPUs costing $2,000-7,000 USD easily, what is the huge gripe at that point about upgrading a motherboard platform that costs much less? The HEDT market is incredibly small. Traditionally it’s been an afterthought market segment where server platforms are adapted to those who need high performance workstations.

It is usually in a company’s profit interest to have a platform retain some form of upgradeability, but that is only to capture sales from users who don’t want to upgrade their entire system to a new platform. Otherwise it’s obvious a company will rather have a user upgrade the entire platform. If someone can say AMD is a profit motivated entity, what does that say about Intel? They are both for profit entities.

A better, and more constructive way to do things is to architect a system for the use needed, with the available components. I believe, and have seen it happen over and over that companies respond to how users use their systems as long as there is profit to be made. For example, after the Pentium III era I was on Athlon XP/MP through Athlon/Opteron 64/X2. Then I was on Core derived platforms for over 10 years, until AMD forced Intel to increase core counts. Subsequently I’m on Zen derived platforms, but my upcoming NAS will probably be an Intel platform.
 

NablaSquaredG

Well-Known Member
Aug 17, 2020
675
287
63
You are aware of the meaning of "HEDT"? High-End Desktop. Not workstation.

AMD is marketing the standard Threadripper line as "Desktop", not workstation. The primary target of Threadripper were Enthusiasts and Power Users who need the lanes or the cores, but are not Professionals that want to spend twice the money on an EPYC CPU.

Why would there be overclocking support if the primary target is workstation?

With HEDT CPUs costing $2,000-7,000 USD easily, what is the huge gripe at that point about upgrading a motherboard platform that costs much less? The HEDT market is incredibly small.
That is generally not true. The upper bound is $3,999, which is the MSRP for a 3990X cpu, the lower bound is $1,399 for the 3960X.
Motherboards for Threadripper easily cost up to 1000$ (Zenith II Extreme Alpha), whereas the median is be around 700$ (590€) incl tax.
Given the median of 700$, an upgrade costs about 20% more when looking at the 3990X MSRP because you have to buy a new mobo, or even more in case of the small SKU (50% for 3960X)

50% more costs is quite significant and you don't want to toss your 1000$ board in the bin just because AMD rolled a dice.
We already have a significant worldwide electronics waste problem and this just makes it worse.

There's a good reason why right to repair is coming, but AMD is just trying to be as evil as possible with unnecessary vendor locking of CPUs (PSB) or this shitshow with Threadripper.

These are professional systems that are often purpose built by either big integrators or white box by smaller integrators like me. Professional workstations for actual clients are designed for the needs of the lifetime of the use case, and specified accordingly. Sometimes “and then some” added for future proofing.
Threadripper is not the professional lineup. Threadripper PRO (as the name says) is.
And guess what, even the "Pro-Pro" line, EPYC, has, at least limited, upgradability.

It is usually in a company’s profit interest to have a platform retain some form of upgradeability, but that is only to capture sales from users who don’t want to upgrade their entire system to a new platform. Otherwise it’s obvious a company will rather have a user upgrade the entire platform. If someone can say AMD is a profit motivated entity, what does that say about Intel? They are both for profit entities.
This all wouldn't have been a problem if AMD had been straight up about it. "Hey folks, Threadripper 3000 is not going to be compatible with X399 motherboards and we don't know whether a potential successor of Threadripper 3000 will be compatible with Threadripper 3000 boards" and everything would've been fine.

Instead they decided to publish a statement about long-term support, which is just vague enough that they probably can't be held accountable for it.
 

ReturnedSword

Active Member
Jun 15, 2018
526
226
43
Santa Monica, CA
Let’s be completely honest with ourselves and realize that any argument can be made about anything. Anything bad can be argued to be good; anything good can be argued to be bad. Not to mention all the in-betweens.

Tbh I’m quite surprised by many of the opinions here. I respect the passion. It doesn’t change the fact that views posted are opinions, my own comments included.

This discussion reminds me of when I started making real money to afford fancy things about 20 years ago and immediately spent it on the latest and greatest platforms available, “just because I could,” and because if I’m honest about it, I fell for the marketing. That’s the thing, marketing is the very definition of making claims about a product, sometimes a bit too fanciful. PR is quite different from reality.

Then after a few cycles of blowing tons of money on stuff and trying to “make it work,” I realized I was falling for marketing. Since then I only bought what I needed at the time + 20% future proofing. Yes I lost my passion, but also gained a lot of peace, not to mention free time.
 

msg7086

Active Member
May 2, 2017
397
144
43
35
During the making of my X265 benchmark, I tried compiling the x265 with both Intel ICC compiler and MSVC and ran both of them on a 3700X. There aren't any intentional slowdowns as far as I can see.
Maybe I'm wasting time talking to someone saying people who have different opinion than you are paid. Maybe you should educate yourself about x265 in the first place, that is --

The major core part of x265 is written in hand optimized assembly. The most expensive part is not written in C++. They are written in ASM and are compiled by YASM/NASM. There shouldn't be huge performance difference anyway.

Disclaimer: I was a contributor of x265 and I'm maintaining a modded x265 fork.

EDIT:

I feel like I didn't phrase it clearly so I'm adding this. I'm NOT saying Intel is to be blamed here, however it's pretty easy to make mistakes when picking examples. The example you told was not a good one because even though it seems legit, it's actually meaningless. A professional user can easily point out your mistake, and then accuse you for collecting false evidence for the purpose of advertising for Intel, even though it could be an honest mistake. In people's eyes, it seems like you are the one who got paid.
 
Last edited:

111alan

Active Member
Mar 11, 2019
208
47
28
Haerbing Institution of Technology
Maybe I'm wasting time talking to someone saying people who have different opinion than you are paid. Maybe you should educate yourself about x265 in the first place, that is --

The major core part of x265 is written in hand optimized assembly. The most expensive part is not written in C++. They are written in ASM and are compiled by YASM/NASM. There shouldn't be huge performance difference anyway.

Disclaimer: I was a contributor of x265 and I'm maintaining a modded x265 fork.

EDIT:

I feel like I didn't phrase it clearly so I'm adding this. I'm NOT saying Intel is to be blamed here, however it's pretty easy to make mistakes when picking examples. The example you told was not a good one because even though it seems legit, it's actually meaningless. A professional user can easily point out your mistake, and then accuse you for collecting false evidence for the purpose of advertising for Intel, even though it could be an honest mistake. In people's eyes, it seems like you are the one who got paid.
The major core part of x265 is written in hand optimized assembly. The most expensive part is not written in C++. They are written in ASM and are compiled by YASM/NASM. There shouldn't be huge performance difference anyway.
I tested a lot of times, nothing obvious was seen. Even if the differences come from the obvious overhead there WILL be differences. When the compiling instruction settings are not correct, the performance will see obvious drops.

If you're trying to say that X265 compiled with different methods have no difference in performance, even if they're cheating by disabling instructions etc, try it yourself then. Just say even by activating AVX512(during compiling, not during coding) brings a perf uplift.

And, you think I've only tested X265?

Here's another example. OSPRAY, which is not only compiled but also written directly by Intel. The simple benchmark tool was ran on 3700X and 8700K(lower is better)
10.JPG
You see they intentionally limit AMD's performance? No?

Just say they have little reason to do that now. It's not matching their current bussiness model. And they should not be accused for this.

So is this accuse problematic.
A professional user can easily point out your mistake, and then accuse you for collecting false evidence for the purpose of advertising for Intel, even though it could be an honest mistake.
At least I won't say that just by a sigular, "I don't see right" example. What I see as a proof is like this:
9.jpg

How interesting to see a 16 core to have a lower power consumtion than a 8-core with the same architecture in my own render test, when running Prime95?

And this kind of graph is everywhere now.
In people's eyes, it seems like you are the one who got paid.
Just say that nobody other than a fanboy will see it this way, just because somebody say it that way.
 
Last edited:

RageBone

Active Member
Jul 11, 2017
570
141
43
@111alan i think you are the perfect example for the exact issue you are crying about.

Its like wrecking ones car and then blaming the Manufacturer because he marketed the car as safe.
And then claim that they must use brainwashing and bribery to force people to buy their cars.

Even worse, you only criticize.
You are not discussing any of your own or others points made, you just add new ones.
That is not a discussion.
I think that can be called misguided Propaganda.
You aren't even trying to be objective and responsible.
And i find that to be exactly what you cry and scream bloody murder about.
You still make some valid points, but touch your own nose and do better!

EDIT: I also really agree on the "lost in translation" feeling.
 
  • Like
Reactions: ReturnedSword

ReturnedSword

Active Member
Jun 15, 2018
526
226
43
Santa Monica, CA
@111alan As I said before I respect everyone’s opinions, and keep an open mind, as should all people. I respect that you have put a lot of time to create the graphs to share as well. I can’t help but feel like some of this discussion is being lost in translation though.

But here’s the thing, every platform built on every architecture is going to have limitations. Why can’t I have the best cores and best I/O, with specific features XYZ that I want to use? In the enterprise area that I work in, my advice in enterprise design meetings has always been, let’s look at what’s available, test the platforms against our requirements, and pick the best choice within our budget. And I can say there have been times I personally managed with very large multi-million $ USD budgets and still couldn’t get what I and the design team wanted. If we in the enterprise field, with direct access or second tier access (through the vendor) to the component maker, and we can’t get 100% of what we want, what does that say about end users (even “big budget” ones)?

My question is a simple one though and still stands: If a platform doesn’t work for the user, why not go over to the other side and buy their platform instead? Going back and forth about theoretical is very fun, but at the end of the day, isn’t very helpful.
 
  • Like
Reactions: RageBone

111alan

Active Member
Mar 11, 2019
208
47
28
Haerbing Institution of Technology
@111alan i think you are the perfect example for the exact issue you are crying about.

Its like wrecking ones car and then blaming the Manufacturer because he marketed the car as safe.
And then claim that they must use brainwashing and bribery to force people to buy their cars.

Even worse, you only criticize.
You are not discussing any of your own or others points made, you just add new ones.
That is not a discussion.
I think that can be called misguided Propaganda.
You aren't even trying to be objective and responsible.
And i find that to be exactly what you cry and scream bloody murder about.
You still make some valid points, but touch your own nose and do better!

EDIT: I also really agree on the "lost in translation" feeling.
Sorry, there are just too many things that proves my idea that I don't know what to post. But I do think I touched everything people asked me about.

BTW about the more concrete proof you asked 2 weeks ago, I'm working on it.

Did I make up any rumors, fake any test data, or insulting anyone's personallity? Please point those out if I do.

I'm criticizing this kind of behavior, not writing a review, what else should I do, other than "criticizing"? Yes there are multiple aspects in AMD product design that's worth talking about in a review, but do they have to fit in a different topic?

About the objective part, perhaps I've posted the most proof in this thread than anyone else. Did I say something that's purely opinion, or making up anything?

If there are any, point them out. Perhaps it's due to misexpression, I would like to answer again. Sorry I was never engaged in an english argument before, I'll be careful.

I don't usually attack people, I've tried to peacefully discuss things no matter what before. But learned from my experience, if people just start with attacking me, my attitute probably will not be so good.
 
Last edited:

111alan

Active Member
Mar 11, 2019
208
47
28
Haerbing Institution of Technology
@111alan As I said before I respect everyone’s opinions, and keep an open mind, as should all people. I respect that you have put a lot of time to create the graphs to share as well. I can’t help but feel like some of this discussion is being lost in translation though.

But here’s the thing, every platform built on every architecture is going to have limitations. Why can’t I have the best cores and best I/O, with specific features XYZ that I want to use? In the enterprise area that I work in, my advice in enterprise design meetings has always been, let’s look at what’s available, test the platforms against our requirements, and pick the best choice within our budget. And I can say there have been times I personally managed with very large multi-million $ USD budgets and still couldn’t get what I and the design team wanted. If we in the enterprise field, with direct access or second tier access (through the vendor) to the component maker, and we can’t get 100% of what we want, what does that say about end users (even “big budget” ones)?

My question is a simple one though and still stands: If a platform doesn’t work for the user, why not go over to the other side and buy their platform instead? Going back and forth about theoretical is very fun, but at the end of the day, isn’t very helpful.
The idea of selection based on requirement is what I always tell people to get. What I'm talking about is more of faking stuff or using internet violence to silence any opposing opinions, thus distorting many people's view, making them become irrational without knowing.
 

msg7086

Active Member
May 2, 2017
397
144
43
35
Just say even by activating AVX512(during compiling, not during coding) brings a perf uplift.
I don't understand how you can compile x265 with or without AVX512. The AVX512 code is embedded in the asm files such as ipfilter8 and ipfilter16, and is compiled by YASM. C++ compiler is not involved here. Like I said, all the core code of x265 are written in hand crafted asm code. It's not even possible to strip out the code from the asm file. Just take a look at https://github.com/msg7086/x265-Yuuki-Asuna/blob/stable/source/common/x86/ipfilter16.asm#L386 those are the AVX512 core code that's already there. Go grab a yasm.exe and compile this file, you get an object with all the AVX512 instructions inside it.

If you are talking about the I/O, such as y4m parsing or mp4 muxer, yes they are written in C++ and can be optimized by C++ compilers, but that takes, like, 0.1% of all CPU time.

Speaking of AVX512, interestingly Intel just announced that AVX512 will be taken away from all 12th gen CPU users from a BIOS update. Sounds like Intel users will lose their performance boost from the nice instruction set.
 

111alan

Active Member
Mar 11, 2019
208
47
28
Haerbing Institution of Technology
Is there any proof that these Chiphell forum posters are paid AMD operatives? I highly doubt that. Seems like a typical case of fanboy-ism to me. Fanboys have existed in every subject for as long as the internet has existed. Who cares? Let’s just pick the platform that works best for our needs at the time of build, and move on.
Is there any proof that these Chiphell forum posters are paid AMD operatives? I highly doubt that. Seems like a typical case of fanboy-ism to me. Fanboys have existed in every subject for as long as the internet has existed. Who cares? Let’s just pick the platform that works best for our needs at the time of build, and move on.
I find it very hard to believe that AMD, which is many times smaller than Intel, and did not have/probably still doesn’t have a PR department throwing a lot of money to influence the public perception, would hire random forum posters anywhere China or Western forums to influence the narrative. If anyone has actual proof I’m more than happy to change my mind on that.

I’m well aware that in China the fanboyism is quite strong. There may be various reasons for that, which I decline to go into in public to protect China-side people. This doesn’t change the fact that it’s still fanboyism and not constructive to the larger conversation. Fanboyism is characterized by overly passionate shouting down others who don’t agree, exaggerating good points, and downplaying or often ignoring bad points. It is not rational, nor does it help anyone including the posters understand the strengths and weaknesses of a particular platform.

I think those upset at architecture changes to be entirely missing the point.I don’t understand why anyone can expect HEDT platforms to have socket compatibility across multiple generations. These are professional systems that are often purpose built by either big integrators or white box by smaller integrators like me. Professional workstations for actual clients are designed for the needs of the lifetime of the use case, and specified accordingly. Sometimes “and then some” added for future proofing.

I understand that there are also enthusiast users of HEDT platforms, but to expect multi-generational support is going a bit too far, for either Intel or AMD. With HEDT CPUs costing $2,000-7,000 USD easily, what is the huge gripe at that point about upgrading a motherboard platform that costs much less? The HEDT market is incredibly small. Traditionally it’s been an afterthought market segment where server platforms are adapted to those who need high performance workstations.

It is usually in a company’s profit interest to have a platform retain some form of upgradeability, but that is only to capture sales from users who don’t want to upgrade their entire system to a new platform. Otherwise it’s obvious a company will rather have a user upgrade the entire platform. If someone can say AMD is a profit motivated entity, what does that say about Intel? They are both for profit entities.

A better, and more constructive way to do things is to architect a system for the use needed, with the available components. I believe, and have seen it happen over and over that companies respond to how users use their systems as long as there is profit to be made. For example, after the Pentium III era I was on Athlon XP/MP through Athlon/Opteron 64/X2. Then I was on Core derived platforms for over 10 years, until AMD forced Intel to increase core counts. Subsequently I’m on Zen derived platforms, but my upcoming NAS will probably be an Intel platform.
Opinions becomes real when there is proof. And there are really a lot of them.

Now let's prove my point, with my own experience. I don't really like to talk about it, it makes me tired just by looking at these proofs, but something has to be shown.

There are multiple ways to tell a fan from a payed story teller. Let's analyze these in several ways.

1. Are they aiming at silencing people
I'm not talking about using sharp words, I'm talking about "physically" silencing the opposing side, no matter who they are.

For example, these people manage to find almost all my posts and replies and negatively rate them. It's not just about hate, this forum will stop you from posting when your points are less than 100.

But I've posted several reviews before which earned me hundreds of thousands of views and a lot of points, so they failed.
rate.JPG

But some others are not that lucky. Like this one.

And if they can't do it this way? If their side is exposed to be biased? Lock the post then.

And search their posts, do you see these people doing this when there are no certain brand involved? Perhaps never. Yeah just remove those people so nobody will talk bad.

Perhaps I'm not banned just because of my past contributions to this forum.


2. Do they use a same, organized set of pattern of doing things

Aside from the situation in the first point, here are other patterns I noticed.
①Making up misleading opinions that could convince the crowd, then spam them everywhere.

Things like "Intel will double the price if AMD didn't involve""Intel will forever stuck at 4 core(let's not look at its server lineup)""AMD's next flagship will have 5GHz, 16cores and a price of 2000CNY(before zen3 of course)""AMD has much lower power usage(without mentioning any other conditions)"

Even something like"Intel can't compete any more""Intel will bankrupt".

Of course someone with the most basic knowledges will know these are wrong, but these sounds pretty legit, tbh, expecially when combined with fake data.

②Personal attacks without a point.

Fans will usually state the reasons behind their opinion, before any kind of personal attack if there are. But these people always start with personal attacks,never talk about any real proof. For example, there are a lot of them in this post. They don't even have the knowledge to understand the technical detail I talked about, but they always manage to make thing up for personal attacks.

BTW they're obviously violating almost every single forum rule there. So I told that mod in question.

Then about that mod I mentioned, he basically said"I don't understand either, but you're wrong. You have no logic, but I can't explain why"
8.PNG
There was also a fanboy there, but when I posted the whole study he agreed with me. You can compare between those people.

And people in NGA talked about this too, also coming with silencing.
nga.JPG

Note that these are basically the only living computer hardware related forums in China that isn't as shallow as Tieba.

There are of course many other scenarios, including my friends who did the testing themselves. They all get their own share of reasonless internet violence, even by not using AMD cpus when testing SSDs.

③Picking sides, saying your side is ill-intentioned rather than reasoning

Can also be seen a lot in this post


④They come in groups and agree with eachother's point, to create a momentum.

Usually start when the previous methods have failed. Don't think this needs an explaination, if you want one, look at the post I referenced above.

You can argue that each single point doesn't definately prove these are 100% payed activities, but together, they really forms a pattern that you can almost predict what they will do next.

3. Do they have past records of payed speech control

A lot. There is a period when Chiphell decided to ban anyone who talk about Yeston and Gigabyte before. They promotes ASUS like crazy. As mentioned before PCEVA and PC426 both have exposed them before. Excavating articles from several years ago is not easy, but I probably will do this, if this topic continues. Just say this is what every veteran PC player know here.

In fact, payed commenters and payed speech control is very common in China. These people was usually called"5-dime""sea navy" here, suggesting their cheap recruitment fee and the massive numbers of posts and replies they made. How cheap? Well even some food companies and mobile phone companies can afford some.

As for payed reviewers, 1500CNY can buy you a very good review article commission. I also happen to have a friend doing this kind of work(but he is not being biased as far as I know). And you can calculate how many well known reviewers they need to buy.
14.JPG

If Chiphell is doing payed speech control, who's paying them? Think you understand.

Just say that you don't know China. I hope I don't have to either.

I'm not saying that the entire Chiphell forum is bad, I'm just aiming at the mod in power who is doing this, as well as the collaborators. I take full responsibility for everything I said, and have everything backed up both in the format of photos and offline pages. If it comes to legal affairs, I'm happy to participate.

Think I've explained enough. If there are other questions about the posts I referenced, you can tell me. Just notice that, don't take what people are saying as is. My part is all started with a simple open box plus discussion post.
 
Last edited:

111alan

Active Member
Mar 11, 2019
208
47
28
Haerbing Institution of Technology
I don't understand how you can compile x265 with or without AVX512. The AVX512 code is embedded in the asm files such as ipfilter8 and ipfilter16, and is compiled by YASM. C++ compiler is not involved here. Like I said, all the core code of x265 are written in hand crafted asm code. It's not even possible to strip out the code from the asm file. Just take a look at https://github.com/msg7086/x265-Yuuki-Asuna/blob/stable/source/common/x86/ipfilter16.asm#L386 those are the AVX512 core code that's already there. Go grab a yasm.exe and compile this file, you get an object with all the AVX512 instructions inside it.

If you are talking about the I/O, such as y4m parsing or mp4 muxer, yes they are written in C++ and can be optimized by C++ compilers, but that takes, like, 0.1% of all CPU time.

Speaking of AVX512, interestingly Intel just announced that AVX512 will be taken away from all 12th gen CPU users from a BIOS update. Sounds like Intel users will lose their performance boost from the nice instruction set.
ICC has some integrated features in VS, including two relating select boxes, which allows you to select the instruction set used, and the specific arch(like skylake, knights landing) to be optimized. Yes even when not compiled with AVX512, this instruction set can still be enabled during encoding. I was using a raw video for testing. Still the performance difference between different compiling instruction sets can be seen, I remember it's less than 5%, but not within error range.

They actually made AVX512 hardware, they are even planning on expanding it to AMX. I think they're trying to disabe it now on desktop platforms because, a lot of reviewer test power consumption with it on full turbo but never show the performance uplift. This makes them look really bad.
 

msg7086

Active Member
May 2, 2017
397
144
43
35
Still the performance difference between different compiling instruction sets can be seen
I think they're trying to disabe it now on desktop platforms because
Well, thanks Intel, I guess? We won't be able to see the difference any more :) unless we pay a premium and get the workstation / server line?

I develop free and open source tools and I implemented a few with AVX512 support for Intel users years ago hoping that Intel will get its AVX512 to the desktop line soon so people can actually benefit from that using affordable hardware. Guess I was wrong and I should just stick with AVX2?

I'm not saying AVX512 is a good idea. It's complicated, has so many subsets and occupies large area on the chip, and is crazily hot. It's however those policies and decisions that make me sick. Just fking tell us desktop users wouldn't get it, so we don't need to waste time working on it and supporting it. (F to those who worked on IA64 lol) We've got used to motherboards and chipsets that last 1 or 2 gen. How about instruction sets that last 1 or 2 gen? lol

Let's see what AMD's gonna do with Zen 4. If they add AVX512 to the desktop line I'll buy AMD again.
 
  • Like
Reactions: RageBone

ReturnedSword

Active Member
Jun 15, 2018
526
226
43
Santa Monica, CA
They actually made AVX512 hardware, they are even planning on expanding it to AMX. I think they're trying to disabe it now on desktop platforms because, a lot of reviewer test power consumption with it on full turbo but never show the performance uplift. This makes them look really bad.
The real reason is likely because the E cores don’t support AVX512 at all. In order to enable AVX512 on ADLt the E cores need to be disabled. That and artificial market segmentation, which is Intel as usual. That being said, AVX512 is mostly pointless for most users. I’d rather have real ECC support.
 
  • Like
Reactions: NablaSquaredG

111alan

Active Member
Mar 11, 2019
208
47
28
Haerbing Institution of Technology
The real reason is likely because the E cores don’t support AVX512 at all. In order to enable AVX512 on ADLt the E cores need to be disabled. That and artificial market segmentation, which is Intel as usual. That being said, AVX512 is mostly pointless for most users. I’d rather have real ECC support.
That is one of the reasons. They can't make small cores as big as a Xeon-Phi core just to support avx512.

But in the past people think AVX2 was pointless, but it's not now. Ecology need to be built up to be useful.
 

111alan

Active Member
Mar 11, 2019
208
47
28
Haerbing Institution of Technology
Well, thanks Intel, I guess? We won't be able to see the difference any more :) unless we pay a premium and get the workstation / server line?

I develop free and open source tools and I implemented a few with AVX512 support for Intel users years ago hoping that Intel will get its AVX512 to the desktop line soon so people can actually benefit from that using affordable hardware. Guess I was wrong and I should just stick with AVX2?

I'm not saying AVX512 is a good idea. It's complicated, has so many subsets and occupies large area on the chip, and is crazily hot. It's however those policies and decisions that make me sick. Just fking tell us desktop users wouldn't get it, so we don't need to waste time working on it and supporting it. (F to those who worked on IA64 lol) We've got used to motherboards and chipsets that last 1 or 2 gen. How about instruction sets that last 1 or 2 gen? lol

Let's see what AMD's gonna do with Zen 4. If they add AVX512 to the desktop line I'll buy AMD again.
I think they'll eventually bring it back in future gens, when big cores gets bigger so they have more than 4 small core's space for each big core position, to fit extra units in. Or they could go for EMIB linked multi die solution.

BTW AVX512 doesn't take any space on desktop platform(only 1 unit, made by combining two existing avx256 units). On servers it's like this(left-up corner):
Skylake_core.jpg

The reason for the extra heat is more utilizition for the entire architecture due to higher throughput, and the aditional vector engine for servers. Of course it will be extra hot without an offset, but with a proper one it will not use more power than usual but also generate more throughput(2x per clock).

If zen4 will include AVX512, they would need to get rid of the current FP unit design first, otherwise it'll be far too space consuming. See how things turns out.
 
Last edited:

RageBone

Active Member
Jul 11, 2017
570
141
43
Opinions becomes real when there is proof. And there are really a lot of them.
Now let's prove my point, with my own experience.
So, i don't think that is how any of that works.
Especially proving something does not work and should no work like that!

Would you agree that you are a proven Intel Shill because i perceived you as such?
Does it become proof when you write a thing X times and i perceive you as such Y times?

I hope and think not!

Proof would start with witness testimony of people being payed and doing the deed.
Or bills showing the payments.

A related issue is the location of your mentioned issues with for example Chiphell.

Every time you bring them up, it makes me think you misunderstood or misconstrued something i or others said as a Attack on you in the way you accuse those Chiphell people off.

Which appears to be wrong.

Strictly hypothetically speaking:
If I were to say that AMD makes the Best CPUs!
You might disagree.
You might ask me why i think that.
When i then Say that I don't want to be silenced and that i got silenced by bullies earlier,
how does that make me seem?

Because that is how you appear to me right now.

It doesn't get better when i would then claim that it wasn't you and that it was some other place on the web.
And it gets worse if i were to keep doing that instead of answering your god damn questions.

Now to a more real "doing my own benchmarks" part of the topic.
You are hopefully aware that Benchmarking is very hard, at least propper benchmarking!
Why is that?
Well, each Vendor does things differently.
Bioses often apply overclocks on default to make the manufracturer and the board look better.
Just because you have set it to use JEDEC or XMP settings, it does not mean that ALL the Timings, especially the weirder ones? Second, third?
are actually the same and those can have an effect.
Another obvious issue are Powerlimits and Boosts that aren't as sure to apply as one thinks.

One Example i can give of my own is with a Rome 64C ES.
It does about 6K CB20 points.
Considering a 7401P 24C Naples does more then 7K points, that is pretty bad.
So i could now scream and shout how shit Rome CPUs are.
I am not doing that.
Why?

Because it should be obvious that there are issues with my test-setup.

The CPU has a Turbo of 2Ghz and a baseclock of 1.4Ghz on all cores.
It has no issues doing both in idle and few core loads.
The moment all cores are loaded, core clock drops to 400mhz.

So one could argue that at 400mhz, 6K CB20 points are actually not bad.
Powerusage is reported at about 80W total and the last time i looked,
i noticed the core temperature on one or more cores was at 90°C and more.
The rest were fine at like 40°C.

A reasonable question from you might be if i were running it without a cooler.
I sadly did, with a TR NHu12, but that would have been a funny way to massacre the performance.
Thinking about this now, maybe its bad TIM on a chiplet.
Or a Power and temperature reporting issue.
I will have to investigate that further.

Still, there is valid criticism to be had that my specific CPU and setup isn't behaving as it should.
Using that to spout intels superiority in server hardware, would not speak well for my mental faculties and character.
It might still be my valid opinion, but not Truth or Facts.

EDIT: CB23 to 20, my mistake.
Thinking about it, 6K CB20 Points at 80W isn't that bad either, right?
Ok, naa its shit. a 5700G does that too and its a 65W part.
 
Last edited: