EXPIRED Samsung/Netapp 7.68TB PM1643 SAS SSD - ebay - EUR 250

Notice: Page may contain affiliate links for which we may earn a small commission through services like Amazon Affiliates or Skimlinks.

luckylinux

Well-Known Member
Mar 18, 2012
1,527
478
83
The main benefit of the API is that it returns a json object and you can apply lots of filters on that. So for instance you can get rid of items where multiple sizes are advertised, and where the search picks it up because of the price of the lowest size. You also get to search on multiple ebay platforms (right now I only do UK, US and DE). You can't really search by part number as the descriptions are awefully inconsistent and imprecise on ebay, and then you have all the OEM alternatives. But if you search for enterprise SSD sizes (3.2TB, 3.84TB), you filter out lots of garbage. The only problem is Intel 4TB drives, and my main source of garbage.

Then you need to be smart about the logic for parsing the description and filtering out unwanted stuff (cases, USB, ps5, external, etc). Also need some logic for spotting lots of multiple drives (so you can filter by price per TB rather than absolute price, a $400 lot of 4x 3.84TB drive is a great deal).
But do you use the "RAW" JSON API or the Python (or other Language one) ?

The eBay API makes the Atlassian JIRA API look like Child Play to be honest ...
 

ca3y6

Well-Known Member
Apr 3, 2021
748
718
93
I use the raw rest API using C#. I only needed to implement a couple types of calls so not a huge amount of work. The difficulty is parsing and filtering the description. The only third party library I used is to handle the authentication. You also have to be careful about your calls to stay under the daily rate limits.
 

luckylinux

Well-Known Member
Mar 18, 2012
1,527
478
83
I use the raw rest API using C#.
It's been a long Time since I tried to use C# :(. I'm mainly doing Python Stuff nowadays :cool: .

I only needed to implement a couple types of calls so not a huge amount of work.
I can imagine that. If it's only a Couple of EndPoints and a couple Methods (GET/POST) etc, IMHO most of the Work might be to implement Pagination to go through the Pagination Results (via "Cursor", if that's how they implement it).

The difficulty is parsing and filtering the description.
I guess I would have to try to see how hard it is indeed. Shouldn't be too hard IMHO. But ... already Locale-Dependent Settings play a huge Part in how good Results are: whether you look for 3.84TB / 3,84TB / 3840GB makes all the Difference and of course not consistent ever.


The only third party library I used is to handle the authentication.
Pretty sure I looked briefly into the RAW API and saw there were like 4 or so Types of API Keys, then said "Forget it".


You also have to be careful about your calls to stay under the daily rate limits.
Probably will need to split into a "Crawler" Part and a "Processor" Part, especially when testing in the Beginning.

Let the first Part Cache Results (either save JSON File or possibly much more advanced stuff like Valkey / Redis or anything in-between), then once that's done run the 2nd Part.
 

heromode

Well-Known Member
May 25, 2020
534
327
63
It's been a long Time since I tried to use C# :(. I'm mainly doing Python Stuff nowadays :cool: .


I can imagine that. If it's only a Couple of EndPoints and a couple Methods (GET/POST) etc, IMHO most of the Work might be to implement Pagination to go through the Pagination Results (via "Cursor", if that's how they implement it).


I guess I would have to try to see how hard it is indeed. Shouldn't be too hard IMHO. But ... already Locale-Dependent Settings play a huge Part in how good Results are: whether you look for 3.84TB / 3,84TB / 3840GB makes all the Difference and of course not consistent ever.



Pretty sure I looked briefly into the RAW API and saw there were like 4 or so Types of API Keys, then said "Forget it".



Probably will need to split into a "Crawler" Part and a "Processor" Part, especially when testing in the Beginning.

Let the first Part Cache Results (either save JSON File or possibly much more advanced stuff like Valkey / Redis or anything in-between), then once that's done run the 2nd Part.
I would recommend Geo-Ship.com for an easy solution, you can do a few searches for free when you register, and more if you pay
 

ericloewe

Active Member
Apr 24, 2017
344
168
43
32
Seller did go for 240 instead of 250 (good enough to cover shipping). Ordered four, will share details once they arrive next week-ish.
 
  • Like
Reactions: Benno

usbTypeD

Member
Apr 2, 2020
30
14
8
Also bought two. Have yet to find a usecase for it but i dont think people here will judge.
 

ca3y6

Well-Known Member
Apr 3, 2021
748
718
93
I am not related to the seller, but from the seller's screenshots, the drives are already formatted to 512b. That's the logical block size in the smartctl output. Now when you buy an OEM drive there is always a small risk, but netapp drives have been fairly solid for me, and if the drive is not formatted to 512b and cannot be formatted to 512b, I think you would have a claim that it was mis-advertised by the seller on ebay which would warrant a return.
 
  • Like
Reactions: fossxplorer

luckylinux

Well-Known Member
Mar 18, 2012
1,527
478
83
I am not related to the seller, but from the seller's screenshots, the drives are already formatted to 512b. That's the logical block size in the smartctl output. Now when you buy an OEM drive there is always a small risk, but netapp drives have been fairly solid for me, and if the drive is not formatted to 512b and cannot be formatted to 512b, I think you would have a claim that it was mis-advertised by the seller on ebay which would warrant a return.
To be fair I think he said "Should be able to be formatted ...". Should != Shall/Must/Can.

EDIT 1: he said "most Probably":
They can either be used in Netapp or with a low level format most probably also in other servers supporting SAS SSD.
 
  • Like
Reactions: fossxplorer

vitamins

New Member
Oct 27, 2023
8
13
3
Netapp E-Series drives (the E-X4133A sticker) do not come in 520b/s, only 512b/s. E-Series uses a different OS, santricity, rather than ontap. The version compatible with ontap would have a X319A label on it and show as an X319A with NAXX firmware in sellers smartctl output.
 

ca3y6

Well-Known Member
Apr 3, 2021
748
718
93
any tips on how to find deals like this? I’d like to snag a few.
there is a discussion earlier in this thread about that. But to be honest I am full full on SSD, so I pretty much post here any deal I find (not including auctions which are rarely good deals, at least not in a deterministic way). The only exception would be 15.36TB SAS SSD, 7.68TB SATA or U.2 SSD where I could potentially upgrade my drives. But these are rarely good value. 3.84TB is generally the sweet spot.

You can watch the Great Deals forum, you will receive an email every time there is a new thread.

I might start a thread for small good deals like the link above with a quantity of 1 and do not deserve a thread of their own, that people can watch if they want.
 

heromode

Well-Known Member
May 25, 2020
534
327
63
Without knowing anything of the second hand enterprise SSD market, i have my hopes up these new 60 and 120TB drives will event lead to a flood of 7.68 and 15.36 drives to the market...

For big companies the cost savings with large capacity SSD drives is so significant, they will upgrade
 

whoknew123

Member
Jun 21, 2025
30
18
8
there is a discussion earlier in this thread about that. But to be honest I am full full on SSD, so I pretty much post here any deal I find (not including auctions which are rarely good deals, at least not in a deterministic way). The only exception would be 15.36TB SAS SSD, 7.68TB SATA or U.2 SSD where I could potentially upgrade my drives. But these are rarely good value. 3.84TB is generally the sweet spot.

You can watch the Great Deals forum, you will receive an email every time there is a new thread.

I might start a thread for small good deals like the link above with a quantity of 1 and do not deserve a thread of their own, that people can watch if they want.
I don’t work in tech so coding my own search is a bit beyond my skill range. I see someone recommended a site that does search alerts so I’ll give that a shot.

I see a 32k hour bug mentioned on some models. Is there a central guide or thread that has various models/issues? Or will I need to google around.

I can’t speak for anyone else but I’d be super appreciative if you posted small quantities.
 

ca3y6

Well-Known Member
Apr 3, 2021
748
718
93
I see a 32k hour bug mentioned on some models. Is there a central guide or thread that has various models/issues? Or will I need to google around.

I can’t speak for anyone else but I’d be super appreciative if you posted small quantities.
Sure, I will start this thread next time I see one.

32k hours bug is a nasty bug in the samsung SAS drives firmware, where the drive would brick itself at 32k hours. I think it affects pretty much all Samsung SAS models (read somewhere there is a bug at 40k hours not, sure if it is the same). It was discovered second half of 2018 I think, so any drive manufactured from 2019 shouldn't have it (they might have other bugs).

Now I have seen, and owned, some drives that have not been patched for that buggy firmware, but are over 40k hours, so it doesn't seem to kill every drive, and it is not clear whether if you passed 40k, you are in the clear or not. Problem is Samsung doesn't distribute firmwares directly, only some OEM do. Dell firmwares are publically available. Netapp can be found sometimes.
 

ca3y6

Well-Known Member
Apr 3, 2021
748
718
93
Without knowing anything of the second hand enterprise SSD market, i have my hopes up these new 60 and 120TB drives will event lead to a flood of 7.68 and 15.36 drives to the market...

For big companies the cost savings with large capacity SSD drives is so significant, they will upgrade
When I am looking at my "estate" of SSD (table below), made mostly of used enterprise SSD bought over the last 12 months, it is mostly drives made in 2016-2018, so the drives coming on the market are typically drives that have been retired and recycled, and it seems to be a 6-8y cycle.

ManufacturedNVMESASSATA
2014--1%
20158%-3%
2016-19%22%
20178%38%6%
20188%30%13%
201919%3%5%
202013%9%8%
202121%1%12%
20226%-9%
20238%-7%
202411%-15%

So I think we will see larger capacities being increasingly more frequent as they started being prevalent from 2018-2020. But as for the 30TB+ drives being sold today, they tend to go mostly to hyper-scalers (AWS, Azure, etc) and those typically don't resell their old drives, they tend to shred them (the simple mention of shreding a perfectly good enterprise SSD gives me shivers...).
 

heromode

Well-Known Member
May 25, 2020
534
327
63
When I am looking at my "estate" of SSD (table below), made mostly of used enterprise SSD bought over the last 12 months, it is mostly drives made in 2016-2018, so the drives coming on the market are typically drives that have been retired and recycled, and it seems to be a 6-8y cycle.

ManufacturedNVMESASSATA
2014--1%
20158%-3%
2016-19%22%
20178%38%6%
20188%30%13%
201919%3%5%
202013%9%8%
202121%1%12%
20226%-9%
20238%-7%
202411%-15%

So I think we will see larger capacities being increasingly more frequent as they started being prevalent from 2018-2020. But as for the 30TB+ drives being sold today, they tend to go mostly to hyper-scalers (AWS, Azure, etc) and those typically don't resell their old drives, they tend to shred them (the simple mention of shreding a perfectly good enterprise SSD gives me shivers...).
that's encouraging, 7.68 and 15.36 are sizes that homelabbers can start replacing their spinning rust with.

Shredding 30TB+ drives is highly immoral, but i can understand why, because it would not take a big mistake in wiping them (even no mistake with some firmware bug) for customer data to be retrievable by some ebay kid. That could destroy a multi-billion company, and they can't take that risk