AI and Deep Learning Lab storage - is this what you're using?

Notice: Page may contain affiliate links for which we may earn a small commission through services like Amazon Affiliates or Skimlinks.

AI4me

New Member
Dec 7, 2016
9
0
1
47
I'm starting forums with vigor today. If I'm signing up I may as well join in right?

So I'm going to build a small AI learning machine for my office. Here: https://forums.servethehome.com/index.php?posts/117292/

We've got pretty good connectivity at work but I don't want to put my skill development storage on our enterprise arrays in case I ever wanted to do this full time in 2-3 years.

I know you need like LOTS of training data. How big are the arrays you guys are all using? I've got a colleague who suggested using FreeNAS since its.... well.... free. He says it's slow but it connects to most things.

I'm not really a Ubuntu fan since its a distribution for people who don't know how to install RHEL or CentOS. I've heard Ubuntu ZFS and NFS is faster than FreeNAS. Or should I just try doing a low power 3 node Ceph with cheap Pentium N systems?
 

Markus

Member
Oct 25, 2015
78
19
8
If you setup a FreeNAS-System correctly the throughput is fast enough...

It depends on:
- Hardware
- Network
- Protocol

You also can go the napp-it route, OMV and other distributions, but FreeNAS is not slow at all.

Regards
Markus
 

AI4me

New Member
Dec 7, 2016
9
0
1
47
If you setup a FreeNAS-System correctly the throughput is fast enough...

It depends on:
- Hardware
- Network
- Protocol

You also can go the napp-it route, OMV and other distributions, but FreeNAS is not slow at all.

Regards
Markus
I've not seen many use FreeNAS for data science? What if I want to download the yahoo or Wikipedia data sets and train models with them?
 

Markus

Member
Oct 25, 2015
78
19
8
In the end it is just "data", isn't it?

It totally depends on your requirements if a solution solves your problem.

What exactly are these requirements (how much storage, how many clients, what kind of requests, read/write rate...)?
Regards
Markus