Hi guys,
Just picked myself up an EliteDesk 800 G6 SFF:
Current specs:
- CPU: i5-10500
- RAM: 8 GB
- NVMe SSD: 256 GB
My plan is to beef this up with:
-
RAM: Crucial Pro DDR4 RAM 64GB Kit (2x32GB) 3200MHz
-
HDD: 4TB ironwolf NAS drives * 2
-
NVME SSD: Samsung 970 EVO Plus 1 TB PCIe NVMe M.2
How I’m planning my setup:
The existing 256 GB NVMe will host Proxmox.
The new 1 TB NVMe will be for VM’s & LXC’s
The 4TB ironwolf NAS drives will be configured in a mirror and will be used as a NAS (Best way to do this?) as well as for bulk data from my services, like recordings from Frigate.
Services:
-
Home Assistant (Currently running on a pi4)
-
Frigate (Currently running on a pi4)
-
Pi hole (Maybe, already running on an OG pi)
-
Next cloud (Calendar, photos)
-
TailsScale
-
Vaultwarden
-
Windows 11
My follow on projects will be:
Setup PBS to back up my Host(proxmox-backup-client), VMS & LXC’s
I have a Raspberry Pi 4 that I was thinking to use for PBS in the short term, but will eventually move it to something like an n100 mini PC.
I will also setup a second NAS(TrueNAS most likely, bare metal) to back up the 4TB ironwolf NAS.
This is my first proper homeLab, having mostly tinkered with Raspberry Pi’s and Arduino’s up to this point, any advice on my setup would be really appreciated.
I’m curious what you’re doing with frigate/ how you’re doing it without a graphics card?
I’ve been using it for object detection, but i had to install it on my workhorse because my server doesn’t have a graphics card. I suppose it doesn’t need one if you’re not doing ml processing, but I’m still curious
I’m sure it depends on your workload, but I’ve been running object detection just fine off the igpu on my i5-8600. I think the key is to ensure the frame size isn’t unnecessarily large for object detection.
cool, i might try it
I wanted to get a dedicated card for video transcription anyway, but it’s good to know I don’t necessarily need it.
Frigate is currently running on a raspberry pi 4 with a USB coral tpu, it runs great, I only have 2 cameras currently though…
If the raspberry pi can handle it, I’ve no doubt this server will be more than capable.
It’s well worth it to get a $50 coral tpu for object detection. Fast inference speed and nearly zero CPU usage.
I thought about it, but I have a couple other services that could benefit from getting dedicated gpu anyway. Might as well just save for a proper PCIE card.
I have a similar setup and it’s been a huge pain when I when I have to do the OS updates.
The Coral needs a dkms module, but the sources and Google’s own documentation for it are out of date. I would highly recommend using the iGPU for inference.
How efficient is using a GPU? I understood the efficiency wasn’t nearly as good, but that may have been info from a while back.
I am currently migrating away from my 6th gen i5 to a newer N100.
Speed wise, it was about the same as the coral, about 6-8ms on the i5.