512gb of unified memory is insane. The price will be outrageous but for AI enthusiasts it will probably be worth it.

  • KingRandomGuy@lemmy.world
    link
    fedilink
    English
    arrow-up
    2
    ·
    2 days ago

    This type of thing is mostly used for inference with extremely large models, where a single GPU will have far too little VRAM to even load a model into memory. I doubt people are expecting this to perform particularly fast, they just want to get a model to run at all.