We specified a very optimistic setup for a data science platform for a client....

Minimum one machine with a 16 core CPU with 64GB RAM to process data.....

Client's IT department: Best we can do is an 8 core 16GB server.

Literally what I have on my laptop.

Data scientist doesn't use any out-of-memory data processing framework, e.g. Dask, despite telling him it's the best way to be economical on memory; ipykernel kills the computation anyway because it runs out of memory.

Data scientist has a 64GB machine himself so he says it's fine.

Purpose of the server: rendered pointless.

  • 0
    If the data scientist actually is one, they should know best what they need. And if you can do it in-memory, you normally want to do it in-memory...
  • 0
    @Oktokolo we're past the point of in-memory @ 16GB. Lots of rows.

    No one is born an expert. My critique was that my suggestion fell on deaf ears.
  • 0
    No cloud options?
  • 0
    @asgs client wants everything hosted on premises. No cloud.
  • 0
    @brnrdo You suggested a 64 GiB machine and the data scientist uses a 64 GiB machine. So they sortof did follow your suggestion.
Add Comment