1

Pff today I start 400 threads in python initialized in a for loop, one thread per record, each record initialize a heavy process, Memory Limit Exceeded, I'm frustrated with this shit!

How to run 400 records doing a heavy task preventing the CPU to close the task executed by timeout limit exceeded, or how to run multiple threads to end before reach timeout limit each task without exceeding the memory limit.

☠💩🐱‍💻

Comments
  • 2
    I'm thinking I'm a Shitty programmer hahahahaha.
  • 2
    You’re going about it the wrong way me thinks
  • 2
    If you are running a 64-Bit Python, the memory limit is the memory limit of your system (more or less).

    As you are writing about threads: If you are not calling a C library for processing your data, multi threading in Python actually decreases performance.
  • 4
    Run less threads and wait a few seconds longer 🤷‍♂️
  • 7
    How about running only a few threads and a queue to give them the tasks?
  • 0
    @ilikeglue That would be sane.

    Where is the fun in it?
  • 6
    I second @ilikeglue .

    400 active threads is a performance pit. It slows you down. Learn a bit about multithreading and heavy calculations, where multithreading is efficient and where large amount of threads is slower [i.E. The more - the slower].

    Unless you have a server with 450 active processing domains, 400 heavy-calc threads is a bad idea. Very bad

    partition your work into tasks and feed them to the worker threads' pool through a queue. Make workers' pool a small one: the more intensive tasks - the closer to the number of cpus/cpu threads.
  • 2
  • 0
    Additionally to the previous answer's, you could also use a language with greenthreads (golang, elixir,...).
Add Comment