1

Is there a goddamn way of making a setTimeOut() call in nodejs work like a manualresetevent with a timeout in c# ?

where you just say "HEY GO AHEAD GODDAMN IT !" and it goes on ahead, cancelling the delay ?

Comments
  • 0
    You mean you want to cancel it?
  • 1
    Save the handle when you call setTimeout() and use it in clearTimeout():

    const timeoutId = setTimeout(function(){
    console.log("Hello World");
    }, 2000);

    clearTimeout(timeoutId);
    console.log(`Timeout ID ${timeoutId} has been cleared`);
  • 0
    @ScriptCoded no i mean I want to make it proceed immediately.
  • 1
    - "Hey, I wanna use setTimeout"
    - "Why, to run something after a scheduled delay or at least queue it in the event loop?"
    - "No to proceed immediately!"

    Sorry doesn't make sense.
    Either you didn't explain well, or this is an X/Y problem. Or you want sth like:

    If (flagManualReset) {
    // run
    } else {
    setTimeout(() => { // run ...
    }
  • 1
    @webketje no i want something that causes a thread to proceed like a semaphore.

    to proliferate the method I use setTimeOut, like setInterval, but simplky don't renew it when everything is processed.

    so no the mockery doesn't make sense if you READ THE QUESTION and looked up what a MANUAL RESET EVENT IS.

    you'd understand that javascript is asynchronous but not multithreaded truly like c# is. because its a crap language only losers specialize in youuuuuuu BASTARD YOU LOL
  • 0
    @webketje I think I'll just start more jobs from the queue and repeat a small amount of code since simply pausing the thread would be too efficient for fucking javascript.
  • 0
    Use two arrays then. One for locked and one for unlocked calls. Process the unlocked with a worker. Move the calls between them as needed.

    If you want control over which locked call you fetch, use an object instead with what ever key you want to use to identify them.
  • 0
    @RagingCodeChimp

    a familiar apology:

    if (hashjobs < hashjoblimit )

    {

    // maybe its an irrational fear, but the reason I don't just leapfrog off

    // method completion is the fear that this will silenlt error out and die

    // like can happen with threads in other programming languages.

    // yes it may be irrational.

    if (hashqueue.length == 0)

    {

    await grabNextHashQueue();

    }

    if (hashqueue.length > 0)

    {

    var item = hashqueue.shift();

    await startHashJob(item);

    }

    }

    familiar apology in this strategy instead.

    sigh weird weird weird node js.
  • 0
    @AvatarOfKaine that comment really made my day hahaa brilliant. XD

    I'm aware of JS vs C#'s "threadedness" and their drawbacks and advantages. I've never been confronted with the need of this feature, so I'm curious why promises or workers (as @RagingCodeChimp suggested) wouldn't cut it.

    Then again, mostly been working w Node as a "negotiator" server between front-end & back-ends that handle lower-level OS stuff
  • 0
    @webketje well herein we touch on my well learned microsoft based paranoia about stuff dying quietly, giving the illusion of functioning lol like... threads.. and thread lock.. etc

    certainly what you could do is this:

    populate a processing queue to the initial maximum

    start the maximum number asynchronous jobs and hope they ALWAYS resolve their promises.

    on promise resolution start a replacement job.

    continue until the processing queue cannot be refilled.

    its still UGLY AND DOESN'T WORK RIGHT !!! LMAO I'd have to mark another status flag in the fucking database to keep my query from pulling back shit already in queue or pass a set of keys in to check against to make sure of no duplication.

    i'm annoyed.
  • 0
    @webketje mainly i'm just afraid of the job dying quietly and not proliferating the error to the rest of the app which is probably completely unfounded as i've seen code buried in the depths of an async module levels down crash my whole fucking app without an easily traceable cause ! lol
  • 0
    @webketje I think I enjoy the self abuse of node js. because seriously other than the 'ease' of oauth processing i don't know why the hell i used it.
  • 1
    @webketje all things considered i just processed 15 gb's of sha256 hashing in the time it took to bitch about this with my method and didn't really touch my available ram that much..... not bad I suppose.
  • 1
    @AvatarOfKaine glad it turned out better than expected :D. For metalsmith I wrote a batchAsync method that might be of interest: https://github.com/metalsmith/...

    Though the main preoccupation there was with max simultaneously open file handles on an OS, you can use it to limit concurrency to n for any promise-returning methods
  • 0
    @webketje i've returned to this goddamn SIMPLE project now multiple times with large gaps between it. its a sysphean task ! and i'm not dead !
  • 0
    @webketje this is good practice for bypassing issues in other peoples systems.

    google doesn't always publish all metadata characteristics immediately of an uploaded file even after processing is done. so. yay.
  • 1
    @webketje sigh something just demonstrated
  • 1
    Is this what you're looking for?
  • 0
    JS has no standard library, so this is the best we have. You (or someone else) can make it into an NPM package, though i'd recommend against using such a package because it makes it impossible to inject time for unit testing.
  • 0
    @lbfalvy so it would return a function I could just call ?
  • 0
    @lbfalvy I made my own thread sleep with await and async and a while loop lol it sucks lol but it works to keep google from throttling me
  • 0
    @AvatarOfKaine yeah, that's a function you can call to immediately fire the event.

    Now looking at it, this version can call the function twice if you call the return value twice or call it after time has expired.
  • 0
    @AvatarOfKaine Why not just

    await new Promise(r => setTimeout(r, 1000))
  • 0
    Here's an improved version of the above:

    https://pastie.io/ocyinl.js
  • 0
    @lbfalvy yea I’ve noted the poor organization of some things js can do this if one is not careful
  • 0
    @lbfalvy wouldn’t r have to be another function ?
  • 0
    @lbfalvy or wait wouldn’t that just keep calling the wait ?
  • 0
    Ps if people do the same things and we respond the same way it’s their fault stop incentivizing insanity
  • 0
    @AvatarOfKaine r is the resolve function which Promise passes to its argument. The promise is resolved when you call this function. I should've spelled resolve out for easier parsing but this expression is pretty much muscle memory to me.
  • 0
    Promise takes one argument, a function which takes two arguments, both functions. Calling the first resolves the promise with whatever you pass to it, calling the second asynchronously throws the error you pass to it. (Unfortunately these are typically not error objects but that's a different topic)
  • 0
    The promise constructor is literally built to wrap a callback-based API.
  • 1
    @lbfalvy I often access it via the async keyword then it’s just implied and I use either await or then catch
  • 0
    @AvatarOfKaine well yeah, but if your API isn't already promisified await doesn't do much. That's where the monadic promise API shines.
  • 1
    @lbfalvy all that aside do you know how to extract one frame at a time from a video file with ffmpeg starting at an exact frame number, not a time stamp ?

    like the goddamn ffprobe thinger says there are x frames.

    i want to grab each frame.

    ffmpeg doesn't seem to like this or provide a way to say "start at frame x" -ss is goddamn time !

    and i don't see another and i really don't want to rebuild opencv which for some reason doesn't install with PIP!
  • 0
    Put your code in a function and do this:

    doStuff();
    setTimeout(doStuff, 1000);
  • 1
    @AvatarOfKaine I don't know, can't you use some attribute extraction library or program to get the framerate and divide your frame number by it to get a floating point second-based timestamp for ffmpeg?
  • 1
    @lbfalvy if there are x number frames that would mean their are key frames and differential frames but in the end x number of images that can be composed via transition

    Why would I want half images and hops when I want to process every image ?
  • 1
    @lbfalvy course if I wanted to do media recognition at different frame rates that’s might make more sense

    But that’s not what I’m doing
  • 0
    @AvatarOfKaine I forgot there are dynamic framerate video formats.
  • 1
    @lbfalvy ya confusing and very detailed subject

    I learned alll about it back in the day

    Container vs streams vs compression and how that compression works and why videos get distorted sometimes and suddenly recover their data and all that happy crap and video bitrate and wtf pal vs ntsc is fun fun
Add Comment