4

WTF COULD PYCHARM BE DOING THAT JUST STARTED UP IT USES 200 MB SHY OF A GB OF RAM ???

Comments
  • 5
    Executing subsystems that you aren't using but they decided to save money on the disabling mechanism for

    Rendering HTML because they wanted to save on a GUI dev

    Storing enormous libraries that it uses like 3 functions from because they didn't want to reinvent the wheel

    For the last 20 years the nature and magnitude of the tasks we use our computers for didn't change substantially. The reason old computers are useless is greed.
  • 0
    @homo-lorens if i were gay i'd blow you right now. :P
    lmao

    so hey another question, jupyter notebooks :P
    i know how to solve this problem WITHOUT using the included examples for wavenet but where do you specify where they search for .py files ? their demo isn't running out of the box because it can't find .py files one level up.
  • 0
    @homo-lorens god i'm so fucking tired.
  • 0
    @homo-lorens this is more than just getting older, no this is the environment even online being so retardedly repetitive

    deja vu just now, not sure if its the same person, there have been times it wasn't
  • 1
    It's made with java, enough said
  • 0
    @AvatarOfKaine I know nothing about jupyter notebooks.
  • 0
    @Hazarth And Python, says Wikipedia. So...
  • 3
    Don’t you love it when your code/computer/device/politician is doing something you didn’t tell it to?
  • 0
    @Root oh yes very much so
    you know what else I love ?
    using anything but c# on windows the idea being able to leave my sim game open and running.

    nope. no such luck.

    pytorch. trying tyo load the samples for wavenet and its now complaining about being able to load the dll when relocate them into the main user site-packages directory.
  • 2
    So you don't want an IDE... You want an text editor. Then use one.
  • 1
    @IntrusionCM why are you so aggro ? You’re throwing off
    My fung Shu which has just been so ridiculously loaded with inner peace since I started seeing fucked up dead people
    Wandering around everywhere smirking about being dead
  • 1
    @killames I'm not aggro. Rather a calm, soothing voice.

    Little Boy Blue, come blow your horn...

    The sheep's in the meadow, the cow's in the corn.

    Where is that boy who looks after the sheep?

    He's under a haystack, fast asleep.

    Will you wake him?
    Oh no, not I, for if I do, he'll surely cry...

    *keeps on singing nursery rhymes*
  • 1
    @IntrusionCM *slaps him* sing something more soothing.
  • 1
    @AvatarOfKaine he was just pretending poorly to help in a smartass manner
    You shouldn’t slap him for it..

    Here take this shotgun
  • 0
    you know what the problem is with people trying to make you remember things by upsetting you ?
    eventually the effect wears off so negative feeders just end up running in a lifelong hamster feeder.
  • 0
    @AvatarOfKaine who intrusion ? He’s just being a troll a grumpy grumpy troll

    Kind of endearing really
  • 0
    @killames well amusing. endearing may be a tad warm of term.
  • 0
    @IntrusionCM why dont you love people ?
  • 0
    My guess is it is loading something into RAM.
  • 0
    @killames seems like you don't like me.

    But all IDEs do the same:

    - index all files
    - run static analyzer(s) and parsers
    - create associations between indexed files via result of parsers (e.g. resolve of importers)
    - store result(s) of static analyzers

    - use an event driven loop / service to detect file changes, e.g. inotify
    - provide UI framework for highlighting / hints

    - integrate different environments / deployments ( e.g. SSH / parsers / linters / ... )

    and so on.

    And all this requires RAM....

    If you want no features or a smaller footprint, choose a text editor.
  • 0
    @IntrusionCM I never said I don’t like you
    I
    Love you
    I’d do anything to protect you
    I’d do anything to make you happy
    And you seem like you’d do anything to rightly kill me like I deserve because I’m a terrible cheesy piece of garbage that deserves anything that happens to me
  • 0
    @killames wow.

    Not really, more like don't care, don't give a fuck.

    And no, there are few people I can think of that fit your description and neither of them are you. to reach that level I had to care and give a fuck, and I think I told them explicitly what I think of them 🤔
  • 0
    @IntrusionCM does that mean there is a lack of caring or there is caring on the intrusions side ?
  • 0
    @killames it means that I don't care - but out of curiosity I post my stream of consciousness, maybe someone gives me another insight in the topic / more information / different opinion.

    To learn you have to share.
  • 0
    @IntrusionCM honestly your aggro posts are amusing
  • 1
    @killames I still don't get the aggro part... Aggressive would sound different 😂
  • 0
    @IntrusionCM always thought aggro in chat terms meant aggravated and grumpy and venomous lol
  • 0
    @IntrusionCM god I hate this lady near me
    I would like to hang her
  • 0
    @IntrusionCM see that was a bit beyond aggro lol
  • 1
    @killames https://urbandictionary.com/define....

    Nah. It's an abbreviation for aggressive.
  • 0
    @IntrusionCM hey unrelated question.
    what hashing algorithm for file verification is acceptable and right except in cases where 1000s to millions of files are being created and stored and hashed by different users ?
  • 0
    @IntrusionCM like are md5 hash collision's something a normal user would really have to worry about ?
  • 0
    @killames drivem crazy and stuff mr aggro
  • 0
    @AvatarOfKaine If this is a shared system you should use secure hashes, otherwise malicious parties might be able to overwrite known files with colliding alternatives causing data loss at best.
  • 1
    Also I think I'll change my nickname, it's meant to be a biology joke but like every other person I meet online thinks I'm gay.
  • 1
    @AvatarOfKaine Not sure what you mean.

    The "except" part is confusing.

    But there are 3-4 seperate issues in your question:

    1) concurrent access to file
    2) concurrent access to file hash storage
    3) hashing algorithm
    4) preventing resource starvation / ddos

    1/2/4 have a lot to do with specific programming language and OS

    Hashing algorithm is matter of taste, though Blake 3 might be worth to look at.

    It was specifically designed for that.
  • 1
    @AvatarOfKaine md5...

    Is really outdated. Performance and security wise rather meh.

    Collisions are... A theoretical problem.

    I'm not saying impossible, but the question is what you want to do.

    If you want to detect duplicate files, it might be a no go. If you just want to verify a file hash, it shouldn't matter - you verify the file path and the file, so the collision shouldn't matter...

    Edit: duplicate file detection based on hash alone.
  • 0
    @IntrusionCM i think @homo-lorens got the idea.

    the except part, being statistically a hash collision is going to be hard to create one would think.

    otherwise the hashing algorithm you mentioned, md5 would never have worked in the first place.

    but if people are say storing thousands of large video files on a server, chances are the possibility of a hash collision in the case of a false equality would be something to worry about
    .
    I'm not google so i'm not worried lol

    however i thought md5 wasnt expecting files several gigs in size.... based on some strange need to replace it.
  • 0
    @IntrusionCM also yeah the reason i'm aksing is performance, i may not be google but i gots a lot of files i want to check in the case of the same filename showing up somewhere.
  • 1
    @AvatarOfKaine MD5 sucks.

    :)

    There are several alternatives, e.g. Blake 3 I mentioned, Whirlpool or others.

    The obsession with deprecated / completely out of date algorithms is really fascinating in IT. :)
  • 0
    @IntrusionCM well sometimes when you get the rubies and the node people trying to replace apache :P and things of this nature and new things bomb out, you can understand :P

    sometimes stable is good where a base system is concerned.

    what about sha ? sha is more of a recognizable standard question is which bit level would be adequate.
  • 0
    @IntrusionCM mainly performance is more important in this scenario than is 'ommmgzzzzzz a million files and one of them generated the same hash i'm fucked !' which will likely never ever happen.
  • 1
    @AvatarOfKaine yes.

    I'm aware that performance is your primary concern.

    That's the reason I can only recommend newer algorithms - MD5 was designed at a time where we were happy to have 133 MHz.

    The thing with hashes and standards is more a concern in case of cryptography.

    You want a reliable way to determine file hashes - the more modern approaches / implementations of standard have chunking and parallelity by design, not piggybacked via implementation.

    Blake(2 I think) was e.g. a in the finals for the SHA-3 reference implementation
  • 0
    @IntrusionCM wouldn’t that likely suggest that it was higher performance ? Hehe
  • 1
    @killames depends.

    https://keccak.team/2017/...

    https://github.com/BLAKE3-team/...

    The trouble with standards and an theoretical approach to cryptography, like the NIST / NSA does, is that it's slow, theoretical and less practical.

    I'm a fan of standards.... Usually.

    But a lot of cryptography experts dislike some of the choices NIST / NSA made.

    And to go one step further - NIST / NSA made the standardization a lengthy and painful process.

    Blake and other FOSS implementations of cryptography have one plus - they adapt. To better match current hardware, to follow current recommendations / concerns, to discuss in a transparent manner.

    https://csrc.nist.gov/projects/...

    When you look at the timetable of SHA-3... It were nearly 8 years... A very very long time.
  • 0
    @IntrusionCM hehe I looked up Blake hashing and the first words I noticed reminded me of this

    https://youtu.be/lWZ7yLrSicw
  • 0
    @IntrusionCM again I’m thinking the only way to be sure would be to try to engineer scenarios where unique values are not so unique in comparison to the unique data they are representing

    So many things require a person test things or devise tests for themselves

    I looked recently at some of the hashing standards for sha and honestly I think I’d need a math degree not to implement them or understand them but figure out how to make them break
  • 0
    @IntrusionCM in general slow progress for purposes such as hashing don't seem to be that critical unless they're being applied to information security.
Add Comment