72
Parzi
4y

Forgot FAT32 had a file count limit.

Turns out Linux won't stop you from writing too many files to a FAT32 drive.

Turns out this makes FAT32 do *really weird shit.*

Comments
  • 6
    @Parzi
    For what did you need F32 on Linux?
  • 5
    @groxx 3DS.

    That'd be nice but no and also Microsoft royalties...
  • 5
    @metamourge 3DS' SD card got a No-Intro collection or 37 put on it. Easily bypassed the 200-some million file limit.
  • 6
    @Parzi
    Maybe you should report this to the kernel-project, could be a bug.
  • 4
    @metamourge all the shit is broken. I don't think it'd matter that much anymore, either, as FAT32/VFAT is a third-party driver and i dunno who made it so i can't send it to them.
  • 5
    When it gets too fat is starts fucking up

    Sounds like a metaphore p
  • 2
  • 12
    ntfs also has this really weird limit one filename length, when reading files. So software like dropbox, npm, and others can write crazy long deep in directories. Because of that reading limit Windows won't allow you to go to the file and you can't delete it either because "durr filename too long". However you can mount the directory from powershell then go as far in as you can, mount that, rinse and repeat until you can actually delete shit
  • 3
    Of course it has a file count limit! When it comes to non-empty files, you can have at most as many files as there are clusters on the file system. If all files are empty, the limit should be 32 * free space in KiB (16 entries per 512-byte sector).
  • 2
    What crazy shit does it do?
  • 3
    lets do this to an usb stick and plug it into windows, I bet it would catch fire or sth
  • 0
    @Charmesal Access Denied on *read*, as if the user didn't have access to the file. FAT32 doesn't have this feature normally, so the driver just crashes (thus kernel-panicking the 3DS' firmware at pid like 3 or 4)
  • 0
    @SomeNone it's actually in the 210-million range per partition. I got over 300 million onto it.
  • 0
    @inaba Never had that issue. Usually it fails to write for me, so...
  • 1
    Ouch, I'd call that a Linux issue 😅 maybe file a bug report or something?
  • 0
    @linuxxx yeah, but who makes the VFAT driver?
  • 1
    I came here just to read some classic anti windows comment, even when windows it's not the topic.... Linux fanboys are amazing
  • 2
    @dontbeevil I genuinely hope you don't mean mine, otherwise I'd really have to call you fucking blind.

    @Parzi I have no clue
  • 1
    @linuxxx why you think always that? I told you already several times, that luckily you're not one of them, you shouldt be more confident and not feel touched :) just read the comments from the bottom and go up for about 8 positions, you'll find your answer :)
  • 2
    @dontbeevil Oh oopsie, can you point me to one of those comments? I genuinely missed that!
  • 0
    @dontbeevil you referring to the comment about Dropbox being cancer?
  • 2
    @dontbeevil Although, reading the comments, I mostly see troubleshooting and Linux 'blaming'...
  • 1
    @linuxxx i mean... it *was* Linux's fault as it didn't warn or deny me at all...
  • 2
    @Parzi Yeah that haha. Who develops this? If Microsoft than you can 'blame' Microsoft in some way as well (not windows, Microsoft)
  • 1
    @linuxxx "...plug it into windows... Will catch fire"
  • 1
    @Parzi 300 million files on a partition of what size?
  • 1
    @Parzi It is Linux's fault for not telling you you had too many files?

    I'm pretty sure the same would have happened under Windows.
  • 2
    @SomeNone 128GB, and Windows does actually stop me ("General I/O Error" on a pristine SD card and a known-good USB adapter)
  • 1
    @Parzi If we are talking about empty files, a 128G partition can accommodate about 4 billion of them.

    How does a 'General I/O error' tell you you have too many files?

    (Salutes to General I/O…)
  • 1
    @SomeNone again, FAT32 has a per-partition file limit of like 210 million according to several sites, and they weren't empty, I have a hacked N3DSXL and put as many No-Intro sets as I could get ahold of on it (it's a decent emulation platform, praise Libretro!)
    also, i'm assuming the I/O error is due to file limit, as it only occurs on Windows when I get pretty close to the file limit (i'm assuming the number also includes directories, which brings it up to almost the listed max-file number.)
  • 1
    Linux fanboys are so cute, there is a rant aginst linux

    "But but I'm sure that on windows there is the same problem or even worst"

    Without have a single clue of course
  • 1
    @dontbeevil who you accusing?
  • 2
    @Parzi "... It's linux fault...the same would have happened under windows"
  • 2
    @dontbeevil i mean

    in this case linux broke a standard

    that's a legit worrying case as only linux ever gets standards implemented properly

    and if LINUX fucked up a standard...
  • 1
    @Parzi yeah...but still it was not really my point, I'm just annoyed how some people react to linux issues
  • 1
    @Parzi nice, you doing homebrew too?
  • 1
    @netherfail04 yeah... homebrew. let's go with that.
    (yeah, i'm the only active Linux 3DS dev atm. Not going well, considering all the custom hardware and weird bullshit of the processors.)
  • 1
    @Parzi YOO REALLY? I have a version installed. I'm not sure if it's the same version as yours though. Can you post a link to the GitHub?
  • 0
    @netherfail04 https://gbatemp.net/threads/...
    end of the thread, at some point back you'll find the pieces you need for initramfs and chainload image. Basic, buggy SD R/W implemented, not much else. We kinda dropped off the map.
  • 1
    @Parzi oh yeah I have this on mine.
  • 0
    @netherfail04 it's like Page 36?
  • 1
    @Parzi no just the main one on the thread
  • 0
    @netherfail04 the first one on the thread is Xerpi's build, the last one before takeover.
  • 1
    Linux is a hobbyists system. what do you expect?
Add Comment