15

Why do I program everything myself in C, even a rest service? By writing everything yourself in C you make simple things complex to make complex things simple.

Writing a rest service for example learns you a part of http protocol, how sockets work, how to create a parser (in this case json). Three thing's you would miss if I used python.

On top, your rest service uses WAY lesser resource than written in python for example. Especially for CPU usage.

Allocating and free-ing still often have issues there, but I consider it a skill problem / discipline issue. Not blaming C for that. The rules are clear.

Comments
  • 2
    My last rest service required sqlite3 and now i have a bit concept of a database implementation and added a native database function. The function 'bool' in this case. It converts true/1 to "true" and "true" to 1. bool(bool(true)) == "true". I can use "select bool(a) from where a = true form table". So I have learned that along the way too..

    What's next: build own db but that's even for me too much work. It's more work than average general purpose programming language since it also includes a parser / interpreter, and that's only the start! I've watched a hour video about sqlite internals and decided, not writing a database myself, but am capable probably. But sqlite3 has too much extra's (e.g.: journal with serveral storage options) that you really want on top of the basic db implementation
  • 3
    I remember at university I had to do a HTTP 1.0 server in C and it was fun!
  • 2
    @cafecortado same! We also had to implement HTTPS so all the certificate negotiation stuff and it was super fun
  • 2
    Implementing HTTP is fun, however JSON parsing 💀💀💀
  • 2
    @cafecortado did you use the preferred limits for header in total and header values? Had you build in protections so a user couldn't go to ."/../"?

    What interesting assignments did you build more during university?

    Cool about my server is that it supports http chunking. It can deliver a shitload of data with barely using memory and the client is happy because it gets data all the time. I do not first generate a request and THEN send, but I send when generating. So a stream. Very good for large datasets
  • 3
    @Tounai that json library parsing was done in one very sharp evening. Not only building, also constructing. I check if serialize(deserialize(serialize())) equals to original content in the tests. Works flawlessy. The serializer and deserializer are both one single recursive function. between 70-80 lines. Total json library is 590 lines
  • 1
    @retoor I don't remember. Probably it was filled with vulnerabilities that I didn't care enough xD
  • 1
    @retoor It’s 590 lines I would have hated to write. It is among the things I hate doing, especially that it has performances considerations
  • 2
    If implementing everything in C will make me a better dev then I'm all for it
  • 2
    @UberSalt I really think so. Also, if you're lazy, after writing some C, not anymore. It's a good way to get rid of lazyness. But last night I've written python and damn, that goes fast but damn, what uses it a lot of resources. My C server handles three fully time requesting python clients with just 40% CPU while all python processes are on 100%. And that C application does even the heavy database stuff. Million requests, still on 0.3 memory usage or smth. Writing C is rewarding
  • 1
    C is nice, but I think the architecture is lacking. That’s why I use Zig. The standard library is beefy enough - and optimized enough - where I can get machine language performance without pulling in tons of dependencies or losing performance by reinventing the wheel. Unless I want to.
  • 0
    @AlgoRythm I work nearly with no dependencies. I nearly always reinvent the wheel. That's the fun part for me. I have a big lib with all my (re)inventions that gets compiled to a single header file I use in all my projects. I mayve reinvented the wheel but I won't do it twice! This library consist about 40 (or so) header files with a c file for each containing the tests. So, before compiling my library to one big header file, it builds and runs 40 very well tested components.

    The merger of all those files only includes local includes (", not <>) and prevents double includes. It includes everything in right order. I have one file including all header files since order matters a bit, (overwriting malloc) and that one is the merge source.

    My rest service uses only sqlite3 as dependency, smth too big to write myself. I'll watch the sponsored zig episode of tsoding. I prolly won't use it tough, reinventing the wheel is my goal what makes c kinda perfect for me. So educational
  • 0
    There's tons of videos of how http and other protocols work, but they only provide surface level knowledge and don't dig deeper into how they look like in binary or textual form. It's so easy to show internals of http/1.1 when it is text based compared to http/2 or 3. When i implemented super simple htt-protocol server in C# i understood more about how sockets work. Like why there's how "many bytes to read" argument when i'm like "i dunno, tell me?". it's just endless stream of bytes, protocol implementer itself should define when to stop reading from that socket stream.
Add Comment