33

Some of the dinosaurs maintaining legacy systems are rather well meaning, but somewhat frustratingly don't seem to have updated their knowledge of what counts as "a lot of storage space" beyond the early 90's. Just asked one of them if they could dump all the legacy logs to AWS each day so we could keep them for 6 months or so (they're rolled over weekly otherwise.)

"Well I mean we can, but are you sure? *All* the logs? You could end up storing close to 100MB of logs a month! Won't that be prohibitively expensive?!"

It's rather cute really.

Comments
  • 7
    Aawwww :) it IS cute!
  • 16
    imagine how efficient most software would be today if we kept worrying about that stuff to that degree. One can dream.
  • 7
    I used to work with some super smart ASIC engineers. Really great guys.

    But I'm 90% sure that some of them didn't know what their chips 'did' in reality in... IRL.

    Like I'm pretty sure they didn't know that when they picked up their smartphone and fumbled around with it ... they had no clue that their chips were handling that network traffic...
  • 8
    With just some tiny little bit of optimizing or forethought, software could be so much faster and more efficient.

    Related example:
    In our Rails project at work, we have thousands of “resources” — basically named fancy nested macros. The lists of resource names are stored in string arrays, and they’re referenced by string everywhere. Using symbols instead would reduce the memory size by about 32%, and increase lookup speed by about 63%. (Just benchmarked this)

    (Symbols in Ruby are immutable strings stored as ints under the hood, so they’re considerably faster)

    It’s such a small change for such a large benefit, but even still, that’s a line-by-line optimization. Now better architectural patterns? That can go from 0.3x speed ups to 10x, 20x, etc.

    If devs don’t give much thought to what their code is doing and why, it will always end up slow and voracious.
  • 1
    A month? What are these, logs for ants?
  • 1
    @Root are symbols comparable to enums?
  • 1
    @Lensflare They work a little differently, but that’s a really good parallel.

    Think of a global enum list where new enums are initialized upon first use. They’re just an int under the hood, so instantiation is quick, and doing lookups is also quick. They are unloaded eventually (in Ruby 2.2+) so there’s no runaway memory issue for dynamically created symbols.

    Compare this to using strings for everything. Wasted memory, slower (and repeated) instantiations, slower lookups, uglier syntax colors 😅, ...

    Ruby’s memory management is a little more complicated than this implies (and quite interesting), but the takeaway is that symbols are absolutely faster.
  • 2
    @Root

    Not happening in ruby land but where I am just saw something similar-ish. Some queries were checking a perfectly good string as part of just what they do but were running slower.

    Someone was all "hey what if that wasn't just a string" and checked something that was not a string, but effectivly indicated the same thing.

    Speed increased dramatically.

    Well shit....

    Just one of those things nobody thought of at the start.
  • 0
    @Root I just looked up memory management in ruby and it seems to do garbage collection.
    Since you said it is interesting, I'm curious. Is it different than what C# or Java does?
  • 1
    @Lensflare How Ruby handles memory is kind of interesting.

    Instead of typing up a long description of allocating heap pages and object blocks, etc. on my phone, have a link!

    https://joyfulbikeshedding.com/blog...

    There’s also some interestingness around symbols vs short vs long strings, arbitrary length integers, etc., but you’ll have to look through the mailing list for history on those (or look for other articles, I suppose!)
  • 0
    @SortOfTested Batch jobs. Logs are quite noisy but they don't run that often.
Add Comment