85
ddephor
7y

Biggest scaling challenge?
The imaginary scaling issues from clients.

Client : How do you cope with data that's a billion times bigger than our current data set? Can you handle that? How much longer will it take to access some data then?

I could then give a speech about optimizing internal data structures and access algorithms that work with O(log n) complexity, but that wouldn't help, non-tech people will not understand that.
And telling someone, the system will be outdated and hopefully been replaced when that amount of data is reached, would be misinterpreted as "Our system can not handle it".

So the usual answer is: "No problem, our algorithms are optimized so they can handle any amount of data"

Comments
  • 17
    Our pogosort handles any data volume

    ....I never said in what time frame....
  • 9
    “We support 100 tetrakvods of data out of the box, don’t you worry “
  • 11
    "premature optimization is the root of all evil"
  • 3
    @antic Tell that to Google lol
  • 4
    The correct answer being "was that part of the original specification?"
  • 6
    "What kind of hash salting do you use for password security"

    "Ughh...Mega Hash."

    "Great!"

    "Christ I can't believe that worked."
Add Comment